Protecting Database Centric Web Services against SQL/XPath Injection Attacks
NASA Astrophysics Data System (ADS)
Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique
Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.
Web Services Security - Implementation and Evaluation Issues
NASA Astrophysics Data System (ADS)
Pimenidis, Elias; Georgiadis, Christos K.; Bako, Peter; Zorkadis, Vassilis
Web services development is a key theme in the utilization the commercial exploitation of the semantic web. Paramount to the development and offering of such services is the issue of security features and they way these are applied in instituting trust amongst participants and recipients of the service. Implementing such security features is a major challenge to developers as they need to balance these with performance and interoperability requirements. Being able to evaluate the level of security offered is a desirable feature for any prospective participant. The authors attempt to address the issues of security requirements and evaluation criteria, while they discuss the challenges of security implementation through a simple web service application case.
Security and Efficiency Concerns With Distributed Collaborative Networking Environments
2003-09-01
have the ability to access Web communications services of the WebEx MediaTone Network from a single login. [24] WebEx provides a range of secure...Web. WebEx services enable secure data, voice and video communications through the browser and are supported by the WebEx MediaTone Network, a global...designed to host large-scale, structured events and conferences, featuring a Q&A Manager that allows multiple moderators to handle questions while
Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael
2015-01-01
Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596
A Security Architecture for Grid-enabling OGC Web Services
NASA Astrophysics Data System (ADS)
Angelini, Valerio; Petronzio, Luca
2010-05-01
In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid resources. While the gLite middleware is tied to a consolidated security approach based on X.509 certificates, our system is able to support different kinds of user's security infrastructures. Our central component, the G-OWS Security Framework, is based on the OASIS WS-Trust specifications and on the OGC GeoRM architectural framework. This allows to satisfy advanced requirements such as the enforcement of specific geospatial policies and complex secure web service chained requests. The typical use case is represented by a scientist belonging to a given organization who issues a request to a G-OWS Grid-enabled Web Service. The system initially asks the user to authenticate to his/her organization's security system and, after verification of the user's security credentials, it translates the user's digital identity into a G-OWS identity. This identity is linked to a set of attributes describing the user's access rights to the G-OWS services and resources. Inside the G-OWS Security system, access restrictions are applied making use of the enhanced Geospatial capabilities specified by the OGC GeoXACML. If the required action needs to make use of the Grid environment the system checks if the user is entitled to access a Grid infrastructure. In that case his/her identity is translated to a temporary Grid security token using the Short Lived Credential Services (IGTF Standard). In our case, for the specific gLite Grid infrastructure, some information (VOMS Attributes) is plugged into the Grid Security Token to grant the access to the user's Virtual Organization Grid resources. The resulting token is used to submit the request to the Grid and also by the various gLite middleware elements to verify the user's grants. Basing on the presented framework, the G-OWS Security Working Group developed a prototype, enabling the execution of OGC Web Services on the EGEE Production Grid through the federation with a Shibboleth based security infrastructure. Future plans aim to integrate other Web authentication services such as OpenID, Kerberos and WS-Federation.
Frey, Lewis J; Sward, Katherine A; Newth, Christopher J L; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael
2015-11-01
To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Security concept in 'MyAngelWeb' a website for the individual patient at risk of emergency.
Pinciroli, F; Nahaissi, D; Boschini, M; Ferrari, R; Meloni, G; Camnasio, M; Spaggiari, P; Carnerone, G
2000-11-01
We describe the Security Plan for the 'MyAngelWeb' service. The different actors involved in the service are subject to different security procedures. The core of the security system is implemented at the host site by means of a DBMS and standard Information Technology tools. Hardware requirements for sustainable security are needed at the web-site construction sites. They are not needed at the emergency physician's site. At the emergency physician's site, a two-way authentication system (password and test phrase method) is implemented.
Security concept in 'MyAngelWeb((R))' a website for the individual patient at risk of emergency.
Pinciroli; Nahaissi; Boschini; Ferrari; Meloni; Camnasio; Spaggiari; Carnerone
2000-11-01
We describe the Security Plan for the 'MyAngelWeb' service. The different actors involved in the service are subject to different security procedures. The core of the security system is implemented at the host site by means of a DBMS and standard Information Technology tools. Hardware requirements for sustainable security are needed at the web-site construction sites. They are not needed at the emergency physician's site. At the emergency physician's site, a two-way authentication system (password and test phrase method) is implemented.
Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M Barry; Lee, Brian; Stinn, John; Worthen, Katherine
2010-01-01
Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance.
Security and Dependability Solutions for Web Services and Workflows
NASA Astrophysics Data System (ADS)
Kokolakis, Spyros; Rizomiliotis, Panagiotis; Benameur, Azzedine; Sinha, Smriti Kumar
In this chapter we present an innovative approach towards the design and application of Security and Dependability (S&D) solutions for Web services and service-based workflows. Recently, several standards have been published that prescribe S&D solutions for Web services, e.g. OASIS WS-Security. However,the application of these solutions in specific contexts has been proven problematic. We propose a new framework for the application of such solutions based on the SERENITY S&D Pattern concept. An S&D Pattern comprises all the necessary information for the implementation, verification, deployment, and active monitoring of an S&D Solution. Thus, system developers may rely on proven solutions that are dynamically deployed and monitored by the Serenity Runtime Framework. Finally, we further extend this approach to cover the case of executable workflows which are realised through the orchestration of Web services.
Secure password-based authenticated key exchange for web services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Fang; Meder, Samuel; Chevassut, Olivier
This paper discusses an implementation of an authenticated key-exchange method rendered on message primitives defined in the WS-Trust and WS-SecureConversation specifications. This IEEE-specified cryptographic method (AuthA) is proven-secure for password-based authentication and key exchange, while the WS-Trust and WS-Secure Conversation are emerging Web Services Security specifications that extend the WS-Security specification. A prototype of the presented protocol is integrated in the WSRF-compliant Globus Toolkit V4. Further hardening of the implementation is expected to result in a version that will be shipped with future Globus Toolkit releases. This could help to address the current unavailability of decent shared-secret-based authentication options inmore » the Web Services and Grid world. Future work will be to integrate One-Time-Password (OTP) features in the authentication protocol.« less
Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M. Barry; Lee, Brian; Stinn, John; Worthen, Katherine
2010-01-01
Objectives Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. Methods As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Results Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. Discussion This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Conclusion Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance. PMID:23569581
31 CFR 344.3 - What provisions apply to the SLGSafe Service?
Code of Federal Regulations, 2014 CFR
2014-07-01
... to the SLGSafe Service? (a) What is the SLGSafe Service? SLGSafe is a secure Internet site on the World Wide Web through which subscribers submit SLGS securities transactions. SLGSafe Internet... (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE FISCAL SERVICE U.S. TREASURY SECURITIES...
31 CFR 344.3 - What provisions apply to the SLGSafe Service?
Code of Federal Regulations, 2012 CFR
2012-07-01
... to the SLGSafe Service? (a) What is the SLGSafe Service? SLGSafe is a secure Internet site on the World Wide Web through which subscribers submit SLGS securities transactions. SLGSafe Internet... (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT U.S. TREASURY SECURITIES...
31 CFR 344.3 - What provisions apply to the SLGSafe Service?
Code of Federal Regulations, 2013 CFR
2013-07-01
... to the SLGSafe Service? (a) What is the SLGSafe Service? SLGSafe is a secure Internet site on the World Wide Web through which subscribers submit SLGS securities transactions. SLGSafe Internet... (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY BUREAU OF THE PUBLIC DEBT U.S. TREASURY SECURITIES...
Service-Oriented Architecture for NVO and TeraGrid Computing
NASA Technical Reports Server (NTRS)
Jacob, Joseph; Miller, Craig; Williams, Roy; Steenberg, Conrad; Graham, Matthew
2008-01-01
The National Virtual Observatory (NVO) Extensible Secure Scalable Service Infrastructure (NESSSI) is a Web service architecture and software framework that enables Web-based astronomical data publishing and processing on grid computers such as the National Science Foundation's TeraGrid. Characteristics of this architecture include the following: (1) Services are created, managed, and upgraded by their developers, who are trusted users of computing platforms on which the services are deployed. (2) Service jobs can be initiated by means of Java or Python client programs run on a command line or with Web portals. (3) Access is granted within a graduated security scheme in which the size of a job that can be initiated depends on the level of authentication of the user.
Design of Provider-Provisioned Website Protection Scheme against Malware Distribution
NASA Astrophysics Data System (ADS)
Yagi, Takeshi; Tanimoto, Naoto; Hariu, Takeo; Itoh, Mitsutaka
Vulnerabilities in web applications expose computer networks to security threats, and many websites are used by attackers as hopping sites to attack other websites and user terminals. These incidents prevent service providers from constructing secure networking environments. To protect websites from attacks exploiting vulnerabilities in web applications, service providers use web application firewalls (WAFs). WAFs filter accesses from attackers by using signatures, which are generated based on the exploit codes of previous attacks. However, WAFs cannot filter unknown attacks because the signatures cannot reflect new types of attacks. In service provider environments, the number of exploit codes has recently increased rapidly because of the spread of vulnerable web applications that have been developed through cloud computing. Thus, generating signatures for all exploit codes is difficult. To solve these problems, our proposed scheme detects and filters malware downloads that are sent from websites which have already received exploit codes. In addition, to collect information for detecting malware downloads, web honeypots, which automatically extract the communication records of exploit codes, are used. According to the results of experiments using a prototype, our scheme can filter attacks automatically so that service providers can provide secure and cost-effective network environments.
Enterprise Considerations for Ports and Protocols
2016-10-21
selected communications. These protocols are restricted to specific ports or addresses in the receiving web service. HTTPS is familiarly restricted...in use by the web services and applications that are connected to the network are required for interoperability and security. Policies specify the...network or reside at the end-points (i.e., web services or clients). ____________________________ Manuscript received June 1, 2016; revised July
2008-03-01
Machine [29]. OC4J applications support Java Servlets , Web services, and the following J2EE specific standards: Extensible Markup Language (XML...IMAP Internet Message Access Protocol IP Internet Protocol IT Information Technology xviii J2EE Java Enterprise Environment JSR 168 Java ...LDAP), World Wide Web Distributed Authoring and Versioning (WebDav), Java Specification Request 168 (JSR 168), and Web Services for Remote
Efficient Authorization of Rich Presence Using Secure and Composed Web Services
NASA Astrophysics Data System (ADS)
Li, Li; Chou, Wu
This paper presents an extended Role-Based Access Control (RBAC) model for efficient authorization of rich presence using secure web services composed with an abstract presence data model. Following the information symmetry principle, the standard RBAC model is extended to support context sensitive social relations and cascaded authority. In conjunction with the extended RBAC model, we introduce an extensible presence architecture prototype using WS-Security and WS-Eventing to secure rich presence information exchanges based on PKI certificates. Applications and performance measurements of our presence system are presented to show that the proposed RBAC framework for presence and collaboration is well suited for real-time communication and collaboration.
Security Broker—A Complementary Tool for SOA Security
NASA Astrophysics Data System (ADS)
Kamatchi, R.; Rakshit, Atanu
2011-09-01
The Service Oriented Architecture along with web services is providing a new dimension to the world of reusability and resource sharing. The services developed by a creator can be used by any service consumers from anywhere despite of their platforms used. This open nature of the SOA architecture is also raising the issues of security at various levels of usage. This is paper is discussing on the implementation benefits of a service broker with the Service Oriented Architecture.
WebGLORE: a web service for Grid LOgistic REgression.
Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian
2013-12-15
WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation.
WebGLORE: a Web service for Grid LOgistic REgression
Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian
2013-01-01
WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. Availability and implementation: http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation. Contact: x1jiang@ucsd.edu PMID:24072732
OGC and Grid Interoperability in enviroGRIDS Project
NASA Astrophysics Data System (ADS)
Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas
2010-05-01
EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and the OGC Web service protocols, the advantages offered by the Grid technology - such as providing a secure interoperability between the distributed geospatial resource -and the issues introduced by the integration of distributed geospatial data in a secure environment: data and service discovery, management, access and computation. enviroGRIDS project proposes a new architecture which allows a flexible and scalable approach for integrating the geospatial domain represented by the OGC Web services with the Grid domain represented by the gLite middleware. The parallelism offered by the Grid technology is discussed and explored at the data level, management level and computation level. The analysis is carried out for OGC Web service interoperability in general but specific details are emphasized for Web Map Service (WMS), Web Feature Service (WFS), Web Coverage Service (WCS), Web Processing Service (WPS) and Catalog Service for Web (CSW). Issues regarding the mapping and the interoperability between the OGC and the Grid standards and protocols are analyzed as they are the base in solving the communication problems between the two environments: grid and geospatial. The presetation mainly highlights how the Grid environment and Grid applications capabilities can be extended and utilized in geospatial interoperability. Interoperability between geospatial and Grid infrastructures provides features such as the specific geospatial complex functionality and the high power computation and security of the Grid, high spatial model resolution and geographical area covering, flexible combination and interoperability of the geographical models. According with the Service Oriented Architecture concepts and requirements of interoperability between geospatial and Grid infrastructures each of the main functionality is visible from enviroGRIDS Portal and consequently, by the end user applications such as Decision Maker/Citizen oriented Applications. The enviroGRIDS portal is the single way of the user to get into the system and the portal faces a unique style of the graphical user interface. Main reference for further information: [1] enviroGRIDS Project, http://www.envirogrids.net/
77 FR 44306 - Service Delivery Plan
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-27
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2012-0048] Service Delivery Plan AGENCY: Social... publicly available. Do not include in your comments any personal information, such as Social Security... function of the Web page to find docket number SSA-2012-0048. The system will issue you a tracking number...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casella, R.
RESTful (REpresentational State Transfer) web services are an alternative implementation to SOAP/RPC web services in a client/server model. BNLs IT Division has started deploying RESTful Web Services for enterprise data retrieval and manipulation. Data is currently used by system administrators for tracking configuration information and as it is expanded will be used by Cyber Security for vulnerability management and as an aid to cyber investigations. This talk will describe the implementation and outstanding issues as well as some of the reasons for choosing RESTful over SOAP/RPC and future directions.
Grid Enabled Geospatial Catalogue Web Service
NASA Technical Reports Server (NTRS)
Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush
2004-01-01
Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.
Secure medical digital libraries.
Papadakis, I; Chrissikopoulos, V; Polemi, D
2001-12-01
In this paper, a secure medical digital library is presented. It is based on the CORBA specifications for distributed systems. The described approach relies on a three-tier architecture. Interaction between the medical digital library and its users is achieved through a Web server. The choice of employing Web technology for the dissemination of medical data has many advantages compared to older approaches, but also poses extra requirements that need to be fulfilled. Thus, special attention is paid to the distinguished nature of such medical data, whose integrity and confidentiality should be preserved at all costs. This is achieved through the employment of Trusted Third Parties (TTP) technology for the support of the required security services. Additionally, the proposed digital library employs smartcards for the management of the various security tokens that are used from the above services.
Enhancing the AliEn Web Service Authentication
NASA Astrophysics Data System (ADS)
Zhu, Jianlin; Saiz, Pablo; Carminati, Federico; Betev, Latchezar; Zhou, Daicui; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Grigoras, Costin; Furano, Fabrizio; Schreiner, Steffen; Vladimirovna Datskova, Olga; Sankar Banerjee, Subho; Zhang, Guoping
2011-12-01
Web Services are an XML based technology that allow applications to communicate with each other across disparate systems. Web Services are becoming the de facto standard that enable inter operability between heterogeneous processes and systems. AliEn2 is a grid environment based on web services. The AliEn2 services can be divided in three categories: Central services, deployed once per organization; Site services, deployed on each of the participating centers; Job Agents running on the worker nodes automatically. A security model to protect these services is essential for the whole system. Current implementations of web server, such as Apache, are not suitable to be used within the grid environment. Apache with the mod_ssl and OpenSSL only supports the X.509 certificates. But in the grid environment, the common credential is the proxy certificate for the purpose of providing restricted proxy and delegation. An Authentication framework was taken for AliEn2 web services to add the ability to accept X.509 certificates and proxy certificates from client-side to Apache Web Server. The authentication framework could also allow the generation of access control policies to limit access to the AliEn2 web services.
Detection and Prevention of Insider Threats in Database Driven Web Services
NASA Astrophysics Data System (ADS)
Chumash, Tzvi; Yao, Danfeng
In this paper, we take the first step to address the gap between the security needs in outsourced hosting services and the protection provided in the current practice. We consider both insider and outsider attacks in the third-party web hosting scenarios. We present SafeWS, a modular solution that is inserted between server side scripts and databases in order to prevent and detect website hijacking and unauthorized access to stored data. To achieve the required security, SafeWS utilizes a combination of lightweight cryptographic integrity and encryption tools, software engineering techniques, and security data management principles. We also describe our implementation of SafeWS and its evaluation. The performance analysis of our prototype shows the overhead introduced by security verification is small. SafeWS will allow business owners to significantly reduce the security risks and vulnerabilities of outsourcing their sensitive customer data to third-party providers.
Alaska > DOLWD > Employment Security Tax EMAIL SCAM ALERT (December 2012) On-line Employer Services Online Filing Demonstrations FAQs for TaxWeb Employer Report Notice Alaska Unemployment Insurance Tax Handbook The Employment Security Tax Section is responsible for providing assistance and information to
48 CFR 1804.470-3 - IT security requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... the provisioning of services or products (e.g., research and development, engineering, manufacturing... Policies are available at the NASA IT Security Policy Web site at: http://www.nasa.gov/offices/ocio...
A Security-façade Library for Virtual-observatory Software
NASA Astrophysics Data System (ADS)
Rixon, G.
2009-09-01
The security-façade library implements, for Java, IVOA's security standards. It supports the authentication mechanisms for SOAP and REST web-services, the sign-on mechanisms (with MyProxy, AstroGrid Accounts protocol or local credential-caches), the delegation protocol, and RFC3820-enabled HTTPS for Apache Tomcat. Using the façade, a developer who is not a security specialist can easily add access control to a virtual-observatory service and call secured services from an application. The library has been an internal part of AstroGrid software for some time and it is now offered for use by other developers.
Assuring the privacy and security of transmitting sensitive electronic health information.
Peng, Charlie; Kesarinath, Gautam; Brinks, Tom; Young, James; Groves, David
2009-11-14
The interchange of electronic health records between healthcare providers and public health organizations has become an increasingly desirable tool in reducing healthcare costs, improving healthcare quality, and protecting population health. Assuring privacy and security in nationwide sharing of Electronic Health Records (EHR) in an environment such as GRID has become a top challenge and concern. The Centers for Disease Control and Prevention's (CDC) and The Science Application International Corporation (SAIC) have jointly conducted a proof of concept study to find and build a common secure and reliable messaging platform (the SRM Platform) to handle this challenge. The SRM Platform is built on the open standards of OASIS, World Wide Web Consortium (W3C) web-services standards, and Web Services Interoperability (WS-I) specifications to provide the secure transport of sensitive EHR or electronic medical records (EMR). Transmitted data may be in any digital form including text, data, and binary files, such as images. This paper identifies the business use cases, architecture, test results, and new connectivity options for disparate health networks among PHIN, NHIN, Grid, and others.
Making Spatial Statistics Service Accessible On Cloud Platform
NASA Astrophysics Data System (ADS)
Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.
2014-04-01
Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.
A transmission security framework for email-based telemedicine.
Caffery, Liam J; Smith, Anthony C
2010-01-01
Encryption is used to convert an email message to an unreadable format thereby securing patient privacy during the transmission of the message across the Internet. Two available means of encryption are: public key infrastructure (PKI) used in conjunction with ordinary email and secure hypertext transfer protocol (HTTPS) used by secure web-mail applications. Both of these approaches have advantages and disadvantages in terms of viability, cost, usability and compliance. The aim of this study was develop an instrument to identify the most appropriate means of encrypting email communication for telemedicine. A multi-method approach was used to construct the instrument. Technical assessment and existing bodies of knowledge regarding the utility of PKI were analyzed, along with survey results from users of Queensland Health's Child and Youth Mental Health Service secure web-mail service. The resultant decision support model identified that the following conditions affect the choice of encryption technology: correspondent's risk perception, correspondent's identification to the security afforded by encryption, email-client used by correspondents, the tolerance to human error and the availability of technical resources. A decision support model is presented as a flow chart to identify the most appropriate encryption for a specific email-based telemedicine service.
NASA Astrophysics Data System (ADS)
Plessel, T.; Szykman, J.; Freeman, M.
2012-12-01
EPA's Remote Sensing Information Gateway (RSIG) is a widely used free applet and web service for quickly and easily retrieving, visualizing and saving user-specified subsets of atmospheric data - by variable, geographic domain and time range. Petabytes of available data include thousands of variables from a set of NASA and NOAA satellites, aircraft, ground stations and EPA air-quality models. The RSIG applet is used by atmospheric researchers and uses the rsigserver web service to obtain data and images. The rsigserver web service is compliant with the Open Geospatial Consortium Web Coverage Service (OGC-WCS) standard to facilitate data discovery and interoperability. Since rsigserver is publicly accessible, it can be (and is) used by other applications. This presentation describes the architecture and technical implementation details of this successful system with an emphasis on achieving convenience, high-performance, data integrity and security.
Bootstrapping and Maintaining Trust in the Cloud
2016-12-01
proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup
17 CFR 248.126 - Delivery of opt out notices.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 17 Commodity and Securities Exchanges 4 2014-04-01 2014-04-01 false Delivery of opt out notices. 248.126 Section 248.126 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer obtained a product or service electronically and requires the...
17 CFR 248.124 - Reasonable opportunity to opt out.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 17 Commodity and Securities Exchanges 3 2012-04-01 2012-04-01 false Reasonable opportunity to opt out. 248.124 Section 248.124 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer has obtained a product or service. The consumer acknowledges...
17 CFR 248.124 - Reasonable opportunity to opt out.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 3 2011-04-01 2011-04-01 false Reasonable opportunity to opt out. 248.124 Section 248.124 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer has obtained a product or service. The consumer acknowledges...
17 CFR 248.126 - Delivery of opt out notices.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 17 Commodity and Securities Exchanges 3 2012-04-01 2012-04-01 false Delivery of opt out notices. 248.126 Section 248.126 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer obtained a product or service electronically and requires the...
17 CFR 248.124 - Reasonable opportunity to opt out.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 17 Commodity and Securities Exchanges 4 2014-04-01 2014-04-01 false Reasonable opportunity to opt out. 248.124 Section 248.124 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer has obtained a product or service. The consumer acknowledges...
17 CFR 248.126 - Delivery of opt out notices.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 17 Commodity and Securities Exchanges 3 2013-04-01 2013-04-01 false Delivery of opt out notices. 248.126 Section 248.126 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer obtained a product or service electronically and requires the...
17 CFR 248.124 - Reasonable opportunity to opt out.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 17 Commodity and Securities Exchanges 3 2013-04-01 2013-04-01 false Reasonable opportunity to opt out. 248.124 Section 248.124 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer has obtained a product or service. The consumer acknowledges...
17 CFR 248.126 - Delivery of opt out notices.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 3 2011-04-01 2011-04-01 false Delivery of opt out notices. 248.126 Section 248.126 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer obtained a product or service electronically and requires the...
17 CFR 248.124 - Reasonable opportunity to opt out.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Reasonable opportunity to opt out. 248.124 Section 248.124 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION... Internet Web site at which the consumer has obtained a product or service. The consumer acknowledges...
Empirical analysis of the effects of cyber security incidents.
Davis, Ginger; Garcia, Alfredo; Zhang, Weide
2009-09-01
We analyze the time series associated with web traffic for a representative set of online businesses that have suffered widely reported cyber security incidents. Our working hypothesis is that cyber security incidents may prompt (security conscious) online customers to opt out and conduct their business elsewhere or, at the very least, to refrain from accessing online services. For companies relying almost exclusively on online channels, this presents an important business risk. We test for structural changes in these time series that may have been caused by these cyber security incidents. Our results consistently indicate that cyber security incidents do not affect the structure of web traffic for the set of online businesses studied. We discuss various public policy considerations stemming from our analysis.
T-Check in Technologies for Interoperability: Web Services and Security--Single Sign-On
2007-12-01
following tools: • Apache Tomcat 6.0—a Java Servlet container to host the Web services and a simple Web client application [Apache 2007a] • Apache Axis...Eclipse. Eclipse – an open development platform. http://www.eclipse.org/ (2007) [Hunter 2001] Hunter, Jason. Java Servlet Programming, 2nd Edition...Citation SAML 1.1 Java Toolkit SAML Ping Identity’s SAML-1.1 implementation [SourceID 2006] OpenSAML SAML An open source implementation of SAML 1.1
A new information architecture, website and services for the CMS experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Lucas; Rusack, Eleanor; Zemleris, Vidmantas
2012-01-01
The age and size of the CMS collaboration at the LHC means it now has many hundreds of inhomogeneous web sites and services, and hundreds of thousands of documents. We describe a major initiative to create a single coherent CMS internal and public web site. This uses the Drupal web Content Management System (now supported by CERN/IT) on top of a standard LAMP stack (Linux, Apache, MySQL, and php/perl). The new navigation, content and search services are coherently integrated with numerous existing CERN services (CDS, EDMS, Indico, phonebook, Twiki) as well as many CMS internal Web services. We describe themore » information architecture, the system design, implementation and monitoring, the document and content database, security aspects, and our deployment strategy, which ensured continual smooth operation of all systems at all times.« less
A new Information Architecture, Website and Services for the CMS Experiment
NASA Astrophysics Data System (ADS)
Taylor, Lucas; Rusack, Eleanor; Zemleris, Vidmantas
2012-12-01
The age and size of the CMS collaboration at the LHC means it now has many hundreds of inhomogeneous web sites and services, and hundreds of thousands of documents. We describe a major initiative to create a single coherent CMS internal and public web site. This uses the Drupal web Content Management System (now supported by CERN/IT) on top of a standard LAMP stack (Linux, Apache, MySQL, and php/perl). The new navigation, content and search services are coherently integrated with numerous existing CERN services (CDS, EDMS, Indico, phonebook, Twiki) as well as many CMS internal Web services. We describe the information architecture; the system design, implementation and monitoring; the document and content database; security aspects; and our deployment strategy, which ensured continual smooth operation of all systems at all times.
NASA Astrophysics Data System (ADS)
Friberg, P. A.; Luis, R. S.; Quintiliani, M.; Lisowski, S.; Hunter, S.
2014-12-01
Recently, a novel set of modules has been included in the Open Source Earthworm seismic data processing system, supporting the use of web applications. These include the Mole sub-system, for storing relevant event data in a MySQL database (see M. Quintiliani and S. Pintore, SRL, 2013), and an embedded webserver, Moleserv, for serving such data to web clients in QuakeML format. These modules have enabled, for the first time using Earthworm, the use of web applications for seismic data processing. These can greatly simplify the operation and maintenance of seismic data processing centers by having one or more servers providing the relevant data as well as the data processing applications themselves to client machines running arbitrary operating systems.Web applications with secure online web access allow operators to work anywhere, without the often cumbersome and bandwidth hungry use of secure shell or virtual private networks. Furthermore, web applications can seamlessly access third party data repositories to acquire additional information, such as maps. Finally, the usage of HTML email brought the possibility of specialized web applications, to be used in email clients. This is the case of EWHTMLEmail, which produces event notification emails that are in fact simple web applications for plotting relevant seismic data.Providing web services as part of Earthworm has enabled a number of other tools as well. One is ISTI's EZ Earthworm, a web based command and control system for an otherwise command line driven system; another is a waveform web service. The waveform web service serves Earthworm data to additional web clients for plotting, picking, and other web-based processing tools. The current Earthworm waveform web service hosts an advanced plotting capability for providing views of event-based waveforms from a Mole database served by Moleserve.The current trend towards the usage of cloud services supported by web applications is driving improvements in JavaScript, css and HTML, as well as faster and more efficient web browsers, including mobile. It is foreseeable that in the near future, web applications are as powerful and efficient as native applications. Hence the work described here has been the first step towards bringing the Open Source Earthworm seismic data processing system to this new paradigm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lurie, Gordon
2007-01-02
The cell phone software allows any Java enabled cell phone to view sensor and meteorological data via an internet connection using a secure connection to the CB-EMIS Web Service. Users with appropriate privileges can monitor the state of the sensors and perform simple maintenance tasks remotely. All sensitive data is downloaded from the web service, thus protecting sensitive data in the event a cell phone is lost.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-08
... the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the principal... in a manner to facilitate its distribution via Web sites or mobile devices. \\4\\ See Securities... broadcasters, Web site and mobile device service providers, and others to distribute this data product to their...
Building Multilevel Secure Web Services-Based Components for the Global Information Grid
2006-05-01
unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Transforming: Business , Security ,Warfighting 16 CROSSTALK The Journal of Defense...A Single Step of the BAC Table 1: A Single Step of the Block Access Controller Transforming: Business , Security ,Warfighting 18 CROSSTALK The Journal
NNDC Stand: Activities and Services of the National Nuclear Data Center
NASA Astrophysics Data System (ADS)
Pritychenko, B.; Arcilla, R.; Burrows, T. W.; Dunford, C. L.; Herman, M. W.; McLane, V.; Obložinský, P.; Sonzogni, A. A.; Tuli, J. K.; Winchell, D. F.
2005-05-01
The National Nuclear Data Center (NNDC) collects, evaluates, and disseminates nuclear physics data for basic nuclear research, applied nuclear technologies including energy, shielding, medical and homeland security. In 2004, to answer the needs of nuclear data users community, NNDC completed a project to modernize data storage and management of its databases and began offering new nuclear data Web services. The principles of database and Web application development as well as related nuclear reaction and structure database services are briefly described.
Evolution of System Architectures: Where Do We Need to Fail Next?
NASA Astrophysics Data System (ADS)
Bermudez, Luis; Alameh, Nadine; Percivall, George
2013-04-01
Innovation requires testing and failing. Thomas Edison was right when he said "I have not failed. I've just found 10,000 ways that won't work". For innovation and improvement of standards to happen, service Architectures have to be tested and tested. Within the Open Geospatial Consortium (OGC), testing of service architectures has occurred for the last 15 years. This talk will present an evolution of these service architectures and a possible future path. OGC is a global forum for the collaboration of developers and users of spatial data products and services, and for the advancement and development of international standards for geospatial interoperability. The OGC Interoperability Program is a series of hands-on, fast paced, engineering initiatives to accelerate the development and acceptance of OGC standards. Each initiative is organized in threads that provide focus under a particular theme. The first testbed, OGC Web Services phase 1, completed in 2003 had four threads: Common Architecture, Web Mapping, Sensor Web and Web Imagery Enablement. The Common Architecture was a cross-thread theme, to ensure that the Web Mapping and Sensor Web experiments built on a base common architecture. The architecture was based on the three main SOA components: Broker, Requestor and Provider. It proposed a general service model defining service interactions and dependencies; categorization of service types; registries to allow discovery and access of services; data models and encodings; and common services (WMS, WFS, WCS). For the latter, there was a clear distinction on the different services: Data Services (e.g. WMS), Application services (e.g. Coordinate transformation) and server-side client applications (e.g. image exploitation). The latest testbed, OGC Web Service phase 9, completed in 2012 had 5 threads: Aviation, Cross-Community Interoperability (CCI), Security and Services Interoperability (SSI), OWS Innovations and Compliance & Interoperability Testing & Evaluation (CITE). Compared to the first testbed, OWS-9 did not have a separate common architecture thread. Instead the emphasis was on brokering information models, securing them and making data available efficiently on mobile devices. The outcome is an architecture based on usability and non-intrusiveness while leveraging mediation of information models from different communities. This talk will use lessons learned from the evolution from OGC Testbed phase 1 to phase 9 to better understand how global and complex infrastructures evolve to support many communities including the Earth System Science Community.
Transactional interactive multimedia banner
NASA Astrophysics Data System (ADS)
Shae, Zon-Yin; Wang, Xiping; von Kaenel, Juerg
2000-05-01
Advertising in TV broadcasting has shown that multimedia is a very effective means to present merchandise and attract shoppers. This has been applied to the Web by including animated multimedia banner ads on web pages. However, the issues of coupling interactive browsing, shopping, and secure transactions e.g. from inside a multimedia banner, have only recently started to being explored. Currently there is an explosively growing amount of back-end services available (e.g., business to business commerce (B2B), business to consumer (B2C) commerce, and infomercial services) in the Internet. These services are mostly accessible through static HTML web pages at a few specific web portals. In this paper, we will investigate the feasibility of using interactive multimedia banners as pervasive access point for the B2C, B2B, and infomercial services. We present a system architecture that involves a layer of middleware agents functioning as the bridge between the interactive multimedia banners and back-end services.
Combining Domain-driven Design and Mashups for Service Development
NASA Astrophysics Data System (ADS)
Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni
This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.
Integrating geo web services for a user driven exploratory analysis
NASA Astrophysics Data System (ADS)
Moncrieff, Simon; Turdukulov, Ulanbek; Gulland, Elizabeth-Kate
2016-04-01
In data exploration, several online data sources may need to be dynamically aggregated or summarised over spatial region, time interval, or set of attributes. With respect to thematic data, web services are mainly used to present results leading to a supplier driven service model limiting the exploration of the data. In this paper we propose a user need driven service model based on geo web processing services. The aim of the framework is to provide a method for the scalable and interactive access to various geographic data sources on the web. The architecture combines a data query, processing technique and visualisation methodology to rapidly integrate and visually summarise properties of a dataset. We illustrate the environment on a health related use case that derives Age Standardised Rate - a dynamic index that needs integration of the existing interoperable web services of demographic data in conjunction with standalone non-spatial secure database servers used in health research. Although the example is specific to the health field, the architecture and the proposed approach are relevant and applicable to other fields that require integration and visualisation of geo datasets from various web services and thus, we believe is generic in its approach.
Dynamic Construction Scheme for Virtualization Security Service in Software-Defined Networks
Lin, Zhaowen; Tao, Dan; Wang, Zhenji
2017-01-01
For a Software Defined Network (SDN), security is an important factor affecting its large-scale deployment. The existing security solutions for SDN mainly focus on the controller itself, which has to handle all the security protection tasks by using the programmability of the network. This will undoubtedly involve a heavy burden for the controller. More devastatingly, once the controller itself is attacked, the entire network will be paralyzed. Motivated by this, this paper proposes a novel security protection architecture for SDN. We design a security service orchestration center in the control plane of SDN, and this center physically decouples from the SDN controller and constructs SDN security services. We adopt virtualization technology to construct a security meta-function library, and propose a dynamic security service composition construction algorithm based on web service composition technology. The rule-combining method is used to combine security meta-functions to construct security services which meet the requirements of users. Moreover, the RETE algorithm is introduced to improve the efficiency of the rule-combining method. We evaluate our solutions in a realistic scenario based on OpenStack. Substantial experimental results demonstrate the effectiveness of our solutions that contribute to achieve the effective security protection with a small burden of the SDN controller. PMID:28430155
Dynamic Construction Scheme for Virtualization Security Service in Software-Defined Networks.
Lin, Zhaowen; Tao, Dan; Wang, Zhenji
2017-04-21
For a Software Defined Network (SDN), security is an important factor affecting its large-scale deployment. The existing security solutions for SDN mainly focus on the controller itself, which has to handle all the security protection tasks by using the programmability of the network. This will undoubtedly involve a heavy burden for the controller. More devastatingly, once the controller itself is attacked, the entire network will be paralyzed. Motivated by this, this paper proposes a novel security protection architecture for SDN. We design a security service orchestration center in the control plane of SDN, and this center physically decouples from the SDN controller and constructs SDN security services. We adopt virtualization technology to construct a security meta-function library, and propose a dynamic security service composition construction algorithm based on web service composition technology. The rule-combining method is used to combine security meta-functions to construct security services which meet the requirements of users. Moreover, the RETE algorithm is introduced to improve the efficiency of the rule-combining method. We evaluate our solutions in a realistic scenario based on OpenStack. Substantial experimental results demonstrate the effectiveness of our solutions that contribute to achieve the effective security protection with a small burden of the SDN controller.
Data Mining for Web-Based Support Systems: A Case Study in e-Custom Systems
NASA Astrophysics Data System (ADS)
Razmerita, Liana; Kirchner, Kathrin
This chapter provides an example of a Web-based support system (WSS) used to streamline trade procedures, prevent potential security threats, and reduce tax-related fraud in cross-border trade. The architecture is based on a service-oriented architecture that includes smart seals and Web services. We discuss the implications and suggest further enhancements to demonstrate how such systems can move toward a Web-based decision support system with the support of data mining methods. We provide a concrete example of how data mining can help to analyze the vast amount of data collected while monitoring the container movements along its supply chain.
Access Control of Web- and Java-Based Applications
NASA Technical Reports Server (NTRS)
Tso, Kam S.; Pajevski, Michael J.
2013-01-01
Cybersecurity has become a great concern as threats of service interruption, unauthorized access, stealing and altering of information, and spreading of viruses have become more prevalent and serious. Application layer access control of applications is a critical component in the overall security solution that also includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. An access control solution, based on an open-source access manager augmented with custom software components, was developed to provide protection to both Web-based and Javabased client and server applications. The DISA Security Service (DISA-SS) provides common access control capabilities for AMMOS software applications through a set of application programming interfaces (APIs) and network- accessible security services for authentication, single sign-on, authorization checking, and authorization policy management. The OpenAM access management technology designed for Web applications can be extended to meet the needs of Java thick clients and stand alone servers that are commonly used in the JPL AMMOS environment. The DISA-SS reusable components have greatly reduced the effort for each AMMOS subsystem to develop its own access control strategy. The novelty of this work is that it leverages an open-source access management product that was designed for Webbased applications to provide access control for Java thick clients and Java standalone servers. Thick clients and standalone servers are still commonly used in businesses and government, especially for applications that require rich graphical user interfaces and high-performance visualization that cannot be met by thin clients running on Web browsers
Marketing and Selling CD-ROM Products on the World-Wide Web.
ERIC Educational Resources Information Center
Walker, Becki
1995-01-01
Describes three companies' approaches to marketing and selling CD-ROM products on the World Wide Web. Benefits include low overhead for Internet-based sales, allowance for creativity, and ability to let customers preview products online. Discusses advertising, information delivery, content, information services, and security. (AEF)
A demanding web-based PACS supported by web services technology
NASA Astrophysics Data System (ADS)
Costa, Carlos M. A.; Silva, Augusto; Oliveira, José L.; Ribeiro, Vasco G.; Ribeiro, José
2006-03-01
During the last years, the ubiquity of web interfaces have pushed practically all PACS suppliers to develop client applications in which clinical practitioners can receive and analyze medical images, using conventional personal computers and Web browsers. However, due to security and performance issues, the utilization of these software packages has been restricted to Intranets. Paradigmatically, one of the most important advantages of digital image systems is to simplify the widespread sharing and remote access of medical data between healthcare institutions. This paper analyses the traditional PACS drawbacks that contribute to their reduced usage in the Internet and describes a PACS based on Web Services technology that supports a customized DICOM encoding syntax and a specific compression scheme providing all historical patient data in a unique Web interface.
The research of network database security technology based on web service
NASA Astrophysics Data System (ADS)
Meng, Fanxing; Wen, Xiumei; Gao, Liting; Pang, Hui; Wang, Qinglin
2013-03-01
Database technology is one of the most widely applied computer technologies, its security is becoming more and more important. This paper introduced the database security, network database security level, studies the security technology of the network database, analyzes emphatically sub-key encryption algorithm, applies this algorithm into the campus-one-card system successfully. The realization process of the encryption algorithm is discussed, this method is widely used as reference in many fields, particularly in management information system security and e-commerce.
Symmetric Key Services Markup Language (SKSML)
NASA Astrophysics Data System (ADS)
Noor, Arshad
Symmetric Key Services Markup Language (SKSML) is the eXtensible Markup Language (XML) being standardized by the OASIS Enterprise Key Management Infrastructure Technical Committee for requesting and receiving symmetric encryption cryptographic keys within a Symmetric Key Management System (SKMS). This protocol is designed to be used between clients and servers within an Enterprise Key Management Infrastructure (EKMI) to secure data, independent of the application and platform. Building on many security standards such as XML Signature, XML Encryption, Web Services Security and PKI, SKSML provides standards-based capability to allow any application to use symmetric encryption keys, while maintaining centralized control. This article describes the SKSML protocol and its capabilities.
Secure electronic commerce communication system based on CA
NASA Astrophysics Data System (ADS)
Chen, Deyun; Zhang, Junfeng; Pei, Shujun
2001-07-01
In this paper, we introduce the situation of electronic commercial security, then we analyze the working process and security for SSL protocol. At last, we propose a secure electronic commerce communication system based on CA. The system provide secure services such as encryption, integer, peer authentication and non-repudiation for application layer communication software of browser clients' and web server. The system can implement automatic allocation and united management of key through setting up the CA in the network.
Efficient Data Transfer Rate and Speed of Secured Ethernet Interface System.
Ghanti, Shaila; Naik, G M
2016-01-01
Embedded systems are extensively used in home automation systems, small office systems, vehicle communication systems, and health service systems. The services provided by these systems are available on the Internet and these services need to be protected. Security features like IP filtering, UDP protection, or TCP protection need to be implemented depending on the specific application used by the device. Every device on the Internet must have network interface. This paper proposes the design of the embedded Secured Ethernet Interface System to protect the service available on the Internet against the SYN flood attack. In this experimental study, Secured Ethernet Interface System is customized to protect the web service against the SYN flood attack. Secured Ethernet Interface System is implemented on ALTERA Stratix IV FPGA as a system on chip and uses the modified SYN flood attack protection method. The experimental results using Secured Ethernet Interface System indicate increase in number of genuine clients getting service from the server, considerable improvement in the data transfer rate, and better response time during the SYN flood attack.
Efficient Data Transfer Rate and Speed of Secured Ethernet Interface System
Ghanti, Shaila
2016-01-01
Embedded systems are extensively used in home automation systems, small office systems, vehicle communication systems, and health service systems. The services provided by these systems are available on the Internet and these services need to be protected. Security features like IP filtering, UDP protection, or TCP protection need to be implemented depending on the specific application used by the device. Every device on the Internet must have network interface. This paper proposes the design of the embedded Secured Ethernet Interface System to protect the service available on the Internet against the SYN flood attack. In this experimental study, Secured Ethernet Interface System is customized to protect the web service against the SYN flood attack. Secured Ethernet Interface System is implemented on ALTERA Stratix IV FPGA as a system on chip and uses the modified SYN flood attack protection method. The experimental results using Secured Ethernet Interface System indicate increase in number of genuine clients getting service from the server, considerable improvement in the data transfer rate, and better response time during the SYN flood attack. PMID:28116350
Adopting and adapting a commercial view of web services for the Navy
NASA Astrophysics Data System (ADS)
Warner, Elizabeth; Ladner, Roy; Katikaneni, Uday; Petry, Fred
2005-05-01
Web Services are being adopted as the enabling technology to provide net-centric capabilities for many Department of Defense operations. The Navy Enterprise Portal, for example, is Web Services-based, and the Department of the Navy is promulgating guidance for developing Web Services. Web Services, however, only constitute a baseline specification that provides the foundation on which users, under current approaches, write specialized applications in order to retrieve data over the Internet. Application development may increase dramatically as the number of different available Web Services increases. Reasons for specialized application development include XML schema versioning differences, adoption/use of diverse business rules, security access issues, and time/parameter naming constraints, among others. We are currently developing for the US Navy a system which will improve delivery of timely and relevant meteorological and oceanographic (MetOc) data to the warfighter. Our objective is to develop an Advanced MetOc Broker (AMB) that leverages Web Services technology to identify, retrieve and integrate relevant MetOc data in an automated manner. The AMB will utilize a Mediator, which will be developed by applying ontological research and schema matching techniques to MetOc forms of data. The AMB, using the Mediator, will support a new, advanced approach to the use of Web Services; namely, the automated identification, retrieval and integration of MetOc data. Systems based on this approach will then not require extensive end-user application development for each Web Service from which data can be retrieved. Users anywhere on the globe will be able to receive timely environmental data that fits their particular needs.
Cryptography for a High-Assurance Web-Based Enterprise
2013-10-01
2. Other Cryptographic services - Java provides many cryptographic services through the Java Cryptography Architecture (JCA) framework. The...id=2125 [7]. Miller, Sandra Kay, Fiber Optic Networks Vulnerable to Attack, Information Security Magazine, November 15, 2006, [8]. José R.C
U.S.-Mexican Security Cooperation: the Merida Initiative and Beyond
2010-07-29
Department to USAID for implementation. 71 “Cárteles Perturban al Sistema Carcelario,” El Universal, June 18, 2010. 72 Silvia Otero, “No Investigan 95...a Web .” Milenio. July 28, 2010. U.S.-Mexican Security Cooperation: the Mérida Initiative and Beyond Congressional Research Service 27 Similar
U.S.- Mexican Security Cooperation: The Merida Initiative and Beyond
2010-07-29
Department to USAID for implementation. 71 “Cárteles Perturban al Sistema Carcelario,” El Universal, June 18, 2010. 72 Silvia Otero, “No Investigan 95...a Web .” Milenio. July 28, 2010. U.S.-Mexican Security Cooperation: the Mérida Initiative and Beyond Congressional Research Service 27 Similar
U.S.-Mexican Security Cooperation: the Merida Initiative and Beyond
2010-08-16
2010, those funds had yet to be transferred from the State Department to USAID for implementation. 71 “Cárteles Perturban al Sistema Carcelario,” El...Quejas a Web .” Milenio. July 28, 2010. U.S.-Mexican Security Cooperation: the Mérida Initiative and Beyond Congressional Research Service 27
NASA Astrophysics Data System (ADS)
Navarro-Arribas, Guillermo; Garcia-Alfaro, Joaquin
Web browsers are becoming the universal interface to reach applications and services related with these systems. Different browsing contexts may be required in order to reach them, e.g., use of VPN tunnels, corporate proxies, anonymisers, etc. By browsing context we mean how the user browsers the Web, including mainly the concrete configuration of its browser. When the context of the browser changes, its security requirements also change. In this work, we present the use of authorisation policies to automatise the process of controlling the resources of a Web browser when its context changes. The objective of our proposal is oriented towards easing the adaptation to the security requirements of the new context and enforce them in the browser without the need for user intervention. We present a concrete application of our work as a plug-in for the adaption of security requirements in Mozilla/Firefox browser when a context of anonymous navigation through the Tor network is enabled.
Testing in Service-Oriented Environments
2010-03-01
software releases (versions, service packs, vulnerability patches) for one com- mon ESB during the 13-month period from January 1, 2008 through...impact on quality of service : Unlike traditional software compo- nents, a single instance of a web service can be used by multiple consumers. Since the...distributed, with heterogeneous hardware and software (SOA infrastructure, services , operating systems, and databases). Because of cost and security, it
Immune Inspired Security Approach for Manets: a Case Study
NASA Astrophysics Data System (ADS)
Mohamed, Yasir Abdelgadir
2011-06-01
This paper extends the work that has earlier been established. Immune inspired approach for securing mobile ad hoc networks is specified there. Although it is clearly indicated there that the research scope is the wireless networks in general and hybrid mobile ad hoc networks in particular, we have seen that specifying the security system in one of the communications applications that need further security approach may help to understand how effectively the system can contribute to this vital and important networks sector. Security in this type of networks is important and controversial as it plays a key role in users' eagerness or reluctance for the services provided by these networks. In this paper, the immune inspired security system is specified to secure web services in converged networks.
Della Mea, V; Beltrami, C A
2000-01-01
The last five years experience has definitely demonstrated the possible applications of the Internet for telepathology. They may be listed as follows: (a) teleconsultation via multimedia e-mail; (b) teleconsultation via web-based tools; (c) distant education by means of World Wide Web; (d) virtual microscope management through Web and Java interfaces; (e) real-time consultations through Internet-based videoconferencing. Such applications have led to the recognition of some important limits of the Internet, when dealing with telemedicine: (i) no guarantees on the quality of service (QoS); (ii) inadequate security and privacy; (iii) for some countries, low bandwidth and thus low responsiveness for real-time applications. Currently, there are several innovations in the world of the Internet. Different initiatives have been aimed at an amelioration of the Internet protocols, in order to have quality of service, multimedia support, security and other advanced services, together with greater bandwidth. The forthcoming Internet improvements, although induced by electronic commerce, video on demand, and other commercial needs, are of real interest also for telemedicine, because they solve the limits currently slowing down the use of Internet. When such new services will be available, telepathology applications may switch from research to daily practice in a fast way.
76 FR 51044 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
.... Project: SAMHSA SOAR Web-Based Data Form--NEW In 2009 the Substance Abuse and Mental Health Services... states. SOAR's primary objective is to improve the allowance rate for Social Security Administration (SSA... SAMHSA's direction developed a web-based data form that case managers can use to track the progress of...
Ultrabroadband photonic internet: safety aspects
NASA Astrophysics Data System (ADS)
Kalicki, Arkadiusz; Romaniuk, Ryszard
2008-11-01
Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.
Orchestrating BMD Control in Extended BPEL
2008-05-21
Orchestration of secure WebMail , Technical Report ISE-TR-06-08, George Mason University, Fairfax, VA, August 2006. [9] E. Christensen, F. Curbera...methods to access and dissemination control, securing circuit switched (SS7) and IP based telecommunication (VoIP) systems, multimedia, security ...decorating the Business Process Execution Language (BPEL) with Quality of Service (QoS), Measures of Performance (MoP), Measures of Effectiveness (MoE
Globus Identity, Access, and Data Management: Platform Services for Collaborative Science
NASA Astrophysics Data System (ADS)
Ananthakrishnan, R.; Foster, I.; Wagner, R.
2016-12-01
Globus is software-as-a-service for research data management, developed at, and operated by, the University of Chicago. Globus, accessible at www.globus.org, provides high speed, secure file transfer; file sharing directly from existing storage systems; and data publication to institutional repositories. 40,000 registered users have used Globus to transfer tens of billions of files totaling hundreds of petabytes between more than 10,000 storage systems within campuses and national laboratories in the US and internationally. Web, command line, and REST interfaces support both interactive use and integration into applications and infrastructures. An important component of the Globus system is its foundational identity and access management (IAM) platform service, Globus Auth. Both Globus research data management and other applications use Globus Auth for brokering authentication and authorization interactions between end-users, identity providers, resource servers (services), and a range of clients, including web, mobile, and desktop applications, and other services. Compliant with important standards such as OAuth, OpenID, and SAML, Globus Auth provides mechanisms required for an extensible, integrated ecosystem of services and clients for the research and education community. It underpins projects such as the US National Science Foundation's XSEDE system, NCAR's Research Data Archive, and the DOE Systems Biology Knowledge Base. Current work is extending Globus services to be compliant with FEDRAMP standards for security assessment, authorization, and monitoring for cloud services. We will present Globus IAM solutions and give examples of Globus use in various projects for federated access to resources. We will also describe how Globus Auth and Globus research data management capabilities enable rapid development and low-cost operations of secure data sharing platforms that leverage Globus services and integrate them with local policy and security.
NASA Technical Reports Server (NTRS)
Sinderson, Elias; Magapu, Vish; Mak, Ronald
2004-01-01
We describe the design and deployment of the middleware for the Collaborative Information Portal (CIP), a mission critical J2EE application developed for NASA's 2003 Mars Exploration Rover mission. CIP enabled mission personnel to access data and images sent back from Mars, staff and event schedules, broadcast messages and clocks displaying various Earth and Mars time zones. We developed the CIP middleware in less than two years time usins cutting-edge technologies, including EJBs, servlets, JDBC, JNDI and JMS. The middleware was designed as a collection of independent, hot-deployable web services, providing secure access to back end file systems and databases. Throughout the middleware we enabled crosscutting capabilities such as runtime service configuration, security, logging and remote monitoring. This paper presents our approach to mitigating the challenges we faced, concluding with a review of the lessons we learned from this project and noting what we'd do differently and why.
Subotic-Kerry, Mirjana; King, Catherine; O'Moore, Kathleen; Achilles, Melinda; O'Dea, Bridianne
2018-03-23
Anxiety disorders and depression are prevalent among youth. General practitioners (GPs) are often the first point of professional contact for treating health problems in young people. A Web-based mental health service delivered in partnership with schools may facilitate increased access to psychological care among adolescents. However, for such a model to be implemented successfully, GPs' views need to be measured. This study aimed to examine the needs and attitudes of GPs toward a Web-based mental health service for adolescents, and to identify the factors that may affect the provision of this type of service and likelihood of integration. Findings will inform the content and overall service design. GPs were interviewed individually about the proposed Web-based service. Qualitative analysis of transcripts was performed using thematic coding. A short follow-up questionnaire was delivered to assess background characteristics, level of acceptability, and likelihood of integration of the Web-based mental health service. A total of 13 GPs participated in the interview and 11 completed a follow-up online questionnaire. Findings suggest strong support for the proposed Web-based mental health service. A wide range of factors were found to influence the likelihood of GPs integrating a Web-based service into their clinical practice. Coordinated collaboration with parents, students, school counselors, and other mental health care professionals were considered important by nearly all GPs. Confidence in Web-based care, noncompliance of adolescents and GPs, accessibility, privacy, and confidentiality were identified as potential barriers to adopting the proposed Web-based service. GPs were open to a proposed Web-based service for the monitoring and management of anxiety and depression in adolescents, provided that a collaborative approach to care is used, the feedback regarding the client is clear, and privacy and security provisions are assured. ©Mirjana Subotic-Kerry, Catherine King, Kathleen O'Moore, Melinda Achilles, Bridianne O'Dea. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 23.03.2018.
Scalable and Precise Abstraction of Programs for Trustworthy Software
2017-01-01
calculus for core Java. • 14 months: A systematic abstraction of core Java. • 18 months: A security auditor for core Java. • 24 months: A contract... auditor for full Java. • 42 months: A web-deployed service for security auditing. Approved for Public Release; Distribution Unlimited 4 4.0 RESULTS
2016-04-30
software (OSS) and proprietary (CSS) software elements or remote services (Scacchi, 2002, 2010), eventually including recent efforts to support Web ...specific platforms, including those operating on secured Web /mobile devices. Common Development Technology provides AC development tools and common...transition to OA systems and OSS software elements, specifically for Web and Mobile devices within the realm of C3CB. OA, Open APIs, OSS, and CSS OA
Building Trust Through Secure Web Sites. The Systems Librarian
ERIC Educational Resources Information Center
Breeding, Marshall
2005-01-01
Who can be trusted on the Web? These days, with identity theft seemingly rampant, it's more important than ever to take all possible measures to protect privacy and to shield personal information from those who might not have good intentions. Today, librarians also have to take reasonable precautions to ensure that the online services that they…
Koutelakis, George V.; Anastassopoulos, George K.; Lymberopoulos, Dimitrios K.
2012-01-01
Multiprotocol medical imaging communication through the Internet is more flexible than the tight DICOM transfers. This paper introduces a modular multiprotocol teleradiology architecture that integrates DICOM and common Internet services (based on web, FTP, and E-mail) into a unique operational domain. The extended WADO service (a web extension of DICOM) and the other proposed services allow access to all levels of the DICOM information hierarchy as opposed to solely Object level. A lightweight client site is considered adequate, because the server site of the architecture provides clients with service interfaces through the web as well as invulnerable space for temporary storage, called as User Domains, so that users fulfill their applications' tasks. The proposed teleradiology architecture is pilot implemented using mainly Java-based technologies and is evaluated by engineers in collaboration with doctors. The new architecture ensures flexibility in access, user mobility, and enhanced data security. PMID:22489237
reCAPTCHA: human-based character recognition via Web security measures.
von Ahn, Luis; Maurer, Benjamin; McMillen, Colin; Abraham, David; Blum, Manuel
2008-09-12
CAPTCHAs (Completely Automated Public Turing test to tell Computers and Humans Apart) are widespread security measures on the World Wide Web that prevent automated programs from abusing online services. They do so by asking humans to perform a task that computers cannot yet perform, such as deciphering distorted characters. Our research explored whether such human effort can be channeled into a useful purpose: helping to digitize old printed material by asking users to decipher scanned words from books that computerized optical character recognition failed to recognize. We showed that this method can transcribe text with a word accuracy exceeding 99%, matching the guarantee of professional human transcribers. Our apparatus is deployed in more than 40,000 Web sites and has transcribed over 440 million words.
EntrezAJAX: direct web browser access to the Entrez Programming Utilities.
Loman, Nicholas J; Pallen, Mark J
2010-06-21
Web applications for biology and medicine often need to integrate data from Entrez services provided by the National Center for Biotechnology Information. However, direct access to Entrez from a web browser is not possible due to 'same-origin' security restrictions. The use of "Asynchronous JavaScript and XML" (AJAX) to create rich, interactive web applications is now commonplace. The ability to access Entrez via AJAX would be advantageous in the creation of integrated biomedical web resources. We describe EntrezAJAX, which provides access to Entrez eUtils and is able to circumvent same-origin browser restrictions. EntrezAJAX is easily implemented by JavaScript developers and provides identical functionality as Entrez eUtils as well as enhanced functionality to ease development. We provide easy-to-understand developer examples written in JavaScript to illustrate potential uses of this service. For the purposes of speed, reliability and scalability, EntrezAJAX has been deployed on Google App Engine, a freely available cloud service. The EntrezAJAX webpage is located at http://entrezajax.appspot.com/
TOKEN: Trustable Keystroke-Based Authentication for Web-Based Applications on Smartphones
NASA Astrophysics Data System (ADS)
Nauman, Mohammad; Ali, Tamleek
Smartphones are increasingly being used to store personal information as well as to access sensitive data from the Internet and the cloud. Establishment of the identity of a user requesting information from smartphones is a prerequisite for secure systems in such scenarios. In the past, keystroke-based user identification has been successfully deployed on production-level mobile devices to mitigate the risks associated with naïve username/password based authentication. However, these approaches have two major limitations: they are not applicable to services where authentication occurs outside the domain of the mobile device - such as web-based services; and they often overly tax the limited computational capabilities of mobile devices. In this paper, we propose a protocol for keystroke dynamics analysis which allows web-based applications to make use of remote attestation and delegated keystroke analysis. The end result is an efficient keystroke-based user identification mechanism that strengthens traditional password protected services while mitigating the risks of user profiling by collaborating malicious web services.
Consumer trust to a Web site: moderating effect of attitudes toward online shopping.
San Martín, Sonia; Camarero, Carmen
2008-10-01
In this paper, authors suggest a model that reflects the role played by the Web site characteristics and the previous level of satisfaction as determinant factors of trust in the Web site. Also, authors consider the moderating effects of consumers' motives and inhibitors to purchase online. Results show that satisfaction with previous purchases, the Web site security and privacy policies, and service quality are the main determinants of trust. Also, the motives and inhibitors the individuals perceive when buying online determine the type of signals they consider to trust.
2015-09-01
interface. 15. SUBJECT TERMS smartphone, HDPT, global graph, DSPro, ozone widget framework, distributed common ground system, web service 16. SECURITY...Lee M. Lessons learned with a global graph and ozone widget framework (OWF) testbed. Aberdeen Proving Ground (MD): Army Research Laboratory (US); 2013
How Homeland Security Affects Spatial Information
ERIC Educational Resources Information Center
Zellmer, Linda
2004-01-01
A recent article in Security-Focus described the fact that several U.S. government buildings in Washington DC could no longer be clearly seen by people using MapQuest's aerial photo database. In addition, the photos of these buildings were altered at the Web sites wherein they are posted at the request of the U.S. Secret Service. This is an…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-16
... provisions of the Social Security Act (the Act) and Public Health Service Act. We also issue various manuals..., there is a 3-month lapse between the information available on the Web site and information covered by... notice that provided only Web links to the addenda, or provide this information on a newly- created CMS...
Issues in implementing services for a wireless web-enabled digital camera
NASA Astrophysics Data System (ADS)
Venkataraman, Shyam; Sampat, Nitin; Fisher, Yoram; Canosa, John; Noel, Nicholas
2001-05-01
The competition in the exploding digital photography market has caused vendors to explore new ways to increase their return on investment. A common view among industry analysts is that increasingly it will be services provided by these cameras, and not the cameras themselves, that will provide the revenue stream. These services will be coupled to e- Appliance based Communities. In addition, the rapidly increasing need to upload images to the Internet for photo- finishing services as well as the need to download software upgrades to the camera is driving many camera OEMs to evaluate the benefits of using the wireless web to extend their enterprise systems. Currently, creating a viable e- appliance such as a digital camera coupled with a wireless web service requires more than just a competency in product development. This paper will evaluate the system implications in the deployment of recurring revenue services and enterprise connectivity of a wireless, web-enabled digital camera. These include, among other things, an architectural design approach for services such as device management, synchronization, billing, connectivity, security, etc. Such an evaluation will assist, we hope, anyone designing or connecting a digital camera to the enterprise systems.
E-Service Quality Evaluation on E-Government Website: Case Study BPJS Kesehatan Indonesia
NASA Astrophysics Data System (ADS)
Rasyid, A.; Alfina, I.
2017-01-01
This research intends to develop a model to evaluate the quality of e-services on e-government. The proposed model consists of seven dimensions: web design, reliability, responsiveness, privacy and security, personalization, information, and ease of use. The model is used to measure the quality of the e-registration of BPJS Kesehatan, an Indonesian government health insurance program. The validation and reliability testing show that of the seven dimensions proposed, only four that suitable for the case study. The result shows that the BPJS Kesehatan e-registration service is good in reliability and responsiveness dimensions, while from web design and ease of use dimensions the e-service still needs to be optimized.
Van Hoecke, Sofie; Steurbaut, Kristof; Taveirne, Kristof; De Turck, Filip; Dhoedt, Bart
2010-01-01
We designed a broker platform for e-homecare services using web service technology. The broker allows efficient data communication and guarantees quality requirements such as security, availability and cost-efficiency by dynamic selection of services, minimizing user interactions and simplifying authentication through a single user sign-on. A prototype was implemented, with several e-homecare services (alarm, telemonitoring, audio diary and video-chat). It was evaluated by patients with diabetes and multiple sclerosis. The patients found that the start-up time and overhead imposed by the platform was satisfactory. Having all e-homecare services integrated into a single application, which required only one login, resulted in a high quality of experience for the patients.
Secure Service Proxy: A CoAP(s) Intermediary for a Securer and Smarter Web of Things
Van den Abeele, Floris; Moerman, Ingrid; Demeester, Piet
2017-01-01
As the IoT continues to grow over the coming years, resource-constrained devices and networks will see an increase in traffic as everything is connected in an open Web of Things. The performance- and function-enhancing features are difficult to provide in resource-constrained environments, but will gain importance if the WoT is to be scaled up successfully. For example, scalable open standards-based authentication and authorization will be important to manage access to the limited resources of constrained devices and networks. Additionally, features such as caching and virtualization may help further reduce the load on these constrained systems. This work presents the Secure Service Proxy (SSP): a constrained-network edge proxy with the goal of improving the performance and functionality of constrained RESTful environments. Our evaluations show that the proposed design reaches its goal by reducing the load on constrained devices while implementing a wide range of features as different adapters. Specifically, the results show that the SSP leads to significant savings in processing, network traffic, network delay and packet loss rates for constrained devices. As a result, the SSP helps to guarantee the proper operation of constrained networks as these networks form an ever-expanding Web of Things. PMID:28696393
Secure Service Proxy: A CoAP(s) Intermediary for a Securer and Smarter Web of Things.
Van den Abeele, Floris; Moerman, Ingrid; Demeester, Piet; Hoebeke, Jeroen
2017-07-11
As the IoT continues to grow over the coming years, resource-constrained devices and networks will see an increase in traffic as everything is connected in an open Web of Things. The performance- and function-enhancing features are difficult to provide in resource-constrained environments, but will gain importance if the WoT is to be scaled up successfully. For example, scalable open standards-based authentication and authorization will be important to manage access to the limited resources of constrained devices and networks. Additionally, features such as caching and virtualization may help further reduce the load on these constrained systems. This work presents the Secure Service Proxy (SSP): a constrained-network edge proxy with the goal of improving the performance and functionality of constrained RESTful environments. Our evaluations show that the proposed design reaches its goal by reducing the load on constrained devices while implementing a wide range of features as different adapters. Specifically, the results show that the SSP leads to significant savings in processing, network traffic, network delay and packet loss rates for constrained devices. As a result, the SSP helps to guarantee the proper operation of constrained networks as these networks form an ever-expanding Web of Things.
Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro
2011-07-01
Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org.
Fingerprinting Software Defined Networks and Controllers
2015-03-01
24 2.5.3 Intrusion Prevention System with SDN . . . . . . . . . . . . . . . 25 2.5.4 Modular Security Services...Control Message Protocol IDS Intrusion Detection System IPS Intrusion Prevention System ISP Internet Service Provider LLDP Link Layer Discovery Protocol...layer functions (e.g., web proxies, firewalls, intrusion detection/prevention, load balancers, etc.). The increase in switch capabilities combined
39 CFR 501.8 - Postage Evidencing System test and approval.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Service Web site at http://www.usps.com/postagesolutions/programdoc.html or requests for copies may be.... (b) As provided in § 501.11, the provider has a duty to report security weaknesses to the Postal... basic features or safeguards may be made except as authorized or ordered by the Postal Service in...
GEMSS: privacy and security for a medical Grid.
Middleton, S E; Herveg, J A M; Crazzolara, F; Marvin, D; Poullet, Y
2005-01-01
The GEMSS project is developing a secure Grid infrastructure through which six medical simulations services can be invoked. We examine the legal and security framework within which GEMSS operates. We provide a legal qualification to the operations performed upon patient data, in view of EU directive 95/46, when using medical applications on the GEMSS Grid. We identify appropriate measures to ensure security and describe the legal rationale behind our choice of security technology. Our legal analysis demonstrates there must be an identified controller (typically a hospital) of patient data. The controller must then choose a processor (in this context a Grid service provider) that provides sufficient guarantees with respect to the security of their technical and organizational data processing procedures. These guarantees must ensure a level of security appropriate to the risks, with due regard to the state of the art and the cost of their implementation. Our security solutions are based on a public key infrastructure (PKI), transport level security and end-to-end security mechanisms in line with the web service (WS Security, WS Trust and SecureConversation) security specifications. The GEMSS infrastructure ensures a degree of protection of patient data that is appropriate for the health care sector, and is in line with the European directives. We hope that GEMSS will become synonymous with high security data processing, providing a framework by which GEMSS service providers can provide the security guarantees required by hospitals with regard to the processing of patient data.
The Exon-Florio National Security Test for Foreign Investment
2006-03-15
Congressional Research Service ˜ The Library of Congress CRS Report for Congress Received through the CRS Web Order Code RL33312 The Exon- Florio ...number. 1. REPORT DATE 15 MAR 2006 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE The Exon- Florio National Security Test for...Z39-18 The Exon- Florio National Security Test for Foreign Investment Summary The proposed acquisitions of major operations in six major U.S. ports by
Electronic freight management (EFM) standards strategy
DOT National Transportation Integrated Search
2006-04-01
The EFM initiative is a U.S. Department of Transportation (DOT)-sponsored research effort aimed at improving the operating efficiency, safety, and security of freight movement. The initiative involves conducting a deployment test using Web services t...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-03
... provisions of the Social Security Act (the Act) and Public Health Service Act. We also issue various manuals...-month period along with a hyperlink to the full listing that is available on the CMS Web site or the... information and will be available earlier than we publish our quarterly notice. We believe the Web site list...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... provisions of the Social Security Act (the Act) and Public Health Service Act. We also issue various manuals...-month period along with a hyperlink to the full listing that is available on the CMS Web site or the... information and will be available earlier than we publish our quarterly notice. We believe the Web site list...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-26
... provisions of the Social Security Act (the Act) and Public Health Service Act. We also issue various manuals...-month period along with a hyperlink to the full listing that is available on the CMS Web site or the... information and will be available earlier than we publish our quarterly notice. We believe the Web site list...
2005-07-01
policies in pervasive computing environments. In this context, the owner of information sources (e.g. user, sensor, application, or organization...work in decentralized trust management and semantic web technologies . Section 3 introduces an Information Disclosure Agent architecture for...Norman Sadeh July 2005 CMU-ISRI-05-113 School of Computer Science, Carnegie Mellon University 5000 Forbes Avenue, Pittsburgh, PA, 15213
Secure Service Invocation in a Peer-to-Peer Environment Using JXTA-SOAP
NASA Astrophysics Data System (ADS)
Laghi, Maria Chiara; Amoretti, Michele; Conte, Gianni
The effective convergence of service-oriented architectures (SOA) and peer-to-peer (P2P) is an urgent task, with many important applications ranging from e-business to ambient intelligence. A considerable standardization effort is being carried out from both SOA and P2P communities, but a complete platform for the development of secure, distributed applications is still missing. In this context, the result of our research and development activity is JXTA-SOAP, an official extension for JXTA enabling Web Service sharing in peer-to-peer networks. Recently we focused on security aspects, providing JXTA-SOAP with a general security management system, and specialized policies that target both J2SE and J2ME versions of the component. Among others, we implemented a policy based on Multimedia Internet KEYing (MIKEY), which can be used to create a key pair and all the required parameters for encryption and decryption of service messages in consumer and provider peers running on resource-constrained devices.
Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa
2010-08-21
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.
2010-01-01
Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200
Globus | Informatics Technology for Cancer Research (ITCR)
Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.
EntrezAJAX: direct web browser access to the Entrez Programming Utilities
2010-01-01
Web applications for biology and medicine often need to integrate data from Entrez services provided by the National Center for Biotechnology Information. However, direct access to Entrez from a web browser is not possible due to 'same-origin' security restrictions. The use of "Asynchronous JavaScript and XML" (AJAX) to create rich, interactive web applications is now commonplace. The ability to access Entrez via AJAX would be advantageous in the creation of integrated biomedical web resources. We describe EntrezAJAX, which provides access to Entrez eUtils and is able to circumvent same-origin browser restrictions. EntrezAJAX is easily implemented by JavaScript developers and provides identical functionality as Entrez eUtils as well as enhanced functionality to ease development. We provide easy-to-understand developer examples written in JavaScript to illustrate potential uses of this service. For the purposes of speed, reliability and scalability, EntrezAJAX has been deployed on Google App Engine, a freely available cloud service. The EntrezAJAX webpage is located at http://entrezajax.appspot.com/ PMID:20565938
Practice improvement, part II: update on patient communication technologies.
Roett, Michelle A; Coleman, Mary Thoesen
2013-11-01
Patient portals (ie, secure web-based services for patient health record access) and secure messaging to health care professionals are gaining popularity slowly. Advantages of web portals include timely communication and instruction, access to appointments and other services, and high patient satisfaction. Limitations include inappropriate use, security considerations, organizational costs, and exclusion of patients who are uncomfortable with or unable to use computers. Attention to the organization's strategic plan and office policies, patient and staff expectations, workflow and communication integration, training, marketing, and enrollment can facilitate optimal use of this technology. Other communication technologies that can enhance patient care include automated voice or text reminders and brief electronic communications. Social media provide another method of patient outreach, but privacy and access are concerns. Incorporating telehealthcare (health care provided via telephone or Internet), providing health coaching, and using interactive health communication applications can improve patient knowledge and clinical outcomes and provide social support. Written permission from the American Academy of Family Physicians is required for reproduction of this material in whole or in part in any form or medium.
Security Risks of Cloud Computing and Its Emergence as 5th Utility Service
NASA Astrophysics Data System (ADS)
Ahmad, Mushtaq
Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-22
... of Rule Change To Extend the Effective Date of the Amendment to the Continuing Disclosure Service of EMMA To Provide for the Posting of Credit Rating and Related Information on the EMMA Public Web Site... service of the MSRB's Electronic Municipal Market Access system (``EMMA'') to provide for the posting of...
Task Force on the Future of Military Health Care
2007-12-01
Navigator. Service programs are supported by the Military Health System Population Health Portal (MHSPHP), a centralized, secure, web-based population...Congress on March 1, 2008.66 64 Air Force Medical Support Agency, Population Health Support Division. MHS Population Health Portal Methods. July 2007...HEDIS metrics using the MHS Population Health Portal and reporting in the service systems and the Tri- Service Business Planning tool. DoD has several
Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro
2011-01-01
Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org. PMID:21632604
Access Control of Web and Java Based Applications
NASA Technical Reports Server (NTRS)
Tso, Kam S.; Pajevski, Michael J.; Johnson, Bryan
2011-01-01
Cyber security has gained national and international attention as a result of near continuous headlines from financial institutions, retail stores, government offices and universities reporting compromised systems and stolen data. Concerns continue to rise as threats of service interruption, and spreading of viruses become ever more prevalent and serious. Controlling access to application layer resources is a critical component in a layered security solution that includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. In this paper we discuss the development of an application-level access control solution, based on an open-source access manager augmented with custom software components, to provide protection to both Web-based and Java-based client and server applications.
A secure and easy-to-implement web-based communication framework for caregiving robot teams
NASA Astrophysics Data System (ADS)
Tuna, G.; Daş, R.; Tuna, A.; Örenbaş, H.; Baykara, M.; Gülez, K.
2016-03-01
In recent years, robots have started to become more commonplace in our lives, from factory floors to museums, festivals and shows. They have started to change how we work and play. With an increase in the population of the elderly, they have also been started to be used for caregiving services, and hence many countries have been investing in the robot development. The advancements in robotics and wireless communications has led to the emergence of autonomous caregiving robot teams which cooperate to accomplish a set of tasks assigned by human operators. Although wireless communications and devices are flexible and convenient, they are vulnerable to many risks compared to traditional wired networks. Since robots with wireless communication capability transmit all data types, including sensory, coordination, and control, through radio frequencies, they are open to intruders and attackers unless protected and their openness may lead to many security issues such as data theft, passive listening, and service interruption. In this paper, a secure web-based communication framework is proposed to address potential security threats due to wireless communication in robot-robot and human-robot interaction. The proposed framework is simple and practical, and can be used by caregiving robot teams in the exchange of sensory data as well as coordination and control data.
Intelligent cloud computing security using genetic algorithm as a computational tools
NASA Astrophysics Data System (ADS)
Razuky AL-Shaikhly, Mazin H.
2018-05-01
An essential change had occurred in the field of Information Technology which represented with cloud computing, cloud giving virtual assets by means of web yet awesome difficulties in the field of information security and security assurance. Currently main problem with cloud computing is how to improve privacy and security for cloud “cloud is critical security”. This paper attempts to solve cloud security by using intelligent system with genetic algorithm as wall to provide cloud data secure, all services provided by cloud must detect who receive and register it to create list of users (trusted or un-trusted) depend on behavior. The execution of present proposal has shown great outcome.
Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment
Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon
2013-01-01
Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906
Secure encapsulation and publication of biological services in the cloud computing environment.
Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon
2013-01-01
Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.
Fifty Years of Silent Service: A Peek Inside the CIA Library.
ERIC Educational Resources Information Center
Newlen, Robert R.
1998-01-01
Describes the CIA (Central Intelligence Agency) library. Highlights include security measures, a day in the life of two CIA librarians, sample reference questions, collection development, the Historical Intelligence Collection, the CIA Web site, and library modernization. (JAK)
User-Centric Secure Cross-Site Interaction Framework for Online Social Networking Services
ERIC Educational Resources Information Center
Ko, Moo Nam
2011-01-01
Social networking service is one of major technological phenomena on Web 2.0. Hundreds of millions of users are posting message, photos, and videos on their profiles and interacting with other users, but the sharing and interaction are limited within the same social networking site. Although users can share some content on a social networking site…
BetterThanPin: Empowering Users to Fight Phishing (Poster)
NASA Astrophysics Data System (ADS)
Tan, Teik Guan
The BetterThanPin concept is an online security service that allows users to enable almost any Cloud or Web-based account (e.g. Gmail, MSN, Yahoo, etc) to be protected with "almost" 2-factor authentication (2FA). The result is that users can now protect their online accounts with better authentication, without waiting for the service or cloud provider.
Safe and Secure Services Based on NGN
NASA Astrophysics Data System (ADS)
Fukazawa, Tomoo; Nisase, Takemi; Kawashima, Masahisa; Hariu, Takeo; Oshima, Yoshihito
Next Generation Network (NGN), which has been undergoing standardization as it has developed, is expected to create new services that converge the fixed and mobile networks. This paper introduces the basic requirements for NGN in terms of security and explains the standardization activities, in particular, the requirements for the security function described in Y.2701 discussed in ITU-T SG-13. In addition to the basic NGN security function, requirements for NGN authentication are also described from three aspects: security, deployability, and service. As examples of authentication implementation, three profiles-namely, fixed, nomadic, and mobile-are defined in this paper. That is, the “fixed profile” is typically for fixed-line subscribers, the “nomadic profile” basically utilizes WiFi access points, and the “mobile profile” provides ideal NGN mobility for mobile subscribers. All three of these profiles satisfy the requirements from security aspects. The three profiles are compared from the viewpoint of requirements for deployability and service. After showing that none of the three profiles can fulfill all of the requirements, we propose that multiple profiles should be used by NGN providers. As service and application examples, two promising NGN applications are proposed. The first is a strong authentication mechanism that makes Web applications more safe and secure even against password theft. It is based on NGN ID federation function. The second provides an easy peer-to-peer broadband virtual private network service aimed at safe and secure communication for personal/SOHO (small office, home office) users, based on NGN SIP (session initiation protocol) session control.
The International Solid Earth Research Virtual Observatory
NASA Astrophysics Data System (ADS)
Fox, G.; Pierce, M.; Rundle, J.; Donnellan, A.; Parker, J.; Granat, R.; Lyzenga, G.; McLeod, D.; Grant, L.
2004-12-01
We describe the architecture and initial implementation of the International Solid Earth Research Virtual Observatory (iSERVO). This has been prototyped within the USA as SERVOGrid and expansion is planned to Australia, China, Japan and other countries. We base our design on a globally scalable distributed "cyber-infrastructure" or Grid built around a Web Services-based approach consistent with the extended Web Service Interoperability approach. The Solid Earth Science Working Group of NASA has identified several challenges for Earth Science research. In order to investigate these, we need to couple numerical simulation codes and data mining tools to observational data sets. This observational data are now available on-line in internet-accessible forms, and the quantity of this data is expected to grow explosively over the next decade. We architect iSERVO as a loosely federated Grid of Grids with each country involved supporting a national Solid Earth Research Grid. The national Grid Operations, possibly with dedicated control centers, are linked together to support iSERVO where an International Grid control center may eventually be necessary. We address the difficult multi-administrative domain security and ownership issues by exposing capabilities as services for which the risk of abuse is minimized. We support large scale simulations within a single domain using service-hosted tools (mesh generation, data repository and sensor access, GIS, visualization). Simulations typically involve sequential or parallel machines in a single domain supported by cross-continent services. We use Web Services implement Service Oriented Architecture (SOA) using WSDL for service description and SOAP for message formats. These are augmented by UDDI, WS-Security, WS-Notification/Eventing and WS-ReliableMessaging in the WS-I+ approach. Support for the latter two capabilities will be available over the next 6 months from the NaradaBrokering messaging system. We augment these specifications with the powerful portlet architecture using WSRP and JSR168 supported by such portal containers as uPortal, WebSphere, and Apache JetSpeed2. The latter portal aggregates component user interfaces for each iSERVO service allowing flexible customization of the user interface. We exploit the portlets produced by the NSF NMI (Middleware initiative) OGCE activity. iSERVO also uses specifications from the Open Geographical Information Systems (GIS) Consortium (OGC) that defines a number of standards for modeling earth surface feature data and services for interacting with this data. The data models are expressed in the XML-based Geography Markup Language (GML), and the OGC service framework are being adapted to use the Web Service model. The SERVO prototype includes a GIS Grid that currently includes the core WMS and WFS (Map and Feature) services. We will follow the best practice in the Grid and Web Service field and will adapt our technology as appropriate. For example, we expect to support services built on WS-RF when is finalized and to make use of the database interfaces OGSA-DAI and its WS-I+ versions. Finally, we review advances in Web Service scripting (such as HPSearch) and workflow systems (such as GCF) and their applications to iSERVO.
Securing a web-based teleradiology platform according to German law and "best practices".
Spitzer, Michael; Ullrich, Tobias; Ueckert, Frank
2009-01-01
The Medical Data and Picture Exchange platform (MDPE), as a teleradiology system, facilitates the exchange of digital medical imaging data among authorized users. It features extensive support of the DICOM standard including networking functions. Since MDPE is designed as a web service, security and confidentiality of data and communication pose an outstanding challenge. To comply with demands of German laws and authorities, a generic data security concept considered as "best practice" in German health telematics was adapted to the specific demands of MDPE. The concept features strict logical and physical separation of diagnostic and identity data and thus an all-encompassing pseudonymization throughout the system. Hence, data may only be merged at authorized clients. MDPE's solution of merging data from separate sources within a web browser avoids technically questionable techniques such as deliberate cross-site scripting. Instead, data is merged dynamically by JavaScriptlets running in the user's browser. These scriptlets are provided by one server, while content and method calls are generated by another server. Additionally, MDPE uses encrypted temporary IDs for communication and merging of data.
SocialRAD: an infrastructure for a secure, cooperative, asynchronous teleradiology system.
Figueiredo, João Filho Matos; Motta, Gustavo Henrique Matos Bezerra
2013-01-01
The popularity of teleradiology services has enabled a major advance in the provision of health services to areas with difficult geographical access. However, this potential has also brought with it a number of challenges: the large volume of data, characteristic of imaging tests, and security requirements designed to ensure confidentiality and integrity. Moreover, there is also a number of ethical questions involving the dominant model on the market, whereby this service is outsourced to private companies, and is not directly undertaken by professional radiologists. Therefore, the present paper proposes a cooperative model of teleradiology, where health professionals interact directly with the hospitals providing patient care. This has involved the integration of a wide range of technologies, such as the interconnection models Peer-to-Peer, Cloud Computing, Dynamic DNS, RESTful Web Services, as well as security and interoperability standards, with the aim of promoting a secure, collaborative asynchronous environment. The developed model is currently being used on an experimental basis, providing teleradiology support to cities in the north-eastern hinterland of Brazil, and is fulfilling all expectations.
Job submission and management through web services: the experience with the CREAM service
NASA Astrophysics Data System (ADS)
Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Fina, S. D.; Ronco, S. D.; Dorigo, A.; Gianelle, A.; Marzolla, M.; Mazzucato, M.; Sgaravatto, M.; Verlato, M.; Zangrando, L.; Corvo, M.; Miccio, V.; Sciaba, A.; Cesini, D.; Dongiovanni, D.; Grandi, C.
2008-07-01
Modern Grid middleware is built around components providing basic functionality, such as data storage, authentication, security, job management, resource monitoring and reservation. In this paper we describe the Computing Resource Execution and Management (CREAM) service. CREAM provides a Web service-based job execution and management capability for Grid systems; in particular, it is being used within the gLite middleware. CREAM exposes a Web service interface allowing conforming clients to submit and manage computational jobs to a Local Resource Management System. We developed a special component, called ICE (Interface to CREAM Environment) to integrate CREAM in gLite. ICE transfers job submissions and cancellations from the Workload Management System, allowing users to manage CREAM jobs from the gLite User Interface. This paper describes some recent studies aimed at assessing the performance and reliability of CREAM and ICE; those tests have been performed as part of the acceptance tests for integration of CREAM and ICE in gLite. We also discuss recent work towards enhancing CREAM with a BES and JSDL compliant interface.
Della Mea, V.; Beltrami, C. A.
2000-01-01
The last five years experience has definitely demonstrated the possible applications of the Internet for telepathology. They may be listed as follows: (a) teleconsultation via multimedia e‐mail; (b) teleconsultation via web‐based tools; (c) distant education by means of World Wide Web; (d) virtual microscope management through Web and Java interfaces; (e) real‐time consultations through Internet‐based videoconferencing. Such applications have led to the recognition of some important limits of the Internet, when dealing with telemedicine: (i) no guarantees on the quality of service (QoS); (ii) inadequate security and privacy; (iii) for some countries, low bandwidth and thus low responsiveness for real‐time applications. Currently, there are several innovations in the world of the Internet. Different initiatives have been aimed at an amelioration of the Internet protocols, in order to have quality of service, multimedia support, security and other advanced services, together with greater bandwidth. The forthcoming Internet improvements, although induced by electronic commerce, video on demand, and other commercial needs, are of real interest also for telemedicine, because they solve the limits currently slowing down the use of Internet. When such new services will be available, telepathology applications may switch from research to daily practice in a fast way. PMID:11339559
Teaching Web Security Using Portable Virtual Labs
ERIC Educational Resources Information Center
Chen, Li-Chiou; Tao, Lixin
2012-01-01
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
Usage of insecure E-mail services among researchers with different scientific background.
Solić, Kresimir; Grgić, Krešimir; Ilakovac, Vesna; Zagar, Drago
2011-08-01
Free web‑based e-mail services are considered to have more security flaws than institutional ones, but they are frequently used among scientific researchers for professional communication. The aim of this study was to analyze frequency of usage of the insecure free e-mail services for professional communication among biomedical, economical and technical researchers, who published papers in one of three different journals: Croatian Medical Journal, Automatika and Economic Research. Contact details of the authors who provided their e‑mail address from the papers published in those three journals during one year period were collected. These e‑mail addresses were collected from the electronic archive of the journals in question. The domains of all e‑mail addresses were assessed and contacts were categorized into three groups according to the following types: world-wide known free web‑based e‑mail services, national Internet Service Provider (ISP) e-mail services, and institutional or corporate e-mail addresses. The proportion of authors using free web-based e-mail services, the least secure group type, was highest among biomedical researchers (17.8%) while every e‑mail address collected from the technical journal belonged to the secured institutional e‑mail group type. It seems that all researchers from the technical scientific field and most of the researchers from the economical field value good security practice and use more secure systems for professional communication. High percentage of the biomedical researchers who use insecure e‑mail services may mean that they need to be warned of the possible security disadvantages of those kinds of e‑mail addresses.
The OGC Sensor Web Enablement framework
NASA Astrophysics Data System (ADS)
Cox, S. J.; Botts, M.
2006-12-01
Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which covers low-level data streams, and domain-specific GML Application Schemas for definitions of the target feature types. The SWE framework has been demonstrated in several interoperability testbeds. These were based around emergency management, security, contamination and environmental monitoring scenarios.
AceCloud: Molecular Dynamics Simulations in the Cloud.
Harvey, M J; De Fabritiis, G
2015-05-26
We present AceCloud, an on-demand service for molecular dynamics simulations. AceCloud is designed to facilitate the secure execution of large ensembles of simulations on an external cloud computing service (currently Amazon Web Services). The AceCloud client, integrated into the ACEMD molecular dynamics package, provides an easy-to-use interface that abstracts all aspects of interaction with the cloud services. This gives the user the experience that all simulations are running on their local machine, minimizing the learning curve typically associated with the transition to using high performance computing services.
Applications of Multi-Channel Safety Authentication Protocols in Wireless Networks.
Chen, Young-Long; Liau, Ren-Hau; Chang, Liang-Yu
2016-01-01
People can use their web browser or mobile devices to access web services and applications which are built into these servers. Users have to input their identity and password to login the server. The identity and password may be appropriated by hackers when the network environment is not safe. The multiple secure authentication protocol can improve the security of the network environment. Mobile devices can be used to pass the authentication messages through Wi-Fi or 3G networks to serve as a second communication channel. The content of the message number is not considered in a multiple secure authentication protocol. The more excessive transmission of messages would be easier to collect and decode by hackers. In this paper, we propose two schemes which allow the server to validate the user and reduce the number of messages using the XOR operation. Our schemes can improve the security of the authentication protocol. The experimental results show that our proposed authentication protocols are more secure and effective. In regard to applications of second authentication communication channels for a smart access control system, identity identification and E-wallet, our proposed authentication protocols can ensure the safety of person and property, and achieve more effective security management mechanisms.
MOWServ: a web client for integration of bioinformatic resources
Ramírez, Sergio; Muñoz-Mérida, Antonio; Karlsson, Johan; García, Maximiliano; Pérez-Pulido, Antonio J.; Claros, M. Gonzalo; Trelles, Oswaldo
2010-01-01
The productivity of any scientist is affected by cumbersome, tedious and time-consuming tasks that try to make the heterogeneous web services compatible so that they can be useful in their research. MOWServ, the bioinformatic platform offered by the Spanish National Institute of Bioinformatics, was released to provide integrated access to databases and analytical tools. Since its release, the number of available services has grown dramatically, and it has become one of the main contributors of registered services in the EMBRACE Biocatalogue. The ontology that enables most of the web-service compatibility has been curated, improved and extended. The service discovery has been greatly enhanced by Magallanes software and biodataSF. User data are securely stored on the main server by an authentication protocol that enables the monitoring of current or already-finished user’s tasks, as well as the pipelining of successive data processing services. The BioMoby standard has been greatly extended with the new features included in the MOWServ, such as management of additional information (metadata such as extended descriptions, keywords and datafile examples), a qualified registry, error handling, asynchronous services and service replication. All of them have increased the MOWServ service quality, usability and robustness. MOWServ is available at http://www.inab.org/MOWServ/ and has a mirror at http://www.bitlab-es.com/MOWServ/. PMID:20525794
MOWServ: a web client for integration of bioinformatic resources.
Ramírez, Sergio; Muñoz-Mérida, Antonio; Karlsson, Johan; García, Maximiliano; Pérez-Pulido, Antonio J; Claros, M Gonzalo; Trelles, Oswaldo
2010-07-01
The productivity of any scientist is affected by cumbersome, tedious and time-consuming tasks that try to make the heterogeneous web services compatible so that they can be useful in their research. MOWServ, the bioinformatic platform offered by the Spanish National Institute of Bioinformatics, was released to provide integrated access to databases and analytical tools. Since its release, the number of available services has grown dramatically, and it has become one of the main contributors of registered services in the EMBRACE Biocatalogue. The ontology that enables most of the web-service compatibility has been curated, improved and extended. The service discovery has been greatly enhanced by Magallanes software and biodataSF. User data are securely stored on the main server by an authentication protocol that enables the monitoring of current or already-finished user's tasks, as well as the pipelining of successive data processing services. The BioMoby standard has been greatly extended with the new features included in the MOWServ, such as management of additional information (metadata such as extended descriptions, keywords and datafile examples), a qualified registry, error handling, asynchronous services and service replication. All of them have increased the MOWServ service quality, usability and robustness. MOWServ is available at http://www.inab.org/MOWServ/ and has a mirror at http://www.bitlab-es.com/MOWServ/.
Application of open source standards and technologies in the http://climate4impact.eu/ portal
NASA Astrophysics Data System (ADS)
Plieger, Maarten; Som de Cerff, Wim; Pagé, Christian; Tatarinova, Natalia
2015-04-01
This presentation will demonstrate how to calculate and visualize the climate indice SU (number of summer days) on the climate4impact portal. The following topics will be covered during the demonstration: - Security: Login using OpenID for access to the Earth System Grid Fedeation (ESGF) data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA). - Processing using Web Processing Services (WPS): Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 ICCLIM. - Visualization using Web Map Services (WMS): Visualize data from ESGF data nodes using ADAGUC Web Map Services. The aim of climate4impact is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 21 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the European projects IS-ENES and IS-ENES2 for more than 5 years, and its development currently continues within IS-ENES2 and CLIPC. As the climate impact community is very broad, the focus is mainly on the scientific impact community. This work has resulted in the ENES portal interface for climate impact communities and can be visited at http://climate4impact.eu/ The current main objectives for climate4impact can be summarized in two objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/ICCLIM on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals. This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national, for example.
Green, Beverly B; Cook, Andrea J; Ralston, James D; Fishman, Paul A; Catz, Sheryl L; Carlson, James; Carrell, David; Tyll, Lynda; Larson, Eric B; Thompson, Robert S
2008-06-25
Treating hypertension decreases mortality and disability from cardiovascular disease, but most hypertension remains inadequately controlled. To determine if a new model of care that uses patient Web services, home blood pressure (BP) monitoring, and pharmacist-assisted care improves BP control. A 3-group randomized controlled trial, the Electronic Communications and Home Blood Pressure Monitoring study was based on the Chronic Care Model. The trial was conducted at an integrated group practice in Washington state, enrolling 778 participants aged 25 to 75 years with uncontrolled essential hypertension and Internet access. Care was delivered over a secure patient Web site from June 2005 to December 2007. Participants were randomly assigned to usual care, home BP monitoring and secure patient Web site training only, or home BP monitoring and secure patient Web site training plus pharmacist care management delivered through Web communications. Percentage of patients with controlled BP (<140/90 mm Hg) and changes in systolic and diastolic BP at 12 months. Of 778 patients, 730 (94%) completed the 1-year follow-up visit. Patients assigned to the home BP monitoring and Web training only group had a nonsignificant increase in the percentage of patients with controlled BP (<140/90 mm Hg) compared with usual care (36% [95% confidence interval {CI}, 30%-42%] vs 31% [95% CI, 25%-37%]; P = .21). Adding Web-based pharmacist care to home BP monitoring and Web training significantly increased the percentage of patients with controlled BP (56%; 95% CI, 49%-62%) compared with usual care (P < .001) and home BP monitoring and Web training only (P < .001). Systolic BP was decreased stepwise from usual care to home BP monitoring and Web training only to home BP monitoring and Web training plus pharmacist care. Diastolic BP was decreased only in the pharmacist care group compared with both the usual care and home BP monitoring and Web training only groups. Compared with usual care, the patients who had baseline systolic BP of 160 mm Hg or higher and received home BP monitoring and Web training plus pharmacist care had a greater net reduction in systolic BP (-13.2 mm Hg [95% CI, -19.2 to -7.1]; P < .001) and diastolic BP (-4.6 mm Hg [95% CI, -8.0 to -1.2]; P < .001), and improved BP control (relative risk, 3.32 [95% CI, 1.86 to 5.94]; P<.001). Pharmacist care management delivered through secure patient Web communications improved BP control in patients with hypertension. Trial Registration clinicaltrials.gov Identifier: NCT00158639.
The design and implementation of web mining in web sites security
NASA Astrophysics Data System (ADS)
Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li
2003-06-01
The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.
GEMSS: grid-infrastructure for medical service provision.
Benkner, S; Berti, G; Engelbrecht, G; Fingberg, J; Kohring, G; Middleton, S E; Schmidt, R
2005-01-01
The European GEMSS Project is concerned with the creation of medical Grid service prototypes and their evaluation in a secure service-oriented infrastructure for distributed on demand/supercomputing. Key aspects of the GEMSS Grid middleware include negotiable QoS support for time-critical service provision, flexible support for business models, and security at all levels in order to ensure privacy of patient data as well as compliance to EU law. The GEMSS Grid infrastructure is based on a service-oriented architecture and is being built on top of existing standard Grid and Web technologies. The GEMSS infrastructure offers a generic Grid service provision framework that hides the complexity of transforming existing applications into Grid services. For the development of client-side applications or portals, a pluggable component framework has been developed, providing developers with full control over business processes, service discovery, QoS negotiation, and workflow, while keeping their underlying implementation hidden from view. A first version of the GEMSS Grid infrastructure is operational and has been used for the set-up of a Grid test-bed deploying six medical Grid service prototypes including maxillo-facial surgery simulation, neuro-surgery support, radio-surgery planning, inhaled drug-delivery simulation, cardiovascular simulation and advanced image reconstruction. The GEMSS Grid infrastructure is based on standard Web Services technology with an anticipated future transition path towards the OGSA standard proposed by the Global Grid Forum. GEMSS demonstrates that the Grid can be used to provide medical practitioners and researchers with access to advanced simulation and image processing services for improved preoperative planning and near real-time surgical support.
39 CFR 501.11 - Reporting Postage Evidencing System security weaknesses.
Code of Federal Regulations, 2014 CFR
2014-07-01
... any repeatable deviation from normal Postage Evidencing System performance. (3) Cyber attacks that... misappropriating assets or sensitive information, corrupting data, or causing operational disruption. Cyber attacks... causing denial-of-service attacks on Web sites. Cyber attacks may be carried out by third parties or...
39 CFR 501.11 - Reporting Postage Evidencing System security weaknesses.
Code of Federal Regulations, 2013 CFR
2013-07-01
... any repeatable deviation from normal Postage Evidencing System performance. (3) Cyber attacks that... misappropriating assets or sensitive information, corrupting data, or causing operational disruption. Cyber attacks... causing denial-of-service attacks on Web sites. Cyber attacks may be carried out by third parties or...
39 CFR 501.11 - Reporting Postage Evidencing System security weaknesses.
Code of Federal Regulations, 2012 CFR
2012-07-01
... any repeatable deviation from normal Postage Evidencing System performance. (3) Cyber attacks that... misappropriating assets or sensitive information, corrupting data, or causing operational disruption. Cyber attacks... causing denial-of-service attacks on Web sites. Cyber attacks may be carried out by third parties or...
A National Crop Progress Monitoring System Based on NASA Earth Science Results
NASA Astrophysics Data System (ADS)
Di, L.; Yu, G.; Zhang, B.; Deng, M.; Yang, Z.
2011-12-01
Crop progress is an important piece of information for food security and agricultural commodities. Timely monitoring and reporting are mandated for the operation of agricultural statistical agencies. Traditionally, the weekly reporting issued by the National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) is based on reports from the knowledgeable state and county agricultural officials and farmers. The results are spatially coarse and subjective. In this project, a remote-sensing-supported crop progress monitoring system is being developed intensively using the data and derived products from NASA Earth Observing satellites. Moderate Resolution Imaging Spectroradiometer (MODIS) Level 3 product - MOD09 (Surface Reflectance) is used for deriving daily normalized vegetation index (NDVI), vegetation condition index (VCI), and mean vegetation condition index (MVCI). Ratio change to previous year and multiple year mean can be also produced on demand. The time-series vegetation condition indices are further combined with the NASS' remote-sensing-derived Cropland Data Layer (CDL) to estimate crop condition and progress crop by crop. To facilitate the operational requirement and increase the accessibility of data and products by different users, each component of the system has being developed and implemented following open specifications under the Web Service reference model of Open Geospatial Consortium Inc. Sensor observations and data are accessed through Web Coverage Service (WCS), Web Feature Service (WFS), or Sensor Observation Service (SOS) if available. Products are also served through such open-specification-compliant services. For rendering and presentation, Web Map Service (WMS) is used. A Web-service based system is set up and deployed at dss.csiss.gmu.edu/NDVIDownload. Further development will adopt crop growth models, feed the models with remotely sensed precipitation and soil moisture information, and incorporate the model results with vegetation-index time series for crop progress stage estimation.
Accelerating Cancer Systems Biology Research through Semantic Web Technology
Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S.
2012-01-01
Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute’s caBIG®, so users can not only interact with the DMR through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers’ intellectual property. PMID:23188758
Accelerating cancer systems biology research through Semantic Web technology.
Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S
2013-01-01
Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute's caBIG, so users can interact with the DMR not only through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers' intellectual property. Copyright © 2012 Wiley Periodicals, Inc.
A Web-Based Database for Nurse Led Outreach Teams (NLOT) in Toronto.
Li, Shirley; Kuo, Mu-Hsing; Ryan, David
2016-01-01
A web-based system can provide access to real-time data and information. Healthcare is moving towards digitizing patients' medical information and securely exchanging it through web-based systems. In one of Ontario's health regions, Nurse Led Outreach Teams (NLOT) provide emergency mobile nursing services to help reduce unnecessary transfers from long-term care homes to emergency departments. Currently the NLOT team uses a Microsoft Access database to keep track of the health information on the residents that they serve. The Access database lacks scalability, portability, and interoperability. The objective of this study is the development of a web-based database using Oracle Application Express that is easily accessible from mobile devices. The web-based database will allow NLOT nurses to enter and access resident information anytime and from anywhere.
2014-07-01
voluminous threat environment. Today we regularly construct seamless encrypted communications between machines through SSL or other TLS . These do not...return to the web application and the user, As a prerequisite to end-to-end communication an SSL , or other suitable TLS is set up between each of the...an TLS connection is established between the requestor and the service provider, within which a WS-Security package will be sent to the service
2010-08-01
the public and for first responders to access disaster information and services provided by government agencies and non- governmental organizations...thereby reducing the performance gap for a single federal disaster-management site. DMIS provides government and non- governmental organizations...NIMS) and Na- tional Response Framework (NRF): a. First responders b. Local governments and agencies c. Regional and federal agencies 2
Wiki Mass Authoring for Experiential Learning: A Case Study
ERIC Educational Resources Information Center
Pardue, Harold; Landry, Jeffrey; Sweeney, Bob
2013-01-01
Web 2.0 services include sharing and collaborative technologies such as blogs, social networking sites, online office productivity tools, and wikis. Wikis are increasingly used for the design and implementation of pedagogy, for example to facilitate experiential learning. A U.S. government-funded project for system security risk assessment was…
OpenID Connect as a security service in cloud-based medical imaging systems.
Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter
2016-04-01
The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.
Session management for web-based healthcare applications.
Wei, L.; Sengupta, S.
1999-01-01
In health care systems, users may access multiple applications during one session of interaction with the system. However, users must sign on to each application individually, and it is difficult to maintain a common context among these applications. We are developing a session management system for web-based applications using LDAP directory service, which will allow single sign-on to multiple web-based applications, and maintain a common context among those applications for the user. This paper discusses the motivations for building this system, the system architecture, and the challenges of our approach, such as the session objects management for the user, and session security. PMID:10566511
NASA Astrophysics Data System (ADS)
Kershaw, Philip; Lawrence, Bryan; Lowe, Dominic; Norton, Peter; Pascoe, Stephen
2010-05-01
CEDA (Centre for Environmental Data Archival) based at STFC Rutherford Appleton Laboratory is host to the BADC (British Atmospheric Data Centre) and NEODC (NERC Earth Observation Data Centre) with data holdings of over half a Petabyte. In the coming months this figure is set to increase by over one Petabyte through the BADC's role as one of three data centres to host the CMIP5 (Coupled Model Intercomparison Project Phase 5) core archive of climate model data. Quite apart from the problem of managing the storage of such large volumes there is the challenge of collating the data together from the modelling centres around the world and enabling access to these data for the user community. An infrastructure to support this is being developed under the US Earth System Grid (ESG) and related projects bringing together participating organisations together in a federation. The ESG architecture defines Gateways, the web interfaces that enable users to access data and data serving applications organised into Data Nodes. The BADC has been working in collaboration with US Earth System Grid team and other partners to develop a security system to restrict access to data. This provides single sign-on via both OpenID and PKI based means and uses role based authorisation facilitated by SAML and OpenID based interfaces for attribute retrieval. This presentation will provide an overview of the access control architecture and look at how this has been implemented for CEDA. CEDA has developed an expertise in data access and information services over several years through a number of projects to develop and enhance these capabilities. Participation in CMIP5 comes at a time when a number of other software development activities are coming to fruition. New services are in the process of being deployed alongside services making up the system for ESG. The security system must apply access control across this heterogeneous environment of different data services and technologies. One strand of the development efforts within CEDA has been the NDG (NERC Datagrid) Security system. This system has been extended to interoperate with ESG, greatly assisted by the standards based approach adopted for the ESG security architecture. Drawing from experience from previous projects the decision was taken to refactor the NDG Security software into a component based architecture to enable a separation of concerns between access control and the functionality of a given application being protected. Such an approach is only possible through a generic interface. At CEDA, this has been realised in the Python programming language using the WSGI (Web Server Gateway Interface) specification. A parallel Java filter based implementation is also under development with our US partners for use with the THREDDS Data Server. Using such technologies applications and middleware can be assembled into custom configurations to meet different requirements. In the case of access control, NDG Security middleware can be layered over the top of existing applications without the need to modify them. A RESTful approach to the application of authorisation policy has been key in this approach. We explore the practical implementation of such a scheme alongside the application of the ESG security architecture to CEDA's OGC web services implementation COWS.
Columbia University's Informatics for Diabetes Education and Telemedicine (IDEATel) Project
Starren, Justin; Hripcsak, George; Sengupta, Soumitra; Abbruscato, C.R.; Knudson, Paul E.; Weinstock, Ruth S.; Shea, Steven
2002-01-01
The Columbia University Informatics for Diabetes Education and Telemedicine IDEATel) project is a four-year demonstration project funded by the Centers for Medicare and Medicaid Services with the overall goal of evaluating the feasibility, acceptability, effectiveness, and cost-effectiveness of telemedicine. The focal point of the intervention is the home telemedicine unit (HTU), which provides four functions: synchronous videoconferencing over standard telephone lines, electronic transmission for fingerstick glucose and blood pressure readings, secure Web-based messaging and clinical data review, and access to Web-based educational materials. The HTU must be usable by elderly patients with no prior computer experience. Providing these functions through the HTU requires tight integration of six components: the HTU itself, case management software, a clinical information system, Web-based educational material, data security, and networking and telecommunications. These six components were integrated through a variety of interfaces, providing a system that works well for patients and providers. With more than 400 HTUs installed, IDEATel has demonstrated the feasibility of large-scale home telemedicine. PMID:11751801
JPARSS: A Java Parallel Network Package for Grid Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Jie; Akers, Walter; Chen, Ying
2002-03-01
The emergence of high speed wide area networks makes grid computinga reality. However grid applications that need reliable data transfer still have difficulties to achieve optimal TCP performance due to network tuning of TCP window size to improve bandwidth and to reduce latency on a high speed wide area network. This paper presents a Java package called JPARSS (Java Parallel Secure Stream (Socket)) that divides data into partitions that are sent over several parallel Java streams simultaneously and allows Java or Web applications to achieve optimal TCP performance in a grid environment without the necessity of tuning TCP window size.more » This package enables single sign-on, certificate delegation and secure or plain-text data transfer using several security components based on X.509 certificate and SSL. Several experiments will be presented to show that using Java parallelstreams is more effective than tuning TCP window size. In addition a simple architecture using Web services« less
Cloud-based Predictive Modeling System and its Application to Asthma Readmission Prediction
Chen, Robert; Su, Hang; Khalilia, Mohammed; Lin, Sizhe; Peng, Yue; Davis, Tod; Hirsh, Daniel A; Searles, Elizabeth; Tejedor-Sojo, Javier; Thompson, Michael; Sun, Jimeng
2015-01-01
The predictive modeling process is time consuming and requires clinical researchers to handle complex electronic health record (EHR) data in restricted computational environments. To address this problem, we implemented a cloud-based predictive modeling system via a hybrid setup combining a secure private server with the Amazon Web Services (AWS) Elastic MapReduce platform. EHR data is preprocessed on a private server and the resulting de-identified event sequences are hosted on AWS. Based on user-specified modeling configurations, an on-demand web service launches a cluster of Elastic Compute 2 (EC2) instances on AWS to perform feature selection and classification algorithms in a distributed fashion. Afterwards, the secure private server aggregates results and displays them via interactive visualization. We tested the system on a pediatric asthma readmission task on a de-identified EHR dataset of 2,967 patients. We conduct a larger scale experiment on the CMS Linkable 2008–2010 Medicare Data Entrepreneurs’ Synthetic Public Use File dataset of 2 million patients, which achieves over 25-fold speedup compared to sequential execution. PMID:26958172
First experience with the new .cern Top Level Domain
NASA Astrophysics Data System (ADS)
Alvarez, E.; Malo de Molina, M.; Salwerowicz, M.; Silva De Sousa, B.; Smith, T.; Wagner, A.
2017-10-01
In October 2015, CERN’s core website has been moved to a new address, http://home.cern, marking the launch of the brand new top-level domain .cern. In combination with a formal governance and registration policy, the IT infrastructure needed to be extended to accommodate the hosting of Web sites in this new top level domain. We will present the technical implementation in the framework of the CERN Web Services that allows to provide virtual hosting, a reverse proxy solution and that also includes the provisioning of SSL server certificates for secure communications.
Enabling Secure XMPP Communications in Federated IoT Clouds Through XEP 0027 and SAML/SASL SSO
Celesti, Antonio; Fazio, Maria; Villari, Massimo
2017-01-01
Nowadays, in the panorama of Internet of Things (IoT), finding a right compromise between interactivity and security is not trivial at all. Currently, most of pervasive communication technologies are designed to work locally. As a consequence, the development of large-scale Internet services and applications is not so easy for IoT Cloud providers. The main issue is that both IoT architectures and services have started as simple but they are becoming more and more complex. Consequently, the web service technology is often inappropriate. Recently, many operators in both academia and industry fields are considering the possibility to adopt the eXtensible Messaging and Presence Protocol (XMPP) for the implementation of IoT Cloud communication systems. In fact, XMPP offers many advantages in term of real-time capabilities, efficient data distribution, service discovery and inter-domain communication compared to other technologies. Nevertheless, the protocol lacks of native security, data confidentiality and trustworthy federation features. In this paper, considering an XMPP-based IoT Cloud architectural model, we discuss how can be possible to enforce message signing/encryption and Single-Sign On (SSO) authentication respectively for secure inter-module and inter-domain communications in a federated environment. Experiments prove that security mechanisms introduce an acceptable overhead, considering the obvious advantages achieved in terms of data trustiness and privacy. PMID:28178214
Enabling Secure XMPP Communications in Federated IoT Clouds Through XEP 0027 and SAML/SASL SSO.
Celesti, Antonio; Fazio, Maria; Villari, Massimo
2017-02-07
Nowadays, in the panorama of Internet of Things (IoT), finding a right compromise between interactivity and security is not trivial at all. Currently, most of pervasive communication technologies are designed to work locally. As a consequence, the development of large-scale Internet services and applications is not so easy for IoT Cloud providers. The main issue is that both IoT architectures and services have started as simple but they are becoming more and more complex. Consequently, the web service technology is often inappropriate. Recently, many operators in both academia and industry fields are considering the possibility to adopt the eXtensible Messaging and Presence Protocol (XMPP) for the implementation of IoT Cloud communication systems. In fact, XMPP offers many advantages in term of real-time capabilities, efficient data distribution, service discovery and inter-domain communication compared to other technologies. Nevertheless, the protocol lacks of native security, data confidentiality and trustworthy federation features. In this paper, considering an XMPP-based IoT Cloud architectural model, we discuss how can be possible to enforce message signing/encryption and Single-Sign On (SSO) authentication respectively for secure inter-module and inter-domain communications in a federated environment. Experiments prove that security mechanisms introduce an acceptable overhead, considering the obvious advantages achieved in terms of data trustiness and privacy.
Information-Flow-Based Access Control for Web Browsers
NASA Astrophysics Data System (ADS)
Yoshihama, Sachiko; Tateishi, Takaaki; Tabuchi, Naoshi; Matsumoto, Tsutomu
The emergence of Web 2.0 technologies such as Ajax and Mashup has revealed the weakness of the same-origin policy[1], the current de facto standard for the Web browser security model. We propose a new browser security model to allow fine-grained access control in the client-side Web applications for secure mashup and user-generated contents. We propose a browser security model that is based on information-flow-based access control (IBAC) to overcome the dynamic nature of the client-side Web applications and to accurately determine the privilege of scripts in the event-driven programming model.
ERIC Educational Resources Information Center
Fischer, Audrey; Cole, John Y.; Tarr, Susan M.; Carey, Len; Mehnert, Robert; Sherman, Andrew M.; Davis, Linda; Leahy, Debra W.; Chute, Adrienne; Willard, Robert S.; Dunn, Christina
2003-01-01
Includes annual reports from 12 federal agencies and libraries that discuss security, budgets, legislation, digital projects, preservation, government role, information management, personnel changes, collections, databases, financial issues, services, administration, Web sites, access to information, customer service, statistics, international…
NASA Technical Reports Server (NTRS)
Hinke, Thomas H.
2004-01-01
Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid services discovered using semantic grid technology. As required, high-end computational resources could be drawn from available grid resource pools. Using grid technology, this confluence of data, services and computational resources could easily be harnessed to transform data from many different sources into a desired product that is delivered to a user's workstation or to a web portal though which it could be accessed by its intended audience.
Securing the anonymity of content providers in the World Wide Web
NASA Astrophysics Data System (ADS)
Demuth, Thomas; Rieke, Andreas
1999-04-01
Nowadays the World Wide Web (WWW) is an established service used by people all over the world. Most of them do not recognize the fact that they reveal plenty of information about themselves or their affiliation and computer equipment to the providers of web pages they connect to. As a result, a lot of services offer users to access web pages unrecognized or without risk of being backtracked, respectively. This kind of anonymity is called user or client anonymity. But on the other hand, an equivalent protection for content providers does not exist, although this feature is desirable for many situations in which the identity of a publisher or content provider shall be hidden. We call this property server anonymity. We will introduce the first system with the primary target to offer anonymity for providers of information in the WWW. Beside this property, it provides also client anonymity. Based on David Chaum's idea of mixes and in relation to the context of the WWW, we explain the term 'server anonymity' motivating the system JANUS which offers both client and server anonymity.
OpenID Connect as a security service in cloud-based medical imaging systems
Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter
2016-01-01
Abstract. The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as “Kerberos of cloud.” We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682
IRONSIDES: DNS With No Single Packet Denial of Service or Remote Code Execution Vulnerabilities
2012-02-27
Caching DNSSEC TSIG 1Pv6 Wildcard S fi.. In terface y y Y * N __ mo e ---- o •• vare y y y N NN y m progress y y y NY N Web, Y command...Proceedings of the 2007 IEEE Aerospace Conference. [6) C. Heitmeyer, M . Archer, E. Leonard and J. Mclean, "Applying formal methods to a certifiably secure...2003). [1 5] DNSSEC-The DNS Security Extensions, http:// http://www.dnssec.net/ (16] S . Conchon, E. Contcjean and J. Kanig, "Ergo : A theorem prover
Selling Internet Gambling: Advertising, New Media and the Content of Poker Promotion
ERIC Educational Resources Information Center
McMullan, John L.; Kervin, Melissa
2012-01-01
This study examines the web design and engineering, advertising and marketing, and pedagogical features present at a random sample of 71 international poker sites obtained from the Casino City directory in the summer of 2009. We coded for 22 variables related to access, appeal, player protection, customer services, on-site security, use of images,…
Making Choices in the Virtual World: The New Model at United Technologies Information Network.
ERIC Educational Resources Information Center
Gulliford, Bradley
1998-01-01
Describes changes in services of the United Technologies Corporation Information Network from a traditional library system to a virtual system of World Wide Web sites, a document-delivery unit, telephone and e-mail reference, and desktop technical support to provide remote access. Staff time, security, and licensing issues are addressed.…
2014-11-01
unclassified tools and techniques that can be shared with PNs, to include social engineering, spear phishing , fake web sites, physical access attempts, and...and instead rely on commercial services such as Yahoo or Google . Some nations have quite advanced cyber security practices, but may take vastly...unauthorized access to data/systems Inject external network scanning, email phishing , malicious website access, social engineering Sample
Efficient Web Services Policy Combination
NASA Technical Reports Server (NTRS)
Vatan, Farrokh; Harman, Joseph G.
2010-01-01
Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take precedence. Meta-policies are specified in defeasible logic, a computationally efficient non-monotonic logic developed to model human reasoning. One drawback of this method is that at one point the algorithm starts an exhaustive search of all subsets of the set of conclusions of a defeasible theory. Although the propositional defeasible logic has linear complexity, the set of conclusions here may be large, especially in real-life practical cases. This phenomenon leads to an inefficient exponential explosion of complexity. The current process of getting a Web security policy from combination of two meta-policies consists of two steps. The first is generating a new meta-policy that is a composition of the input meta-policies, and the second is mapping the meta-policy onto a security policy. The new algorithm avoids the exhaustive search in the current algorithm, and provides a security policy that matches all requirements of the involved metapolicies.
Turning Access into a web-enabled secure information system for clinical trials.
Dongquan Chen; Chen, Wei-Bang; Soong, Mayhue; Soong, Seng-Jaw; Orthner, Helmuth F
2009-08-01
Organizations that have limited resources need to conduct clinical studies in a cost-effective, but secure way. Clinical data residing in various individual databases need to be easily accessed and secured. Although widely available, digital certification, encryption, and secure web server, have not been implemented as widely, partly due to a lack of understanding of needs and concerns over issues such as cost and difficulty in implementation. The objective of this study was to test the possibility of centralizing various databases and to demonstrate ways of offering an alternative to a large-scale comprehensive and costly commercial product, especially for simple phase I and II trials, with reasonable convenience and security. We report a working procedure to transform and develop a standalone Access database into a secure Web-based secure information system. For data collection and reporting purposes, we centralized several individual databases; developed, and tested a web-based secure server using self-issued digital certificates. The system lacks audit trails. The cost of development and maintenance may hinder its wide application. The clinical trial databases scattered in various departments of an institution could be centralized into a web-enabled secure information system. The limitations such as the lack of a calendar and audit trail can be partially addressed with additional programming. The centralized Web system may provide an alternative to a comprehensive clinical trial management system.
ERIC Educational Resources Information Center
Shermis, Mark D.; Averitt, Jason
The purpose of this paper is to enumerate a series of security steps that might be taken by those researchers or organizations that are contemplating Web-based tests and performance assessments. From a security viewpoint, much of what goes on with Web-based transactions is similar to other general computer activity, but the recommendations here…
The old age health security in rural China: where to go?
Dai, Baozhen
2015-11-04
The huge number of rural elders and the deepening health problems (e.g. growing threats of infectious diseases and chronic diseases etc.) place enormous pressure on old age health security in rural China. This study aims to provide information for policy-makers to develop effective measures for promoting rural elders' health care service access by examining the current developments and challenges confronted by the old age health security in rural China. Search resources are electronic databases, web pages of the National Bureau of Statistics of China and the National Health and Family Planning Commission of China on the internet, China Population and Employment Statistics Yearbook, China Civil Affairs' Statistical Yearbook and China Health Statistics Yearbooks etc. Articles were identified from Elsevier, Wiley, EBSCO, EMBASE, PubMed, SCI Expanded, ProQuest, and National Knowledge Infrastructure of China (CNKI) which is the most informative database in Chinese. Search terms were "rural", "China", "health security", "cooperative medical scheme", "social medical assistance", "medical insurance" or "community based medical insurance", "old", or "elder", "elderly", or "aged", "aging". Google scholar was searched with the same combination of keywords. The results showed that old age health security in rural China had expanded to all rural elders and substantially improved health care service utilization among rural elders. Increasing chronic disease prevalence rates, pressing public health issues, inefficient rural health care service provision system and lack of sufficient financing challenged the old age health security in rural China. Increasing funds from the central and regional governments for old age health security in rural China will contribute to reducing urban-rural disparities in provision of old age health security and increasing health equity among rural elders between different regions. Meanwhile, initiating provider payment reform may contribute to improving the efficiency of rural health care service provision system and promoting health care service access among rural elders.
17 CFR 240.15c2-12 - Municipal securities disclosure.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Internet Web site or filed with the Commission. (4) The term issuer of municipal securities means the... the public on the Municipal Securities Rulemaking Board's Internet Web site or filed with the... 17 Commodity and Securities Exchanges 4 2014-04-01 2014-04-01 false Municipal securities...
17 CFR 240.15c2-12 - Municipal securities disclosure.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Internet Web site or filed with the Commission. (4) The term issuer of municipal securities means the... the public on the Municipal Securities Rulemaking Board's Internet Web site or filed with the... 17 Commodity and Securities Exchanges 3 2013-04-01 2013-04-01 false Municipal securities...
17 CFR 240.15c2-12 - Municipal securities disclosure.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Internet Web site or filed with the Commission. (4) The term issuer of municipal securities means the... the public on the Municipal Securities Rulemaking Board's Internet Web site or filed with the... 17 Commodity and Securities Exchanges 3 2011-04-01 2011-04-01 false Municipal securities...
17 CFR 240.15c2-12 - Municipal securities disclosure.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Internet Web site or filed with the Commission. (4) The term issuer of municipal securities means the... the public on the Municipal Securities Rulemaking Board's Internet Web site or filed with the... 17 Commodity and Securities Exchanges 3 2012-04-01 2012-04-01 false Municipal securities...
Spaceflight Operations Services Grid (SOSG) Project
NASA Technical Reports Server (NTRS)
Bradford, Robert; Lisotta, Anthony
2004-01-01
The motivation, goals, and objectives of the Space Operations Services Grid Project (SOSG) are covered in this viewgraph presentation. The goals and objectives of SOSG include: 1) Developing a grid-enabled prototype providing Space-based ground operations end user services through a collaborative effort between NASA, academia, and industry to assess the technical and cost feasibility of implementation of Grid technologies in the Space Operations arena; 2) Provide to space operations organizations and processes, through a single secure portal(s), access to all the information technology (Grid and Web based) services necessary for program/project development, operations and the ultimate creation of new processes, information and knowledge.
The secure authorization model for healthcare information system.
Hsu, Wen-Shin; Pan, Jiann-I
2013-10-01
Exploring healthcare system for assisting medical services or transmitting patients' personal health information in web application has been widely investigated. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. In the healthcare system, not all users are allowed to access all the information. Several authorization models for restricting users to access specific information at specific permissions have been proposed. However, as the number of users and the amount of information grows, the difficulties for administrating user authorization will increase. The critical problem limits the widespread usage of the healthcare system. This paper proposes an approach for role-based and extends it to deal with the information for authorizations in the healthcare system. We propose the role-based authorization model which supports authorizations for different kinds of objects, and a new authorization domain. Based on this model, we discuss the issues and requirements of security in the healthcare systems. The security issues for services shared between different healthcare industries will also be discussed.
Web Application Software for Ground Operations Planning Database (GOPDb) Management
NASA Technical Reports Server (NTRS)
Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey
2013-01-01
A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.
Tricks and Clicks: How Low-Cost Carriers Ply Their Trade Through Self-Service Websites
NASA Astrophysics Data System (ADS)
Barry, Chris; Torres, Ann M.
Ethics on the Internet has been a widely debated topic in recent years covering issues that range from privacy to security to fraud. Little, however, has been written on more subtle ethical questions, such as the exploitation of web technologies to inhibit or avoid customer service. Increasingly some firms are using websites to create distance between them and their customer base in specific areas of their operations, while simultaneously developing excellence in sales transaction committal via self-service. This chapter takes a magnifying glass with an ethical lens to one sector - the low-cost, web-based, self-service airline industry, specifically in Ireland. It is noted that the teaching of information systems development (ISD) and, for the most part its practice, assumes ethicality. Similarly, marketing courses focus on satisfying customer needs more effectively and efficiently within the confines of an acceptable ethos. This chapter observes that while these business disciplines are central to the success of self-service websites, there is a disconnect between the normative view and the actuality of practice.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... Certain Obsolete Text April 17, 2013. Pursuant to Section 19(b)(1) \\1\\ of the Securities Exchange Act of... and Charges for Exchange Services (the ``Fee Schedule'') to remove certain obsolete text related to a.... The text of the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the...
NASA Astrophysics Data System (ADS)
Urrutia-Cordero, Pablo; Ekvall, Mattias K.; Hansson, Lars-Anders
2016-07-01
A major challenge for ecological research is to identify ways to improve resilience to climate-induced changes in order to secure the ecosystem functions of natural systems, as well as ecosystem services for human welfare. With respect to aquatic ecosystems, interactions between climate warming and the elevated runoff of humic substances (brownification) may strongly affect ecosystem functions and services. However, we hitherto lack the adaptive management tools needed to counteract such global-scale effects on freshwater ecosystems. Here we show, both experimentally and using monitoring data, that predicted climatic warming and brownification will reduce freshwater quality by exacerbating cyanobacterial growth and toxin levels. Furthermore, in a model based on long-term data from a natural system, we demonstrate that food web management has the potential to increase the resilience of freshwater systems against the growth of harmful cyanobacteria, and thereby that local efforts offer an opportunity to secure our water resources against some of the negative impacts of climate warming and brownification. This allows for novel policy action at a local scale to counteract effects of global-scale environmental change, thereby providing a buffer period and a safer operating space until climate mitigation strategies are effectively established.
The deegree framework - Spatial Data Infrastructure solution for end-users and developers
NASA Astrophysics Data System (ADS)
Kiehle, Christian; Poth, Andreas
2010-05-01
The open source software framework deegree is a comprehensive implementation of standards as defined by ISO and Open Geospatial Consortium (OGC). It has been developed with two goals in mind: provide a uniform framework for implementing Spatial Data Infrastructures (SDI) and adhering to standards as strictly as possible. Although being open source software (Lesser GNU Public License, LGPL), deegree has been developed with a business model in mind: providing the general building blocks of SDIs without license fees and offer customization, consulting and tailoring by specialized companies. The core of deegree is a comprehensive Java Application Programming Interface (API) offering access to spatial features, analysis, metadata and coordinate reference systems. As a library, deegree can and has been integrated as a core module inside spatial information systems. It is reference implementation for several OGC standards and based on an ISO 19107 geometry model. For end users, deegree is shipped as a web application providing easy-to-set-up components for web mapping and spatial analysis. Since 2000, deegree has been the backbone of many productive SDIs, first and foremost for governmental stakeholders (e.g. Federal Agency for Cartography and Geodesy in Germany, the Ministry of Housing, Spatial Planning and the Environment in the Netherlands, etc.) as well as for research and development projects as an early adoption of standards, drafts and discussion papers. Besides mature standards like Web Map Service, Web Feature Service and Catalogue Services, deegree also implements rather new standards like the Sensor Observation Service, the Web Processing Service and the Web Coordinate Transformation Service (WCTS). While a robust background in standardization (knowledge and implementation) is a must for consultancy, standard-compliant services and encodings alone do not provide solutions for customers. The added value is comprised by a sophisticated set of client software, desktop and web environments. A focus lies on different client solutions for specific standards like the Web Processing Service and the Web Coordinate Transformation Service. On the other hand, complex geoportal solutions comprised of multiple standards and enhanced by components for user management, security and map client functionality show the demanding requirements of real world solutions. The XPlan-GML-standard as defined by the German spatial planing authorities is a good example of how complex real-world requirements can get. XPlan-GML is intended to provide a framework for digital spatial planning documents and requires complex Geography Markup Language (GML) features along with Symbology Encoding (SE), Filter Encoding (FE), Web Map Services (WMS), Web Feature Services (WFS). This complex infrastructure should be used by urban and spatial planners and therefore requires a user-friendly graphical interface hiding the complexity of the underlying infrastructure. Based on challenges faced within customer projects, the importance of easy to use software components is focused. SDI solution should be build upon ISO/OGC-standards, but more important, should be user-friendly and support the users in spatial data management and analysis.
Designing, Implementing, and Evaluating Secure Web Browsers
ERIC Educational Resources Information Center
Grier, Christopher L.
2009-01-01
Web browsers are plagued with vulnerabilities, providing hackers with easy access to computer systems using browser-based attacks. Efforts that retrofit existing browsers have had limited success since modern browsers are not designed to withstand attack. To enable more secure web browsing, we design and implement new web browsers from the ground…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karthik, Rajasekar; Patlolla, Dilip Reddy; Sorokine, Alexandre
Managing a wide variety of mobile devices across multiple mobile operating systems is a security challenge for any organization [1, 2]. With the wide adoption of mobile devices to access work-related apps, there is an increase in third-party apps that might either misuse or improperly handle user s personal or sensitive data [3]. HTML5 has been receiving wide attention for developing cross-platform mobile apps. According to International Data Corporation (IDC), by 2015, 80% of all mobile apps will be based in part or wholly upon HTML5 [4]. Though HTML5 provides a rich set of features for building an app, itmore » is a challenge for organizations to deploy and manage HTML5 apps on wide variety of devices while keeping security policies intact. In this paper, we will describe an upcoming secure mobile environment for HTML5 apps, called Sencha Space that addresses these issues and discuss how it will be used to design and build a secure and cross-platform mobile mapping service app. We will also describe how HTML5 and a new set of related technologies such as Geolocation API, WebGL, Open Layers 3, and Local Storage, can be used to provide a high end and high performance experience for users of the mapping service app.« less
Secure, Autonomous, Intelligent Controller for Integrating Distributed Sensor Webs
NASA Technical Reports Server (NTRS)
Ivancic, William D.
2007-01-01
This paper describes the infrastructure and protocols necessary to enable near-real-time commanding, access to space-based assets, and the secure interoperation between sensor webs owned and controlled by various entities. Select terrestrial and aeronautics-base sensor webs will be used to demonstrate time-critical interoperability between integrated, intelligent sensor webs both terrestrial and between terrestrial and space-based assets. For this work, a Secure, Autonomous, Intelligent Controller and knowledge generation unit is implemented using Virtual Mission Operation Center technology.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-16
...This document revises the mailing address and web-based submission procedures for filing certain notices under the Department of Labor (Department) Employee Benefits Security Administration's fiduciary-level fee disclosure regulation under section 408(b)(2) of the Employee Retirement Income Security Act of 1974 (ERISA). Responsible plan fiduciaries of employee pension benefit plans must file these notices with the Department to obtain relief from ERISA's prohibited transaction provisions that otherwise may apply when a covered service provider to the plan fails to disclose information in accordance with the regulation's requirements.
A Low-Cost and Secure Solution for e-Commerce
NASA Astrophysics Data System (ADS)
Pasquet, Marc; Vacquez, Delphine; Rosenberger, Christophe
We present in this paper a new architecture for remote banking and e-commerce applications. The proposed solution is designed to be low cost and provides some good guarantees of security for a client and his bank issuer. Indeed, the main problem for an issuer is to identify and authenticate one client (a cardholder) using his personal computer through the web when this client wants to access to remote banking services or when he wants to pay on a e-commerce site equipped with 3D-secure payment solution. The proposed solution described in this paper is MasterCard Chip Authentication Program compliant and was experimented in the project called SOPAS. The main contribution of this system consists in the use of a smartcard with a I2C bus that pilots a terminal only equipped with a screen and a keyboard. During the use of services, the user types his PIN code on the keyboard and all the security part of the transaction is performed by the chip of the smartcard. None information of security stays on the personal computer and a dynamic token created by the card is sent to the bank and verified by the front end. We present first the defined methodology and we analyze the main security aspects of the proposed solution.
NASA Astrophysics Data System (ADS)
Burnett, M.
2010-12-01
One topic that is beginning to influence the systems that support these goals is that of Information Technology (IT) Security. Unsecure systems are vulnerable to increasing attacks and other negative consequences; sponsoring agencies are correspondingly responding with more refined policies and more stringent security requirements. These affect how EO systems can meet the goals of data and service interoperability and harmonization through open access, transformation and visualization services. Contemporary systems, including the vision of a system-of-systems (such as GEOSS, the Global Earth Observation System of Systems), utilize technologies that support a distributed, global, net-centric environment. These types of systems have a high reliance on the open systems, web services, shared infrastructure and data standards. The broader IT industry has developed and used these technologies in their business and mission critical systems for many years. Unfortunately, the IT industry, and their customers have learned the importance of protecting their assets and resources (computing and information) as they have been forced to respond to an ever increasing number and more complex illegitimate “attackers”. This presentation will offer an overview of work done by the CEOS WGISS organization in summarizing security threats, the challenges to responding to them and capturing the current state of the practice within the EO community.
The Climate-G Portal: a Grid Enabled Scientifc Gateway for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni
2010-05-01
Grid portals are web gateways aiming at concealing the underlying infrastructure through a pervasive, transparent, user-friendly, ubiquitous and seamless access to heterogeneous and geographical spread resources (i.e. storage, computational facilities, services, sensors, network, databases). Definitively they provide an enhanced problem-solving environment able to deal with modern, large scale scientific and engineering problems. Scientific gateways are able to introduce a revolution in the way scientists and researchers organize and carry out their activities. Access to distributed resources, complex workflow capabilities, and community-oriented functionalities are just some of the features that can be provided by such a web-based environment. In the context of the EGEE NA4 Earth Science Cluster, Climate-G is a distributed testbed focusing on climate change research topics. The Euro-Mediterranean Center for Climate Change (CMCC) is actively participating in the testbed providing the scientific gateway (Climate-G Portal) to access to the entire infrastructure. The Climate-G Portal has to face important and critical challenges as well as has to satisfy and address key requirements. In the following, the most relevant ones are presented and discussed. Transparency: the portal has to provide a transparent access to the underlying infrastructure preventing users from dealing with low level details and the complexity of a distributed grid environment. Security: users must be authenticated and authorized on the portal to access and exploit portal functionalities. A wide set of roles is needed to clearly assign the proper one to each user. The access to the computational grid must be completely secured, since the target infrastructure to run jobs is a production grid environment. A security infrastructure (based on X509v3 digital certificates) is strongly needed. Pervasivity and ubiquity: the access to the system must be pervasive and ubiquitous. This is easily true due to the nature of the needed web approach. Usability and simplicity: the portal has to provide simple, high level and user friendly interfaces to ease the access and exploitation of the entire system. Coexistence of general purpose and domain oriented services: along with general purpose services (file transfer, job submission, etc.), the portal has to provide domain based services and functionalities. Subsetting of data, visualization of 2D maps around a virtual globe, delivery of maps through OGC compliant interfaces (i.e. Web Map Service - WMS) are just some examples. Since april 2009, about 70 users (85% coming from the climate change community) got access to the portal. A key challenge of this work is the idea to provide users with an integrated working environment, that is a place where scientists can find huge amount of data, complete metadata support, a wide set of data access services, data visualization and analysis tools, easy access to the underlying grid infrastructure and advanced monitoring interfaces.
Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo
2010-05-01
In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an integration on existing solutions. More specifically, the Open Geospatial Consortium (OGC) Web Services (OWS) specifications play a fundamental role in geospatial information sharing (e.g. in INSPIRE Implementing Rules, GEOSS architecture, GMES Services, etc.). On the Grid side, the gLite middleware, developed in the European EGEE (Enabling Grids for E-sciencE) Projects, is widely spread in Europe and beyond, proving its high scalability and it is one of the middleware chosen for the future European Grid Infrastructure (EGI) initiative. Therefore the convergence between OWS and gLite technologies would be desirable for a seamless access to the Grid capabilities through OWS-compliant systems. Anyway, to achieve this harmonization there are some obstacles to overcome. Firstly, a semantics mismatch must be addressed: gLite handle low-level (e.g. close to the machine) concepts like "file", "data", "instruments", "job", etc., while geo-information services handle higher-level (closer to the human) concepts like "coverage", "observation", "measurement", "model", etc. Secondly, an architectural mismatch must be addressed: OWS implements a Web Service-Oriented-Architecture which is stateless, synchronous and with no embedded security (which is demanded to other specs), while gLite implements the Grid paradigm in an architecture which is stateful, asynchronous (even not fully event-based) and with strong embedded security (based on the VO paradigm). In recent years many initiatives and projects have worked out possible approaches for implementing Grid-enabled OWSs. Just to mention some: (i) in 2007 the OGC has signed a Memorandum of Understanding with the Open Grid Forum, "a community of users, developers, and vendors leading the global standardization effort for grid computing."; (ii) the OGC identified "WPS Profiles - Conflation; and Grid processing" as one of the tasks in the Geo Processing Workflow theme of the OWS Phase 6 (OWS-6); (iii) several national, European and international projects investigated different aspects of this integration, developing demonstrators and Proof-of-Concepts; In this context, "gLite enablement of OpenGeospatial Web Services" (G-OWS) is an initiative started in 2008 by the European CYCLOPS, GENESI-DR, and DORII Projects Consortia in order to collect/coordinate experiences on the enablement of OWS on top of the gLite middleware [GOWS]. Currently G-OWS counts ten member organizations from Europe and beyond, and four European Projects involved. It broadened its scope to the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Its operational objectives are the following: i) to contribute to the OGC-OGF initiative; ii) to release a reference implementation as standard gLite APIs (under the gLite software license); iii) to release a reference model (including procedures and guidelines) for OWS Grid-ification, as far as gLite is concerned; iv) to foster and promote the formation of consortiums for participation to projects/initiatives aimed at building Grid-enabled SDIs To achieve this objectives G-OWS bases its activities on two main guiding principles: a) the adoption of a service-oriented architecture based on the information modelling approach, and b) standardization as a means of achieving interoperability (i.e. adoption of standards from ISO TC211, OGC OWS, OGF). In the first year of activity G-OWS has designed a general architectural framework stemming from the FP6 CYCLOPS studies and enriched by the outcomes of other projects and initiatives involved (i.e. FP7 GENESI-DR, FP7 DORII, AIST GeoGrid, etc.). Some proof-of-concepts have been developed to demonstrate the flexibility and scalability of such architectural framework. The G-OWS WG developed implementations of gLite-enabled Web Coverage Service (WCS) and Web Processing Service (WPS), and an implementation of a Shibboleth authentication for gLite-enabled OWS in order to evaluate the possible integration of Web and Grid security models. The presentation will aim to communicate the G-OWS organization, activities, future plans and means to involve the ESSI community. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Foster 2001] I. Foster, C. Kesselman and S. Tuecke, "The Anatomy of the Grid. The International Journal ofHigh Performance Computing Applications", 15(3):200-222, Fall 2001 [GOWS] G-OWS WG, https://www.g-ows.org/, accessed: 15 January 2010
Virtualized Multi-Mission Operations Center (vMMOC) and its Cloud Services
NASA Technical Reports Server (NTRS)
Ido, Haisam Kassim
2017-01-01
His presentation will cover, the current and future, technical and organizational opportunities and challenges with virtualizing a multi-mission operations center. The full deployment of Goddard Space Flight Centers (GSFC) Virtualized Multi-Mission Operations Center (vMMOC) is nearly complete. The Space Science Mission Operations (SSMO) organizations spacecraft ACE, Fermi, LRO, MMS(4), OSIRIS-REx, SDO, SOHO, Swift, and Wind are in the process of being fully migrated to the vMMOC. The benefits of the vMMOC will be the normalization and the standardization of IT services, mission operations, maintenance, and development as well as ancillary services and policies such as collaboration tools, change management systems, and IT Security. The vMMOC will also provide operational efficiencies regarding hardware, IT domain expertise, training, maintenance and support.The presentation will also cover SSMO's secure Situational Awareness Dashboard in an integrated, fleet centric, cloud based web services fashion. Additionally the SSMO Telemetry as a Service (TaaS) will be covered, which allows authorized users and processes to access telemetry for the entire SSMO fleet, and for the entirety of each spacecrafts history. Both services leverage cloud services in a secure FISMA High and FedRamp environment, and also leverage distributed object stores in order to house and provide the telemetry. The services are also in the process of leveraging the cloud computing services elasticity and horizontal scalability. In the design phase is the Navigation as a Service (NaaS) which will provide a standardized, efficient, and normalized service for the fleet's space flight dynamics operations. Additional future services that may be considered are Ground Segment as a Service (GSaaS), Telemetry and Command as a Service (TCaaS), Flight Software Simulation as a Service, etc.
Crowdsourcing Physical Network Topology Mapping With Net.Tagger
2016-03-01
backend server infrastructure . This in- cludes a full security audit, better web services handling, and integration with the OSM stack and dataset to...a novel approach to network infrastructure mapping that combines smartphone apps with crowdsourced collection to gather data for offline aggregation...and analysis. The project aims to build a map of physical network infrastructure such as fiber-optic cables, facilities, and access points. The
An Ontology for Insider Threat Indicators Development and Applications
2014-11-01
An Ontology for Insider Threat Indicators Development and Applications Daniel L. Costa , Matthew L. Collins, Samuel J. Perl, Michael J. Albrethsen...services, commit fraud against an organization, steal intellectual property, or conduct national security espionage, sabotaging systems and data, as...engineering plans from the victim organization’s computer systems to his new employer. The insider accessed a web server with an administrator account
Development of a virtual multidisciplinary lung cancer tumor board in a community setting.
Stevenson, Marvaretta M; Irwin, Tonia; Lowry, Terry; Ahmed, Maleka Z; Walden, Thomas L; Watson, Melanie; Sutton, Linda
2013-05-01
Creating an effective platform for multidisciplinary tumor conferences can be challenging in the rural community setting. The Duke Cancer Network created an Internet-based platform for a multidisciplinary conference to enhance the care of patients with lung cancer. This conference incorporates providers from different physical locations within a rural community and affiliated providers from a university-based cancer center 2 hours away. An electronic Web conferencing tool connects providers aurally and visually. Conferences were set up using a commercially available Web conferencing platform. The video platform provides a secure Web site coupled with a secure teleconference platform to ensure patient confidentiality. Multiple disciplines are invited to participate, including radiology, radiation oncology, thoracic surgery, pathology, and medical oncology. Participants only need telephone access and Internet connection to participate. Patient histories and physicals are presented, and the Web conferencing platform allows radiologic and histologic images to be reviewed. Treatment plans for patients are discussed, allowing providers to coordinate care among the different subspecialties. Patients who need referral to the affiliated university-based cancer center for specialized services are identified. Pertinent treatment guidelines and journal articles are reviewed. On average, there are 10 participants with one to two cases presented per session. The use of a Web conferencing platform allows subspecialty providers throughout the community and hours away to discuss lung cancer patient cases. This platform increases convenience for providers, eliminating travel to a central location. Coordination of care for patients requiring multidisciplinary care is facilitated, shortening evaluation time before definitive treatment plan.
A Web-based telemedicine system for diabetic retinopathy screening using digital fundus photography.
Wei, Jack C; Valentino, Daniel J; Bell, Douglas S; Baker, Richard S
2006-02-01
The purpose was to design and implement a Web-based telemedicine system for diabetic retinopathy screening using digital fundus cameras and to make the software publicly available through Open Source release. The process of retinal imaging and case reviewing was modeled to optimize workflow and implement use of computer system. The Web-based system was built on Java Servlet and Java Server Pages (JSP) technologies. Apache Tomcat was chosen as the JSP engine, while MySQL was used as the main database and Laboratory of Neuro Imaging (LONI) Image Storage Architecture, from the LONI-UCLA, as the platform for image storage. For security, all data transmissions were carried over encrypted Internet connections such as Secure Socket Layer (SSL) and HyperText Transfer Protocol over SSL (HTTPS). User logins were required and access to patient data was logged for auditing. The system was deployed at Hubert H. Humphrey Comprehensive Health Center and Martin Luther King/Drew Medical Center of Los Angeles County Department of Health Services. Within 4 months, 1500 images of more than 650 patients were taken at Humphrey's Eye Clinic and successfully transferred to King/Drew's Department of Ophthalmology. This study demonstrates an effective architecture for remote diabetic retinopathy screening.
Electronic Health Records: An Enhanced Security Paradigm to Preserve Patient's Privacy
NASA Astrophysics Data System (ADS)
Slamanig, Daniel; Stingl, Christian
In recent years, demographic change and increasing treatment costs demand the adoption of more cost efficient, highly qualitative and integrated health care processes. The rapid growth and availability of the Internet facilitate the development of eHealth services and especially of electronic health records (EHRs) which are promising solutions to meet the aforementioned requirements. Considering actual web-based EHR systems, patient-centric and patient moderated approaches are widely deployed. Besides, there is an emerging market of so called personal health record platforms, e.g. Google Health. Both concepts provide a central and web-based access to highly sensitive medical data. Additionally, the fact that these systems may be hosted by not fully trustworthy providers necessitates to thoroughly consider privacy issues. In this paper we define security and privacy objectives that play an important role in context of web-based EHRs. Furthermore, we discuss deployed solutions as well as concepts proposed in the literature with respect to this objectives and point out several weaknesses. Finally, we introduce a system which overcomes the drawbacks of existing solutions by considering an holistic approach to preserve patient's privacy and discuss the applied methods.
Security Encryption Scheme for Communication of Web Based Control Systems
NASA Astrophysics Data System (ADS)
Robles, Rosslin John; Kim, Tai-Hoon
A control system is a device or set of devices to manage, command, direct or regulate the behavior of other devices or systems. The trend in most systems is that they are connected through the Internet. Traditional Supervisory Control and Data Acquisition Systems (SCADA) is connected only in a limited private network Since the internet Supervisory Control and Data Acquisition Systems (SCADA) facility has brought a lot of advantages in terms of control, data viewing and generation. Along with these advantages, are security issues regarding web SCADA, operators are pushed to connect Control Systems through the internet. Because of this, many issues regarding security surfaced. In this paper, we discuss web SCADA and the issues regarding security. As a countermeasure, a web SCADA security solution using crossed-crypto-scheme is proposed to be used in the communication of SCADA components.
Information System through ANIS at CeSAM
NASA Astrophysics Data System (ADS)
Moreau, C.; Agneray, F.; Gimenez, S.
2015-09-01
ANIS (AstroNomical Information System) is a web generic tool developed at CeSAM to facilitate and standardize the implementation of astronomical data of various kinds through private and/or public dedicated Information Systems. The architecture of ANIS is composed of a database server which contains the project data, a web user interface template which provides high level services (search, extract and display imaging and spectroscopic data using a combination of criteria, an object list, a sql query module or a cone search interfaces), a framework composed of several packages, and a metadata database managed by a web administration entity. The process to implement a new ANIS instance at CeSAM is easy and fast : the scientific project has to submit data or a data secure access, the CeSAM team installs the new instance (web interface template and the metadata database), and the project administrator can configure the instance with the web ANIS-administration entity. Currently, the CeSAM offers through ANIS a web access to VO compliant Information Systems for different projects (HeDaM, HST-COSMOS, CFHTLS-ZPhots, ExoDAT,...).
Use of a secure Internet Web site for collaborative medical research.
Marshall, W W; Haley, R W
2000-10-11
Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.
Security & Privacy Policy - Naval Oceanography Portal
Notice: This is a U.S. Government Web Site 1. This is a World Wide Web site for official information information on this Web site are strictly prohibited and may be punishable under the Computer Fraud and Abuse Information Act (FOIA) | External Link Disclaimer This is an official U.S. Navy web site. Security &
Protecting clinical data on Web client computers: the PCASSO approach.
Masys, D. R.; Baker, D. B.
1998-01-01
The ubiquity and ease of use of the Web have made it an increasingly popular medium for communication of health-related information. Web interfaces to commercially available clinical information systems are now available or under development by most major vendors. To the extent that such interfaces involve the use of unprotected operating systems, they are vulnerable to security limitations of Web client software environments. The Patient Centered Access to Secure Systems Online (PCASSO) project extends the protections for person-identifiable health data on Web client computers. PCASSO uses several approaches, including physical protection of authentication information, execution containment, graphical displays, and monitoring the client system for intrusions and co-existing programs that may compromise security. PMID:9929243
Abandoned Uranium Mines (AUM) Site Screening Map Service, 2016, US EPA Region 9
As described in detail in the Five-Year Report, US EPA completed on-the-ground screening of 521 abandoned uranium mine areas. US EPA and the Navajo EPA are using the Comprehensive Database and Atlas to determine which mines should be cleaned up first. US EPA continues to research and identify Potentially Responsible Parties (PRPs) under Superfund to contribute to the costs of cleanup efforts.This US EPA Region 9 web service contains the following map layers:Abandoned Uranium Mines, Priority Mines, Tronox Mines, Navajo Environmental Response Trust Mines, Mines with Enforcement Actions, Superfund AUM Regions, Navajo Nation Administrative Boundaries and Chapter Houses.Mine points have a maximum scale of 1:220,000, while Mine polygons have a minimum scale of 1:220,000. Chapter houses have a minimum scale of 1:200,000. BLM Land Status has a minimum scale of 1:150,000.Full FGDC metadata records for each layer can be found by clicking the layer name at the web service endpoint and viewing the layer description. Data used to create this web service are available for download at https://edg.epa.gov/metadata/catalog/data/data.page.Security Classification: Public. Access Constraints: None. Use Constraints: None. Please check sources, scale, accuracy, currentness and other available information. Please confirm that you are using the most recent copy of both data and metadata. Acknowledgement of the EPA would be appreciated.
Choi, Okkyung; Jung, Hanyoung; Moon, Seungbin
2014-01-01
With smartphone distribution becoming common and robotic applications on the rise, social tagging services for various applications including robotic domains have advanced significantly. Though social tagging plays an important role when users are finding the exact information through web search, reliability and semantic relation between web contents and tags are not considered. Spams are making ill use of this aspect and put irrelevant tags deliberately on contents and induce users to advertise contents when they click items of search results. Therefore, this study proposes a detection method for tag-ranking manipulation to solve the problem of the existing methods which cannot guarantee the reliability of tagging. Similarity is measured for ranking the grade of registered tag on the contents, and weighted values of each tag are measured by means of synonym relevance, frequency, and semantic distances between tags. Lastly, experimental evaluation results are provided and its efficiency and accuracy are verified through them.
NASA Technical Reports Server (NTRS)
2006-01-01
The Global Change Master Directory (GCMD) has been one of the best known Earth science and global change data discovery online resources throughout its extended operational history. The growing popularity of the system since its introduction on the World Wide Web in 1994 has created an environment where resolving issues of scalability, security, and interoperability have been critical to providing the best available service to the users and partners of the GCMD. Innovative approaches developed at the GCMD in these areas will be presented with a focus on how they relate to current and future GO-ESSP community needs.
Response capabilities of the National Guard: a focus on domestic disaster medical response.
Bochicchio, Daniel
2010-01-01
The National Guard has a 373-year history of responding to the nation's call to duty for service both at home and abroad (The National Guard Bureau Web site: Available at http://www.ngb.army.mil/default. aspx.). The National Guard (NG) is a constitutionally unique organization (United States Constitution, US Government Printing Office Web site: Available at http://www.gpoaccess.gov/constitution/index.html.). Today's Guard conducts domestic disaster response and civilian assistance missions on a daily basis. Yet, the NG's role, mission, and capabilities are not well-known or understood. The National Response Framework (NRF) places significant responsibility on the local and state disaster planners (Department of Homeland Security: National Response Framework. US Department of Homeland Security, Washington, DC, January 2008). The public health professionals are an integral component of the disaster planning community. It is critical that the public health community be knowledgeable of types and capabilities of all the response assets at their disposal.
Distributed Operations Planning
NASA Technical Reports Server (NTRS)
Fox, Jason; Norris, Jeffrey; Powell, Mark; Rabe, Kenneth; Shams, Khawaja
2007-01-01
Maestro software provides a secure and distributed mission planning system for long-term missions in general, and the Mars Exploration Rover Mission (MER) specifically. Maestro, the successor to the Science Activity Planner, has a heavy emphasis on portability and distributed operations, and requires no data replication or expensive hardware, instead relying on a set of services functioning on JPL institutional servers. Maestro works on most current computers with network connections, including laptops. When browsing down-link data from a spacecraft, Maestro functions similarly to being on a Web browser. After authenticating the user, it connects to a database server to query an index of data products. It then contacts a Web server to download and display the actual data products. The software also includes collaboration support based upon a highly reliable messaging system. Modifications made to targets in one instance are quickly and securely transmitted to other instances of Maestro. The back end that has been developed for Maestro could benefit many future missions by reducing the cost of centralized operations system architecture.
Critical Infrastructure: Control Systems and the Terrorist Threat
2004-01-20
Congressional Research Service ˜ The Library of Congress CRS Report for Congress Received through the CRS Web Order Code RL31534 Critical...http://www.pnl.gov/main/sectors/homeland.html]. 68 Rolf Carlson, “Sandia SCADA Program High-Security SCADA LDRD Final Report ,” Sandia Report SAND2002...and Industry Division Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to
Critical Infrastructure: Control Systems and the Terrorist Threat
2003-07-14
Congressional Research Service ˜ The Library of Congress CRS Report for Congress Received through the CRS Web Order Code RL31534 Critical...available online at [http://www.pnl.gov/main/sectors/homeland.html]. 56 Rolf Carlson, “Sandia SCADA Program High-Security SCADA LDRD Final Report ...Industry Division Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to
PRISMATIC: Unified Hierarchical Probabilistic Verification Tool
2011-09-01
security protocols such as for anonymity and quantum cryptography ; and biological reaction pathways. PRISM is currently the leading probabilistic...a whole will only deadlock and fail with a probability ≤ p/2. The assumption allows us to partition the overall system verification problem into two ...run on any port using the standard HTTP protocol. In this way multiple instances of the PRISMATIC web service can respond to different requests when
Secure Web-Site Access with Tickets and Message-Dependent Digests
Donato, David I.
2008-01-01
Although there are various methods for restricting access to documents stored on a World Wide Web (WWW) site (a Web site), none of the widely used methods is completely suitable for restricting access to Web applications hosted on an otherwise publicly accessible Web site. A new technique, however, provides a mix of features well suited for restricting Web-site or Web-application access to authorized users, including the following: secure user authentication, tamper-resistant sessions, simple access to user state variables by server-side applications, and clean session terminations. This technique, called message-dependent digests with tickets, or MDDT, maintains secure user sessions by passing single-use nonces (tickets) and message-dependent digests of user credentials back and forth between client and server. Appendix 2 provides a working implementation of MDDT with PHP server-side code and JavaScript client-side code.
The peer review system (PRS) for quality assurance and treatment improvement in radiation therapy
NASA Astrophysics Data System (ADS)
Le, Anh H. T.; Kapoor, Rishabh; Palta, Jatinder R.
2012-02-01
Peer reviews are needed across all disciplines of medicine to address complex medical challenges in disease care, medical safety, insurance coverage handling, and public safety. Radiation therapy utilizes technologically advanced imaging for treatment planning, often with excellent efficacy. Since planning data requirements are substantial, patients are at risk for repeat diagnostic procedures or suboptimal therapeutic intervention due to a lack of knowledge regarding previous treatments. The Peer Review System (PRS) will make this critical radiation therapy information readily available on demand via Web technology. The PRS system has been developed with current Web technology, .NET framework, and in-house DICOM library. With the advantages of Web server-client architecture, including IIS web server, SOAP Web Services and Silverlight for the client side, the patient data can be visualized through web browser and distributed across multiple locations by the local area network and Internet. This PRS will significantly improve the quality, safety, and accessibility, of treatment plans in cancer therapy. Furthermore, the secure Web-based PRS with DICOM-RT compliance will provide flexible utilities for organization, sorting, and retrieval of imaging studies and treatment plans to optimize the patient treatment and ultimately improve patient safety and treatment quality.
The Enterprise 2.0 Concept: Challenges on Data and Information Security
NASA Astrophysics Data System (ADS)
Silva, Ana; Moreira, Fernando; Varajão, João
The Web 2.0 wave has "hit" businesses all over the world, with companies taking advantage of the 2.0 concept and new applications stimulating collaboration between employees, and also with external partners (suppliers, contractors, universities, R&D organizations and others). However, the use of Web 2.0 applications inside organizations has created additional security challenges, especially regarding data and information security. Companies need to be aware of these risks when deploying the 2.0 concept and take a proactive approach on security. In this paper are identified and discussed some of the challenges and risks of the use of Web 2.0 tools, namely when it comes to securing companies' intellectual property.
A security architecture for interconnecting health information systems.
Gritzalis, Dimitris; Lambrinoudakis, Costas
2004-03-31
Several hereditary and other chronic diseases necessitate continuous and complicated health care procedures, typically offered in different, often distant, health care units. Inevitably, the medical records of patients suffering from such diseases become complex, grow in size very fast and are scattered all over the units involved in the care process, hindering communication of information between health care professionals. Web-based electronic medical records have been recently proposed as the solution to the above problem, facilitating the interconnection of the health care units in the sense that health care professionals can now access the complete medical record of the patient, even if it is distributed in several remote units. However, by allowing users to access information from virtually anywhere, the universe of ineligible people who may attempt to harm the system is dramatically expanded, thus severely complicating the design and implementation of a secure environment. This paper presents a security architecture that has been mainly designed for providing authentication and authorization services in web-based distributed systems. The architecture has been based on a role-based access scheme and on the implementation of an intelligent security agent per site (i.e. health care unit). This intelligent security agent: (a). authenticates the users, local or remote, that can access the local resources; (b). assigns, through temporary certificates, access privileges to the authenticated users in accordance to their role; and (c). communicates to other sites (through the respective security agents) information about the local users that may need to access information stored in other sites, as well as about local resources that can be accessed remotely.
Investigating weaknesses in Android certificate security
NASA Astrophysics Data System (ADS)
Krych, Daniel E.; Lange-Maney, Stephen; McDaniel, Patrick; Glodek, William
2015-05-01
Android's application market relies on secure certificate generation to establish trust between applications and their users; yet, cryptography is often not a priority for application developers and many fail to take the necessary security precautions. Indeed, there is cause for concern: several recent high-profile studies have observed a pervasive lack of entropy on Web-systems leading to the factorization of private keys.1 Sufficient entropy, or randomness, is essential to generate secure key pairs and combat predictable key generation. In this paper, we analyze the security of Android certificates. We investigate the entropy present in 550,000 Android application certificates using the Quasilinear GCD finding algorithm.1 Our results show that while the lack of entropy does not appear to be as ubiquitous in the mobile markets as on Web-systems, there is substantial reuse of certificates only one third of the certificates in our dataset were unique. In other words, we find that organizations frequently reuse certificates for different applications. While such a practice is acceptable under Google's specifications for a single developer, we find that in some cases the same certificates are used for a myriad of developers, potentially compromising Android's intended trust relationships. Further, we observed duplicate certificates being used by both malicious and non-malicious applications. The top 3 repeated certificates present in our dataset accounted for a total of 11,438 separate APKs. Of these applications, 451, or roughly 4%, were identified as malicious by antivirus services.
NASA Astrophysics Data System (ADS)
Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.
2018-05-01
This paper presents statistical results and their consolidation, which were received in the study into security of various web-application against cross-site request forgery attacks. Some of the results were received in the study carried out within the framework of certification for compliance with information security requirements. The paper provides the results of consolidating information about the attack and protection measures, which are currently used by the developers of web-applications. It specifies results of the study, which demonstrate various distribution types: distribution of identified vulnerabilities as per the developer type (Russian and foreign), distribution of the security measures used in web-applications, distribution of the identified vulnerabilities as per the programming languages, data on the number of security measures that are used in the studied web-applications. The results of the study show that in most cases the developers of web-applications do not pay due attention to protection against cross-site request forgery attacks. The authors give recommendations to the developers that are planning to undergo a certification process for their software applications.
Aryanto, K Y E; Broekema, A; Langenhuysen, R G A; Oudkerk, M; van Ooijen, P M A
2015-05-01
To develop and test a fast and easy rule-based web-environment with optional de-identification of imaging data to facilitate data distribution within a hospital environment. A web interface was built using Hypertext Preprocessor (PHP), an open source scripting language for web development, and Java with SQL Server to handle the database. The system allows for the selection of patient data and for de-identifying these when necessary. Using the services provided by the RSNA Clinical Trial Processor (CTP), the selected images were pushed to the appropriate services using a protocol based on the module created for the associated task. Five pipelines, each performing a different task, were set up in the server. In a 75 month period, more than 2,000,000 images are transferred and de-identified in a proper manner while 20,000,000 images are moved from one node to another without de-identification. While maintaining a high level of security and stability, the proposed system is easy to setup, it integrate well with our clinical and research practice and it provides a fast and accurate vendor-neutral process of transferring, de-identifying, and storing DICOM images. Its ability to run different de-identification processes in parallel pipelines is a major advantage in both clinical and research setting.
Web-based Factors Affecting Online Purchasing Behaviour
NASA Astrophysics Data System (ADS)
Ariff, Mohd Shoki Md; Sze Yan, Ng; Zakuan, Norhayati; Zaidi Bahari, Ahamad; Jusoh, Ahmad
2013-06-01
The growing use of internet and online purchasing among young consumers in Malaysia provides a huge prospect in e-commerce market, specifically for B2C segment. In this market, if E-marketers know the web-based factors affecting online buyers' behaviour, and the effect of these factors on behaviour of online consumers, then they can develop their marketing strategies to convert potential customers into active one, while retaining existing online customers. Review of previous studies related to the online purchasing behaviour in B2C market has point out that the conceptualization and empirical validation of the online purchasing behaviour of Information and Communication Technology (ICT) literate users, or ICT professional, in Malaysia has not been clearly addressed. This paper focuses on (i) web-based factors which online buyers (ICT professional) keep in mind while shopping online; and (ii) the effect of web-based factors on online purchasing behaviour. Based on the extensive literature review, a conceptual framework of 24 items of five factors was constructed to determine web-based factors affecting online purchasing behaviour of ICT professional. Analysis of data was performed based on the 310 questionnaires, which were collected using a stratified random sampling method, from ICT undergraduate students in a public university in Malaysia. The Exploratory factor analysis performed showed that five factors affecting online purchase behaviour are Information Quality, Fulfilment/Reliability/Customer Service, Website Design, Quick and Details, and Privacy/Security. The result of Multiple Regression Analysis indicated that Information Quality, Quick and Details, and Privacy/Security affect positively online purchase behaviour. The results provide a usable model for measuring web-based factors affecting buyers' online purchase behaviour in B2C market, as well as for online shopping companies to focus on the factors that will increase customers' online purchase.
Transformative Rendering of Internet Resources
2012-10-01
4 Securing WiFi Connections...comes from legitimate web sites that have themselves been hacked . There is no way of anticipating which of these sites have been hacked and therefore...pose a security threat to visitors. The purpose of most of this web page hacking is to plant malicious code on the web site that will attack any
Reliability, Compliance, and Security in Web-Based Course Assessments
ERIC Educational Resources Information Center
Bonham, Scott
2008-01-01
Pre- and postcourse assessment has become a very important tool for education research in physics and other areas. The web offers an attractive alternative to in-class paper administration, but concerns about web-based administration include reliability due to changes in medium, student compliance rates, and test security, both question leakage…
A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.
2017-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
Development of a Virtual Multidisciplinary Lung Cancer Tumor Board in a Community Setting
Stevenson, Marvaretta M.; Irwin, Tonia; Lowry, Terry; Ahmed, Maleka Z.; Walden, Thomas L.; Watson, Melanie; Sutton, Linda
2013-01-01
Purpose: Creating an effective platform for multidisciplinary tumor conferences can be challenging in the rural community setting. The Duke Cancer Network created an Internet-based platform for a multidisciplinary conference to enhance the care of patients with lung cancer. This conference incorporates providers from different physical locations within a rural community and affiliated providers from a university-based cancer center 2 hours away. An electronic Web conferencing tool connects providers aurally and visually. Methods: Conferences were set up using a commercially available Web conferencing platform. The video platform provides a secure Web site coupled with a secure teleconference platform to ensure patient confidentiality. Multiple disciplines are invited to participate, including radiology, radiation oncology, thoracic surgery, pathology, and medical oncology. Participants only need telephone access and Internet connection to participate. Results: Patient histories and physicals are presented, and the Web conferencing platform allows radiologic and histologic images to be reviewed. Treatment plans for patients are discussed, allowing providers to coordinate care among the different subspecialties. Patients who need referral to the affiliated university-based cancer center for specialized services are identified. Pertinent treatment guidelines and journal articles are reviewed. On average, there are 10 participants with one to two cases presented per session. Conclusion: The use of a Web conferencing platform allows subspecialty providers throughout the community and hours away to discuss lung cancer patient cases. This platform increases convenience for providers, eliminating travel to a central location. Coordination of care for patients requiring multidisciplinary care is facilitated, shortening evaluation time before definitive treatment plan. PMID:23942505
Anon-Pass: Practical Anonymous Subscriptions
Lee, Michael Z.; Dunn, Alan M.; Katz, Jonathan; Waters, Brent; Witchel, Emmett
2014-01-01
We present the design, security proof, and implementation of an anonymous subscription service. Users register for the service by providing some form of identity, which might or might not be linked to a real-world identity such as a credit card, a web login, or a public key. A user logs on to the system by presenting a credential derived from information received at registration. Each credential allows only a single login in any authentication window, or epoch. Logins are anonymous in the sense that the service cannot distinguish which user is logging in any better than random guessing. This implies unlinkability of a user across different logins. We find that a central tension in an anonymous subscription service is the service provider’s desire for a long epoch (to reduce server-side computation) versus users’ desire for a short epoch (so they can repeatedly “re-anonymize” their sessions). We balance this tension by having short epochs, but adding an efficient operation for clients who do not need unlinkability to cheaply re-authenticate themselves for the next time period. We measure performance of a research prototype of our protocol that allows an independent service to offer anonymous access to existing services. We implement a music service, an Android-based subway-pass application, and a web proxy, and show that adding anonymity adds minimal client latency and only requires 33 KB of server memory per active user. PMID:24504081
Anon-Pass: Practical Anonymous Subscriptions.
Lee, Michael Z; Dunn, Alan M; Katz, Jonathan; Waters, Brent; Witchel, Emmett
2013-12-31
We present the design, security proof, and implementation of an anonymous subscription service. Users register for the service by providing some form of identity, which might or might not be linked to a real-world identity such as a credit card, a web login, or a public key. A user logs on to the system by presenting a credential derived from information received at registration. Each credential allows only a single login in any authentication window, or epoch . Logins are anonymous in the sense that the service cannot distinguish which user is logging in any better than random guessing. This implies unlinkability of a user across different logins. We find that a central tension in an anonymous subscription service is the service provider's desire for a long epoch (to reduce server-side computation) versus users' desire for a short epoch (so they can repeatedly "re-anonymize" their sessions). We balance this tension by having short epochs, but adding an efficient operation for clients who do not need unlinkability to cheaply re-authenticate themselves for the next time period. We measure performance of a research prototype of our protocol that allows an independent service to offer anonymous access to existing services. We implement a music service, an Android-based subway-pass application, and a web proxy, and show that adding anonymity adds minimal client latency and only requires 33 KB of server memory per active user.
Web vulnerability study of online pharmacy sites.
Kuzma, Joanne
2011-01-01
Consumers are increasingly using online pharmacies, but these sites may not provide an adequate level of security with the consumers' personal data. There is a gap in this research addressing the problems of security vulnerabilities in this industry. The objective is to identify the level of web application security vulnerabilities in online pharmacies and the common types of flaws, thus expanding on prior studies. Technical, managerial and legal recommendations on how to mitigate security issues are presented. The proposed four-step method first consists of choosing an online testing tool. The next steps involve choosing a list of 60 online pharmacy sites to test, and then running the software analysis to compile a list of flaws. Finally, an in-depth analysis is performed on the types of web application vulnerabilities. The majority of sites had serious vulnerabilities, with the majority of flaws being cross-site scripting or old versions of software that have not been updated. A method is proposed for the securing of web pharmacy sites, using a multi-phased approach of technical and managerial techniques together with a thorough understanding of national legal requirements for securing systems.
Federated Access to Cyber Observables for Detection of Targeted Attacks
2014-10-01
each manages. The DQNs also utilize an intelligent information ex- traction capability for automatically suggesting mappings from text found in audit ...Harmelen, and others, “OWL web ontology language overview,” W3C Recomm., vol. 10, no. 2004–03, p. 10, 2004. [4] D. Miller and B. Pearson , Security...Online]. Available: http://www.disa.mil/Services/Information- Assurance /HBS/HBSS. [21] S. Zanikolas and R. Sakellariou, “A taxonomy of grid
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-01-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic. PMID:26451333
Cloud Computing for Pharmacometrics: Using AWS, NONMEM, PsN, Grid Engine, and Sonic.
Sanduja, S; Jewell, P; Aron, E; Pharai, N
2015-09-01
Cloud computing allows pharmacometricians to access advanced hardware, network, and security resources available to expedite analysis and reporting. Cloud-based computing environments are available at a fraction of the time and effort when compared to traditional local datacenter-based solutions. This tutorial explains how to get started with building your own personal cloud computer cluster using Amazon Web Services (AWS), NONMEM, PsN, Grid Engine, and Sonic.
A WebGIS-based system for analyzing and visualizing air quality data for Shanghai Municipality
NASA Astrophysics Data System (ADS)
Wang, Manyi; Liu, Chaoshun; Gao, Wei
2014-10-01
An online visual analytical system based on Java Web and WebGIS for air quality data for Shanghai Municipality was designed and implemented to quantitatively analyze and qualitatively visualize air quality data. By analyzing the architecture of WebGIS and Java Web, we firstly designed the overall scheme for system architecture, then put forward the software and hardware environment and also determined the main function modules for the system. The visual system was ultimately established with the DIV + CSS layout method combined with JSP, JavaScript, and some other computer programming languages based on the Java programming environment. Moreover, Struts, Spring, and Hibernate frameworks (SSH) were integrated in the system for the purpose of easy maintenance and expansion. To provide mapping service and spatial analysis functions, we selected ArcGIS for Server as the GIS server. We also used Oracle database and ESRI file geodatabase to store spatial data and non-spatial data in order to ensure the data security. In addition, the response data from the Web server are resampled to implement rapid visualization through the browser. The experimental successes indicate that this system can quickly respond to user's requests, and efficiently return the accurate processing results.
An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm.
Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya
2015-01-01
Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the "quality of service" as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services.
This map service contains data from aerial radiological surveys of 41 potential uranium mining areas (1,144 square miles) within the Navajo Nation that were conducted during the period from October 1994 through October 1999. The US Environmental Protection Agency (USEPA) Region 9 funded the surveys and the US Department of Energy (USDOE) Remote Sensing Laboratory (RSL) in Las Vegas, Nevada conducted the aerial surveys. The aerial survey data were used to characterize the overall radioactivity and excess Bismuth 214 levels within the surveyed areas.This US EPA Region 9 web service contains the following map layers: Total Terrestrial Gamma Activity Polygons, Total Terrestrial Gamma Activity Contours, Excess Bismuth 214 Contours, Excess Bismuth 214 Polygons, Flight AreasFull FGDC metadata records for each layer can be found by clicking the layer name at the web service endpoint and viewing the layer description.Security Classification: Public. Access Constraints: None. Use Constraints: None. Please check sources, scale, accuracy, currentness and other available information. Please confirm that you are using the most recent copy of both data and metadata. Acknowledgement of the EPA would be appreciated.
Data Publishing and Sharing Via the THREDDS Data Repository
NASA Astrophysics Data System (ADS)
Wilson, A.; Caron, J.; Davis, E.; Baltzer, T.
2007-12-01
The terms "Team Science" and "Networked Science" have been coined to describe a virtual organization of researchers tied via some intellectual challenge, but often located in different organizations and locations. A critical component to these endeavors is publishing and sharing of content, including scientific data. Imagine pointing your web browser to a web page that interactively lets you upload data and metadata to a repository residing on a remote server, which can then be accessed by others in a secure fasion via the web. While any content can be added to this repository, it is designed particularly for storing and sharing scientific data and metadata. Server support includes uploading of data files that can subsequently be subsetted, aggregrated, and served in NetCDF or other scientific data formats. Metadata can be associated with the data and interactively edited. The THREDDS Data Repository (TDR) is a server that provides client initiated, on demand, location transparent storage for data of any type that can then be served by the THREDDS Data Server (TDS). The TDR provides functionality to: * securely store and "own" data files and associated metadata * upload files via HTTP and gridftp * upload a collection of data as single file * modify and restructure repository contents * incorporate metadata provided by the user * generate additional metadata programmatically * edit individual metadata elements The TDR can exist separately from a TDS, serving content via HTTP. Also, it can work in conjunction with the TDS, which includes functionality to provide: * access to data in a variety of formats via -- OPeNDAP -- OGC Web Coverage Service (for gridded datasets) -- bulk HTTP file transfer * a NetCDF view of datasets in NetCDF, OPeNDAP, HDF-5, GRIB, and NEXRAD formats * serving of very large volume datasets, such as NEXRAD radar * aggregation into virtual datasets * subsetting via OPeNDAP and NetCDF Subsetting services This talk will discuss TDR/TDS capabilities as well as how users can install this software to create their own repositories.
Flexible Web services integration: a novel personalised social approach
NASA Astrophysics Data System (ADS)
Metrouh, Abdelmalek; Mokhati, Farid
2018-05-01
Dynamic composition or integration remains one of the key objectives of Web services technology. This paper aims to propose an innovative approach of dynamic Web services composition based on functional and non-functional attributes and individual preferences. In this approach, social networks of Web services are used to maintain interactions between Web services in order to select and compose Web services that are more tightly related to user's preferences. We use the concept of Web services community in a social network of Web services to reduce considerably their search space. These communities are created by the direct involvement of Web services providers.
An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm
Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya
2015-01-01
Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the “quality of service” as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services. PMID:26504894
Computational knowledge integration in biopharmaceutical research.
Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim
2003-09-01
An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.
An innovative privacy preserving technique for incremental datasets on cloud computing.
Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan
2016-08-01
Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.
Proposal for a Web Encoding Service (wes) for Spatial Data Transactio
NASA Astrophysics Data System (ADS)
Siew, C. B.; Peters, S.; Rahman, A. A.
2015-10-01
Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.
Connecting to success: practice management on the Net.
Freydberg, B K
2001-08-15
Profound changes in the way dental practices manage data, patient records, and communication are beginning to unfold. Sooner than most of us can imagine, secured patient medical and dental records will reside on the Internet. Additionally, communication between health care providers and patients will become virtually 100% electronic. As the Application Service Provider (ASP) dental models mature, practices will transition from paper to "paperless" to "web-based" management and clinical systems. This article examines and explains these future frontiers.
NASA Astrophysics Data System (ADS)
Čepický, Jáchym; Moreira de Sousa, Luís
2016-06-01
The OGC® Web Processing Service (WPS) Interface Standard provides rules for standardizing inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates publishing of geospatial processes and client discovery of processes and and binding to those processes into workflows. Data required by a WPS can be delivered across a network or they can be available at a server. PyWPS was one of the first implementations of OGC WPS on the server side. It is written in the Python programming language and it tries to connect to all existing tools for geospatial data analysis, available on the Python platform. During the last two years, the PyWPS development team has written a new version (called PyWPS-4) completely from scratch. The analysis of large raster datasets poses several technical issues in implementing the WPS standard. The data format has to be defined and validated on the server side and binary data have to be encoded using some numeric representation. Pulling raster data from remote servers introduces security risks, in addition, running several processes in parallel has to be possible, so that system resources are used efficiently while preserving security. Here we discuss these topics and illustrate some of the solutions adopted within the PyWPS implementation.
Dynamic selection mechanism for quality of service aware web services
NASA Astrophysics Data System (ADS)
D'Mello, Demian Antony; Ananthanarayana, V. S.
2010-02-01
A web service is an interface of the software component that can be accessed by standard Internet protocols. The web service technology enables an application to application communication and interoperability. The increasing number of web service providers throughout the globe have produced numerous web services providing the same or similar functionality. This necessitates the use of tools and techniques to search the suitable services available over the Web. UDDI (universal description, discovery and integration) is the first initiative to find the suitable web services based on the requester's functional demands. However, the requester's requirements may also include non-functional aspects like quality of service (QoS). In this paper, the authors define a QoS model for QoS aware and business driven web service publishing and selection. The authors propose a QoS requirement format for the requesters, to specify their complex demands on QoS for the web service selection. The authors define a tree structure called quality constraint tree (QCT) to represent the requester's variety of requirements on QoS properties having varied preferences. The paper proposes a QoS broker based architecture for web service selection, which facilitates the requesters to specify their QoS requirements to select qualitatively optimal web service. A web service selection algorithm is presented, which ranks the functionally similar web services based on the degree of satisfaction of the requester's QoS requirements and preferences. The paper defines web service provider qualities to distinguish qualitatively competitive web services. The paper also presents the modelling and selection mechanism for the requester's alternative constraints defined on the QoS. The authors implement the QoS broker based system to prove the correctness of the proposed web service selection mechanism.
Usability and trust in e-banking.
Pravettoni, Gabriella; Leotta, Salvatore Nuccio; Lucchiari, Claudio; Misuraca, Raffaella
2007-12-01
This study assessed the role of usability in trust of e-banking services. A questionnaire was administered to 185 Italian undergraduate working students who volunteered for the experiment (M age = 30.5 yr., SD = 3.1). Participants were differentiated on computer ability (Expert, n = 104; Nonexpert, n = 81) and e-banking use (User, n = 93; Nonusers, n = 92). Analysis showed that the website usability of e-banking services did not play a very important role for the User group. Instead, institution-based trust, e.g., the trust in the security policy of the Web merchant, customers, and the overall trust of the bank were the crucial factors in the adoption of e-banking.
A Flexible Component based Access Control Architecture for OPeNDAP Services
NASA Astrophysics Data System (ADS)
Kershaw, Philip; Ananthakrishnan, Rachana; Cinquini, Luca; Lawrence, Bryan; Pascoe, Stephen; Siebenlist, Frank
2010-05-01
Network data access services such as OPeNDAP enable widespread access to data across user communities. However, without ready means to restrict access to data for such services, data providers and data owners are constrained from making their data more widely available. Even with such capability, the range of different security technologies available can make interoperability between services and user client tools a challenge. OPeNDAP is a key data access service in the infrastructure under development to support the CMIP5 (Couple Model Intercomparison Project Phase 5). The work is being carried out as part of an international collaboration including the US Earth System Grid and Curator projects and the EU funded IS-ENES and Metafor projects. This infrastructure will bring together Petabytes of climate model data and associated metadata from over twenty modelling centres around the world in a federation with a core archive mirrored at three data centres. A security system is needed to meet the requirements of organisations responsible for model data including the ability to restrict data access to registered users, keep them up to date with changes to data and services, audit access and protect finite computing resources. Individual organisations have existing tools and services such as OPeNDAP with which users in the climate research community are already familiar. The security system should overlay access control in a way which maintains the usability and ease of access to these services. The BADC (British Atmospheric Data Centre) has been working in collaboration with the Earth System Grid development team and partner organisations to develop the security architecture. OpenID and MyProxy were selected at an early stage in the ESG project to provide single sign-on capability across the federation of participating organisations. Building on the existing OPeNDAP specification an architecture based on pluggable server side components has been developed at the BADC. These components filter requests to the service they protect and apply the required authentication and authorisation schemes. Filters have been developed for OpenID and SSL client based authentication. The latter enabling access with MyProxy issued credentials. By preserving a clear separation between the security and application functionality, multiple authentication technologies may be supported without the need for modification to the underlying OPeNDAP application. The software has been developed in the Python programming language securing the Python based OPeNDAP implementation, PyDAP. This utilises the Python WSGI (Web Server Gateway Interface) specification to create distinct security filter components. Work is also currently underway to develop a parallel Java based filter implementation to secure the THREDDS Data Server. Whilst the ability to apply this flexible approach to the server side security layer is important, the development of compatible client software is vital to the take up of these services across a wide user base. To date PyDAP and wget based clients have been tested and work is planned to integrate the required security interface into the netCDF API. This forms part of ongoing collaboration with the OPeNDAP user and development community to ensure interoperability.
Image-based electronic patient records for secured collaborative medical applications.
Zhang, Jianguo; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Yao, Yihong; Cai, Weihua; Jin, Jin; Zhang, Guozhen; Sun, Kun
2005-01-01
We developed a Web-based system to interactively display image-based electronic patient records (EPR) for secured intranet and Internet collaborative medical applications. The system consists of four major components: EPR DICOM gateway (EPR-GW), Image-based EPR repository server (EPR-Server), Web Server and EPR DICOM viewer (EPR-Viewer). In the EPR-GW and EPR-Viewer, the security modules of Digital Signature and Authentication are integrated to perform the security processing on the EPR data with integrity and authenticity. The privacy of EPR in data communication and exchanging is provided by SSL/TLS-based secure communication. This presentation gave a new approach to create and manage image-based EPR from actual patient records, and also presented a way to use Web technology and DICOM standard to build an open architecture for collaborative medical applications.
C-C1-04: Building a Health Services Information Technology Research Environment
Gehrum, David W; Jones, JB; Romania, Gregory J; Young, David L; Lerch, Virginia R; Bruce, Christa A; Donkochik, Diane; Stewart, Walter F
2010-01-01
Background: The electronic health record (EHR) has opened a new era for health services research (HSR) where information technology (IT) is used to re-engineer care processes. While the EHR provides one means of advancing novel solutions, a promising strategy is to develop tools (e.g., online questionnaires, visual display tools, decision support) distinct from, but which interact with, the EHR. Development of such software tools outside the EHR offers an advantage in flexibility, sophistication, and ultimately in portability to other settings. However, institutional IT departments have an imperative to protect patient data and to standardize IT processes to ensure system-level security and support traditional business needs. Such imperatives usually present formidable process barriers to testing novel software solutions. We describe how, in collaboration with our IT department, we are creating an environment and a process that allows for routine and rapid testing of novel software solutions. Methods: We convened a working group consisting of IT and research personnel with expertise in information security, database design/management, web design, EHR programming, and health services research. The working group was tasked with developing a research IT environment to accomplish two objectives: maintain network/ data security and regulatory compliance; allow researchers working with external vendors to rapidly prototype and, in a clinical setting, test web-based tools. Results: Two parallel solutions, one focused on hardware, the second on oversight and management, were developed. First, we concluded that three separate, staged development environments were required to allow external vendor access for testing software and for transitioning software to be used in a clinic. In parallel, the extant oversight process for approving/managing access to internal/external personnel had to be altered to reflect the scope and scale of discrete research projects, as opposed to an enterpriselevel approach to IT management. Conclusions: Innovation in health services software development requires a flexible, scalable IT environment adapted to the unique objectives of a HSR software development model. In our experience, implementing the hardware solution is less challenging than the cultural change required to implement such a model and the modifications to administrative and oversight processes to sustain an environment for rapid product development and testing.
Cyber security challenges in Smart Cities: Safety, security and privacy.
Elmaghraby, Adel S; Losavio, Michael M
2014-07-01
The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the "Internet of Things." Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect.
Server-Based and Server-Less Byod Solutions to Support Electronic Learning
2016-06-01
Knowledge Online NSD National Security Directive OS operating system OWA Outlook Web Access PC personal computer PED personal electronic device PDA...mobile devices, institute mobile device policies and standards, and promote the development and use of DOD mobile and web -enabled applications” (DOD...with an isolated BYOD web server, properly educated system administrators must carry out and execute the necessary, pre-defined network security
2014-01-01
With smartphone distribution becoming common and robotic applications on the rise, social tagging services for various applications including robotic domains have advanced significantly. Though social tagging plays an important role when users are finding the exact information through web search, reliability and semantic relation between web contents and tags are not considered. Spams are making ill use of this aspect and put irrelevant tags deliberately on contents and induce users to advertise contents when they click items of search results. Therefore, this study proposes a detection method for tag-ranking manipulation to solve the problem of the existing methods which cannot guarantee the reliability of tagging. Similarity is measured for ranking the grade of registered tag on the contents, and weighted values of each tag are measured by means of synonym relevance, frequency, and semantic distances between tags. Lastly, experimental evaluation results are provided and its efficiency and accuracy are verified through them. PMID:25114975
2018-01-01
Background Structural and functional brain images are essential imaging modalities for medical experts to study brain anatomy. These images are typically visually inspected by experts. To analyze images without any bias, they must be first converted to numeric values. Many software packages are available to process the images, but they are complex and difficult to use. The software packages are also hardware intensive. The results obtained after processing vary depending on the native operating system used and its associated software libraries; data processed in one system cannot typically be combined with data on another system. Objective The aim of this study was to fulfill the neuroimaging community’s need for a common platform to store, process, explore, and visualize their neuroimaging data and results using Neuroimaging Web Services Interface: a series of processing pipelines designed as a cyber physical system for neuroimaging and clinical data in brain research. Methods Neuroimaging Web Services Interface accepts magnetic resonance imaging, positron emission tomography, diffusion tensor imaging, and functional magnetic resonance imaging. These images are processed using existing and custom software packages. The output is then stored as image files, tabulated files, and MySQL tables. The system, made up of a series of interconnected servers, is password-protected and is securely accessible through a Web interface and allows (1) visualization of results and (2) downloading of tabulated data. Results All results were obtained using our processing servers in order to maintain data validity and consistency. The design is responsive and scalable. The processing pipeline started from a FreeSurfer reconstruction of Structural magnetic resonance imaging images. The FreeSurfer and regional standardized uptake value ratio calculations were validated using Alzheimer’s Disease Neuroimaging Initiative input images, and the results were posted at the Laboratory of Neuro Imaging data archive. Notable leading researchers in the field of Alzheimer’s Disease and epilepsy have used the interface to access and process the data and visualize the results. Tabulated results with unique visualization mechanisms help guide more informed diagnosis and expert rating, providing a truly unique multimodal imaging platform that combines magnetic resonance imaging, positron emission tomography, diffusion tensor imaging, and resting state functional magnetic resonance imaging. A quality control component was reinforced through expert visual rating involving at least 2 experts. Conclusions To our knowledge, there is no validated Web-based system offering all the services that Neuroimaging Web Services Interface offers. The intent of Neuroimaging Web Services Interface is to create a tool for clinicians and researchers with keen interest on multimodal neuroimaging. More importantly, Neuroimaging Web Services Interface significantly augments the Alzheimer’s Disease Neuroimaging Initiative data, especially since our data contain a large cohort of Hispanic normal controls and Alzheimer’s Disease patients. The obtained results could be scrutinized visually or through the tabulated forms, informing researchers on subtle changes that characterize the different stages of the disease. PMID:29699962
ERIC Educational Resources Information Center
Brandt, D. Scott
1998-01-01
Examines Internet security risks and how users can protect themselves. Discusses inadvertent bugs in software; programming problems with Common Gateway Interface (CGI); viruses; tracking of Web users; and preventing access to selected Web pages and filtering software. A glossary of Internet security-related terms is included. (AEF)
Threats and risks to information security: a practical analysis of free access wireless networks
NASA Astrophysics Data System (ADS)
Quirumbay, Daniel I.; Coronel, Iván. A.; Bayas, Marcia M.; Rovira, Ronald H.; Gromaszek, Konrad; Tleshova, Akmaral; Kozbekova, Ainur
2017-08-01
Nowadays, there is an ever-growing need to investigate, consult and communicate through the internet. This need leads to the intensification of free access to the web in strategic and functional points for the benefit of the community. However, this open access is also related to the increase of information insecurity. The existing works on computer security primarily focus on the development of techniques to reduce cyber-attacks. However, these approaches do not address the sector of inexperienced users who have difficulty understanding browser settings. Two methods can solve this problem: first the development of friendly browsers with intuitive setups for new users and on the other hand, by implementing awareness programs on essential security without deepening on technical information. This article addresses an analysis of the vulnerabilities of wireless equipment that provides internet service in the open access zones and the potential risks that could be found when using these means.
Professional convergence in forensic practice.
Mercer, D; Mason, T; Richman, J
2001-06-01
This paper outlines the development and convergence of forensic science and secure psychiatric services in the UK, locating the professionalization of forensic nursing within a complex web of political, economic, and ideological structures. It is suggested that a stagnation of the therapeutic enterprise in high and medium security provision has witnessed an intrusion of medical power into the societal body. Expanding technologies of control and surveillance are discussed in relation to the move from modernity to postmodernity and the ongoing dynamic of medicalized offending. Four aspects of globalization are identified as impacting upon the organization and application of forensic practice: (i) organized capitalism and the exhaustion of the welfare state; (ii) security versus danger and trust versus risk; (iii) science as a meta-language; and (iv) foreclosure as a mechanism of censorship. Finally, as a challenge for the profession, some predictions are offered about the future directions or demise of forensic nursing.
A radiology department intranet: development and applications.
Willing, S J; Berland, L L
1999-01-01
An intranet is a "private Internet" that uses the protocols of the World Wide Web to share information resources within a company or with the company's business partners and clients. The hardware requirements for an intranet begin with a dedicated Web server permanently connected to the departmental network. The heart of a Web server is the hypertext transfer protocol (HTTP) service, which receives a page request from a client's browser and transmits the page back to the client. Although knowledge of hypertext markup language (HTML) is not essential for authoring a Web page, a working familiarity with HTML is useful, as is knowledge of programming and database management. Security can be ensured by using scripts to write information in hidden fields or by means of "cookies." Interfacing databases and database management systems with the Web server and conforming the user interface to HTML syntax can be achieved by means of the common gateway interface (CGI), Active Server Pages (ASP), or other methods. An intranet in a radiology department could include the following types of content: on-call schedules, work schedules and a calendar, a personnel directory, resident resources, memorandums and discussion groups, software for a radiology information system, and databases.
Robust image obfuscation for privacy protection in Web 2.0 applications
NASA Astrophysics Data System (ADS)
Poller, Andreas; Steinebach, Martin; Liu, Huajian
2012-03-01
We present two approaches to robust image obfuscation based on permutation of image regions and channel intensity modulation. The proposed concept of robust image obfuscation is a step towards end-to-end security in Web 2.0 applications. It helps to protect the privacy of the users against threats caused by internet bots and web applications that extract biometric and other features from images for data-linkage purposes. The approaches described in this paper consider that images uploaded to Web 2.0 applications pass several transformations, such as scaling and JPEG compression, until the receiver downloads them. In contrast to existing approaches, our focus is on usability, therefore the primary goal is not a maximum of security but an acceptable trade-off between security and resulting image quality.
Data privacy preservation in telemedicine: the PAIRSE project.
Nageba, Ebrahim; Defude, Bruno; Morvan, Franck; Ghedira, Chirine; Fayn, Jocelyne
2011-01-01
The preservation of medical data privacy and confidentiality is a major challenge in eHealth systems and applications. A technological solution based on advanced information and communication systems architectures is needed in order to retrieve and exchange the patient's data in a secure and reliable manner. In this paper, we introduce the project PAIRSE, Preserving Privacy in Peer to Peer (P2P) environments, which proposes an original web service oriented framework preserving the privacy and confidentiality of shared or exchanged medical data.
2006-05-01
Subway Train (New York) 9° Food Blender at 3 ft. 80 Garbage Disposal at 3 ft. Shouting at 3 ft. 70 Vacuum Cleaner at 10 ft. Normal Speech 60 at...1899. The Natural Resources Conservation Service (NRCS) has developed procedures for identifying wetlands for compliance with the Food Security Act of...organic litter and wood to aquatic systems, nutrient retention and cycling, wildlife habitat, and food -web support for a wide range of aquatic and
HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation
NASA Astrophysics Data System (ADS)
Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.
2006-03-01
As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.
Personalization of Rule-based Web Services.
Choi, Okkyung; Han, Sang Yong
2008-04-04
Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.
NASA Astrophysics Data System (ADS)
Satoh, Hitoshi; Niki, Noboru; Eguchi, Kenji; Ohmatsu, Hironobu; Kaneko, Masahiro; Kakinuma, Ryutaro; Moriyama, Noriyuki
2010-03-01
Diagnostic MDCT imaging requires a considerable number of images to be read. Moreover, the doctor who diagnoses a medical image is insufficient in Japan. Because of such a background, we have provided diagnostic assistance methods to medical screening specialists by developing a lung cancer screening algorithm that automatically detects suspected lung cancers in helical CT images, a coronary artery calcification screening algorithm that automatically detects suspected coronary artery calcification and a vertebra body analysis algorithm for quantitative evaluation of osteoporosis. We also have developed the teleradiology network system by using web medical image conference system. In the teleradiology network system, the security of information network is very important subjects. Our teleradiology network system can perform Web medical image conference in the medical institutions of a remote place using the web medical image conference system. We completed the basic proof experiment of the web medical image conference system with information security solution. We can share the screen of web medical image conference system from two or more web conference terminals at the same time. An opinion can be exchanged mutually by using a camera and a microphone that are connected with the workstation that builds in some diagnostic assistance methods. Biometric face authentication used on site of teleradiology makes "Encryption of file" and "Success in login" effective. Our Privacy and information security technology of information security solution ensures compliance with Japanese regulations. As a result, patients' private information is protected. Based on these diagnostic assistance methods, we have developed a new computer-aided workstation and a new teleradiology network that can display suspected lesions three-dimensionally in a short time. The results of this study indicate that our radiological information system without film by using computer-aided diagnosis workstation and our teleradiology network system can increase diagnostic speed, diagnostic accuracy and security improvement of medical information.
NASA Astrophysics Data System (ADS)
Satibi; Widodo, Catur Edi; Farikhin
2018-02-01
This research aims to optimize forex trading profit automatically using EA but its still keep considering accuracy and drawdown levels. The evaluation system will classify EA performance based on trading market sessions (Sydney, Tokyo, London and New York) to determine the right EA to be used in certain market sessions. This evaluation system is a web-based ELECTRE methods that interact in real-time with EA through web service and are able to present real-time charts performance dashboard using web socket protocol communications. Web applications are programmed using NodeJs. In the testing period, all EAs had been simulated 24 hours in all market sessions for three months, the best EA is valued by its profit, accuracy and drawdown criteria that calculated using web-based ELECTRE method. The ideas of this research are to compare the best EA on testing period with collaboration performances of each best classified EA by market sessions. This research uses three months historical data of EUR/USD as testing period and other 3 months as validation period. As a result, performance of collaboration four best EA classified by market sessions can increase profits percentage consistently in testing and validation periods and keep securing accuracy and drawdown levels.
NASA Astrophysics Data System (ADS)
Alpert, J. C.; Rutledge, G.; Wang, J.; Freeman, P.; Kang, C. Y.
2009-05-01
The NOAA Operational Modeling Archive Distribution System (NOMADS) is now delivering high availability services as part of NOAA's official real time data dissemination at its Web Operations Center (WOC). The WOC is a web service used by all organizational units in NOAA and acts as a data repository where public information can be posted to a secure and scalable content server. A goal is to foster collaborations among the research and education communities, value added retailers, and public access for science and development efforts aimed at advancing modeling and GEO-related tasks. The services used to access the operational model data output are the Open-source Project for a Network Data Access Protocol (OPeNDAP), implemented with the Grid Analysis and Display System (GrADS) Data Server (GDS), and applications for slicing, dicing and area sub-setting the large matrix of real time model data holdings. This approach insures an efficient use of computer resources because users transmit/receive only the data necessary for their tasks including metadata. Data sets served in this way with a high availability server offer vast possibilities for the creation of new products for value added retailers and the scientific community. New applications to access data and observations for verification of gridded model output, and progress toward integration with access to conventional and non-conventional observations will be discussed. We will demonstrate how users can use NOMADS services to repackage area subsets either using repackaging of GRIB2 files, or values selected by ensemble component, (forecast) time, vertical levels, global horizontal location, and by variable, virtually a 6- Dimensional analysis services across the internet.
Distributed spatial information integration based on web service
NASA Astrophysics Data System (ADS)
Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng
2008-10-01
Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.
Distributed spatial information integration based on web service
NASA Astrophysics Data System (ADS)
Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng
2009-10-01
Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.
Providing Multi-Page Data Extraction Services with XWRAPComposer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ling; Zhang, Jianjun; Han, Wei
2008-04-30
Dynamic Web data sources – sometimes known collectively as the Deep Web – increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deepmore » Web. To address these challenges, we present DYNABOT, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DYNABOT has three unique characteristics. First, DYNABOT utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DYNABOT employs a modular, self-tuning system architecture for focused crawling of the Deep Web using service class descriptions. Third, DYNABOT incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less
Interfaces to PeptideAtlas: a case study of standard data access systems
Handcock, Jeremy; Robinson, Thomas; Deutsch, Eric W.; Boyle, John
2012-01-01
Access to public data sets is important to the scientific community as a resource to develop new experiments or validate new data. Projects such as the PeptideAtlas, Ensembl and The Cancer Genome Atlas (TCGA) offer both access to public data and a repository to share their own data. Access to these data sets is often provided through a web page form and a web service API. Access technologies based on web protocols (e.g. http) have been in use for over a decade and are widely adopted across the industry for a variety of functions (e.g. search, commercial transactions, and social media). Each architecture adapts these technologies to provide users with tools to access and share data. Both commonly used web service technologies (e.g. REST and SOAP), and custom-built solutions over HTTP are utilized in providing access to research data. Providing multiple access points ensures that the community can access the data in the simplest and most effective manner for their particular needs. This article examines three common access mechanisms for web accessible data: BioMart, caBIG, and Google Data Sources. These are illustrated by implementing each over the PeptideAtlas repository and reviewed for their suitability based on specific usages common to research. BioMart, Google Data Sources, and caBIG are each suitable for certain uses. The tradeoffs made in the development of the technology are dependent on the uses each was designed for (e.g. security versus speed). This means that an understanding of specific requirements and tradeoffs is necessary before selecting the access technology. PMID:22941959
An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services
NASA Astrophysics Data System (ADS)
Shah, M.; Verma, Y.; Nandakumar, R.
2012-07-01
Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.
MedlinePlus Connect: Web Service
... https://medlineplus.gov/connect/service.html MedlinePlus Connect: Web Service To use the sharing features on this ... if you implement MedlinePlus Connect by contacting us . Web Service Overview The parameters for the Web service ...
Focused Crawling of the Deep Web Using Service Class Descriptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rocco, D; Liu, L; Critchlow, T
2004-06-21
Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address thesemore » challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less
Web-based GIS for spatial pattern detection: application to malaria incidence in Vietnam.
Bui, Thanh Quang; Pham, Hai Minh
2016-01-01
There is a great concern on how to build up an interoperable health information system of public health and health information technology within the development of public information and health surveillance programme. Technically, some major issues remain regarding to health data visualization, spatial processing of health data, health information dissemination, data sharing and the access of local communities to health information. In combination with GIS, we propose a technical framework for web-based health data visualization and spatial analysis. Data was collected from open map-servers and geocoded by open data kit package and data geocoding tools. The Web-based system is designed based on Open-source frameworks and libraries. The system provides Web-based analyst tool for pattern detection through three spatial tests: Nearest neighbour, K function, and Spatial Autocorrelation. The result is a web-based GIS, through which end users can detect disease patterns via selecting area, spatial test parameters and contribute to managers and decision makers. The end users can be health practitioners, educators, local communities, health sector authorities and decision makers. This web-based system allows for the improvement of health related services to public sector users as well as citizens in a secure manner. The combination of spatial statistics and web-based GIS can be a solution that helps empower health practitioners in direct and specific intersectional actions, thus provide for better analysis, control and decision-making.
Advanced Communication and Control Solutions of Distributed Energy Resources (DER)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Asgeirsson, Haukur; Seguin, Richard; Sherding, Cameron
2007-01-10
This report covers work performed in Phase II of a two phase project whose objective was to demonstrate the aggregation of multiple Distributed Energy Resources (DERs) and to offer them into the energy market. The Phase I work (DE-FC36-03CH11161) created an integrated, but distributed, system and procedures to monitor and control multiple DERs from numerous manufacturers connected to the electric distribution system. Procedures were created which protect the distribution network and personnel that may be working on the network. Using the web as the communication medium for control and monitoring of the DERs, the integration of information and security wasmore » accomplished through the use of industry standard protocols such as secure SSL,VPN and ICCP. The primary objective of Phase II was to develop the procedures for marketing the power of the Phase I aggregated DERs in the energy market, increase the number of DER units, and implement the marketing procedures (interface with ISOs) for the DER generated power. The team partnered with the Midwest Independent System Operator (MISO), the local ISO, to address the energy market and demonstrate the economic dispatch of DERs in response to market signals. The selection of standards-based communication technologies offers the ability of the system to be deployed and integrated with other utilities’ resources. With the use of a data historian technology to facilitate the aggregation, the developed algorithms and procedures can be verified, audited, and modified. The team has demonstrated monitoring and control of multiple DERs as outlined in phase I report including procedures to perform these operations in a secure and safe manner. In Phase II, additional DER units were added. We also expanded on our phase I work to enhance communication security and to develop the market model of having DERs, both customer and utility owned, participate in the energy market. We are proposing a two-part DER energy market model--a utility need business model and an independent energy aggregator-business model. The approach of developing two group models of DER energy participation in the market is unique. The Detroit Edison (DECo, Utility)-led team includes: DTE Energy Technologies (Dtech, DER provider), Electrical Distribution Design (EDD, Virginia Tech company supporting EPRI’s Distribution Engineering Workstation, DEW), Systems Integration Specialists Company (SISCO, economic scheduling and real-time protocol integrator), and OSIsoft (PI software system for managing real-time information). This team is focused on developing the application engineering, including software systems necessary for DER’s integration, control and sale into the market place. Phase II Highlights Installed and tested an ICCP link with SSL (security) between DECo, the utility, and DTE Energy Technologies (DTECH), the aggregator, making DER data available to the utility for both monitoring and control. Installed and tested PI process book with circuit & DER operational models for DECo SOC/ROC operator’s use for monitoring of both utility circuit and customer DER parameters. The PI Process Book models also included DER control for the DECo SOC/ROC operators, which was tested and demonstrated control. The DER Tagging and Operating Procedures were developed, which allowed that control to be done in a safe manner, were modified for required MOC/MISO notification procedures. The Distribution Engineering Workstation (DEW) was modified to include temperature normalized load research statistics, using a 30 hour day-ahead weather feed. This allowed day-ahead forecasting of the customer load profile and the entire circuit to determine overload and low voltage problems. This forecast at the point of common coupling was passed to DTech DR SOC for use in their economic dispatch algorithm. Standard Work Instructions were developed for DER notification, sale, and operation into the MISO market. A software mechanism consisting of a suite of new and revised functionality was developed that integrated with the local ISO such that offers can be made electronically without human intervention. A suite of software was developed by DR SOC enabling DER usage in real time and day-ahead: Generation information file exchange with PI and the utility power flow A utility day-ahead information file Energy Offer Web Service Market Result Web Service Real-Time Meter Data Web Service Real-Time Notification Web Service Registered over 20 DER with MISO in Demand Response Market and demonstrated electronic sale to MISO.« less
NASA Technical Reports Server (NTRS)
Hochstadt, Jake
2011-01-01
Ruby on Rails is an open source web application framework for the Ruby programming language. The first application I built was a web application to manage and authenticate other applications. One of the main requirements for this application was a single sign-on service. This allowed authentication to be built in one location and be implemented in many different applications. For example, users would be able to login using their existing credentials, and be able to access other NASA applications without authenticating again. The second application I worked on was an internal qualification plan app. Previously, the viewing of employee qualifications was managed through Excel spread sheets. I built a database driven application to streamline the process of managing qualifications. Employees would be able to login securely to view, edit and update their personal qualifications.
Deng, Wu; Zhao, Huimin; Zou, Li; Li, Yuanyuan; Li, Zhengguang
2012-08-01
Computer and information technology popularizes in the medicine manufacturing enterprise for its potentials in working efficiency and service quality. In allusion to the explosive data and information of application system in current medicine manufacturing enterprise, we desire to propose a novel application information system integration platform in medicine manufacturing enterprise, which based on a combination of RFID technology and SOA, to implement information sharing and alternation. This method exploits the application integration platform across service interface layer to invoke the RFID middleware. The loose coupling in integration solution is realized by Web services. The key techniques in RFID event components and expanded role-based security access mechanism are studied in detail. Finally, a case study is implemented and tested to evidence our understanding on application system integration platform in medicine manufacturing enterprise.
Provenance-Based Approaches to Semantic Web Service Discovery and Usage
ERIC Educational Resources Information Center
Narock, Thomas William
2012-01-01
The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…
WebCIS: large scale deployment of a Web-based clinical information system.
Hripcsak, G; Cimino, J J; Sengupta, S
1999-01-01
WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system.
48 CFR 52.222-54 - Employment Eligibility Verification.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) or the Social Security Administration (SSA) may terminate the Contractor's MOU and deny access to the... determines not to suspend or debar the Contractor, then the Contractor must reenroll in E-Verify. (c) Web... at the Department of Homeland Security Web site: http://www.dhs.gov/E-Verify. (d) Individuals...
48 CFR 52.222-54 - Employment Eligibility Verification.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) or the Social Security Administration (SSA) may terminate the Contractor's MOU and deny access to the... determines not to suspend or debar the Contractor, then the Contractor must reenroll in E-Verify. (c) Web... at the Department of Homeland Security Web site: http://www.dhs.gov/E-Verify. (d) Individuals...
49 CFR 393.102 - What are the minimum performance criteria for cargo securement devices and systems?
Code of Federal Regulations, 2012 CFR
2012-10-01
... chains, wire rope, steel strapping, synthetic webbing, and cordage) and other attachment or fastening..., steel strapping, synthetic webbing, and cordage) and other attachment or fastening devices used to... contained within the structure of the vehicle. Securement systems must provide a downward force equivalent...
49 CFR 393.102 - What are the minimum performance criteria for cargo securement devices and systems?
Code of Federal Regulations, 2014 CFR
2014-10-01
... chains, wire rope, steel strapping, synthetic webbing, and cordage) and other attachment or fastening..., steel strapping, synthetic webbing, and cordage) and other attachment or fastening devices used to... contained within the structure of the vehicle. Securement systems must provide a downward force equivalent...
49 CFR 393.102 - What are the minimum performance criteria for cargo securement devices and systems?
Code of Federal Regulations, 2013 CFR
2013-10-01
... chains, wire rope, steel strapping, synthetic webbing, and cordage) and other attachment or fastening..., steel strapping, synthetic webbing, and cordage) and other attachment or fastening devices used to... contained within the structure of the vehicle. Securement systems must provide a downward force equivalent...
ERIC Educational Resources Information Center
Waters, John K.
2009-01-01
In December, Microsoft announced a major security flaw affecting its Internet Explorer web browser. The flaw allowed hackers to use hidden computer code they had already injected into legitimate websites to steal the passwords of visitors to those sites. Reportedly, more than 10,000 websites were infected with the destructive code by the time…
Cyber security challenges in Smart Cities: Safety, security and privacy
Elmaghraby, Adel S.; Losavio, Michael M.
2014-01-01
The world is experiencing an evolution of Smart Cities. These emerge from innovations in information technology that, while they create new economic and social opportunities, pose challenges to our security and expectations of privacy. Humans are already interconnected via smart phones and gadgets. Smart energy meters, security devices and smart appliances are being used in many cities. Homes, cars, public venues and other social systems are now on their path to the full connectivity known as the “Internet of Things.” Standards are evolving for all of these potentially connected systems. They will lead to unprecedented improvements in the quality of life. To benefit from them, city infrastructures and services are changing with new interconnected systems for monitoring, control and automation. Intelligent transportation, public and private, will access a web of interconnected data from GPS location to weather and traffic updates. Integrated systems will aid public safety, emergency responders and in disaster recovery. We examine two important and entangled challenges: security and privacy. Security includes illegal access to information and attacks causing physical disruptions in service availability. As digital citizens are more and more instrumented with data available about their location and activities, privacy seems to disappear. Privacy protecting systems that gather data and trigger emergency response when needed are technological challenges that go hand-in-hand with the continuous security challenges. Their implementation is essential for a Smart City in which we would wish to live. We also present a model representing the interactions between person, servers and things. Those are the major element in the Smart City and their interactions are what we need to protect. PMID:25685517
A component-based, distributed object services architecture for a clinical workstation.
Chueh, H C; Raila, W F; Pappas, J J; Ford, M; Zatsman, P; Tu, J; Barnett, G O
1996-01-01
Attention to an architectural framework in the development of clinical applications can promote reusability of both legacy systems as well as newly designed software. We describe one approach to an architecture for a clinical workstation application which is based on a critical middle tier of distributed object-oriented services. This tier of network-based services provides flexibility in the creation of both the user interface and the database tiers. We developed a clinical workstation for ambulatory care using this architecture, defining a number of core services including those for vocabulary, patient index, documents, charting, security, and encounter management. These services can be implemented through proprietary or more standard distributed object interfaces such as CORBA and OLE. Services are accessed over the network by a collection of user interface components which can be mixed and matched to form a variety of interface styles. These services have also been reused with several applications based on World Wide Web browser interfaces.
A component-based, distributed object services architecture for a clinical workstation.
Chueh, H. C.; Raila, W. F.; Pappas, J. J.; Ford, M.; Zatsman, P.; Tu, J.; Barnett, G. O.
1996-01-01
Attention to an architectural framework in the development of clinical applications can promote reusability of both legacy systems as well as newly designed software. We describe one approach to an architecture for a clinical workstation application which is based on a critical middle tier of distributed object-oriented services. This tier of network-based services provides flexibility in the creation of both the user interface and the database tiers. We developed a clinical workstation for ambulatory care using this architecture, defining a number of core services including those for vocabulary, patient index, documents, charting, security, and encounter management. These services can be implemented through proprietary or more standard distributed object interfaces such as CORBA and OLE. Services are accessed over the network by a collection of user interface components which can be mixed and matched to form a variety of interface styles. These services have also been reused with several applications based on World Wide Web browser interfaces. PMID:8947744
Real-Time Remote Monitoring with Data Acquisition System
NASA Astrophysics Data System (ADS)
Faizal Zainal Abidin, Ahmad; Huzaimy Jusoh, Mohammad; James, Elster; Junid, Syed Abdul Mutalib Al; Mohd Yassin, Ahmad Ihsan
2015-11-01
The purpose of this system is to provide monitoring system for an electrical device and enable remote monitoring via web based application. This monitoring system allow the user to monitor the device condition from anywhere as the information will be synchronised to the website. The current and voltage reading of the monitored equipment, ambient temperature and humidity level are monitored and recorded. These parameters will be updated on the web page. All these sensor are connected to the microcontroller and the data will saved in micro secure digital (SD) card and send all the gathered information to a web page using the GPRS service connection synchronously. The collected data will be displayed on the website and the user enable to download the data directly from the website. The system will help user to monitor the devices condition and ambient changes with ease. The system is successfully developed, tested and has been installed at residential area in Taman Cahaya Alam, Section U12, Shah Alam, Selangor, Malaysia.
NASA Astrophysics Data System (ADS)
Paulraj, D.; Swamynathan, S.; Madhaiyan, M.
2012-11-01
Web Service composition has become indispensable as a single web service cannot satisfy complex functional requirements. Composition of services has received much interest to support business-to-business (B2B) or enterprise application integration. An important component of the service composition is the discovery of relevant services. In Semantic Web Services (SWS), service discovery is generally achieved by using service profile of Ontology Web Languages for Services (OWL-S). The profile of the service is a derived and concise description but not a functional part of the service. The information contained in the service profile is sufficient for atomic service discovery, but it is not sufficient for the discovery of composite semantic web services (CSWS). The purpose of this article is two-fold: first to prove that the process model is a better choice than the service profile for service discovery. Second, to facilitate the composition of inter-organisational CSWS by proposing a new composition method which uses process ontology. The proposed service composition approach uses an algorithm which performs a fine grained match at the level of atomic process rather than at the level of the entire service in a composite semantic web service. Many works carried out in this area have proposed solutions only for the composition of atomic services and this article proposes a solution for the composition of composite semantic web services.
Automatic geospatial information Web service composition based on ontology interface matching
NASA Astrophysics Data System (ADS)
Xu, Xianbin; Wu, Qunyong; Wang, Qinmin
2008-10-01
With Web services technology the functions of WebGIS can be presented as a kind of geospatial information service, and helped to overcome the limitation of the information-isolated situation in geospatial information sharing field. Thus Geospatial Information Web service composition, which conglomerates outsourced services working in tandem to offer value-added service, plays the key role in fully taking advantage of geospatial information services. This paper proposes an automatic geospatial information web service composition algorithm that employed the ontology dictionary WordNet to analyze semantic distances among the interfaces. Through making matching between input/output parameters and the semantic meaning of pairs of service interfaces, a geospatial information web service chain can be created from a number of candidate services. A practice of the algorithm is also proposed and the result of it shows the feasibility of this algorithm and the great promise in the emerging demand for geospatial information web service composition.
Graph-Based Semantic Web Service Composition for Healthcare Data Integration.
Arch-Int, Ngamnij; Arch-Int, Somjit; Sonsilphong, Suphachoke; Wanchai, Paweena
2017-01-01
Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement.
Graph-Based Semantic Web Service Composition for Healthcare Data Integration
2017-01-01
Within the numerous and heterogeneous web services offered through different sources, automatic web services composition is the most convenient method for building complex business processes that permit invocation of multiple existing atomic services. The current solutions in functional web services composition lack autonomous queries of semantic matches within the parameters of web services, which are necessary in the composition of large-scale related services. In this paper, we propose a graph-based Semantic Web Services composition system consisting of two subsystems: management time and run time. The management-time subsystem is responsible for dependency graph preparation in which a dependency graph of related services is generated automatically according to the proposed semantic matchmaking rules. The run-time subsystem is responsible for discovering the potential web services and nonredundant web services composition of a user's query using a graph-based searching algorithm. The proposed approach was applied to healthcare data integration in different health organizations and was evaluated according to two aspects: execution time measurement and correctness measurement. PMID:29065602
BioSWR – Semantic Web Services Registry for Bioinformatics
Repchevsky, Dmitry; Gelpi, Josep Ll.
2014-01-01
Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license. PMID:25233118
BioSWR--semantic web services registry for bioinformatics.
Repchevsky, Dmitry; Gelpi, Josep Ll
2014-01-01
Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license.
Reliable Execution Based on CPN and Skyline Optimization for Web Service Composition
Ha, Weitao; Zhang, Guojun
2013-01-01
With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431
Reliable execution based on CPN and skyline optimization for Web service composition.
Chen, Liping; Ha, Weitao; Zhang, Guojun
2013-01-01
With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets.
Aerts, Jozef
2017-01-01
RESTful web services nowadays are state-of-the-art in business transactions over the internet. They are however not very much used in medical informatics and in clinical research, especially not in Europe. To make an inventory of RESTful web services that can be used in medical informatics and clinical research, including those that can help in patient empowerment in the DACH region and in Europe, and to develop some new RESTful web services for use in clinical research and regulatory review. A literature search on available RESTful web services has been performed and new RESTful web services have been developed on an application server using the Java language. Most of the web services found originate from institutes and organizations in the USA, whereas no similar web services could be found that are made available by European organizations. New RESTful web services have been developed for LOINC codes lookup, for UCUM conversions and for use with CDISC Standards. A comparison is made between "top down" and "bottom up" web services, the latter meant to answer concrete questions immediately. The lack of RESTful web services made available by European organizations in healthcare and medical informatics is striking. RESTful web services may in short future play a major role in medical informatics, and when localized for the German language and other European languages, can help to considerably facilitate patient empowerment. This however requires an EU equivalent of the US National Library of Medicine.
Security Implications of OPC, OLE, DCOM, and RPC in Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2006-01-01
OPC is a collection of software programming standards and interfaces used in the process control industry. It is intended to provide open connectivity and vendor equipment interoperability. The use of OPC technology simplifies the development of control systems that integrate components from multiple vendors and support multiple control protocols. OPC-compliant products are available from most control system vendors, and are widely used in the process control industry. OPC was originally known as OLE for Process Control; the first standards for OPC were based on underlying services in the Microsoft Windows computing environment. These underlying services (OLE [Object Linking and Embedding],more » DCOM [Distributed Component Object Model], and RPC [Remote Procedure Call]) have been the source of many severe security vulnerabilities. It is not feasible to automatically apply vendor patches and service packs to mitigate these vulnerabilities in a control systems environment. Control systems using the original OPC data access technology can thus inherit the vulnerabilities associated with these services. Current OPC standardization efforts are moving away from the original focus on Microsoft protocols, with a distinct trend toward web-based protocols that are independent of any particular operating system. However, the installed base of OPC equipment consists mainly of legacy implementations of the OLE for Process Control protocols.« less
Internet Roadside Cafe #6. [Videotape.
ERIC Educational Resources Information Center
American Library Association Video/Library Video Network, Towson, MD.
This 30-minute videotape takes an in-depth look at World Wide Web business transactions, potential risks, client privacy and security issues by asking businesses and consumers how they do business on the Internet. Also featured in the program is advice about choosing a secure password, the use of credit cards for Web purchasing and a review of…
Weaving a Secure Web around Education: A Guide to Technology Standards and Security.
ERIC Educational Resources Information Center
National Forum on Education Statistics (ED/OERI), Washington, DC.
The purpose of this guidebook is to assist education agencies and organizations--which include state education agencies or state departments of education, school districts, and schools--in the development, maintenance, and standardization of effective Web sites. Also included is a detailed examination of the procedures necessary to provide…
Lizarraga, Gabriel; Li, Chunfei; Cabrerizo, Mercedes; Barker, Warren; Loewenstein, David A; Duara, Ranjan; Adjouadi, Malek
2018-04-26
Structural and functional brain images are essential imaging modalities for medical experts to study brain anatomy. These images are typically visually inspected by experts. To analyze images without any bias, they must be first converted to numeric values. Many software packages are available to process the images, but they are complex and difficult to use. The software packages are also hardware intensive. The results obtained after processing vary depending on the native operating system used and its associated software libraries; data processed in one system cannot typically be combined with data on another system. The aim of this study was to fulfill the neuroimaging community’s need for a common platform to store, process, explore, and visualize their neuroimaging data and results using Neuroimaging Web Services Interface: a series of processing pipelines designed as a cyber physical system for neuroimaging and clinical data in brain research. Neuroimaging Web Services Interface accepts magnetic resonance imaging, positron emission tomography, diffusion tensor imaging, and functional magnetic resonance imaging. These images are processed using existing and custom software packages. The output is then stored as image files, tabulated files, and MySQL tables. The system, made up of a series of interconnected servers, is password-protected and is securely accessible through a Web interface and allows (1) visualization of results and (2) downloading of tabulated data. All results were obtained using our processing servers in order to maintain data validity and consistency. The design is responsive and scalable. The processing pipeline started from a FreeSurfer reconstruction of Structural magnetic resonance imaging images. The FreeSurfer and regional standardized uptake value ratio calculations were validated using Alzheimer’s Disease Neuroimaging Initiative input images, and the results were posted at the Laboratory of Neuro Imaging data archive. Notable leading researchers in the field of Alzheimer’s Disease and epilepsy have used the interface to access and process the data and visualize the results. Tabulated results with unique visualization mechanisms help guide more informed diagnosis and expert rating, providing a truly unique multimodal imaging platform that combines magnetic resonance imaging, positron emission tomography, diffusion tensor imaging, and resting state functional magnetic resonance imaging. A quality control component was reinforced through expert visual rating involving at least 2 experts. To our knowledge, there is no validated Web-based system offering all the services that Neuroimaging Web Services Interface offers. The intent of Neuroimaging Web Services Interface is to create a tool for clinicians and researchers with keen interest on multimodal neuroimaging. More importantly, Neuroimaging Web Services Interface significantly augments the Alzheimer’s Disease Neuroimaging Initiative data, especially since our data contain a large cohort of Hispanic normal controls and Alzheimer’s Disease patients. The obtained results could be scrutinized visually or through the tabulated forms, informing researchers on subtle changes that characterize the different stages of the disease. ©Gabriel Lizarraga, Chunfei Li, Mercedes Cabrerizo, Warren Barker, David A Loewenstein, Ranjan Duara, Malek Adjouadi. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 26.04.2018.
Web service module for access to g-Lite
NASA Astrophysics Data System (ADS)
Goranova, R.; Goranov, G.
2012-10-01
G-Lite is a lightweight grid middleware for grid computing installed on all clusters of the European Grid Infrastructure (EGI). The middleware is partially service-oriented and does not provide well-defined Web services for job management. The existing Web services in the environment cannot be directly used by grid users for building service compositions in the EGI. In this article we present a module of well-defined Web services for job management in the EGI. We describe the architecture of the module and the design of the developed Web services. The presented Web services are composable and can participate in service compositions (workflows). An example of usage of the module with tools for service compositions in g-Lite is shown.
Web technology for emergency medicine and secure transmission of electronic patient records.
Halamka, J D
1998-01-01
The American Heritage dictionary defines the word "web" as "something intricately contrived, especially something that ensnares or entangles." The wealth of medical resources on the World Wide Web is now so extensive, yet disorganized and unmonitored, that such a definition seems fitting. In emergency medicine, for example, a field in which accurate and complete information, including patients' records, is urgently needed, more than 5000 Web pages are available today, whereas fewer than 50 were available in December 1994. Most sites are static Web pages using the Internet to publish textbook material, but new technology is extending the scope of the Internet to include online medical education and secure exchange of clinical information. This article lists some of the best Web sites for use in emergency medicine and then describes a project in which the Web is used for transmission and protection of electronic medical records.
BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.
Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel
2015-06-02
Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.
The IS-ENES climate4impact portal: bridging the CMIP5 and CORDEX data to impact users
NASA Astrophysics Data System (ADS)
Som de Cerff, Wim; Plieger, Maarten; Page, Christian; Tatarinova, Natalia; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Sjökvist, Elin; Vega Saldarriaga, Manuel; Santiago Cofiño Gonzalez, Antonio
2015-04-01
The aim of climate4impact (climate4impact.eu) is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 17 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the IS-ENES European project and is currently operated and further developed in the IS ENES2 project. As the climate impact community is very broad, the focus is mainly on the scientific impact community. Climate4impact is connected to the Earth System Grid Federation (ESGF) nodes containing global climate model data (GCM data) from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) and regional climate model data (RCM) data from the Coordinated Regional Climate Downscaling Experiment (CORDEX). This global network of climate model data centers offers services for data description, discovery and download. The climate4impact portal connects to these services using OpenID, and offers a user interface for searching, visualizing and downloading global climate model data and more. A challenging task is to describe the available model data and how it can be used. The portal informs users about possible caveats when using climate model data. All impact use cases are described in the documentation section, using highlighted keywords pointing to detailed information in the glossary. Climate4impact currently has two main objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/icclim on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals (like the future Copernicus platform, developed in the EU FP7 CLIPC project). This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national. In the presentation the following subjects will be detailed: - Lessons learned developing climate4impact.eu - Download: Directly from ESGF nodes and other THREDDS catalogs - Connection with the downscaling portal of the university of Cantabria - Experiences on the question and answer site via Askbot - Visualization: Visualize data from ESGF data nodes using ADAGUC Web Map Services. - Processing: Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 icclim. - Security: Login using OpenID for access to the ESGF data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA).
NASA Astrophysics Data System (ADS)
Argenti, M.; Giannini, V.; Averty, R.; Bigagli, L.; Dumoulin, J.
2012-04-01
The EC FP7 ISTIMES project has the goal of realizing an ICT-based system exploiting distributed and local sensors for non destructive electromagnetic monitoring in order to make critical transport infrastructures more reliable and safe. Higher situation awareness thanks to real time and detailed information and images of the controlled infrastructure status allows improving decision capabilities for emergency management stakeholders. Web-enabled sensors and a service-oriented approach are used as core of the architecture providing a sys-tem that adopts open standards (e.g. OGC SWE, OGC CSW etc.) and makes efforts to achieve full interoperability with other GMES and European Spatial Data Infrastructure initiatives as well as compliance with INSPIRE. The system exploits an open easily scalable network architecture to accommodate a wide range of sensors integrated with a set of tools for handling, analyzing and processing large data volumes from different organizations with different data models. Situation Awareness tools are also integrated in the system. Definition of sensor observations and services follows a metadata model based on the ISO 19115 Core set of metadata elements and the O&M model of OGC SWE. The ISTIMES infrastructure is based on an e-Infrastructure for geospatial data sharing, with a Data Cata-log that implements the discovery services for sensor data retrieval, acting as a broker through static connections based on standard SOS and WNS interfaces; a Decision Support component which helps decision makers providing support for data fusion and inference and generation of situation indexes; a Presentation component which implements system-users interaction services for information publication and rendering, by means of a WEB Portal using SOA design principles; A security framework using Shibboleth open source middleware based on the Security Assertion Markup Language supporting Single Sign On (SSO). ACKNOWLEDGEMENT - The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement n° 225663
Remote autopsy services: A feasibility study on nine cases.
Vodovnik, Aleksandar; Aghdam, Mohammad Reza F; Espedal, Dan Gøran
2017-01-01
Introduction We have conducted a feasibility study on remote autopsy services in order to increase the flexibility of the service with benefits for teaching and interdepartmental collaboration. Methods Three senior staff pathologists, one senior autopsy technician and one junior resident participated in the study. Nine autopsies were performed by the autopsy technician or resident, supervised by the primary pathologist, through the secure, double encrypted video link using Jabber Video (Cisco) with a high-speed broadband connection. The primary pathologist and autopsy room each connected to the secure virtual meeting room using 14″ laptops with in-built cameras (Hewlett-Packard). A portable high-definition web camera (Cisco) was used in the autopsy room. Primary and secondary pathologists independently interpreted and later compared gross findings for the purpose of quality assurance. The video was streamed live only during consultations and interpretation. A satisfaction survey on technical and professional aspects of the study was conducted. Results Independent interpretations of gross findings between primary and secondary pathologists yielded full agreement. A definite cause of death in one complex autopsy was determined following discussions between pathologists and reviews of the clinical notes. Our satisfaction level with the technical and professional aspects of the study was 87% and 97%, respectively. Discussion Remote autopsy services are found to be feasible in the hands of experienced staff, with increased flexibility and interest of autopsy technicians in the service as a result.
Web Services--A Buzz Word with Potentials
János T. Füstös
2006-01-01
The simplest definition of a web service is an application that provides a web API. The web API exposes the functionality of the solution to other applications. The web API relies on other Internet-based technologies to manage communications. The resulting web services are pervasive, vendor-independent, language-neutral, and very low-cost. The main purpose of a web API...
Access and accounting schemes of wireless broadband
NASA Astrophysics Data System (ADS)
Zhang, Jian; Huang, Benxiong; Wang, Yan; Yu, Xing
2004-04-01
In this paper, two wireless broadband access and accounting schemes were introduced. There are some differences in the client and the access router module between them. In one scheme, Secure Shell (SSH) protocol is used in the access system. The SSH server makes the authentication based on private key cryptography. The advantage of this scheme is the security of the user's information, and we have sophisticated access control. In the other scheme, Secure Sockets Layer (SSL) protocol is used the access system. It uses the technology of public privacy key. Nowadays, web browser generally combines HTTP and SSL protocol and we use the SSL protocol to implement the encryption of the data between the clients and the access route. The schemes are same in the radius sever part. Remote Authentication Dial in User Service (RADIUS), as a security protocol in the form of Client/Sever, is becoming an authentication/accounting protocol for standard access to the Internet. It will be explained in a flow chart. In our scheme, the access router serves as the client to the radius server.
BioServices: a common Python package to access biological Web Services programmatically.
Cokelaer, Thomas; Pultz, Dennis; Harder, Lea M; Serra-Musach, Jordi; Saez-Rodriguez, Julio
2013-12-15
Web interfaces provide access to numerous biological databases. Many can be accessed to in a programmatic way thanks to Web Services. Building applications that combine several of them would benefit from a single framework. BioServices is a comprehensive Python framework that provides programmatic access to major bioinformatics Web Services (e.g. KEGG, UniProt, BioModels, ChEMBLdb). Wrapping additional Web Services based either on Representational State Transfer or Simple Object Access Protocol/Web Services Description Language technologies is eased by the usage of object-oriented programming. BioServices releases and documentation are available at http://pypi.python.org/pypi/bioservices under a GPL-v3 license.
NASA Astrophysics Data System (ADS)
Yu, Weishui; Luo, Changshou; Zheng, Yaming; Wei, Qingfeng; Cao, Chengzhong
2017-09-01
To deal with the “last kilometer” problem during the agricultural science and technology information service, we analyzed the feasibility, necessity and advantages of WebApp applied to agricultural information service and discussed the modes of WebApp used in agricultural information service based on the requirements analysis and the function of WebApp. To overcome the existing App’s defects of difficult installation and weak compatibility between the mobile operating systems, the Beijing Agricultural Sci-tech Service Hotline WebApp was developed based on the HTML and JAVA technology. The WebApp has greater compatibility and simpler operation than the Native App, what’s more, it can be linked to the WeChat public platform making it spread easily and run directly without setup process. The WebApp was used to provide agricultural expert consulting services and agriculture information push, obtained a good preliminary application achievement. Finally, we concluded the creative application of WebApp in agricultural consulting services and prospected the development of WebApp in agricultural information service.
An Encryption Scheme for Communication Internet SCADA Components
NASA Astrophysics Data System (ADS)
Robles, Rosslin John; Kim, Tai-Hoon
The trend in most systems is that they are connected through the Internet. Traditional Supervisory Control and Data Acquisition Systems (SCADA) is connected only in a limited private network. SCADA is considered a critical infrastructure, and connecting to the internet is putting the society on jeopardy, some operators hold back on connecting it to the internet. But since the internet Supervisory Control and Data Acquisition Systems (SCADA) facility has brought a lot of advantages in terms of control, data viewing and generation. Along with these advantages, are security issues regarding web SCADA, operators are pushed to connect Supervisory Control and Data Acquisition Systems (SCADA) through the internet. Because of this, many issues regarding security surfaced. In this paper, we discuss web SCADA and the issues regarding security. As a countermeasure, a web SCADA security solution using crossed-crypto-scheme is proposed to be used in the communication of SCADA components.
Mashup Model and Verification Using Mashup Processing Network
NASA Astrophysics Data System (ADS)
Zahoor, Ehtesham; Perrin, Olivier; Godart, Claude
Mashups are defined to be lightweight Web applications aggregating data from different Web services, built using ad-hoc composition and being not concerned with long term stability and robustness. In this paper we present a pattern based approach, called Mashup Processing Network (MPN). The idea is based on Event Processing Network and is supposed to facilitate the creation, modeling and the verification of mashups. MPN provides a view of how different actors interact for the mashup development namely the producer, consumer, mashup processing agent and the communication channels. It also supports modeling transformations and validations of data and offers validation of both functional and non-functional requirements, such as reliable messaging and security, that are key issues within the enterprise context. We have enriched the model with a set of processing operations and categorize them into data composition, transformation and validation categories. These processing operations can be seen as a set of patterns for facilitating the mashup development process. MPN also paves a way for realizing Mashup Oriented Architecture where mashups along with services are used as building blocks for application development.
17 CFR 232.314 - Accommodation for certain securitizers of asset-backed securities.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Securities Rulemaking Board's Internet Web site. [76 FR 4511, Jan. 26, 2011] XBRL-Related Documents ... 17 Commodity and Securities Exchanges 3 2014-04-01 2014-04-01 false Accommodation for certain securitizers of asset-backed securities. 232.314 Section 232.314 Commodity and Securities Exchanges SECURITIES...
17 CFR 232.314 - Accommodation for certain securitizers of asset-backed securities.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Securities Rulemaking Board's Internet Web site. [76 FR 4511, Jan. 26, 2011] XBRL-Related Documents ... 17 Commodity and Securities Exchanges 2 2012-04-01 2012-04-01 false Accommodation for certain securitizers of asset-backed securities. 232.314 Section 232.314 Commodity and Securities Exchanges SECURITIES...
17 CFR 232.314 - Accommodation for certain securitizers of asset-backed securities.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Securities Rulemaking Board's Internet Web site. [76 FR 4511, Jan. 26, 2011] XBRL-Related Documents ... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false Accommodation for certain securitizers of asset-backed securities. 232.314 Section 232.314 Commodity and Securities Exchanges SECURITIES...
17 CFR 232.314 - Accommodation for certain securitizers of asset-backed securities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Securities Rulemaking Board's Internet Web site. [76 FR 4511, Jan. 26, 2011] XBRL-Related Documents ... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false Accommodation for certain securitizers of asset-backed securities. 232.314 Section 232.314 Commodity and Securities Exchanges SECURITIES...
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
Toward Exposing Timing-Based Probing Attacks in Web Applications †
Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai
2017-01-01
Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users’ browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach. PMID:28245610
Toward Exposing Timing-Based Probing Attacks in Web Applications.
Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai
2017-02-25
Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users' browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach.
Web Service Execution and Monitoring in Integrated Applications in Support of Business Communities
NASA Astrophysics Data System (ADS)
Chiriacescu, Rares M.; SzóKe, Alexandru; Portase, Sorin; Florea, Monica
Emerging technology is one of the key factors that drive the business world to faster adaptation, reaction and shorter communication path. Building upon such technologies, business communities emerge, geared toward high flexibility in their offerings and collaboration: business-to-customer and business-to-business collaborations. Adapting to the market requirements, companies must address several technical challenges that arise from the main requirements of the system they have to introduce: a high degree of flexibility, heterogeneous system collaboration and security of the transferred data.
Informatics for neglected diseases collaborations.
Bost, Frederic; Jacobs, Robert T; Kowalczyk, Paul
2010-05-01
Many different public and private organizations from across the globe are collaborating on neglected diseases drug-discovery and development projects with the aim of identifying a cure for tropical infectious diseases. These neglected diseases collaborations require a global, secure, multi-organization data-management solution, combined with a platform that facilitates communication and supports collaborative work. This review discusses the solutions offered by 'Software as a Service' (SaaS) web-based platforms, despite notable challenges, and the evolution of these platforms required to foster efficient virtual research efforts by geographically dispersed scientists.
Persistence and availability of Web services in computational biology.
Schultheiss, Sebastian J; Münch, Marc-Christian; Andreeva, Gergana D; Rätsch, Gunnar
2011-01-01
We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository.
Persistence and Availability of Web Services in Computational Biology
Schultheiss, Sebastian J.; Münch, Marc-Christian; Andreeva, Gergana D.; Rätsch, Gunnar
2011-01-01
We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository. PMID:21966383
Space Physics Data Facility Web Services
NASA Technical Reports Server (NTRS)
Candey, Robert M.; Harris, Bernard T.; Chimiak, Reine A.
2005-01-01
The Space Physics Data Facility (SPDF) Web services provides a distributed programming interface to a portion of the SPDF software. (A general description of Web services is available at http://www.w3.org/ and in many current software-engineering texts and articles focused on distributed programming.) The SPDF Web services distributed programming interface enables additional collaboration and integration of the SPDF software system with other software systems, in furtherance of the SPDF mission to lead collaborative efforts in the collection and utilization of space physics data and mathematical models. This programming interface conforms to all applicable Web services specifications of the World Wide Web Consortium. The interface is specified by a Web Services Description Language (WSDL) file. The SPDF Web services software consists of the following components: 1) A server program for implementation of the Web services; and 2) A software developer s kit that consists of a WSDL file, a less formal description of the interface, a Java class library (which further eases development of Java-based client software), and Java source code for an example client program that illustrates the use of the interface.
The EMBRACE web service collection
Pettifer, Steve; Ison, Jon; Kalaš, Matúš; Thorne, Dave; McDermott, Philip; Jonassen, Inge; Liaquat, Ali; Fernández, José M.; Rodriguez, Jose M.; Partners, INB-; Pisano, David G.; Blanchet, Christophe; Uludag, Mahmut; Rice, Peter; Bartaseviciute, Edita; Rapacki, Kristoffer; Hekkelman, Maarten; Sand, Olivier; Stockinger, Heinz; Clegg, Andrew B.; Bongcam-Rudloff, Erik; Salzemann, Jean; Breton, Vincent; Attwood, Teresa K.; Cameron, Graham; Vriend, Gert
2010-01-01
The EMBRACE (European Model for Bioinformatics Research and Community Education) web service collection is the culmination of a 5-year project that set out to investigate issues involved in developing and deploying web services for use in the life sciences. The project concluded that in order for web services to achieve widespread adoption, standards must be defined for the choice of web service technology, for semantically annotating both service function and the data exchanged, and a mechanism for discovering services must be provided. Building on this, the project developed: EDAM, an ontology for describing life science web services; BioXSD, a schema for exchanging data between services; and a centralized registry (http://www.embraceregistry.net) that collects together around 1000 services developed by the consortium partners. This article presents the current status of the collection and its associated recommendations and standards definitions. PMID:20462862
The Namibia Early Flood Warning System, A CEOS Pilot Project
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Sohlberg, Robert; Handy, Matthew; Grossman, Robert
2012-01-01
Over the past year few years, an international collaboration has developed a pilot project under the auspices of Committee on Earth Observation Satellite (CEOS) Disasters team. The overall team consists of civilian satellite agencies. For this pilot effort, the development team consists of NASA, Canadian Space Agency, Univ. of Maryland, Univ. of Colorado, Univ. of Oklahoma, Ukraine Space Research Institute and Joint Research Center(JRC) for European Commission. This development team collaborates with regional , national and international agencies to deliver end-to-end disaster coverage. In particular, the team in collaborating on this effort with the Namibia Department of Hydrology to begin in Namibia . However, the ultimate goal is to expand the functionality to provide early warning over the South Africa region. The initial collaboration was initiated by United Nations Office of Outer Space Affairs and CEOS Working Group for Information Systems and Services (WGISS). The initial driver was to demonstrate international interoperability using various space agency sensors and models along with regional in-situ ground sensors. In 2010, the team created a preliminary semi-manual system to demonstrate moving and combining key data streams and delivering the data to the Namibia Department of Hydrology during their flood season which typically is January through April. In this pilot, a variety of moderate resolution and high resolution satellite flood imagery was rapidly delivered and used in conjunction with flood predictive models in Namibia. This was collected in conjunction with ground measurements and was used to examine how to create a customized flood early warning system. During the first year, the team made use of SensorWeb technology to gather various sensor data which was used to monitor flood waves traveling down basins originating in Angola, but eventually flooding villages in Namibia. The team made use of standardized interfaces such as those articulated under the Open Cloud Consortium (OGC) Sensor Web Enablement (SWE) set of web services was good [1][2]. However, it was discovered that in order to make a system like this functional, there were many performance issues. Data sets were large and located in a variety of location behind firewalls and had to be accessed across open networks, so security was an issue. Furthermore, the network access acted as bottleneck to transfer map products to where they are needed. Finally, during disasters, many users and computer processes act in parallel and thus it was very easy to overload the single string of computers stitched together in a virtual system that was initially developed. To address some of these performance issues, the team partnered with the Open Cloud Consortium (OCC) who supplied a Computation Cloud located at the University of Illinois at Chicago and some manpower to administer this Cloud. The Flood SensorWeb [3] system was interfaced to the Cloud to provide a high performance user interface and product development engine. Figure 1 shows the functional diagram of the Flood SensorWeb. Figure 2 shows some of the functionality of the Computation Cloud that was integrated. A significant portion of the original system was ported to the Cloud and during the past year, technical issues were resolved which included web access to the Cloud, security over the open Internet, beginning experiments on how to handle surge capacity by using the virtual machines in the cloud in parallel, using tiling techniques to render large data sets as layers on map, interfaces to allow user to customize the data processing/product chain and other performance enhancing techniques. The conclusion reached from the effort and this presentation is that defining the interoperability standards in a small fraction of the work. For example, once open web service standards were defined, many users could not make use of the standards due to security restrictions. Furthermore, once an interoperable sysm is functional, then a surge of users can render a system unusable, especially in the disaster domain.
Research on models of Digital City geo-information sharing platform
NASA Astrophysics Data System (ADS)
Xu, Hanwei; Liu, Zhihui; Badawi, Rami; Liu, Haiwang
2009-10-01
The data related to Digital City has the property of large quantity, isomerous and multiple dimensions. In the original copy method of data sharing, the application departments can not solve the problem of data updating and data security in real-time. This paper firstly analyzes various patterns of sharing Digital City information and on this basis the author provides a new shared mechanism of GIS Services, with which the data producers provide Geographic Information Services to the application users through Web API, so as to the data producers and the data users can do their best respectively. Then the author takes the application system in supermarket management as an example to explain the correctness and effectiveness of the method provided in this paper.
Authentication Binding between SSL/TLS and HTTP
NASA Astrophysics Data System (ADS)
Saito, Takamichi; Sekiguchi, Kiyomi; Hatsugai, Ryosuke
While the Secure Socket Layer or Transport Layer Security (SSL/TLS) is assumed to provide secure communications over the Internet, many web applications utilize basic or digest authentication of Hyper Text Transport Protocol (HTTP) over SSL/TLS. Namely, in the scheme, there are two different authentication schemes in a session. Since they are separated by a layer, these are not convenient for a web application. Moreover, the scheme may also cause problems in establishing secure communication. Then we provide a scheme of authentication binding between SSL/TLS and HTTP without modifying SSL/TLS protocols and its implementation, and we show the effectiveness of our proposed scheme.
Enhancing UCSF Chimera through web services
Huang, Conrad C.; Meng, Elaine C.; Morris, John H.; Pettersen, Eric F.; Ferrin, Thomas E.
2014-01-01
Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624
Do You Ignore Information Security in Your Journal Website?
Dadkhah, Mehdi; Borchardt, Glenn; Lagzian, Mohammad
2017-08-01
Nowadays, web-based applications extend to all businesses due to their advantages and easy usability. The most important issue in web-based applications is security. Due to their advantages, most academic journals are now using these applications, with papers being submitted and published through their websites. As these websites are resources for knowledge, information security is primary for maintaining their integrity. In this opinion piece, we point out vulnerabilities in certain websites and introduce the potential for future threats. We intend to present how some journals are vulnerable and what will happen if a journal can be infected by attackers. This opinion is not a technical manual in information security, it is a short inspection that we did to improve the security of academic journals.
Real-time GIS data model and sensor web service platform for environmental data management.
Gong, Jianya; Geng, Jing; Chen, Zeqiang
2015-01-09
Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.
Similarity Based Semantic Web Service Match
NASA Astrophysics Data System (ADS)
Peng, Hui; Niu, Wenjia; Huang, Ronghuai
Semantic web service discovery aims at returning the most matching advertised services to the service requester by comparing the semantic of the request service with an advertised service. The semantic of a web service are described in terms of inputs, outputs, preconditions and results in Ontology Web Language for Service (OWL-S) which formalized by W3C. In this paper we proposed an algorithm to calculate the semantic similarity of two services by weighted averaging their inputs and outputs similarities. Case study and applications show the effectiveness of our algorithm in service match.
Flush-mounting technique for composite beams
NASA Technical Reports Server (NTRS)
Harman, T. C.; Kay, B. F.
1980-01-01
Procedure permits mounting of heavy parts to surface of composite beams without appreciably weakening beam web. Web is split and held apart in region where attachment is to be made by lightweight precast foam filler. Bolt hole penetrates foam rather than web, and is secured by barrelnut in transverse bushing through web.
12 CFR 555.310 - How do I notify OTS?
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Describe the transactional web site. (2) Indicate the date the transactional web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional web site. (b) Transition provision. If you established a transactional web site after the date of your last...
12 CFR 155.310 - How do I notify the OCC?
Code of Federal Regulations, 2014 CFR
2014-01-01
... least 30 days before you establish a transactional Web site. The notice must do three things: (a) Describe the transactional web site. (b) Indicate the date the transactional web site will become operational. (c) List a contact familiar with the deployment, operation, and security of the transactional web...
12 CFR 390.222 - How do I notify FDIC?
Code of Federal Regulations, 2013 CFR
2013-01-01
... 30 days before you establish a transactional Web site. The notice must do three things: (1) Describe the transactional Web site. (2) Indicate the date the transactional Web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional Web...
12 CFR 555.310 - How do I notify OTS?
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Describe the transactional web site. (2) Indicate the date the transactional web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional web site. (b) Transition provision. If you established a transactional web site after the date of your last...
12 CFR 390.222 - How do I notify FDIC?
Code of Federal Regulations, 2012 CFR
2012-01-01
... 30 days before you establish a transactional Web site. The notice must do three things: (1) Describe the transactional Web site. (2) Indicate the date the transactional Web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional Web...
12 CFR 555.310 - How do I notify OTS?
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Describe the transactional web site. (2) Indicate the date the transactional web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional web site. (b) Transition provision. If you established a transactional web site after the date of your last...
12 CFR 155.310 - How do I notify the OCC?
Code of Federal Regulations, 2013 CFR
2013-01-01
... least 30 days before you establish a transactional Web site. The notice must do three things: (a) Describe the transactional web site. (b) Indicate the date the transactional web site will become operational. (c) List a contact familiar with the deployment, operation, and security of the transactional web...
12 CFR 555.310 - How do I notify OTS?
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Describe the transactional web site. (2) Indicate the date the transactional web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional web site. (b) Transition provision. If you established a transactional web site after the date of your last...
12 CFR 390.222 - How do I notify FDIC?
Code of Federal Regulations, 2014 CFR
2014-01-01
... 30 days before you establish a transactional Web site. The notice must do three things: (1) Describe the transactional Web site. (2) Indicate the date the transactional Web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional Web...
12 CFR 155.310 - How do I notify the OCC?
Code of Federal Regulations, 2012 CFR
2012-01-01
... least 30 days before you establish a transactional Web site. The notice must do three things: (a) Describe the transactional web site. (b) Indicate the date the transactional web site will become operational. (c) List a contact familiar with the deployment, operation, and security of the transactional web...
Boverhof's App Earns Honorable Mention in Amazon's Web Services
» Boverhof's App Earns Honorable Mention in Amazon's Web Services Competition News & Publications News Publications Facebook Google+ Twitter Boverhof's App Earns Honorable Mention in Amazon's Web Services by Amazon Web Services (AWS). Amazon officially announced the winners of its EC2 Spotathon on Monday
Biological Web Service Repositories Review
Urdidiales‐Nieto, David; Navas‐Delgado, Ismael
2016-01-01
Abstract Web services play a key role in bioinformatics enabling the integration of database access and analysis of algorithms. However, Web service repositories do not usually publish information on the changes made to their registered Web services. Dynamism is directly related to the changes in the repositories (services registered or unregistered) and at service level (annotation changes). Thus, users, software clients or workflow based approaches lack enough relevant information to decide when they should review or re‐execute a Web service or workflow to get updated or improved results. The dynamism of the repository could be a measure for workflow developers to re‐check service availability and annotation changes in the services of interest to them. This paper presents a review on the most well‐known Web service repositories in the life sciences including an analysis of their dynamism. Freshness is introduced in this paper, and has been used as the measure for the dynamism of these repositories. PMID:27783459
NASA Astrophysics Data System (ADS)
Du, Xiaofeng; Song, William; Munro, Malcolm
Web Services as a new distributed system technology has been widely adopted by industries in the areas, such as enterprise application integration (EAI), business process management (BPM), and virtual organisation (VO). However, lack of semantics in the current Web Service standards has been a major barrier in service discovery and composition. In this chapter, we propose an enhanced context-based semantic service description framework (CbSSDF+) that tackles the problem and improves the flexibility of service discovery and the correctness of generated composite services. We also provide an agile transformation method to demonstrate how the various formats of Web Service descriptions on the Web can be managed and renovated step by step into CbSSDF+ based service description without large amount of engineering work. At the end of the chapter, we evaluate the applicability of the transformation method and the effectiveness of CbSSDF+ through a series of experiments.
Secure Web-based Ground System User Interfaces over the Open Internet
NASA Technical Reports Server (NTRS)
Langston, James H.; Murray, Henry L.; Hunt, Gary R.
1998-01-01
A prototype has been developed which makes use of commercially available products in conjunction with the Java programming language to provide a secure user interface for command and control over the open Internet. This paper reports successful demonstration of: (1) Security over the Internet, including encryption and certification; (2) Integration of Java applets with a COTS command and control product; (3) Remote spacecraft commanding using the Internet. The Java-based Spacecraft Web Interface to Telemetry and Command Handling (Jswitch) ground system prototype provides these capabilities. This activity demonstrates the use and integration of current technologies to enable a spacecraft engineer or flight operator to monitor and control a spacecraft from a user interface communicating over the open Internet using standard World Wide Web (WWW) protocols and commercial off-the-shelf (COTS) products. The core command and control functions are provided by the COTS Epoch 2000 product. The standard WWW tools and browsers are used in conjunction with the Java programming technology. Security is provided with the current encryption and certification technology. This system prototype is a step in the direction of giving scientist and flight operators Web-based access to instrument, payload, and spacecraft data.
Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership
NASA Astrophysics Data System (ADS)
Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya
CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.
Augmenting Space Technology Program Management with Secure Cloud & Mobile Services
NASA Technical Reports Server (NTRS)
Hodson, Robert F.; Munk, Christopher; Helble, Adelle; Press, Martin T.; George, Cory; Johnson, David
2017-01-01
The National Aeronautics and Space Administration (NASA) Game Changing Development (GCD) program manages technology projects across all NASA centers and reports to NASA headquarters regularly on progress. Program stakeholders expect an up-to-date, accurate status and often have questions about the program's portfolio that requires a timely response. Historically, reporting, data collection, and analysis were done with manual processes that were inefficient and prone to error. To address these issues, GCD set out to develop a new business automation solution. In doing this, the program wanted to leverage the latest information technology platforms and decided to utilize traditional systems along with new cloud-based web services and gaming technology for a novel and interactive user environment. The team also set out to develop a mobile solution for anytime information access. This paper discusses a solution to these challenging goals and how the GCD team succeeded in developing and deploying such a system. The architecture and approach taken has proven to be effective and robust and can serve as a model for others looking to develop secure interactive mobile business solutions for government or enterprise business automation.
OC ToGo: bed site image integration into OpenClinica with mobile devices
NASA Astrophysics Data System (ADS)
Haak, Daniel; Gehlen, Johan; Jonas, Stephan; Deserno, Thomas M.
2014-03-01
Imaging and image-based measurements nowadays play an essential role in controlled clinical trials, but electronic data capture (EDC) systems insufficiently support integration of captured images by mobile devices (e.g. smartphones and tablets). The web application OpenClinica has established as one of the world's leading EDC systems and is used to collect, manage and store data of clinical trials in electronic case report forms (eCRFs). In this paper, we present a mobile application for instantaneous integration of images into OpenClinica directly during examination on patient's bed site. The communication between the Android application and OpenClinica is based on the simple object access protocol (SOAP) and representational state transfer (REST) web services for metadata, and secure file transfer protocol (SFTP) for image transfer, respectively. OpenClinica's web services are used to query context information (e.g. existing studies, events and subjects) and to import data into the eCRF, as well as export of eCRF metadata and structural information. A stable image transfer is ensured and progress information (e.g. remaining time) visualized to the user. The workflow is demonstrated for a European multi-center registry, where patients with calciphylaxis disease are included. Our approach improves the EDC workflow, saves time, and reduces costs. Furthermore, data privacy is enhanced, since storage of private health data on the imaging devices becomes obsolete.
Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E.; Tkachenko, Valery; Torcivia-Rodriguez, John; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja
2016-01-01
The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure. The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu PMID:26989153
Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja
2016-01-01
The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.
The impact of web services at the IRIS DMC
NASA Astrophysics Data System (ADS)
Weekly, R. T.; Trabant, C. M.; Ahern, T. K.; Stults, M.; Suleiman, Y. Y.; Van Fossen, M.; Weertman, B.
2015-12-01
The IRIS Data Management Center (DMC) has served the seismological community for nearly 25 years. In that time we have offered data and information from our archive using a variety of mechanisms ranging from email-based to desktop applications to web applications and web services. Of these, web services have quickly become the primary method for data extraction at the DMC. In 2011, the first full year of operation, web services accounted for over 40% of the data shipped from the DMC. In 2014, over ~450 TB of data was delivered directly to users through web services, representing nearly 70% of all shipments from the DMC that year. In addition to handling requests directly from users, the DMC switched all data extraction methods to use web services in 2014. On average the DMC now handles between 10 and 20 million requests per day submitted to web service interfaces. The rapid adoption of web services is attributed to the many advantages they bring. For users, they provide on-demand data using an interface technology, HTTP, that is widely supported in nearly every computing environment and language. These characteristics, combined with human-readable documentation and existing tools make integration of data access into existing workflows relatively easy. For the DMC, the web services provide an abstraction layer to internal repositories allowing for concentrated optimization of extraction workflow and easier evolution of those repositories. Lending further support to DMC's push in this direction, the core web services for station metadata, timeseries data and event parameters were adopted as standards by the International Federation of Digital Seismograph Networks (FDSN). We expect to continue enhancing existing services and building new capabilities for this platform. For example, the DMC has created a federation system and tools allowing researchers to discover and collect seismic data from data centers running the FDSN-standardized services. A future capability will leverage the DMC's MUSTANG project to select data based on data quality measurements. Within five years, the DMC's web services have proven to be a robust and flexible platform that enables continued growth for the DMC. We expect continued enhancements and adoption of web services.
Dao, Tien Tuan; Hoang, Tuan Nha; Ta, Xuan Hien; Tho, Marie Christine Ho Ba
2013-02-01
Human musculoskeletal system resources of the human body are valuable for the learning and medical purposes. Internet-based information from conventional search engines such as Google or Yahoo cannot response to the need of useful, accurate, reliable and good-quality human musculoskeletal resources related to medical processes, pathological knowledge and practical expertise. In this present work, an advanced knowledge-based personalized search engine was developed. Our search engine was based on a client-server multi-layer multi-agent architecture and the principle of semantic web services to acquire dynamically accurate and reliable HMSR information by a semantic processing and visualization approach. A security-enhanced mechanism was applied to protect the medical information. A multi-agent crawler was implemented to develop a content-based database of HMSR information. A new semantic-based PageRank score with related mathematical formulas were also defined and implemented. As the results, semantic web service descriptions were presented in OWL, WSDL and OWL-S formats. Operational scenarios with related web-based interfaces for personal computers and mobile devices were presented and analyzed. Functional comparison between our knowledge-based search engine, a conventional search engine and a semantic search engine showed the originality and the robustness of our knowledge-based personalized search engine. In fact, our knowledge-based personalized search engine allows different users such as orthopedic patient and experts or healthcare system managers or medical students to access remotely into useful, accurate, reliable and good-quality HMSR information for their learning and medical purposes. Copyright © 2012 Elsevier Inc. All rights reserved.
LISA, the next generation: from a web-based application to a fat client.
Pierlet, Noëlla; Aerts, Werner; Vanautgaerden, Mark; Van den Bosch, Bart; De Deurwaerder, André; Schils, Erik; Noppe, Thomas
2008-01-01
The LISA application, developed by the University Hospitals Leuven, permits referring physicians to consult the electronic medical records of their patients over the internet in a highly secure way. We decided to completely change the way we secured the application, discard the existing web application and build a completely new application, based on the in-house developed hospital information system, used in the University Hospitals Leuven. The result is a fat Java client, running on a Windows Terminal Server, secured by a commercial SSL-VPN solution.
Web service discovery among large service pools utilising semantic similarity and clustering
NASA Astrophysics Data System (ADS)
Chen, Fuzan; Li, Minqiang; Wu, Harris; Xie, Lingli
2017-03-01
With the rapid development of electronic business, Web services have attracted much attention in recent years. Enterprises can combine individual Web services to provide new value-added services. An emerging challenge is the timely discovery of close matches to service requests among large service pools. In this study, we first define a new semantic similarity measure combining functional similarity and process similarity. We then present a service discovery mechanism that utilises the new semantic similarity measure for service matching. All the published Web services are pre-grouped into functional clusters prior to the matching process. For a user's service request, the discovery mechanism first identifies matching services clusters and then identifies the best matching Web services within these matching clusters. Experimental results show that the proposed semantic discovery mechanism performs better than a conventional lexical similarity-based mechanism.
A verification strategy for web services composition using enhanced stacked automata model.
Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali
2015-01-01
Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the existing models.
Pragmatic Computing - A Semiotic Perspective to Web Services
NASA Astrophysics Data System (ADS)
Liu, Kecheng
The web seems to have evolved from a syntactic web, a semantic web to a pragmatic web. This evolution conforms to the study of information and technology from the theory of semiotics. The pragmatics, concerning with the use of information in relation to the context and intended purposes, is extremely important in web service and applications. Much research in pragmatics has been carried out; but in the same time, attempts and solutions have led to some more questions. After reviewing the current work in pragmatic web, the paper presents a semiotic approach to website services, particularly on request decomposition and service aggregation.
A systematic review of studies of web portals for patients with diabetes mellitus.
Coughlin, Steven S; Williams, Lovoria B; Hatzigeorgiou, Christos
2017-01-01
Patient web portals are password-protected online websites that offer patients 24-hour access to personal health information from anywhere with an Internet connection. Due to advances in health information technologies, there has been increasing interest among providers and researchers in patient web portals for use by patients with diabetes and other chronic conditions. This article, which is based upon bibliographic searches in PubMed, reviews web portals for patients with diabetes mellitus including patient web portals tethered to electronic medical records and web portals developed specifically for patients with diabetes. Twelve studies of the impact of patient web portals on the management of diabetes patients were identified. Three had a cross-sectional design, 1 employed mixed-methods, one had a matched-control design, 3 had a retrospective cohort design, and 5 were randomized controlled trials. Six (50%) of the studies examined web portals tethered to electronic medical records and the remainder were web portals developed specifically for diabetes patients. The results of this review suggest that secure messaging between adult diabetic patients and their clinician is associated with improved glycemic control. However, results from observational studies indicate that many diabetic patients do not take advantage of web portal features such as secure messaging, perhaps because of a lack of internet access or lack of experience in navigating web portal resources. Although results from randomized controlled trials provide stronger evidence of the efficacy of web portal use in improving glycemic control among diabetic patients, the number of trials is small and results from the trials have been mixed. Studies suggest that secure messaging between adult diabetic patients and their clinician is associated with improved glycemic control, but negative findings have also been reported. The number of randomized controlled trials that have examined the efficacy of web portal use in improving glycemic control among diabetic patients is still small. Additional research is needed to identify specific portal features that may impact quality of care or improve glycemic control.
A systematic review of studies of web portals for patients with diabetes mellitus
Williams, Lovoria B.; Hatzigeorgiou, Christos
2017-01-01
Patient web portals are password-protected online websites that offer patients 24-hour access to personal health information from anywhere with an Internet connection. Due to advances in health information technologies, there has been increasing interest among providers and researchers in patient web portals for use by patients with diabetes and other chronic conditions. This article, which is based upon bibliographic searches in PubMed, reviews web portals for patients with diabetes mellitus including patient web portals tethered to electronic medical records and web portals developed specifically for patients with diabetes. Twelve studies of the impact of patient web portals on the management of diabetes patients were identified. Three had a cross-sectional design, 1 employed mixed-methods, one had a matched-control design, 3 had a retrospective cohort design, and 5 were randomized controlled trials. Six (50%) of the studies examined web portals tethered to electronic medical records and the remainder were web portals developed specifically for diabetes patients. The results of this review suggest that secure messaging between adult diabetic patients and their clinician is associated with improved glycemic control. However, results from observational studies indicate that many diabetic patients do not take advantage of web portal features such as secure messaging, perhaps because of a lack of internet access or lack of experience in navigating web portal resources. Although results from randomized controlled trials provide stronger evidence of the efficacy of web portal use in improving glycemic control among diabetic patients, the number of trials is small and results from the trials have been mixed. Studies suggest that secure messaging between adult diabetic patients and their clinician is associated with improved glycemic control, but negative findings have also been reported. The number of randomized controlled trials that have examined the efficacy of web portal use in improving glycemic control among diabetic patients is still small. Additional research is needed to identify specific portal features that may impact quality of care or improve glycemic control. PMID:28736732
Access to the NCAR Research Data Archive via the Globus Data Transfer Service
NASA Astrophysics Data System (ADS)
Cram, T.; Schuster, D.; Ji, Z.; Worley, S. J.
2014-12-01
The NCAR Research Data Archive (RDA; http://rda.ucar.edu) contains a large and diverse collection of meteorological and oceanographic observations, operational and reanalysis outputs, and remote sensing datasets to support atmospheric and geoscience research. The RDA contains greater than 600 dataset collections which support the varying needs of a diverse user community. The number of RDA users is increasing annually, and the most popular method used to access the RDA data holdings is through web based protocols, such as wget and cURL based scripts. In the year 2013, 10,000 unique users downloaded greater than 820 terabytes of data from the RDA, and customized data products were prepared for more than 29,000 user-driven requests. In order to further support this increase in web download usage, the RDA is implementing the Globus data transfer service (www.globus.org) to provide a GridFTP data transfer option for the user community. The Globus service is broadly scalable, has an easy to install client, is sustainably supported, and provides a robust, efficient, and reliable data transfer option for RDA users. This paper highlights the main functionality and usefulness of the Globus data transfer service for accessing the RDA holdings. The Globus data transfer service, developed and supported by the Computation Institute at The University of Chicago and Argonne National Laboratory, uses the GridFTP as a fast, secure, and reliable method for transferring data between two endpoints. A Globus user account is required to use this service, and data transfer endpoints are defined on the Globus web interface. In the RDA use cases, the access endpoint is created on the RDA data server at NCAR. The data user defines the receiving endpoint for the data transfer, which can be the main file system at a host institution, a personal work station, or laptop. Once initiated, the data transfer runs as an unattended background process by Globus, and Globus ensures that the transfer is accurately fulfilled. Users can monitor the data transfer progress on the Globus web interface and optionally receive an email notification once it is complete. Globus also provides a command-line interface to support scripted transfers, which can be useful when embedded in data processing workflows.
QoS measurement of workflow-based web service compositions using Colored Petri net.
Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra
2014-01-01
Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.
Enhancing UCSF Chimera through web services.
Huang, Conrad C; Meng, Elaine C; Morris, John H; Pettersen, Eric F; Ferrin, Thomas E
2014-07-01
Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Masys, D. R.; Baker, D. B.
1997-01-01
The Internet's World-Wide Web (WWW) provides an appealing medium for the communication of health related information due to its ease of use and growing popularity. But current technologies for communicating data between WWW clients and servers are systematically vulnerable to certain types of security threats. Prominent among these threats are "Trojan horse" programs running on client workstations, which perform some useful and known function for a user, while breaching security via background functions that are not apparent to the user. The Patient-Centered Access to Secure Systems Online (PCASSO) project of SAIC and UCSD is a research, development and evaluation project to exploit state-of-the-art security and WWW technology for health care. PCASSO is designed to provide secure access to clinical data for healthcare providers and their patients using the Internet. PCASSO will be evaluated for both safety and effectiveness, and may provide a model for secure communications via public data networks. PMID:9357644
76 FR 19110 - Published Privacy Impact Assessments on the Web
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-06
... the Web AGENCY: Privacy Office, Department of Homeland Security (DHS). ACTION: Notice of Publication... the Privacy Office's Web site between May 3, 2010 and January 7, 2011. DATES: The Privacy Impact Assessments are available on the DHS Web site until June 6, 2011, after which they are obtained by contacting...
Standards-based sensor interoperability and networking SensorWeb: an overview
NASA Astrophysics Data System (ADS)
Bolling, Sam
2012-06-01
The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.
SCEAPI: A unified Restful Web API for High-Performance Computing
NASA Astrophysics Data System (ADS)
Rongqiang, Cao; Haili, Xiao; Shasha, Lu; Yining, Zhao; Xiaoning, Wang; Xuebin, Chi
2017-10-01
The development of scientific computing is increasingly moving to collaborative web and mobile applications. All these applications need high-quality programming interface for accessing heterogeneous computing resources consisting of clusters, grid computing or cloud computing. In this paper, we introduce our high-performance computing environment that integrates computing resources from 16 HPC centers across China. Then we present a bundle of web services called SCEAPI and describe how it can be used to access HPC resources with HTTP or HTTPs protocols. We discuss SCEAPI from several aspects including architecture, implementation and security, and address specific challenges in designing compatible interfaces and protecting sensitive data. We describe the functions of SCEAPI including authentication, file transfer and job management for creating, submitting and monitoring, and how to use SCEAPI in an easy-to-use way. Finally, we discuss how to exploit more HPC resources quickly for the ATLAS experiment by implementing the custom ARC compute element based on SCEAPI, and our work shows that SCEAPI is an easy-to-use and effective solution to extend opportunistic HPC resources.
The Potential of CGI: Using Pre-Built CGI Scripts to Make Interactive Web Pages.
ERIC Educational Resources Information Center
Nackerud, Shane A.
1998-01-01
Describes CGI (Common Gateway Interface) scripts that are available on the Web and explains how librarians can use them to make Web pages more interactive. Topics include CGI security; Perl scripts; UNIX; and HTML. (LRW)
User Needs of Digital Service Web Portals: A Case Study
ERIC Educational Resources Information Center
Heo, Misook; Song, Jung-Sook; Seol, Moon-Won
2013-01-01
The authors examined the needs of digital information service web portal users. More specifically, the needs of Korean cultural portal users were examined as a case study. The conceptual framework of a web-based portal is that it is a complex, web-based service application with characteristics of information systems and service agents. In…
Compression-based aggregation model for medical web services.
Al-Shammary, Dhiah; Khalil, Ibrahim
2010-01-01
Many organizations such as hospitals have adopted Cloud Web services in applying their network services to avoid investing heavily computing infrastructure. SOAP (Simple Object Access Protocol) is the basic communication protocol of Cloud Web services that is XML based protocol. Generally,Web services often suffer congestions and bottlenecks as a result of the high network traffic that is caused by the large XML overhead size. At the same time, the massive load on Cloud Web services in terms of the large demand of client requests has resulted in the same problem. In this paper, two XML-aware aggregation techniques that are based on exploiting the compression concepts are proposed in order to aggregate the medical Web messages and achieve higher message size reduction.
Osz, Ágnes; Pongor, Lorinc Sándor; Szirmai, Danuta; Gyorffy, Balázs
2017-12-08
The long-term availability of online Web services is of utmost importance to ensure reproducibility of analytical results. However, because of lack of maintenance following acceptance, many servers become unavailable after a short period of time. Our aim was to monitor the accessibility and the decay rate of published Web services as well as to determine the factors underlying trends changes. We searched PubMed to identify publications containing Web server-related terms published between 1994 and 2017. Automatic and manual screening was used to check the status of each Web service. Kruskall-Wallis, Mann-Whitney and Chi-square tests were used to evaluate various parameters, including availability, accessibility, platform, origin of authors, citation, journal impact factor and publication year. We identified 3649 publications in 375 journals of which 2522 (69%) were currently active. Over 95% of sites were running in the first 2 years, but this rate dropped to 84% in the third year and gradually sank afterwards (P < 1e-16). The mean half-life of Web services is 10.39 years. Working Web services were published in journals with higher impact factors (P = 4.8e-04). Services published before the year 2000 received minimal attention. The citation of offline services was less than for those online (P = 0.022). The majority of Web services provide analytical tools, and the proportion of databases is slowly decreasing. Conclusions. Almost one-third of Web services published to date went out of service. We recommend continued support of Web-based services to increase the reproducibility of published results. © The Author 2017. Published by Oxford University Press.
Analysis Tool Web Services from the EMBL-EBI.
McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo
2013-07-01
Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.
Analysis Tool Web Services from the EMBL-EBI
McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo
2013-01-01
Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338
Experimental Internet Environment Software Development
NASA Technical Reports Server (NTRS)
Maddux, Gary A.
1998-01-01
Geographically distributed project teams need an Internet based collaborative work environment or "Intranet." The Virtual Research Center (VRC) is an experimental Intranet server that combines several services such as desktop conferencing, file archives, on-line publishing, and security. Using the World Wide Web (WWW) as a shared space paradigm, the Graphical User Interface (GUI) presents users with images of a lunar colony. Each project has a wing of the colony and each wing has a conference room, library, laboratory, and mail station. In FY95, the VRC development team proved the feasibility of this shared space concept by building a prototype using a Netscape commerce server and several public domain programs. Successful demonstrations of the prototype resulted in approval for a second phase. Phase 2, documented by this report, will produce a seamlessly integrated environment by introducing new technologies such as Java and Adobe Web Links to replace less efficient interface software.
KernPaeP - a web-based pediatric palliative documentation system for home care.
Hartz, Tobias; Verst, Hendrik; Ueckert, Frank
2009-01-01
KernPaeP is a new web-based on- and offline documentation system, which has been developed for pediatric palliative care-teams supporting patient documentation and communication among health care professionals. It provides a reliable system making fast and secure home care documentation possible. KernPaeP is accessible online by registered users using any web-browser. Home care teams use an offline version of KernPaeP running on a netbook for patient documentation on site. Identifying and medical patient data are strictly separated and stored on two database servers. The system offers a stable, enhanced two-way algorithm for synchronization between the offline component and the central database servers. KernPaeP is implemented meeting highest security standards while still maintaining high usability. The web-based documentation system allows ubiquitous and immediate access to patient data. Sumptuous paper work is replaced by secure and comprehensive electronic documentation. KernPaeP helps saving time and improving the quality of documentation. Due to development in close cooperation with pediatric palliative professionals, KernPaeP fulfils the broad needs of home-care documentation. The technique of web-based online and offline documentation is in general applicable for arbitrary home care scenarios.
Biological Web Service Repositories Review.
Urdidiales-Nieto, David; Navas-Delgado, Ismael; Aldana-Montes, José F
2017-05-01
Web services play a key role in bioinformatics enabling the integration of database access and analysis of algorithms. However, Web service repositories do not usually publish information on the changes made to their registered Web services. Dynamism is directly related to the changes in the repositories (services registered or unregistered) and at service level (annotation changes). Thus, users, software clients or workflow based approaches lack enough relevant information to decide when they should review or re-execute a Web service or workflow to get updated or improved results. The dynamism of the repository could be a measure for workflow developers to re-check service availability and annotation changes in the services of interest to them. This paper presents a review on the most well-known Web service repositories in the life sciences including an analysis of their dynamism. Freshness is introduced in this paper, and has been used as the measure for the dynamism of these repositories. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.
The value of Web-based library services at Cedars-Sinai Health System.
Halub, L P
1999-07-01
Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services.
The value of Web-based library services at Cedars-Sinai Health System.
Halub, L P
1999-01-01
Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services. PMID:10427423
MedlinePlus Connect: How it Works
... it looks depends on how it is implemented. Web Application The Web application returns a formatted response ... for more examples of Web Application response pages. Web Service The MedlinePlus Connect REST-based Web service ...
Unifying Access to National Hydrologic Data Repositories via Web Services
NASA Astrophysics Data System (ADS)
Valentine, D. W.; Jennings, B.; Zaslavsky, I.; Maidment, D. R.
2006-12-01
The CUAHSI hydrologic information system (HIS) is designed to be a live, multiscale web portal system for accessing, querying, visualizing, and publishing distributed hydrologic observation data and models for any location or region in the United States. The HIS design follows the principles of open service oriented architecture, i.e. system components are represented as web services with well defined standard service APIs. WaterOneFlow web services are the main component of the design. The currently available services have been completely re-written compared to the previous version, and provide programmatic access to USGS NWIS. (steam flow, groundwater and water quality repositories), DAYMET daily observations, NASA MODIS, and Unidata NAM streams, with several additional web service wrappers being added (EPA STORET, NCDC and others.). Different repositories of hydrologic data use different vocabularies, and support different types of query access. Resolving semantic and structural heterogeneities across different hydrologic observation archives and distilling a generic set of service signatures is one of the main scalability challenges in this project, and a requirement in our web service design. To accomplish the uniformity of the web services API, data repositories are modeled following the CUAHSI Observation Data Model. The web service responses are document-based, and use an XML schema to express the semantics in a standard format. Access to station metadata is provided via web service methods, GetSites, GetSiteInfo and GetVariableInfo. The methdods form the foundation of CUAHSI HIS discovery interface and may execute over locally-stored metadata or request the information from remote repositories directly. Observation values are retrieved via a generic GetValues method which is executed against national data repositories. The service is implemented in ASP.Net, and other providers are implementing WaterOneFlow services in java. Reference implementation of WaterOneFlow web services is available. More information about the ongoing development of CUAHSI HIS is available from http://www.cuahsi.org/his/.
A Privacy Access Control Framework for Web Services Collaboration with Role Mechanisms
NASA Astrophysics Data System (ADS)
Liu, Linyuan; Huang, Zhiqiu; Zhu, Haibin
With the popularity of Internet technology, web services are becoming the most promising paradigm for distributed computing. This increased use of web services has meant that more and more personal information of consumers is being shared with web service providers, leading to the need to guarantee the privacy of consumers. This paper proposes a role-based privacy access control framework for Web services collaboration, it utilizes roles to specify the privacy privileges of services, and considers the impact on the reputation degree of the historic experience of services in playing roles. Comparing to the traditional privacy access control approaches, this framework can make the fine-grained authorization decision, thus efficiently protecting consumers' privacy.
Science and Technology Resources on the Internet: Computer Security.
ERIC Educational Resources Information Center
Kinkus, Jane F.
2002-01-01
Discusses issues related to computer security, including confidentiality, integrity, and authentication or availability; and presents a selected list of Web sites that cover the basic issues of computer security under subject headings that include ethics, privacy, kids, antivirus, policies, cryptography, operating system security, and biometrics.…
SIDECACHE: Information access, management and dissemination framework for web services.
Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A
2011-06-14
Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.
Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; M., M.
2016-06-01
Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.
2011-01-01
Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.
Trust and Online Reputation Systems
NASA Astrophysics Data System (ADS)
Kwan, Ming; Ramachandran, Deepak
Web 2.0 technologies provide organizations with unprecedented opportunities to expand and solidify relationships with their customers, partners, and employees—while empowering firms to define entirely new business models focused on sharing information in online collaborative environments. Yet, in and of themselves, these technologies cannot ensure productive online interactions. Leading enterprises that are experimenting with social networks and online communities are already discovering this fact and along with it, the importance of establishing trust as the foundation for online collaboration and transactions. Just as today's consumers must feel secure to bank, exchange personal information and purchase products and services online; participants in Web 2.0 initiatives will only accept the higher levels of risk and exposure inherent in e-commerce and Web collaboration in an environment of trust. Indeed, only by attending to the need to cultivate online trust with customers, partners and employees will enterprises ever fully exploit the expanded business potential posed by Web 2.0. But developing online trust is no easy feat. While various preliminary attempts have occurred, no definitive model for establishing or measuring it has yet been established. To that end, nGenera has identified three, distinct dimensions of online trust: reputation (quantitative-based); relationship (qualitative-based) and process (system-based). When considered together, they form a valuable model for understanding online trust and a toolbox for cultivating it to support Web 2.0 initiatives.
Dynamic Generation of Reduced Ontologies to Support Resource Constraints of Mobile Devices
ERIC Educational Resources Information Center
Schrimpsher, Dan
2011-01-01
As Web Services and the Semantic Web become more important, enabling technologies such as web service ontologies will grow larger. At the same time, use of mobile devices to access web services has doubled in the last year. The ability of these resource constrained devices to download and reason across these ontologies to support service discovery…
A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services
NASA Astrophysics Data System (ADS)
Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.
2015-12-01
Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.
From Sensor to Observation Web with environmental enablers in the Future Internet.
Havlik, Denis; Schade, Sven; Sabeur, Zoheir A; Mazzetti, Paolo; Watson, Kym; Berre, Arne J; Mon, Jose Lorenzo
2011-01-01
This paper outlines the grand challenges in global sustainability research and the objectives of the FP7 Future Internet PPP program within the Digital Agenda for Europe. Large user communities are generating significant amounts of valuable environmental observations at local and regional scales using the devices and services of the Future Internet. These communities' environmental observations represent a wealth of information which is currently hardly used or used only in isolation and therefore in need of integration with other information sources. Indeed, this very integration will lead to a paradigm shift from a mere Sensor Web to an Observation Web with semantically enriched content emanating from sensors, environmental simulations and citizens. The paper also describes the research challenges to realize the Observation Web and the associated environmental enablers for the Future Internet. Such an environmental enabler could for instance be an electronic sensing device, a web-service application, or even a social networking group affording or facilitating the capability of the Future Internet applications to consume, produce, and use environmental observations in cross-domain applications. The term "envirofied" Future Internet is coined to describe this overall target that forms a cornerstone of work in the Environmental Usage Area within the Future Internet PPP program. Relevant trends described in the paper are the usage of ubiquitous sensors (anywhere), the provision and generation of information by citizens, and the convergence of real and virtual realities to convey understanding of environmental observations. The paper addresses the technical challenges in the Environmental Usage Area and the need for designing multi-style service oriented architecture. Key topics are the mapping of requirements to capabilities, providing scalability and robustness with implementing context aware information retrieval. Another essential research topic is handling data fusion and model based computation, and the related propagation of information uncertainty. Approaches to security, standardization and harmonization, all essential for sustainable solutions, are summarized from the perspective of the Environmental Usage Area. The paper concludes with an overview of emerging, high impact applications in the environmental areas concerning land ecosystems (biodiversity), air quality (atmospheric conditions) and water ecosystems (marine asset management).
From Sensor to Observation Web with Environmental Enablers in the Future Internet
Havlik, Denis; Schade, Sven; Sabeur, Zoheir A.; Mazzetti, Paolo; Watson, Kym; Berre, Arne J.; Mon, Jose Lorenzo
2011-01-01
This paper outlines the grand challenges in global sustainability research and the objectives of the FP7 Future Internet PPP program within the Digital Agenda for Europe. Large user communities are generating significant amounts of valuable environmental observations at local and regional scales using the devices and services of the Future Internet. These communities’ environmental observations represent a wealth of information which is currently hardly used or used only in isolation and therefore in need of integration with other information sources. Indeed, this very integration will lead to a paradigm shift from a mere Sensor Web to an Observation Web with semantically enriched content emanating from sensors, environmental simulations and citizens. The paper also describes the research challenges to realize the Observation Web and the associated environmental enablers for the Future Internet. Such an environmental enabler could for instance be an electronic sensing device, a web-service application, or even a social networking group affording or facilitating the capability of the Future Internet applications to consume, produce, and use environmental observations in cross-domain applications. The term “envirofied” Future Internet is coined to describe this overall target that forms a cornerstone of work in the Environmental Usage Area within the Future Internet PPP program. Relevant trends described in the paper are the usage of ubiquitous sensors (anywhere), the provision and generation of information by citizens, and the convergence of real and virtual realities to convey understanding of environmental observations. The paper addresses the technical challenges in the Environmental Usage Area and the need for designing multi-style service oriented architecture. Key topics are the mapping of requirements to capabilities, providing scalability and robustness with implementing context aware information retrieval. Another essential research topic is handling data fusion and model based computation, and the related propagation of information uncertainty. Approaches to security, standardization and harmonization, all essential for sustainable solutions, are summarized from the perspective of the Environmental Usage Area. The paper concludes with an overview of emerging, high impact applications in the environmental areas concerning land ecosystems (biodiversity), air quality (atmospheric conditions) and water ecosystems (marine asset management). PMID:22163827
Innovative Quality-Assurance Strategies for Tuberculosis Surveillance in the United States
Manangan, Lilia Ponce; Tryon, Cheryl; Magee, Elvin; Miramontes, Roque
2012-01-01
Introduction. The Centers for Disease Control and Prevention (CDC)'s National Tuberculosis Surveillance System (NTSS) is the national repository of tuberculosis (TB) data in the United States. Jurisdictions report to NTSS through the Report of Verified Case of Tuberculosis (RVCT) form that transitioned to a web-based system in 2009. Materials and Methods. To improve RVCT data quality, CDC conducted a quality assurance (QA) needs assessment to develop QA strategies. These include QA components (case detection, data accuracy, completeness, timeliness, data security, and confidentiality); sample tools such as National TB Indicators Project (NTIP) to identify TB case reporting discrepancies; comprehensive training course; resource guide and toolkit. Results and Discussion. During July–September 2011, 73 staff from 34 (57%) of 60 reporting jurisdictions participated in QA training. Participants stated usefulness of sharing jurisdictions' QA methods; 66 (93%) wrote that the QA tools will be effective for their activities. Several jurisdictions reported implementation of QA tools pertinent to their programs. Data showed >8% increase in NTSS and NTIP enrollment through Secure Access Management Services, which monitors system usage, from August 2011–February 2012. Conclusions. Despite challenges imposed by web-based surveillance systems, QA strategies can be developed with innovation and collaboration. These strategies can also be used by other disease programs to ensure high data quality. PMID:22685648
NASA Astrophysics Data System (ADS)
Wang, Tusheng; Yang, Yuanyuan; Zhang, Jianguo
2013-03-01
In order to enable multiple disciplines of medical researchers, clinical physicians and biomedical engineers working together in a secured, efficient, and transparent cooperative environment, we had designed an e-Science platform for biomedical imaging research and application cross multiple academic institutions and hospitals in Shanghai by using grid-based or cloud-based distributed architecture and presented this work in SPIE Medical Imaging conference held in San Diego in 2012. However, when the platform integrates more and more nodes over different networks, the first challenge is that how to monitor and maintain all the hosts and services operating cross multiple academic institutions and hospitals in the e-Science platform, such as DICOM and Web based image communication services, messaging services and XDS ITI transaction services. In this presentation, we presented a system design and implementation of intelligent monitoring and management which can collect system resource status of every node in real time, alert when node or service failure occurs, and can finally improve the robustness, reliability and service continuity of this e-Science platform.
Managing the Web-Enhanced Geographic Information Service.
ERIC Educational Resources Information Center
Stephens, Denise
1997-01-01
Examines key management issues involved in delivering geographic information services on the World Wide Web, using the Geographic Information Center (GIC) program at the University of Virginia Library as a reference. Highlights include integrating the Web into services; building collections for Web delivery; and evaluating spatial information…
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
BioCatalogue: a universal catalogue of web services for the life sciences
Bhagat, Jiten; Tanoh, Franck; Nzuobontane, Eric; Laurent, Thomas; Orlowski, Jerzy; Roos, Marco; Wolstencroft, Katy; Aleksejevs, Sergejs; Stevens, Robert; Pettifer, Steve; Lopez, Rodrigo; Goble, Carole A.
2010-01-01
The use of Web Services to enable programmatic access to on-line bioinformatics is becoming increasingly important in the Life Sciences. However, their number, distribution and the variable quality of their documentation can make their discovery and subsequent use difficult. A Web Services registry with information on available services will help to bring together service providers and their users. The BioCatalogue (http://www.biocatalogue.org/) provides a common interface for registering, browsing and annotating Web Services to the Life Science community. Services in the BioCatalogue can be described and searched in multiple ways based upon their technical types, bioinformatics categories, user tags, service providers or data inputs and outputs. They are also subject to constant monitoring, allowing the identification of service problems and changes and the filtering-out of unavailable or unreliable resources. The system is accessible via a human-readable ‘Web 2.0’-style interface and a programmatic Web Service interface. The BioCatalogue follows a community approach in which all services can be registered, browsed and incrementally documented with annotations by any member of the scientific community. PMID:20484378
BioCatalogue: a universal catalogue of web services for the life sciences.
Bhagat, Jiten; Tanoh, Franck; Nzuobontane, Eric; Laurent, Thomas; Orlowski, Jerzy; Roos, Marco; Wolstencroft, Katy; Aleksejevs, Sergejs; Stevens, Robert; Pettifer, Steve; Lopez, Rodrigo; Goble, Carole A
2010-07-01
The use of Web Services to enable programmatic access to on-line bioinformatics is becoming increasingly important in the Life Sciences. However, their number, distribution and the variable quality of their documentation can make their discovery and subsequent use difficult. A Web Services registry with information on available services will help to bring together service providers and their users. The BioCatalogue (http://www.biocatalogue.org/) provides a common interface for registering, browsing and annotating Web Services to the Life Science community. Services in the BioCatalogue can be described and searched in multiple ways based upon their technical types, bioinformatics categories, user tags, service providers or data inputs and outputs. They are also subject to constant monitoring, allowing the identification of service problems and changes and the filtering-out of unavailable or unreliable resources. The system is accessible via a human-readable 'Web 2.0'-style interface and a programmatic Web Service interface. The BioCatalogue follows a community approach in which all services can be registered, browsed and incrementally documented with annotations by any member of the scientific community.
Security Aspects of an Enterprise-Wide Network Architecture.
ERIC Educational Resources Information Center
Loew, Robert; Stengel, Ingo; Bleimann, Udo; McDonald, Aidan
1999-01-01
Presents an overview of two projects that concern local area networks and the common point between networks as they relate to network security. Discusses security architectures based on firewall components, packet filters, application gateways, security-management components, an intranet solution, user registration by Web form, and requests for…
NASA Astrophysics Data System (ADS)
Fulker, D. W.; Gallagher, J. H. R.
2015-12-01
OPeNDAP's Hyrax data server is an open-source framework fostering interoperability via easily-deployed Web services. Compatible with solutions listed in the (PA001) session description—federation, rigid standards and brokering/mediation—the framework can support tight or loose coupling, even with dependence on community-contributed software. Hyrax is a Web-services framework with a middleware-like design and a handler-style architecture that together reduce the interoperability challenge (for N datatypes and M user contexts) to an O(N+M) problem, similar to brokering. Combined with an open-source ethos, this reduction makes Hyrax a community tool for gaining interoperability. E.g., in its response to the Big Earth Data Initiative (BEDI), NASA references OPeNDAP-based interoperability. Assuming its suitability, the question becomes: how sustainable is OPeNDAP, a small not-for-profit that produces open-source software, i.e., has no software-sales? In other words, if geoscience interoperability depends on OPeNDAP and similar organizations, are those entities in turn sustainable? Jim Collins (in Good to Great) highlights three questions that successful companies can answer (paraphrased here): What is your passion? Where is your world-class excellence? What drives your economic engine? We attempt to shed light on OPeNDAP sustainability by examining these. Passion: OPeNDAP has a focused passion for improving the effectiveness of scientific data sharing and use, as deeply-cooperative community endeavors. Excellence: OPeNDAP has few peers in remote, scientific data access. Skills include computer science with experience in data science, (operational, secure) Web services, and software design (for servers and clients, where the latter vary from Web pages to standalone apps and end-user programs). Economic Engine: OPeNDAP is an engineering services organization more than a product company, despite software being key to OPeNDAP's reputation. In essence, provision of engineering expertise, via contracts and grants, is the economic engine. Hence sustainability, as needed to address global grand challenges in geoscience, depends on agencies' and others' abilities and willingness to offer grants and let contracts for continually upgrading open-source software from OPeNDAP and others.
A Study on Technology Architecture and Serving Approaches of Electronic Government System
NASA Astrophysics Data System (ADS)
Liu, Chunnian; Huang, Yiyun; Pan, Qin
As E-government becomes a very active research area, a lot of solutions to solve citizens' needs are being deployed. This paper provides technology architecture of E-government system and approaches of service in Public Administrations. The proposed electronic system addresses the basic E-government requirements of user friendliness, security, interoperability, transparency and effectiveness in the communication between small and medium sized public organizations and their citizens, businesses and other public organizations. The paper has provided several serving approaches of E-government, which includes SOA, web service, mobile E-government, public library and every has its own characteristics and application scenes. Still, there are a number of E-government issues for further research on organization structure change, including research methodology, data collection analysis, etc.
Customer Decision Making in Web Services with an Integrated P6 Model
NASA Astrophysics Data System (ADS)
Sun, Zhaohao; Sun, Junqing; Meredith, Grant
Customer decision making (CDM) is an indispensable factor for web services. This article examines CDM in web services with a novel P6 model, which consists of the 6 Ps: privacy, perception, propensity, preference, personalization and promised experience. This model integrates the existing 6 P elements of marketing mix as the system environment of CDM in web services. The new integrated P6 model deals with the inner world of the customer and incorporates what the customer think during the DM process. The proposed approach will facilitate the research and development of web services and decision support systems.
Choi, Okkyung; Han, SangYong
2007-01-01
Ubiquitous Computing makes it possible to determine in real time the location and situations of service requesters in a web service environment as it enables access to computers at any time and in any place. Though research on various aspects of ubiquitous commerce is progressing at enterprises and research centers, both domestically and overseas, analysis of a customer's personal preferences based on semantic web and rule based services using semantics is not currently being conducted. This paper proposes a Ubiquitous Computing Services System that enables a rule based search as well as semantics based search to support the fact that the electronic space and the physical space can be combined into one and the real time search for web services and the construction of efficient web services thus become possible.
NASA Astrophysics Data System (ADS)
Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.
2003-12-01
Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have hosted these Web Services as a part of the SCEC Community Modeling Environment (SCEC/CME) ITR Project (http://www.scec.org/cme). We have implemented Web Services for several of the reasons sited previously. For example, we implemented a FORTRAN-based Earthquake Rupture Forecast (ERF) as a Web Service for use by client computers that don't support a FORTRAN runtime environment. We implemented a Generic Mapping Tool (GMT) Web Service for use by systems that don't have local access to GMT. We implemented a Hazard Map Calculator Web Service to execute Hazard calculations that are too computationally intensive to run on a local system. We implemented a Coordinate Conversion Web Service to enforce a standard and consistent method for converting between UTM and Lat/Lon. Our experience developing these services indicates both strengths and weakness in current Web Service technology. Client programs that utilize Web Services typically need network access, a significant disadvantage at times. Programs with simple input and output parameters were the easiest to implement as Web Services, while programs with complex parameter-types required a significant amount of additional development. We also noted that Web services are very data-oriented, and adapting object-oriented software into the Web Service model proved problematic. Also, the Web Service approach of converting data types into XML format for network transmission has significant inefficiencies for some data sets.
The Organizational Role of Web Services
ERIC Educational Resources Information Center
Mitchell, Erik
2011-01-01
The workload of Web librarians is already split between Web-related and other library tasks. But today's technological environment has created new implications for existing services and new demands for staff time. It is time to reconsider how libraries can best allocate resources to provide effective Web services. Delivering high-quality services…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-01
...-NEW] Agency Information Collection Activities: Online Survey of Web Services Employers; New... Web site at http://www.Regulations.gov under e-Docket ID number USCIS-2013- 0003. When submitting... information collection. (2) Title of the Form/Collection: Online Survey of Web Services Employers. (3) Agency...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-16
...-NEW] Agency Information Collection Activities: Online Survey of Web Services Employers; New... Information Collection: New information collection. (2) Title of the Form/Collection: Online Survey of Web... sector. It is necessary that USCIS obtains data on the E-Verify Program Web Services. Gaining an...
WebTag: Web browsing into sensor tags over NFC.
Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Alvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio
2012-01-01
Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm.
WebTag: Web Browsing into Sensor Tags over NFC
Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Álvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio
2012-01-01
Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm. PMID:23012511
Incentives to Encourage Scientific Web Contribution (Invited)
NASA Astrophysics Data System (ADS)
Antunes, A. K.
2010-12-01
We suggest improvements to citation standards and creation of remuneration opportunities to encourage career scientist contributions to Web2.0 and social media science channels. At present, agencies want to accomplish better outreach and engagement with no funding, while scientists sacrifice their personal time to contribute to web and social media sites. Securing active participation by scientists requires career recognition of the value scientists provide to web knowledge bases and to the general public. One primary mechanism to encourage participation is citation standards, which let a contributor improve their reputation in a quantifiable way. But such standards must be recognized by their scientific and workplace communities. Using case studies such as the acceptance of web in the workplace and the growth of open access journals, we examine what agencies and individual can do as well as the time scales needed to secure increased active contribution by scientists. We also discuss ways to jumpstart this process.
NASA Astrophysics Data System (ADS)
Zeitz, Christian; Scheidat, Tobias; Dittmann, Jana; Vielhauer, Claus; González Agulla, Elisardo; Otero Muras, Enrique; García Mateo, Carmen; Alba Castro, José L.
2008-02-01
Beside the optimization of biometric error rates the overall security system performance in respect to intentional security attacks plays an important role for biometric enabled authentication schemes. As traditionally most user authentication schemes are knowledge and/or possession based, firstly in this paper we present a methodology for a security analysis of Internet-based biometric authentication systems by enhancing known methodologies such as the CERT attack-taxonomy with a more detailed view on the OSI-Model. Secondly as proof of concept, the guidelines extracted from this methodology are strictly applied to an open source Internet-based biometric authentication system (BioWebAuth). As case studies, two exemplary attacks, based on the found security leaks, are investigated and the attack performance is presented to show that during the biometric authentication schemes beside biometric error performance tuning also security issues need to be addressed. Finally, some design recommendations are given in order to ensure a minimum security level.
A Web service substitution method based on service cluster nets
NASA Astrophysics Data System (ADS)
Du, YuYue; Gai, JunJing; Zhou, MengChu
2017-11-01
Service substitution is an important research topic in the fields of Web services and service-oriented computing. This work presents a novel method to analyse and substitute Web services. A new concept, called a Service Cluster Net Unit, is proposed based on Web service clusters. A service cluster is converted into a Service Cluster Net Unit. Then it is used to analyse whether the services in the cluster can satisfy some service requests. Meanwhile, the substitution methods of an atomic service and a composite service are proposed. The correctness of the proposed method is proved, and the effectiveness is shown and compared with the state-of-the-art method via an experiment. It can be readily applied to e-commerce service substitution to meet the business automation needs.
myExperiment: a repository and social network for the sharing of bioinformatics workflows
Goble, Carole A.; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David
2010-01-01
myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org. PMID:20501605
Applying World Wide Web technology to the study of patients with rare diseases.
de Groen, P C; Barry, J A; Schaller, W J
1998-07-15
Randomized, controlled trials of sporadic diseases are rarely conducted. Recent developments in communication technology, particularly the World Wide Web, allow efficient dissemination and exchange of information. However, software for the identification of patients with a rare disease and subsequent data entry and analysis in a secure Web database are currently not available. To study cholangiocarcinoma, a rare cancer of the bile ducts, we developed a computerized disease tracing system coupled with a database accessible on the Web. The tracing system scans computerized information systems on a daily basis and forwards demographic information on patients with bile duct abnormalities to an electronic mailbox. If informed consent is given, the patient's demographic and preexisting medical information available in medical database servers are electronically forwarded to a UNIX research database. Information from further patient-physician interactions and procedures is also entered into this database. The database is equipped with a Web user interface that allows data entry from various platforms (PC-compatible, Macintosh, and UNIX workstations) anywhere inside or outside our institution. To ensure patient confidentiality and data security, the database includes all security measures required for electronic medical records. The combination of a Web-based disease tracing system and a database has broad applications, particularly for the integration of clinical research within clinical practice and for the coordination of multicenter trials.
NASA Astrophysics Data System (ADS)
Seamon, E.; Gessler, P. E.; Flathers, E.; Sheneman, L.; Gollberg, G.
2013-12-01
The Regional Approaches to Climate Change for Pacific Northwest Agriculture (REACCH PNA) is a five-year USDA/NIFA-funded coordinated agriculture project to examine the sustainability of cereal crop production systems in the Pacific Northwest, in relationship to ongoing climate change. As part of this effort, an extensive data management system has been developed to enable researchers, students, and the public, to upload, manage, and analyze various data. The REACCH PNA data management team has developed three core systems to encompass cyberinfrastructure and data management needs: 1) the reacchpna.org portal (https://www.reacchpna.org) is the entry point for all public and secure information, with secure access by REACCH PNA members for data analysis, uploading, and informational review; 2) the REACCH PNA Data Repository is a replicated, redundant database server environment that allows for file and database storage and access to all core data; and 3) the REACCH PNA Libraries which are functional groupings of data for REACCH PNA members and the public, based on their access level. These libraries are accessible thru our https://www.reacchpna.org portal. The developed system is structured in a virtual server environment (data, applications, web) that includes a geospatial database/geospatial web server for web mapping services (ArcGIS Server), use of ESRI's Geoportal Server for data discovery and metadata management (under the ISO 19115-2 standard), Thematic Realtime Environmental Distributed Data Services (THREDDS) for data cataloging, and Interactive Python notebook server (IPython) technology for data analysis. REACCH systems are housed and maintained by the Northwest Knowledge Network project (www.northwestknowledge.net), which provides data management services to support research. Initial project data harvesting and meta-tagging efforts have resulted in the interrogation and loading of over 10 terabytes of climate model output, regional entomological data, agricultural and atmospheric information, as well as imagery, publications, videos, and other soft content. In addition, the outlined data management approach has focused on the integration and interconnection of hard data (raw data output) with associated publications, presentations, or other narrative documentation - through metadata lineage associations. This harvest-and-consume data management methodology could additionally be applied to other research team environments that involve large and divergent data.
Development of web-based services for an ensemble flood forecasting and risk assessment system
NASA Astrophysics Data System (ADS)
Yaw Manful, Desmond; He, Yi; Cloke, Hannah; Pappenberger, Florian; Li, Zhijia; Wetterhall, Fredrik; Huang, Yingchun; Hu, Yuzhong
2010-05-01
Flooding is a wide spread and devastating natural disaster worldwide. Floods that took place in the last decade in China were ranked the worst amongst recorded floods worldwide in terms of the number of human fatalities and economic losses (Munich Re-Insurance). Rapid economic development and population expansion into low lying flood plains has worsened the situation. Current conventional flood prediction systems in China are neither suited to the perceptible climate variability nor the rapid pace of urbanization sweeping the country. Flood prediction, from short-term (a few hours) to medium-term (a few days), needs to be revisited and adapted to changing socio-economic and hydro-climatic realities. The latest technology requires implementation of multiple numerical weather prediction systems. The availability of twelve global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a good opportunity for an effective state-of-the-art early forecasting system. A prototype of a Novel Flood Early Warning System (NEWS) using the TIGGE database is tested in the Huai River basin in east-central China. It is the first early flood warning system in China that uses the massive TIGGE database cascaded with river catchment models, the Xinanjiang hydrologic model and a 1-D hydraulic model, to predict river discharge and flood inundation. The NEWS algorithm is also designed to provide web-based services to a broad spectrum of end-users. The latter presents challenges as both databases and proprietary codes reside in different locations and converge at dissimilar times. NEWS will thus make use of a ready-to-run grid system that makes distributed computing and data resources available in a seamless and secure way. An ability to run or function on different operating systems and provide an interface or front that is accessible to broad spectrum of end-users is additional requirement. The aim is to achieve robust interoperability through strong security and workflow capabilities. A physical network diagram and a work flow scheme of all the models, codes and databases used to achieve the NEWS algorithm are presented. They constitute a first step in the development of a platform for providing real time flood forecasting services on the web to mitigate 21st century weather phenomena.
Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story
NASA Technical Reports Server (NTRS)
Ly, Vuong
2017-01-01
The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.
A secure mobile crowdsensing (MCS) location tracker for elderly in smart city
NASA Astrophysics Data System (ADS)
Shien, Lau Khai; Singh, Manmeet Mahinderjit
2017-10-01
According to the UN's (United Nations) projection, Malaysia will achieve ageing population status by 2030. The challenge of the growing ageing population is health and social care services. As the population lives longer, the costs of institutional care rises and elderly who not able live independently in their own homes without caregivers. Moreover, it restricted their activity area, safety and freedom in their daily life. Hence, a tracking system is worthy for their caregivers to track their real-time location with efficient. Currently tracking and monitoring systems are unable to satisfy the needs of the community. Hence, Indoor-Outdoor Elderly Secure and Tracking care system (IOET) proposed to track and monitor elderly. This Mobile Crowdsensing type of system is using indoor and outdoor positioning system to locate elder which utilizes the RFID, NFC, biometric system and GPS aim to secure the safety of elderly within indoors and outdoors environment. A mobile application and web-based application to be designed for this system. This system able to real-time tracking by combining GPS and NFC for outdoor coverage where ideally in smart city. In indoor coverage, the system utilizes active RFID tracking elderly movement. The system will prompt caregiver wherever elderly movement or request by using the notification service which provided the real-time notify. Caregiver also can review the place that visited by elderly and trace back elderly movement.
Web Services as Public Services: Are We Supporting Our Busiest Service Point?
ERIC Educational Resources Information Center
Riley-Huff, Debra A.
2009-01-01
This article is an analysis of academic library organizational culture, patterns, and processes as they relate to Web services. Data gathered in a research survey is examined in an attempt to reveal current departmental and administrative attitudes, practices, and support for Web services in the library research environment. (Contains 10 tables.)
Modeling, Simulation and Analysis of Public Key Infrastructure
NASA Technical Reports Server (NTRS)
Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)
1998-01-01
Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.
An Architecture for Autonomic Web Service Process Planning
NASA Astrophysics Data System (ADS)
Moore, Colm; Xue Wang, Ming; Pahl, Claus
Web service composition is a technology that has received considerable attention in the last number of years. Languages and tools to aid in the process of creating composite Web services have been received specific attention. Web service composition is the process of linking single Web services together in order to accomplish more complex tasks. One area of Web service composition that has not received as much attention is the area of dynamic error handling and re-planning, enabling autonomic composition. Given a repository of service descriptions and a task to complete, it is possible for AI planners to automatically create a plan that will achieve this goal. If however a service in the plan is unavailable or erroneous the plan will fail. Motivated by this problem, this paper suggests autonomous re-planning as a means to overcome dynamic problems. Our solution involves automatically recovering from faults and creating a context-dependent alternate plan. We present an architecture that serves as a basis for the central activities autonomous composition, monitoring and fault handling.
A New Cloud Architecture of Virtual Trusted Platform Modules
NASA Astrophysics Data System (ADS)
Liu, Dongxi; Lee, Jack; Jang, Julian; Nepal, Surya; Zic, John
We propose and implement a cloud architecture of virtual Trusted Platform Modules (TPMs) to improve the usability of TPMs. In this architecture, virtual TPMs can be obtained from the TPM cloud on demand. Hence, the TPM functionality is available for applications that do not have physical TPMs in their local platforms. Moreover, the TPM cloud allows users to access their keys and data in the same virtual TPM even if they move to untrusted platforms. The TPM cloud is easy to access for applications in different languages since cloud computing delivers services in standard protocols. The functionality of the TPM cloud is demonstrated by applying it to implement the Needham-Schroeder public-key protocol for web authentications, such that the strong security provided by TPMs is integrated into high level applications. The chain of trust based on the TPM cloud is discussed and the security properties of the virtual TPMs in the cloud is analyzed.
Domain-specific Web Service Discovery with Service Class Descriptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rocco, D; Caverlee, J; Liu, L
2005-02-14
This paper presents DynaBot, a domain-specific web service discovery system. The core idea of the DynaBot service discovery system is to use domain-specific service class descriptions powered by an intelligent Deep Web crawler. In contrast to current registry-based service discovery systems--like the several available UDDI registries--DynaBot promotes focused crawling of the Deep Web of services and discovers candidate services that are relevant to the domain of interest. It uses intelligent filtering algorithms to match services found by focused crawling with the domain-specific service class descriptions. We demonstrate the capability of DynaBot through the BLAST service discovery scenario and describe ourmore » initial experience with DynaBot.« less
Development of a web service for analysis in a distributed network.
Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila
2014-01-01
We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.
Development of a Web Service for Analysis in a Distributed Network
Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila
2014-01-01
Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes. PMID:25848586
Available, intuitive and free! Building e-learning modules using web 2.0 services.
Tam, Chun Wah Michael; Eastwood, Anne
2012-01-01
E-learning is part of the mainstream in medical education and often provides the most efficient and effective means of engaging learners in a particular topic. However, translating design and content ideas into a useable product can be technically challenging, especially in the absence of information technology (IT) support. There is little published literature on the use of web 2.0 services to build e-learning activities. To describe the web 2.0 tools and solutions employed to build the GP Synergy evidence-based medicine and critical appraisal online course. We used and integrated a number of free web 2.0 services including: Prezi, a web-based presentation platform; YouTube, a video sharing service; Google Docs, a online document platform; Tiny.cc, a URL shortening service; and Wordpress, a blogging platform. The course consisting of five multimedia-rich, tutorial-like modules was built without IT specialist assistance or specialised software. The web 2.0 services used were free. The course can be accessed with a modern web browser. Modern web 2.0 services remove many of the technical barriers for creating and sharing content on the internet. When used synergistically, these services can be a flexible and low-cost platform for building e-learning activities. They were a pragmatic solution in our context.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
...; Comment Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer... currently valid OMB control number. Proposed Collection: Title: NCI Cancer Genetics Services Directory Web... application form and the Web-based update mailer is to collect information about genetics professionals to be...
PaaS for web applications with OpenShift Origin
NASA Astrophysics Data System (ADS)
Lossent, A.; Rodriguez Peon, A.; Wagner, A.
2017-10-01
The CERN Web Frameworks team has deployed OpenShift Origin to facilitate deployment of web applications and to improving efficiency in terms of computing resource usage. OpenShift leverages Docker containers and Kubernetes orchestration to provide a Platform-as-a-service solution oriented for web applications. We will review use cases and how OpenShift was integrated with other services such as source control, web site management and authentication services.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer Summary: In... Cancer Genetics Services Directory Web-based Application Form and Update Mailer. [[Page 14035
Building asynchronous geospatial processing workflows with web services
NASA Astrophysics Data System (ADS)
Zhao, Peisheng; Di, Liping; Yu, Genong
2012-02-01
Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.
OneGeology Web Services and Portal as a global geological SDI - latest standards and technology
NASA Astrophysics Data System (ADS)
Duffy, Tim; Tellez-Arenas, Agnes
2014-05-01
The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 Web Feature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based Web Feature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone wishing to discover the availability of global geological web services and has new functionality to view and use such services including multiple projection support. KEYWORDS : OneGeology; GeoSciML V 3.2; Data exchange; Portal; INSPIRE; Standards; OGC; Interoperability; GeoScience information; WMS; WFS; Cookbook.
Web Services and Other Enhancements at the Northern California Earthquake Data Center
NASA Astrophysics Data System (ADS)
Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.
2012-12-01
The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These data can be accessed through the above web services and through special NCEDC web pages.
Tools for Administration of a UNIX-Based Network
NASA Technical Reports Server (NTRS)
LeClaire, Stephen; Farrar, Edward
2004-01-01
Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.
Security of social network credentials for accessing course portal: Users' experience
NASA Astrophysics Data System (ADS)
Katuk, Norliza; Fong, Choo Sok; Chun, Koo Lee
2015-12-01
Social login (SL) has recently emerged as a solution for single sign-on (SSO) within the web and mobile environments. It allows users to use their existing social network credentials (SNC) to login to third party web applications without the need to create a new identity in the intended applications' database. Although it has been used by many web application providers, its' applicability in accessing learning materials is not yet fully investigated. Hence, this research aims to explore users' (i.e., instructors' and students') perception and experience on the security of SL for accessing learning contents. A course portal was developed for students at a higher learning institution and it provides two types of user authentications (i) traditional user authentication, and (ii) SL facility. Users comprised instructors and students evaluated the login facility of the course portal through a controlled lab experimental study following the within-subject design. The participants provided their feedback in terms of the security of SL for accessing learning contents. The study revealed that users preferred to use SL over the traditional authentication, however, they concerned on the security of SL and their privacy.
Research of three level match method about semantic web service based on ontology
NASA Astrophysics Data System (ADS)
Xiao, Jie; Cai, Fang
2011-10-01
An important step of Web service Application is the discovery of useful services. Keywords are used in service discovery in traditional technology like UDDI and WSDL, with the disadvantage of user intervention, lack of semantic description and low accuracy. To cope with these problems, OWL-S is introduced and extended with QoS attributes to describe the attribute and functions of Web Services. A three-level service matching algorithm based on ontology and QOS in proposed in this paper. Our algorithm can match web service by utilizing the service profile, QoS parameters together with input and output of the service. Simulation results shows that it greatly enhanced the speed of service matching while high accuracy is also guaranteed.
The I-Tribe Community Pharmacy Practice Model: professional pharmacy unshackled.
Alston, Greg L; Waitzman, Jennifer A
2013-01-01
To describe a mechanism by which pharmacists could create a disruptive innovation to provide professional primary care services via a Web-based delivery model. Several obstacles have prevented pharmacists from using available technology to develop business models that capitalize on their clinical skills in primary care. Community practice has experienced multiple sustaining innovations that have improved dispensing productivity but have not stimulated sufficient demand for pharmacy services to disrupt the marketplace and provide new opportunities for pharmacists. Pharmacists are in a unique position to bridge the gap between demand for basic primary medical care and access to a competent medical professional. Building on the historic strengths of community pharmacy practice, modern pharmacists could provide a disruptive innovation in the marketplace for primary care by taking advantage of new technology and implementing the I-Tribe Community Pharmacy Practice Model (I-Tribe). This model would directly connect pharmacists to patients through an interactive, secure Web presence that would liberate the relationship from geographic restrictions. The I-Tribe is a disruptive innovation that could become the foundation for a vibrant market in pharmacist professional service offerings. The I-Tribe model could benefit society by expanding access to primary medical care while simultaneously providing a new source of revenue for community practice pharmacists. Entrepreneurial innovation through I-Tribe pharmacy would free pharmacists to become the care providers envisioned by the profession's thought leaders.
Clark, Barry; Wachowiak, Bartosz; Crawford, Ewan W.; Jakubowski, Zenon; Kabata, Janusz
1998-01-01
A pilot study was performed to evaluate the feasibility of using the Internet to securely deliver patient laboratory results, and the system has subsequently gone into routine use in Poland. The system went from design to pilot and then to live implementation within a four-month period, resulting in the LIS-Interlink software product. Test results are retrieved at regular intervals from the BioLinkTM LIS (Laboratory Information System), encrypted and transferred to a secure area on the Web server. The primary health-care centres dial into the Internet using a local-cell service provided by Polish Telecom (TP), obtain a TCP/IP address using the TP DHCP server, and perform HTTP ‘get’ and ‘post’ operations to obtain the files by secure handshaking. The data are then automatically inserted into a local SQL database (with optional printing of incoming reports)for cumulative reporting and searching functions. The local database is fully multi-user and can be accessed from different clinics within the centres by a variety of networking protocols. PMID:18924820
Clark, B; Wachowiak, B; Crawford, E W; Jakubowski, Z; Kabata, J
1998-01-01
A pilot study was performed to evaluate the feasibility of using the Internet to securely deliver patient laboratory results, and the system has subsequently gone into routine use in Poland. The system went from design to pilot and then to live implementation within a four-month period, resulting in the LIS-Interlink software product. Test results are retrieved at regular intervals from the BioLink(TM) LIS (Laboratory Information System), encrypted and transferred to a secure area on the Web server. The primary health-care centres dial into the Internet using a local-cell service provided by Polish Telecom (TP), obtain a TCP/IP address using the TP DHCP server, and perform HTTP 'get' and 'post' operations to obtain the files by secure handshaking. The data are then automatically inserted into a local SQL database (with optional printing of incoming reports)for cumulative reporting and searching functions. The local database is fully multi-user and can be accessed from different clinics within the centres by a variety of networking protocols.
Interoperability And Value Added To Earth Observation Data
NASA Astrophysics Data System (ADS)
Gasperi, J.
2012-04-01
Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.
SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services
Gessler, Damian DG; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T
2009-01-01
Background SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. Results There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at , developer tools at , and a portal to third-party ontologies at (a "swap meet"). Conclusion SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs. PMID:19775460
Internet/Web-based administration of benefits.
Vitiello, J
2001-09-01
Most funds will face the challenge of deploying at least some Web-based functionality in the near future, if they have not already done so. Clear objectives and careful planning will help ensure success. Issues that must be considered include support requirements, security concerns, functional business objectives, and employer and member Web access.
A Course Evolves-Physical Anthropology.
ERIC Educational Resources Information Center
O'Neil, Dennis
2001-01-01
Describes the development of an online physical anthropology course at Palomar College (California) that evolved from online tutorials. Discusses the ability to update materials on the Web more quickly than in traditional textbooks; creating Web pages that are readable by most Web browsers; test security issues; and clarifying ownership of online…
39 CFR 3001.12 - Service of documents.
Code of Federal Regulations, 2010 CFR
2010-07-01
... or presiding officer has determined is unable to receive service through the Commission's Web site... presiding officer has determined is unable to receive service through the Commission Web site shall be by... service list for each current proceeding will be available on the Commission's Web site http://www.prc.gov...
ChemCalc: a building block for tomorrow's chemical infrastructure.
Patiny, Luc; Borel, Alain
2013-05-24
Web services, as an aspect of cloud computing, are becoming an important part of the general IT infrastructure, and scientific computing is no exception to this trend. We propose a simple approach to develop chemical Web services, through which servers could expose the essential data manipulation functionality that students and researchers need for chemical calculations. These services return their results as JSON (JavaScript Object Notation) objects, which facilitates their use for Web applications. The ChemCalc project http://www.chemcalc.org demonstrates this approach: we present three Web services related with mass spectrometry, namely isotopic distribution simulation, peptide fragmentation simulation, and molecular formula determination. We also developed a complete Web application based on these three Web services, taking advantage of modern HTML5 and JavaScript libraries (ChemDoodle and jQuery).
Can They Plan to Teach with Web 2.0? Future Teachers' Potential Use of the Emerging Web
ERIC Educational Resources Information Center
Kale, Ugur
2014-01-01
This study examined pre-service teachers' potential use of Web 2.0 technologies for teaching. A coding scheme incorporating the Technological Pedagogical Content Knowledge (TPACK) framework guided the analysis of pre-service teachers' Web 2.0-enhanced learning activity descriptions. The results indicated that while pre-service teachers were able…
An Offline-Online Android Application for Hazard Event Mapping Using WebGIS Open Source Technologies
NASA Astrophysics Data System (ADS)
Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya
2016-04-01
Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche, etc. Keywords: Offline, Online, WebGIS Open source, Android, Hazard Event Mapping
Mobile Cloud Computing with SOAP and REST Web Services
NASA Astrophysics Data System (ADS)
Ali, Mushtaq; Fadli Zolkipli, Mohamad; Mohamad Zain, Jasni; Anwar, Shahid
2018-05-01
Mobile computing in conjunction with Mobile web services drives a strong approach where the limitations of mobile devices may possibly be tackled. Mobile Web Services are based on two types of technologies; SOAP and REST, which works with the existing protocols to develop Web services. Both the approaches carry their own distinct features, yet to keep the constraint features of mobile devices in mind, the better in two is considered to be the one which minimize the computation and transmission overhead while offloading. The load transferring of mobile device to remote servers for execution called computational offloading. There are numerous approaches to implement computational offloading a viable solution for eradicating the resources constraints of mobile device, yet a dynamic method of computational offloading is always required for a smooth and simple migration of complex tasks. The intention of this work is to present a distinctive approach which may not engage the mobile resources for longer time. The concept of web services utilized in our work to delegate the computational intensive tasks for remote execution. We tested both SOAP Web services approach and REST Web Services for mobile computing. Two parameters considered in our lab experiments to test; Execution Time and Energy Consumption. The results show that RESTful Web services execution is far better than executing the same application by SOAP Web services approach, in terms of execution time and energy consumption. Conducting experiments with the developed prototype matrix multiplication app, REST execution time is about 200% better than SOAP execution approach. In case of energy consumption REST execution is about 250% better than SOAP execution approach.
Discovery Mechanisms for the Sensor Web
Jirka, Simon; Bröring, Arne; Stasch, Christoph
2009-01-01
This paper addresses the discovery of sensors within the OGC Sensor Web Enablement framework. Whereas services like the OGC Web Map Service or Web Coverage Service are already well supported through catalogue services, the field of sensor networks and the according discovery mechanisms is still a challenge. The focus within this article will be on the use of existing OGC Sensor Web components for realizing a discovery solution. After discussing the requirements for a Sensor Web discovery mechanism, an approach will be presented that was developed within the EU funded project “OSIRIS”. This solution offers mechanisms to search for sensors, exploit basic semantic relationships, harvest sensor metadata and integrate sensor discovery into already existing catalogues. PMID:22574038
AdaFF: Adaptive Failure-Handling Framework for Composite Web Services
NASA Astrophysics Data System (ADS)
Kim, Yuna; Lee, Wan Yeon; Kim, Kyong Hoon; Kim, Jong
In this paper, we propose a novel Web service composition framework which dynamically accommodates various failure recovery requirements. In the proposed framework called Adaptive Failure-handling Framework (AdaFF), failure-handling submodules are prepared during the design of a composite service, and some of them are systematically selected and automatically combined with the composite Web service at service instantiation in accordance with the requirement of individual users. In contrast, existing frameworks cannot adapt the failure-handling behaviors to user's requirements. AdaFF rapidly delivers a composite service supporting the requirement-matched failure handling without manual development, and contributes to a flexible composite Web service design in that service architects never care about failure handling or variable requirements of users. For proof of concept, we implement a prototype system of the AdaFF, which automatically generates a composite service instance with Web Services Business Process Execution Language (WS-BPEL) according to the users' requirement specified in XML format and executes the generated instance on the ActiveBPEL engine.
VegScape: U.S. Crop Condition Monitoring Service
NASA Astrophysics Data System (ADS)
mueller, R.; Yang, Z.; Di, L.
2013-12-01
Since 1995, the US Department of Agriculture (USDA)/National Agricultural Statistics Service (NASS) has provided qualitative biweekly vegetation condition indices to USDA policymakers and the public on a weekly basis during the growing season. Vegetation indices have proven useful for assessing crop condition and identifying the areal extent of floods, drought, major weather anomalies, and vulnerabilities of early/late season crops. With growing emphasis on more extreme weather events and food security issues rising to the forefront of national interest, a new vegetation condition monitoring system was developed. The new vegetation condition portal named VegScape was initiated at the start of the 2013 growing season. VegScape delivers web mapping service based interactive vegetation indices. Users can use an interactive map to explore, query and disseminate current crop conditions. Vegetation indices like Normal Difference Vegetation Index (NDVI), Vegetation Condition Index (VCI), and mean, median, and ratio comparisons to prior years can be constructed for analytical purposes and on-demand crop statistics. The NASA MODIS satellite with 250 meter (15 acres) resolution and thirteen years of data history provides improved spatial and temporal resolutions and delivers improved detailed timely (i.e., daily) crop specific condition and dynamics. VegScape thus provides supplemental information to support NASS' weekly crop reports. VegScape delivers an agricultural cultivated crop mask and the most recent Cropland Data Layer (CDL) product to exploit the agricultural domain and visualize prior years' planted crops. Additionally, the data can be directly exported to Google Earth for web mashups or delivered via web mapping services for uses in other applications. VegScape supports the ethos of data democracy by providing free and open access to digital geospatial data layers using open geospatial standards, thereby supporting transparent and collaborative government initiatives. NASS developed VegScape in cooperation with the Center for Spatial Information Science and Systems, George Mason University, Fairfax, VA. VegScape Ratio to Median NDVI
A New Approach for Semantic Web Matching
NASA Astrophysics Data System (ADS)
Zamanifar, Kamran; Heidary, Golsa; Nematbakhsh, Naser; Mardukhi, Farhad
In this work we propose a new approach for semantic web matching to improve the performance of Web Service replacement. Because in automatic systems we should ensure the self-healing, self-configuration, self-optimization and self-management, all services should be always available and if one of them crashes, it should be replaced with the most similar one. Candidate services are advertised in Universal Description, Discovery and Integration (UDDI) all in Web Ontology Language (OWL). By the help of bipartite graph, we did the matching between the crashed service and a Candidate one. Then we chose the best service, which had the maximum rate of matching. In fact we compare two services' functionalities and capabilities to see how much they match. We found that the best way for matching two web services, is comparing the functionalities of them.
Climatological Data Option in My Weather Impacts Decision Aid (MyWIDA) Overview
2017-07-18
rules. It consists of 2 databases, a data service server, a collection of web service, and web applications that show weather impacts on selected...3.1.2 ClimoDB 5 3.2 Data Service 5 3.2.1 Data Requestor 5 3.2.2 Data Decoder 6 3.2.3 Post Processor 6 3.2.4 Job Scheduler 6 3.3 Web Service 6...6.1 Additional Data Option 9 6.2 Impact Overlay Web Service 9 6.3 Graphical User Interface 9 7. References 10 List of Symbols, Abbreviations, and
Exploring key factors in online shopping with a hybrid model.
Chen, Hsiao-Ming; Wu, Chia-Huei; Tsai, Sang-Bing; Yu, Jian; Wang, Jiangtao; Zheng, Yuxiang
2016-01-01
Nowadays, the web increasingly influences retail sales. An in-depth analysis of consumer decision-making in the context of e-business has become an important issue for internet vendors. However, factors affecting e-business are complicated and intertwined. To stimulate online sales, understanding key influential factors and causal relationships among the factors is important. To gain more insights into this issue, this paper introduces a hybrid method, which combines the Decision Making Trial and Evaluation Laboratory (DEMATEL) with the analytic network process, called DANP method, to find out the driving factors that influence the online business mostly. By DEMATEL approach the causal graph showed that "online service" dimension has the highest degree of direct impact on other dimensions; thus, the internet vendor is suggested to made strong efforts on service quality throughout the online shopping process. In addition, the study adopted DANP to measure the importance of key factors, among which "transaction security" proves to be the most important criterion. Hence, transaction security should be treated with top priority to boost the online businesses. From our study with DANP approach, the comprehensive information can be visually detected so that the decision makers can spotlight on the root causes to develop effectual actions.
Design and implementation of CUAHSI WaterML and WaterOneFlow Web Services
NASA Astrophysics Data System (ADS)
Valentine, D. W.; Zaslavsky, I.; Whitenack, T.; Maidment, D.
2007-12-01
WaterOneFlow is a term for a group of web services created by and for the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) community. CUAHSI web services facilitate the retrieval of hydrologic observations information from online data sources using the SOAP protocol. CUAHSI Water Markup Language (below referred to as WaterML) is an XML schema defining the format of messages returned by the WaterOneFlow web services. \
Processing biological literature with customizable Web services supporting interoperable formats.
Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia
2014-01-01
Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.
Data partitioning enables the use of standard SOAP Web Services in genome-scale workflows.
Sztromwasser, Pawel; Puntervoll, Pål; Petersen, Kjell
2011-07-26
Biological databases and computational biology tools are provided by research groups around the world, and made accessible on the Web. Combining these resources is a common practice in bioinformatics, but integration of heterogeneous and often distributed tools and datasets can be challenging. To date, this challenge has been commonly addressed in a pragmatic way, by tedious and error-prone scripting. Recently however a more reliable technique has been identified and proposed as the platform that would tie together bioinformatics resources, namely Web Services. In the last decade the Web Services have spread wide in bioinformatics, and earned the title of recommended technology. However, in the era of high-throughput experimentation, a major concern regarding Web Services is their ability to handle large-scale data traffic. We propose a stream-like communication pattern for standard SOAP Web Services, that enables efficient flow of large data traffic between a workflow orchestrator and Web Services. We evaluated the data-partitioning strategy by comparing it with typical communication patterns on an example pipeline for genomic sequence annotation. The results show that data-partitioning lowers resource demands of services and increases their throughput, which in consequence allows to execute in-silico experiments on genome-scale, using standard SOAP Web Services and workflows. As a proof-of-principle we annotated an RNA-seq dataset using a plain BPEL workflow engine.
Processing biological literature with customizable Web services supporting interoperable formats
Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia
2014-01-01
Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225
Exploring NASA GES DISC Data with Interoperable Services
NASA Technical Reports Server (NTRS)
Zhao, Peisheng; Yang, Wenli; Hegde, Mahabal; Wei, Jennifer C.; Kempler, Steven; Pham, Long; Teng, William; Savtchenko, Andrey
2015-01-01
Overview of NASA GES DISC (NASA Goddard Earth Science Data and Information Services Center) data with interoperable services: Open-standard and Interoperable Services Improve data discoverability, accessibility, and usability with metadata, catalogue and portal standards Achieve data, information and knowledge sharing across applications with standardized interfaces and protocols Open Geospatial Consortium (OGC) Data Services and Specifications Web Coverage Service (WCS) -- data Web Map Service (WMS) -- pictures of data Web Map Tile Service (WMTS) --- pictures of data tiles Styled Layer Descriptors (SLD) --- rendered styles.
49 CFR 1510.5 - Imposition of security service fees.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 9 2014-10-01 2014-10-01 false Imposition of security service fees. 1510.5... SECURITY ADMINISTRATION, DEPARTMENT OF HOMELAND SECURITY ADMINISTRATIVE AND PROCEDURAL RULES PASSENGER CIVIL AVIATION SECURITY SERVICE FEES § 1510.5 Imposition of security service fees. (a) Each direct air...
31 CFR 515.578 - Exportation of certain services incident to Internet-based communications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Internet, such as instant messaging, chat and email, social networking, sharing of photos and movies, web... direct or indirect exportation of web-hosting services that are for purposes other than personal communications (e.g., web-hosting services for commercial endeavors) or of domain name registration services. (4...
31 CFR 515.578 - Exportation of certain services incident to Internet-based communications.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Internet, such as instant messaging, chat and email, social networking, sharing of photos and movies, web... direct or indirect exportation of web-hosting services that are for purposes other than personal communications (e.g., web-hosting services for commercial endeavors) or of domain name registration services. (4...
REMORA: a pilot in the ocean of BioMoby web-services.
Carrere, Sébastien; Gouzy, Jérôme
2006-04-01
Emerging web-services technology allows interoperability between multiple distributed architectures. Here, we present REMORA, a web server implemented according to the BioMoby web-service specifications, providing life science researchers with an easy-to-use workflow generator and launcher, a repository of predefined workflows and a survey system. Jerome.Gouzy@toulouse.inra.fr The REMORA web server is freely available at http://bioinfo.genopole-toulouse.prd.fr/remora, sources are available upon request from the authors.
jORCA: easily integrating bioinformatics Web Services.
Martín-Requena, Victoria; Ríos, Javier; García, Maximiliano; Ramírez, Sergio; Trelles, Oswaldo
2010-02-15
Web services technology is becoming the option of choice to deploy bioinformatics tools that are universally available. One of the major strengths of this approach is that it supports machine-to-machine interoperability over a network. However, a weakness of this approach is that various Web Services differ in their definition and invocation protocols, as well as their communication and data formats-and this presents a barrier to service interoperability. jORCA is a desktop client aimed at facilitating seamless integration of Web Services. It does so by making a uniform representation of the different web resources, supporting scalable service discovery, and automatic composition of workflows. Usability is at the top of the jORCA agenda; thus it is a highly customizable and extensible application that accommodates a broad range of user skills featuring double-click invocation of services in conjunction with advanced execution-control, on the fly data standardization, extensibility of viewer plug-ins, drag-and-drop editing capabilities, plus a file-based browsing style and organization of favourite tools. The integration of bioinformatics Web Services is made easier to support a wider range of users. .
National Centers for Environmental Prediction
. Government's official Web portal to all Federal, state and local government Web resources and services. MISSION Web Page [scroll down to "Verification" Section] HRRR Verification at NOAA ESRL HRRR Web Verification Web Page NOAA / National Weather Service National Centers for Environmental Prediction
Component, Context, and Manufacturing Model Library (C2M2L)
2012-11-01
123 5.1 MML Population and Web Service Interface...104 Table 41. Relevant Questions with Associated Web Services...the models, and implementing web services that provide semantically aware programmatic access to the models, including implementing the MS&T
Web Server Security on Open Source Environments
NASA Astrophysics Data System (ADS)
Gkoutzelis, Dimitrios X.; Sardis, Manolis S.
Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.
2011-01-01
Background The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. Description SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. Conclusions SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies. PMID:22024447
Wilkinson, Mark D; Vandervalk, Benjamin; McCarthy, Luke
2011-10-24
The complexity and inter-related nature of biological data poses a difficult challenge for data and tool integration. There has been a proliferation of interoperability standards and projects over the past decade, none of which has been widely adopted by the bioinformatics community. Recent attempts have focused on the use of semantics to assist integration, and Semantic Web technologies are being welcomed by this community. SADI - Semantic Automated Discovery and Integration - is a lightweight set of fully standards-compliant Semantic Web service design patterns that simplify the publication of services of the type commonly found in bioinformatics and other scientific domains. Using Semantic Web technologies at every level of the Web services "stack", SADI services consume and produce instances of OWL Classes following a small number of very straightforward best-practices. In addition, we provide codebases that support these best-practices, and plug-in tools to popular developer and client software that dramatically simplify deployment of services by providers, and the discovery and utilization of those services by their consumers. SADI Services are fully compliant with, and utilize only foundational Web standards; are simple to create and maintain for service providers; and can be discovered and utilized in a very intuitive way by biologist end-users. In addition, the SADI design patterns significantly improve the ability of software to automatically discover appropriate services based on user-needs, and automatically chain these into complex analytical workflows. We show that, when resources are exposed through SADI, data compliant with a given ontological model can be automatically gathered, or generated, from these distributed, non-coordinating resources - a behaviour we have not observed in any other Semantic system. Finally, we show that, using SADI, data dynamically generated from Web services can be explored in a manner very similar to data housed in static triple-stores, thus facilitating the intersection of Web services and Semantic Web technologies.
The climate4impact portal: bridging the CMIP5 and CORDEX data infrastructure to impact users
NASA Astrophysics Data System (ADS)
Plieger, Maarten; Som de Cerff, Wim; Pagé, Christian; Tatarinova, Natalia; Cofiño, Antonio; Vega Saldarriaga, Manuel; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Sjökvist, Elin
2015-04-01
The aim of climate4impact is to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. The portal is based on 21 impact use cases from 5 different European countries, and is evaluated by a user panel consisting of use case owners. It has been developed within the European projects IS-ENES and IS-ENES2 for more than 5 years, and its development currently continues within IS-ENES2 and CLIPC. As the climate impact community is very broad, the focus is mainly on the scientific impact community. This work has resulted in the ENES portal interface for climate impact communities and can be visited at www.climate4impact.eu. The climate4impact is connected to the Earth System Grid Federation (ESGF) nodes containing global climate model data (GCM data) from the fifth phase of the Coupled Model Intercomparison Project (CMIP5) and regional climate model data (RCM) data from the Coordinated Regional Climate Downscaling Experiment (CORDEX). This global network of climate model data centers offers services for data description, discovery and download. The climate4impact portal connects to these services using OpenID, and offers a user interface for searching, visualizing and downloading global climate model data and more. A challenging task was to describe the available model data and how it can be used. The portal tries to inform users about possible caveats when using climate model data. All impact use cases are described in the documentation section, using highlighted keywords pointing to detailed information in the glossary. During the project, the content management system Drupal was used to enable partners to contribute on the documentation section. In this presentation the architecture and following items will be detailed: - Visualization: Visualize data from ESGF data nodes using ADAGUC Web Map Services. - Processing: Transform data, subset, export into other formats, and perform climate indices calculations using Web Processing Services implemented by PyWPS, based on NCAR NCPP OpenClimateGIS and IS-ENES2 icclim. - Security: Login using OpenID for access to the ESGF data nodes. The ESGF works in conjunction with several external websites and systems. The climate4impact portal uses X509 based short lived credentials, generated on behalf of the user with a MyProxy service. Single Sign-on (SSO) is used to make these websites and systems work together. - Discovery: Facetted search based on e.g. variable name, model and institute using the ESGF search services. A catalog browser allows for browsing through CMIP5 and any other climate model data catalogues (e.g. ESSENCE, EOBS, UNIDATA). - Download: Directly from ESGF nodes and other THREDDS catalogs This architecture will also be used for the future Copernicus platform, developed in the EU FP7 CLIPC project. - Connection with the downscaling portal of the university of Cantabria - Experiences on the question and answer site via Askbot The current main objectives for climate4impact can be summarized in two objectives. The first one is to work on a web interface which automatically generates a graphical user interface on WPS endpoints. The WPS calculates climate indices and subset data using OpenClimateGIS/icclim on data stored in ESGF data nodes. Data is then transmitted from ESGF nodes over secured OpenDAP and becomes available in a new, per user, secured OpenDAP server. The results can then be visualized again using ADAGUC WMS. Dedicated wizards for processing of climate indices will be developed in close collaboration with users. The second one is to expose climate4impact services, so as to offer standardized services which can be used by other portals. This has the advantage to add interoperability between several portals, as well as to enable the design of specific portals aimed at different impact communities, either thematic or national, for example.
NASA Astrophysics Data System (ADS)
Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.
Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.
Web-services-based spatial decision support system to facilitate nuclear waste siting
NASA Astrophysics Data System (ADS)
Huang, L. Xinglai; Sheng, Grant
2006-10-01
The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
... Industry Regulatory Authority's (``FINRA'') Central Registration Depository System (``Web CRD''), and must... proposed Rule 313 all associated persons that are not already registered in Web CRD must register (i.e... the Exchange via a Form U4 through FINRA's Web CRD. (Generally, all principals must qualify as...
32 CFR 806b.51 - Privacy and the Web.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 6 2013-07-01 2013-07-01 false Privacy and the Web. 806b.51 Section 806b.51 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION PRIVACY ACT... security notices at major web site entry points and Privacy Act statements or Privacy Advisories when...
Lee, Jae Dong; Yoon, Tae Sik; Chung, Seung Hyun
2015-01-01
Objectives Remote medical services have been expanding globally, and this is expansion is steadily increasing. It has had many positive effects, including medical access convenience, timeliness of service, and cost reduction. The speed of research and development in remote medical technology has been gradually accelerating. Therefore, it is expected to expand to enable various high-tech information and communications technology (ICT)-based remote medical services. However, the current state lacks an appropriate security framework that can resolve security issues centered on the Internet of things (IoT) environment that will be utilized significantly in telemedicine. Methods This study developed a medical service-oriented frame work for secure remote medical services, possessing flexibility regarding new service and security elements through its service-oriented structure. First, the common architecture of remote medical services is defined. Next medical-oriented secu rity threats and requirements within the IoT environment are identified. Finally, we propose a "service-oriented security frame work for remote medical services" based on previous work and requirements for secure remote medical services in the IoT. Results The proposed framework is a secure framework based on service-oriented cases in the medical environment. A com parative analysis focusing on the security elements (confidentiality, integrity, availability, privacy) was conducted, and the analysis results demonstrate the security of the proposed framework for remote medical services with IoT. Conclusions The proposed framework is service-oriented structure. It can support dynamic security elements in accordance with demands related to new remote medical services which will be diversely generated in the IoT environment. We anticipate that it will enable secure services to be provided that can guarantee confidentiality, integrity, and availability for all, including patients, non-patients, and medical staff. PMID:26618034
Lee, Jae Dong; Yoon, Tae Sik; Chung, Seung Hyun; Cha, Hyo Soung
2015-10-01
Remote medical services have been expanding globally, and this is expansion is steadily increasing. It has had many positive effects, including medical access convenience, timeliness of service, and cost reduction. The speed of research and development in remote medical technology has been gradually accelerating. Therefore, it is expected to expand to enable various high-tech information and communications technology (ICT)-based remote medical services. However, the current state lacks an appropriate security framework that can resolve security issues centered on the Internet of things (IoT) environment that will be utilized significantly in telemedicine. This study developed a medical service-oriented frame work for secure remote medical services, possessing flexibility regarding new service and security elements through its service-oriented structure. First, the common architecture of remote medical services is defined. Next medical-oriented secu rity threats and requirements within the IoT environment are identified. Finally, we propose a "service-oriented security frame work for remote medical services" based on previous work and requirements for secure remote medical services in the IoT. The proposed framework is a secure framework based on service-oriented cases in the medical environment. A com parative analysis focusing on the security elements (confidentiality, integrity, availability, privacy) was conducted, and the analysis results demonstrate the security of the proposed framework for remote medical services with IoT. The proposed framework is service-oriented structure. It can support dynamic security elements in accordance with demands related to new remote medical services which will be diversely generated in the IoT environment. We anticipate that it will enable secure services to be provided that can guarantee confidentiality, integrity, and availability for all, including patients, non-patients, and medical staff.
NASA Astrophysics Data System (ADS)
Teng, W.; Chiu, L.; Kempler, S.; Liu, Z.; Nadeau, D.; Rui, H.
2006-12-01
Using NASA satellite remote sensing data from multiple sources for hydrologic applications can be a daunting task and requires a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time-consuming task that must be undertaken before the core investigation can begin. In order to facilitate such investigations, the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has developed the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure or "Giovanni," which supports a family of Web interfaces (instances) that allow users to perform interactive visualization and analysis online without downloading any data. Two such Giovanni instances are particularly relevant to hydrologic applications: the Tropical Rainfall Measuring Mission (TRMM) Online Visualization and Analysis System (TOVAS) and the Agricultural Online Visualization and Analysis System (AOVAS), both highly popular and widely used for a variety of applications, including those related to several NASA Applications of National Priority, such as Agricultural Efficiency, Disaster Management, Ecological Forecasting, Homeland Security, and Public Health. Dynamic, context- sensitive Web services provided by TOVAS and AOVAS enable users to seamlessly access NASA data from within, and deeply integrate the data into, their local client environments. One example is between TOVAS and Florida International University's TerraFly, a Web-enabled system that serves a broad segment of the research and applications community, by facilitating access to various textual, remotely sensed, and vector data. Another example is between AOVAS and the U.S. Department of Agriculture Foreign Agricultural Service (USDA FAS)'s Crop Explorer, the primary decision support tool used by FAS to monitor the production, supply, and demand of agricultural commodities worldwide. AOVAS is also part of GES DISC's Agricultural Information System (AIS), which can operationally provide satellite remote sensing data products (e.g., near- real-time rainfall) and analysis services to agricultural users. AIS enables the remote, interoperable access to distributed data, by using the GrADS-Data Server (GDS) and the Open Geospatial Consortium (OGC)- compliant MapServer. The latter allows the access of AIS data from any OGC-compliant client, such as the Earth-Sun System Gateway (ESG) or Google Earth. The Giovanni system is evolving towards a Service- Oriented Architecture and is highly customizable (e.g., adding new products or services), thus availing the hydrologic applications user community of Giovanni's simple-to-use and powerful capabilities to improve decision-making.
Availability of the OGC geoprocessing standard: March 2011 reality check
NASA Astrophysics Data System (ADS)
Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier
2012-10-01
This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.
Seahawk: moving beyond HTML in Web-based bioinformatics analysis.
Gordon, Paul M K; Sensen, Christoph W
2007-06-18
Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therefore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer.
Seahawk: moving beyond HTML in Web-based bioinformatics analysis
Gordon, Paul MK; Sensen, Christoph W
2007-01-01
Background Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therfore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. Results We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. Conclusion As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer. PMID:17577405
Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent
2017-01-01
Abstract Motivation Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Implementation Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. General features Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Availability Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. PMID:29025122
Hearn,, Paul P.
2009-01-01
Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.
Maintenance and Exchange of Learning Objects in a Web Services Based e-Learning System
ERIC Educational Resources Information Center
Vossen, Gottfried; Westerkamp, Peter
2004-01-01
"Web services" enable partners to exploit applications via the Internet. Individual services can be composed to build new and more complex ones with additional and more comprehensive functionality. In this paper, we apply the Web service paradigm to electronic learning, and show how to exchange and maintain learning objects is a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-21
NREL's Developer Network, developer.nrel.gov, provides data that users can access to provide data to their own analyses, mobile and web applications. Developers can retrieve the data through a Web services API (application programming interface). The Developer Network handles overhead of serving up web services such as key management, authentication, analytics, reporting, documentation standards, and throttling in a common architecture, while allowing web services and APIs to be maintained and managed independently.
Exploring health information technology education: an analysis of the research.
Virgona, Thomas
2012-01-01
This article is an analysis of the Health Information Technology Education published research. The purpose of this study was to examine selected literature using variables such as journal frequency, keyword analysis, universities associated with the research and geographic diversity. The analysis presented in this paper has identified intellectually significant studies that have contributed to the development and accumulation of intellectual wealth of Health Information Technology. The keyword analysis suggests that Health Information Technology research has evolved from establishing concepts and domains of health information systems, technology and management to contemporary issues such as education, outsourcing, web services and security. The research findings have implications for educators, researchers, journal.
78 FR 3971 - Children's Online Privacy Protection Rule
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-17
...The Commission amends the Children's Online Privacy Protection Rule (``COPPA Rule'' or ``Rule''), consistent with the requirements of the Children's Online Privacy Protection Act, to clarify the scope of the Rule and strengthen its protections for children's personal information, in light of changes in online technology since the Rule went into effect in April 2000. The final amended Rule includes modifications to the definitions of operator, personal information, and Web site or online service directed to children. The amended Rule also updates the requirements set forth in the notice, parental consent, confidentiality and security, and safe harbor provisions, and adds a new provision addressing data retention and deletion.
NASA Astrophysics Data System (ADS)
Manuaba, I. B. P.; Rudiastini, E.
2018-01-01
Assessment of lecturers is a tool used to measure lecturer performance. Lecturer’s assessment variable can be measured from three aspects : teaching activities, research and community service. Broad aspect to measure the performance of lecturers requires a special framework, so that the system can be developed in a sustainable manner. Issues of this research is to create a API web service data tool, so the lecturer assessment system can be developed in various frameworks. The research was developed with web service and php programming language with the output of json extension data. The conclusion of this research is API web service data application can be developed using several platforms such as web, mobile application
Zbikowski, Susan M; Jack, Lisa M; McClure, Jennifer B; Deprey, Mona; Javitz, Harold S; McAfee, Timothy A; Catz, Sheryl L; Richards, Julie; Bush, Terry; Swan, Gary E
2011-05-01
Phone counseling has become standard for behavioral smoking cessation treatment. Newer options include Web and integrated phone-Web treatment. No prior research, to our knowledge, has systematically compared the effectiveness of these three treatment modalities in a randomized trial. Understanding how utilization varies by mode, the impact of utilization on outcomes, and predictors of utilization across each mode could lead to improved treatments. One thousand two hundred and two participants were randomized to phone, Web, or combined phone-Web cessation treatment. Services varied by modality and were tracked using automated systems. All participants received 12 weeks of varenicline, printed guides, an orientation call, and access to a phone supportline. Self-report data were collected at baseline and 6-month follow-up. Overall, participants utilized phone services more often than the Web-based services. Among treatment groups with Web access, a significant proportion logged in only once (37% phone-Web, 41% Web), and those in the phone-Web group logged in less often than those in the Web group (mean = 2.4 vs. 3.7, p = .0001). Use of the phone also was correlated with increased use of the Web. In multivariate analyses, greater use of the phone- or Web-based services was associated with higher cessation rates. Finally, older age and the belief that certain treatments could improve success were consistent predictors of greater utilization across groups. Other predictors varied by treatment group. Opportunities for enhancing treatment utilization exist, particularly for Web-based programs. Increasing utilization more broadly could result in better overall treatment effectiveness for all intervention modalities.
Implementation of Sensor Twitter Feed Web Service Server and Client
2016-12-01
ARL-TN-0807 ● DEC 2016 US Army Research Laboratory Implementation of Sensor Twitter Feed Web Service Server and Client by...Implementation of Sensor Twitter Feed Web Service Server and Client by Bhagyashree V Kulkarni University of Maryland Michael H Lee Computational...
Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José
2015-01-01
This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production. PMID:26068216
Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José
2015-01-01
This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production.
Web services as applications' integration tool: QikProp case study.
Laoui, Abdel; Polyakov, Valery R
2011-07-15
Web services are a new technology that enables to integrate applications running on different platforms by using primarily XML to enable communication among different computers over the Internet. Large number of applications was designed as stand alone systems before the concept of Web services was introduced and it is a challenge to integrate them into larger computational networks. A generally applicable method of wrapping stand alone applications into Web services was developed and is described. To test the technology, it was applied to the QikProp for DOS (Windows). Although performance of the application did not change when it was delivered as a Web service, this form of deployment had offered several advantages like simplified and centralized maintenance, smaller number of licenses, and practically no training for the end user. Because by using the described approach almost any legacy application can be wrapped as a Web service, this form of delivery may be recommended as a global alternative to traditional deployment solutions. Copyright © 2011 Wiley Periodicals, Inc.
Using USNO's API to Obtain Data
NASA Astrophysics Data System (ADS)
Lesniak, Michael V.; Pozniak, Daniel; Punnoose, Tarun
2015-01-01
The U.S. Naval Observatory (USNO) is in the process of modernizing its publicly available web services into APIs (Application Programming Interfaces). Services configured as APIs offer greater flexibility to the user and allow greater usage. Depending on the particular service, users who implement our APIs will receive either a PNG (Portable Network Graphics) image or data in JSON (JavaScript Object Notation) format. This raw data can then be embedded in third-party web sites or in apps.Part of the USNO's mission is to provide astronomical and timing data to government agencies and the general public. To this end, the USNO provides accurate computations of astronomical phenomena such as dates of lunar phases, rise and set times of the Moon and Sun, and lunar and solar eclipse times. Users who navigate to our web site and select one of our 18 services are prompted to complete a web form, specifying parameters such as date, time, location, and object. Many of our services work for years between 1700 and 2100, meaning that past, present, and future events can be computed. Upon form submission, our web server processes the request, computes the data, and outputs it to the user.Over recent years, the use of the web by the general public has vastly changed. In response to this, the USNO is modernizing its web-based data services. This includes making our computed data easier to embed within third-party web sites as well as more easily querying from apps running on tablets and smart phones. To facilitate this, the USNO has begun converting its services into APIs. In addition to the existing web forms for the various services, users are able to make direct URL requests that return either an image or numerical data.To date, four of our web services have been configured to run with APIs. Two are image-producing services: "Apparent Disk of a Solar System Object" and "Day and Night Across the Earth." Two API data services are "Complete Sun and Moon Data for One Day" and "Dates of Primary Phases of the Moon." Instructions for how to use our API services as well as examples of their use can be found on one of our explanatory web pages and will be discussed here.
SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services.
Gessler, Damian D G; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T
2009-09-23
SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at http://sswap.info (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at http://sswap.info/protocol.jsp, developer tools at http://sswap.info/developer.jsp, and a portal to third-party ontologies at http://sswapmeet.sswap.info (a "swap meet"). SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs.
ERIC Educational Resources Information Center
Relyea, Harold C.; Halchin, L. Elaine; Hogue, Henry B.; Agnew, Grace; Martin, Mairead; Schottlaender, Brian E. C.; Jackson, Mary E.
2003-01-01
Theses five reports address five special issues: the effects of the September 11 attacks on information management, including homeland security, Web site information removal, scientific and technical information, and privacy concerns; federal policy for electronic government information; digital rights management and libraries; library Web portal…
Eccher, Claudio; Eccher, Lorenzo; Izzo, Umberto
2005-01-01
In this poster we describe the security solutions implemented in a web-based cooperative work frame-work for managing heart failure patients among different health care professionals involved in the care process. The solution, developed in close collaboration with the Law Department of the University of Trento, is compliant with the new Italian Personal Data Protection Code, issued in 2003, that regulates also the storing and processing of health data.
WebGIS based on semantic grid model and web services
NASA Astrophysics Data System (ADS)
Zhang, WangFei; Yue, CaiRong; Gao, JianGuo
2009-10-01
As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by ontology based on Grid technology and Web Services.
Fan, Kenneth L; Avashia, Yash J; Dayicioglu, Deniz; DeGennaro, Vincent A; Thaller, Seth R
2014-04-01
Immediately after the January 2010 earthquake in Haiti, plastic surgeons provided disaster relief services through the University of Miami Miller School of Medicine for 5 months. To improve surgical care and promote awareness of plastic surgery's role in humanitarian assistance, an online communication platform (OCP) was initiated. An OCP is a Web-based application combining Web blogging, picture uploading, news posting, and private messaging systems into a single platform. The purpose of this study was to analyze the use of OCP during disaster relief. Surgeries performed during the period from January 13 to May 28, 2010, were documented. The OCP was established with 4 priorities: ease of use, multimedia integration, organization capabilities, and security. Web traffic was documented. A 17-question survey was administered to 18 plastic surgeons who used the OCP after 1 year to assess their attitudes and perceptions. From January 13 to May 28, 2010, 413 operations were performed at the field hospital. Of the overall number of procedures, 46.9% were performed by plastic surgery teams. In a year, beginning from January 12, 2011, the OCP had 1117 visits with 530 absolute unique visitors. Of 17 plastic surgeons, 71% responded that the OCP improved follow-up and continuity of care by debriefing rotating plastic surgery teams. One hundred percent claimed that the OCP conveyed the role of plastic surgeons with the public. Results demonstrate the necessity of OCP during disaster relief. Online communication platform permitted secure exchange of surgical management details, follow-up, photos, and miscellaneous necessary recommendations. Posted experiences and field hospital progress assisted in generating substantial awareness regarding the significant role and contribution played by plastic surgeons in disaster relief.
77 FR 19408 - Reinstate Index to Chapter III in 20 CFR
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2012-0018] Reinstate Index to Chapter III in 20 CFR AGENCY: Social Security Administration. ACTION: Notice; correction. SUMMARY: The Social Security... Chapter III in Title 20 of the Code of Federal Regulations. The document contains a misprinted Web site...