Science.gov

Sample records for distance web service

  1. Web Service

    MedlinePlus

    ... www.nlm.nih.gov/medlineplus/webservices.html MedlinePlus Web Service To use the sharing features on this ... please enable JavaScript. MedlinePlus offers a search-based Web service that provides access to MedlinePlus health topic ...

  2. Parallelization and optimization of genetic analyses in isolation by distance web service

    PubMed Central

    Turner, Julia L; Kelley, Scott T; Otto, James S; Valafar, Faramarz; Bohonak, Andrew J

    2009-01-01

    Background The Isolation by Distance Web Service (IBDWS) is a user-friendly web interface for analyzing patterns of isolation by distance in population genetic data. IBDWS enables researchers to perform a variety of statistical tests such as Mantel tests and reduced major axis regression (RMA), and returns vector based graphs. The more than 60 citations since 2005 confirm the popularity and utility of this website. Despite its usefulness, the data sets with over 65 populations can take hours or days to complete due to the computational intensity of the statistical tests. This is especially troublesome for web-based software analysis, since users tend to expect real-time results on the order of seconds, or at most, minutes. Moreover, as genetic data continue to increase and diversify, so does the demand for more processing power. In order to increase the speed and efficiency of IBDWS, we first determined which aspects of the code were most time consuming and whether they might be amenable to improvements by parallelization or algorithmic optimization. Results Runtime tests uncovered two areas of IBDWS that consumed significant amounts of time: randomizations within the Mantel test and the RMA calculations. We found that these sections of code could be restructured and parallelized to improve efficiency. The code was first optimized by combining two similar randomization routines, implementing a Fisher-Yates shuffling algorithm, and then parallelizing those routines. Tests of the parallelization and Fisher-Yates algorithmic improvements were performed on a variety of data sets ranging from 10 to 150 populations. All tested algorithms showed runtime reductions and a very close fit to the predicted speedups based on time-complexity calculations. In the case of 150 populations with 10,000 randomizations, data were analyzed 23 times faster. Conclusion Since the implementation of the new algorithms in late 2007, datasets have continued to increase substantially in size

  3. Web-Enabled Distance Education Environment.

    ERIC Educational Resources Information Center

    Bouras, Christos; Lampsas, Petros; Bazaios, Antonis; Tsintilas, Giorgos

    This paper describes the design of a synchronous World Wide Web-based distance education environment developed at the Telematics Laboratory of Computer Engineer and Informatics Department (CEID) and Computer Technology Institute (Greece); the environment uses telematics services to conduct lessons over computer networks, simulating a traditional…

  4. Web-Browsing Competencies of Pre-Service Adult Facilitators: Implications for Curriculum Transformation and Distance Learning

    ERIC Educational Resources Information Center

    Theresa, Ofoegbu; Ugwu, Agboeze Matthias; Ihebuzoaju, Anyanwu Joy; Uche, Asogwa

    2013-01-01

    The study investigated the Web-browsing competencies of pre-service adult facilitators in the southeast geopolitical zone of Nigeria. Survey design was adopted for the study. The population consists of all pre-service adult facilitators in all the federal universities in the southeast geopolitical zone of Nigeria. Accidental sampling technique was…

  5. Adding Interactivity to Web Based Distance Learning.

    ERIC Educational Resources Information Center

    Cafolla, Ralph; Knee, Richard

    Web Based Distance Learning (WBDL) is a form of distance learning based on providing instruction mainly on the World Wide Web. This paradigm has limitations, especially the lack of interactivity inherent in the Web. The purpose of this paper is to discuss some of the technologies the authors have used in their courses at Florida Atlantic…

  6. Web service performance script

    Energy Science and Technology Software Center (ESTSC)

    2009-08-01

    This python script, available from ESRI and modified here, checks a server at specified intervals to ensure that web services remain up and running. If any are found to be off, they are automatically turned back on.

  7. Using Web-Based Distance Learning to Reduce Cultural Distance

    ERIC Educational Resources Information Center

    Wong, L. Fai; Trinidad, S. G.

    2004-01-01

    In recent years, Web-based distance learning (WBDL) systems have become a popular learning environment for many western learners. While it has been established as an effective learning alternative, WBDL is not flourishing in Hong Kong as expected. This paper proposes that this is because Hong Kong students are not trained to learn independently…

  8. Web Page Design in Distance Education

    ERIC Educational Resources Information Center

    Isman, Aytekin; Dabaj, Fahme; Gumus, Agah; Altinay, Fahriye; Altinay, Zehra

    2004-01-01

    Distance education is contemporary process of the education. It facilitates fast, easy delivery of information with its concrete hardware and software tools. The development of high technology, Internet and web-design delivering become impact of effective using as delivery system to the students. Within the global perspective, even the all work…

  9. The EMBRACE web service collection

    PubMed Central

    Pettifer, Steve; Ison, Jon; Kalaš, Matúš; Thorne, Dave; McDermott, Philip; Jonassen, Inge; Liaquat, Ali; Fernández, José M.; Rodriguez, Jose M.; Partners, INB-; Pisano, David G.; Blanchet, Christophe; Uludag, Mahmut; Rice, Peter; Bartaseviciute, Edita; Rapacki, Kristoffer; Hekkelman, Maarten; Sand, Olivier; Stockinger, Heinz; Clegg, Andrew B.; Bongcam-Rudloff, Erik; Salzemann, Jean; Breton, Vincent; Attwood, Teresa K.; Cameron, Graham; Vriend, Gert

    2010-01-01

    The EMBRACE (European Model for Bioinformatics Research and Community Education) web service collection is the culmination of a 5-year project that set out to investigate issues involved in developing and deploying web services for use in the life sciences. The project concluded that in order for web services to achieve widespread adoption, standards must be defined for the choice of web service technology, for semantically annotating both service function and the data exchanged, and a mechanism for discovering services must be provided. Building on this, the project developed: EDAM, an ontology for describing life science web services; BioXSD, a schema for exchanging data between services; and a centralized registry (http://www.embraceregistry.net) that collects together around 1000 services developed by the consortium partners. This article presents the current status of the collection and its associated recommendations and standards definitions. PMID:20462862

  10. MedlinePlus Connect: Web Service

    MedlinePlus

    ... nih.gov/medlineplus/connect/service.html MedlinePlus Connect: Web Service To use the sharing features on this ... if you implement MedlinePlus Connect by contacting us . Web Service Overview The parameters for the Web service ...

  11. MedlinePlus Connect: Web Service

    MedlinePlus

    ... https://medlineplus.gov/connect/service.html MedlinePlus Connect: Web Service To use the sharing features on this ... if you implement MedlinePlus Connect by contacting us . Web Service Overview The parameters for the Web service ...

  12. Web Service: MedlinePlus

    MedlinePlus

    ... this page: https://medlineplus.gov/webservices.html MedlinePlus Web Service To use the sharing features on this ... please enable JavaScript. MedlinePlus offers a search-based Web service that provides access to MedlinePlus health topic ...

  13. RESTful Web Services at BNL

    SciTech Connect

    Casella, R.

    2011-06-14

    RESTful (REpresentational State Transfer) web services are an alternative implementation to SOAP/RPC web services in a client/server model. BNLs IT Division has started deploying RESTful Web Services for enterprise data retrieval and manipulation. Data is currently used by system administrators for tracking configuration information and as it is expanded will be used by Cyber Security for vulnerability management and as an aid to cyber investigations. This talk will describe the implementation and outstanding issues as well as some of the reasons for choosing RESTful over SOAP/RPC and future directions.

  14. Library Services to Distance Education Students at UNL.

    ERIC Educational Resources Information Center

    Adams, Kate; Cassner, Mary

    This paper presents a program overview of library services to distance education students at the University of Nebraska-Lincoln (UNL). In the fall of 1997, a survey on use of library services was sent to students in an interdepartmental master's program. Questions focused on the use of the library online catalog, the World Wide Web, and the…

  15. Embracing a Customer Service Mindset: A Fresh Examination of Services for Distance Learners

    ERIC Educational Resources Information Center

    Steiner, Heidi

    2013-01-01

    Library literature and blogs frequently discuss customer service and user experience in physical libraries and Web sites, but little is said about this mentality toward services for distance learners specifically. This paper takes customer service best practices from well-known thinkers of the business world and makes connections to services for…

  16. Semantic Web for Manufacturing Web Services

    SciTech Connect

    Kulvatunyou, Boonserm; Ivezic, Nenad

    2002-06-01

    As markets become unexpectedly turbulent with a shortened product life cycle and a power shift towards buyers, the need for methods to rapidly and cost-effectively develop products, production facilities and supporting software is becoming urgent. The use of a virtual enterprise plays a vital role in surviving turbulent markets. However, its success requires reliable and large-scale interoperation among trading partners via a semantic web of trading partners' services whose properties, capabilities, and interfaces are encoded in an unambiguous as well as computer-understandable form. This paper demonstrates a promising approach to integration and interoperation between a design house and a manufacturer by developing semantic web services for business and engineering transactions. To this end, detailed activity and information flow diagrams are developed, in which the two trading partners exchange messages and documents. The properties and capabilities of the manufacturer sites are defined using DARPA Agent Markup Language (DAML) ontology definition language. The prototype development of semantic webs shows that enterprises can widely interoperate in an unambiguous and autonomous manner; hence, virtual enterprise is realizable at a low cost.

  17. Web-Based Communications, the Internet, and Distance Education. Readings in Distance Education, Number 7.

    ERIC Educational Resources Information Center

    Moore, Michael G., Ed.; Cozine, Geoffrey T., Ed.

    This book brings together a selection of articles published in "The American Journal of Distance Education" that are related to Web-based delivery of distance education. Articles include: "Performance and Perceptions of Distance Learners in Cyberspace" (Peter Navarro and Judy Shoemaker); "Distance Education for Dentists: Improving the Quality of…

  18. WEBCAP: Web Scheduler for Distance Learning Multimedia Documents with Web Workload Considerations

    ERIC Educational Resources Information Center

    Habib, Sami; Safar, Maytham

    2008-01-01

    In many web applications, such as the distance learning, the frequency of refreshing multimedia web documents places a heavy burden on the WWW resources. Moreover, the updated web documents may encounter inordinate delays, which make it difficult to retrieve web documents in time. Here, we present an Internet tool called WEBCAP that can schedule…

  19. Semantic Search of Web Services

    ERIC Educational Resources Information Center

    Hao, Ke

    2013-01-01

    This dissertation addresses semantic search of Web services using natural language processing. We first survey various existing approaches, focusing on the fact that the expensive costs of current semantic annotation frameworks result in limited use of semantic search for large scale applications. We then propose a vector space model based service…

  20. APPRIS WebServer and WebServices

    PubMed Central

    Rodriguez, Jose Manuel; Carro, Angel; Valencia, Alfonso; Tress, Michael L.

    2015-01-01

    This paper introduces the APPRIS WebServer (http://appris.bioinfo.cnio.es) and WebServices (http://apprisws.bioinfo.cnio.es). Both the web servers and the web services are based around the APPRIS Database, a database that presently houses annotations of splice isoforms for five different vertebrate genomes. The APPRIS WebServer and WebServices provide access to the computational methods implemented in the APPRIS Database, while the APPRIS WebServices also allows retrieval of the annotations. The APPRIS WebServer and WebServices annotate splice isoforms with protein structural and functional features, and with data from cross-species alignments. In addition they can use the annotations of structure, function and conservation to select a single reference isoform for each protein-coding gene (the principal protein isoform). APPRIS principal isoforms have been shown to agree overwhelmingly with the main protein isoform detected in proteomics experiments. The APPRIS WebServer allows for the annotation of splice isoforms for individual genes, and provides a range of visual representations and tools to allow researchers to identify the likely effect of splicing events. The APPRIS WebServices permit users to generate annotations automatically in high throughput mode and to interrogate the annotations in the APPRIS Database. The APPRIS WebServices have been implemented using REST architecture to be flexible, modular and automatic. PMID:25990727

  1. APPRIS WebServer and WebServices.

    PubMed

    Rodriguez, Jose Manuel; Carro, Angel; Valencia, Alfonso; Tress, Michael L

    2015-07-01

    This paper introduces the APPRIS WebServer (http://appris.bioinfo.cnio.es) and WebServices (http://apprisws.bioinfo.cnio.es). Both the web servers and the web services are based around the APPRIS Database, a database that presently houses annotations of splice isoforms for five different vertebrate genomes. The APPRIS WebServer and WebServices provide access to the computational methods implemented in the APPRIS Database, while the APPRIS WebServices also allows retrieval of the annotations. The APPRIS WebServer and WebServices annotate splice isoforms with protein structural and functional features, and with data from cross-species alignments. In addition they can use the annotations of structure, function and conservation to select a single reference isoform for each protein-coding gene (the principal protein isoform). APPRIS principal isoforms have been shown to agree overwhelmingly with the main protein isoform detected in proteomics experiments. The APPRIS WebServer allows for the annotation of splice isoforms for individual genes, and provides a range of visual representations and tools to allow researchers to identify the likely effect of splicing events. The APPRIS WebServices permit users to generate annotations automatically in high throughput mode and to interrogate the annotations in the APPRIS Database. The APPRIS WebServices have been implemented using REST architecture to be flexible, modular and automatic. PMID:25990727

  2. Earth Science Mining Web Services

    NASA Technical Reports Server (NTRS)

    Pham, Long; Lynnes, Christopher; Hegde, Mahabaleshwa; Graves, Sara; Ramachandran, Rahul; Maskey, Manil; Keiser, Ken

    2008-01-01

    To allow scientists further capabilities in the area of data mining and web services, the Goddard Earth Sciences Data and Information Services Center (GES DISC) and researchers at the University of Alabama in Huntsville (UAH) have developed a system to mine data at the source without the need of network transfers. The system has been constructed by linking together several pre-existing technologies: the Simple Scalable Script-based Science Processor for Measurements (S4PM), a processing engine at he GES DISC; the Algorithm Development and Mining (ADaM) system, a data mining toolkit from UAH that can be configured in a variety of ways to create customized mining processes; ActiveBPEL, a workflow execution engine based on BPEL (Business Process Execution Language); XBaya, a graphical workflow composer; and the EOS Clearinghouse (ECHO). XBaya is used to construct an analysis workflow at UAH using ADam components, which are also installed remotely at the GES DISC, wrapped as Web Services. The S4PM processing engine searches ECHO for data using space-time criteria, staging them to cache, allowing the ActiveBPEL engine to remotely orchestras the processing workflow within S4PM. As mining is completed, the output is placed in an FTP holding area for the end user. The goals are to give users control over the data they want to process, while mining data at the data source using the server's resources rather than transferring the full volume over the internet. These diverse technologies have been infused into a functioning, distributed system with only minor changes to the underlying technologies. The key to the infusion is the loosely coupled, Web-Services based architecture: All of the participating components are accessible (one way or another) through (Simple Object Access Protocol) SOAP-based Web Services.

  3. Earth Science Mining Web Services

    NASA Astrophysics Data System (ADS)

    Pham, L. B.; Lynnes, C. S.; Hegde, M.; Graves, S.; Ramachandran, R.; Maskey, M.; Keiser, K.

    2008-12-01

    To allow scientists further capabilities in the area of data mining and web services, the Goddard Earth Sciences Data and Information Services Center (GES DISC) and researchers at the University of Alabama in Huntsville (UAH) have developed a system to mine data at the source without the need of network transfers. The system has been constructed by linking together several pre-existing technologies: the Simple Scalable Script-based Science Processor for Measurements (S4PM), a processing engine at the GES DISC; the Algorithm Development and Mining (ADaM) system, a data mining toolkit from UAH that can be configured in a variety of ways to create customized mining processes; ActiveBPEL, a workflow execution engine based on BPEL (Business Process Execution Language); XBaya, a graphical workflow composer; and the EOS Clearinghouse (ECHO). XBaya is used to construct an analysis workflow at UAH using ADaM components, which are also installed remotely at the GES DISC, wrapped as Web Services. The S4PM processing engine searches ECHO for data using space-time criteria, staging them to cache, allowing the ActiveBPEL engine to remotely orchestrates the processing workflow within S4PM. As mining is completed, the output is placed in an FTP holding area for the end user. The goals are to give users control over the data they want to process, while mining data at the data source using the server's resources rather than transferring the full volume over the internet. These diverse technologies have been infused into a functioning, distributed system with only minor changes to the underlying technologies. The key to this infusion is the loosely coupled, Web- Services based architecture: All of the participating components are accessible (one way or another) through (Simple Object Access Protocol) SOAP-based Web Services.

  4. Faculty Perceptions of Web-Based Distance Education in Agriculture.

    ERIC Educational Resources Information Center

    Born, Kevin A.; Miller, Greg

    1999-01-01

    A survey of 42 agronomy faculty showed their perceptions of Web-based distance education were higher when they were familiar with the master of science in agronomy program or had viewed a lesson. Their concerns included the value and rigor of Web-based degree programs and the effectiveness of online student-teacher interaction. (SK)

  5. Web Feature Service Semantic Mediation

    NASA Astrophysics Data System (ADS)

    Hobona, G.; Bermudez, L. E.; Brackin, R.; Percivall, G. S.

    2012-12-01

    Scientists from different organizations and disciplines need to work together to find the solutions to complex problems. Multi-disciplinary science typically involves users with specialized tools and their own preferred view of the data including unique characteristics of the user's information model and symbology. Even though organizations use web services to expose data, there are still semantic inconsistencies that need to be solved. Recent activities within the OGC Interoperability Program (IP) have helped advance semantic mediation solutions when using OGC services to help solve complex problems. The OGC standards development process is influenced by the feedback of activities within the Interoperability Program, which conducts international interoperability initiatives such as Testbeds, Pilot Projects, Interoperability Experiments, and Interoperability Support Services. These activities are designed to encourage rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. Two recent Testbeds, the OGC Web Services Phase 8 and Phase 9, have advanced the use of semantic mediation approaches to increase semantic interoperability among geospatial communities. The Cross-Community Interoperability (CCI) thread within these two testbeds, advanced semantic mediation approaches for data discovery, access and use of heterogeneous data models and heterogeneous metadata models. This presentation will provide an overview of the interoperability program, the CCI Thread and will explain the methodology to mediate heterogeneous GML Application Profiles served via WFS, including discovery of services via a catalog standard interface and mediating symbology applicable to each application profile.

  6. Effective Web Design and Core Communication Issues: The Missing Components in Web-Based Distance Education.

    ERIC Educational Resources Information Center

    Burch, Randall O.

    2001-01-01

    Discussion of Web-based distance education focuses on communication issues. Highlights include Internet communications; components of a Web site, including site architecture, user interface, information delivery method, and mode of feedback; elements of Web design, including conceptual design, sensory design, and reactive design; and a Web…

  7. Socialization of Distance Education: The Web as Enabler.

    ERIC Educational Resources Information Center

    Parker, Drew; Rossner-Merrill, Vivian

    The World Wide Web has allowed the delineation of distance versus place-based education to become a spectrum rather than a binary choice. This paper discusses a novel format of distance education, called "Virtual Seminars," and its relation to issues within cognitive flexibility theory. Virtual seminars are interactive courses offered over the…

  8. Technical Services and the World Wide Web.

    ERIC Educational Resources Information Center

    Scheschy, Virginia M.

    The World Wide Web and browsers such as Netscape and Mosaic have simplified access to electronic resources. Today, technical services librarians can share in the wealth of information available on the Web. One of the premier Web sites for acquisitions librarians is AcqWeb, a cousin of the AcqNet listserv. In addition to interesting news items,…

  9. The Organizational Role of Web Services

    ERIC Educational Resources Information Center

    Mitchell, Erik

    2011-01-01

    The workload of Web librarians is already split between Web-related and other library tasks. But today's technological environment has created new implications for existing services and new demands for staff time. It is time to reconsider how libraries can best allocate resources to provide effective Web services. Delivering high-quality services…

  10. Dynamic selection mechanism for quality of service aware web services

    NASA Astrophysics Data System (ADS)

    D'Mello, Demian Antony; Ananthanarayana, V. S.

    2010-02-01

    A web service is an interface of the software component that can be accessed by standard Internet protocols. The web service technology enables an application to application communication and interoperability. The increasing number of web service providers throughout the globe have produced numerous web services providing the same or similar functionality. This necessitates the use of tools and techniques to search the suitable services available over the Web. UDDI (universal description, discovery and integration) is the first initiative to find the suitable web services based on the requester's functional demands. However, the requester's requirements may also include non-functional aspects like quality of service (QoS). In this paper, the authors define a QoS model for QoS aware and business driven web service publishing and selection. The authors propose a QoS requirement format for the requesters, to specify their complex demands on QoS for the web service selection. The authors define a tree structure called quality constraint tree (QCT) to represent the requester's variety of requirements on QoS properties having varied preferences. The paper proposes a QoS broker based architecture for web service selection, which facilitates the requesters to specify their QoS requirements to select qualitatively optimal web service. A web service selection algorithm is presented, which ranks the functionally similar web services based on the degree of satisfaction of the requester's QoS requirements and preferences. The paper defines web service provider qualities to distinguish qualitatively competitive web services. The paper also presents the modelling and selection mechanism for the requester's alternative constraints defined on the QoS. The authors implement the QoS broker based system to prove the correctness of the proposed web service selection mechanism.

  11. Decreasing transactional distance in a Web-based course.

    PubMed

    Pattillo, Robin E

    2007-01-01

    The Horizon Wimba online Web-conferencing voice system was used to facilitate dialogue and decrease transactional distance in a Web-based course. Small-group (< or =6)discussion sessions were held and addressed topics pertinent to clinical practice. Students were asked to evaluate the synchronous voice discussion groups via a Flashlight survey at the end of the semester. Anecdotal and survey responses indicated that discussion groups increased dialogue between faculty and students. PMID:17496503

  12. Distance Learning Library Services in Ugandan Universities

    ERIC Educational Resources Information Center

    Mayende, Jackline Estomihi Kiwelu; Obura, Constant Okello

    2013-01-01

    The study carried out at Makerere University and Uganda Martyrs University in 2010 aimed at providing strategies for enhanced distance learning library services in terms of convenience and adequacy. The study adopted a cross sectional descriptive survey design. The study revealed services provided in branch libraries in Ugandan universities were…

  13. A web service infrastructure for thermochemical data.

    PubMed

    Paolini, Christopher P; Bhattacharjee, Subrata

    2008-07-01

    W3C standardized Web Services are becoming an increasingly popular middleware technology used to facilitate the open exchange of chemical data. While several projects in existence use Web Services to wrap existing commercial and open-source tools that mine chemical structure data, no Web Service infrastructure has yet been developed to compute thermochemical properties of substances. This work presents an infrastructure of Web Services for thermochemical data retrieval. Several examples are presented to demonstrate how our Web Services can be called from Java, through JavaScript using an AJAX methodology, and within commonly used commercial applications such as Microsoft Excel and MATLAB for use in computational work. We illustrate how a JANAF table, widely used by chemists and engineers, can be quickly reproduced through our Web Service infrastructure. PMID:18543903

  14. Discovery and Classification of Bioinformatics Web Services

    SciTech Connect

    Rocco, D; Critchlow, T

    2002-09-02

    The transition of the World Wide Web from a paradigm of static Web pages to one of dynamic Web services provides new and exciting opportunities for bioinformatics with respect to data dissemination, transformation, and integration. However, the rapid growth of bioinformatics services, coupled with non-standardized interfaces, diminish the potential that these Web services offer. To face this challenge, we examine the notion of a Web service class that defines the functionality provided by a collection of interfaces. These descriptions are an integral part of a larger framework that can be used to discover, classify, and wrapWeb services automatically. We discuss how this framework can be used in the context of the proliferation of sites offering BLAST sequence alignment services for specialized data sets.

  15. Storage Manager and File Transfer Web Services

    SciTech Connect

    William A Watson III; Ying Chen; Jie Chen; Walt Akers

    2002-07-01

    Web services are emerging as an interesting mechanism for a wide range of grid services, particularly those focused upon information services and control. When coupled with efficient data transfer services, they provide a powerful mechanism for building a flexible, open, extensible data grid for science applications. In this paper we present our prototype work on a Java Storage Resource Manager (JSRM) web service and a Java Reliable File Transfer (JRFT) web service. A java client (Grid File Manager) on top of JSRM and is developed to demonstrate the capabilities of these web services. The purpose of this work is to show the extent to which SOAP based web services are an appropriate direction for building a grid-wide data management system, and eventually grid-based portals.

  16. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm

    PubMed Central

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the “quality of service” as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services. PMID:26504894

  17. Socialization in the "Virtual Hallway": Instant Messaging in the Asynchronous Web-Based Distance Education Classroom.

    ERIC Educational Resources Information Center

    Nicholson, Scott

    2002-01-01

    Examined differences in communication between masters degree students at Syracuse University who used instant messaging (IM) services and those who did not in the same asynchronous distance education Web-based course. Results showed that students who used IM found it easier to communicate and felt a stronger sense of community. (Author/LRW)

  18. A research on semantic Web services discovery mechanism

    NASA Astrophysics Data System (ADS)

    Chen, Zhijun; Li, Xinke

    2011-12-01

    Semantic Web service discovery is focused on finding the best services from the majority of services which provide similar or same functions based on Semantic matching Algorithms. This paper firstly proposed a new QoS model for describing Semantic Web services by adding a new significant characteristic (SU ratio) of QoS. To raise the finding work efficiency, we then give an improved algorithm for matching semantic web services; the new algorithm does not only focus on the inclusion relation of ontology concepts in taxonomic tree as implemented by classic algorithms, but also includes binary relations. Through the comparison of weighted semantic distance in taxonomic tree, the similarity computation is more accuracy and the precision ratio and recall ratio of algorithm is enhanced. Finally a preliminary case study was accomplished in order to prove its applicability and viability.

  19. Enriching the Web Processing Service

    NASA Astrophysics Data System (ADS)

    Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer

    2014-05-01

    The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS

  20. Space Physics Data Facility Web Services

    NASA Technical Reports Server (NTRS)

    Candey, Robert M.; Harris, Bernard T.; Chimiak, Reine A.

    2005-01-01

    The Space Physics Data Facility (SPDF) Web services provides a distributed programming interface to a portion of the SPDF software. (A general description of Web services is available at http://www.w3.org/ and in many current software-engineering texts and articles focused on distributed programming.) The SPDF Web services distributed programming interface enables additional collaboration and integration of the SPDF software system with other software systems, in furtherance of the SPDF mission to lead collaborative efforts in the collection and utilization of space physics data and mathematical models. This programming interface conforms to all applicable Web services specifications of the World Wide Web Consortium. The interface is specified by a Web Services Description Language (WSDL) file. The SPDF Web services software consists of the following components: 1) A server program for implementation of the Web services; and 2) A software developer s kit that consists of a WSDL file, a less formal description of the interface, a Java class library (which further eases development of Java-based client software), and Java source code for an example client program that illustrates the use of the interface.

  1. Enhancing UCSF Chimera through web services.

    PubMed

    Huang, Conrad C; Meng, Elaine C; Morris, John H; Pettersen, Eric F; Ferrin, Thomas E

    2014-07-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624

  2. Efficient Web Services Policy Combination

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh; Harman, Joseph G.

    2010-01-01

    Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take

  3. Transimulation - protein biosynthesis web service.

    PubMed

    Siwiak, Marlena; Zielenkiewicz, Piotr

    2013-01-01

    Although translation is the key step during gene expression, it remains poorly characterized at the level of individual genes. For this reason, we developed Transimulation - a web service measuring translational activity of genes in three model organisms: Escherichia coli, Saccharomyces cerevisiae and Homo sapiens. The calculations are based on our previous computational model of translation and experimental data sets. Transimulation quantifies mean translation initiation and elongation time (expressed in SI units), and the number of proteins produced per transcript. It also approximates the number of ribosomes that typically occupy a transcript during translation, and simulates their propagation. The simulation of ribosomes' movement is interactive and allows modifying the coding sequence on the fly. It also enables uploading any coding sequence and simulating its translation in one of three model organisms. In such a case, ribosomes propagate according to mean codon elongation times of the host organism, which may prove useful for heterologous expression. Transimulation was used to examine evolutionary conservation of translational parameters of orthologous genes. Transimulation may be accessed at http://nexus.ibb.waw.pl/Transimulation (requires Java version 1.7 or higher). Its manual and source code, distributed under the GPL-2.0 license, is freely available at the website. PMID:24040122

  4. Between Shots TRANSP Web Service

    NASA Astrophysics Data System (ADS)

    Feibush, E.; Andre, R.; Ludescher-Furth, C.; Kaye, S.; McCune, D.

    2008-11-01

    Running TRANSP between NSTX shots requires rapid data preparation and job submittal. A web service with a graphical user interface and data visualization has been developed to meet these goals. The underlying data preparation system has a command line interface written in Python and runs on a PPPL compute server. The display client is a Java program (ElVis) that sends requests to the data preparation system. As the run data is prepared, graphs are created and sent to the client for display. Flux surface plots are displayed and animated over time. The most commonly used control options are implemented in the UI as buttons and text fields. A time slice or time dependent run can be prepared. The command line interface is available in the client program for expert users to apply advanced settings, to prototype new UI buttons, and to run scripts. The client program contains a simple text editor for modifying the TRANSP namelist. When data preparation is complete the run is submitted to the TRANSP production system. The initial version has been deployed and is being tested in the control room setting. Results will be discussed in the poster presentation. Work performed at PPPL under the auspices of U.S. DOE Contract DE-AC02-76CH03073.

  5. Improving query services of web map by web mining

    NASA Astrophysics Data System (ADS)

    Huang, Maojun

    2007-11-01

    Web map is the hybrid of map and the World Wide Web (known as Web). It is usually created with WebGIS techniques. With the rapid social development, web maps oriented the public are facing pressure that dissatisfy the increased demanding. The geocoding database plays a key role in supporting query services effectively. The traditional geocoding method is laborious and time-consuming. And there is much online spatial information, which would be the supplementary information source for geocoding. Therefore, this paper discusses how to improve query services by web mining. The improvement can be described from three facets: first, improving location query by discovering and extracting address information from the Web to extend geocoding database. Second, enhancing the ability of optimum path query of public traffic and buffer query by spatial analyzing and reasoning on the extended geocoding database. Third, adjusting strategies of collecting data according to patterns discovered by web map query mining. Finally, this paper presents the designing of the application system and experimental results.

  6. Preservice Mathematics Teachers' Views on Distance Education and Their Web Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Cagirgan Gulten, Dilek

    2013-01-01

    This research aims to investigate primary preservice mathematics teachers' views on distance education and web pedagogical content knowledge in terms of the subscales of general web, communicative web, pedagogical web, web pedagogical content and attitude towards web based instruction. The research was conducted with 46 senior students in the…

  7. Adaptive Service Binding with Lightweight Semantic Web Services

    NASA Astrophysics Data System (ADS)

    Pedrinaci, Carlos; Lambert, Dave; Maleshkova, Maria; Liu, Dong; Domingue, John; Krummenacher, Reto

    Adaptive service selection is acknowledged to provide a certain number of advantages to optimise the service provisioning process or to cater for advanced service brokering. Semantic Web Services, that is services that have been enriched with semantic annotations have often been used for providing adaptive service selection by deferring the binding of services until runtime. Thus far, however, research on Semantic Web Services has mainly been dominated by rich conceptual frameworks such as WSMO and OWL-S which require a significant effort towards the annotation of services and rely on complex reasoning for which there are no efficient solutions that can scale to the Web yet. In this chapter, inline with current trends on the Semantic Web that sacrifice expressivity in favour of performance, we present a novel approach to providing adaptive service selection that relies on simple conceptual models for services and less expressive formalisms for which there currently exist mature and performant implementations. In particular, we present a set of conceptual models defined in RDF(S) that support both Web services and Web APIs and we show how simple templates abstracting user requirements can be automatically transformed into SPARQL to enable service selection in a scalable manner.

  8. Web server for priority ordered multimedia services

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  9. Online Information Services. Caught in the Web?

    ERIC Educational Resources Information Center

    Green, Tim

    1995-01-01

    Provides brief reviews of the sites for several online services of the World Wide Web; the Web as a marketing tool and other aspects of interest to information professionals are highlighted. A sidebar presents information on accessing Internet locations, graphics, online forms, Telnet, saving, printing, mailing, and searching. (AEF)

  10. Domain-specific Web Service Discovery with Service Class Descriptions

    SciTech Connect

    Rocco, D; Caverlee, J; Liu, L; Critchlow, T J

    2005-02-14

    This paper presents DynaBot, a domain-specific web service discovery system. The core idea of the DynaBot service discovery system is to use domain-specific service class descriptions powered by an intelligent Deep Web crawler. In contrast to current registry-based service discovery systems--like the several available UDDI registries--DynaBot promotes focused crawling of the Deep Web of services and discovers candidate services that are relevant to the domain of interest. It uses intelligent filtering algorithms to match services found by focused crawling with the domain-specific service class descriptions. We demonstrate the capability of DynaBot through the BLAST service discovery scenario and describe our initial experience with DynaBot.

  11. CB-EMIS WEB SERVICE SOFTWARE

    Energy Science and Technology Software Center (ESTSC)

    2007-01-01

    This software provides CB-EMIS data to remote devices using a secure internet connection. The CB-EMIS Web Service filters and repackages data in a form suitable for resource limited devices such as a cell phone. Data transmission is filtered based on a user's authentical level. The web services acts as intermediary so that no direct connection is possible between the internet and the CB-EMIS server software.

  12. UncertWeb: chaining web services accounting for uncertainty

    NASA Astrophysics Data System (ADS)

    Cornford, Dan; Jones, Richard; Bastin, Lucy; Williams, Matthew; Pebesma, Edzer; Nativi, Stefano

    2010-05-01

    The development of interoperable services that permit access to data and processes, typically using web service based standards opens up the possibility for increasingly complex chains of data and processes, which might be discovered and composed in increasingly automatic ways. This concept, sometimes referred to as the "Model Web", offers the promise of integrated (Earth) system models, with pluggable web service based components which can be discovered, composed and evaluated dynamically. A significant issue with such service chains, indeed in any composite model composed of coupled components, is that in all interesting (non-linear) cases the effect of uncertainties on inputs, or components within the chain will have complex, potentially unexpected effects on the outputs. Within the FP7 UncertWeb project we will be developing a mechanism and an accompanying set of tools to enable rigorous uncertainty management in web based service chains involving both data and processes. The project will exploit and extend the UncertML candidate standard to flexibly propagate uncertainty through service chains, including looking at mechanisms to develop uncertainty enabled profiles of existing Open Geospatial Consortium services. To facilitate the use of such services we will develop tools to address the definition of the input uncertainties (elicitation), manage the uncertainty propagation (emulation), undertake uncertainty and sensitivity analysis and visualise the output uncertainty. In this talk we will outline the challenges of the UncertWeb project, illustrating this with a prototype service chain we have created for correcting station level pressure to sea-level pressure, which accounts for the various uncertainties involved. In particular we will discuss some of the challenges of chaining Open Geospatial Consortium services using the Business Process Execution Language. We will also address the issue of computational cost and communication bandwidth requirements for

  13. Web Service Architecture Framework for Embedded Devices

    ERIC Educational Resources Information Center

    Yanzick, Paul David

    2009-01-01

    The use of Service Oriented Architectures, namely web services, has become a widely adopted method for transfer of data between systems across the Internet as well as the Enterprise. Adopting a similar approach to embedded devices is also starting to emerge as personal devices and sensor networks are becoming more common in the industry. This…

  14. New Interfaces to Web Documents and Services

    NASA Technical Reports Server (NTRS)

    Carlisle, W. H.

    1996-01-01

    This paper reports on investigations into how to extend capabilities of the Virtual Research Center (VRC) for NASA's Advanced Concepts Office. The work was performed as part of NASA's 1996 Summer Faculty Fellowship program, and involved research into and prototype development of software components that provide documents and services for the World Wide Web (WWW). The WWW has become a de-facto standard for sharing resources over the internet, primarily because web browsers are freely available for the most common hardware platforms and their operating systems. As a consequence of the popularity of the internet, tools, and techniques associated with web browsers are changing rapidly. New capabilities are offered by companies that support web browsers in order to achieve or remain a dominant participant in internet services. Because a goal of the VRC is to build an environment for NASA centers, universities, and industrial partners to share information associated with Advanced Concepts Office activities, the VRC tracks new techniques and services associated with the web in order to determine the their usefulness for distributed and collaborative engineering research activities. Most recently, Java has emerged as a new tool for providing internet services. Because the major web browser providers have decided to include Java in their software, investigations into Java were conducted this summer.

  15. A Strategic Model of Trust Management in Web Services

    NASA Astrophysics Data System (ADS)

    Sun, Junqing; Sun, Zhaohao; Li, Yuanzhe; Zhao, Shuliang

    This article examines trust and trust management in web services and proposes a multiagent model of trust relationship in web services. It looks at the hierarchical structure of trust management in web services and proposes a strategic model of trust management in web services. The proposed approach in this article will facilitate research and development of trust management in e-commerce, web services and social networking.

  16. Efficiently Selecting the Best Web Services

    NASA Astrophysics Data System (ADS)

    Goncalves, Marlene; Vidal, Maria-Esther; Regalado, Alfredo; Yacoubi Ayadi, Nadia

    Emerging technologies and linking data initiatives have motivated the publication of a large number of datasets, and provide the basis for publishing Web services and tools to manage the available data. This wealth of resources opens a world of possibilities to satisfy user requests. However, Web services may have similar functionality and assess different performance; therefore, it is required to identify among the Web services that satisfy a user request, the ones with the best quality. In this paper we propose a hybrid approach that combines reasoning tasks with ranking techniques to aim at the selection of the Web services that best implement a user request. Web service functionalities are described in terms of input and output attributes annotated with existing ontologies, non-functionality is represented as Quality of Services (QoS) parameters, and user requests correspond to conjunctive queries whose sub-goals impose restrictions on the functionality and quality of the services to be selected. The ontology annotations are used in different reasoning tasks to infer service implicit properties and to augment the size of the service search space. Furthermore, QoS parameters are considered by a ranking metric to classify the services according to how well they meet a user non-functional condition. We assume that all the QoS parameters of the non-functional condition are equally important, and apply the Top-k Skyline approach to select the k services that best meet this condition. Our proposal relies on a two-fold solution which fires a deductive-based engine that performs different reasoning tasks to discover the services that satisfy the requested functionality, and an efficient implementation of the Top-k Skyline approach to compute the top-k services that meet the majority of the QoS constraints. Our Top-k Skyline solution exploits the properties of the Skyline Frequency metric and identifies the top-k services by just analyzing a subset of the services that

  17. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  18. Web-based Service Portal in Healthcare

    NASA Astrophysics Data System (ADS)

    Silhavy, Petr; Silhavy, Radek; Prokopova, Zdenka

    Information delivery is one the most important task in healthcare. The growing sector of electronic healthcare has an important impact on the information delivery. There are two basic approaches towards information delivering. The first is web portal and second is touch-screen terminal. The aim of this paper is to investigate the web-based service portal. The most important advantage of web-based portal in the field of healthcare is an independent access for patients. This paper deals with the conditions and frameworks for healthcare portals

  19. Optimizing Web Service Composition While Enforcing Regulations

    NASA Astrophysics Data System (ADS)

    Sohrabi, Shirin; McIlraith, Sheila A.

    To direct automated Web service composition, it is compelling to provide a template, workflow or scaffolding that dictates the ways in which services can be composed. In this paper we present an approach to Web service composition that builds on work using AI planning, and more specifically Hierarchical Task Networks (HTNs), for Web service composition. A significant advantage of our approach is that it provides much of the how-to knowledge of a choreography while enabling customization and optimization of integrated Web service selection and composition based upon the needs of the specific problem, the preferences of the customer, and the available services. Many customers must also be concerned with enforcement of regulations, perhaps in the form of corporate policies and/or government regulations. Regulations are traditionally enforced at design time by verifying that a workflow or composition adheres to regulations. Our approach supports customization, optimization and regulation enforcement all at composition construction time. To maximize efficiency, we have developed novel search heuristics together with a branch and bound search algorithm that enable the generation of high quality compositions with the performance of state-of-the-art planning systems.

  20. Designing Crop Simulation Web Service with Service Oriented Architecture Principle

    NASA Astrophysics Data System (ADS)

    Chinnachodteeranun, R.; Hung, N. D.; Honda, K.

    2015-12-01

    Crop simulation models are efficient tools for simulating crop growth processes and yield. Running crop models requires data from various sources as well as time-consuming data processing, such as data quality checking and data formatting, before those data can be inputted to the model. It makes the use of crop modeling limited only to crop modelers. We aim to make running crop models convenient for various users so that the utilization of crop models will be expanded, which will directly improve agricultural applications. As the first step, we had developed a prototype that runs DSSAT on Web called as Tomorrow's Rice (v. 1). It predicts rice yields based on a planting date, rice's variety and soil characteristics using DSSAT crop model. A user only needs to select a planting location on the Web GUI then the system queried historical weather data from available sources and expected yield is returned. Currently, we are working on weather data connection via Sensor Observation Service (SOS) interface defined by Open Geospatial Consortium (OGC). Weather data can be automatically connected to a weather generator for generating weather scenarios for running the crop model. In order to expand these services further, we are designing a web service framework consisting of layers of web services to support compositions and executions for running crop simulations. This framework allows a third party application to call and cascade each service as it needs for data preparation and running DSSAT model using a dynamic web service mechanism. The framework has a module to manage data format conversion, which means users do not need to spend their time curating the data inputs. Dynamic linking of data sources and services are implemented using the Service Component Architecture (SCA). This agriculture web service platform demonstrates interoperability of weather data using SOS interface, convenient connections between weather data sources and weather generator, and connecting

  1. How Are Teacher-Librarians Finding Resources for Coursework?: Distance Learners and the Role of University Library Services

    ERIC Educational Resources Information Center

    de Jong, Cees-Jan; Branch, Jennifer L.

    2005-01-01

    Providing distance learners access to library services does not automatically translate into usage of those resources. The literature on information-seeking behaviour of distance learners has indicated that they prefer to use local resources, as well as Web-based resources. This study investigates perspectives on library services and available…

  2. The ViennaRNA web services.

    PubMed

    Gruber, Andreas R; Bernhart, Stephan H; Lorenz, Ronny

    2015-01-01

    The ViennaRNA package is a widely used collection of programs for thermodynamic RNA secondary structure prediction. Over the years, many additional tools have been developed building on the core programs of the package to also address issues related to noncoding RNA detection, RNA folding kinetics, or efficient sequence design considering RNA-RNA hybridizations. The ViennaRNA web services provide easy and user-friendly web access to these tools. This chapter describes how to use this online platform to perform tasks such as prediction of minimum free energy structures, prediction of RNA-RNA hybrids, or noncoding RNA detection. The ViennaRNA web services can be used free of charge and can be accessed via http://rna.tbi.univie.ac.at. PMID:25577387

  3. CMR Catalog Service for the Web

    NASA Technical Reports Server (NTRS)

    Newman, Doug; Mitchell, Andrew

    2016-01-01

    With the impending retirement of Global Change Master Directory (GCMD) Application Programming Interfaces (APIs) the Common Metadata Repository (CMR) was charged with providing a collection-level Catalog Service for the Web (CSW) that provided the same level of functionality as GCMD. This talk describes the capabilities of the CMR CSW API with particular reference to the support of the Committee on Earth Observation Satellites (CEOS) Working Group on Information Systems and Services (WGISS) Integrated Catalog (CWIC).

  4. Towards Web Service-Based Educational Systems

    ERIC Educational Resources Information Center

    Sampson, Demetrios G.

    2005-01-01

    The need for designing the next generation of web service-based educational systems with the ability of integrating components from different tools and platforms is now recognised as the major challenge in advanced learning technologies. In this paper, we discuss this issue and we present the conceptual design of such environment, referred to as…

  5. Introducing the PRIDE Archive RESTful web services.

    PubMed

    Reisinger, Florian; del-Toro, Noemi; Ternent, Tobias; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-07-01

    The PRIDE (PRoteomics IDEntifications) database is one of the world-leading public repositories of mass spectrometry (MS)-based proteomics data and it is a founding member of the ProteomeXchange Consortium of proteomics resources. In the original PRIDE database system, users could access data programmatically by accessing the web services provided by the PRIDE BioMart interface. New REST (REpresentational State Transfer) web services have been developed to serve the most popular functionality provided by BioMart (now discontinued due to data scalability issues) and address the data access requirements of the newly developed PRIDE Archive. Using the API (Application Programming Interface) it is now possible to programmatically query for and retrieve peptide and protein identifications, project and assay metadata and the originally submitted files. Searching and filtering is also possible by metadata information, such as sample details (e.g. species and tissues), instrumentation (mass spectrometer), keywords and other provided annotations. The PRIDE Archive web services were first made available in April 2014. The API has already been adopted by a few applications and standalone tools such as PeptideShaker, PRIDE Inspector, the Unipept web application and the Python-based BioServices package. This application is free and open to all users with no login requirement and can be accessed at http://www.ebi.ac.uk/pride/ws/archive/. PMID:25904633

  6. Predicting Student Performance in Web-Based Distance Education Courses Based on Survey Instruments Measuring Personality Traits and Technical Skills

    ERIC Educational Resources Information Center

    Hall, Michael

    2008-01-01

    Two common web-based surveys, "Is Online Learning Right for Me?' and "What Technical Skills Do I Need?", were combined into a single survey instrument and given to 228 on-campus and 83 distance education students. The students were enrolled in four different classes (business, computer information services, criminal justice, and…

  7. Enhancing Data Interoperability with Web Services

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  8. Focused Crawling of the Deep Web Using Service Class Descriptions

    SciTech Connect

    Rocco, D; Liu, L; Critchlow, T

    2004-06-21

    Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address these challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.

  9. A Web Services Data Analysis Grid

    SciTech Connect

    William A Watson III; Ian Bird; Jie Chen; Bryan Hess; Andy Kowalski; Ying Chen

    2002-07-01

    The trend in large-scale scientific data analysis is to exploit compute, storage and other resources located at multiple sites, and to make those resources accessible to the scientist as if they were a single, coherent system. Web technologies driven by the huge and rapidly growing electronic commerce industry provide valuable components to speed the deployment of such sophisticated systems. Jefferson Lab, where several hundred terabytes of experimental data are acquired each year, is in the process of developing a web-based distributed system for data analysis and management. The essential aspects of this system are a distributed data grid (site independent access to experiment, simulation and model data) and a distributed batch system, augmented with various supervisory and management capabilities, and integrated using Java and XML-based web services.

  10. User Needs of Digital Service Web Portals: A Case Study

    ERIC Educational Resources Information Center

    Heo, Misook; Song, Jung-Sook; Seol, Moon-Won

    2013-01-01

    The authors examined the needs of digital information service web portal users. More specifically, the needs of Korean cultural portal users were examined as a case study. The conceptual framework of a web-based portal is that it is a complex, web-based service application with characteristics of information systems and service agents. In…

  11. Contract Observation in Web Services Environments

    NASA Astrophysics Data System (ADS)

    Bíba, Jiří; Hodík, Jiří; Jakob, Michal; Pěchouček, Michal

    Electronic contracting, based on explicit representation of different parties' commitments, is a promising way to specifying and regulating behaviour in distributed business applications. A key part of contract-based system is a process through which the actual behaviour of individual parties is checked for conformance with contracts set to govern such behaviour. Such checking requires that relevant information on the behaviour of the parties, both with respect to the application processes they execute and to managing their contractual relationships, is captured. The process of collecting all such information, termed contract observation, is the subject of this paper. First, we describe general properties and requirements of such an observation process; afterwards, we discuss specifics of realising contract observation in web services environments. Finally, we show how contract observation has been implemented as part of the IST-CONTRACT web services framework for contract-based systems.

  12. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  13. The Impact of Web Based Resource Material on Learning Outcome in Open Distance Higher Education

    ERIC Educational Resources Information Center

    Masrur, Rehana

    2010-01-01

    One of the most powerful educational option in open and distance education is web-based learning. A blended (hybrid) course combines traditional face to face and web-based learning approaches in an educational environment that is nonspecific as to time and place. The study reported here investigated the impact of web based resource material…

  14. The Reality of Web-Based Interaction in an Egyptian Distance Education Course

    ERIC Educational Resources Information Center

    Sadik, Alaa

    2006-01-01

    This paper reports the results of a study conducted to evaluate the reality of interaction in a web-based distance education course. The learners were Egyptian first-grade secondary school students (15-16 years old) and the learning subject is mathematics. To investigate students' interactions via the Web, a Web-based learning environment was…

  15. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  16. Distance Learning: Information Access and Services for Virtual Users.

    ERIC Educational Resources Information Center

    Iyer, Hemalata, Ed.

    This volume centers broadly on information support services for distance education. The articles in this book can be categorized into two areas: access to information resources for distance learners, and studies of distance learning programs. Contents include: "The Challenges and Benefits of Asynchronous Learning Networks" (Daphne Jorgensen);…

  17. Security Policy Configuration Analysis for Web Services on Heterogeneous Platforms

    NASA Astrophysics Data System (ADS)

    Hongbin, Ji; Fengyu, Zhao; Tao, Xu

    With the rapid development of web services, message security of web services between heterogeneous platforms is increasingly prominent. As two popular web services platforms, Apache Axis2 and Microsoft .Net, have their own security module respectively (Rampart, WSE). Due to differences in platform security mechanisms, it is difficult to build a secure web services communications between different platforms. This paper firstly introduces the Apache Axis2 and. Net platforms, and then analyzes their differences of security mechanism on these two platforms. Finally, followed by a typical secure case, a series of steps are designed and tested in order to realize secure web service invocation on heterogeneous platforms.

  18. Using EMBL-EBI services via Web interface and programmatically via Web Services

    PubMed Central

    Lopez, Rodrigo; Cowley, Andrew; Li, Weizhong; McWilliam, Hamish

    2015-01-01

    The European Bioinformatics Institute (EMBL-EBI) provides access to a wide range of databases and analysis tools that are of key importance in bioinformatics. As well as providing Web interfaces to these resources, Web Services are available using SOAP and REST protocols that enable programmatic access to our resources and allow their integration into other applications and analytical workflows. This unit describes the various options available to a typical researcher or bioinformatician who wishes to use our resources via Web interface or programmatically via a range of programming languages. PMID:25501941

  19. Pragmatic Computing - A Semiotic Perspective to Web Services

    NASA Astrophysics Data System (ADS)

    Liu, Kecheng

    The web seems to have evolved from a syntactic web, a semantic web to a pragmatic web. This evolution conforms to the study of information and technology from the theory of semiotics. The pragmatics, concerning with the use of information in relation to the context and intended purposes, is extremely important in web service and applications. Much research in pragmatics has been carried out; but in the same time, attempts and solutions have led to some more questions. After reviewing the current work in pragmatic web, the paper presents a semiotic approach to website services, particularly on request decomposition and service aggregation.

  20. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  1. Web Map Services (WMS) Global Mosaic

    NASA Technical Reports Server (NTRS)

    Percivall, George; Plesea, Lucian

    2003-01-01

    The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.

  2. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  3. Web Services as Public Services: Are We Supporting Our Busiest Service Point?

    ERIC Educational Resources Information Center

    Riley-Huff, Debra A.

    2009-01-01

    This article is an analysis of academic library organizational culture, patterns, and processes as they relate to Web services. Data gathered in a research survey is examined in an attempt to reveal current departmental and administrative attitudes, practices, and support for Web services in the library research environment. (Contains 10 tables.)

  4. Synthetic seismogram web service and Python tools

    NASA Astrophysics Data System (ADS)

    Heimann, Sebastian; Cesca, Simone; Kriegerowski, Marius; Dahm, Torsten

    2014-05-01

    Many geophysical methods require knowledge of Green's functions (GF) or synthetic seismograms in dependence of ranges of source and receiver coordinates. Examples include synthetic seismogram generation, moment tensor inversion, the modeling of depth phases for regional and teleseismic earthquakes, or the modeling of pressure diffusion induced static displacement and strain. Calculation of Green's functions is a computationally expensive operation and it can be of advantage to calculate them in advance: the same Green's function traces can then be reused several or many times as required in a typical application. Regarding Green's function computation as an independent step in a use-case's processing chain encourages to store these in an application independent form. They can then be shared between different applications and they can also be passed to other researchers, e.g. via a web service. Starting now, we provide such a web service to the seismological community (http://kinherd.org/), where a researcher can share Green's function stores and retrieve synthetic seismograms for various point and extended earthquake source models for many different earth models at local, regional and global scale. This web service is part of a rich new toolset for the creation and handling of Green's functions and synthetic seismograms (http://emolch.github.com/pyrocko/gf). It can be used off-line or in client mode. Its core features are: greatly simplified generation of Green's function stores supports various codes for Green's function computation extensible Green's function storage format flexible spacial indexing of Green's functions integrated travel time computation support for other types of Green's functions; e.g. poro-elastic GFs written in Python

  5. Web service module for access to g-Lite

    NASA Astrophysics Data System (ADS)

    Goranova, R.; Goranov, G.

    2012-10-01

    G-Lite is a lightweight grid middleware for grid computing installed on all clusters of the European Grid Infrastructure (EGI). The middleware is partially service-oriented and does not provide well-defined Web services for job management. The existing Web services in the environment cannot be directly used by grid users for building service compositions in the EGI. In this article we present a module of well-defined Web services for job management in the EGI. We describe the architecture of the module and the design of the developed Web services. The presented Web services are composable and can participate in service compositions (workflows). An example of usage of the module with tools for service compositions in g-Lite is shown.

  6. P2P Approach for Web Services Publishing and Discovery

    NASA Astrophysics Data System (ADS)

    Islam, Mohmammad Towhidul; Akon, Mursalin; Shen, Xuemin (Sherman)

    Web service is an emerging paradigm for distributing business applications from different platforms to a wide variety of clients. The critical factor in seamlessly accessing web services is to discover the appropriate service and the related service providers. Unfortunately, current web service technologies use centralized directory to keep the service index, which is not scalable and at the same time vulnerable to single point of failure. Peer to peer system is a popular decentralized architecture which can be used for key look up service with scalability and self organization. Thus there is an opportunity to intersect the P2P framework with web services to provide the scalable solution. In this chapter, we discuss the key methods to deploy web services using the peer-to-peer technology.

  7. Solving the Problem of Promoting Distance Library Services

    ERIC Educational Resources Information Center

    Wyss, Paul Alan

    2007-01-01

    Promoting services is a conundrum for any organization. This is especially true for an academic library promoting distance library services. Systems thinking, process mapping, team learning, and diffusion of information practices offer ways of thinking about promoting services that help those involved find novel ways to approach promoting distance…

  8. A Jini-based dynamic service WebGIS model

    NASA Astrophysics Data System (ADS)

    Xuan, Wenling; Chen, Xiuwan; Huang, Zhaoqiang; Zhao, Gang

    2007-06-01

    The development of current GIS technology has evolved from single platform GIS system into WebGIS. However, The Geographic Information Services (GIServices) provision and application manner cannot meet the requirement of pervasive computing environment. Jini/JAVA technique, a dynamic distributed architecture for providing spontaneous network of services, might be a tool/solution to improve the GIService performance of current WebGIS. This paper studies and analyses Jini infrastructure and its dynamic service mechanism, designs a new WebGIS architecture with Jini-based dynamic service model. The experiment shows that Jini technique can be integrated into WebGIS and to realize the dynamic services organization and management.

  9. Creating Web Services from Community Sourced Data

    NASA Astrophysics Data System (ADS)

    Siegel, D.; Scopel, C.; Boghici, E.

    2013-12-01

    In order to extend the World Hydro Basemap and build watershed delineation and river tracing services that cover the entire planet, we are integrating community-contributed data into a global hydrographic dataset. This dataset is the engine behind a foundational set of tools and services intended to enable hydrologic analysis on the web. However, each organization that collects hydrography uses a workflow and data model unique to their mission, which makes synthesizing their data difficult. Furthermore, these data are collected at different resolutions, so running analytics across regions with multiple contributors is not necessarily valid. Thus, instead of merging contributed data into a seamless geodatabase, the goal of our Community Maps for Hydrology program is to create workflows for converting any arbitrary dataset into the Arc Hydro Data Model. This way, tools and services can be pointed towards different contributions interchangeably while still maintaining the autonomy of each dataset. Contributors retain ownership of their data and are responsible for updates and edits, but the tools and services work identically across all contributions. HydroSHEDs data, contributed by the World Wildlife Fund, is used at the smallest scales to ensure global coverage, and national datasets extend our services to the medium-scales where available. A workflow to incorporate LIDAR and other large scale data is being developed as well, so that local governments and engineering companies can contribute to the program. Watershed Delineation Tool The World Hydro Basemap

  10. OGC Web Services standards by example : the European Seismic Portal

    NASA Astrophysics Data System (ADS)

    Frobert, L.; Kamb, L.; Trani, L.; Spinuso, A.; Bossu, R.; Van Eck, T.

    2011-12-01

    NERIES (2006-2010) was an Integrated Infrastructure Initiative (I3) project in the Sixth Framework Program (FP6) of the European Commission (EC), aiming at networking the European seismic networks, improving access to data, allowing access to specific seismic infrastructures and pursuing targeted research developing the next generation of tools for improved service and data analysis. During this project, a web portal was developed using web services to access data and a Visual Web Applications to display them. However these web services were not conform to any standard, making them difficult to consume by any new user interface. Therefore, for the NERA project, the follow-up of NERIES, we have proposed the use of web services standards to access our data. We have decided to use standards defined by the Open Geospatial Consortium (OGC). The OGC defines standards for the Web service interfaces to access geo-tagged data. The events and seismic stations are also geo-tagged making these web services suitable for our purpose. Using standard web services gives us the opportunity to distribute our data across all conformant consumers to these standards through various programming languages and applications We have implemented a preliminary version of web services conforming to the Web Map Service (WMS) and Web Feature Service (WFS) standard to access our catalog of seismic events (nearly 200 000 events). To visualize them we have made four examples demo on our web site using different technologies (Adobe Flash, JavaScript, Java with Nasa World Wind and UDig a desktop GIS application). In the future we hope to implement other OGC Web services standard like : - Sensor Observation Service (SOS) to provide seismic waveform records; - Web Notification Service (WNS); - Catalog Service for the Web (CSW) to provide a search engine of all our web services; - Web Processing Service (WPS) to process data between different services. The power of the use of OGC standards is the easy

  11. Persistence and Availability of Web Services in Computational Biology

    PubMed Central

    Schultheiss, Sebastian J.; Münch, Marc-Christian; Andreeva, Gergana D.; Rätsch, Gunnar

    2011-01-01

    We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository. PMID:21966383

  12. Creating OGC Web Processing Service workflows using a web-based editor

    NASA Astrophysics Data System (ADS)

    de Jesus, J.; Walker, P.; Grant, M.

    2012-04-01

    The OGC WPS (Web Processing Service) specifies how geospatial algorithms may be accessed in an SOA (Service Oriented Architecture). Service providers can encode both simple and sophisticated algorithms as WPS processes and publish them as web services. These services are not only useful individually but may be built into complex processing chains (workflows) that can solve complex data analysis and/or scientific problems. The NETMAR project has extended the Web Processing Service (WPS) framework to provide transparent integration between it and the commonly used WSDL (Web Service Description Language) that describes the web services and its default SOAP (Simple Object Access Protocol) binding. The extensions allow WPS services to be orchestrated using commonly used tools (in this case Taverna Workbench, but BPEL based systems would also be an option). We have also developed a WebGUI service editor, based on HTML5 and the WireIt! Javascript API, that allows users to create these workflows using only a web browser. The editor is coded entirely in Javascript and performs all XSLT transformations needed to produce a Taverna compatible (T2FLOW) workflow description which can be exported and run on a local Taverna Workbench or uploaded to a web-based orchestration server and run there. Here we present the NETMAR WebGUI service chain editor and discuss the problems associated with the development of a WebGUI for scientific workflow editing; content transformation into the Taverna orchestration language (T2FLOW/SCUFL); final orchestration in the Taverna engine and how to deal with the large volumes of data being transferred between different WPS services (possibly running on different servers) during workflow orchestration. We will also demonstrate using the WebGUI for creating a simple workflow making use of published web processing services, showing how simple services may be chained together to produce outputs that would previously have required a GIS (Geographic

  13. A web service for service composition to aid geospatial modelers

    NASA Astrophysics Data System (ADS)

    Bigagli, L.; Santoro, M.; Roncella, R.; Mazzetti, P.

    2012-04-01

    The identification of appropriate mechanisms for process reuse, chaining and composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. In the Earth and Space Sciences, such a facility could primarily enable integrated and interoperable modeling, for what several approaches have been proposed and developed, over the last years. In fact, GEOSS is specifically tasked with the development of the so-called "Model Web". At increasing levels of abstraction and generalization, the initial stove-pipe software tools have evolved to community-wide modeling frameworks, to Component-Based Architecture solution, and, more recently, started to embrace Service-Oriented Architectures technologies, such as the OGC WPS specification and the WS-* stack of W3C standards for service composition. However, so far, the level of abstraction seems too low for implementing the Model Web vision, and far too complex technological aspects must still be addressed by both providers and users, resulting in limited usability and, eventually, difficult uptake. As by the recent ICT trend of resource virtualization, it has been suggested that users in need of a particular processing capability, required by a given modeling workflow, may benefit from outsourcing the composition activities into an external first-class service, according to the Composition as a Service (CaaS) approach. A CaaS system provides the necessary interoperability service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general) in the form of executable workflows. This work introduces the architecture of a CaaS system, as a distributed information system for creating, validating, editing, storing, publishing, and executing geospatial workflows. This way, the users can be freed from the need of a composition infrastructure and

  14. Library Services to Distance Learners in the Commonwealth: A Reader.

    ERIC Educational Resources Information Center

    Watson, Elizabeth F., Ed.; Jagannathan, Neela, Ed.

    The provision of good library services is a crucial factor in determining the quality of distance education. This collection of articles acquaints readers with distance librarianship as it is practiced in developed and developing countries throughout the British Commonwealth. The reader includes: "Introduction" (Michael Wooliscroft); "Distance…

  15. An Integrated Model in E-Government Based on Semantic Web, Web Service and Intelligent Agent

    NASA Astrophysics Data System (ADS)

    Zhu, Hongtao; Su, Fangli

    One urgent problem in E-government service is to improve service efficiency through breaking information islands while constructing integrated service systems. Web Service provides a set of standards for the provision of functionality over the Web, and Web Service descriptions are pure syntactic instead of semantic content. Semantic Web provides interoperability from syntactic level to semantic one not only for human users but also for software agents. Semantic Web and Intelligent Agent are highly complementary, and the existing technologies have made their unification quite feasible, which brings about a good opportunity to the development of E-government. Based on Semantic Web and Intelligent Agent technologies an integrated service model of E-government is suggested in this paper.

  16. Synchronous Distance Education: Using Web-Conferencing in an MBA Accounting Course

    ERIC Educational Resources Information Center

    Ellingson, Dee Ann; Notbohm, Matthew

    2012-01-01

    Online distance education can take many forms, from a correspondence course with materials online to fully synchronous, live instruction. This paper describes a fully synchronous, live format using web-conferencing. Some useful features of web-conferencing and the way they are employed in this course are described. Instructor observations and…

  17. The Effects of Personality Type on Web-Based Distance Learning

    ERIC Educational Resources Information Center

    Bishop-Clark, Cathy; Dietz-Uhler, Beth; Fisher, Amy

    2007-01-01

    Web-based distance learning is a relatively new approach in higher education which is gaining in popularity. Because a Web-based classroom is so different than a traditional face-to-face classroom, the variables that influence success or satisfaction with such a course may be different than those in a face-to-face course. We investigated whether…

  18. Web-Based Distance Instruction: Design and Implications of a Cybercourse Model.

    ERIC Educational Resources Information Center

    Chen, Li-Ling

    This paper describes a cybercourse model that was designed and created by infusing the following four beneficial telecomputing activities into a World Wide Web-based learning system: collaborative learning; demonstration; interactive discussion; and problem solving. Differences between the regular Web-based distance learning system and the…

  19. Breaking out of the Asynchronous Box: Using Web Conferencing in Distance Learning

    ERIC Educational Resources Information Center

    Lietzau, Julie Arnold; Mann, Barbara J.

    2009-01-01

    A discussion of a university library's use of Web conferencing (real-time synchronous instruction) which addresses the questions (1) Is Web conferencing a viable option for distance students in online only classrooms? (2) Do faculty and students benefit from this type of instruction? Four different scenarios are presented with assessment results…

  20. The Web, the Millennium, and the Digital Evolution of Distance Education.

    ERIC Educational Resources Information Center

    Leonard, David C.

    1999-01-01

    Discusses Industrial and Digital Age educational models, needs, and expectations of adult and traditional learners for Internet-based education; knowledge management and its impact on technical communication; the Universal Campus Network and the nature of Web-based education in the near future; elements for success for Web-based distance education…

  1. Effectiveness of Learning Process Using "Web Technology" in the Distance Learning System

    ERIC Educational Resources Information Center

    Killedar, Manoj

    2008-01-01

    Web is a globally distributed, still highly personalized media for cost-effective delivery of multimedia information and services. Web is expected to have a strong impact on almost every aspect of how we learn. "Total Quality" is the totality of features, as perceived by the customers of the product or service. Totality of features includes stated…

  2. Geo-enabling Science through Web Services (Invited)

    NASA Astrophysics Data System (ADS)

    White, C. E.

    2010-12-01

    Sharing research is a crucial part of participating in science. The value of a dataset increases when users 1) know the dataset exists, and 2) can access the dataset and use it. Exposing data through web services allows other researchers to quickly access, overlay, and analyze data in the web-based or desktop mapping client of their choice. The ability to mash up different web services can reveal new - especially spatial - relationships between phenomena, and encourage creative uses of the data. This presentation will investigate the scientific and business value of standards-based web services for earth observation data, teach step-by-step how to expose such datasets as web services, and demonstrate tools - such as Catalog Services, REST endpoints, and GIS portals - that enable other researchers to discover and use web-accessible data resources.

  3. BioSWR--semantic web services registry for bioinformatics.

    PubMed

    Repchevsky, Dmitry; Gelpi, Josep Ll

    2014-01-01

    Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license. PMID:25233118

  4. BioSWR – Semantic Web Services Registry for Bioinformatics

    PubMed Central

    Repchevsky, Dmitry; Gelpi, Josep Ll.

    2014-01-01

    Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license. PMID:25233118

  5. Web-Based Course Management and Web Services

    ERIC Educational Resources Information Center

    Mandal, Chittaranjan; Sinha, Vijay Luxmi; Reade, Christopher M. P.

    2004-01-01

    The architecture of a web-based course management tool that has been developed at IIT [Indian Institute of Technology], Kharagpur and which manages the submission of assignments is discussed. Both the distributed architecture used for data storage and the client-server architecture supporting the web interface are described. Further developments…

  6. From a Distance: Robust Reference Service via Instant Messaging

    ERIC Educational Resources Information Center

    Meulemans, Yvonne Nalani; Carr, Allison; Ly, Pearl

    2010-01-01

    Reference service via instant messaging (IM) has significant potential to benefit distance learners. There has been wide experimentation with IM to expand reference services in libraries across the US, with mixed results. Concern has been expressed that IM cannot provide the same reference experience as face-to-face interactions. One academic…

  7. Distance Learning and the Web: Are Advertising Programs Missing the Target?

    ERIC Educational Resources Information Center

    Falk, Louis K.; Rehman, Sharaf; Foster, Dawn

    1999-01-01

    Discusses survey results that examined whether distance education programs at universities offering courses in advertising and/or public relations make use of the Internet/Web pages to inform potential students about courses taught via distance education. The survey and list of Association of Education in Journalism and Mass Communication (AEJMC)…

  8. Innovation in Open & Distance Learning: Successful Development of Online and Web-Based Learning.

    ERIC Educational Resources Information Center

    Lockwood, Fred, Ed.; Gooley, Anne, Ed.

    This book contains 19 papers examining innovation in open and distance learning through development of online and World Wide Web-based learning. The following papers are included: "Innovation in Distributed Learning: Creating the Environment" (Fred Lockwood); "Innovation in Open and Distance Learning: Some Lessons from Experience and Research"…

  9. An Exploration of Cultural Value Orientations in Distance Education Web Marketing

    ERIC Educational Resources Information Center

    DeGaetano, Lora A.

    2013-01-01

    In the current global environment, universities seek to attract international students. The low enrollment of international students at a particular distance education institution demonstrated the competitive challenge in attracting international students. Distance education web marketing communications may be a factor influencing low enrollment…

  10. Modeling quality attributes and metrics for web service selection

    NASA Astrophysics Data System (ADS)

    Oskooei, Meysam Ahmadi; Daud, Salwani binti Mohd; Chua, Fang-Fang

    2014-06-01

    Since the service-oriented architecture (SOA) has been designed to develop the system as a distributed application, the service selection has become a vital aspect of service-oriented computing (SOC). Selecting the appropriate web service with respect to quality of service (QoS) through using mathematical solution for optimization of problem turns the service selection problem into a common concern for service users. Nowadays, number of web services that provide the same functionality is increased and selection of services from a set of alternatives which differ in quality parameters can be difficult for service consumers. In this paper, a new model for QoS attributes and metrics is proposed to provide a suitable solution for optimizing web service selection and composition with low complexity.

  11. Web Information Services at the University of South Africa: A Work in Progress.

    ERIC Educational Resources Information Center

    Hartzer, Sandra; Paterson, Brian; Snyman, Dorette; Thompson, Lisa; van Heerden, Louise; Vorster, Marza; Watkins, Ansie

    1998-01-01

    Outlines progress initiated by the University of South Africa (Unisa) library to develop online support to distance-education users. Topics include a library skills training program delivered via the World Wide Web; research information skills; Internet course development; current awareness service; the library's home page; electronic libraries;…

  12. Unifying Access to National Hydrologic Data Repositories via Web Services

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Jennings, B.; Zaslavsky, I.; Maidment, D. R.

    2006-12-01

    The CUAHSI hydrologic information system (HIS) is designed to be a live, multiscale web portal system for accessing, querying, visualizing, and publishing distributed hydrologic observation data and models for any location or region in the United States. The HIS design follows the principles of open service oriented architecture, i.e. system components are represented as web services with well defined standard service APIs. WaterOneFlow web services are the main component of the design. The currently available services have been completely re-written compared to the previous version, and provide programmatic access to USGS NWIS. (steam flow, groundwater and water quality repositories), DAYMET daily observations, NASA MODIS, and Unidata NAM streams, with several additional web service wrappers being added (EPA STORET, NCDC and others.). Different repositories of hydrologic data use different vocabularies, and support different types of query access. Resolving semantic and structural heterogeneities across different hydrologic observation archives and distilling a generic set of service signatures is one of the main scalability challenges in this project, and a requirement in our web service design. To accomplish the uniformity of the web services API, data repositories are modeled following the CUAHSI Observation Data Model. The web service responses are document-based, and use an XML schema to express the semantics in a standard format. Access to station metadata is provided via web service methods, GetSites, GetSiteInfo and GetVariableInfo. The methdods form the foundation of CUAHSI HIS discovery interface and may execute over locally-stored metadata or request the information from remote repositories directly. Observation values are retrieved via a generic GetValues method which is executed against national data repositories. The service is implemented in ASP.Net, and other providers are implementing WaterOneFlow services in java. Reference implementation of

  13. Experience using web services for biological sequence analysis.

    PubMed

    Stockinger, Heinz; Attwood, Teresa; Chohan, Shahid Nadeem; Côté, Richard; Cudré-Mauroux, Philippe; Falquet, Laurent; Fernandes, Pedro; Finn, Robert D; Hupponen, Taavi; Korpelainen, Eija; Labarga, Alberto; Laugraud, Aurelie; Lima, Tania; Pafilis, Evangelos; Pagni, Marco; Pettifer, Steve; Phan, Isabelle; Rahman, Nazim

    2008-11-01

    Programmatic access to data and tools through the web using so-called web services has an important role to play in bioinformatics. In this article, we discuss the most popular approaches based on SOAP/WS-I and REST and describe our, a cross section of the community, experiences with providing and using web services in the context of biological sequence analysis. We briefly review main technological approaches as well as best practice hints that are useful for both users and developers. Finally, syntactic and semantic data integration issues with multiple web services are discussed. PMID:18621748

  14. PRONET services for distance learning in mammographic image processing.

    PubMed

    Costaridou, L; Panayiotakis, G; Efstratiou, C; Sakellaropoulos, P; Cavouras, D; Kalogeropoulou, C; Varaki, K; Giannakou, L; Dimopoulos, J

    1997-01-01

    The potential of telematics services is investigated with respect to learning needs of medical physicists and biomedical engineers. Telematics services are integrated into a system, the PRONET, which evolves around multimedia computer based courses and distance tutoring support. In addition, information database access and special interest group support are offered. System architecture is based on a component integration approach. The services are delivered in three modes: LAN, ISDN and Internet. Mammographic image processing is selected as an example content area. PMID:10179585

  15. Web 2.0 Strategy in Libraries and Information Services

    ERIC Educational Resources Information Center

    Byrne, Alex

    2008-01-01

    Web 2.0 challenges libraries to change from their predominantly centralised service models with integrated library management systems at the hub. Implementation of Web 2.0 technologies and the accompanying attitudinal shifts will demand reconceptualisation of the nature of library and information service around a dynamic, ever changing, networked,…

  16. Density estimation of small-mammal populations using a trapping web and distance sampling methods

    USGS Publications Warehouse

    Anderson, David R.; Burnham, Kenneth P.; White, Gary C.; Otis, David L.

    1983-01-01

    Distance sampling methodology is adapted to enable animal density (number per unit of area) to be estimated from capture-recapture and removal data. A trapping web design provides the link between capture data and distance sampling theory. The estimator of density is D = Mt+1f(0), where Mt+1 is the number of individuals captured and f(0) is computed from the Mt+1 distances from the web center to the traps in which those individuals were first captured. It is possible to check qualitatively the critical assumption on which the web design and the estimator are based. This is a conceptual paper outlining a new methodology, not a definitive investigation of the best specific way to implement this method. Several alternative sampling and analysis methods are possible within the general framework of distance sampling theory; a few alternatives are discussed and an example is given.

  17. Compression-based aggregation model for medical web services.

    PubMed

    Al-Shammary, Dhiah; Khalil, Ibrahim

    2010-01-01

    Many organizations such as hospitals have adopted Cloud Web services in applying their network services to avoid investing heavily computing infrastructure. SOAP (Simple Object Access Protocol) is the basic communication protocol of Cloud Web services that is XML based protocol. Generally,Web services often suffer congestions and bottlenecks as a result of the high network traffic that is caused by the large XML overhead size. At the same time, the massive load on Cloud Web services in terms of the large demand of client requests has resulted in the same problem. In this paper, two XML-aware aggregation techniques that are based on exploiting the compression concepts are proposed in order to aggregate the medical Web messages and achieve higher message size reduction. PMID:21097152

  18. Process model-based atomic service discovery and composition of composite semantic web services using web ontology language for services (OWL-S)

    NASA Astrophysics Data System (ADS)

    Paulraj, D.; Swamynathan, S.; Madhaiyan, M.

    2012-11-01

    Web Service composition has become indispensable as a single web service cannot satisfy complex functional requirements. Composition of services has received much interest to support business-to-business (B2B) or enterprise application integration. An important component of the service composition is the discovery of relevant services. In Semantic Web Services (SWS), service discovery is generally achieved by using service profile of Ontology Web Languages for Services (OWL-S). The profile of the service is a derived and concise description but not a functional part of the service. The information contained in the service profile is sufficient for atomic service discovery, but it is not sufficient for the discovery of composite semantic web services (CSWS). The purpose of this article is two-fold: first to prove that the process model is a better choice than the service profile for service discovery. Second, to facilitate the composition of inter-organisational CSWS by proposing a new composition method which uses process ontology. The proposed service composition approach uses an algorithm which performs a fine grained match at the level of atomic process rather than at the level of the entire service in a composite semantic web service. Many works carried out in this area have proposed solutions only for the composition of atomic services and this article proposes a solution for the composition of composite semantic web services.

  19. Spatial Data Web Services Pricing Model Infrastructure

    NASA Astrophysics Data System (ADS)

    Ozmus, L.; Erkek, B.; Colak, S.; Cankurt, I.; Bakıcı, S.

    2013-08-01

    The General Directorate of Land Registry and Cadastre (TKGM) which is the leader in the field of cartography largely continues its missions which are; to keep and update land registry and cadastre system of the country under the responsibility of the treasure, to perform transactions related to real estate and to establish Turkish national spatial information system. TKGM a public agency has completed many projects. Such as; Continuously Operating GPS Reference Stations (TUSAGA-Aktif), Geo-Metadata Portal (HBB), Orthophoto-Base Map Production and web services, Completion of Initial Cadastre, Cadastral Renovation Project (TKMP), Land Registry and Cadastre Information System (TAKBIS), Turkish National Spatial Data Infrastructure Project (TNSDI), Ottoman Land Registry Archive Information System (TARBIS). TKGM provides updated map and map information to not only public institutions but also to related society in the name of social responsibility principals. Turkish National Spatial Data Infrastructure activities have been started by the motivation of Circular No. 2003/48 which was declared by Turkish Prime Ministry in 2003 within the context of e-Transformation of Turkey Short-term Action Plan. Action No. 47 in the mentioned action plan implies that "A Feasibility Study shall be made in order to establish the Turkish National Spatial Data Infrastructure" whose responsibility has been given to General Directorate of Land Registry and Cadastre. Feasibility report of NSDI has been completed in 10th of December 2010. After decision of Steering Committee, feasibility report has been send to Development Bank (old name State Planning Organization) for further evaluation. There are two main arrangements with related this project (feasibility report).First; Now there is only one Ministry which is Ministry of Environment and Urbanism responsible for establishment, operating and all national level activities of NSDI. And Second arrangement is related to institutional Level. The

  20. WebGIS based on semantic grid model and web services

    NASA Astrophysics Data System (ADS)

    Zhang, WangFei; Yue, CaiRong; Gao, JianGuo

    2009-10-01

    As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by

  1. Rule-based semantic web services matching strategy

    NASA Astrophysics Data System (ADS)

    Fan, Hong; Wang, Zhihua

    2011-12-01

    With the development of Web services technology, the number of service increases rapidly, and it becomes a challenge task that how to efficiently discovery the services that exactly match the user's requirements from the large scale of services library. Many semantic Web services discovery technologies proposed by the recent literatures only focus on the keyword-based or primary semantic based service's matching. This paper studies the rules and rule reasoning based service matching algorithm in the background of large scale services library. Firstly, the formal descriptions of semantic web services and service matching is presented. The services' matching are divided into four levels: Exact, Plugin, Subsume and Fail and their formal descriptions are also presented. Then, the service matching is regarded as rule-based reasoning issues. A set of match rules are firstly given and the related services set is retrieved from services ontology base through rule-based reasoning, and their matching levels are determined by distinguishing the relationships between service's I/O and user's request I/O. Finally, the experiment based on two services sets show that the proposed services matching strategy can easily implement the smart service discovery and obtains the high service discovery efficiency in comparison with the traditional global traversal strategy.

  2. Processing biological literature with customizable Web services supporting interoperable formats

    PubMed Central

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  3. Processing biological literature with customizable Web services supporting interoperable formats.

    PubMed

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  4. The impact of web services at the IRIS DMC

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Trabant, C. M.; Ahern, T. K.; Stults, M.; Suleiman, Y. Y.; Van Fossen, M.; Weertman, B.

    2015-12-01

    The IRIS Data Management Center (DMC) has served the seismological community for nearly 25 years. In that time we have offered data and information from our archive using a variety of mechanisms ranging from email-based to desktop applications to web applications and web services. Of these, web services have quickly become the primary method for data extraction at the DMC. In 2011, the first full year of operation, web services accounted for over 40% of the data shipped from the DMC. In 2014, over ~450 TB of data was delivered directly to users through web services, representing nearly 70% of all shipments from the DMC that year. In addition to handling requests directly from users, the DMC switched all data extraction methods to use web services in 2014. On average the DMC now handles between 10 and 20 million requests per day submitted to web service interfaces. The rapid adoption of web services is attributed to the many advantages they bring. For users, they provide on-demand data using an interface technology, HTTP, that is widely supported in nearly every computing environment and language. These characteristics, combined with human-readable documentation and existing tools make integration of data access into existing workflows relatively easy. For the DMC, the web services provide an abstraction layer to internal repositories allowing for concentrated optimization of extraction workflow and easier evolution of those repositories. Lending further support to DMC's push in this direction, the core web services for station metadata, timeseries data and event parameters were adopted as standards by the International Federation of Digital Seismograph Networks (FDSN). We expect to continue enhancing existing services and building new capabilities for this platform. For example, the DMC has created a federation system and tools allowing researchers to discover and collect seismic data from data centers running the FDSN-standardized services. A future capability

  5. Storage Viability and Optimization Web Service

    SciTech Connect

    Stadler, Michael; Marnay, Christ; Lai, Judy; Siddiqui, Afzal; Limpaitoon, Tanachai; Phan, Trucy; Megel, Olivier; Chang, Jessica; DeForest, Nicholas

    2010-10-11

    Non-residential sectors offer many promising applications for electrical storage (batteries) and photovoltaics (PVs). However, choosing and operating storage under complex tariff structures poses a daunting technical and economic problem that may discourage potential customers and result in lost carbon and economic savings. Equipment vendors are unlikely to provide adequate environmental analysis or unbiased economic results to potential clients, and are even less likely to completely describe the robustness of choices in the face of changing fuel prices and tariffs. Given these considerations, researchers at Lawrence Berkeley National Laboratory (LBNL) have designed the Storage Viability and Optimization Web Service (SVOW): a tool that helps building owners, operators and managers to decide if storage technologies and PVs merit deeper analysis. SVOW is an open access, web-based energy storage and PV analysis calculator, accessible by secure remote login. Upon first login, the user sees an overview of the parameters: load profile, tariff, technologies, and solar radiation location. Each parameter has a pull-down list of possible predefined inputs and users may upload their own as necessary. Since the non-residential sectors encompass a broad range of facilities with fundamentally different characteristics, the tool starts by asking the users to select a load profile from a limited cohort group of example facilities. The example facilities are categorized according to their North American Industry Classification System (NAICS) code. After the load profile selection, users select a predefined tariff or use the widget to create their own. The technologies and solar radiation menus operate in a similar fashion. After these four parameters have been inputted, the users have to select an optimization setting as well as an optimization objective. The analytic engine of SVOW is LBNL?s Distributed Energy Resources Customer Adoption Model (DER-CAM), which is a mixed

  6. Ethical Issues in Providing Library Services to Distance Learners

    ERIC Educational Resources Information Center

    Needham, Gill; Johnson, Kay

    2007-01-01

    The authors, library practitioners from either side of the Atlantic Ocean, embarked on a dialogue about the ethical challenges encountered in providing library services to distance learners. Unable to find an existing, appropriate ethical framework for their discussion, they agreed to devise their own, informed by relevant professional codes and…

  7. Communicating data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Roberts, Charles; Griffiths, Guy; Lewis, Jane; Yang, Kevin

    2013-04-01

    The sharing and visualization of environmental data through spatial data infrastructures is becoming increasingly common. However, information about the quality of data is frequently unavailable or presented in an inconsistent fashion. ("Data quality" is a phrase with many possible meanings but here we define it as "fitness for purpose" - therefore different users have different notions of what constitutes a "high quality" dataset.) The GeoViQua project (www.geoviqua.org) is developing means for eliciting, formatting, discovering and visualizing quality information using ISO and Open Geospatial Consortium (OGC) standards. Here we describe one aspect of the innovations of the GeoViQua project. In this presentation, we shall demonstrate new developments in using Web Map Services to communicate data quality at the level of datasets, variables and individual samples. We shall outline a new draft set of conventions (known as "WMS-Q"), which describe a set of rules for using WMS to convey quality information (OGC draft Engineering Report 12-160). We shall demonstrate these conventions through new prototype software, based upon the widely-used ncWMS software, that applies these rules to enable the visualization of uncertainties in raster data such as satellite products and the results of numerical simulations. Many conceptual and practical issues have arisen from these experiments. How can source data be formatted so that a WMS implementation can detect the semantic links between variables (e.g. the links between a mean field and its variance)? The visualization of uncertainty can be a complex task - how can we provide users with the power and flexibility to choose an optimal strategy? How can we maintain compatibility (as far as possible) with existing WMS clients? We explore these questions with reference to existing standards and approaches, including UncertML, NetCDF-U and Styled Layer Descriptors.

  8. Going, going, still there: using the WebCite service to permanently archive cited web pages.

    PubMed

    Eysenbach, Gunther; Trudel, Mathieu

    2005-01-01

    Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research

  9. Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages

    PubMed Central

    Trudel, Mathieu

    2005-01-01

    Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have

  10. Service Learning and Building Community with the World Wide Web

    ERIC Educational Resources Information Center

    Longan, Michael W.

    2007-01-01

    The geography education literature touts the World Wide Web (Web) as a revolutionary educational tool, yet most accounts ignore its uses for public communication and creative expression. This article argues that students can be producers of content that is of service to local audiences. Drawing inspiration from the community networking movement,…

  11. A Web Service and Interface for Remote Electronic Device Characterization

    ERIC Educational Resources Information Center

    Dutta, S.; Prakash, S.; Estrada, D.; Pop, E.

    2011-01-01

    A lightweight Web Service and a Web site interface have been developed, which enable remote measurements of electronic devices as a "virtual laboratory" for undergraduate engineering classes. Using standard browsers without additional plugins (such as Internet Explorer, Firefox, or even Safari on an iPhone), remote users can control a Keithley…

  12. The impact of national cultural distance on the number of foreign Web site visits by U.S. households.

    PubMed

    Beugelsdijk, Sjoerd; Slangen, Arjen

    2010-04-01

    We investigate how national cultural distance, defined as the extent to which the shared values and norms in one country differ from those in another, affect the number of Web site visits. Based on a sample of 2,654 U.S. households visiting Web sites in 38 countries over 25 different Web site categories, we find that cultural distance has a negative and significant effect on the number of taste-related foreign Web site visits. In the case of Web sites containing sexually explicit material, we obtain a significantly positive effect of cultural distance. Our findings suggest that cultural distance can be both a source of attraction and a source of repulsion in explaining the number of Web site visits depending on the nature of the Web site. PMID:20528279

  13. SSWAP: A Simple Semantic Web Architecture and Protocol for Semantic Web Services

    Technology Transfer Automated Retrieval System (TEKTRAN)

    SSWAP (Simple Semantic Web Architecture and Protocol) is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP is the driving technology behind the Virtual Plant Information Network, an NSF-funded semantic w...

  14. The Use of WebCT in Distance Learning Course in University of Manchester

    ERIC Educational Resources Information Center

    Ahmad, Rosman; Edwards, Rodger; Tomkinson, Bland

    2006-01-01

    The World Wide Web impacted the educational model and became part of distance education in this early century. There were many changes taking place in higher education for political, economic and educational reasons. New goals and educational objectives are being set within educational institutions. There were particular emphases to produce a more…

  15. Problems of Implementing SCORM in an Enterprise Distance Learning Architecture: SCORM Incompatibility across Multiple Web Domains.

    ERIC Educational Resources Information Center

    Engelbrecht, Jeffrey C.

    2003-01-01

    Delivering content to distant users located in dispersed networks, separated by firewalls and different web domains requires extensive customization and integration. This article outlines some of the problems of implementing the Sharable Content Object Reference Model (SCORM) in the Marine Corps' Distance Learning System (MarineNet) and extends…

  16. A Comparison of Web-Based Concept Mapping Tasks for Alternative Assessment in Distance Teacher Education

    ERIC Educational Resources Information Center

    Oliver, Kevin

    2008-01-01

    Three sections of the same distance education class completed a series of Web-based concept map assessments using one of two methods. Open-ended maps applied in section 1 led students to conduct more relational thinking overall, but variance in map items was very high introducing more subjectivity in scoring. Pre-selected term mapping applied in…

  17. Offering Distance Education in Health Informatics: The State of the Web Sites.

    ERIC Educational Resources Information Center

    Lazinger, Susan; Handzel, Ruth

    2003-01-01

    Within the framework of a bi-national project, between the University of North Carolina at Chapel Hill and four Israeli universities, a prototype database of programs and courses in health informatics was implemented. Examined Web sites particularly for courses offered via distance education and discusses results of a content analysis. (Author/LRW)

  18. Influence of Web-Based Distance Education on the Academic Department Chair Role

    ERIC Educational Resources Information Center

    Franklin, Kathy K.; Hart, Jan K.

    2006-01-01

    The purpose of this study was to examine academic department chair perceptions about the future influence of web-based distance education on departmental operations and their changing role as academic leader. Using a rating, modified-policy Delphi method, the researcher worked with 22 department chairs employed at public, urban universities in the…

  19. Faculty Adoption Behaviour about Web-Based Distance Education: A Case Study from China Agricultural University

    ERIC Educational Resources Information Center

    Li, Yan; Lindner, James R.

    2007-01-01

    The purpose of this study was to determine China Agricultural University's (CAU's) faculty adoption behaviour about web-based distance education (WBDE). Rogers' (2003) model of five stages in the innovation-decision process was adopted and modified as the theoretical base for the study. Quantitative research was employed and the research design…

  20. The Design of an Integrated System for Web-Based Distance Education.

    ERIC Educational Resources Information Center

    Wang, Hongxue; Holt, Pete

    This paper deals with the design of an integrated system for Web-based distance education (ISWBDE). At the highest level, this system can be dissected into two sub-systems: an integrated course authoring system (ICAS) and an integrated course delivery system (ICDS). The ICAS is designed to help with, or automate the creation and management of…

  1. The Impact of Web Conferencing Training on Peer Tutors' Attitudes toward Distance Education

    ERIC Educational Resources Information Center

    Dvorak, Johanna; Roessger, Kevin

    2012-01-01

    This study investigated the attitudes of peer tutors who received web conferencing training in preparation for synchronous online tutoring. A quasi-experimental design was employed to evaluate changes in peer tutors' attitudes toward distance learning following participation in an online tutor training program. Peer tutors were found to have: (a)…

  2. The Challenge of Designing and Evaluating "Interaction" in Web-Based Distance Education.

    ERIC Educational Resources Information Center

    Gunawardena, Charlotte

    The purpose of this paper is threefold: (1) to discuss issues related to the design of interaction on the World Wide Web using models of interaction developed for distance education; (2) to examine several techniques for the analysis of interactions and the quality of the learning experience in a computer-mediated group conference; and (3) to…

  3. Building asynchronous geospatial processing workflows with web services

    NASA Astrophysics Data System (ADS)

    Zhao, Peisheng; Di, Liping; Yu, Genong

    2012-02-01

    Geoscience research and applications often involve a geospatial processing workflow. This workflow includes a sequence of operations that use a variety of tools to collect, translate, and analyze distributed heterogeneous geospatial data. Asynchronous mechanisms, by which clients initiate a request and then resume their processing without waiting for a response, are very useful for complicated workflows that take a long time to run. Geospatial contents and capabilities are increasingly becoming available online as interoperable Web services. This online availability significantly enhances the ability to use Web service chains to build distributed geospatial processing workflows. This paper focuses on how to orchestrate Web services for implementing asynchronous geospatial processing workflows. The theoretical bases for asynchronous Web services and workflows, including asynchrony patterns and message transmission, are examined to explore different asynchronous approaches to and architecture of workflow code for the support of asynchronous behavior. A sample geospatial processing workflow, issued by the Open Geospatial Consortium (OGC) Web Service, Phase 6 (OWS-6), is provided to illustrate the implementation of asynchronous geospatial processing workflows and the challenges in using Web Services Business Process Execution Language (WS-BPEL) to develop them.

  4. Automated geospatial Web Services composition based on geodata quality requirements

    NASA Astrophysics Data System (ADS)

    Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael

    2012-10-01

    Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.

  5. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    ERIC Educational Resources Information Center

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  6. Reinforcement Learning Based Web Service Compositions for Mobile Business

    NASA Astrophysics Data System (ADS)

    Zhou, Juan; Chen, Shouming

    In this paper, we propose a new solution to Reactive Web Service Composition, via molding with Reinforcement Learning, and introducing modified (alterable) QoS variables into the model as elements in the Markov Decision Process tuple. Moreover, we give an example of Reactive-WSC-based mobile banking, to demonstrate the intrinsic capability of the solution in question of obtaining the optimized service composition, characterized by (alterable) target QoS variable sets with optimized values. Consequently, we come to the conclusion that the solution has decent potentials in boosting customer experiences and qualities of services in Web Services, and those in applications in the whole electronic commerce and business sector.

  7. Scalable web services for the PSIPRED Protein Analysis Workbench.

    PubMed

    Buchan, Daniel W A; Minneci, Federico; Nugent, Tim C O; Bryson, Kevin; Jones, David T

    2013-07-01

    Here, we present the new UCL Bioinformatics Group's PSIPRED Protein Analysis Workbench. The Workbench unites all of our previously available analysis methods into a single web-based framework. The new web portal provides a greatly streamlined user interface with a number of new features to allow users to better explore their results. We offer a number of additional services to enable computationally scalable execution of our prediction methods; these include SOAP and XML-RPC web server access and new HADOOP packages. All software and services are available via the UCL Bioinformatics Group website at http://bioinf.cs.ucl.ac.uk/. PMID:23748958

  8. Business Systems Branch Abilities, Capabilities, and Services Web Page

    NASA Technical Reports Server (NTRS)

    Cortes-Pena, Aida Yoguely

    2009-01-01

    During the INSPIRE summer internship I acted as the Business Systems Branch Capability Owner for the Kennedy Web-based Initiative for Communicating Capabilities System (KWICC), with the responsibility of creating a portal that describes the services provided by this Branch. This project will help others achieve a clear view ofthe services that the Business System Branch provides to NASA and the Kennedy Space Center. After collecting the data through the interviews with subject matter experts and the literature in Business World and other web sites I identified discrepancies, made the necessary corrections to the sites and placed the information from the report into the KWICC web page.

  9. A SOAP Web Service for accessing MODIS land product subsets

    SciTech Connect

    SanthanaVannan, Suresh K; Cook, Robert B; Pan, Jerry Yun; Wilson, Bruce E

    2011-01-01

    Remote sensing data from satellites have provided valuable information on the state of the earth for several decades. Since March 2000, the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on board NASA s Terra and Aqua satellites have been providing estimates of several land parameters useful in understanding earth system processes at global, continental, and regional scales. However, the HDF-EOS file format, specialized software needed to process the HDF-EOS files, data volume, and the high spatial and temporal resolution of MODIS data make it difficult for users wanting to extract small but valuable amounts of information from the MODIS record. To overcome this usability issue, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemical Dynamics at Oak Ridge National Laboratory (ORNL) developed a Web service that provides subsets of MODIS land products using Simple Object Access Protocol (SOAP). The ORNL DAAC MODIS subsetting Web service is a unique way of serving satellite data that exploits a fairly established and popular Internet protocol to allow users access to massive amounts of remote sensing data. The Web service provides MODIS land product subsets up to 201 x 201 km in a non-proprietary comma delimited text file format. Users can programmatically query the Web service to extract MODIS land parameters for real time data integration into models, decision support tools or connect to workflow software. Information regarding the MODIS SOAP subsetting Web service is available on the World Wide Web (WWW) at http://daac.ornl.gov/modiswebservice.

  10. BioServices: a common Python package to access biological Web Services programmatically

    PubMed Central

    Cokelaer, Thomas; Pultz, Dennis; Harder, Lea M.; Serra-Musach, Jordi; Saez-Rodriguez, Julio

    2013-01-01

    Motivation: Web interfaces provide access to numerous biological databases. Many can be accessed to in a programmatic way thanks to Web Services. Building applications that combine several of them would benefit from a single framework. Results: BioServices is a comprehensive Python framework that provides programmatic access to major bioinformatics Web Services (e.g. KEGG, UniProt, BioModels, ChEMBLdb). Wrapping additional Web Services based either on Representational State Transfer or Simple Object Access Protocol/Web Services Description Language technologies is eased by the usage of object-oriented programming. Availability and implementation: BioServices releases and documentation are available at http://pypi.python.org/pypi/bioservices under a GPL-v3 license. Contact: cokelaer@ebi.ac.uk or bioservices@googlegroups.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24064416

  11. Web Services Provide Access to SCEC Scientific Research Application Software

    NASA Astrophysics Data System (ADS)

    Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.

    2003-12-01

    Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the

  12. BPELPower—A BPEL execution engine for geospatial web services

    NASA Astrophysics Data System (ADS)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  13. Video in Distance Education: ITFS vs. Web-Streaming--Evaluation of Student Attitudes

    ERIC Educational Resources Information Center

    Reisslein, Jana; Seeling, Patrick; Reisslein, Martin

    2005-01-01

    The use of video in distance education courses has a long tradition, with many colleges and universities having been delivering distance education courses with video since the 80's using the Instructional Television Fixed Service (ITFS) and cable television. With the emergence of the Internet and the increased access bandwidths from private homes…

  14. Architecture-Based Reliability Analysis of Web Services

    ERIC Educational Resources Information Center

    Rahmani, Cobra Mariam

    2012-01-01

    In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…

  15. Advancing Your Library's Web-Based Services. ERIC Digest.

    ERIC Educational Resources Information Center

    Feldman, Sari; Strobel, Tracy

    This digest discusses the development of World Wide Web-based services for libraries and provides examples from the Cleveland Public Library (CPL). The first section highlights the importance of developing such services, steps to be followed for a successful project, and the importance of having the goal of replicating and enhancing traditional…

  16. Teaching Medical Students at a Distance: Using Distance Learning Benchmarks to Plan and Evaluate a Web-Enhanced Medical Student Curriculum

    ERIC Educational Resources Information Center

    Olney, Cynthia A.; Chumley, Heidi; Parra, Juan M.

    2004-01-01

    A team designing a Web-enhanced third-year medical education didactic curriculum based their course planning and evaluation activities on the Institute for Higher Education Policy's (2000) 24 benchmarks for online distance learning. The authors present the team's blueprint for planning and evaluating the Web-enhanced curriculum, which incorporates…

  17. OntoGene web services for biomedical text mining.

    PubMed

    Rinaldi, Fabio; Clematide, Simon; Marques, Hernani; Ellendorff, Tilia; Romacker, Martin; Rodriguez-Esteban, Raul

    2014-01-01

    Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges,with top ranked results in several of them. PMID:25472638

  18. A Caching Mechanism for Semantic Web Service Discovery

    NASA Astrophysics Data System (ADS)

    Stollberg, Michael; Hepp, Martin; Hoffmann, Jörg

    The discovery of suitable Web services for a given task is one of the central operations in Service-oriented Architectures (SOA), and research on Semantic Web services (SWS) aims at automating this step. For the large amount of available Web services that can be expected in real-world settings, the computational costs of automated discovery based on semantic matchmaking become important. To make a discovery engine a reliable software component, we must thus aim at minimizing both the mean and the variance of the duration of the discovery task. For this, we present an extension for discovery engines in SWS environments that exploits structural knowledge and previous discovery results for reducing the search space of consequent discovery operations. Our prototype implementation shows significant improvements when applied to the Stanford SWS Challenge scenario and dataset.

  19. OntoGene web services for biomedical text mining

    PubMed Central

    2014-01-01

    Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges, with top ranked results in several of them. PMID:25472638

  20. Finding, Browsing and Getting Data Easily Using SPDF Web Services

    NASA Technical Reports Server (NTRS)

    Candey, R.; Chimiak, R.; Harris, B.; Johnson, R.; Kovalick, T.; Lal, N.; Leckner, H.; Liu, M.; McGuire, R.; Papitashvili, N.; Roberts, A.

    2010-01-01

    The NASA GSFC Space Physics Data Facility (5PDF) provides heliophysics science-enabling information services for enhancing scientific research and enabling integration of these services into the Heliophysics Data Environment paradigm, via standards-based approach (SOAP) and Representational State Transfer (REST) web services in addition to web browser, FTP, and OPeNDAP interfaces. We describe these interfaces and the philosophies behind these web services, and show how to call them from various languages, such as IDL and Perl. We are working towards a "one simple line to call" philosophy extolled in the recent VxO discussions. Combining data from many instruments and missions enables broad research analysis and correlation and coordination with other experiments and missions.

  1. Operational Use of OGC Web Services at the Met Office

    NASA Astrophysics Data System (ADS)

    Wright, Bruce

    2010-05-01

    The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or

  2. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services

    PubMed Central

    Gessler, Damian DG; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-01-01

    Background SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. Results There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at , developer tools at , and a portal to third-party ontologies at (a "swap meet"). Conclusion SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing

  3. Web Service for Positional Quality Assessment: the Wps Tier

    NASA Astrophysics Data System (ADS)

    Xavier, E. M. A.; Ariza-López, F. J.; Ureña-Cámara, M. A.

    2015-08-01

    In the field of spatial data every day we have more and more information available, but we still have little or very little information about the quality of spatial data. We consider that the automation of the spatial data quality assessment is a true need for the geomatic sector, and that automation is possible by means of web processing services (WPS), and the application of specific assessment procedures. In this paper we propose and develop a WPS tier centered on the automation of the positional quality assessment. An experiment using the NSSDA positional accuracy method is presented. The experiment involves the uploading by the client of two datasets (reference and evaluation data). The processing is to determine homologous pairs of points (by distance) and calculate the value of positional accuracy under the NSSDA standard. The process generates a small report that is sent to the client. From our experiment, we reached some conclusions on the advantages and disadvantages of WPSs when applied to the automation of spatial data accuracy assessments.

  4. CMS data quality monitoring web service

    NASA Astrophysics Data System (ADS)

    Tuura, L.; Eulisse, G.; Meyer, A.

    2010-04-01

    A central component of the data quality monitoring system of the CMS experiment at the Large Hadron Collider is a web site for browsing data quality histograms. The production servers in data taking provide access to several hundred thousand histograms per run, both live in online as well as for up to several terabytes of archived histograms for the online data taking, Tier-0 prompt reconstruction, prompt calibration and analysis activities, for re-reconstruction at Tier-1s and for release validation. At the present usage level the servers currently handle in total around a million authenticated HTTP requests per day. We describe the main features and components of the system, our implementation for web-based interactive rendering, and the server design. We give an overview of the deployment and maintenance procedures. We discuss the main technical challenges and our solutions to them, with emphasis on functionality, long-term robustness and performance.

  5. The Role of Libraries in Web-Based Distance Education: An Account and an Analysis of the Impact of Web Technology on Distance Learning--What Remains Unchanged, What Is Changing

    ERIC Educational Resources Information Center

    Cooke, Nicole A.

    2004-01-01

    Even though distance education has a long and diverse history, dating back to 1840, in the last ten-to-fifteen years it has been completely transformed by the emergence of Web-based technology. This technology has had an enormous impact on all aspects of distance education (or distance learning as it is increasingly called). In addition to…

  6. Data Mining Web Services for Science Data Repositories

    NASA Astrophysics Data System (ADS)

    Graves, S.; Ramachandran, R.; Keiser, K.; Maskey, M.; Lynnes, C.; Pham, L.

    2006-12-01

    The maturation of web services standards and technologies sets the stage for a distributed "Service-Oriented Architecture" (SOA) for NASA's next generation science data processing. This architecture will allow members of the scientific community to create and combine persistent distributed data processing services and make them available to other users over the Internet. NASA has initiated a project to create a suite of specialized data mining web services designed specifically for science data. The project leverages the Algorithm Development and Mining (ADaM) toolkit as its basis. The ADaM toolkit is a robust, mature and freely available science data mining toolkit that is being used by several research organizations and educational institutions worldwide. These mining services will give the scientific community a powerful and versatile data mining capability that can be used to create higher order products such as thematic maps from current and future NASA satellite data records with methods that are not currently available. The package of mining and related services are being developed using Web Services standards so that community-based measurement processing systems can access and interoperate with them. These standards-based services allow users different options for utilizing them, from direct remote invocation by a client application to deployment of a Business Process Execution Language (BPEL) solutions package where a complex data mining workflow is exposed to others as a single service. The ability to deploy and operate these services at a data archive allows the data mining algorithms to be run where the data are stored, a more efficient scenario than moving large amounts of data over the network. This will be demonstrated in a scenario in which a user uses a remote Web-Service-enabled clustering algorithm to create cloud masks from satellite imagery at the Goddard Earth Sciences Data and Information Services Center (GES DISC).

  7. Creating Web-based, multimedia, and interactive courses for distance learning.

    PubMed

    Mills, A C

    2000-01-01

    A case study describes how faculty at Saint Louis University School of Nursing have developed computer-based, multimedia courses for master's and post-master's nursing education. Lectures with slide presentations are recorded digitally and encoded for multimedia streaming over the World Wide Web for distance learning. The technology is explained in detail in terms of the specific technologies used for lecture and course development. How this technology supports graduate student learning is also presented. PMID:10835811

  8. A Method of EC Model Implementation Using Web Service Functions

    NASA Astrophysics Data System (ADS)

    Kurihara, Jun; Koizumi, Hisao; Ishikawa, Toshiyuki; Dasai, Takashi

    In recent years, advances in computer and communication technology and the associated rapid increase in the number of Internet users are encouraging advances in Electronic Commerce (EC). Business models of EC are being actively developed by many different enterprises and engineers, and implemented in many kinds of fields. Meanwhile Web services that reuse remote components over the Internet are drawing attention. Web services are based on SOAP/WSDL/UDDI and are given an important position as the infrastructure of the EC systems. The article analyzes the functions and structures of various business models, establishing the patterns of their distinctive and common features, and proposes a method of determining the implementation specifications of business models utilizing these patterns and Web service functions. This method has been applied to a parts purchasing system, which is a typical pattern of the B to B (Business to Business) EC applications. The article also discusses the results of evaluating this prototype system.

  9. A resource oriented webs service for environmental modeling

    NASA Astrophysics Data System (ADS)

    Ferencik, Ioan

    2013-04-01

    Environmental modeling is a largely adopted practice in the study of natural phenomena. Environmental models can be difficult to build and use and thus sharing them within the community is an important aspect. The most common approach to share a model is to expose it as a web service. In practice the interaction with this web service is cumbersome due to lack of standardized contract and the complexity of the model being exposed. In this work we investigate the use of a resource oriented approach in exposing environmental models as web services. We view a model as a layered resource build atop the object concept from Object Oriented Programming, augmented with persistence capabilities provided by an embedded object database to keep track of its state and implementing the four basic principles of resource oriented architectures: addressability, statelessness, representation and uniform interface. For implementation we use exclusively open source software: Django framework, dyBase object oriented database and Python programming language. We developed a generic framework of resources structured into a hierarchy of types and consequently extended this typology with recurses specific to the domain of environmental modeling. To test our web service we used cURL, a robust command-line based web client.

  10. Exploring Education Major Focused Adult Learners' Perspectives and Practices of Web-Based Distance Education in Sixteen Universities

    ERIC Educational Resources Information Center

    Zhang, Jing

    2009-01-01

    Distance education is not a new concept for all kinds of learners in the modern societies. Many researchers have studied traditional distance education programs for adult learners in the past, but little research has been done on Web-based distance education (WBDE) for adult learners. There are also many popular online universities in the U.S. or…

  11. Web-based health services and clinical decision support.

    PubMed

    Jegelevicius, Darius; Marozas, Vaidotas; Lukosevicius, Arunas; Patasius, Martynas

    2004-01-01

    The purpose of this study was the development of a Web-based e-health service for comprehensive assistance and clinical decision support. The service structure consists of a Web server, a PHP-based Web interface linked to a clinical SQL database, Java applets for interactive manipulation and visualization of signals and a Matlab server linked with signal and data processing algorithms implemented by Matlab programs. The service ensures diagnostic signal- and image analysis-sbased clinical decision support. By using the discussed methodology, a pilot service for pathology specialists for automatic calculation of the proliferation index has been developed. Physicians use a simple Web interface for uploading the pictures under investigation to the server; subsequently a Java applet interface is used for outlining the region of interest and, after processing on the server, the requested proliferation index value is calculated. There is also an "expert corner", where experts can submit their index estimates and comments on particular images, which is especially important for system developers. These expert evaluations are used for optimization and verification of automatic analysis algorithms. Decision support trials have been conducted for ECG and ophthalmology ultrasonic investigations of intraocular tumor differentiation. Data mining algorithms have been applied and decision support trees constructed. These services are under implementation by a Web-based system too. The study has shown that the Web-based structure ensures more effective, flexible and accessible services compared with standalone programs and is very convenient for biomedical engineers and physicians, especially in the development phase. PMID:15718591

  12. Web services for open meteorological data in British Columbia

    NASA Astrophysics Data System (ADS)

    Hiebert, J.; Anslow, F. S.

    2012-12-01

    Until recently, British Columbia suffered from a dearth of publicly and easily accessible (open) meteorological data. While Environment Canada (EC) maintains approximately 250 active in situ weather stations, the remaining meteorological and climate data -- which represent the majority of observations made in the province -- have been gathered by the provincial government within several disparate, ministry-specific networks. Those observations have traditionally been either inaccessible to non-government employees or only available on a network-by-network basis by contacting network managers and requesting custom data queries. Under a collaborative agreement between several provincial ministries, private industry and the Pacific Climate Impacts Consortium (PCIC) and with support from EC, the entire province's meteorological archive has been collected into a single database at PCIC and made publicly accessible via web services and open data protocols. In this paper, we describe our web services, built on open-source software, which provide users access to the full catalogue of BC's meteorological observations through a simple user interface. Our geographic web services provide users access to station locations using Open Geospatial Consortium's Web Mapping Service and Web Feature Service protocols. We use OpenDAP to provide users download access to over a century of weather observations through a variety of open formats such as NetCDF, HDF, ASCII, and others. The goals of these web services are twofold. We primarily aim to provide planners, scientists and researchers with timely and comprehensive climate data as conveniently and efficiently as possible. A natural consequence of this is to enable the flexibility to expand the volume and types of data served and to facilitate more sophisticated analysis regarding past and future climate.

  13. Deploying and sharing U-Compare workflows as web services

    PubMed Central

    2013-01-01

    Background U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare’s components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. Results We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. Conclusions The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform. PMID:23419017

  14. Consuming Web Services: A Yahoo! Newsfeed Reader

    ERIC Educational Resources Information Center

    Dadashzadeh, Mohammad

    2010-01-01

    Service Oriented Architecture (SOA) shows demonstrable signs of simplifying software integration. It provides the necessary framework for building applications that can be integrated and can reduce the cost of integration significantly. Organizations are beginning to architect new integration solutions following the SOA approach. As such,…

  15. ME: Multimodal Environment Based on Web Services Architecture

    NASA Astrophysics Data System (ADS)

    Chiara Caschera, Maria; D'Andrea, Alessia; D'Ulizia, Arianna; Ferri, Fernando; Grifoni, Patrizia; Guzzo, Tiziana

    Information, documents and knowledge for each person and for public and private organizations are fundamental in each activity, and they may be the products or services they provide or supply. The daily activities and decision-making processes are usually based on many different pieces of information, which could be handled on PDAs and mobile devices in general, or stored on laptop computers using a lot of different forms, such as spreadsheets, e-mail messages, Web information obtained as the result of a Google search or a query, and so on. Simulating and managing services and information in catastrophic events and emergencies does not represent an exception. This paper describes the Web Services architecture of the Multimodal collaborative knowledge oriented Environment (ME), a platform designed to manage data, information and services for catastrophic events such as earthquakes, floods and dangerous natural phenomena.

  16. WebGMAP: a web service for mapping and aligning cDNA sequences to genomes

    PubMed Central

    Liang, Chun; Liu, Lin; Ji, Guoli

    2009-01-01

    The genomes of thousands of organisms are being sequenced, often with accompanying sequences of cDNAs or ESTs. One of the great challenges in bioinformatics is to make these genomic sequences and genome annotations accessible in a user-friendly manner to general biologists to address interesting biological questions. We have created an open-access web service called WebGMAP (http://www.bioinfolab.org/software/webgmap) that seamlessly integrates cDNA-genome alignment tools, such as GMAP, with easy-to-use data visualization and mining tools. This web service is intended to facilitate community efforts in improving genome annotation, determining accurate gene structures and their variations, and exploring important biological processes such as alternative splicing and alternative polyadenylation. For routine sequence analysis, WebGMAP provides a web-based sequence viewer with many useful functions, including nucleotide positioning, six-frame translations, sequence reverse complementation, and imperfect motif detection and alignment. WebGMAP also provides users with the ability to sort, filter and search for individual cDNA sequences and cDNA-genome alignments. Our EST-Genome-Browser can display annotated gene structures and cDNA-genome alignments at scales from 100 to 50 000 nt. With its ability to highlight base differences between query cDNAs and the genome, our EST-Genome-Browser allows biologists to discover potential point or insertion-deletion variations from cDNA-genome alignments. PMID:19465381

  17. Academic Public Service Web Sites and the Future of Virtual Academic Public Service

    ERIC Educational Resources Information Center

    Cohn, Ellen; Hibbitts, Bernard

    2005-01-01

    Some faculty have started to use the Internet as a bridge to the public instead of merely to each other. Leveraging their specialist knowledge and their academic authority against perceived public needs, they have created another type of academic Web site on their institutional servers--the academic public service Web site (APSWS). APSWS is an…

  18. 3D medical volume reconstruction using web services.

    PubMed

    Kooper, Rob; Shirk, Andrew; Lee, Sang-Chul; Lin, Amy; Folberg, Robert; Bajcsy, Peter

    2008-04-01

    We address the problem of 3D medical volume reconstruction using web services. The use of proposed web services is motivated by the fact that the problem of 3D medical volume reconstruction requires significant computer resources and human expertise in medical and computer science areas. Web services are implemented as an additional layer to a dataflow framework called data to knowledge. In the collaboration between UIC and NCSA, pre-processed input images at NCSA are made accessible to medical collaborators for registration. Every time UIC medical collaborators inspected images and selected corresponding features for registration, the web service at NCSA is contacted and the registration processing query is executed using the image to knowledge library of registration methods. Co-registered frames are returned for verification by medical collaborators in a new window. In this paper, we present 3D volume reconstruction problem requirements and the architecture of the developed prototype system at http://isda.ncsa.uiuc.edu/MedVolume. We also explain the tradeoffs of our system design and provide experimental data to support our system implementation. The prototype system has been used for multiple 3D volume reconstructions of blood vessels and vasculogenic mimicry patterns in histological sections of uveal melanoma studied by fluorescent confocal laser scanning microscope. PMID:18336808

  19. Web services at the European Bioinformatics Institute-2009

    PubMed Central

    Mcwilliam, Hamish; Valentin, Franck; Goujon, Mickael; Li, Weizhong; Narayanasamy, Menaka; Martin, Jenny; Miyar, Teresa; Lopez, Rodrigo

    2009-01-01

    The European Bioinformatics Institute (EMBL-EBI) has been providing access to mainstream databases and tools in bioinformatics since 1997. In addition to the traditional web form based interfaces, APIs exist for core data resources such as EMBL-Bank, Ensembl, UniProt, InterPro, PDB and ArrayExpress. These APIs are based on Web Services (SOAP/REST) interfaces that allow users to systematically access databases and analytical tools. From the user's point of view, these Web Services provide the same functionality as the browser-based forms. However, using the APIs frees the user from web page constraints and are ideal for the analysis of large batches of data, performing text-mining tasks and the casual or systematic evaluation of mathematical models in regulatory networks. Furthermore, these services are widespread and easy to use; require no prior knowledge of the technology and no more than basic experience in programming. In the following we wish to inform of new and updated services as well as briefly describe planned developments to be made available during the course of 2009–2010. PMID:19435877

  20. A demanding web-based PACS supported by web services technology

    NASA Astrophysics Data System (ADS)

    Costa, Carlos M. A.; Silva, Augusto; Oliveira, José L.; Ribeiro, Vasco G.; Ribeiro, José

    2006-03-01

    During the last years, the ubiquity of web interfaces have pushed practically all PACS suppliers to develop client applications in which clinical practitioners can receive and analyze medical images, using conventional personal computers and Web browsers. However, due to security and performance issues, the utilization of these software packages has been restricted to Intranets. Paradigmatically, one of the most important advantages of digital image systems is to simplify the widespread sharing and remote access of medical data between healthcare institutions. This paper analyses the traditional PACS drawbacks that contribute to their reduced usage in the Internet and describes a PACS based on Web Services technology that supports a customized DICOM encoding syntax and a specific compression scheme providing all historical patient data in a unique Web interface.

  1. HWA modelling web services for the IMPEx infrastructure

    NASA Astrophysics Data System (ADS)

    Kallio, Esa; Khodachenko, Maxim; Génot, Vincent; Schmidt, Walter; Häkkinen, Lasse; Jarvinen, Riku; Dyadechkin, Sergey; Pérez-Suárez, David; Topf, Florian; Al-Ubaidi, Tarek; Gangloff, Michel; Budnik, Elena; Bouchemit, Myriam; Bourrel, Natalyia; Penou, Emmanuel; André, Nicolas; Modolo, Ronan; Hess, Sebastien; Alexeev, Igor; Belenkaya, Elena

    2013-04-01

    The EU-FP7 Project "Integrated Medium for Planetary Exploration", IMPEx [1], was established as a result of scientific collaboration between institutions across Europe and is working on the integration of a set of interactive data analysis and modeling tools in the field of space plasma physics. These tools are comprised of numerical hybrid/MHD and analytical Paraboloid magnetospheric models from the simulation sector as well as from the data analysis and visualization sector (AMDA, ClWeb, 3DView). The basic feature of IMPEx consists in connection of different data sources, including archived computational simulation results and observational data, in order to analyse and visualize scientific data by means of interactive web-based tools. In this presentation we introduce a web service, Hybrid Web Archive, HWA [2], which enables access to the simulation runs made by HYB and GUMICS models included in the IMPEx HMM (Hybrid and Magnetohydrodynamic Modelling) environment. The HYB hybrid model and the GUMICS MHD model enables to study the solar wind interaction with the planets, moons, asteroids and comets [2]. We also introduce web services which enable a connection of the HWA and observational data resources. Acknowledgment: IMPEx was funded by the European Commission under the 7th Framework Program, grant agreement no 262863 References: [1] http://impex-fp7.oeaw.ac.at [2] http://hwa.fmi.fi

  2. The Astrophysics Data System Web Services

    NASA Astrophysics Data System (ADS)

    Eichhorn, G.; Accomazzi, A.; Demleitner, M.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.

    1999-12-01

    The Astrophysics Data System is a central part of the Distributed Digital Library for Astronomy. It provides access to most of the astronomical literature, as well as links to many different on-line information sources. The ADS Abstract Service provides a search interface to over 1.5 million references. The ADS Article Service provides access to the full journal articles for all major and most smaller journals, most of them back to volume 1. Links to on-line catalogs, electronic articles, astronomical object information and other data allow the user to quickly find on-line information. A reference and citation database provides information about article citations. We are currently working on greatly expanding the reference/citations database by including reference lists from the journals and by OCRing scanned reference lists. Between reference lists from the publishers and OCRd reference lists we have recently added almost 1 million reference-citation pairs to the database. OCRing of the abstracts from scanned journal article allowed us to include over 20,000 abstracts to the searchable database. Both these efforts will continue to add more data to our database. In the near future we will scan microfilms of publications from astronomical observatories, produced by a preservation project at the Harvard Library. This will provide unrestricted access to a large part of the 19th century astronomical literature.

  3. Pioneering a web-Based Museum in Taiwan: Design and Implementation of Lifelong Distance Learning of Science Education.

    ERIC Educational Resources Information Center

    Young, Shelley Shwu-Ching; Huang, Yi-Long; Jang, Jyh-Shing Roger

    2000-01-01

    Describes the development and implementation process of a Web-based science museum in Taiwan. Topics include use of the Internet; lifelong distance learning; museums and the Internet; objectives of the science museum; funding; categories of exhibitions; analysis of Web users; homepage characteristics; graphics and the effect on speed; and future…

  4. Towards Thematic Web Services for Generic Data Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Horanont, T.; Basa, M.; Shibasaki, R.

    2012-07-01

    Spatial analysis packages and thematic mapping are available in a number of traditional desktop GIS. However, visualizing thematic maps through the Internet is still limited to fix contents and restrict changes of the input data. The users with limited GIS knowledge or people who do not own digital map data are normally having difficulties to create output thematic maps from generic data. In this study, we developed thematic mapping services that can be applied to non-spatial data format served through powerful map services solutions. Novice users who have no GIS software experience or have no digital base map can simply input a plain text file with location identifier field such as place name or gazetteer to generate thematic maps online. We implemented a prototype by using web service standards recommended by the Open Geospatial Consortium (OGC) such as Web Map Service (WMS), Web Feature Service (WFS) and Styled Layer Descriptor (SLD) to provide a principle for communication and allow users to visualize spatial information as thematic maps. The system dedicates a great deal of effort to the initial study of geospatial analysis and visualization for novice users including those with no past experience using Geographic Information Systems.

  5. Clinical Predictive Modeling Development and Deployment through FHIR Web Services

    PubMed Central

    Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng

    2015-01-01

    Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction. PMID:26958207

  6. Clinical Predictive Modeling Development and Deployment through FHIR Web Services.

    PubMed

    Khalilia, Mohammed; Choi, Myung; Henderson, Amelia; Iyengar, Sneha; Braunstein, Mark; Sun, Jimeng

    2015-01-01

    Clinical predictive modeling involves two challenging tasks: model development and model deployment. In this paper we demonstrate a software architecture for developing and deploying clinical predictive models using web services via the Health Level 7 (HL7) Fast Healthcare Interoperability Resources (FHIR) standard. The services enable model development using electronic health records (EHRs) stored in OMOP CDM databases and model deployment for scoring individual patients through FHIR resources. The MIMIC2 ICU dataset and a synthetic outpatient dataset were transformed into OMOP CDM databases for predictive model development. The resulting predictive models are deployed as FHIR resources, which receive requests of patient information, perform prediction against the deployed predictive model and respond with prediction scores. To assess the practicality of this approach we evaluated the response and prediction time of the FHIR modeling web services. We found the system to be reasonably fast with one second total response time per patient prediction. PMID:26958207

  7. A new reference implementation of the PSICQUIC web service.

    PubMed

    del-Toro, Noemi; Dumousseau, Marine; Orchard, Sandra; Jimenez, Rafael C; Galeota, Eugenia; Launay, Guillaume; Goll, Johannes; Breuer, Karin; Ono, Keiichiro; Salwinski, Lukasz; Hermjakob, Henning

    2013-07-01

    The Proteomics Standard Initiative Common QUery InterfaCe (PSICQUIC) specification was created by the Human Proteome Organization Proteomics Standards Initiative (HUPO-PSI) to enable computational access to molecular-interaction data resources by means of a standard Web Service and query language. Currently providing >150 million binary interaction evidences from 28 servers globally, the PSICQUIC interface allows the concurrent search of multiple molecular-interaction information resources using a single query. Here, we present an extension of the PSICQUIC specification (version 1.3), which has been released to be compliant with the enhanced standards in molecular interactions. The new release also includes a new reference implementation of the PSICQUIC server available to the data providers. It offers augmented web service capabilities and improves the user experience. PSICQUIC has been running for almost 5 years, with a user base growing from only 4 data providers to 28 (April 2013) allowing access to 151 310 109 binary interactions. The power of this web service is shown in PSICQUIC View web application, an example of how to simultaneously query, browse and download results from the different PSICQUIC servers. This application is free and open to all users with no login requirement (http://www.ebi.ac.uk/Tools/webservices/psicquic/view/main.xhtml). PMID:23671334

  8. Taking advantage of Google's Web-based applications and services.

    PubMed

    Brigham, Tara J

    2014-01-01

    Google is a company that is constantly expanding and growing its services and products. While most librarians possess a "love/hate" relationship with Google, there are a number of reasons you should consider exploring some of the tools Google has created and made freely available. Applications and services such as Google Docs, Slides, and Google+ are functional and dynamic without the cost of comparable products. This column will address some of the issues users should be aware of before signing up to use Google's tools, and a description of some of Google's Web applications and services, plus how they can be useful to librarians in health care. PMID:24735269

  9. [Consumer health web service at DIMDI].

    PubMed

    Hasky-Günther, K

    2004-10-01

    The German Institute of Medical Documentation and Information (DIMDI) extended its Internet services targeted at patients in order to meet the rising interest of the public for understandable, high-quality medical information. Medical terminology is made clear to nonprofessionals by voluminous reference books such as the Roche Encyclopedia of Medicine. By using free offers, such as the possibilities to search in up-to-date medical literature and studies, laypersons can find valuable and quality assured information on their fields of interest. Graphic and film material, which is offered in the virtual medical video shop (VVFM), covers the en-tire spectrum of specific medical fields and brings the areas of prevention, diagnostics, therapy, aftercare as well as nursing care up for discussion. It is easy to find physicians, hospitals, and self-help groups. Future plans include an extension of the offer to a substantial database-supported information portal for health-related subjects, which will provide the public with simple and speedy access to the health information of the DIM-DI and other trustworthy providers under one interface. PMID:15490083

  10. Web services in the U.S. geological survey streamstats web application

    USGS Publications Warehouse

    Guthrie, J.D.; Dartiguenave, C.; Ries, Kernell G., III

    2009-01-01

    StreamStats is a U.S. Geological Survey Web-based GIS application developed as a tool for waterresources planning and management, engineering design, and other applications. StreamStats' primary functionality allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, Web services have been developed that provide the capability to remote users and applications to access comprehensive GIS tools that are available in StreamStats, including delineating drainage-basin boundaries, computing basin characteristics, estimating streamflow statistics for user-selected locations, and determining point features that coincide with a National Hydrography Dataset (NHD) reach address. For the state of Kentucky, a web service also has been developed that provides users the ability to estimate daily time series of drainage-basin average values of daily precipitation and temperature. The use of web services allows the user to take full advantage of the datasets and processes behind the Stream Stats application without having to develop and maintain them. ?? 2009 IEEE.

  11. Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service

    PubMed Central

    Hatano, Kenji; Ohe, Kazuhiko

    2003-01-01

    Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364

  12. The Knowledge Base as an Extension of Distance Learning Reference Service

    ERIC Educational Resources Information Center

    Casey, Anne Marie

    2012-01-01

    This study explores knowledge bases as extension of reference services for distance learners. Through a survey and follow-up interviews with distance learning librarians, this paper discusses their interest in creating and maintaining a knowledge base as a resource for reference services to distance learners. It also investigates their perceptions…

  13. 75 FR 57086 - Submission for Review: Federal Cyber Service: Scholarship for Service (SFS) Registration Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... April 19, 2010 at 75 FR 20400, allowing for a 60-day public comment period. One comment was received... MANAGEMENT Submission for Review: Federal Cyber Service: Scholarship for Service (SFS) Registration Web Site... number of qualified students entering the fields of information assurance and computer security in...

  14. Augmenting Basic Web Data Services with Middleware Services to Facilitate Usability and Interoperability

    NASA Astrophysics Data System (ADS)

    Werpy, J.; Torbert, C.

    2014-12-01

    Over the past few years many Data Providers have implemented services that allow for web based (HTTP) interfaces to manipulate, organize, modify, and deliver Earth Science Data. This web architecture provides the foundation for streamlining of Earth Science Users utilization of and interaction with the Data. However, critical components are missing and need to be developed in order to increase the capabilities, potential, and reach of these services. Middleware services represent a class of Data Services that are able to communicate their capabilities more clearly and effectively with Science Data Users while also leveraging the more raw web services on the back end. A Middleware layer of a services architecture functions to coordinate the interactions of the users with the core web services. This simplifies execution, parameter selection, data integration, data delivery, and data analysis activities. This presentation will outline how the Land Processes Distributed Active Archive Center (LP DAAC) has utilized core services to provide basic access to data, data manipulation, and processing. Beyond that, the presentation will also detail the enhancement of those efforts through the development and implementation of Middleware to augment capabilities and create workflows necessary to enable Science Users to perform meaningful science activities and analysis faster than before. The Middleware layer acts as the "glue" that allows all these separate services to work together. By moving the algorithms that process and organize data closer to the Data Archive and enabling access to them via web services fronted by Middleware services, the LP DAAC helps Science users to do better, less expensive, and more expansive science much faster than they ever could before.

  15. Secure password-based authenticated key exchange for web services

    SciTech Connect

    Liang, Fang; Meder, Samuel; Chevassut, Olivier; Siebenlist, Frank

    2004-11-22

    This paper discusses an implementation of an authenticated key-exchange method rendered on message primitives defined in the WS-Trust and WS-SecureConversation specifications. This IEEE-specified cryptographic method (AuthA) is proven-secure for password-based authentication and key exchange, while the WS-Trust and WS-Secure Conversation are emerging Web Services Security specifications that extend the WS-Security specification. A prototype of the presented protocol is integrated in the WSRF-compliant Globus Toolkit V4. Further hardening of the implementation is expected to result in a version that will be shipped with future Globus Toolkit releases. This could help to address the current unavailability of decent shared-secret-based authentication options in the Web Services and Grid world. Future work will be to integrate One-Time-Password (OTP) features in the authentication protocol.

  16. GWASS: GRASS web application software system based on the GeoBrain web service

    NASA Astrophysics Data System (ADS)

    Qiu, Fang; Ni, Feng; Chastain, Bryan; Huang, Haiting; Zhao, Peisheng; Han, Weiguo; Di, Liping

    2012-10-01

    GRASS is a well-known geographic information system developed more than 30 years ago. As one of the earliest GIS systems, GRASS has currently survived mainly as free, open-source desktop GIS software, with users primarily limited to the research community or among programmers who use it to create customized functions. To allow average GIS end users to continue taking advantage of this widely-used software, we developed a GRASS Web Application Software System (GWASS), a distributed, web-based, multi-tiered Geospatial Information System (GIS) built on top of the GeoBrain web service, a project sponsored by NASA using the latest service oriented architecture (SOA). This SOA enabled system offers an effective and practical alternative to current commercial desktop GIS solutions. With GWASS, all geospatial processing and analyses are conducted by the server, so users are not required to install any software at the client side, which reduces the cost of access for users. The only resource needed to use GWASS is an access to the Internet, and anyone who knows how to use a web browser can operate the system. The SOA framework is revitalizing the GRASS as a new means to bring powerful geospatial analysis and resources to more users with concurrent access.

  17. The Evolution of a National Distance Guidance Service: Trends and Challenges

    ERIC Educational Resources Information Center

    Watts, A. G.; Dent, Gareth

    2008-01-01

    Three trends in the evolution of the UK Learndirect advice service are identified: the partial migration from telephone to web-based services; the trend within the telephone service from information/advice-oriented interventions to more guidance-oriented interventions; and the move from a mainly learning-oriented service to a more career-oriented…

  18. Oceanic satellite data service system based on web

    NASA Astrophysics Data System (ADS)

    Kang, Yan; Pan, Delu; He, Xianqiang; Wang, Difeng; Chen, Jianyu; Chen, Xiaoyan

    2011-11-01

    The ocean satellite observation is more and more important to study the global change, protect ocean resource and implement ocean engineering for their large area cover and high frequency observation, which have already given us a global view of ocean environment parameters, including the sea surface temperature, ocean color, wind, wave, sea level and sea ice, etc... China has made great progress in ocean environment remote sensing over the last couple of years. These data are widely used for a variety of applications in ocean environment studies, coastal water quality monitoring environmental, fishery resources protection, development and utilization of fishery resources, coastal engineering and oceanography. But the data are no online information access and dissemination, no online visualization & browsing, no online query and analyze capability. To facilitate the application of the data and to help disseminating the data, a web-service system has developed. The system provides capabilities of online oceanic satellite information access, query, visualize and analyze. It disseminates oceanic satellite data to the users via real time retrieval, processing and publishing through standards-based geospatial web services. A region of interest can also be exported directly to Google Earth for displaying or downloaded. This web service system greatly improves accessibility, interoperability, usability, and visualization of oceanic satellite data without any client-side software installation.

  19. We Cannot See Them, but They Are There: Marketing Library Services to Distance Learners

    ERIC Educational Resources Information Center

    Dermody, Melinda

    2005-01-01

    Distance learners are a unique target-population for the marketing of library services and resources. Because these patrons do not visit the library often, if at all, it is crucial to actively promote the library resources and services available to them. Marketing strategies for distance learning library services need to take a multifaceted…

  20. MyLibrary@LANL: proximity and semi-metric networks for a collaborative and recommender web service.

    SciTech Connect

    Rocha, L. M.; Simas, T.; Rechtsteiner, A.; DiGiacomo, M.; Luce, R. E.

    2005-01-01

    We describe a network approach to building recommendation systems for a WWW service. We employ two different types of weighted graphs in our analysis and development: Proximity graphs, a type of Fuzzy Graphs based on a co-occurrence probability, and semi-metric distance graphs, which do not observe the triangle inequality of Euclidean distances. Both types of graphs are used to develop intelligent recommendation and collaboration systems for the MyLibrary@LANL web service, a user-centered front-end to the Los Alamos National Laboratory's (LANL) digital library collections and WWW resources.

  1. Application of the Open Geospatial Consortium (OGC) Web Processing Service (WPS) Standard for Exposing Water Models as Web Services

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Castronova, A. M.; Huynh, N.; Caicedo, J. M.

    2012-12-01

    Management of water systems often requires the integration of data and models across a range of sources and disciplinary expertise. Service-Oriented Architectures (SOA) have emerged as a powerful paradigm for providing this integration. Including models within a SOA presents challenges because services are not well suited for applications that require state management and large data transfers. Despite these challenges, thoughtful inclusion of models as resources within a SOA could have distinct advantages that center on the idea of abstracting complex computer hardware and software from service consumers while, at the same time, providing powerful resources to client applications. With these advantages and challenges of using models within SOA in mind, this work explores the potential of a modeling service standard as a means for integrating models as resources within SOA. Specifically, we investigate the use of the Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard for exposing models as web services. Through extension of a Python-based implementation of WPS (called pyWPS), we present a demonstration of the methodology through a case study involving a storm event that floods roads and disrupts travel in Columbia, SC. The case study highlights the benefit of an urban infrastructure system with its various subsystems (stormwater, transportation, and structures) interacting and exchanging data seamlessly.

  2. SAS- Semantic Annotation Service for Geoscience resources on the web

    NASA Astrophysics Data System (ADS)

    Elag, M.; Kumar, P.; Marini, L.; Li, R.; Jiang, P.

    2015-12-01

    There is a growing need for increased integration across the data and model resources that are disseminated on the web to advance their reuse across different earth science applications. Meaningful reuse of resources requires semantic metadata to realize the semantic web vision for allowing pragmatic linkage and integration among resources. Semantic metadata associates standard metadata with resources to turn them into semantically-enabled resources on the web. However, the lack of a common standardized metadata framework as well as the uncoordinated use of metadata fields across different geo-information systems, has led to a situation in which standards and related Standard Names abound. To address this need, we have designed SAS to provide a bridge between the core ontologies required to annotate resources and information systems in order to enable queries and analysis over annotation from a single environment (web). SAS is one of the services that are provided by the Geosematnic framework, which is a decentralized semantic framework to support the integration between models and data and allow semantically heterogeneous to interact with minimum human intervention. Here we present the design of SAS and demonstrate its application for annotating data and models. First we describe how predicates and their attributes are extracted from standards and ingested in the knowledge-base of the Geosemantic framework. Then we illustrate the application of SAS in annotating data managed by SEAD and annotating simulation models that have web interface. SAS is a step in a broader approach to raise the quality of geoscience data and models that are published on the web and allow users to better search, access, and use of the existing resources based on standard vocabularies that are encoded and published using semantic technologies.

  3. Leveling the Playing Field for Users with Web Services

    NASA Astrophysics Data System (ADS)

    Trabant, C. M.; Ahern, T. K.; Karstens, R.; Weertman, B.; Suleiman, Y. Y.

    2013-12-01

    The dawn of digital seismological data recording began approximately 4 decades ago. Since then multiple networks of seismological recording stations have and continue to exist. It is common for each network to operate a data center to store and distribute the collected data. Increasingly there are data centers that archive and distribute data produced by multiple networks and organizations. The modern landscape for seismological data users consists of many data centers spread across the globe offering a variety of data. Luckily most of these centers exchange data in standard formats defined by the International Federation of Digital Seismograph Networks (FDSN). Working with our partners in the FDSN, the IRIS Data Management Center (DMC) developed specifications for 3 standard web service interfaces that are intended to provide an abstraction layer on each center's customized data management system. These services provide access to seismological time series data, related metadata and event (earthquake) parameters. An important part of the interface design is to adhere to web standards and common conventions, which allows use of ubiquitous web client software and toolkits. Another critical design criteria is simple usage, we recognize that our user base is scientific data consumers and not necessarily technologists. The IRIS DMC has implemented each of these 3 service interfaces and made the common software components freely available. Under the NSF's EarthScope program and within the international COOPEUS project, the DMC worked with European partners to help install these standardized interfaces on their own data management systems. One key development was the addition of these web services to the SeisComP3 data handling system, which is common in many seismological data centers, especially in Europe. The combination of standardized data formats and access interfaces removes the need for complex request brokers that translate between centers. Instead, it allows

  4. Stakeholder Expectations of Service Quality in a University Web Portal

    NASA Astrophysics Data System (ADS)

    Tate, Mary; Evermann, Joerg; Hope, Beverley; Barnes, Stuart

    Online service quality is a much-studied concept. There is considerable evidence that user expectations and perceptions of self-service and online service quality differ in different business domains. In addition, the nature of online services is continually changing and universities have been at the forefront of this change, with university websites increasingly acting as a portal for a wide range of online transactions for a wide range of stakeholders. In this qualitative study, we conduct focus groups with a range of stakeholders in a university web portal. Our study offers a number of insights into the changing nature of the relationship between organisations and customers. New technologies are influencing customer expectations. Customers increasingly expect organisations to have integrated information systems, and to utilise new technologies such as SMS and web portals. Organisations can be slow to adopt a customer-centric viewpoint, and persist in providing interfaces that are inconsistent or require inside knowledge of organisational structures and processes. This has a negative effect on customer perceptions.

  5. Semantic Web Service Framework to Intelligent Distributed Manufacturing

    SciTech Connect

    Kulvatunyou, Boonserm

    2005-12-01

    As markets become unexpectedly turbulent with a shortened product life cycle and a power shift towards buyers, the need for methods to develop products, production facilities, and supporting software rapidly and cost-effectively is becoming urgent. The use of a loosely integrated virtual enterprise based framework holds the potential of surviving changing market needs. However, its success requires reliable and large-scale interoperation among trading partners via a semantic web of trading partners services whose properties, capabilities, and interfaces are encoded in an unambiguous as well as computer-understandable form. This paper demonstrates a promising approach to integration and interoperation between a design house and a manufacturer that may or may not have prior relationship by developing semantic web services for business and engineering transactions. To this end, detailed activity and information flow diagrams are developed, in which the two trading partners exchange messages and documents. The properties and capabilities of the manufacturer sites are defined using DARPA Agent Markup Language (DAML) ontology definition language. The prototype development of semantic webs shows that enterprises can interoperate widely in an unambiguous and autonomous manner. This contributes towards the realization of virtual enterprises at a low cost.

  6. Geovisualization in the HydroProg web map service

    NASA Astrophysics Data System (ADS)

    Spallek, Waldemar; Wieczorek, Malgorzata; Szymanowski, Mariusz; Niedzielski, Tomasz; Swierczynska, Malgorzata

    2016-04-01

    The HydroProg system, built at the University of Wroclaw (Poland) in frame of the research project no. 2011/01/D/ST10/04171 financed by the National Science Centre of Poland, has been designed for computing predictions of river stages in real time on a basis of multimodelling. This experimental system works on the upper Nysa Klodzka basin (SW Poland) above the gauge in the town of Bardo, with the catchment area of 1744 square kilometres. The system operates in association with the Local System for Flood Monitoring of Klodzko County (LSOP), and produces hydrograph prognoses as well as inundation predictions. For presenting the up-to-date predictions and their statistics in the online mode, the dedicated real-time web map service has been designed. Geovisualisation in the HydroProg map service concerns: interactive maps of study area, interactive spaghetti hydrograms of water level forecasts along with observed river stages, animated images of inundation. The LSOP network offers a high spatial and temporal resolution of observations, as the length of the sampling interval is equal to 15 minutes. The main environmental elements related to hydrological modelling are shown on the main map. This includes elevation data (hillshading and hypsometric tints), rivers and reservoirs as well as catchment boundaries. Furthermore, we added main towns, roads as well as political and administrative boundaries for better map understanding. The web map was designed as a multi-scale representation, with levels of detail and zooming according to scales: 1:100 000, 1:250 000 and 1:500 000. Observations of water level in LSOP are shown on interactive hydrographs for each gauge. Additionally, predictions and some of their statistical characteristics (like prediction errors and Nash-Sutcliffe efficiency) are shown for selected gauges. Finally, predictions of inundation are presented on animated maps which have been added for four experimental sites. The HydroProg system is a strictly

  7. Satellite Technologies and Services: Implications for International Distance Education.

    ERIC Educational Resources Information Center

    Stahmer, Anna

    1987-01-01

    This examination of international distance education and open university applications of communication satellites at the postsecondary level notes activities in less developed countries (LDCs); presents potential models for cooperation; and describes technical systems for distance education, emphasizing satellite technology and possible problems…

  8. Web based aphasia test using service oriented architecture (SOA)

    NASA Astrophysics Data System (ADS)

    Voos, J. A.; Vigliecca, N. S.; Gonzalez, E. A.

    2007-11-01

    Based on an aphasia test for Spanish speakers which analyze the patient's basic resources of verbal communication, a web-enabled software was developed to automate its execution. A clinical database was designed as a complement, in order to evaluate the antecedents (risk factors, pharmacological and medical backgrounds, neurological or psychiatric symptoms, brain injury -anatomical and physiological characteristics, etc) which are necessary to carry out a multi-factor statistical analysis in different samples of patients. The automated test was developed following service oriented architecture and implemented in a web site which contains a tests suite, which would allow both integrating the aphasia test with other neuropsychological instruments and increasing the available site information for scientific research. The test design, the database and the study of its psychometric properties (validity, reliability and objectivity) were made in conjunction with neuropsychological researchers, who participate actively in the software design, based on the patients or other subjects of investigation feedback.

  9. A Security Architecture for Grid-enabling OGC Web Services

    NASA Astrophysics Data System (ADS)

    Angelini, Valerio; Petronzio, Luca

    2010-05-01

    In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid

  10. Diy Geospatial Web Service Chains: Geochaining Make it Easy

    NASA Astrophysics Data System (ADS)

    Wu, H.; You, L.; Gui, Z.

    2011-08-01

    It is a great challenge for beginners to create, deploy and utilize a Geospatial Web Service Chain (GWSC). People in Computer Science are usually not familiar with geospatial domain knowledge. Geospatial practitioners may lack the knowledge about web services and service chains. The end users may lack both. However, integrated visual editing interfaces, validation tools, and oneclick deployment wizards may help to lower the learning curve and improve modelling skills so beginners will have a better experience. GeoChaining is a GWSC modelling tool designed and developed based on these ideas. GeoChaining integrates visual editing, validation, deployment, execution etc. into a unified platform. By employing a Virtual Globe, users can intuitively visualize raw data and results produced by GeoChaining. All of these features allow users to easily start using GWSC, regardless of their professional background and computer skills. Further, GeoChaining supports GWSC model reuse, meaning that an entire GWSC model created or even a specific part can be directly reused in a new model. This greatly improves the efficiency of creating a new GWSC, and also contributes to the sharing and interoperability of GWSC.

  11. Free Factories: Unified Infrastructure for Data Intensive Web Services

    PubMed Central

    Zaranek, Alexander Wait; Clegg, Tom; Vandewege, Ward; Church, George M.

    2010-01-01

    We introduce the Free Factory, a platform for deploying data-intensive web services using small clusters of commodity hardware and free software. Independently administered virtual machines called Freegols give application developers the flexibility of a general purpose web server, along with access to distributed batch processing, cache and storage services. Each cluster exploits idle RAM and disk space for cache, and reserves disks in each node for high bandwidth storage. The batch processing service uses a variation of the MapReduce model. Virtualization allows every CPU in the cluster to participate in batch jobs. Each 48-node cluster can achieve 4-8 gigabytes per second of disk I/O. Our intent is to use multiple clusters to process hundreds of simultaneous requests on multi-hundred terabyte data sets. Currently, our applications achieve 1 gigabyte per second of I/O with 123 disks by scheduling batch jobs on two clusters, one of which is located in a remote data center. PMID:20514356

  12. Optimizing medical data quality based on multiagent web service framework.

    PubMed

    Wu, Ching-Seh; Khoury, Ibrahim; Shah, Hemant

    2012-07-01

    One of the most important issues in e-healthcare information systems is to optimize the medical data quality extracted from distributed and heterogeneous environments, which can extremely improve diagnostic and treatment decision making. This paper proposes a multiagent web service framework based on service-oriented architecture for the optimization of medical data quality in the e-healthcare information system. Based on the design of the multiagent web service framework, an evolutionary algorithm (EA) for the dynamic optimization of the medical data quality is proposed. The framework consists of two main components; first, an EA will be used to dynamically optimize the composition of medical processes into optimal task sequence according to specific quality attributes. Second, a multiagent framework will be proposed to discover, monitor, and report any inconstancy between the optimized task sequence and the actual medical records. To demonstrate the proposed framework, experimental results for a breast cancer case study are provided. Furthermore, to show the unique performance of our algorithm, a comparison with other works in the literature review will be presented. PMID:22614723

  13. Web services interface for Space Weather: NeQuick 2 web and experimental TEC Calibration

    NASA Astrophysics Data System (ADS)

    Migoya Orue, Yenca O.; Nava, Bruno; Radicella, Sandro M.; Alazo Cuartas, Katy; Luigi, Ciraolo

    2013-04-01

    A web front-end has been recently developed and released to allow retrieving and plotting ionospheric parameters computed by the latest version of the model, NeQuick 2. NeQuick is a quick-run ionospheric electron density model particularly designed for trans-ionospheric propagation applications. It has been developed at the Aeronomy and Radiopropagation Laboratory (now T/ICT4D Laboratory) of the Abdus Salam International Centre for Theoretical Physics (ICTP) - Trieste, Italy with the collaboration of the Institute for Geophysics, Astrophysics and Meteorology (IGAM) of the University of Graz, Austria. To describe the electron density of the ionosphere up to the peak of the F2 layer, NeQuick uses a profile formulation which includes five semi-Epstein layers with modelled thickness parameters. Through a simple web interface users can exploit all the model features including the possibility of computing the electron density and visualizing the corresponding Total Electron Content (TEC) along any ground-to-satellite straight line ray-path. Indeed, the TEC is the ionospheric parameter retrieved from the GPS measurements. It complements the experimental data obtained with diverse kinds of sensors and can be considered a major source of ionospheric information. Since the TEC is not a direct measurement, a "de-biasing" procedure or calibration has to be applied to obtain the relevant values from the raw GPS observables. Using the observation and navigation RINEX files corresponding to a single receiver as input data, the web application allows the user to compute the slant and/or vertical TEC following the concept of the "arc-by-arc" offsets estimation. The combined use of both tools, freely available from the T/ICT4D Web site, will allow the comparison of experimentally derived slant and vertical TEC with modelled values. An online demonstration of the capabilities of the mentioned web services will be illustrated.

  14. Chapter 18: Web-based Tools - NED VO Services

    NASA Astrophysics Data System (ADS)

    Mazzarella, J. M.; NED Team

    The NASA/IPAC Extragalactic Database (NED) is a thematic, web-based research facility in widespread use by scientists, educators, space missions, and observatory operations for observation planning, data analysis, discovery, and publication of research about objects beyond our Milky Way galaxy. NED is a portal into a systematic fusion of data from hundreds of sky surveys and tens of thousands of research publications. The contents and services span the entire electromagnetic spectrum from gamma rays through radio frequencies, and are continuously updated to reflect the current literature and releases of large-scale sky survey catalogs. NED has been on the Internet since 1990, growing in content, automation and services with the evolution of information technology. NED is the world's largest database of crossidentified extragalactic objects. As of December 2006, the system contains approximately 10 million objects and 15 million multi-wavelength cross-IDs. Over 4 thousand catalogs and published lists covering the entire electromagnetic spectrum have had their objects cross-identified or associated, with fundamental data parameters federated for convenient queries and retrieval. This chapter describes the interoperability of NED services with other components of the Virtual Observatory (VO). Section 1 is a brief overview of the primary NED web services. Section 2 provides a tutorial for using NED services currently available through the NVO Registry. The "name resolver" provides VO portals and related internet services with celestial coordinates for objects specified by catalog identifier (name); any alias can be queried because this service is based on the source cross-IDs established by NED. All major services have been updated to provide output in VOTable (XML) format that can be accessed directly from the NED web interface or using the NVO registry. These include access to images via SIAP, Cone- Search queries, and services providing fundamental, multi

  15. WEB-IS2: Next Generation Web Services Using Amira Visualization Package

    NASA Astrophysics Data System (ADS)

    Yang, X.; Wang, Y.; Bollig, E. F.; Kadlec, B. J.; Garbow, Z. A.; Yuen, D. A.; Erlebacher, G.

    2003-12-01

    Amira (www.amiravis.com) is a powerful 3-D visualization package and has been employed recently by the science and engineering communities to gain insight into their data. We present a new web-based interface to Amira, packaged in a Java applet. We have developed a module called WEB-IS/Amira (WEB-IS2), which provides web-based access to Amira. This tool allows earth scientists to manipulate Amira controls remotely and to analyze, render and view large datasets over the internet, without regard for time or location. This could have important ramifications for GRID computing. The design of our implementation will soon allow multiple users to visually collaborate by manipulating a single dataset through a variety of client devices. These clients will only require a browser capable of displaying Java applets. As the deluge of data continues, innovative solutions that maximize ease of use without sacrificing efficiency or flexibility will continue to gain in importance, particularly in the Earth sciences. Major initiatives, such as Earthscope (http://www.earthscope.org), which will generate at least a terabyte of data daily, stand to profit enormously by a system such as WEB-IS/Amira (WEB-IS2). We discuss our use of SOAP (Livingston, D., Advanced SOAP for Web development, Prentice Hall, 2002), a novel 2-way communication protocol, as a means of providing remote commands, and efficient point-to-point transfer of binary image data. We will present our initial experiences with the use of Naradabrokering (www.naradabrokering.org) as a means to decouple clients and servers. Information is submitted to the system as a published item, while it is retrieved through a subscription mechanisms, via what is known as "topics". These topic headers, their contents, and the list of subscribers are automatically tracked by Naradabrokering. This novel approach promises a high degree of fault tolerance, flexibility with respect to client diversity, and language independence for the

  16. On the Use of Social Networks in Web Services: Application to the Discovery Stage

    NASA Astrophysics Data System (ADS)

    Maamar, Zakaria; Wives, Leandro Krug; Boukadi, Khouloud

    This chapter discusses the use of social networks in Web services with focus on the discovery stage that characterizes the life cycle of these Web services. Other stages in this life cycle include description, publication, invocation, and composition. Web services are software applications that end users or other peers can invoke and compose to satisfy different needs such as hotel booking and car rental. Discovering the relevant Web services is, and continues to be, a major challenge due to the dynamic nature of these Web services. Indeed, Web services appear/disappear or suspend/resume operations without prior notice. Traditional discovery techniques are based on registries such as Universal Description, Discovery and Integration (UDDI) and Electronic Business using eXtensible Markup Language (ebXML). Unfortunately, despite the different improvements that these techniques have been subject to, they still suffer from various limitations that could slow down the acceptance trend of Web services by the IT community. Social networks seem to offer solutions to some of these limitations but raise, at the same time, some issues that are discussed in this chapter. The contributions of this chapter are three: social network definition in the particular context of Web services; mechanisms that support Web services build, use, and maintain their respective social networks; and social networks adoption to discover Web services.

  17. Reliable Execution Based on CPN and Skyline Optimization for Web Service Composition

    PubMed Central

    Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets. PMID:23935431

  18. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  19. WS/PIDS: standard interoperable PIDS in web services environments.

    PubMed

    Vasilescu, E; Dorobanţu, M; Govoni, S; Padh, S; Mun, S K

    2008-01-01

    An electronic health record depends on the consistent handling of people's identities within and outside healthcare organizations. Currently, the Person Identification Service (PIDS), a CORBA specification, is the only well-researched standard that meets these needs. In this paper, we introduce WS/PIDS, a PIDS specification for Web Services (WS) that closely matches the original PIDS and improves on it by providing explicit support for medical multimedia attributes. WS/PIDS is currently supported by a test implementation, layered on top of a PIDS back-end, with Java- and NET-based, and Web clients. WS/PIDS is interoperable among platforms; it preserves PIDS semantics to a large extent, and it is intended to be fully compliant with established and emerging WS standards. The specification is open source and immediately usable in dynamic clinical systems participating in grid environments. WS/PIDS has been tested successfully with a comprehensive set of use cases, and it is being used in a clinical research setting. PMID:18270041

  20. Maintenance and Exchange of Learning Objects in a Web Services Based e-Learning System

    ERIC Educational Resources Information Center

    Vossen, Gottfried; Westerkamp, Peter

    2004-01-01

    "Web services" enable partners to exploit applications via the Internet. Individual services can be composed to build new and more complex ones with additional and more comprehensive functionality. In this paper, we apply the Web service paradigm to electronic learning, and show how to exchange and maintain learning objects is a…

  1. Design and implementation of CUAHSI WaterML and WaterOneFlow Web Services

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Zaslavsky, I.; Whitenack, T.; Maidment, D.

    2007-12-01

    WaterOneFlow is a term for a group of web services created by and for the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) community. CUAHSI web services facilitate the retrieval of hydrologic observations information from online data sources using the SOAP protocol. CUAHSI Water Markup Language (below referred to as WaterML) is an XML schema defining the format of messages returned by the WaterOneFlow web services. \

  2. ClicO FS: an interactive web-based service of Circos

    PubMed Central

    Cheong, Wei-Hien; Tan, Yung-Chie; Yap, Soon-Joo; Ng, Kee-Peng

    2015-01-01

    Summary: We present ClicO Free Service, an online web-service based on Circos, which provides a user-friendly, interactive web-based interface with configurable features to generate Circos circular plots. Availability and implementation: Online web-service is freely available at http://clicofs.codoncloud.com Contact: soonjoo.yap@codongenomics.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26227146

  3. Customer Decision Making in Web Services with an Integrated P6 Model

    NASA Astrophysics Data System (ADS)

    Sun, Zhaohao; Sun, Junqing; Meredith, Grant

    Customer decision making (CDM) is an indispensable factor for web services. This article examines CDM in web services with a novel P6 model, which consists of the 6 Ps: privacy, perception, propensity, preference, personalization and promised experience. This model integrates the existing 6 P elements of marketing mix as the system environment of CDM in web services. The new integrated P6 model deals with the inner world of the customer and incorporates what the customer think during the DM process. The proposed approach will facilitate the research and development of web services and decision support systems.

  4. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS

  5. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it

  6. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it

  7. A "Virtual Fieldtrip": Service Learning in Distance Education Technical Writing Courses

    ERIC Educational Resources Information Center

    Soria, Krista M.; Weiner, Brad

    2013-01-01

    This mixed-methods experimental study examined the effect of service learning in a distance education technical writing course. Quantitative analysis of data found evidence for a positive relationship between participation in service learning and technical writing learning outcomes. Additionally, qualitative analysis suggests that service learning…

  8. The use of geospatial web services for exchanging utilities data

    NASA Astrophysics Data System (ADS)

    Kuczyńska, Joanna

    2013-04-01

    Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data

  9. Footprint Database and web services for the Herschel space observatory

    NASA Astrophysics Data System (ADS)

    Verebélyi, Erika; Dobos, László; Kiss, Csaba

    2015-08-01

    Using all telemetry and observational meta-data, we created a searchable database of Herschel observation footprints. Data from the Herschel space observatory is freely available for everyone but no uniformly processed catalog of all observations has been published yet. As a first step, we unified the data model for all three Herschel instruments in all observation modes and compiled a database of sky coverage information. As opposed to methods using a pixellation of the sphere, in our database, sky coverage is stored in exact geometric form allowing for precise area calculations. Indexing of the footprints allows for very fast search among observations based on pointing, time, sky coverage overlap and meta-data. This enables us, for example, to find moving objects easily in Herschel fields. The database is accessible via a web site and also as a set of REST web service functions which makes it usable from program clients like Python or IDL scripts. Data is available in various formats including Virtual Observatory standards.

  10. The Best of Two Worlds: Combining ITV and Web Quests To Strengthen Distance Learning.

    ERIC Educational Resources Information Center

    Mosby, Charmaine

    This presentation describes an English graduate seminar in Local Color and Regionalism in American Literature at Western Kentucky University that was set up as an experimental hybrid course, i.e., roughly 60% face-to-face and 40% Web course (Web quest format). The focus is on the four tasks that comprised the Web quest segment of the course: (1) a…

  11. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    NASA Astrophysics Data System (ADS)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  12. Communicating and visualizing data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Roberts, Charles; Blower, Jon; Maso, Joan; Diaz, Daniel; Griffiths, Guy; Lewis, Jane

    2014-05-01

    The sharing and visualization of environmental data through OGC Web Map Services is becoming increasingly common. However, information about the quality of data is rarely presented. (In this presentation we consider mostly data uncertainty as a measure of quality, although we acknowledge that many other quality measures are relevant to the geoscience community.) In the context of the GeoViQua project (http://www.geoviqua.org) we have developed conventions and tools for using WMS to deliver data quality information. The "WMS-Q" convention describes how the WMS specification can be used to publish quality information at the level of datasets, variables and individual pixels (samples). WMS-Q requires no extensions to the WMS 1.3.0 specification, being entirely backward-compatible. (An earlier version of WMS-Q was published as OGC Engineering Report 12-160.) To complement the WMS-Q convention, we have also developed extensions to the OGC Symbology Encoding (SE) specification, enabling uncertain geoscience data to be portrayed using a variety of visualization techniques. These include contours, stippling, blackening, whitening, opacity, bivariate colour maps, confidence interval triangles and glyphs. There may also be more extensive applications of these methods beyond the visual representation of uncertainty. In this presentation we will briefly describe the scope of the WMS-Q and "extended SE" specifications and then demonstrate the innovations using open-source software based upon ncWMS (http://ncwms.sf.net). We apply the tools to a variety of datasets including Earth Observation data from the European Space Agency's Climate Change Initiative. The software allows uncertain raster data to be shared through Web Map Services, giving the user fine control over data visualization.

  13. Bridging the Distance: Service Learning in International Perspective

    ERIC Educational Resources Information Center

    Florman, Jean C.; Just, Craig; Naka, Tomomi; Peterson, Jim; Seaba, Hazel H.

    2009-01-01

    In this article, the authors describe how an existing partnership between two communities, one in eastern Iowa and one in Mexico, was turned into a cross-disciplinary and international service learning course for students in the University of Iowa Colleges of Engineering, Pharmacy, and Liberal Arts and Sciences. The projects that students worked…

  14. Standards-Based, Web Services for Interoperable Geosciences Data Systems

    NASA Astrophysics Data System (ADS)

    Domenico, B.; Nativi, S.; Bigagli, L.; Caron, J.

    2005-12-01

    Disparate, "stove-pipe" data systems are among the main impediments to many interdisciplinary research projects in the geosciences. The solid earth disciplines and hydrology tend to use Geographic Information Systems (GIS) which enable them to store and interact with data representing as discrete features on or near the surface of the earth. Studies of the oceans and atmosphere on the other hand involve systems that represent data as discrete points in the continuous function space of fluid dynamics. Attempts to understand the nature of severe precipitation and flooding events are hampered by the difficulty of integrating data such as streamflows from hydrological data systems with radar data and precipitation forecasts from atmospheric science data systems. An effort is underway to address some of these issues with an interoperability experiment within the framework of the Open Geospatial Consortium (OGC). The experiment is called GALEON (Geo-interface to Atmosphere, Land, Earth, Ocean NetCDF). Teams at the Unidata Program Center and University of Florence are working with a number of international partners to implement a web services interface to traditional atmospheric and oceanographic datasets currently stored in netCDF form or served via the OPeNDAP protocol . The project will result in a gateway service using Web Coverage Service (WCS) specification of the OGC. Underneath the WCS interface will be a combination of technologies including THREDDS (THematic Real-time Environmental Distributed Data Services) and HDF5 (Heirarchical Data Format) in addition to netCDF and OPeNDAP. A key component of the project is to develop mechanisms for explicit encoding of coordinate system information in the form of Coordinate System extensions to NcML (the netCDF Markup Language), directly in the data files themselves and in the form of GML (Geography Markup Language) extensions to NcML. These extensions, called NcML-GML, include a subset profile of the standard GML which is

  15. Protecting Database Centric Web Services against SQL/XPath Injection Attacks

    NASA Astrophysics Data System (ADS)

    Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique

    Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.

  16. Pre-Service Teachers' Views on Web-Based Classroom Management

    ERIC Educational Resources Information Center

    Boyaci, Adnan

    2010-01-01

    With the invention of World Wide Web in 1992, delivery of distance education via internet and emergency of web-based classrooms have rapidly gained acceptance as an alternative and supplement to traditional face to face classroom instruction (Alavi, Yoo & Vogel, 1997; Rahm & Reed, 1997), which represents a paradigm shift challenging all…

  17. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  18. Promoting and tracking the use of hospital library web services by outside entities.

    PubMed

    Leman, Hope

    2010-04-01

    This column describes a process that can be used to market a hospital library Web service for use by external entities and monitor its use by the worldwide audience (e.g., medical, academic and public libraries, offices of research administration). Included are concrete suggestions to help hospital librarians in their efforts to encourage adoption of their Web service by other institutions. PMID:20432141

  19. Determinants of Corporate Web Services Adoption: A Survey of Companies in Korea

    ERIC Educational Resources Information Center

    Kim, Daekil

    2010-01-01

    Despite the growing interest and attention from Information Technology researchers and practitioners, empirical research on factors that influence an organization's likelihood of adoption of Web Services has been limited. This study identified the factors influencing Web Services adoption from the perspective of 151 South Korean firms. The…

  20. Web Services for Dynamic Coloring of UAVSAR Images

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Pierce, Marlon; Donnellan, Andrea; Parker, Jay

    2015-08-01

    QuakeSim has implemented a service-based Geographic Information System to enable users to access large amounts of Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) data through an online interface. The QuakeSim Interferometric Synthetic Aperture Radar (InSAR) profile tool calculates radar-observed displacement (from an unwrapped interferogram product) along user-specified lines. Pre-rendered thumbnails with InSAR fringe patterns are used to display interferogram and unwrapped phase images on a Google Map in the InSAR profile tool. One challenge with this tool lies in the user visually identifying regions of interest when drawing the profile line. This requires that the user correctly interpret the InSAR imagery, which currently uses fringe patterns. The mapping between pixel color and pixel value is not a one-to-one relationship from the InSAR fringe pattern, and it causes difficulty in understanding general displacement information for QuakeSim users. The goal of this work is to generate color maps that directly reflect the pixel values (displacement) as an addition to the pre-rendered images. Because of an extremely uneven distribution of pixel values on an InSAR image, a histogram-based, nonlinear color template generation algorithm is currently under development. A web service enables on-the-fly coloring of UAVSAR images with dynamically generated color templates.

  1. Implementing Broad Scale Childhood Immunization Decision Support as a Web Service

    PubMed Central

    Zhu, Vivienne J.; Grannis, Shaun J.; Rosenman, Marc B.; Downs, Stephen M.

    2009-01-01

    Timely vaccinations decrease a child’s risk of contracting vaccine-preventable disease and prevent disease outbreaks. Childhood immunization schedules may represent the only clinical guideline for which there is official national consensus. So an immunization clinical decision support system (CDSS) is a natural application. However, immunization schedules are complex and change frequently. Maintaining multiple CDSS’s is expensive and error prone. Therefore, a practical strategy would be an immunization CDSS as a centralized web service that can be easily accessed by various electronic medical record (EMR) systems. This allows centralized maintenance of immunization guidelines. We have developed a web service, based on Miller’s tabular model with modifications, which implements routine childhood immunization guidelines. This immunization web service is currently operating in the Regenstrief Institute intranet and system evaluations are ongoing. We will make this web service available on the Internet. In this paper, we describe this web service -based immunization decision support tool. PMID:20351952

  2. Integration of RFID and web service for assisted living.

    PubMed

    Unluturk, Mehmet S; Kurtel, Kaan

    2012-08-01

    The number of people over 65 years old throughout most stable and prosperous countries in the world is increasing. Availability of their care in their own homes is imperative because of the economic reasons and their choices where to live (World Health Organization, Definition of an older or elderly person. http://www.who.int/healthinfo/survey/ageingdefnolder/en/ ; EQUIP-European Framework for Qualifications in Home Care Services for Older People, http://www.equip-project.com ; Salonen, 2009). "Recent advancement in wireless communications and electronics has enabled the development of low-cost sensor networks. The sensor networks can be utilized in various application areas." (Akyildiz, et al. 2002) These two statements show that there is a great promise in wireless technology and utilizing it in assisted living might be very beneficial to the elderly people. In this paper, we propose software architecture called Location Windows Service (LWS) which integrates the Radio Frequency Identification (RFID) technology and the web service to build an assisted living system for elderly people at home. This architecture monitors the location of elderly people without interfering in their daily activities. Location information messages that are generated as the elderly move from room to room indicate that the elderly person is fit and healthy and going about their normal life. The communication must be timely enough to follow elderly people as they move from room to room without missing a location. Unacknowledged publishing, subscription filtering and short location change messages are also included in this software model to reduce the network traffic in large homes. We propose some defense schemes being applied to the network environment of the assisted living system to prevent any external attacks. PMID:21537853

  3. Working without a Crystal Ball: Predicting Web Trends for Web Services Librarians

    ERIC Educational Resources Information Center

    Ovadia, Steven

    2008-01-01

    User-centered design is a principle stating that electronic resources, like library Web sites, should be built around the needs of the users. This article interviews Web developers of library and non-library-related Web sites, determining how they assess user needs and how they decide to adapt certain technologies for users. According to the…

  4. Research of three level match method about semantic web service based on ontology

    NASA Astrophysics Data System (ADS)

    Xiao, Jie; Cai, Fang

    2011-10-01

    An important step of Web service Application is the discovery of useful services. Keywords are used in service discovery in traditional technology like UDDI and WSDL, with the disadvantage of user intervention, lack of semantic description and low accuracy. To cope with these problems, OWL-S is introduced and extended with QoS attributes to describe the attribute and functions of Web Services. A three-level service matching algorithm based on ontology and QOS in proposed in this paper. Our algorithm can match web service by utilizing the service profile, QoS parameters together with input and output of the service. Simulation results shows that it greatly enhanced the speed of service matching while high accuracy is also guaranteed.

  5. The Knowledge Web: Learning and Collaborating on the Net. Open and Distance Learning Series.

    ERIC Educational Resources Information Center

    Eisenstadt, Marc, Ed.; Vincent, Tom, Ed.

    This book contains a collection of examples of new and effective uses of the World Wide Web in education from the Knowledge Media Institute (KMi) at the Open University (Great Britain). The publication is organized in three main sections--"Learning Media,""Collaboration and Presence," and "Knowledge Systems on the Web"--and contains the following…

  6. A Framework of Synthesizing Tutoring Conversation Capability with Web-Based Distance Education Courseware

    ERIC Educational Resources Information Center

    Song, Ki-Sang; Hu, Xiangen; Olney, Andrew; Graesser, Arthur C.

    2004-01-01

    Whereas existing learning environments on the Web lack high level interactivity, we have developed a human tutor-like tutorial conversation system for the Web that enhances educational courseware through mixed-initiative dialog with natural language processing. The conversational tutoring agent is composed of an animated tutor, a Latent Semantic…

  7. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  8. A Privacy Access Control Framework for Web Services Collaboration with Role Mechanisms

    NASA Astrophysics Data System (ADS)

    Liu, Linyuan; Huang, Zhiqiu; Zhu, Haibin

    With the popularity of Internet technology, web services are becoming the most promising paradigm for distributed computing. This increased use of web services has meant that more and more personal information of consumers is being shared with web service providers, leading to the need to guarantee the privacy of consumers. This paper proposes a role-based privacy access control framework for Web services collaboration, it utilizes roles to specify the privacy privileges of services, and considers the impact on the reputation degree of the historic experience of services in playing roles. Comparing to the traditional privacy access control approaches, this framework can make the fine-grained authorization decision, thus efficiently protecting consumers' privacy.

  9. The Building of Digital Archives Personalized Service Website based on Web 2.0

    NASA Astrophysics Data System (ADS)

    Ziyu, Cheng; Haining, An

    Web2.0 technology has been applied in the digital archive personalized service website. Although there are few users and they lack of understanding currently, the author believe that web2.0 relying on many advantages of fast, convenient, and the zero cost, will be approved by more and more users in the future. With the continuous perfection and popularity of web2.0, the personalized service of digital archives will display a new vitality. In the paper, author proposes the application approaches of web2.0 in the system.

  10. Job submission and management through web services: the experience with the CREAM service

    NASA Astrophysics Data System (ADS)

    Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Fina, S. D.; Ronco, S. D.; Dorigo, A.; Gianelle, A.; Marzolla, M.; Mazzucato, M.; Sgaravatto, M.; Verlato, M.; Zangrando, L.; Corvo, M.; Miccio, V.; Sciaba, A.; Cesini, D.; Dongiovanni, D.; Grandi, C.

    2008-07-01

    Modern Grid middleware is built around components providing basic functionality, such as data storage, authentication, security, job management, resource monitoring and reservation. In this paper we describe the Computing Resource Execution and Management (CREAM) service. CREAM provides a Web service-based job execution and management capability for Grid systems; in particular, it is being used within the gLite middleware. CREAM exposes a Web service interface allowing conforming clients to submit and manage computational jobs to a Local Resource Management System. We developed a special component, called ICE (Interface to CREAM Environment) to integrate CREAM in gLite. ICE transfers job submissions and cancellations from the Workload Management System, allowing users to manage CREAM jobs from the gLite User Interface. This paper describes some recent studies aimed at assessing the performance and reliability of CREAM and ICE; those tests have been performed as part of the acceptance tests for integration of CREAM and ICE in gLite. We also discuss recent work towards enhancing CREAM with a BES and JSDL compliant interface.