Science.gov

Sample records for distance web service

  1. Isolation by distance, web service

    PubMed Central

    Jensen, Jeffrey L; Bohonak, Andrew J; Kelley, Scott T

    2005-01-01

    Background The population genetic pattern known as "isolation by distance" results from spatially limited gene flow and is a commonly observed phenomenon in natural populations. However, few software programs exist for estimating the degree of isolation by distance among populations, and they tend not to be user-friendly. Results We have created Isolation by Distance Web Service (IBDWS) a user-friendly web interface for determining patterns of isolation by distance. Using this site, population geneticists can perform a variety of powerful statistical tests including Mantel tests, Reduced Major Axis (RMA) regression analysis, as well as calculate FST between all pairs of populations and perform basic summary statistics (e.g., heterozygosity). All statistical results, including publication-quality scatter plots in Postscript format, are returned rapidly to the user and can be easily downloaded. Conclusion IBDWS population genetics analysis software is hosted at and documentation is available at . The source code has been made available on Source Forge at . PMID:15760479

  2. Distance Education Clearinghouse Web Site.

    ERIC Educational Resources Information Center

    Adams, Kate; Martin, Sara

    A World Wide Web site, developed by University of Nebraska-Lincoln Information Services staff and funded by a NEB*SAT (Nebraska's multiple channel satellite and optical fiber educational telecommunications network) grant, provides a clearinghouse of distance education, Internet, and Web page development information that is useful to librarians and…

  3. Web-Browsing Competencies of Pre-Service Adult Facilitators: Implications for Curriculum Transformation and Distance Learning

    ERIC Educational Resources Information Center

    Theresa, Ofoegbu; Ugwu, Agboeze Matthias; Ihebuzoaju, Anyanwu Joy; Uche, Asogwa

    2013-01-01

    The study investigated the Web-browsing competencies of pre-service adult facilitators in the southeast geopolitical zone of Nigeria. Survey design was adopted for the study. The population consists of all pre-service adult facilitators in all the federal universities in the southeast geopolitical zone of Nigeria. Accidental sampling technique was…

  4. Adding Interactivity to Web Based Distance Learning.

    ERIC Educational Resources Information Center

    Cafolla, Ralph; Knee, Richard

    Web Based Distance Learning (WBDL) is a form of distance learning based on providing instruction mainly on the World Wide Web. This paradigm has limitations, especially the lack of interactivity inherent in the Web. The purpose of this paper is to discuss some of the technologies the authors have used in their courses at Florida Atlantic…

  5. Chapter 59: Web Services

    NASA Astrophysics Data System (ADS)

    Graham, M. J.

    Web services are a cornerstone of the distributed computing infrastructure that the VO is built upon yet to the newcomer, they can appear to be a black art. This perception is not helped by the miasma of technobabble that pervades the subject and the seemingly impenetrable high priesthood of actual users. In truth, however, there is nothing conceptually difficult about web services (unsurprisingly any complexities will lie in the implementation details) nor indeed anything particularly new. A web service is a piece of software available over a network with a formal description of how it is called and what it returns that a computer can understand. Note that entities such as web servers, ftp servers and database servers do not generally qualify as they lack the standardized description of their inputs and outputs. There are prior technologies, such as RMI, CORBA, and DCOM, that have employed a similar approach but the success of web services lies predominantly in its use of standardized XML to provide a language-neutral way for representing data. In fact, the standardization goes further as web services are traditionally (or as traditionally as five years will allow) tied to a specific set of technologies (WSDL and SOAP conveyed using HTTP with an XML serialization). Alternative implementations are becoming increasingly common and we will cover some of these here. One important thing to remember in all of this, though, is that web services are meant for use by computers and not humans (unlike web pages) and this is why so much of it seems incomprehensible gobbledegook. In this chapter, we will start with an overview of the web services current in the VO and present a short guide on how to use and deploy a web service. We will then review the different approaches to web services, particularly REST and SOAP, and alternatives to XML as a data format. We will consider how web services can be formally described and discuss how advanced features such as security, state

  6. Using Web-Based Distance Learning to Reduce Cultural Distance

    ERIC Educational Resources Information Center

    Wong, L. Fai; Trinidad, S. G.

    2004-01-01

    In recent years, Web-based distance learning (WBDL) systems have become a popular learning environment for many western learners. While it has been established as an effective learning alternative, WBDL is not flourishing in Hong Kong as expected. This paper proposes that this is because Hong Kong students are not trained to learn independently…

  7. Web Page Design in Distance Education

    ERIC Educational Resources Information Center

    Isman, Aytekin; Dabaj, Fahme; Gumus, Agah; Altinay, Fahriye; Altinay, Zehra

    2004-01-01

    Distance education is contemporary process of the education. It facilitates fast, easy delivery of information with its concrete hardware and software tools. The development of high technology, Internet and web-design delivering become impact of effective using as delivery system to the students. Within the global perspective, even the all work…

  8. Web Page Design in Distance Education

    ERIC Educational Resources Information Center

    Isman, Aytekin; Dabaj, Fahme; Gumus, Agah; Altinay, Fahriye; Altinay, Zehra

    2004-01-01

    Distance education is contemporary process of the education. It facilitates fast, easy delivery of information with its concrete hardware and software tools. The development of high technology, internet and web-design delivering become impact of effective using as delivery system to the students. Within the global perspective, even the all work…

  9. Web Services Integration on the Fly

    DTIC Science & Technology

    2008-12-01

    22 K. WEB SERVICES CHOREOGRAPHY DESCRIPTION LANGUAGE (WS-CDL...Intelligent Framework WSBPEL Web Services Business Process Execution Language WS-CDL Web Services Choreography Description Language WSDL Web...Language (WSBPEL) is a related technology addressing service orchestration. Web Services Choreography and Web Services Security are important areas

  10. The EMBRACE web service collection

    PubMed Central

    Pettifer, Steve; Ison, Jon; Kalaš, Matúš; Thorne, Dave; McDermott, Philip; Jonassen, Inge; Liaquat, Ali; Fernández, José M.; Rodriguez, Jose M.; Partners, INB-; Pisano, David G.; Blanchet, Christophe; Uludag, Mahmut; Rice, Peter; Bartaseviciute, Edita; Rapacki, Kristoffer; Hekkelman, Maarten; Sand, Olivier; Stockinger, Heinz; Clegg, Andrew B.; Bongcam-Rudloff, Erik; Salzemann, Jean; Breton, Vincent; Attwood, Teresa K.; Cameron, Graham; Vriend, Gert

    2010-01-01

    The EMBRACE (European Model for Bioinformatics Research and Community Education) web service collection is the culmination of a 5-year project that set out to investigate issues involved in developing and deploying web services for use in the life sciences. The project concluded that in order for web services to achieve widespread adoption, standards must be defined for the choice of web service technology, for semantically annotating both service function and the data exchanged, and a mechanism for discovering services must be provided. Building on this, the project developed: EDAM, an ontology for describing life science web services; BioXSD, a schema for exchanging data between services; and a centralized registry (http://www.embraceregistry.net) that collects together around 1000 services developed by the consortium partners. This article presents the current status of the collection and its associated recommendations and standards definitions. PMID:20462862

  11. Web Service: MedlinePlus

    MedlinePlus

    ... this page: https://medlineplus.gov/webservices.html MedlinePlus Web Service To use the sharing features on this ... please enable JavaScript. MedlinePlus offers a search-based Web service that provides access to MedlinePlus health topic ...

  12. RESTful Web Services at BNL

    SciTech Connect

    Casella, R.

    2011-06-14

    RESTful (REpresentational State Transfer) web services are an alternative implementation to SOAP/RPC web services in a client/server model. BNLs IT Division has started deploying RESTful Web Services for enterprise data retrieval and manipulation. Data is currently used by system administrators for tracking configuration information and as it is expanded will be used by Cyber Security for vulnerability management and as an aid to cyber investigations. This talk will describe the implementation and outstanding issues as well as some of the reasons for choosing RESTful over SOAP/RPC and future directions.

  13. Embracing a Customer Service Mindset: A Fresh Examination of Services for Distance Learners

    ERIC Educational Resources Information Center

    Steiner, Heidi

    2013-01-01

    Library literature and blogs frequently discuss customer service and user experience in physical libraries and Web sites, but little is said about this mentality toward services for distance learners specifically. This paper takes customer service best practices from well-known thinkers of the business world and makes connections to services for…

  14. The Uniframe .Net Web Service Discovery Service

    DTIC Science & Technology

    2003-06-27

    used: business; service; binding ; and specifications for the services. The UDDI businessEntity element represents Business information and 18 ...componentTable Mapping of the Web services directories and binding information necessary for clients to consume services. This object 35 also stores...SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18 . NUMBER OF PAGES 124 19a. NAME OF RESPONSIBLE PERSON a. REPORT

  15. Semantic Web for Manufacturing Web Services

    SciTech Connect

    Kulvatunyou, Boonserm; Ivezic, Nenad

    2002-06-01

    As markets become unexpectedly turbulent with a shortened product life cycle and a power shift towards buyers, the need for methods to rapidly and cost-effectively develop products, production facilities and supporting software is becoming urgent. The use of a virtual enterprise plays a vital role in surviving turbulent markets. However, its success requires reliable and large-scale interoperation among trading partners via a semantic web of trading partners' services whose properties, capabilities, and interfaces are encoded in an unambiguous as well as computer-understandable form. This paper demonstrates a promising approach to integration and interoperation between a design house and a manufacturer by developing semantic web services for business and engineering transactions. To this end, detailed activity and information flow diagrams are developed, in which the two trading partners exchange messages and documents. The properties and capabilities of the manufacturer sites are defined using DARPA Agent Markup Language (DAML) ontology definition language. The prototype development of semantic webs shows that enterprises can widely interoperate in an unambiguous and autonomous manner; hence, virtual enterprise is realizable at a low cost.

  16. Using the Web for Distance Learning.

    ERIC Educational Resources Information Center

    Altekruse, Michael K.; Brew, Leah

    This chapter explores the pros and cons of Web-based instruction following the University of North Texas' decision to offer counseling courses on the Web. It seems that the most frequent arguments for Web-based instruction are its ability to reach students and its potential for course enrichment. The practice of providing live instruction on the…

  17. Past, Present, and Future of Web Service

    NASA Astrophysics Data System (ADS)

    Kitamura, Yasuhiko

    World Wide Web was born as a means to provide information through the Internet. As a number of e-shopping sites are developed on the Internet, the Web provides not only information but also services with which users can interact to buy products. This paper describes the basic standards used in Web service; XML, SOAP, and WSDL, and how Web services are implemented on the Java-based platform Axis. It also mentions the REST-based Web service which gains more attention recently. Finally it forecasts the future of Web service from a viewpoint of Semantic Web.

  18. Web-Based Communications, the Internet, and Distance Education. Readings in Distance Education, Number 7.

    ERIC Educational Resources Information Center

    Moore, Michael G., Ed.; Cozine, Geoffrey T., Ed.

    This book brings together a selection of articles published in "The American Journal of Distance Education" that are related to Web-based delivery of distance education. Articles include: "Performance and Perceptions of Distance Learners in Cyberspace" (Peter Navarro and Judy Shoemaker); "Distance Education for Dentists: Improving the Quality of…

  19. WEBCAP: Web Scheduler for Distance Learning Multimedia Documents with Web Workload Considerations

    ERIC Educational Resources Information Center

    Habib, Sami; Safar, Maytham

    2008-01-01

    In many web applications, such as the distance learning, the frequency of refreshing multimedia web documents places a heavy burden on the WWW resources. Moreover, the updated web documents may encounter inordinate delays, which make it difficult to retrieve web documents in time. Here, we present an Internet tool called WEBCAP that can schedule…

  20. Semantic Search of Web Services

    ERIC Educational Resources Information Center

    Hao, Ke

    2013-01-01

    This dissertation addresses semantic search of Web services using natural language processing. We first survey various existing approaches, focusing on the fact that the expensive costs of current semantic annotation frameworks result in limited use of semantic search for large scale applications. We then propose a vector space model based service…

  1. APPRIS WebServer and WebServices.

    PubMed

    Rodriguez, Jose Manuel; Carro, Angel; Valencia, Alfonso; Tress, Michael L

    2015-07-01

    This paper introduces the APPRIS WebServer (http://appris.bioinfo.cnio.es) and WebServices (http://apprisws.bioinfo.cnio.es). Both the web servers and the web services are based around the APPRIS Database, a database that presently houses annotations of splice isoforms for five different vertebrate genomes. The APPRIS WebServer and WebServices provide access to the computational methods implemented in the APPRIS Database, while the APPRIS WebServices also allows retrieval of the annotations. The APPRIS WebServer and WebServices annotate splice isoforms with protein structural and functional features, and with data from cross-species alignments. In addition they can use the annotations of structure, function and conservation to select a single reference isoform for each protein-coding gene (the principal protein isoform). APPRIS principal isoforms have been shown to agree overwhelmingly with the main protein isoform detected in proteomics experiments. The APPRIS WebServer allows for the annotation of splice isoforms for individual genes, and provides a range of visual representations and tools to allow researchers to identify the likely effect of splicing events. The APPRIS WebServices permit users to generate annotations automatically in high throughput mode and to interrogate the annotations in the APPRIS Database. The APPRIS WebServices have been implemented using REST architecture to be flexible, modular and automatic.

  2. Earth Science Mining Web Services

    NASA Astrophysics Data System (ADS)

    Pham, L. B.; Lynnes, C. S.; Hegde, M.; Graves, S.; Ramachandran, R.; Maskey, M.; Keiser, K.

    2008-12-01

    To allow scientists further capabilities in the area of data mining and web services, the Goddard Earth Sciences Data and Information Services Center (GES DISC) and researchers at the University of Alabama in Huntsville (UAH) have developed a system to mine data at the source without the need of network transfers. The system has been constructed by linking together several pre-existing technologies: the Simple Scalable Script-based Science Processor for Measurements (S4PM), a processing engine at the GES DISC; the Algorithm Development and Mining (ADaM) system, a data mining toolkit from UAH that can be configured in a variety of ways to create customized mining processes; ActiveBPEL, a workflow execution engine based on BPEL (Business Process Execution Language); XBaya, a graphical workflow composer; and the EOS Clearinghouse (ECHO). XBaya is used to construct an analysis workflow at UAH using ADaM components, which are also installed remotely at the GES DISC, wrapped as Web Services. The S4PM processing engine searches ECHO for data using space-time criteria, staging them to cache, allowing the ActiveBPEL engine to remotely orchestrates the processing workflow within S4PM. As mining is completed, the output is placed in an FTP holding area for the end user. The goals are to give users control over the data they want to process, while mining data at the data source using the server's resources rather than transferring the full volume over the internet. These diverse technologies have been infused into a functioning, distributed system with only minor changes to the underlying technologies. The key to this infusion is the loosely coupled, Web- Services based architecture: All of the participating components are accessible (one way or another) through (Simple Object Access Protocol) SOAP-based Web Services.

  3. Earth Science Mining Web Services

    NASA Technical Reports Server (NTRS)

    Pham, Long; Lynnes, Christopher; Hegde, Mahabaleshwa; Graves, Sara; Ramachandran, Rahul; Maskey, Manil; Keiser, Ken

    2008-01-01

    To allow scientists further capabilities in the area of data mining and web services, the Goddard Earth Sciences Data and Information Services Center (GES DISC) and researchers at the University of Alabama in Huntsville (UAH) have developed a system to mine data at the source without the need of network transfers. The system has been constructed by linking together several pre-existing technologies: the Simple Scalable Script-based Science Processor for Measurements (S4PM), a processing engine at he GES DISC; the Algorithm Development and Mining (ADaM) system, a data mining toolkit from UAH that can be configured in a variety of ways to create customized mining processes; ActiveBPEL, a workflow execution engine based on BPEL (Business Process Execution Language); XBaya, a graphical workflow composer; and the EOS Clearinghouse (ECHO). XBaya is used to construct an analysis workflow at UAH using ADam components, which are also installed remotely at the GES DISC, wrapped as Web Services. The S4PM processing engine searches ECHO for data using space-time criteria, staging them to cache, allowing the ActiveBPEL engine to remotely orchestras the processing workflow within S4PM. As mining is completed, the output is placed in an FTP holding area for the end user. The goals are to give users control over the data they want to process, while mining data at the data source using the server's resources rather than transferring the full volume over the internet. These diverse technologies have been infused into a functioning, distributed system with only minor changes to the underlying technologies. The key to the infusion is the loosely coupled, Web-Services based architecture: All of the participating components are accessible (one way or another) through (Simple Object Access Protocol) SOAP-based Web Services.

  4. Transactional Distance in World Wide Web Learning Environments.

    ERIC Educational Resources Information Center

    Chen, Yau-Jane

    2001-01-01

    Describes a study conducted at four Taiwan universities that measured the impact of variables on learners' perceived transactional distance in a World Wide Web learning environment. Examines learners' skill level with the Internet, previous experience in taking distance education courses, extent of interaction, and types of learner support.…

  5. Technical Services and the World Wide Web.

    ERIC Educational Resources Information Center

    Scheschy, Virginia M.

    The World Wide Web and browsers such as Netscape and Mosaic have simplified access to electronic resources. Today, technical services librarians can share in the wealth of information available on the Web. One of the premier Web sites for acquisitions librarians is AcqWeb, a cousin of the AcqNet listserv. In addition to interesting news items,…

  6. Similarity Based Semantic Web Service Match

    NASA Astrophysics Data System (ADS)

    Peng, Hui; Niu, Wenjia; Huang, Ronghuai

    Semantic web service discovery aims at returning the most matching advertised services to the service requester by comparing the semantic of the request service with an advertised service. The semantic of a web service are described in terms of inputs, outputs, preconditions and results in Ontology Web Language for Service (OWL-S) which formalized by W3C. In this paper we proposed an algorithm to calculate the semantic similarity of two services by weighted averaging their inputs and outputs similarities. Case study and applications show the effectiveness of our algorithm in service match.

  7. The Organizational Role of Web Services

    ERIC Educational Resources Information Center

    Mitchell, Erik

    2011-01-01

    The workload of Web librarians is already split between Web-related and other library tasks. But today's technological environment has created new implications for existing services and new demands for staff time. It is time to reconsider how libraries can best allocate resources to provide effective Web services. Delivering high-quality services…

  8. Revisiting Transactional Distance Theory in a Context of Web-Based High-School Distance Education

    ERIC Educational Resources Information Center

    Murphy, Elizabeth Anne; Rodriguez-Manzanares, Maria Angeles

    2008-01-01

    The purpose of this paper is to report on a study that provided an opportunity to consider Transactional Distance Theory (TDT) in a current technology context of web-based learning in distance education (DE), high-school classrooms. Data collection relied on semi-structured interviews conducted with 22 e-teachers and managers in Newfoundland and…

  9. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm.

    PubMed

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the "quality of service" as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services.

  10. Discovery and Classification of Bioinformatics Web Services

    SciTech Connect

    Rocco, D; Critchlow, T

    2002-09-02

    The transition of the World Wide Web from a paradigm of static Web pages to one of dynamic Web services provides new and exciting opportunities for bioinformatics with respect to data dissemination, transformation, and integration. However, the rapid growth of bioinformatics services, coupled with non-standardized interfaces, diminish the potential that these Web services offer. To face this challenge, we examine the notion of a Web service class that defines the functionality provided by a collection of interfaces. These descriptions are an integral part of a larger framework that can be used to discover, classify, and wrapWeb services automatically. We discuss how this framework can be used in the context of the proliferation of sites offering BLAST sequence alignment services for specialized data sets.

  11. Storage Manager and File Transfer Web Services

    SciTech Connect

    William A Watson III; Ying Chen; Jie Chen; Walt Akers

    2002-07-01

    Web services are emerging as an interesting mechanism for a wide range of grid services, particularly those focused upon information services and control. When coupled with efficient data transfer services, they provide a powerful mechanism for building a flexible, open, extensible data grid for science applications. In this paper we present our prototype work on a Java Storage Resource Manager (JSRM) web service and a Java Reliable File Transfer (JRFT) web service. A java client (Grid File Manager) on top of JSRM and is developed to demonstrate the capabilities of these web services. The purpose of this work is to show the extent to which SOAP based web services are an appropriate direction for building a grid-wide data management system, and eventually grid-based portals.

  12. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm

    PubMed Central

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the “quality of service” as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services. PMID:26504894

  13. Acquiring Evolving Technologies: Web Services Standards

    DTIC Science & Technology

    2016-06-30

    2006 Carnegie Mellon University Acquiring Evolving Technologies : Web Services Standards Harry L. Levinson Software Engineering Institute Carnegie...Acquiring Evolving Technologies : Web Services Standards 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 Acquiring Evolving Technologies : Web Services Standards © 2006 Carnegie Mellon University Acquiring

  14. Distance Learning Library Services in Ugandan Universities

    ERIC Educational Resources Information Center

    Mayende, Jackline Estomihi Kiwelu; Obura, Constant Okello

    2013-01-01

    The study carried out at Makerere University and Uganda Martyrs University in 2010 aimed at providing strategies for enhanced distance learning library services in terms of convenience and adequacy. The study adopted a cross sectional descriptive survey design. The study revealed services provided in branch libraries in Ugandan universities were…

  15. Enriching the Web Processing Service

    NASA Astrophysics Data System (ADS)

    Wosniok, Christoph; Bensmann, Felix; Wössner, Roman; Kohlus, Jörn; Roosmann, Rainer; Heidmann, Carsten; Lehfeldt, Rainer

    2014-05-01

    The OGC Web Processing Service (WPS) provides a standard for implementing geospatial processes in service-oriented networks. In its current version 1.0.0 it allocates the operations GetCapabilities, DescribeProcess and Execute, which can be used to offer custom processes based on single or multiple sub-processes. A large range of ready to use fine granular, fundamental geospatial processes have been developed by the GIS-community in the past. However, modern use cases or whole workflow processes demand specifications of lifecycle management and service orchestration. Orchestrating smaller sub-processes is a task towards interoperability; a comprehensive documentation by using appropriate metadata is also required. Though different approaches were tested in the past, developing complex WPS applications still requires programming skills, knowledge about software libraries in use and a lot of effort for integration. Our toolset RichWPS aims at providing a better overall experience by setting up two major components. The RichWPS ModelBuilder enables the graphics-aided design of workflow processes based on existing local and distributed processes and geospatial services. Once tested by the RichWPS Server, a composition can be deployed for production use on the RichWPS Server. The ModelBuilder obtains necessary processes and services from a directory service, the RichWPS semantic proxy. It manages the lifecycle and is able to visualize results and debugging-information. One aim will be to generate reproducible results; the workflow should be documented by metadata that can be integrated in Spatial Data Infrastructures. The RichWPS Server provides a set of interfaces to the ModelBuilder for, among others, testing composed workflow sequences, estimating their performance and to publish them as common processes. Therefore the server is oriented towards the upcoming WPS 2.0 standard and its ability to transactionally deploy and undeploy processes making use of a WPS

  16. Space Physics Data Facility Web Services

    NASA Technical Reports Server (NTRS)

    Candey, Robert M.; Harris, Bernard T.; Chimiak, Reine A.

    2005-01-01

    The Space Physics Data Facility (SPDF) Web services provides a distributed programming interface to a portion of the SPDF software. (A general description of Web services is available at http://www.w3.org/ and in many current software-engineering texts and articles focused on distributed programming.) The SPDF Web services distributed programming interface enables additional collaboration and integration of the SPDF software system with other software systems, in furtherance of the SPDF mission to lead collaborative efforts in the collection and utilization of space physics data and mathematical models. This programming interface conforms to all applicable Web services specifications of the World Wide Web Consortium. The interface is specified by a Web Services Description Language (WSDL) file. The SPDF Web services software consists of the following components: 1) A server program for implementation of the Web services; and 2) A software developer s kit that consists of a WSDL file, a less formal description of the interface, a Java class library (which further eases development of Java-based client software), and Java source code for an example client program that illustrates the use of the interface.

  17. Identifying orthoimages in Web Map Services

    NASA Astrophysics Data System (ADS)

    Florczyk, A. J.; Nogueras-Iso, J.; Zarazaga-Soria, F. J.; Béjar, R.

    2012-10-01

    Orthoimages are essential in many Web applications to facilitate the background context that helps to understand other georeferenced information. Catalogues and service registries of Spatial Data Infrastructures do not necessarily register all the services providing access to imagery data on the Web, and it is not easy to automatically identify whether the data offered by a Web service are directly imagery data or not. This work presents a method for an automatic detection of the orthoimage layers offered by Web Map Services. The method combines two types of heuristics. The first one consists in analysing the text in the capabilities document. The second type is content-based heuristics, which analyse the content offered by the Web Map Service layers. These heuristics gather and analyse the colour features of a sample collection of image fragments that represent the offered content. An experiment has been performed over a set of Web Map Service layers, which have been fetched from a repository of capabilities documents gathered from the Web. This has proven the efficiency of the method (precision of 87% and recall of 60%). This functionality has been offered as a Web Processing Service, and it has been integrated within the Virtual Spain project to provide a catalogue of orthoimages and build realistic 3D views.

  18. Web Services and Related Works at CDS

    NASA Astrophysics Data System (ADS)

    Schaaff, A.

    2004-07-01

    Started at CDS in 2002, the work around Web Services is in a full exploitation phase. Several services are now available via SOAP: the Sesame name resolver for Simbad-NED-VizieR, a GLU tag resolver, a UCD resolver, the UCD tag list, Aladin image Access, VizieR catalogue access, etc. A portal is available to publish all information about how to use CDS XML Webservices and also hints on how to start to use XML Web Services (tutorial, links, etc.). Other works around XML Web Services are also ongoing at CDS and are described in this article.

  19. Enhancing UCSF Chimera through web services.

    PubMed

    Huang, Conrad C; Meng, Elaine C; Morris, John H; Pettersen, Eric F; Ferrin, Thomas E

    2014-07-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList.

  20. Improving query services of web map by web mining

    NASA Astrophysics Data System (ADS)

    Huang, Maojun

    2007-11-01

    Web map is the hybrid of map and the World Wide Web (known as Web). It is usually created with WebGIS techniques. With the rapid social development, web maps oriented the public are facing pressure that dissatisfy the increased demanding. The geocoding database plays a key role in supporting query services effectively. The traditional geocoding method is laborious and time-consuming. And there is much online spatial information, which would be the supplementary information source for geocoding. Therefore, this paper discusses how to improve query services by web mining. The improvement can be described from three facets: first, improving location query by discovering and extracting address information from the Web to extend geocoding database. Second, enhancing the ability of optimum path query of public traffic and buffer query by spatial analyzing and reasoning on the extended geocoding database. Third, adjusting strategies of collecting data according to patterns discovered by web map query mining. Finally, this paper presents the designing of the application system and experimental results.

  1. Efficient Web Services Policy Combination

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh; Harman, Joseph G.

    2010-01-01

    Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take

  2. Preservice Mathematics Teachers' Views on Distance Education and Their Web Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Cagirgan Gulten, Dilek

    2013-01-01

    This research aims to investigate primary preservice mathematics teachers' views on distance education and web pedagogical content knowledge in terms of the subscales of general web, communicative web, pedagogical web, web pedagogical content and attitude towards web based instruction. The research was conducted with 46 senior students in the…

  3. Transimulation - protein biosynthesis web service.

    PubMed

    Siwiak, Marlena; Zielenkiewicz, Piotr

    2013-01-01

    Although translation is the key step during gene expression, it remains poorly characterized at the level of individual genes. For this reason, we developed Transimulation - a web service measuring translational activity of genes in three model organisms: Escherichia coli, Saccharomyces cerevisiae and Homo sapiens. The calculations are based on our previous computational model of translation and experimental data sets. Transimulation quantifies mean translation initiation and elongation time (expressed in SI units), and the number of proteins produced per transcript. It also approximates the number of ribosomes that typically occupy a transcript during translation, and simulates their propagation. The simulation of ribosomes' movement is interactive and allows modifying the coding sequence on the fly. It also enables uploading any coding sequence and simulating its translation in one of three model organisms. In such a case, ribosomes propagate according to mean codon elongation times of the host organism, which may prove useful for heterologous expression. Transimulation was used to examine evolutionary conservation of translational parameters of orthologous genes. Transimulation may be accessed at http://nexus.ibb.waw.pl/Transimulation (requires Java version 1.7 or higher). Its manual and source code, distributed under the GPL-2.0 license, is freely available at the website.

  4. Web server for priority ordered multimedia services

    NASA Astrophysics Data System (ADS)

    Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund

    2001-10-01

    In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.

  5. Online Information Services. Caught in the Web?

    ERIC Educational Resources Information Center

    Green, Tim

    1995-01-01

    Provides brief reviews of the sites for several online services of the World Wide Web; the Web as a marketing tool and other aspects of interest to information professionals are highlighted. A sidebar presents information on accessing Internet locations, graphics, online forms, Telnet, saving, printing, mailing, and searching. (AEF)

  6. Domain-specific Web Service Discovery with Service Class Descriptions

    SciTech Connect

    Rocco, D; Caverlee, J; Liu, L; Critchlow, T J

    2005-02-14

    This paper presents DynaBot, a domain-specific web service discovery system. The core idea of the DynaBot service discovery system is to use domain-specific service class descriptions powered by an intelligent Deep Web crawler. In contrast to current registry-based service discovery systems--like the several available UDDI registries--DynaBot promotes focused crawling of the Deep Web of services and discovers candidate services that are relevant to the domain of interest. It uses intelligent filtering algorithms to match services found by focused crawling with the domain-specific service class descriptions. We demonstrate the capability of DynaBot through the BLAST service discovery scenario and describe our initial experience with DynaBot.

  7. Using ESO Reflex with Web Services

    NASA Astrophysics Data System (ADS)

    Järveläinen, P.; Savolainen, V.; Oittinen, T.; Maisala, S.; Ullgrén, M. Hook, R.

    2008-08-01

    ESO Reflex is a prototype graphical workflow system, based on Taverna, and primarily intended to be a flexible way of running ESO data reduction recipes along with other legacy applications and user-written tools. ESO Reflex can also readily use the Taverna Web Services features that are based on the Apache Axis SOAP implementation. Taverna is a general purpose Web Service client, and requires no programming to use such services. However, Taverna also has some restrictions: for example, no numerical types such integers. In addition the preferred binding style is document/literal wrapped, but most astronomical services publish the Axis default WSDL using RPC/encoded style. Despite these minor limitations we have created simple but very promising test VO workflow using the Sesame name resolver service at CDS Strasbourg, the Hubble SIAP server at the Multi-Mission Archive at Space Telescope (MAST) and the WESIX image cataloging and catalogue cross-referencing service at the University of Pittsburgh. ESO Reflex can also pass files and URIs via the PLASTIC protocol to visualisation tools and has its own viewer for VOTables. We picked these three Web Services to try to set up a realistic and useful ESO Reflex workflow. They also demonstrate ESO Reflex abilities to use many kind of Web Services because each of them requires a different interface. We describe each of these services in turn and comment on how it was used

  8. Web Service Architecture Framework for Embedded Devices

    ERIC Educational Resources Information Center

    Yanzick, Paul David

    2009-01-01

    The use of Service Oriented Architectures, namely web services, has become a widely adopted method for transfer of data between systems across the Internet as well as the Enterprise. Adopting a similar approach to embedded devices is also starting to emerge as personal devices and sensor networks are becoming more common in the industry. This…

  9. New Interfaces to Web Documents and Services

    NASA Technical Reports Server (NTRS)

    Carlisle, W. H.

    1996-01-01

    This paper reports on investigations into how to extend capabilities of the Virtual Research Center (VRC) for NASA's Advanced Concepts Office. The work was performed as part of NASA's 1996 Summer Faculty Fellowship program, and involved research into and prototype development of software components that provide documents and services for the World Wide Web (WWW). The WWW has become a de-facto standard for sharing resources over the internet, primarily because web browsers are freely available for the most common hardware platforms and their operating systems. As a consequence of the popularity of the internet, tools, and techniques associated with web browsers are changing rapidly. New capabilities are offered by companies that support web browsers in order to achieve or remain a dominant participant in internet services. Because a goal of the VRC is to build an environment for NASA centers, universities, and industrial partners to share information associated with Advanced Concepts Office activities, the VRC tracks new techniques and services associated with the web in order to determine the their usefulness for distributed and collaborative engineering research activities. Most recently, Java has emerged as a new tool for providing internet services. Because the major web browser providers have decided to include Java in their software, investigations into Java were conducted this summer.

  10. A Strategic Model of Trust Management in Web Services

    NASA Astrophysics Data System (ADS)

    Sun, Junqing; Sun, Zhaohao; Li, Yuanzhe; Zhao, Shuliang

    This article examines trust and trust management in web services and proposes a multiagent model of trust relationship in web services. It looks at the hierarchical structure of trust management in web services and proposes a strategic model of trust management in web services. The proposed approach in this article will facilitate research and development of trust management in e-commerce, web services and social networking.

  11. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  12. Optimizing Web Service Composition While Enforcing Regulations

    NASA Astrophysics Data System (ADS)

    Sohrabi, Shirin; McIlraith, Sheila A.

    To direct automated Web service composition, it is compelling to provide a template, workflow or scaffolding that dictates the ways in which services can be composed. In this paper we present an approach to Web service composition that builds on work using AI planning, and more specifically Hierarchical Task Networks (HTNs), for Web service composition. A significant advantage of our approach is that it provides much of the how-to knowledge of a choreography while enabling customization and optimization of integrated Web service selection and composition based upon the needs of the specific problem, the preferences of the customer, and the available services. Many customers must also be concerned with enforcement of regulations, perhaps in the form of corporate policies and/or government regulations. Regulations are traditionally enforced at design time by verifying that a workflow or composition adheres to regulations. Our approach supports customization, optimization and regulation enforcement all at composition construction time. To maximize efficiency, we have developed novel search heuristics together with a branch and bound search algorithm that enable the generation of high quality compositions with the performance of state-of-the-art planning systems.

  13. Designing Crop Simulation Web Service with Service Oriented Architecture Principle

    NASA Astrophysics Data System (ADS)

    Chinnachodteeranun, R.; Hung, N. D.; Honda, K.

    2015-12-01

    Crop simulation models are efficient tools for simulating crop growth processes and yield. Running crop models requires data from various sources as well as time-consuming data processing, such as data quality checking and data formatting, before those data can be inputted to the model. It makes the use of crop modeling limited only to crop modelers. We aim to make running crop models convenient for various users so that the utilization of crop models will be expanded, which will directly improve agricultural applications. As the first step, we had developed a prototype that runs DSSAT on Web called as Tomorrow's Rice (v. 1). It predicts rice yields based on a planting date, rice's variety and soil characteristics using DSSAT crop model. A user only needs to select a planting location on the Web GUI then the system queried historical weather data from available sources and expected yield is returned. Currently, we are working on weather data connection via Sensor Observation Service (SOS) interface defined by Open Geospatial Consortium (OGC). Weather data can be automatically connected to a weather generator for generating weather scenarios for running the crop model. In order to expand these services further, we are designing a web service framework consisting of layers of web services to support compositions and executions for running crop simulations. This framework allows a third party application to call and cascade each service as it needs for data preparation and running DSSAT model using a dynamic web service mechanism. The framework has a module to manage data format conversion, which means users do not need to spend their time curating the data inputs. Dynamic linking of data sources and services are implemented using the Service Component Architecture (SCA). This agriculture web service platform demonstrates interoperability of weather data using SOS interface, convenient connections between weather data sources and weather generator, and connecting

  14. CMR Catalog Service for the Web

    NASA Technical Reports Server (NTRS)

    Newman, Doug; Mitchell, Andrew

    2016-01-01

    With the impending retirement of Global Change Master Directory (GCMD) Application Programming Interfaces (APIs) the Common Metadata Repository (CMR) was charged with providing a collection-level Catalog Service for the Web (CSW) that provided the same level of functionality as GCMD. This talk describes the capabilities of the CMR CSW API with particular reference to the support of the Committee on Earth Observation Satellites (CEOS) Working Group on Information Systems and Services (WGISS) Integrated Catalog (CWIC).

  15. Introducing the PRIDE Archive RESTful web services

    PubMed Central

    Reisinger, Florian; del-Toro, Noemi; Ternent, Tobias; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-01-01

    The PRIDE (PRoteomics IDEntifications) database is one of the world-leading public repositories of mass spectrometry (MS)-based proteomics data and it is a founding member of the ProteomeXchange Consortium of proteomics resources. In the original PRIDE database system, users could access data programmatically by accessing the web services provided by the PRIDE BioMart interface. New REST (REpresentational State Transfer) web services have been developed to serve the most popular functionality provided by BioMart (now discontinued due to data scalability issues) and address the data access requirements of the newly developed PRIDE Archive. Using the API (Application Programming Interface) it is now possible to programmatically query for and retrieve peptide and protein identifications, project and assay metadata and the originally submitted files. Searching and filtering is also possible by metadata information, such as sample details (e.g. species and tissues), instrumentation (mass spectrometer), keywords and other provided annotations. The PRIDE Archive web services were first made available in April 2014. The API has already been adopted by a few applications and standalone tools such as PeptideShaker, PRIDE Inspector, the Unipept web application and the Python-based BioServices package. This application is free and open to all users with no login requirement and can be accessed at http://www.ebi.ac.uk/pride/ws/archive/. PMID:25904633

  16. Predicting Student Performance in Web-Based Distance Education Courses Based on Survey Instruments Measuring Personality Traits and Technical Skills

    ERIC Educational Resources Information Center

    Hall, Michael

    2008-01-01

    Two common web-based surveys, "Is Online Learning Right for Me?' and "What Technical Skills Do I Need?", were combined into a single survey instrument and given to 228 on-campus and 83 distance education students. The students were enrolled in four different classes (business, computer information services, criminal justice, and…

  17. Enhancing Data Interoperability with Web Services

    NASA Astrophysics Data System (ADS)

    Shrestha, S. R.; Zimble, D. A.; Wang, W.; Herring, D.; Halpert, M.

    2014-12-01

    In an effort to improve data access and interoperability of climate and weather data, the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov and Climate Prediction Center (CPC) are exploring various platform solutions to enhance a user's ability to locate, preview, and acquire the data. The Climate.gov and CPC data team faces multiple challenges including the various kinds of data and formats, inconsistency of metadata records, variety of data service implementations, very large volumes of data and geographically distributed locations. We have created the Data Access and Interoperability project to design a web-based platform, where interoperability between systems can be leveraged to allow greater data discovery, access, visualization and delivery. In the interoperable data platform, systems can integrate with each other to support the synthesis of climate and weather data. Interoperability is the ability for users to discover the available climate and weather data, preview and interact with the data, and acquire the data in common digital formats through a simple web-based interface. The goal of the interoperable data platform is to leverage existing web services, implement the established standards and integrate with existing solutions across the earth sciences domain instead of creating new technologies. Towards this effort to improve the interoperability of the platform, we are collaborating with ESRI Inc. to provide climate and weather data via web services. In this presentation, we will discuss and demonstrate how to use ArcGIS to author RESTful based scientific web services using open standards. These web services are able to encapsulate the logic required to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. Combining these types of services and leveraging well-documented APIs, including the ArcGIS JavaScript API, we can afford to

  18. Focused Crawling of the Deep Web Using Service Class Descriptions

    SciTech Connect

    Rocco, D; Liu, L; Critchlow, T

    2004-06-21

    Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address these challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.

  19. Altering Leadership Thinking and Organizational Behavior Through Web Services

    DTIC Science & Technology

    2010-04-01

    result. The proliferation of Web 2.0 services is enabling information sharing among employees and leaders. Regrettably, this level of information...investigates the relationship between Web services, commonly called Web 2.0 , and the influence these services wield on organizational behavior. To support the...infrastructure for those who can use Web 2.0 and other IT services. The data were sorted by officer, noncommissioned officer, and enlisted ranks for the

  20. A Web Services Data Analysis Grid

    SciTech Connect

    William A Watson III; Ian Bird; Jie Chen; Bryan Hess; Andy Kowalski; Ying Chen

    2002-07-01

    The trend in large-scale scientific data analysis is to exploit compute, storage and other resources located at multiple sites, and to make those resources accessible to the scientist as if they were a single, coherent system. Web technologies driven by the huge and rapidly growing electronic commerce industry provide valuable components to speed the deployment of such sophisticated systems. Jefferson Lab, where several hundred terabytes of experimental data are acquired each year, is in the process of developing a web-based distributed system for data analysis and management. The essential aspects of this system are a distributed data grid (site independent access to experiment, simulation and model data) and a distributed batch system, augmented with various supervisory and management capabilities, and integrated using Java and XML-based web services.

  1. User Needs of Digital Service Web Portals: A Case Study

    ERIC Educational Resources Information Center

    Heo, Misook; Song, Jung-Sook; Seol, Moon-Won

    2013-01-01

    The authors examined the needs of digital information service web portal users. More specifically, the needs of Korean cultural portal users were examined as a case study. The conceptual framework of a web-based portal is that it is a complex, web-based service application with characteristics of information systems and service agents. In…

  2. The Impact of Web Based Resource Material on Learning Outcome in Open Distance Higher Education

    ERIC Educational Resources Information Center

    Masrur, Rehana

    2010-01-01

    One of the most powerful educational option in open and distance education is web-based learning. A blended (hybrid) course combines traditional face to face and web-based learning approaches in an educational environment that is nonspecific as to time and place. The study reported here investigated the impact of web based resource material…

  3. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  4. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  5. WSDRI-based Semantic Web Service Discovery Framework

    NASA Astrophysics Data System (ADS)

    Sun, Xu; Xu, Yanli; Mao, Mingrong; Dong, Ming

    In the research of Web Service [1, 2], semantic information should be automatically discovered, selected and composed. These automations can make usage of Web Service easily. In this paper we propose a framework to facilitate the discovery of Web Service. In this framework, we use WSDRI (Web Service Discovery Information) to describe the semantic information. This framework which refers to client and Match Server is based on WSDRI. Then we evaluate the framework through the application on the internet to find that the framework is effective. Following this framework, it could be easy to discover the information of Web Service especially the semantic information.

  6. Web Map Services for Hurricane Data

    NASA Astrophysics Data System (ADS)

    de La Beaujardiere, J.

    2005-12-01

    We have established several Web Map Services (WMS) that provide visualizations of numerical weather model data and satellite observations in support of NASA's Modeling, Analysis and Prediction Program 2005 Intensive Analysis (MAP05) Hurricane Project. The primary data component offered is the voluminous output (30GB/day) from the fourth-generation Goddard Earth Observing System (GEOS4) incorporating the finite-volume General Circulation Model (fvGCM). In addition, cloud imagery from the NASA MODIS and NOAA GOES sensors, and other ancillary datasets, are provided. We will discuss and demonstrate these servers using a simple web-based client. WMS is a web service interoperability specification developed by the Open Geospatial Consortium (OGC) and also approved for release as an International Standard (ISO 19128). WMS allows the user to request customized images of geospatial information, selecting exactly the dataset(s), times, geographic area and output size desired. A WMS client can combine output from multiple WMS servers, thus producing a unified view of information stored in a distributed network. Adoption of web services for access to hurricane data can be more efficient than older methods of data access. Specifically, instead of downloading huge datasets to a local workstation for analysis, the data can be stored on a remote server that generates visualizations of user-selected subsets of the data. In addition, a "clip-and-ship" capability can allow the user to visually inspect the data via WMS and then request numeric data values for only the subset of interest. Acknowledgements: Work supported by NASA Geoscience Interoperability Office (GIO) and NASA GSFC Software Integration and Visualization Office (SIVO). Satellite imagery provided by NASA GSFC Ocean Color Group and GOES Project Science Office.

  7. Distance Learning: Information Access and Services for Virtual Users.

    ERIC Educational Resources Information Center

    Iyer, Hemalata, Ed.

    This volume centers broadly on information support services for distance education. The articles in this book can be categorized into two areas: access to information resources for distance learners, and studies of distance learning programs. Contents include: "The Challenges and Benefits of Asynchronous Learning Networks" (Daphne…

  8. Using EMBL-EBI services via Web interface and programmatically via Web Services

    PubMed Central

    Lopez, Rodrigo; Cowley, Andrew; Li, Weizhong; McWilliam, Hamish

    2015-01-01

    The European Bioinformatics Institute (EMBL-EBI) provides access to a wide range of databases and analysis tools that are of key importance in bioinformatics. As well as providing Web interfaces to these resources, Web Services are available using SOAP and REST protocols that enable programmatic access to our resources and allow their integration into other applications and analytical workflows. This unit describes the various options available to a typical researcher or bioinformatician who wishes to use our resources via Web interface or programmatically via a range of programming languages. PMID:25501941

  9. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Dörnbrack, A.

    2012-04-01

    We present a web service based tool for the planning of atmospheric research flights. The tool, which we call the "Mission Support System" (MSS), provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. The MSS is focused on the primary needs of mission scientists responsible for planning a research flight, addressing in particular the following requirements: (1) interactive exploration of available atmospheric forecasts, (2) interactive flight planning in relation to these forecasts, (3) computation of expected flight performance to assess the technical feasibility (in terms of total distance and vertical profile) of a flight, (4) no transfer of large forecast data files to the campaign site to allow deployment at remote locations and (5) low demand on hardware resources. We have implemented the software using the open-source programming language Python.

  10. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  11. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  12. Web Map Services (WMS) Global Mosaic

    NASA Technical Reports Server (NTRS)

    Percivall, George; Plesea, Lucian

    2003-01-01

    The WMS Global Mosaic provides access to imagery of the global landmass using an open standard for web mapping. The seamless image is a mosaic of Landsat 7 scenes; geographically-accurate with 30 and 15 meter resolutions. By using the OpenGIS Web Map Service (WMS) interface, any organization can use the global mosaic as a layer in their geospatial applications. Based on a trade study, an implementation approach was chosen that extends a previously developed WMS hosting a Landsat 5 CONUS mosaic developed by JPL. The WMS Global Mosaic supports the NASA Geospatial Interoperability Office goal of providing an integrated digital representation of the Earth, widely accessible for humanity's critical decisions.

  13. Synthetic seismogram web service and Python tools

    NASA Astrophysics Data System (ADS)

    Heimann, Sebastian; Cesca, Simone; Kriegerowski, Marius; Dahm, Torsten

    2014-05-01

    Many geophysical methods require knowledge of Green's functions (GF) or synthetic seismograms in dependence of ranges of source and receiver coordinates. Examples include synthetic seismogram generation, moment tensor inversion, the modeling of depth phases for regional and teleseismic earthquakes, or the modeling of pressure diffusion induced static displacement and strain. Calculation of Green's functions is a computationally expensive operation and it can be of advantage to calculate them in advance: the same Green's function traces can then be reused several or many times as required in a typical application. Regarding Green's function computation as an independent step in a use-case's processing chain encourages to store these in an application independent form. They can then be shared between different applications and they can also be passed to other researchers, e.g. via a web service. Starting now, we provide such a web service to the seismological community (http://kinherd.org/), where a researcher can share Green's function stores and retrieve synthetic seismograms for various point and extended earthquake source models for many different earth models at local, regional and global scale. This web service is part of a rich new toolset for the creation and handling of Green's functions and synthetic seismograms (http://emolch.github.com/pyrocko/gf). It can be used off-line or in client mode. Its core features are: greatly simplified generation of Green's function stores supports various codes for Green's function computation extensible Green's function storage format flexible spacial indexing of Green's functions integrated travel time computation support for other types of Green's functions; e.g. poro-elastic GFs written in Python

  14. UkrVO astronomical WEB services

    NASA Astrophysics Data System (ADS)

    Mazhaev, O. E.

    2017-02-01

    Ukraine Virtual Observatory (UkrVO) has been a member of the International Virtual Observatory Alliance (IVOA) since 2011. The virtual observatory (VO) is not a magic solution to all problems of data storing and processing, but it provides certain standards for building infrastructure of astronomical data center. The astronomical databases help data mining and offer to users an easy access to observation metadata, images within celestial sphere and results of image processing. The astronomical web services (AWS) of UkrVO give to users handy tools for data selection from large astronomical catalogues for a relatively small region of interest in the sky. Examples of the AWS usage are showed.

  15. Creating Web Services from Community Sourced Data

    NASA Astrophysics Data System (ADS)

    Siegel, D.; Scopel, C.; Boghici, E.

    2013-12-01

    In order to extend the World Hydro Basemap and build watershed delineation and river tracing services that cover the entire planet, we are integrating community-contributed data into a global hydrographic dataset. This dataset is the engine behind a foundational set of tools and services intended to enable hydrologic analysis on the web. However, each organization that collects hydrography uses a workflow and data model unique to their mission, which makes synthesizing their data difficult. Furthermore, these data are collected at different resolutions, so running analytics across regions with multiple contributors is not necessarily valid. Thus, instead of merging contributed data into a seamless geodatabase, the goal of our Community Maps for Hydrology program is to create workflows for converting any arbitrary dataset into the Arc Hydro Data Model. This way, tools and services can be pointed towards different contributions interchangeably while still maintaining the autonomy of each dataset. Contributors retain ownership of their data and are responsible for updates and edits, but the tools and services work identically across all contributions. HydroSHEDs data, contributed by the World Wildlife Fund, is used at the smallest scales to ensure global coverage, and national datasets extend our services to the medium-scales where available. A workflow to incorporate LIDAR and other large scale data is being developed as well, so that local governments and engineering companies can contribute to the program. Watershed Delineation Tool The World Hydro Basemap

  16. Model Driven Development of Web Services and Dynamic Web Services Composition

    DTIC Science & Technology

    2005-01-01

    Web Services (WS) has emerged as a new component-based software develop- ment paradigm in a network -centric environment based on the Service Oriented...tion of legacy distributed software system toward WS applications ; 2) the innovation of new infrastructure, and languages in support of WS application ...approach is presented toward reengineering legacy software systems to WS applications , rather than rewriting the whole legacy software system from

  17. OGC Web Services standards by example : the European Seismic Portal

    NASA Astrophysics Data System (ADS)

    Frobert, L.; Kamb, L.; Trani, L.; Spinuso, A.; Bossu, R.; Van Eck, T.

    2011-12-01

    NERIES (2006-2010) was an Integrated Infrastructure Initiative (I3) project in the Sixth Framework Program (FP6) of the European Commission (EC), aiming at networking the European seismic networks, improving access to data, allowing access to specific seismic infrastructures and pursuing targeted research developing the next generation of tools for improved service and data analysis. During this project, a web portal was developed using web services to access data and a Visual Web Applications to display them. However these web services were not conform to any standard, making them difficult to consume by any new user interface. Therefore, for the NERA project, the follow-up of NERIES, we have proposed the use of web services standards to access our data. We have decided to use standards defined by the Open Geospatial Consortium (OGC). The OGC defines standards for the Web service interfaces to access geo-tagged data. The events and seismic stations are also geo-tagged making these web services suitable for our purpose. Using standard web services gives us the opportunity to distribute our data across all conformant consumers to these standards through various programming languages and applications We have implemented a preliminary version of web services conforming to the Web Map Service (WMS) and Web Feature Service (WFS) standard to access our catalog of seismic events (nearly 200 000 events). To visualize them we have made four examples demo on our web site using different technologies (Adobe Flash, JavaScript, Java with Nasa World Wind and UDig a desktop GIS application). In the future we hope to implement other OGC Web services standard like : - Sensor Observation Service (SOS) to provide seismic waveform records; - Web Notification Service (WNS); - Catalog Service for the Web (CSW) to provide a search engine of all our web services; - Web Processing Service (WPS) to process data between different services. The power of the use of OGC standards is the easy

  18. A web services choreography scenario for interoperating bioinformatics applications

    PubMed Central

    de Knikker, Remko; Guo, Youjun; Li, Jin-long; Kwan, Albert KH; Yip, Kevin Y; Cheung, David W; Cheung, Kei-Hoi

    2004-01-01

    Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1) the platforms on which the applications run are heterogeneous, 2) their web interface is not machine-friendly, 3) they use a non-standard format for data input and output, 4) they do not exploit standards to define application interface and message exchange, and 5) existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD) that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH) category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates with these web

  19. Solving the Problem of Promoting Distance Library Services

    ERIC Educational Resources Information Center

    Wyss, Paul Alan

    2007-01-01

    Promoting services is a conundrum for any organization. This is especially true for an academic library promoting distance library services. Systems thinking, process mapping, team learning, and diffusion of information practices offer ways of thinking about promoting services that help those involved find novel ways to approach promoting distance…

  20. Persistence and availability of Web services in computational biology.

    PubMed

    Schultheiss, Sebastian J; Münch, Marc-Christian; Andreeva, Gergana D; Rätsch, Gunnar

    2011-01-01

    We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository.

  1. A web service for service composition to aid geospatial modelers

    NASA Astrophysics Data System (ADS)

    Bigagli, L.; Santoro, M.; Roncella, R.; Mazzetti, P.

    2012-04-01

    The identification of appropriate mechanisms for process reuse, chaining and composition is considered a key enabler for the effective uptake of a global Earth Observation infrastructure, currently pursued by the international geospatial research community. In the Earth and Space Sciences, such a facility could primarily enable integrated and interoperable modeling, for what several approaches have been proposed and developed, over the last years. In fact, GEOSS is specifically tasked with the development of the so-called "Model Web". At increasing levels of abstraction and generalization, the initial stove-pipe software tools have evolved to community-wide modeling frameworks, to Component-Based Architecture solution, and, more recently, started to embrace Service-Oriented Architectures technologies, such as the OGC WPS specification and the WS-* stack of W3C standards for service composition. However, so far, the level of abstraction seems too low for implementing the Model Web vision, and far too complex technological aspects must still be addressed by both providers and users, resulting in limited usability and, eventually, difficult uptake. As by the recent ICT trend of resource virtualization, it has been suggested that users in need of a particular processing capability, required by a given modeling workflow, may benefit from outsourcing the composition activities into an external first-class service, according to the Composition as a Service (CaaS) approach. A CaaS system provides the necessary interoperability service framework for adaptation, reuse and complementation of existing processing resources (including models and geospatial services in general) in the form of executable workflows. This work introduces the architecture of a CaaS system, as a distributed information system for creating, validating, editing, storing, publishing, and executing geospatial workflows. This way, the users can be freed from the need of a composition infrastructure and

  2. The Effects of Personality Type on Web-Based Distance Learning

    ERIC Educational Resources Information Center

    Bishop-Clark, Cathy; Dietz-Uhler, Beth; Fisher, Amy

    2007-01-01

    Web-based distance learning is a relatively new approach in higher education which is gaining in popularity. Because a Web-based classroom is so different than a traditional face-to-face classroom, the variables that influence success or satisfaction with such a course may be different than those in a face-to-face course. We investigated whether…

  3. Synchronous Distance Education: Using Web-Conferencing in an MBA Accounting Course

    ERIC Educational Resources Information Center

    Ellingson, Dee Ann; Notbohm, Matthew

    2012-01-01

    Online distance education can take many forms, from a correspondence course with materials online to fully synchronous, live instruction. This paper describes a fully synchronous, live format using web-conferencing. Some useful features of web-conferencing and the way they are employed in this course are described. Instructor observations and…

  4. The Web, the Millennium, and the Digital Evolution of Distance Education.

    ERIC Educational Resources Information Center

    Leonard, David C.

    1999-01-01

    Discusses Industrial and Digital Age educational models, needs, and expectations of adult and traditional learners for Internet-based education; knowledge management and its impact on technical communication; the Universal Campus Network and the nature of Web-based education in the near future; elements for success for Web-based distance education…

  5. Effectiveness of Learning Process Using "Web Technology" in the Distance Learning System

    ERIC Educational Resources Information Center

    Killedar, Manoj

    2008-01-01

    Web is a globally distributed, still highly personalized media for cost-effective delivery of multimedia information and services. Web is expected to have a strong impact on almost every aspect of how we learn. "Total Quality" is the totality of features, as perceived by the customers of the product or service. Totality of features…

  6. Library Services to Distance Learners in the Commonwealth: A Reader.

    ERIC Educational Resources Information Center

    Watson, Elizabeth F., Ed.; Jagannathan, Neela, Ed.

    The provision of good library services is a crucial factor in determining the quality of distance education. This collection of articles acquaints readers with distance librarianship as it is practiced in developed and developing countries throughout the British Commonwealth. The reader includes: "Introduction" (Michael Wooliscroft);…

  7. BioSWR--semantic web services registry for bioinformatics.

    PubMed

    Repchevsky, Dmitry; Gelpi, Josep Ll

    2014-01-01

    Despite of the variety of available Web services registries specially aimed at Life Sciences, their scope is usually restricted to a limited set of well-defined types of services. While dedicated registries are generally tied to a particular format, general-purpose ones are more adherent to standards and usually rely on Web Service Definition Language (WSDL). Although WSDL is quite flexible to support common Web services types, its lack of semantic expressiveness led to various initiatives to describe Web services via ontology languages. Nevertheless, WSDL 2.0 descriptions gained a standard representation based on Web Ontology Language (OWL). BioSWR is a novel Web services registry that provides standard Resource Description Framework (RDF) based Web services descriptions along with the traditional WSDL based ones. The registry provides Web-based interface for Web services registration, querying and annotation, and is also accessible programmatically via Representational State Transfer (REST) API or using a SPARQL Protocol and RDF Query Language. BioSWR server is located at http://inb.bsc.es/BioSWR/and its code is available at https://sourceforge.net/projects/bioswr/under the LGPL license.

  8. Web-Based Course Management and Web Services

    ERIC Educational Resources Information Center

    Mandal, Chittaranjan; Sinha, Vijay Luxmi; Reade, Christopher M. P.

    2004-01-01

    The architecture of a web-based course management tool that has been developed at IIT [Indian Institute of Technology], Kharagpur and which manages the submission of assignments is discussed. Both the distributed architecture used for data storage and the client-server architecture supporting the web interface are described. Further developments…

  9. Web services for distributed and interoperable hydro-information systems

    NASA Astrophysics Data System (ADS)

    Horak, J.; Orlik, A.; Stromsky, J.

    2007-06-01

    Web services support the integration and interoperability of Web-based applications and enable machine-to-machine interaction. The concepts of web services and open distributed architecture were applied to the development of T-DSS, the prototype customised for web based hydro-information systems. T-DSS provides mapping services, database related services and access to remote components, with special emphasis placed on output flexibility (e.g. multilingualism), where SOAP web services are mainly used for communication. The remote components are represented above all by distant data and mapping services (e.g. eteorological predictions), modelling and analytical systems (currently HEC-HMS, Modflow and additional utilities), which support decision making in water management.

  10. Web services for distributed and interoperable hydro-information systems

    NASA Astrophysics Data System (ADS)

    Horak, J.; Orlik, A.; Stromsky, J.

    2008-03-01

    Web services support the integration and interoperability of Web-based applications and enable machine-to-machine interaction. The concepts of web services and open distributed architecture were applied to the development of T-DSS, the prototype customised for web based hydro-information systems. T-DSS provides mapping services, database related services and access to remote components, with special emphasis placed on the output flexibility (e.g. multilingualism), where SOAP web services are mainly used for communication. The remote components are represented above all by remote data and mapping services (e.g. meteorological predictions), modelling and analytical systems (currently HEC-HMS, MODFLOW and additional utilities), which support decision making in water management.

  11. An Exploration of Cultural Value Orientations in Distance Education Web Marketing

    ERIC Educational Resources Information Center

    DeGaetano, Lora A.

    2013-01-01

    In the current global environment, universities seek to attract international students. The low enrollment of international students at a particular distance education institution demonstrated the competitive challenge in attracting international students. Distance education web marketing communications may be a factor influencing low enrollment…

  12. Empirical Investigation into Motives for Choosing Web-Based Distance Learning Programs

    ERIC Educational Resources Information Center

    Alkhattabi, Mona

    2016-01-01

    Today, in association with rapid social and economic changes, there is an increasing level of demand for distance and online learning programs. This study will focus on identifying the main motivational factors for choosing a web-based distance-learning program. Moreover, it will investigate how these factors relate to age, gender, marital status…

  13. Innovation in Open & Distance Learning: Successful Development of Online and Web-Based Learning.

    ERIC Educational Resources Information Center

    Lockwood, Fred, Ed.; Gooley, Anne, Ed.

    This book contains 19 papers examining innovation in open and distance learning through development of online and World Wide Web-based learning. The following papers are included: "Innovation in Distributed Learning: Creating the Environment" (Fred Lockwood); "Innovation in Open and Distance Learning: Some Lessons from Experience…

  14. Dimensions of Transactional Distance in the World Wide Web Learning Environment: A Factor Analysis.

    ERIC Educational Resources Information Center

    Chen, Yau-Jane

    2001-01-01

    Discussion of the effectiveness of distance education focuses on Moore's Theory of Transactional Distance and a study of learners' experiences with the World Wide Web in a university course in Taiwan. Highlights include learner-instructor interaction; learner-content interaction; learner-learner interaction; learner-interface interaction; results…

  15. Experience using web services for biological sequence analysis

    PubMed Central

    Attwood, Teresa; Chohan, Shahid Nadeem; Côté, Richard; Cudré-Mauroux, Philippe; Falquet, Laurent; Fernandes, Pedro; Finn, Robert D.; Hupponen, Taavi; Korpelainen, Eija; Labarga, Alberto; Laugraud, Aurelie; Lima, Tania; Pafilis, Evangelos; Pagni, Marco; Pettifer, Steve; Phan, Isabelle; Rahman, Nazim

    2008-01-01

    Programmatic access to data and tools through the web using so-called web services has an important role to play in bioinformatics. In this article, we discuss the most popular approaches based on SOAP/WS-I and REST and describe our, a cross section of the community, experiences with providing and using web services in the context of biological sequence analysis. We briefly review main technological approaches as well as best practice hints that are useful for both users and developers. Finally, syntactic and semantic data integration issues with multiple web services are discussed. PMID:18621748

  16. Interactive Distance Education for In-Service Teachers in India.

    ERIC Educational Resources Information Center

    Sharma, Santosh

    2000-01-01

    Discusses interactive television technologies that are under development and experimentation in India for in-service teacher education at the Indira Gandhi National Open University. Describes the VSAT (Very Small Aperture Terminal) system and ISDN (Integrated Services Digital Network) that are used for video technology in distance education.…

  17. Web 2.0 Strategy in Libraries and Information Services

    ERIC Educational Resources Information Center

    Byrne, Alex

    2008-01-01

    Web 2.0 challenges libraries to change from their predominantly centralised service models with integrated library management systems at the hub. Implementation of Web 2.0 technologies and the accompanying attitudinal shifts will demand reconceptualisation of the nature of library and information service around a dynamic, ever changing, networked,…

  18. Density estimation of small-mammal populations using a trapping web and distance sampling methods

    USGS Publications Warehouse

    Anderson, David R.; Burnham, Kenneth P.; White, Gary C.; Otis, David L.

    1983-01-01

    Distance sampling methodology is adapted to enable animal density (number per unit of area) to be estimated from capture-recapture and removal data. A trapping web design provides the link between capture data and distance sampling theory. The estimator of density is D = Mt+1f(0), where Mt+1 is the number of individuals captured and f(0) is computed from the Mt+1 distances from the web center to the traps in which those individuals were first captured. It is possible to check qualitatively the critical assumption on which the web design and the estimator are based. This is a conceptual paper outlining a new methodology, not a definitive investigation of the best specific way to implement this method. Several alternative sampling and analysis methods are possible within the general framework of distance sampling theory; a few alternatives are discussed and an example is given.

  19. Compression-based aggregation model for medical web services.

    PubMed

    Al-Shammary, Dhiah; Khalil, Ibrahim

    2010-01-01

    Many organizations such as hospitals have adopted Cloud Web services in applying their network services to avoid investing heavily computing infrastructure. SOAP (Simple Object Access Protocol) is the basic communication protocol of Cloud Web services that is XML based protocol. Generally,Web services often suffer congestions and bottlenecks as a result of the high network traffic that is caused by the large XML overhead size. At the same time, the massive load on Cloud Web services in terms of the large demand of client requests has resulted in the same problem. In this paper, two XML-aware aggregation techniques that are based on exploiting the compression concepts are proposed in order to aggregate the medical Web messages and achieve higher message size reduction.

  20. Optimizing QoS-Aware Semantic Web Service Composition

    NASA Astrophysics Data System (ADS)

    Lécué, Freddy

    Ranking and optimization of web service compositions are some of the most interesting challenges at present. Since web services can be enhanced with formal semantic descriptions, forming the "semantic web services", it becomes conceivable to exploit the quality of semantic links between services (of any composition) as one of the optimization criteria. For this we propose to use the semantic similarities between output and input parameters of web services. Coupling this with other criteria such as quality of service (QoS) allow us to rank and optimize compositions achieving the same goal. Here we suggest an innovative and extensible optimization model designed to balance semantic fit (or functional quality) with non-functional QoS metrics. To allow the use of this model in the context of a large number of services as foreseen by the strategic EC-funded project SOA4All we propose and test the use of Genetic Algorithms.

  1. A VR-Based Shared Web System for Distance Education

    ERIC Educational Resources Information Center

    Shih, Timothy K.; Chang, Ya-Fung; Hsu, Hun-Hui; Wang, Ying-Hong; Chen, Yung-Hui

    2004-01-01

    Distance education has been an important research issue of multimedia computing and communication. Since the instructional activities are implemented on cyberspace, how to control behaviors of students and to increase the degree of communication awareness have been a challenging issue. We propose a system based on the scaffolding theory. Behaviors…

  2. Enhancing the AliEn Web Service Authentication

    NASA Astrophysics Data System (ADS)

    Zhu, Jianlin; Saiz, Pablo; Carminati, Federico; Betev, Latchezar; Zhou, Daicui; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Grigoras, Costin; Furano, Fabrizio; Schreiner, Steffen; Vladimirovna Datskova, Olga; Sankar Banerjee, Subho; Zhang, Guoping

    2011-12-01

    Web Services are an XML based technology that allow applications to communicate with each other across disparate systems. Web Services are becoming the de facto standard that enable inter operability between heterogeneous processes and systems. AliEn2 is a grid environment based on web services. The AliEn2 services can be divided in three categories: Central services, deployed once per organization; Site services, deployed on each of the participating centers; Job Agents running on the worker nodes automatically. A security model to protect these services is essential for the whole system. Current implementations of web server, such as Apache, are not suitable to be used within the grid environment. Apache with the mod_ssl and OpenSSL only supports the X.509 certificates. But in the grid environment, the common credential is the proxy certificate for the purpose of providing restricted proxy and delegation. An Authentication framework was taken for AliEn2 web services to add the ability to accept X.509 certificates and proxy certificates from client-side to Apache Web Server. The authentication framework could also allow the generation of access control policies to limit access to the AliEn2 web services.

  3. Constraint Web Service Composition Based on Discrete Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Fang, Xianwen; Fan, Xiaoqin; Yin, Zhixiang

    Web service composition provides an open, standards-based approach for connecting web services together to create higher-level business processes. The Standards are designed to reduce the complexity required to compose web services, hence reducing time and costs, and increase overall efficiency in businesses. This paper present independent global constrains web service composition optimization methods based on Discrete Particle Swarm Optimization (DPSO) and associate Petri net (APN). Combining with the properties of APN, an efficient DPSO algorithm is presented which is used to search a legal firing sequence in the APN model. Using legal firing sequences of the Petri net makes the service composition locating space based on DPSO shrink greatly. Finally, for comparing our methods with the approximating methods, the simulation experiment is given out. Theoretical analysis and experimental results indicate that this method owns both lower computation cost and higher success ratio of service composition.

  4. Rule-based semantic web services matching strategy

    NASA Astrophysics Data System (ADS)

    Fan, Hong; Wang, Zhihua

    2011-12-01

    With the development of Web services technology, the number of service increases rapidly, and it becomes a challenge task that how to efficiently discovery the services that exactly match the user's requirements from the large scale of services library. Many semantic Web services discovery technologies proposed by the recent literatures only focus on the keyword-based or primary semantic based service's matching. This paper studies the rules and rule reasoning based service matching algorithm in the background of large scale services library. Firstly, the formal descriptions of semantic web services and service matching is presented. The services' matching are divided into four levels: Exact, Plugin, Subsume and Fail and their formal descriptions are also presented. Then, the service matching is regarded as rule-based reasoning issues. A set of match rules are firstly given and the related services set is retrieved from services ontology base through rule-based reasoning, and their matching levels are determined by distinguishing the relationships between service's I/O and user's request I/O. Finally, the experiment based on two services sets show that the proposed services matching strategy can easily implement the smart service discovery and obtains the high service discovery efficiency in comparison with the traditional global traversal strategy.

  5. Spatial Data Web Services Pricing Model Infrastructure

    NASA Astrophysics Data System (ADS)

    Ozmus, L.; Erkek, B.; Colak, S.; Cankurt, I.; Bakıcı, S.

    2013-08-01

    The General Directorate of Land Registry and Cadastre (TKGM) which is the leader in the field of cartography largely continues its missions which are; to keep and update land registry and cadastre system of the country under the responsibility of the treasure, to perform transactions related to real estate and to establish Turkish national spatial information system. TKGM a public agency has completed many projects. Such as; Continuously Operating GPS Reference Stations (TUSAGA-Aktif), Geo-Metadata Portal (HBB), Orthophoto-Base Map Production and web services, Completion of Initial Cadastre, Cadastral Renovation Project (TKMP), Land Registry and Cadastre Information System (TAKBIS), Turkish National Spatial Data Infrastructure Project (TNSDI), Ottoman Land Registry Archive Information System (TARBIS). TKGM provides updated map and map information to not only public institutions but also to related society in the name of social responsibility principals. Turkish National Spatial Data Infrastructure activities have been started by the motivation of Circular No. 2003/48 which was declared by Turkish Prime Ministry in 2003 within the context of e-Transformation of Turkey Short-term Action Plan. Action No. 47 in the mentioned action plan implies that "A Feasibility Study shall be made in order to establish the Turkish National Spatial Data Infrastructure" whose responsibility has been given to General Directorate of Land Registry and Cadastre. Feasibility report of NSDI has been completed in 10th of December 2010. After decision of Steering Committee, feasibility report has been send to Development Bank (old name State Planning Organization) for further evaluation. There are two main arrangements with related this project (feasibility report).First; Now there is only one Ministry which is Ministry of Environment and Urbanism responsible for establishment, operating and all national level activities of NSDI. And Second arrangement is related to institutional Level. The

  6. Processing biological literature with customizable Web services supporting interoperable formats.

    PubMed

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk.

  7. Processing biological literature with customizable Web services supporting interoperable formats

    PubMed Central

    Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia

    2014-01-01

    Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225

  8. The impact of web services at the IRIS DMC

    NASA Astrophysics Data System (ADS)

    Weekly, R. T.; Trabant, C. M.; Ahern, T. K.; Stults, M.; Suleiman, Y. Y.; Van Fossen, M.; Weertman, B.

    2015-12-01

    The IRIS Data Management Center (DMC) has served the seismological community for nearly 25 years. In that time we have offered data and information from our archive using a variety of mechanisms ranging from email-based to desktop applications to web applications and web services. Of these, web services have quickly become the primary method for data extraction at the DMC. In 2011, the first full year of operation, web services accounted for over 40% of the data shipped from the DMC. In 2014, over ~450 TB of data was delivered directly to users through web services, representing nearly 70% of all shipments from the DMC that year. In addition to handling requests directly from users, the DMC switched all data extraction methods to use web services in 2014. On average the DMC now handles between 10 and 20 million requests per day submitted to web service interfaces. The rapid adoption of web services is attributed to the many advantages they bring. For users, they provide on-demand data using an interface technology, HTTP, that is widely supported in nearly every computing environment and language. These characteristics, combined with human-readable documentation and existing tools make integration of data access into existing workflows relatively easy. For the DMC, the web services provide an abstraction layer to internal repositories allowing for concentrated optimization of extraction workflow and easier evolution of those repositories. Lending further support to DMC's push in this direction, the core web services for station metadata, timeseries data and event parameters were adopted as standards by the International Federation of Digital Seismograph Networks (FDSN). We expect to continue enhancing existing services and building new capabilities for this platform. For example, the DMC has created a federation system and tools allowing researchers to discover and collect seismic data from data centers running the FDSN-standardized services. A future capability

  9. Storage Viability and Optimization Web Service

    SciTech Connect

    Stadler, Michael; Marnay, Christ; Lai, Judy; Siddiqui, Afzal; Limpaitoon, Tanachai; Phan, Trucy; Megel, Olivier; Chang, Jessica; DeForest, Nicholas

    2010-10-11

    Non-residential sectors offer many promising applications for electrical storage (batteries) and photovoltaics (PVs). However, choosing and operating storage under complex tariff structures poses a daunting technical and economic problem that may discourage potential customers and result in lost carbon and economic savings. Equipment vendors are unlikely to provide adequate environmental analysis or unbiased economic results to potential clients, and are even less likely to completely describe the robustness of choices in the face of changing fuel prices and tariffs. Given these considerations, researchers at Lawrence Berkeley National Laboratory (LBNL) have designed the Storage Viability and Optimization Web Service (SVOW): a tool that helps building owners, operators and managers to decide if storage technologies and PVs merit deeper analysis. SVOW is an open access, web-based energy storage and PV analysis calculator, accessible by secure remote login. Upon first login, the user sees an overview of the parameters: load profile, tariff, technologies, and solar radiation location. Each parameter has a pull-down list of possible predefined inputs and users may upload their own as necessary. Since the non-residential sectors encompass a broad range of facilities with fundamentally different characteristics, the tool starts by asking the users to select a load profile from a limited cohort group of example facilities. The example facilities are categorized according to their North American Industry Classification System (NAICS) code. After the load profile selection, users select a predefined tariff or use the widget to create their own. The technologies and solar radiation menus operate in a similar fashion. After these four parameters have been inputted, the users have to select an optimization setting as well as an optimization objective. The analytic engine of SVOW is LBNL?s Distributed Energy Resources Customer Adoption Model (DER-CAM), which is a mixed

  10. Mining biological pathways using WikiPathways web services.

    PubMed

    Kelder, Thomas; Pico, Alexander R; Hanspers, Kristina; van Iersel, Martijn P; Evelo, Chris; Conklin, Bruce R

    2009-07-30

    WikiPathways is a platform for creating, updating, and sharing biological pathways [1]. Pathways can be edited and downloaded using the wiki-style website. Here we present a SOAP web service that provides programmatic access to WikiPathways that is complementary to the website. We describe the functionality that this web service offers and discuss several use cases in detail. Exposing WikiPathways through a web service opens up new ways of utilizing pathway information and assisting the community curation process.

  11. Web services for ecosystem services management and poverty alleviation

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Baez, S.; Veliz Rosas, C.

    2011-12-01

    Over the last decades, near real-time environmental observation, technical advances in computer power and cyber-infrastructure, and the development of environmental software algorithms have increased dramatically. The integration of these evolutions is one of the major challenges of the next decade for environmental sciences. Worldwide, many coordinated activities are ongoing to make this integration a reality. However, far less attention is paid to the question of how these developments can benefit environmental services management in a poverty alleviation context. Such projects are typically faced with issues of large predictive uncertainties, limited resources, limited local scientific capacity. At the same time, the complexity of the socio-economic contexts requires a very strong bottom-up oriented and interdisciplinary approach to environmental data collection and processing. Here, we present the results of two projects on integrated environmental monitoring and scenario analysis aimed at poverty alleviation in the Peruvian Andes and Amazon. In the upper Andean highlands, farmers are monitoring the water cycle of headwater catchments to analyse the impact of land-use changes on stream flow and potential consequences for downstream irrigation. In the Amazon, local communities are monitoring the dynamics of turtle populations and their relations with river levels. In both cases, the use of online databases and web processing services enable real-time analysis of the data and scenario analysis. The system provides both physical and social indicators to assess the impact of land-use management options on local socio-economic development.

  12. Ethical Issues in Providing Library Services to Distance Learners

    ERIC Educational Resources Information Center

    Needham, Gill; Johnson, Kay

    2007-01-01

    The authors, library practitioners from either side of the Atlantic Ocean, embarked on a dialogue about the ethical challenges encountered in providing library services to distance learners. Unable to find an existing, appropriate ethical framework for their discussion, they agreed to devise their own, informed by relevant professional codes and…

  13. Going, going, still there: using the WebCite service to permanently archive cited web pages.

    PubMed

    Eysenbach, Gunther; Trudel, Mathieu

    2005-12-30

    Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research

  14. A Web Service and Interface for Remote Electronic Device Characterization

    ERIC Educational Resources Information Center

    Dutta, S.; Prakash, S.; Estrada, D.; Pop, E.

    2011-01-01

    A lightweight Web Service and a Web site interface have been developed, which enable remote measurements of electronic devices as a "virtual laboratory" for undergraduate engineering classes. Using standard browsers without additional plugins (such as Internet Explorer, Firefox, or even Safari on an iPhone), remote users can control a Keithley…

  15. Communicating data quality through Web Map Services

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Roberts, Charles; Griffiths, Guy; Lewis, Jane; Yang, Kevin

    2013-04-01

    The sharing and visualization of environmental data through spatial data infrastructures is becoming increasingly common. However, information about the quality of data is frequently unavailable or presented in an inconsistent fashion. ("Data quality" is a phrase with many possible meanings but here we define it as "fitness for purpose" - therefore different users have different notions of what constitutes a "high quality" dataset.) The GeoViQua project (www.geoviqua.org) is developing means for eliciting, formatting, discovering and visualizing quality information using ISO and Open Geospatial Consortium (OGC) standards. Here we describe one aspect of the innovations of the GeoViQua project. In this presentation, we shall demonstrate new developments in using Web Map Services to communicate data quality at the level of datasets, variables and individual samples. We shall outline a new draft set of conventions (known as "WMS-Q"), which describe a set of rules for using WMS to convey quality information (OGC draft Engineering Report 12-160). We shall demonstrate these conventions through new prototype software, based upon the widely-used ncWMS software, that applies these rules to enable the visualization of uncertainties in raster data such as satellite products and the results of numerical simulations. Many conceptual and practical issues have arisen from these experiments. How can source data be formatted so that a WMS implementation can detect the semantic links between variables (e.g. the links between a mean field and its variance)? The visualization of uncertainty can be a complex task - how can we provide users with the power and flexibility to choose an optimal strategy? How can we maintain compatibility (as far as possible) with existing WMS clients? We explore these questions with reference to existing standards and approaches, including UncertML, NetCDF-U and Styled Layer Descriptors.

  16. The impact of national cultural distance on the number of foreign Web site visits by U.S. households.

    PubMed

    Beugelsdijk, Sjoerd; Slangen, Arjen

    2010-04-01

    We investigate how national cultural distance, defined as the extent to which the shared values and norms in one country differ from those in another, affect the number of Web site visits. Based on a sample of 2,654 U.S. households visiting Web sites in 38 countries over 25 different Web site categories, we find that cultural distance has a negative and significant effect on the number of taste-related foreign Web site visits. In the case of Web sites containing sexually explicit material, we obtain a significantly positive effect of cultural distance. Our findings suggest that cultural distance can be both a source of attraction and a source of repulsion in explaining the number of Web site visits depending on the nature of the Web site.

  17. SSWAP: A Simple Semantic Web Architecture and Protocol for Semantic Web Services

    Technology Transfer Automated Retrieval System (TEKTRAN)

    SSWAP (Simple Semantic Web Architecture and Protocol) is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP is the driving technology behind the Virtual Plant Information Network, an NSF-funded semantic w...

  18. Faculty Adoption Behaviour about Web-Based Distance Education: A Case Study from China Agricultural University

    ERIC Educational Resources Information Center

    Li, Yan; Lindner, James R.

    2007-01-01

    The purpose of this study was to determine China Agricultural University's (CAU's) faculty adoption behaviour about web-based distance education (WBDE). Rogers' (2003) model of five stages in the innovation-decision process was adopted and modified as the theoretical base for the study. Quantitative research was employed and the research design…

  19. Web-Based Writing Support: Making It Useable for Distance Teachers

    ERIC Educational Resources Information Center

    Goodfellow, Robin; Strauss, Pat; Puxley, Marianne

    2012-01-01

    This paper considers the issues that distance teachers in higher education who are not writing specialists face in supporting their students' academic writing development. We discuss the usefulness of open web-based writing support resources, and propose the need for a system that serves as an interface with these resources. Such a system should…

  20. The Impact of Web Conferencing Training on Peer Tutors' Attitudes toward Distance Education

    ERIC Educational Resources Information Center

    Dvorak, Johanna; Roessger, Kevin

    2012-01-01

    This study investigated the attitudes of peer tutors who received web conferencing training in preparation for synchronous online tutoring. A quasi-experimental design was employed to evaluate changes in peer tutors' attitudes toward distance learning following participation in an online tutor training program. Peer tutors were found to have: (a)…

  1. Influence of Web-Based Distance Education on the Academic Department Chair Role

    ERIC Educational Resources Information Center

    Franklin, Kathy K.; Hart, Jan K.

    2006-01-01

    The purpose of this study was to examine academic department chair perceptions about the future influence of web-based distance education on departmental operations and their changing role as academic leader. Using a rating, modified-policy Delphi method, the researcher worked with 22 department chairs employed at public, urban universities in the…

  2. Maximizing Learning from Rehearsal Activity in Web-Based Distance Learning

    ERIC Educational Resources Information Center

    Wallace, Tary; Grinnell, Lynn; Carey, Lou; Carey, James

    2006-01-01

    Faculty teaching distance courses continuously seek ways to maximize learning for students. Two practice with feedback strategies were examined in this study for their impact on students' achievement and attitudes in an upper-division university level web-based course. One format contained higher structure and dialog creating lower transactional…

  3. Problems of Implementing SCORM in an Enterprise Distance Learning Architecture: SCORM Incompatibility across Multiple Web Domains.

    ERIC Educational Resources Information Center

    Engelbrecht, Jeffrey C.

    2003-01-01

    Delivering content to distant users located in dispersed networks, separated by firewalls and different web domains requires extensive customization and integration. This article outlines some of the problems of implementing the Sharable Content Object Reference Model (SCORM) in the Marine Corps' Distance Learning System (MarineNet) and extends…

  4. Web-Based Seamless Migration for Task-Oriented Mobile Distance Learning

    ERIC Educational Resources Information Center

    Zhang, Degan; Li, Yuan-chao; Zhang, Huaiyu; Zhang, Xinshang; Zeng, Guangping

    2006-01-01

    As a new kind of computing paradigm, pervasive computing will meet the requirements of human being that anybody maybe obtain services in anywhere and at anytime, task-oriented seamless migration is one of its applications. Apparently, the function of seamless mobility is suitable for mobile services, such as mobile Web-based learning. In this…

  5. Provenance-Based Approaches to Semantic Web Service Discovery and Usage

    ERIC Educational Resources Information Center

    Narock, Thomas William

    2012-01-01

    The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…

  6. Reinforcement Learning Based Web Service Compositions for Mobile Business

    NASA Astrophysics Data System (ADS)

    Zhou, Juan; Chen, Shouming

    In this paper, we propose a new solution to Reactive Web Service Composition, via molding with Reinforcement Learning, and introducing modified (alterable) QoS variables into the model as elements in the Markov Decision Process tuple. Moreover, we give an example of Reactive-WSC-based mobile banking, to demonstrate the intrinsic capability of the solution in question of obtaining the optimized service composition, characterized by (alterable) target QoS variable sets with optimized values. Consequently, we come to the conclusion that the solution has decent potentials in boosting customer experiences and qualities of services in Web Services, and those in applications in the whole electronic commerce and business sector.

  7. Business Systems Branch Abilities, Capabilities, and Services Web Page

    NASA Technical Reports Server (NTRS)

    Cortes-Pena, Aida Yoguely

    2009-01-01

    During the INSPIRE summer internship I acted as the Business Systems Branch Capability Owner for the Kennedy Web-based Initiative for Communicating Capabilities System (KWICC), with the responsibility of creating a portal that describes the services provided by this Branch. This project will help others achieve a clear view ofthe services that the Business System Branch provides to NASA and the Kennedy Space Center. After collecting the data through the interviews with subject matter experts and the literature in Business World and other web sites I identified discrepancies, made the necessary corrections to the sites and placed the information from the report into the KWICC web page.

  8. Tactical Web Services: Using XML and Java Web Services to Conduct Real-Time Net-Centric Sonar Visualization

    DTIC Science & Technology

    2004-09-01

    Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503... project . By leveraging existing Web service technology, warfighters at the “tip of the spear” can have access to previously unrealized amounts of...Simulation Framework (XMSF) project . By leveraging existing Web service technology, warfighters at the “tip of the spear” can have access to

  9. A SOAP Web Service for accessing MODIS land product subsets

    SciTech Connect

    SanthanaVannan, Suresh K; Cook, Robert B; Pan, Jerry Yun; Wilson, Bruce E

    2011-01-01

    Remote sensing data from satellites have provided valuable information on the state of the earth for several decades. Since March 2000, the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on board NASA s Terra and Aqua satellites have been providing estimates of several land parameters useful in understanding earth system processes at global, continental, and regional scales. However, the HDF-EOS file format, specialized software needed to process the HDF-EOS files, data volume, and the high spatial and temporal resolution of MODIS data make it difficult for users wanting to extract small but valuable amounts of information from the MODIS record. To overcome this usability issue, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemical Dynamics at Oak Ridge National Laboratory (ORNL) developed a Web service that provides subsets of MODIS land products using Simple Object Access Protocol (SOAP). The ORNL DAAC MODIS subsetting Web service is a unique way of serving satellite data that exploits a fairly established and popular Internet protocol to allow users access to massive amounts of remote sensing data. The Web service provides MODIS land product subsets up to 201 x 201 km in a non-proprietary comma delimited text file format. Users can programmatically query the Web service to extract MODIS land parameters for real time data integration into models, decision support tools or connect to workflow software. Information regarding the MODIS SOAP subsetting Web service is available on the World Wide Web (WWW) at http://daac.ornl.gov/modiswebservice.

  10. 78 FR 42537 - Agency Information Collection Activities: Online Survey of Web Services Employers; New...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-16

    ... of Web Services Employers; New Information Collection; Comment Request ACTION: 60-Day Notice. The... information collection. (2) Title of the Form/Collection: Online Survey of Web Services Employers. (3) Agency... obtains data on the E-Verify Program Web Services. Gaining an understanding of the Web Services...

  11. Web Services Provide Access to SCEC Scientific Research Application Software

    NASA Astrophysics Data System (ADS)

    Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.

    2003-12-01

    Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the

  12. Video in Distance Education: ITFS vs. Web-Streaming--Evaluation of Student Attitudes

    ERIC Educational Resources Information Center

    Reisslein, Jana; Seeling, Patrick; Reisslein, Martin

    2005-01-01

    The use of video in distance education courses has a long tradition, with many colleges and universities having been delivering distance education courses with video since the 80's using the Instructional Television Fixed Service (ITFS) and cable television. With the emergence of the Internet and the increased access bandwidths from private homes…

  13. Integrating geo web services for a user driven exploratory analysis

    NASA Astrophysics Data System (ADS)

    Moncrieff, Simon; Turdukulov, Ulanbek; Gulland, Elizabeth-Kate

    2016-04-01

    In data exploration, several online data sources may need to be dynamically aggregated or summarised over spatial region, time interval, or set of attributes. With respect to thematic data, web services are mainly used to present results leading to a supplier driven service model limiting the exploration of the data. In this paper we propose a user need driven service model based on geo web processing services. The aim of the framework is to provide a method for the scalable and interactive access to various geographic data sources on the web. The architecture combines a data query, processing technique and visualisation methodology to rapidly integrate and visually summarise properties of a dataset. We illustrate the environment on a health related use case that derives Age Standardised Rate - a dynamic index that needs integration of the existing interoperable web services of demographic data in conjunction with standalone non-spatial secure database servers used in health research. Although the example is specific to the health field, the architecture and the proposed approach are relevant and applicable to other fields that require integration and visualisation of geo datasets from various web services and thus, we believe is generic in its approach.

  14. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  15. Architecture-Based Reliability Analysis of Web Services

    ERIC Educational Resources Information Center

    Rahmani, Cobra Mariam

    2012-01-01

    In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…

  16. SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services

    PubMed Central

    Gessler, Damian DG; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T

    2009-01-01

    Background SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. Results There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at , developer tools at , and a portal to third-party ontologies at (a "swap meet"). Conclusion SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing

  17. OntoGene web services for biomedical text mining.

    PubMed

    Rinaldi, Fabio; Clematide, Simon; Marques, Hernani; Ellendorff, Tilia; Romacker, Martin; Rodriguez-Esteban, Raul

    2014-01-01

    Text mining services are rapidly becoming a crucial component of various knowledge management pipelines, for example in the process of database curation, or for exploration and enrichment of biomedical data within the pharmaceutical industry. Traditional architectures, based on monolithic applications, do not offer sufficient flexibility for a wide range of use case scenarios, and therefore open architectures, as provided by web services, are attracting increased interest. We present an approach towards providing advanced text mining capabilities through web services, using a recently proposed standard for textual data interchange (BioC). The web services leverage a state-of-the-art platform for text mining (OntoGene) which has been tested in several community-organized evaluation challenges,with top ranked results in several of them.

  18. Finding, Browsing and Getting Data Easily Using SPDF Web Services

    NASA Technical Reports Server (NTRS)

    Candey, R.; Chimiak, R.; Harris, B.; Johnson, R.; Kovalick, T.; Lal, N.; Leckner, H.; Liu, M.; McGuire, R.; Papitashvili, N.; Roberts, A.

    2010-01-01

    The NASA GSFC Space Physics Data Facility (5PDF) provides heliophysics science-enabling information services for enhancing scientific research and enabling integration of these services into the Heliophysics Data Environment paradigm, via standards-based approach (SOAP) and Representational State Transfer (REST) web services in addition to web browser, FTP, and OPeNDAP interfaces. We describe these interfaces and the philosophies behind these web services, and show how to call them from various languages, such as IDL and Perl. We are working towards a "one simple line to call" philosophy extolled in the recent VxO discussions. Combining data from many instruments and missions enables broad research analysis and correlation and coordination with other experiments and missions.

  19. Operational Use of OGC Web Services at the Met Office

    NASA Astrophysics Data System (ADS)

    Wright, Bruce

    2010-05-01

    The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or

  20. Semantic Web Services Challenge, Results from the First Year. Series: Semantic Web And Beyond, Volume 8.

    NASA Astrophysics Data System (ADS)

    Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.

    Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.

  1. A Generic Evaluation Model for Semantic Web Services

    NASA Astrophysics Data System (ADS)

    Shafiq, Omair

    Semantic Web Services research has gained momentum over the last few Years and by now several realizations exist. They are being used in a number of industrial use-cases. Soon software developers will be expected to use this infrastructure to build their B2B applications requiring dynamic integration. However, there is still a lack of guidelines for the evaluation of tools developed to realize Semantic Web Services and applications built on top of them. In normal software engineering practice such guidelines can already be found for traditional component-based systems. Also some efforts are being made to build performance models for servicebased systems. Drawing on these related efforts in component-oriented and servicebased systems, we identified the need for a generic evaluation model for Semantic Web Services applicable to any realization. The generic evaluation model will help users and customers to orient their systems and solutions towards using Semantic Web Services. In this chapter, we have presented the requirements for the generic evaluation model for Semantic Web Services and further discussed the initial steps that we took to sketch such a model. Finally, we discuss related activities for evaluating semantic technologies.

  2. The Role of Libraries in Web-Based Distance Education: An Account and an Analysis of the Impact of Web Technology on Distance Learning--What Remains Unchanged, What Is Changing

    ERIC Educational Resources Information Center

    Cooke, Nicole A.

    2004-01-01

    Even though distance education has a long and diverse history, dating back to 1840, in the last ten-to-fifteen years it has been completely transformed by the emergence of Web-based technology. This technology has had an enormous impact on all aspects of distance education (or distance learning as it is increasingly called). In addition to…

  3. Data Mining Web Services for Science Data Repositories

    NASA Astrophysics Data System (ADS)

    Graves, S.; Ramachandran, R.; Keiser, K.; Maskey, M.; Lynnes, C.; Pham, L.

    2006-12-01

    The maturation of web services standards and technologies sets the stage for a distributed "Service-Oriented Architecture" (SOA) for NASA's next generation science data processing. This architecture will allow members of the scientific community to create and combine persistent distributed data processing services and make them available to other users over the Internet. NASA has initiated a project to create a suite of specialized data mining web services designed specifically for science data. The project leverages the Algorithm Development and Mining (ADaM) toolkit as its basis. The ADaM toolkit is a robust, mature and freely available science data mining toolkit that is being used by several research organizations and educational institutions worldwide. These mining services will give the scientific community a powerful and versatile data mining capability that can be used to create higher order products such as thematic maps from current and future NASA satellite data records with methods that are not currently available. The package of mining and related services are being developed using Web Services standards so that community-based measurement processing systems can access and interoperate with them. These standards-based services allow users different options for utilizing them, from direct remote invocation by a client application to deployment of a Business Process Execution Language (BPEL) solutions package where a complex data mining workflow is exposed to others as a single service. The ability to deploy and operate these services at a data archive allows the data mining algorithms to be run where the data are stored, a more efficient scenario than moving large amounts of data over the network. This will be demonstrated in a scenario in which a user uses a remote Web-Service-enabled clustering algorithm to create cloud masks from satellite imagery at the Goddard Earth Sciences Data and Information Services Center (GES DISC).

  4. Analyzing the Web Services and UniFrame Paradigms

    DTIC Science & Technology

    2003-04-01

    Service provider business application Service consumer business application Publish Links to Web Services Description Language (WSDL) documents...Jersey 07458 [5] Dhingra, V., � Business -to- Business Ecommerce ,� http://projects.bus.lsu.edu/independent_study/vdhing1/b2b. [6] A Darwin Partners and...lever ( business logic level) and provide a new platform to build software for a distributed environment. UniFrame is a research project that aims to

  5. Real-time medical collaboration services over the web.

    PubMed

    Andrikos, Christos; Rassias, Georgios; Tsanakas, Panayiotis; Maglogiannis, Ilias

    2015-08-01

    The gradual shift in modern medical practice, from working alone clinical doctors to MDTs (Multi-Disciplinary Teams), raises the need of online real-time collaboration among geographically distributed medical personnel. The paper presents a Web-based platform, featuring an efficient medical data management and exchange, for hosting real-time collaborative services. The presented work leverages state-of-the-art features of the web (technologies and APIs) to support client-side medical data processing. Moreover, to address the typical bandwidth bottleneck and known scalability issues of centralized data sharing, an indirect RPC (Remote Process Call) scheme is introduced through object synchronization over the WebRTC paradigm.

  6. The RCSB Protein Data Bank: redesigned web site and web services

    PubMed Central

    Rose, Peter W.; Beran, Bojan; Bi, Chunxiao; Bluhm, Wolfgang F.; Dimitropoulos, Dimitris; Goodsell, David S.; Prlić, Andreas; Quesada, Martha; Quinn, Gregory B.; Westbrook, John D.; Young, Jasmine; Yukich, Benjamin; Zardecki, Christine; Berman, Helen M.; Bourne, Philip E.

    2011-01-01

    The RCSB Protein Data Bank (RCSB PDB) web site (http://www.pdb.org) has been redesigned to increase usability and to cater to a larger and more diverse user base. This article describes key enhancements and new features that fall into the following categories: (i) query and analysis tools for chemical structure searching, query refinement, tabulation and export of query results; (ii) web site customization and new structure alerts; (iii) pair-wise and representative protein structure alignments; (iv) visualization of large assemblies; (v) integration of structural data with the open access literature and binding affinity data; and (vi) web services and web widgets to facilitate integration of PDB data and tools with other resources. These improvements enable a range of new possibilities to analyze and understand structure data. The next generation of the RCSB PDB web site, as described here, provides a rich resource for research and education. PMID:21036868

  7. WebAUGUSTUS--a web service for training AUGUSTUS and predicting genes in eukaryotes.

    PubMed

    Hoff, Katharina J; Stanke, Mario

    2013-07-01

    The prediction of protein coding genes is an important step in the annotation of newly sequenced and assembled genomes. AUGUSTUS is one of the most accurate tools for eukaryotic gene prediction. Here, we present WebAUGUSTUS, a web interface for training AUGUSTUS and predicting genes with AUGUSTUS. Depending on the needs of the user, WebAUGUSTUS generates training gene structures automatically. Besides a genome file, either a file with expressed sequence tags or a file with protein sequences is required for this step. Alternatively, it is possible to submit an externally generated training gene structure file and a genome file. The web service optimizes AUGUSTUS parameters and predicts genes with those parameters. WebAUGUSTUS is available at http://bioinf.uni-greifswald.de/webaugustus.

  8. The RCSB Protein Data Bank: redesigned web site and web services.

    PubMed

    Rose, Peter W; Beran, Bojan; Bi, Chunxiao; Bluhm, Wolfgang F; Dimitropoulos, Dimitris; Goodsell, David S; Prlic, Andreas; Quesada, Martha; Quinn, Gregory B; Westbrook, John D; Young, Jasmine; Yukich, Benjamin; Zardecki, Christine; Berman, Helen M; Bourne, Philip E

    2011-01-01

    The RCSB Protein Data Bank (RCSB PDB) web site (http://www.pdb.org) has been redesigned to increase usability and to cater to a larger and more diverse user base. This article describes key enhancements and new features that fall into the following categories: (i) query and analysis tools for chemical structure searching, query refinement, tabulation and export of query results; (ii) web site customization and new structure alerts; (iii) pair-wise and representative protein structure alignments; (iv) visualization of large assemblies; (v) integration of structural data with the open access literature and binding affinity data; and (vi) web services and web widgets to facilitate integration of PDB data and tools with other resources. These improvements enable a range of new possibilities to analyze and understand structure data. The next generation of the RCSB PDB web site, as described here, provides a rich resource for research and education.

  9. Web Processing Service for assisted land cover classification

    NASA Astrophysics Data System (ADS)

    Gasperi, Jérôme; Peyrega, Charles; Dinot, Sébastien; Boileau, Quentin; Manin, Alexis; Heurteaux, Vincent

    2013-04-01

    The Orfeo Toolbox (OTB - http://www.orfeo-toolbox.org/) is an Open Source Remote Sensing Image Processing software library developed by CNES. The aim of the toolbox is to gather a large number of state of the art algorithms for building processing chains for satellite images. Using the constellation server (http://www.constellation-sdi.org/), we exposed the main OTB processing chains as Web Processing Services (WPS). The WPS provides rules for standardizing inputs and outputs for invoking geospatial processing services. These services are managed from a web browser using the mapshup web client (http://mapshup.info). mapshup supports both synchronous and asynchronous processes and offers direct visualisation of results. The whole system provides user a complete and comprehensive image processing chain to produce land cover classification from satellite orthoimagery.

  10. A Method of EC Model Implementation Using Web Service Functions

    NASA Astrophysics Data System (ADS)

    Kurihara, Jun; Koizumi, Hisao; Ishikawa, Toshiyuki; Dasai, Takashi

    In recent years, advances in computer and communication technology and the associated rapid increase in the number of Internet users are encouraging advances in Electronic Commerce (EC). Business models of EC are being actively developed by many different enterprises and engineers, and implemented in many kinds of fields. Meanwhile Web services that reuse remote components over the Internet are drawing attention. Web services are based on SOAP/WSDL/UDDI and are given an important position as the infrastructure of the EC systems. The article analyzes the functions and structures of various business models, establishing the patterns of their distinctive and common features, and proposes a method of determining the implementation specifications of business models utilizing these patterns and Web service functions. This method has been applied to a parts purchasing system, which is a typical pattern of the B to B (Business to Business) EC applications. The article also discusses the results of evaluating this prototype system.

  11. Exploring Education Major Focused Adult Learners' Perspectives and Practices of Web-Based Distance Education in Sixteen Universities

    ERIC Educational Resources Information Center

    Zhang, Jing

    2009-01-01

    Distance education is not a new concept for all kinds of learners in the modern societies. Many researchers have studied traditional distance education programs for adult learners in the past, but little research has been done on Web-based distance education (WBDE) for adult learners. There are also many popular online universities in the U.S. or…

  12. A resource oriented webs service for environmental modeling

    NASA Astrophysics Data System (ADS)

    Ferencik, Ioan

    2013-04-01

    Environmental modeling is a largely adopted practice in the study of natural phenomena. Environmental models can be difficult to build and use and thus sharing them within the community is an important aspect. The most common approach to share a model is to expose it as a web service. In practice the interaction with this web service is cumbersome due to lack of standardized contract and the complexity of the model being exposed. In this work we investigate the use of a resource oriented approach in exposing environmental models as web services. We view a model as a layered resource build atop the object concept from Object Oriented Programming, augmented with persistence capabilities provided by an embedded object database to keep track of its state and implementing the four basic principles of resource oriented architectures: addressability, statelessness, representation and uniform interface. For implementation we use exclusively open source software: Django framework, dyBase object oriented database and Python programming language. We developed a generic framework of resources structured into a hierarchy of types and consequently extended this typology with recurses specific to the domain of environmental modeling. To test our web service we used cURL, a robust command-line based web client.

  13. Building a print on demand web service

    NASA Astrophysics Data System (ADS)

    Reddy, Prakash; Rozario, Benedict; Dudekula, Shariff; V, Anil Dev

    2011-03-01

    There is considerable effort underway to digitize all books that have ever been printed. There is need for a service that can take raw book scans and convert them into Print on Demand (POD) books. Such a service definitely augments the digitization effort and enables broader access to a wider audience. To make this service practical we have identified three key challenges that needed to be addressed. These are: a) produce high quality image images by eliminating artifacts that exist due to the age of the document or those that are introduced during the scanning process b) develop an efficient automated system to process book scans with minimum human intervention; and c) build an eco system which allows us the target audience to discover these books.

  14. Consuming Web Services: A Yahoo! Newsfeed Reader

    ERIC Educational Resources Information Center

    Dadashzadeh, Mohammad

    2010-01-01

    Service Oriented Architecture (SOA) shows demonstrable signs of simplifying software integration. It provides the necessary framework for building applications that can be integrated and can reduce the cost of integration significantly. Organizations are beginning to architect new integration solutions following the SOA approach. As such,…

  15. Academic Public Service Web Sites and the Future of Virtual Academic Public Service

    ERIC Educational Resources Information Center

    Cohn, Ellen; Hibbitts, Bernard

    2005-01-01

    Some faculty have started to use the Internet as a bridge to the public instead of merely to each other. Leveraging their specialist knowledge and their academic authority against perceived public needs, they have created another type of academic Web site on their institutional servers--the academic public service Web site (APSWS). APSWS is an…

  16. A demanding web-based PACS supported by web services technology

    NASA Astrophysics Data System (ADS)

    Costa, Carlos M. A.; Silva, Augusto; Oliveira, José L.; Ribeiro, Vasco G.; Ribeiro, José

    2006-03-01

    During the last years, the ubiquity of web interfaces have pushed practically all PACS suppliers to develop client applications in which clinical practitioners can receive and analyze medical images, using conventional personal computers and Web browsers. However, due to security and performance issues, the utilization of these software packages has been restricted to Intranets. Paradigmatically, one of the most important advantages of digital image systems is to simplify the widespread sharing and remote access of medical data between healthcare institutions. This paper analyses the traditional PACS drawbacks that contribute to their reduced usage in the Internet and describes a PACS based on Web Services technology that supports a customized DICOM encoding syntax and a specific compression scheme providing all historical patient data in a unique Web interface.

  17. 3D Medical Volume Reconstruction Using Web Services

    PubMed Central

    Kooper, Rob; Shirk, Andrew; Lee, Sang-Chul; Lin, Amy; Folberg, Robert; Bajcsy, Peter

    2008-01-01

    We address the problem of 3D medical volume reconstruction using web services. The use of proposed web services is motivated by the fact that the problem of 3D medical volume reconstruction requires significant computer resources and human expertise in medical and computer science areas. Web services are implemented as an additional layer to a dataflow framework called Data to Knowledge. In the collaboration between UIC and NCSA, pre-processed input images at NCSA are made accessible to medical collaborators for registration. Every time UIC medical collaborators inspected images and selected corresponding features for registration, the web service at NCSA is contacted and the registration processing query is executed using the Image to Knowledge library of registration methods. Co-registered frames are returned for verification by medical collaborators in a new window. In this paper, we present 3D volume reconstruction problem requirements and the architecture of the developed prototype system at http://isda.ncsa.uiuc.edu/MedVolume. We also explain the tradeoffs of our system design and provide experimental data to support our system implementation. The prototype system has been used for multiple 3D volume reconstructions of blood vessels and vasculogenic mimicry patterns in histological sections of uveal melanoma studied by fluorescent confocal laser scanning microscope. PMID:18336808

  18. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Bauer, G.; Dörnbrack, A.

    2011-09-01

    We present a web service based tool for the planning of atmospheric research flights. The tool provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard, a technology that has gained increased attention in meteorology in recent years. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. We have implemented the software using the open-source programming language Python. In the present article, we describe the architecture of the tool. As an example application, we discuss a case study research flight planned for the scenario of the 2010 Eyjafjalla volcano eruption. Usage and implementation details are provided as Supplement.

  19. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Bauer, G.; Dörnbrack, A.

    2012-01-01

    We present a web service based tool for the planning of atmospheric research flights. The tool provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard, a technology that has gained increased attention in meteorology in recent years. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. We have implemented the software using the open-source programming language Python. In the present article, we describe the architecture of the tool. As an example application, we discuss a case study research flight planned for the scenario of the 2010 Eyjafjalla volcano eruption. Usage and implementation details are provided as Supplement.

  20. Incorporating Web services into Earth Science Computational Environments

    NASA Astrophysics Data System (ADS)

    Fox, G.

    2002-12-01

    Grid technology promises to greatly enhance the analysis of data and their integration into all Earth Science fields. To prepare for this, one should "package" applications as Web Services using standards develop by the computer industry and W3C consortium. We report on some early experience with several earthquake simulation programs.

  1. Web services at the European Bioinformatics Institute-2009

    PubMed Central

    Mcwilliam, Hamish; Valentin, Franck; Goujon, Mickael; Li, Weizhong; Narayanasamy, Menaka; Martin, Jenny; Miyar, Teresa; Lopez, Rodrigo

    2009-01-01

    The European Bioinformatics Institute (EMBL-EBI) has been providing access to mainstream databases and tools in bioinformatics since 1997. In addition to the traditional web form based interfaces, APIs exist for core data resources such as EMBL-Bank, Ensembl, UniProt, InterPro, PDB and ArrayExpress. These APIs are based on Web Services (SOAP/REST) interfaces that allow users to systematically access databases and analytical tools. From the user's point of view, these Web Services provide the same functionality as the browser-based forms. However, using the APIs frees the user from web page constraints and are ideal for the analysis of large batches of data, performing text-mining tasks and the casual or systematic evaluation of mathematical models in regulatory networks. Furthermore, these services are widespread and easy to use; require no prior knowledge of the technology and no more than basic experience in programming. In the following we wish to inform of new and updated services as well as briefly describe planned developments to be made available during the course of 2009–2010. PMID:19435877

  2. Novel web service selection model based on discrete group search.

    PubMed

    Zhai, Jie; Shao, Zhiqing; Guo, Yi; Zhang, Haiteng

    2014-01-01

    In our earlier work, we present a novel formal method for the semiautomatic verification of specifications and for describing web service composition components by using abstract concepts. After verification, the instantiations of components were selected to satisfy the complex service performance constraints. However, selecting an optimal instantiation, which comprises different candidate services for each generic service, from a large number of instantiations is difficult. Therefore, we present a new evolutionary approach on the basis of the discrete group search service (D-GSS) model. With regard to obtaining the optimal multiconstraint instantiation of the complex component, the D-GSS model has competitive performance compared with other service selection models in terms of accuracy, efficiency, and ability to solve high-dimensional service composition component problems. We propose the cost function and the discrete group search optimizer (D-GSO) algorithm and study the convergence of the D-GSS model through verification and test cases.

  3. Pioneering a web-Based Museum in Taiwan: Design and Implementation of Lifelong Distance Learning of Science Education.

    ERIC Educational Resources Information Center

    Young, Shelley Shwu-Ching; Huang, Yi-Long; Jang, Jyh-Shing Roger

    2000-01-01

    Describes the development and implementation process of a Web-based science museum in Taiwan. Topics include use of the Internet; lifelong distance learning; museums and the Internet; objectives of the science museum; funding; categories of exhibitions; analysis of Web users; homepage characteristics; graphics and the effect on speed; and future…

  4. Towards Thematic Web Services for Generic Data Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Horanont, T.; Basa, M.; Shibasaki, R.

    2012-07-01

    Spatial analysis packages and thematic mapping are available in a number of traditional desktop GIS. However, visualizing thematic maps through the Internet is still limited to fix contents and restrict changes of the input data. The users with limited GIS knowledge or people who do not own digital map data are normally having difficulties to create output thematic maps from generic data. In this study, we developed thematic mapping services that can be applied to non-spatial data format served through powerful map services solutions. Novice users who have no GIS software experience or have no digital base map can simply input a plain text file with location identifier field such as place name or gazetteer to generate thematic maps online. We implemented a prototype by using web service standards recommended by the Open Geospatial Consortium (OGC) such as Web Map Service (WMS), Web Feature Service (WFS) and Styled Layer Descriptor (SLD) to provide a principle for communication and allow users to visualize spatial information as thematic maps. The system dedicates a great deal of effort to the initial study of geospatial analysis and visualization for novice users including those with no past experience using Geographic Information Systems.

  5. Taking advantage of Google's Web-based applications and services.

    PubMed

    Brigham, Tara J

    2014-01-01

    Google is a company that is constantly expanding and growing its services and products. While most librarians possess a "love/hate" relationship with Google, there are a number of reasons you should consider exploring some of the tools Google has created and made freely available. Applications and services such as Google Docs, Slides, and Google+ are functional and dynamic without the cost of comparable products. This column will address some of the issues users should be aware of before signing up to use Google's tools, and a description of some of Google's Web applications and services, plus how they can be useful to librarians in health care.

  6. Web services in the U.S. geological survey streamstats web application

    USGS Publications Warehouse

    Guthrie, J.D.; Dartiguenave, C.; Ries, Kernell G.

    2009-01-01

    StreamStats is a U.S. Geological Survey Web-based GIS application developed as a tool for waterresources planning and management, engineering design, and other applications. StreamStats' primary functionality allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, Web services have been developed that provide the capability to remote users and applications to access comprehensive GIS tools that are available in StreamStats, including delineating drainage-basin boundaries, computing basin characteristics, estimating streamflow statistics for user-selected locations, and determining point features that coincide with a National Hydrography Dataset (NHD) reach address. For the state of Kentucky, a web service also has been developed that provides users the ability to estimate daily time series of drainage-basin average values of daily precipitation and temperature. The use of web services allows the user to take full advantage of the datasets and processes behind the Stream Stats application without having to develop and maintain them. ?? 2009 IEEE.

  7. WebGIS based community services architecture by griddization managements and crowdsourcing services

    NASA Astrophysics Data System (ADS)

    Wang, Haiyin; Wan, Jianhua; Zeng, Zhe; Zhou, Shengchuan

    2016-11-01

    Along with the fast economic development of cities, rapid urbanization, population surge, in China, the social community service mechanisms need to be rationalized and the policy standards need to be unified, which results in various types of conflicts and challenges for community services of government. Based on the WebGIS technology, the article provides a community service architecture by gridding management and crowdsourcing service. The WEBGIS service architecture includes two parts: the cloud part and the mobile part. The cloud part refers to community service centres, which can instantaneously response the emergency, visualize the scene of the emergency, and analyse the data from the emergency. The mobile part refers to the mobile terminal, which can call the centre, report the event, collect data and verify the feedback. This WebGIS based community service systems for Huangdao District of Qingdao, were awarded the “2015’ national innovation of social governance case of typical cases”.

  8. The Knowledge Base as an Extension of Distance Learning Reference Service

    ERIC Educational Resources Information Center

    Casey, Anne Marie

    2012-01-01

    This study explores knowledge bases as extension of reference services for distance learners. Through a survey and follow-up interviews with distance learning librarians, this paper discusses their interest in creating and maintaining a knowledge base as a resource for reference services to distance learners. It also investigates their perceptions…

  9. 75 FR 20400 - Submission for Review: Federal Cyber Service: Scholarship for Service (SFS) Registration Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-19

    ... From the Federal Register Online via the Government Publishing Office ] OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Federal Cyber Service: Scholarship for Service (SFS) Registration Web Site AGENCY: U.S. Office of Personnel Management. ACTION: 60-Day Notice and request for comments. SUMMARY:...

  10. 75 FR 57086 - Submission for Review: Federal Cyber Service: Scholarship for Service (SFS) Registration Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... From the Federal Register Online via the Government Publishing Office OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Federal Cyber Service: Scholarship for Service (SFS) Registration Web Site AGENCY: Office of Personnel Management. ACTION: 30-Day Notice and request for comments. SUMMARY:...

  11. Augmenting Basic Web Data Services with Middleware Services to Facilitate Usability and Interoperability

    NASA Astrophysics Data System (ADS)

    Werpy, J.; Torbert, C.

    2014-12-01

    Over the past few years many Data Providers have implemented services that allow for web based (HTTP) interfaces to manipulate, organize, modify, and deliver Earth Science Data. This web architecture provides the foundation for streamlining of Earth Science Users utilization of and interaction with the Data. However, critical components are missing and need to be developed in order to increase the capabilities, potential, and reach of these services. Middleware services represent a class of Data Services that are able to communicate their capabilities more clearly and effectively with Science Data Users while also leveraging the more raw web services on the back end. A Middleware layer of a services architecture functions to coordinate the interactions of the users with the core web services. This simplifies execution, parameter selection, data integration, data delivery, and data analysis activities. This presentation will outline how the Land Processes Distributed Active Archive Center (LP DAAC) has utilized core services to provide basic access to data, data manipulation, and processing. Beyond that, the presentation will also detail the enhancement of those efforts through the development and implementation of Middleware to augment capabilities and create workflows necessary to enable Science Users to perform meaningful science activities and analysis faster than before. The Middleware layer acts as the "glue" that allows all these separate services to work together. By moving the algorithms that process and organize data closer to the Data Archive and enabling access to them via web services fronted by Middleware services, the LP DAAC helps Science users to do better, less expensive, and more expansive science much faster than they ever could before.

  12. Secure password-based authenticated key exchange for web services

    SciTech Connect

    Liang, Fang; Meder, Samuel; Chevassut, Olivier; Siebenlist, Frank

    2004-11-22

    This paper discusses an implementation of an authenticated key-exchange method rendered on message primitives defined in the WS-Trust and WS-SecureConversation specifications. This IEEE-specified cryptographic method (AuthA) is proven-secure for password-based authentication and key exchange, while the WS-Trust and WS-Secure Conversation are emerging Web Services Security specifications that extend the WS-Security specification. A prototype of the presented protocol is integrated in the WSRF-compliant Globus Toolkit V4. Further hardening of the implementation is expected to result in a version that will be shipped with future Globus Toolkit releases. This could help to address the current unavailability of decent shared-secret-based authentication options in the Web Services and Grid world. Future work will be to integrate One-Time-Password (OTP) features in the authentication protocol.

  13. GWASS: GRASS web application software system based on the GeoBrain web service

    NASA Astrophysics Data System (ADS)

    Qiu, Fang; Ni, Feng; Chastain, Bryan; Huang, Haiting; Zhao, Peisheng; Han, Weiguo; Di, Liping

    2012-10-01

    GRASS is a well-known geographic information system developed more than 30 years ago. As one of the earliest GIS systems, GRASS has currently survived mainly as free, open-source desktop GIS software, with users primarily limited to the research community or among programmers who use it to create customized functions. To allow average GIS end users to continue taking advantage of this widely-used software, we developed a GRASS Web Application Software System (GWASS), a distributed, web-based, multi-tiered Geospatial Information System (GIS) built on top of the GeoBrain web service, a project sponsored by NASA using the latest service oriented architecture (SOA). This SOA enabled system offers an effective and practical alternative to current commercial desktop GIS solutions. With GWASS, all geospatial processing and analyses are conducted by the server, so users are not required to install any software at the client side, which reduces the cost of access for users. The only resource needed to use GWASS is an access to the Internet, and anyone who knows how to use a web browser can operate the system. The SOA framework is revitalizing the GRASS as a new means to bring powerful geospatial analysis and resources to more users with concurrent access.

  14. Robust Mechanical-to-Electrical Energy Conversion from Short-Distance Electrospun Poly(vinylidene fluoride) Fiber Webs.

    PubMed

    Shao, Hao; Fang, Jian; Wang, Hongxia; Lang, Chenhong; Lin, Tong

    2015-10-14

    Electrospun polyvinylidene fluoride (PVDF) nanofiber webs have shown great potential in making mechanical-to-electrical energy conversion devices. Previously, polyvinylidene fluoride (PVDF) nanofibers were produced either using near-field electrospinning (spinning distance<1 cm) or conventional electrospinning (spinning distance>8 cm). PVDF fibers produced by an electrospinning at a spinning distance between 1 and 8 cm (referred to as "short-distance" electrospinning in this paper) has received little attention. In this study, we have found that PVDF electrospun in such a distance range can still be fibers, although interfiber connection is formed throughout the web. The interconnected PVDF fibers can have a comparable β crystal phase content and mechanical-to-electrical energy conversion property to those produced by conventional electrospinning. However, the interfiber connection was found to considerably stabilize the fibrous structure during repeated compression and decompression for electrical conversion. More interestingly, the short-distance electrospun PVDF fiber webs have higher delamination resistance and tensile strength than those of PVDF nanofiber webs produced by conventional electrospinning. Short-distance electrospun PVDF nanofibers could be more suitable for the development of robust energy harvesters than conventionally electrospun PVDF nanofibers.

  15. Oceanic satellite data service system based on web

    NASA Astrophysics Data System (ADS)

    Kang, Yan; Pan, Delu; He, Xianqiang; Wang, Difeng; Chen, Jianyu; Chen, Xiaoyan

    2011-11-01

    The ocean satellite observation is more and more important to study the global change, protect ocean resource and implement ocean engineering for their large area cover and high frequency observation, which have already given us a global view of ocean environment parameters, including the sea surface temperature, ocean color, wind, wave, sea level and sea ice, etc... China has made great progress in ocean environment remote sensing over the last couple of years. These data are widely used for a variety of applications in ocean environment studies, coastal water quality monitoring environmental, fishery resources protection, development and utilization of fishery resources, coastal engineering and oceanography. But the data are no online information access and dissemination, no online visualization & browsing, no online query and analyze capability. To facilitate the application of the data and to help disseminating the data, a web-service system has developed. The system provides capabilities of online oceanic satellite information access, query, visualize and analyze. It disseminates oceanic satellite data to the users via real time retrieval, processing and publishing through standards-based geospatial web services. A region of interest can also be exported directly to Google Earth for displaying or downloaded. This web service system greatly improves accessibility, interoperability, usability, and visualization of oceanic satellite data without any client-side software installation.

  16. The Evolution of a National Distance Guidance Service: Trends and Challenges

    ERIC Educational Resources Information Center

    Watts, A. G.; Dent, Gareth

    2008-01-01

    Three trends in the evolution of the UK Learndirect advice service are identified: the partial migration from telephone to web-based services; the trend within the telephone service from information/advice-oriented interventions to more guidance-oriented interventions; and the move from a mainly learning-oriented service to a more career-oriented…

  17. Application of the Open Geospatial Consortium (OGC) Web Processing Service (WPS) Standard for Exposing Water Models as Web Services

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Castronova, A. M.; Huynh, N.; Caicedo, J. M.

    2012-12-01

    Management of water systems often requires the integration of data and models across a range of sources and disciplinary expertise. Service-Oriented Architectures (SOA) have emerged as a powerful paradigm for providing this integration. Including models within a SOA presents challenges because services are not well suited for applications that require state management and large data transfers. Despite these challenges, thoughtful inclusion of models as resources within a SOA could have distinct advantages that center on the idea of abstracting complex computer hardware and software from service consumers while, at the same time, providing powerful resources to client applications. With these advantages and challenges of using models within SOA in mind, this work explores the potential of a modeling service standard as a means for integrating models as resources within SOA. Specifically, we investigate the use of the Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard for exposing models as web services. Through extension of a Python-based implementation of WPS (called pyWPS), we present a demonstration of the methodology through a case study involving a storm event that floods roads and disrupts travel in Columbia, SC. The case study highlights the benefit of an urban infrastructure system with its various subsystems (stormwater, transportation, and structures) interacting and exchanging data seamlessly.

  18. SAS- Semantic Annotation Service for Geoscience resources on the web

    NASA Astrophysics Data System (ADS)

    Elag, M.; Kumar, P.; Marini, L.; Li, R.; Jiang, P.

    2015-12-01

    There is a growing need for increased integration across the data and model resources that are disseminated on the web to advance their reuse across different earth science applications. Meaningful reuse of resources requires semantic metadata to realize the semantic web vision for allowing pragmatic linkage and integration among resources. Semantic metadata associates standard metadata with resources to turn them into semantically-enabled resources on the web. However, the lack of a common standardized metadata framework as well as the uncoordinated use of metadata fields across different geo-information systems, has led to a situation in which standards and related Standard Names abound. To address this need, we have designed SAS to provide a bridge between the core ontologies required to annotate resources and information systems in order to enable queries and analysis over annotation from a single environment (web). SAS is one of the services that are provided by the Geosematnic framework, which is a decentralized semantic framework to support the integration between models and data and allow semantically heterogeneous to interact with minimum human intervention. Here we present the design of SAS and demonstrate its application for annotating data and models. First we describe how predicates and their attributes are extracted from standards and ingested in the knowledge-base of the Geosemantic framework. Then we illustrate the application of SAS in annotating data managed by SEAD and annotating simulation models that have web interface. SAS is a step in a broader approach to raise the quality of geoscience data and models that are published on the web and allow users to better search, access, and use of the existing resources based on standard vocabularies that are encoded and published using semantic technologies.

  19. Effectiveness of Asynchronous Reference Services for Distance Learning Students within Florida's Community College System

    ERIC Educational Resources Information Center

    Profeta, Patricia C.

    2007-01-01

    The provision of equitable library services to distance learning students emerged as a critical area during the 1990s. Library services available to distance learning students included digital reference and instructional services, remote access to online research tools, database and research tutorials, interlibrary loan, and document delivery.…

  20. We Cannot See Them, but They Are There: Marketing Library Services to Distance Learners

    ERIC Educational Resources Information Center

    Dermody, Melinda

    2005-01-01

    Distance learners are a unique target-population for the marketing of library services and resources. Because these patrons do not visit the library often, if at all, it is crucial to actively promote the library resources and services available to them. Marketing strategies for distance learning library services need to take a multifaceted…

  1. Leveling the Playing Field for Users with Web Services

    NASA Astrophysics Data System (ADS)

    Trabant, C. M.; Ahern, T. K.; Karstens, R.; Weertman, B.; Suleiman, Y. Y.

    2013-12-01

    The dawn of digital seismological data recording began approximately 4 decades ago. Since then multiple networks of seismological recording stations have and continue to exist. It is common for each network to operate a data center to store and distribute the collected data. Increasingly there are data centers that archive and distribute data produced by multiple networks and organizations. The modern landscape for seismological data users consists of many data centers spread across the globe offering a variety of data. Luckily most of these centers exchange data in standard formats defined by the International Federation of Digital Seismograph Networks (FDSN). Working with our partners in the FDSN, the IRIS Data Management Center (DMC) developed specifications for 3 standard web service interfaces that are intended to provide an abstraction layer on each center's customized data management system. These services provide access to seismological time series data, related metadata and event (earthquake) parameters. An important part of the interface design is to adhere to web standards and common conventions, which allows use of ubiquitous web client software and toolkits. Another critical design criteria is simple usage, we recognize that our user base is scientific data consumers and not necessarily technologists. The IRIS DMC has implemented each of these 3 service interfaces and made the common software components freely available. Under the NSF's EarthScope program and within the international COOPEUS project, the DMC worked with European partners to help install these standardized interfaces on their own data management systems. One key development was the addition of these web services to the SeisComP3 data handling system, which is common in many seismological data centers, especially in Europe. The combination of standardized data formats and access interfaces removes the need for complex request brokers that translate between centers. Instead, it allows

  2. SCALEUS: Semantic Web Services Integration for Biomedical Applications.

    PubMed

    Sernadela, Pedro; González-Castro, Lorena; Oliveira, José Luís

    2017-04-01

    In recent years, we have witnessed an explosion of biological data resulting largely from the demands of life science research. The vast majority of these data are freely available via diverse bioinformatics platforms, including relational databases and conventional keyword search applications. This type of approach has achieved great results in the last few years, but proved to be unfeasible when information needs to be combined or shared among different and scattered sources. During recent years, many of these data distribution challenges have been solved with the adoption of semantic web. Despite the evident benefits of this technology, its adoption introduced new challenges related with the migration process, from existent systems to the semantic level. To facilitate this transition, we have developed Scaleus, a semantic web migration tool that can be deployed on top of traditional systems in order to bring knowledge, inference rules, and query federation to the existent data. Targeted at the biomedical domain, this web-based platform offers, in a single package, straightforward data integration and semantic web services that help developers and researchers in the creation process of new semantically enhanced information systems. SCALEUS is available as open source at http://bioinformatics-ua.github.io/scaleus/ .

  3. Stakeholder Expectations of Service Quality in a University Web Portal

    NASA Astrophysics Data System (ADS)

    Tate, Mary; Evermann, Joerg; Hope, Beverley; Barnes, Stuart

    Online service quality is a much-studied concept. There is considerable evidence that user expectations and perceptions of self-service and online service quality differ in different business domains. In addition, the nature of online services is continually changing and universities have been at the forefront of this change, with university websites increasingly acting as a portal for a wide range of online transactions for a wide range of stakeholders. In this qualitative study, we conduct focus groups with a range of stakeholders in a university web portal. Our study offers a number of insights into the changing nature of the relationship between organisations and customers. New technologies are influencing customer expectations. Customers increasingly expect organisations to have integrated information systems, and to utilise new technologies such as SMS and web portals. Organisations can be slow to adopt a customer-centric viewpoint, and persist in providing interfaces that are inconsistent or require inside knowledge of organisational structures and processes. This has a negative effect on customer perceptions.

  4. Semantic Web Service Framework to Intelligent Distributed Manufacturing

    SciTech Connect

    Kulvatunyou, Boonserm

    2005-12-01

    As markets become unexpectedly turbulent with a shortened product life cycle and a power shift towards buyers, the need for methods to develop products, production facilities, and supporting software rapidly and cost-effectively is becoming urgent. The use of a loosely integrated virtual enterprise based framework holds the potential of surviving changing market needs. However, its success requires reliable and large-scale interoperation among trading partners via a semantic web of trading partners services whose properties, capabilities, and interfaces are encoded in an unambiguous as well as computer-understandable form. This paper demonstrates a promising approach to integration and interoperation between a design house and a manufacturer that may or may not have prior relationship by developing semantic web services for business and engineering transactions. To this end, detailed activity and information flow diagrams are developed, in which the two trading partners exchange messages and documents. The properties and capabilities of the manufacturer sites are defined using DARPA Agent Markup Language (DAML) ontology definition language. The prototype development of semantic webs shows that enterprises can interoperate widely in an unambiguous and autonomous manner. This contributes towards the realization of virtual enterprises at a low cost.

  5. Geovisualization in the HydroProg web map service

    NASA Astrophysics Data System (ADS)

    Spallek, Waldemar; Wieczorek, Malgorzata; Szymanowski, Mariusz; Niedzielski, Tomasz; Swierczynska, Malgorzata

    2016-04-01

    The HydroProg system, built at the University of Wroclaw (Poland) in frame of the research project no. 2011/01/D/ST10/04171 financed by the National Science Centre of Poland, has been designed for computing predictions of river stages in real time on a basis of multimodelling. This experimental system works on the upper Nysa Klodzka basin (SW Poland) above the gauge in the town of Bardo, with the catchment area of 1744 square kilometres. The system operates in association with the Local System for Flood Monitoring of Klodzko County (LSOP), and produces hydrograph prognoses as well as inundation predictions. For presenting the up-to-date predictions and their statistics in the online mode, the dedicated real-time web map service has been designed. Geovisualisation in the HydroProg map service concerns: interactive maps of study area, interactive spaghetti hydrograms of water level forecasts along with observed river stages, animated images of inundation. The LSOP network offers a high spatial and temporal resolution of observations, as the length of the sampling interval is equal to 15 minutes. The main environmental elements related to hydrological modelling are shown on the main map. This includes elevation data (hillshading and hypsometric tints), rivers and reservoirs as well as catchment boundaries. Furthermore, we added main towns, roads as well as political and administrative boundaries for better map understanding. The web map was designed as a multi-scale representation, with levels of detail and zooming according to scales: 1:100 000, 1:250 000 and 1:500 000. Observations of water level in LSOP are shown on interactive hydrographs for each gauge. Additionally, predictions and some of their statistical characteristics (like prediction errors and Nash-Sutcliffe efficiency) are shown for selected gauges. Finally, predictions of inundation are presented on animated maps which have been added for four experimental sites. The HydroProg system is a strictly

  6. Implementation of the OGC Web Processing Service for Use in Spatial Web Portal

    NASA Astrophysics Data System (ADS)

    Sun, M.; Huang, Q.; Li, J.; Yang, C.

    2010-12-01

    Data intensive and computing intensive are now two critical issues for Earth sciences. Spatial analytical processing, which are known for computing intensive, demands better solutions to meet the challenges. Such computing requirements are often beyond the capacities of stand-alone computers. OGC Web Processing Service (WPS) is a new standard way to provide distributed GIS functions to users through the network, instead of local computers. Then these distributed analytical functions can be bond with the spatial web portal which is supported by clouding computing on the background (e.g. spatial web portal by CISC, GMU). The bridging between generic computing and spatial processing for Earth sciences might be solved by the power of cloud computing. WPS for some basic GIS functions (e.g. buffer, overlay, intersect) or domain-specific geoprocessing has been published. Through integrating all sorts of basic functions, specialists can process the applications across different domains. However, there are two problems: the limited functions provided by the existing WPS are only tips of the powerful GIS analytical functions, they cannot effectively satisfy intended analyzing demands well; Also the processing for large data and complex computation might be very slow. Thus, the study is in the purpose of setting up on-line spatial analytical functions in the spatial web portal, supported by cloud computing, which can chain the data management, search, spatial analysis and visualization together.

  7. 78 FR 60303 - Agency Information Collection Activities: Online Survey of Web Services Employers; New...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-01

    ... of Web Services Employers; New Information Collection ACTION: 30-Day Notice. SUMMARY: The Department... via the Federal eRulemaking Portal Web site at http://www.Regulations.gov under e-Docket ID number... collection. (2) Title of the Form/Collection: Online Survey of Web Services Employers. (3) Agency form...

  8. Web based aphasia test using service oriented architecture (SOA)

    NASA Astrophysics Data System (ADS)

    Voos, J. A.; Vigliecca, N. S.; Gonzalez, E. A.

    2007-11-01

    Based on an aphasia test for Spanish speakers which analyze the patient's basic resources of verbal communication, a web-enabled software was developed to automate its execution. A clinical database was designed as a complement, in order to evaluate the antecedents (risk factors, pharmacological and medical backgrounds, neurological or psychiatric symptoms, brain injury -anatomical and physiological characteristics, etc) which are necessary to carry out a multi-factor statistical analysis in different samples of patients. The automated test was developed following service oriented architecture and implemented in a web site which contains a tests suite, which would allow both integrating the aphasia test with other neuropsychological instruments and increasing the available site information for scientific research. The test design, the database and the study of its psychometric properties (validity, reliability and objectivity) were made in conjunction with neuropsychological researchers, who participate actively in the software design, based on the patients or other subjects of investigation feedback.

  9. Semantic Web Services with Web Ontology Language (OWL-S) - Specification of Agent-Services for DARPA Agent Markup Language (DAML)

    DTIC Science & Technology

    2006-08-01

    Sycara, and T. Nishimura, "Towards a Semantic Web Ecommerce ," in Proceedings of 6th Conference on Business Information Systems (BIS2003), Colorado...marketplace. Web Service technology are being adapted by Business -to- Business interaction and even in some Business -to- Consumer interactions. The widespread...messaging protocol) makes it possible to describe the functionalities of devices as Web services. As an example, consider a business meeting where

  10. The Plate Boundary Observatory: Community Focused Web Services

    NASA Astrophysics Data System (ADS)

    Matykiewicz, J.; Anderson, G.; Lee, E.; Hoyt, B.; Hodgkinson, K.; Persson, E.; Wright, J.; Torrez, D.; Jackson, M.

    2006-12-01

    The Plate Boundary Observatory (PBO), part of the NSF-funded EarthScope project, is designed to study the three-dimensional strain field resulting from deformation across the active boundary zone between the Pacific and North American plates in the western United States. To meet these goals, PBO will install 852 continuous GPS stations, 103 borehole strainmeter stations, 28 tiltmeters, and five laser strainmeters, as well as manage data for 209 previously existing continuous GPS stations. UNAVCO provides access to data products from these stations, as well as general information about the PBO project, via the PBO web site (http://pboweb.unavco.org). GPS and strainmeter data products can be found using a variety of channels, including map searches, text searches, and station specific data retrieval. In addition, the PBO construction status is available via multiple mapping interfaces, including custom web based map widgets and Google Earth. Additional construction details can be accessed from PBO operational pages and station specific home pages. The current state of health for the PBO network is available with the statistical snap-shot, full map interfaces, tabular web based reports, and automatic data mining and alerts. UNAVCO is currently working to enhance the community access to this information by developing a web service framework for the discovery of data products, interfacing with operational engineers, and exposing data services to third party participants. In addition, UNAVCO, through the PBO project, provides advanced data management and monitoring systems for use by the community in operating geodetic networks in the United States and beyond. We will demonstrate these systems during the AGU meeting, and we welcome inquiries from the community at any time.

  11. Satellite Technologies and Services: Implications for International Distance Education.

    ERIC Educational Resources Information Center

    Stahmer, Anna

    1987-01-01

    This examination of international distance education and open university applications of communication satellites at the postsecondary level notes activities in less developed countries (LDCs); presents potential models for cooperation; and describes technical systems for distance education, emphasizing satellite technology and possible problems…

  12. A Security Architecture for Grid-enabling OGC Web Services

    NASA Astrophysics Data System (ADS)

    Angelini, Valerio; Petronzio, Luca

    2010-05-01

    In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid

  13. Free Factories: Unified Infrastructure for Data Intensive Web Services

    PubMed Central

    Zaranek, Alexander Wait; Clegg, Tom; Vandewege, Ward; Church, George M.

    2010-01-01

    We introduce the Free Factory, a platform for deploying data-intensive web services using small clusters of commodity hardware and free software. Independently administered virtual machines called Freegols give application developers the flexibility of a general purpose web server, along with access to distributed batch processing, cache and storage services. Each cluster exploits idle RAM and disk space for cache, and reserves disks in each node for high bandwidth storage. The batch processing service uses a variation of the MapReduce model. Virtualization allows every CPU in the cluster to participate in batch jobs. Each 48-node cluster can achieve 4-8 gigabytes per second of disk I/O. Our intent is to use multiple clusters to process hundreds of simultaneous requests on multi-hundred terabyte data sets. Currently, our applications achieve 1 gigabyte per second of I/O with 123 disks by scheduling batch jobs on two clusters, one of which is located in a remote data center. PMID:20514356

  14. Optimizing medical data quality based on multiagent web service framework.

    PubMed

    Wu, Ching-Seh; Khoury, Ibrahim; Shah, Hemant

    2012-07-01

    One of the most important issues in e-healthcare information systems is to optimize the medical data quality extracted from distributed and heterogeneous environments, which can extremely improve diagnostic and treatment decision making. This paper proposes a multiagent web service framework based on service-oriented architecture for the optimization of medical data quality in the e-healthcare information system. Based on the design of the multiagent web service framework, an evolutionary algorithm (EA) for the dynamic optimization of the medical data quality is proposed. The framework consists of two main components; first, an EA will be used to dynamically optimize the composition of medical processes into optimal task sequence according to specific quality attributes. Second, a multiagent framework will be proposed to discover, monitor, and report any inconstancy between the optimized task sequence and the actual medical records. To demonstrate the proposed framework, experimental results for a breast cancer case study are provided. Furthermore, to show the unique performance of our algorithm, a comparison with other works in the literature review will be presented.

  15. Web services interface for Space Weather: NeQuick 2 web and experimental TEC Calibration

    NASA Astrophysics Data System (ADS)

    Migoya Orue, Yenca O.; Nava, Bruno; Radicella, Sandro M.; Alazo Cuartas, Katy; Luigi, Ciraolo

    2013-04-01

    A web front-end has been recently developed and released to allow retrieving and plotting ionospheric parameters computed by the latest version of the model, NeQuick 2. NeQuick is a quick-run ionospheric electron density model particularly designed for trans-ionospheric propagation applications. It has been developed at the Aeronomy and Radiopropagation Laboratory (now T/ICT4D Laboratory) of the Abdus Salam International Centre for Theoretical Physics (ICTP) - Trieste, Italy with the collaboration of the Institute for Geophysics, Astrophysics and Meteorology (IGAM) of the University of Graz, Austria. To describe the electron density of the ionosphere up to the peak of the F2 layer, NeQuick uses a profile formulation which includes five semi-Epstein layers with modelled thickness parameters. Through a simple web interface users can exploit all the model features including the possibility of computing the electron density and visualizing the corresponding Total Electron Content (TEC) along any ground-to-satellite straight line ray-path. Indeed, the TEC is the ionospheric parameter retrieved from the GPS measurements. It complements the experimental data obtained with diverse kinds of sensors and can be considered a major source of ionospheric information. Since the TEC is not a direct measurement, a "de-biasing" procedure or calibration has to be applied to obtain the relevant values from the raw GPS observables. Using the observation and navigation RINEX files corresponding to a single receiver as input data, the web application allows the user to compute the slant and/or vertical TEC following the concept of the "arc-by-arc" offsets estimation. The combined use of both tools, freely available from the T/ICT4D Web site, will allow the comparison of experimentally derived slant and vertical TEC with modelled values. An online demonstration of the capabilities of the mentioned web services will be illustrated.

  16. Chapter 18: Web-based Tools - NED VO Services

    NASA Astrophysics Data System (ADS)

    Mazzarella, J. M.; NED Team

    The NASA/IPAC Extragalactic Database (NED) is a thematic, web-based research facility in widespread use by scientists, educators, space missions, and observatory operations for observation planning, data analysis, discovery, and publication of research about objects beyond our Milky Way galaxy. NED is a portal into a systematic fusion of data from hundreds of sky surveys and tens of thousands of research publications. The contents and services span the entire electromagnetic spectrum from gamma rays through radio frequencies, and are continuously updated to reflect the current literature and releases of large-scale sky survey catalogs. NED has been on the Internet since 1990, growing in content, automation and services with the evolution of information technology. NED is the world's largest database of crossidentified extragalactic objects. As of December 2006, the system contains approximately 10 million objects and 15 million multi-wavelength cross-IDs. Over 4 thousand catalogs and published lists covering the entire electromagnetic spectrum have had their objects cross-identified or associated, with fundamental data parameters federated for convenient queries and retrieval. This chapter describes the interoperability of NED services with other components of the Virtual Observatory (VO). Section 1 is a brief overview of the primary NED web services. Section 2 provides a tutorial for using NED services currently available through the NVO Registry. The "name resolver" provides VO portals and related internet services with celestial coordinates for objects specified by catalog identifier (name); any alias can be queried because this service is based on the source cross-IDs established by NED. All major services have been updated to provide output in VOTable (XML) format that can be accessed directly from the NED web interface or using the NVO registry. These include access to images via SIAP, Cone- Search queries, and services providing fundamental, multi

  17. On the Use of Social Networks in Web Services: Application to the Discovery Stage

    NASA Astrophysics Data System (ADS)

    Maamar, Zakaria; Wives, Leandro Krug; Boukadi, Khouloud

    This chapter discusses the use of social networks in Web services with focus on the discovery stage that characterizes the life cycle of these Web services. Other stages in this life cycle include description, publication, invocation, and composition. Web services are software applications that end users or other peers can invoke and compose to satisfy different needs such as hotel booking and car rental. Discovering the relevant Web services is, and continues to be, a major challenge due to the dynamic nature of these Web services. Indeed, Web services appear/disappear or suspend/resume operations without prior notice. Traditional discovery techniques are based on registries such as Universal Description, Discovery and Integration (UDDI) and Electronic Business using eXtensible Markup Language (ebXML). Unfortunately, despite the different improvements that these techniques have been subject to, they still suffer from various limitations that could slow down the acceptance trend of Web services by the IT community. Social networks seem to offer solutions to some of these limitations but raise, at the same time, some issues that are discussed in this chapter. The contributions of this chapter are three: social network definition in the particular context of Web services; mechanisms that support Web services build, use, and maintain their respective social networks; and social networks adoption to discover Web services.

  18. Reliable execution based on CPN and skyline optimization for Web service composition.

    PubMed

    Chen, Liping; Ha, Weitao; Zhang, Guojun

    2013-01-01

    With development of SOA, the complex problem can be solved by combining available individual services and ordering them to best suit user's requirements. Web services composition is widely used in business environment. With the features of inherent autonomy and heterogeneity for component web services, it is difficult to predict the behavior of the overall composite service. Therefore, transactional properties and nonfunctional quality of service (QoS) properties are crucial for selecting the web services to take part in the composition. Transactional properties ensure reliability of composite Web service, and QoS properties can identify the best candidate web services from a set of functionally equivalent services. In this paper we define a Colored Petri Net (CPN) model which involves transactional properties of web services in the composition process. To ensure reliable and correct execution, unfolding processes of the CPN are followed. The execution of transactional composition Web service (TCWS) is formalized by CPN properties. To identify the best services of QoS properties from candidate service sets formed in the TCSW-CPN, we use skyline computation to retrieve dominant Web service. It can overcome that the reduction of individual scores to an overall similarity leads to significant information loss. We evaluate our approach experimentally using both real and synthetically generated datasets.

  19. FROG: Time Series Analysis for the Web Service Era

    NASA Astrophysics Data System (ADS)

    Allan, A.

    2005-12-01

    The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).

  20. Building Geospatial Web Services for Ecological Monitoring and Forecasting

    NASA Astrophysics Data System (ADS)

    Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.

    2008-12-01

    The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.

  1. WS/PIDS: standard interoperable PIDS in web services environments.

    PubMed

    Vasilescu, E; Dorobanţu, M; Govoni, S; Padh, S; Mun, S K

    2008-01-01

    An electronic health record depends on the consistent handling of people's identities within and outside healthcare organizations. Currently, the Person Identification Service (PIDS), a CORBA specification, is the only well-researched standard that meets these needs. In this paper, we introduce WS/PIDS, a PIDS specification for Web Services (WS) that closely matches the original PIDS and improves on it by providing explicit support for medical multimedia attributes. WS/PIDS is currently supported by a test implementation, layered on top of a PIDS back-end, with Java- and NET-based, and Web clients. WS/PIDS is interoperable among platforms; it preserves PIDS semantics to a large extent, and it is intended to be fully compliant with established and emerging WS standards. The specification is open source and immediately usable in dynamic clinical systems participating in grid environments. WS/PIDS has been tested successfully with a comprehensive set of use cases, and it is being used in a clinical research setting.

  2. A Study of Older Adult Students' Satisfaction with Web-Based Distance Learning at the National Open University of Taiwan

    ERIC Educational Resources Information Center

    Chen, Ho-Yuan

    2010-01-01

    The purpose of this study was to investigate the relationships between older learners' demographic characteristics and their satisfaction with distance learning in the Web-based environment at National Open University in Taiwan (NOUT). Increases in the older adult population have had many impacts throughout societies. The major purpose of older…

  3. Design and implementation of CUAHSI WaterML and WaterOneFlow Web Services

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Zaslavsky, I.; Whitenack, T.; Maidment, D.

    2007-12-01

    WaterOneFlow is a term for a group of web services created by and for the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) community. CUAHSI web services facilitate the retrieval of hydrologic observations information from online data sources using the SOAP protocol. CUAHSI Water Markup Language (below referred to as WaterML) is an XML schema defining the format of messages returned by the WaterOneFlow web services. \

  4. Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment

    DTIC Science & Technology

    2015-09-30

    SPONSORED REPORT SERIES Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment 30...Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment 30 September 2015 Dr. Ying Zhao, Research...tasks completed this year. Task 1. We worked with the OSD OUSD ATL (US) to install the LLA/SSA/CLA system as a web service in the Defense

  5. Customer Decision Making in Web Services with an Integrated P6 Model

    NASA Astrophysics Data System (ADS)

    Sun, Zhaohao; Sun, Junqing; Meredith, Grant

    Customer decision making (CDM) is an indispensable factor for web services. This article examines CDM in web services with a novel P6 model, which consists of the 6 Ps: privacy, perception, propensity, preference, personalization and promised experience. This model integrates the existing 6 P elements of marketing mix as the system environment of CDM in web services. The new integrated P6 model deals with the inner world of the customer and incorporates what the customer think during the DM process. The proposed approach will facilitate the research and development of web services and decision support systems.

  6. Web Service Infrastructure for Correcting InSAR Imaging

    NASA Astrophysics Data System (ADS)

    von Allmen, P. A.; Fielding, E. J.; Xing, Z.; Pan, L.; Fishbein, E.

    2011-12-01

    InSAR images can be obtained from satellite radar data by combining signals acquired at two different times along the spacecraft's orbit, at geospatial locations nearly identical. Changes in the propagation of the radar signal from the first acquisition to the second, caused for example by changes in the tropospheric water vapor content, can lead to a deterioration of the quality of the interferometric data analysis. Other extraneous effects such as ocean tidal loading can also lead to errors that reduce the potential science return of InSAR missions. Data from Global Positioning Systems and infrared radiometers are current used on an ad hoc basis for the tropospheric corrections when available, and operational weather forecast was demonstrated to be able to fill in the remaining spatial and temporal gaps. We have developed a set of web services named OSCAR (Online Services for Correcting Atmosphere in Radar) that transparently to the user retrieves remote sensing and weather forecast data and delivers atmospheric radar delays on a latitude longitude grid that can be directly integrated with Interferometric Synthetic Aperture Radar data processing software. We will describe the common web service architecture, relying on RESTful, that we developed to streamline the development of OSCAR's capabilities. We will also discuss the Bayesian averaging process that we use for merging the radiometric data with numerical weather forecast results. Correcting for biases and estimating the error model will be discussed in detail and validation results will be presented. The success of the correction procedure will be demonstrated by using MODIS data and ECMWF model output. We will also outline the extension of our online correction system to include GPS data to automatically correct for biases in the radiometric data, and a model of ocean tidal loading to correct for long wavelength errors near coastal regions.

  7. International cooperation in veterinary public health curricula using web-based distance interactive education.

    PubMed

    Lipman, Len J; Barnier, Valérie M; de Balogh, Katalin K

    2003-01-01

    The expanding field of Veterinary Public Health places new demands on the knowledge and skills of veterinarians. Veterinary curricula must therefore adapt to this new profile. Through the introduction of case studies dealing with up-to-date issues, students are being trained to solve (real-life) problems and come up with realistic solutions. At the Department of Public Health and Food Safety of the Veterinary Faculty at the University of Utrecht in the Netherlands, positive experiences have resulted from the new opportunities offered by the use of information and communication technology (ICT) in education. The possibility of creating a virtual classroom on the Internet through the use of WebCT software has enabled teachers and students to tackle emerging issues by working together with students in other countries and across disciplines. This article presents some of these experiences, through which international exchange of ideas and realities were stimulated, in addition to consolidating relations between universities in different countries. Long-distance education methodologies provide an important tool to achieve the increasing need for international cooperation in Veterinary Public Health curricula.

  8. Establishing Transportation Framework Services Using the Open Geospatial Consortium Web Feature Service Specification

    NASA Astrophysics Data System (ADS)

    Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.

    2005-12-01

    As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s Web Feature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS

  9. Towards Semantic Web Services on Large, Multi-Dimensional Coverages

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2009-04-01

    Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it

  10. The use of geospatial web services for exchanging utilities data

    NASA Astrophysics Data System (ADS)

    Kuczyńska, Joanna

    2013-04-01

    Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data

  11. The Best of Two Worlds: Combining ITV and Web Quests To Strengthen Distance Learning.

    ERIC Educational Resources Information Center

    Mosby, Charmaine

    This presentation describes an English graduate seminar in Local Color and Regionalism in American Literature at Western Kentucky University that was set up as an experimental hybrid course, i.e., roughly 60% face-to-face and 40% Web course (Web quest format). The focus is on the four tasks that comprised the Web quest segment of the course: (1) a…

  12. IKey+: a new single-access key generation web service.

    PubMed

    Burguiere, Thomas; Causse, Florian; Ung, Visotheary; Vignes-Lebbe, Régine

    2013-01-01

    Single-access keys are a major tool for biologists who need to identify specimens. The construction process of these keys is particularly complex (especially if the input data set is large) so having an automatic single-access key generation tool is essential. As part of the European project ViBRANT, our aim was to develop such a tool as a web service, thus allowing end-users to integrate it directly into their workflow. IKey+generates single-access keys on demand, for single users or research institutions. It receives user input data (using the standard SDD format), accepts several key-generation parameters (affecting the key topology and representation), and supports several output formats. IKey+is freely available (sources and binary packages) at www.identificationkey.fr. Furthermore, it is deployed on our server and can be queried (for testing purposes) through a simple web client also available at www.identificationkey.fr (last accessed 13 August 2012). Finally, a client plugin will be integrated to the Scratchpads biodiversity networking tool (scratchpads.eu).

  13. Cloud/web mapping and geoprocessing services - Intelligently linking geoinformation

    NASA Astrophysics Data System (ADS)

    Veenendaal, Bert; Brovelli, Maria Antonia; Wu, Lixin

    2016-04-01

    We live in a world that is alive with information and geographies. ;Everything happens somewhere; (Tosta, 2001). This reality is being exposed in the digital earth technologies providing a multi-dimensional, multi-temporal and multi-resolution model of the planet, based on the needs of diverse actors: from scientists to decision makers, communities and citizens (Brovelli et al., 2015). We are building up a geospatial information infrastructure updated in real time thanks to mobile, positioning and sensor observations. Users can navigate, not only through space but also through time, to access historical data and future predictions based on social and/or environmental models. But how do we find the information about certain geographic locations or localities when it is scattered in the cloud and across the web of data behind a diversity of databases, web services and hyperlinked pages? We need to be able to link geoinformation together in order to integrate it, make sense of it, and use it appropriately for managing the world and making decisions.

  14. A "Virtual Fieldtrip": Service Learning in Distance Education Technical Writing Courses

    ERIC Educational Resources Information Center

    Soria, Krista M.; Weiner, Brad

    2013-01-01

    This mixed-methods experimental study examined the effect of service learning in a distance education technical writing course. Quantitative analysis of data found evidence for a positive relationship between participation in service learning and technical writing learning outcomes. Additionally, qualitative analysis suggests that service learning…

  15. Embracing Change: Adapting and Evolving Your Distance Learning Library Services to Meet the New ACRL Distance Learning Library Services Standards

    ERIC Educational Resources Information Center

    Marcum, Brad

    2016-01-01

    This article examines the update and revision of the current Association of College and Research Libraries (ACRL) Distance Learning Standards that has been proposed and submitted to the ACRL Standards Committee. An in-depth analysis of the update is included, along with some comparisons between the old and new. Practical advice detailing…

  16. Protecting Database Centric Web Services against SQL/XPath Injection Attacks

    NASA Astrophysics Data System (ADS)

    Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique

    Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.

  17. QoS prediction for Web service in Mobile Internet environment

    NASA Astrophysics Data System (ADS)

    Sun, Qibo; Wang, Lubao; Wang, Shangguang; Ma, You; Hsu, Ching-Hsien

    2016-07-01

    Quality of Services (QoS) prediction plays an important role in Web service recommendation. Many existing Web service QoS prediction approaches are highly accurate and useful in Internet environments. However, the QoS data of Web service in Mobile Internet are notably more volatile, which makes these approaches fail in making accurate QoS predictions of Web services. In this paper, by weakening the volatility of QoS data, we propose an accurate Web service QoS prediction approach based on the collaborative filtering algorithm. This approach contains three processes, that is, QoS preprocessing, user similarity computing and QoS predicting. We have implemented our proposed approach with an experiment based on real-world and synthetic datasets. The results demonstrate that our approach outperforms other approaches in Mobile Internet.

  18. WebTraceMiner: a web service for processing and mining EST sequence trace files.

    PubMed

    Liang, Chun; Wang, Gang; Liu, Lin; Ji, Guoli; Liu, Yuansheng; Chen, Jinqiao; Webb, Jason S; Reese, Greg; Dean, Jeffrey F D

    2007-07-01

    Expressed sequence tags (ESTs) remain a dominant approach for characterizing the protein-encoding portions of various genomes. Due to inherent deficiencies, they also present serious challenges for data quality control. Before GenBank submission, EST sequences are typically screened and trimmed of vector and adapter/linker sequences, as well as polyA/T tails. Removal of these sequences presents an obstacle for data validation of error-prone ESTs and impedes data mining of certain functional motifs, whose detection relies on accurate annotation of positional information for polyA tails added posttranscriptionally. As raw DNA sequence information is made increasingly available from public repositories, such as NCBI Trace Archive, new tools will be necessary to reanalyze and mine this data for new information. WebTraceMiner (www.conifergdb.org/software/wtm) was designed as a public sequence processing service for raw EST traces, with a focus on detection and mining of sequence features that help characterize 3' and 5' termini of cDNA inserts, including vector fragments, adapter/linker sequences, insert-flanking restriction endonuclease recognition sites and polyA or polyT tails. WebTraceMiner complements other public EST resources and should prove to be a unique tool to facilitate data validation and mining of error-prone ESTs (e.g. discovery of new functional motifs).

  19. Standards-Based, Web Services for Interoperable Geosciences Data Systems

    NASA Astrophysics Data System (ADS)

    Domenico, B.; Nativi, S.; Bigagli, L.; Caron, J.

    2005-12-01

    Disparate, "stove-pipe" data systems are among the main impediments to many interdisciplinary research projects in the geosciences. The solid earth disciplines and hydrology tend to use Geographic Information Systems (GIS) which enable them to store and interact with data representing as discrete features on or near the surface of the earth. Studies of the oceans and atmosphere on the other hand involve systems that represent data as discrete points in the continuous function space of fluid dynamics. Attempts to understand the nature of severe precipitation and flooding events are hampered by the difficulty of integrating data such as streamflows from hydrological data systems with radar data and precipitation forecasts from atmospheric science data systems. An effort is underway to address some of these issues with an interoperability experiment within the framework of the Open Geospatial Consortium (OGC). The experiment is called GALEON (Geo-interface to Atmosphere, Land, Earth, Ocean NetCDF). Teams at the Unidata Program Center and University of Florence are working with a number of international partners to implement a web services interface to traditional atmospheric and oceanographic datasets currently stored in netCDF form or served via the OPeNDAP protocol . The project will result in a gateway service using Web Coverage Service (WCS) specification of the OGC. Underneath the WCS interface will be a combination of technologies including THREDDS (THematic Real-time Environmental Distributed Data Services) and HDF5 (Heirarchical Data Format) in addition to netCDF and OPeNDAP. A key component of the project is to develop mechanisms for explicit encoding of coordinate system information in the form of Coordinate System extensions to NcML (the netCDF Markup Language), directly in the data files themselves and in the form of GML (Geography Markup Language) extensions to NcML. These extensions, called NcML-GML, include a subset profile of the standard GML which is

  20. Web Services at the National Oceanic and Atmospheric Administration (NOAA) National Climatic Data Center (NCDC)

    NASA Astrophysics Data System (ADS)

    Ansari, S.; Baldwin, R.; Del Greco, S.; Lott, N.; Rutledge, G.

    2007-12-01

    NOAA's National Climatic Data Center (NCDC) currently archives over 1.5 petabytes of climatological data from various networks and sources including in-situ, numerical models, radar and satellite. Access to these datasets is evolving from interactive web interfaces utilizing database technology to standardized web services in a Service Oriented Architecture (SOA). NCDC is currently offering several web services using Simple Object Access Protocol (SOAP), XML over Representational State Transfer (REST/XML), Open Geospatial Consortium (OGC) Web Map Service (WMS) / Web Feature Service (WFS) / Web Coverage Service (WCS) and OPeNDAP web service protocols. These services offer users a direct connection between their client applications and NCDC data servers. In addition, users may embed access to the services in custom applications to efficiently navigate and subset data in an automated fashion. NCDC currently provides gridded numerical model data through a THREDDS Data Server and GrADS Data Server which offers OPeNDAP and WCS access. In-situ network metadata are available through WMS and WFS while the corresponding time-series data are accessible through SOAP and REST web services. These in-situ services are a part of the Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI) WaterOneFlow services, a consolidated access system for hydrologic data, and comply with the WaterOneFlow specifications. NCDC's Severe Weather Data Inventory (SWDI), which provides user access to archives of several datasets critical to the detection and evaluation of severe weather, is also accessible through REST/XML services. Providing cataloging, access and search capabilities for many of NCDC's datasets using community driven standards is a top priority for the ever increasing data volumes being archived at NCDC. Providing interoperable access is critical to supporting data stewardship across multiple scientific disciplines and user types. This demonstration will

  1. A verification strategy for web services composition using enhanced stacked automata model.

    PubMed

    Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali

    2015-01-01

    Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the

  2. Bridging the Distance: Service Learning in International Perspective

    ERIC Educational Resources Information Center

    Florman, Jean C.; Just, Craig; Naka, Tomomi; Peterson, Jim; Seaba, Hazel H.

    2009-01-01

    In this article, the authors describe how an existing partnership between two communities, one in eastern Iowa and one in Mexico, was turned into a cross-disciplinary and international service learning course for students in the University of Iowa Colleges of Engineering, Pharmacy, and Liberal Arts and Sciences. The projects that students worked…

  3. Pre-Service Teachers' Views on Web-Based Classroom Management

    ERIC Educational Resources Information Center

    Boyaci, Adnan

    2010-01-01

    With the invention of World Wide Web in 1992, delivery of distance education via internet and emergency of web-based classrooms have rapidly gained acceptance as an alternative and supplement to traditional face to face classroom instruction (Alavi, Yoo & Vogel, 1997; Rahm & Reed, 1997), which represents a paradigm shift challenging all…

  4. Determinants of Corporate Web Services Adoption: A Survey of Companies in Korea

    ERIC Educational Resources Information Center

    Kim, Daekil

    2010-01-01

    Despite the growing interest and attention from Information Technology researchers and practitioners, empirical research on factors that influence an organization's likelihood of adoption of Web Services has been limited. This study identified the factors influencing Web Services adoption from the perspective of 151 South Korean firms. The…

  5. The Business Information Services: Old-Line Online Moves to the Web.

    ERIC Educational Resources Information Center

    O'Leary, Mick

    1997-01-01

    Although the availability of free information on the World Wide Web has placed traditional, fee-based proprietary online services on the defensive, most major online business services are now on the Web. Highlights several business information providers: Profound, NewsNet and ProQuest Direct, Dow Jones and Wall Street Journal Interactive Edition,…

  6. Distance Career Counseling: A Technology-Assisted Model for Delivering Career Counseling Services.

    ERIC Educational Resources Information Center

    Djadali, Yas; Malone, James F.

    The purpose of the present article is to demonstrate the need for distance career counseling services, and to present an evolving counseling model that combines the best practices of face-to-face career counseling with technology. The article begins by tracing the historical development of distance career counseling models, and then illustrates…

  7. Comparison of distance measures in spatial analytical modeling for health service planning

    PubMed Central

    2009-01-01

    Background Several methodological approaches have been used to estimate distance in health service research. In this study, focusing on cardiac catheterization services, Euclidean, Manhattan, and the less widely known Minkowski distance metrics are used to estimate distances from patient residence to hospital. Distance metrics typically produce less accurate estimates than actual measurements, but each metric provides a single model of travel over a given network. Therefore, distance metrics, unlike actual measurements, can be directly used in spatial analytical modeling. Euclidean distance is most often used, but unlikely the most appropriate metric. Minkowski distance is a more promising method. Distances estimated with each metric are contrasted with road distance and travel time measurements, and an optimized Minkowski distance is implemented in spatial analytical modeling. Methods Road distance and travel time are calculated from the postal code of residence of each patient undergoing cardiac catheterization to the pertinent hospital. The Minkowski metric is optimized, to approximate travel time and road distance, respectively. Distance estimates and distance measurements are then compared using descriptive statistics and visual mapping methods. The optimized Minkowski metric is implemented, via the spatial weight matrix, in a spatial regression model identifying socio-economic factors significantly associated with cardiac catheterization. Results The Minkowski coefficient that best approximates road distance is 1.54; 1.31 best approximates travel time. The latter is also a good predictor of road distance, thus providing the best single model of travel from patient's residence to hospital. The Euclidean metric and the optimal Minkowski metric are alternatively implemented in the regression model, and the results compared. The Minkowski method produces more reliable results than the traditional Euclidean metric. Conclusion Road distance and travel time

  8. SSE Announcement - New GIS Web Mapping Applications and Services

    Atmospheric Science Data Center

    2016-06-30

    ... If you haven’t already noticed the link to the new SSE-GIS web application on the SSE homepage entitled “GIS Web Mapping Applications and Services”, we invite you to visit the site. The Surface meteorology and Solar Energy (SSE) v1.0.3 Web Mapping ...

  9. Integration of RFID and web service for assisted living.

    PubMed

    Unluturk, Mehmet S; Kurtel, Kaan

    2012-08-01

    The number of people over 65 years old throughout most stable and prosperous countries in the world is increasing. Availability of their care in their own homes is imperative because of the economic reasons and their choices where to live (World Health Organization, Definition of an older or elderly person. http://www.who.int/healthinfo/survey/ageingdefnolder/en/ ; EQUIP-European Framework for Qualifications in Home Care Services for Older People, http://www.equip-project.com ; Salonen, 2009). "Recent advancement in wireless communications and electronics has enabled the development of low-cost sensor networks. The sensor networks can be utilized in various application areas." (Akyildiz, et al. 2002) These two statements show that there is a great promise in wireless technology and utilizing it in assisted living might be very beneficial to the elderly people. In this paper, we propose software architecture called Location Windows Service (LWS) which integrates the Radio Frequency Identification (RFID) technology and the web service to build an assisted living system for elderly people at home. This architecture monitors the location of elderly people without interfering in their daily activities. Location information messages that are generated as the elderly move from room to room indicate that the elderly person is fit and healthy and going about their normal life. The communication must be timely enough to follow elderly people as they move from room to room without missing a location. Unacknowledged publishing, subscription filtering and short location change messages are also included in this software model to reduce the network traffic in large homes. We propose some defense schemes being applied to the network environment of the assisted living system to prevent any external attacks.

  10. The ICNP BaT - from translation tool to translation web service.

    PubMed

    Schrader, Ulrich

    2009-01-01

    The ICNP BaT has been developed as a web application to support the collaborative translation of different versions of the ICNP into different languages. A prototype of a web service is described that could reuse the translations in the database of the ICNP BaT to provide automatic translations of nursing content based on the ICNP terminology globally. The translation web service is based on a service-oriented architecture making it easy to interoperate with different applications. Such a global translation server would free individual institutions from the maintenance costs of realizing their own translation services.

  11. A Framework of Synthesizing Tutoring Conversation Capability with Web-Based Distance Education Courseware

    ERIC Educational Resources Information Center

    Song, Ki-Sang; Hu, Xiangen; Olney, Andrew; Graesser, Arthur C.

    2004-01-01

    Whereas existing learning environments on the Web lack high level interactivity, we have developed a human tutor-like tutorial conversation system for the Web that enhances educational courseware through mixed-initiative dialog with natural language processing. The conversational tutoring agent is composed of an animated tutor, a Latent Semantic…

  12. Working without a Crystal Ball: Predicting Web Trends for Web Services Librarians

    ERIC Educational Resources Information Center

    Ovadia, Steven

    2008-01-01

    User-centered design is a principle stating that electronic resources, like library Web sites, should be built around the needs of the users. This article interviews Web developers of library and non-library-related Web sites, determining how they assess user needs and how they decide to adapt certain technologies for users. According to the…

  13. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  14. A Privacy Access Control Framework for Web Services Collaboration with Role Mechanisms

    NASA Astrophysics Data System (ADS)

    Liu, Linyuan; Huang, Zhiqiu; Zhu, Haibin

    With the popularity of Internet technology, web services are becoming the most promising paradigm for distributed computing. This increased use of web services has meant that more and more personal information of consumers is being shared with web service providers, leading to the need to guarantee the privacy of consumers. This paper proposes a role-based privacy access control framework for Web services collaboration, it utilizes roles to specify the privacy privileges of services, and considers the impact on the reputation degree of the historic experience of services in playing roles. Comparing to the traditional privacy access control approaches, this framework can make the fine-grained authorization decision, thus efficiently protecting consumers' privacy.

  15. Understanding Transactional Distance in Web-Based Learning Environments: An Empirical Study

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Chandra, Aruna; DePaolo, Concetta A.; Simmons, Lakisha L.

    2016-01-01

    Transactional distance is an important pedagogical theory in distance education that calls for more empirical support. The purpose of this study was to verify the theory by operationalizing and examining the relationship of (1) dialogue, structure and learner autonomy to transactional distance, and (2) environmental factors and learner demographic…

  16. The Building of Digital Archives Personalized Service Website based on Web 2.0

    NASA Astrophysics Data System (ADS)

    Ziyu, Cheng; Haining, An

    Web2.0 technology has been applied in the digital archive personalized service website. Although there are few users and they lack of understanding currently, the author believe that web2.0 relying on many advantages of fast, convenient, and the zero cost, will be approved by more and more users in the future. With the continuous perfection and popularity of web2.0, the personalized service of digital archives will display a new vitality. In the paper, author proposes the application approaches of web2.0 in the system.

  17. Job submission and management through web services: the experience with the CREAM service

    NASA Astrophysics Data System (ADS)

    Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Fina, S. D.; Ronco, S. D.; Dorigo, A.; Gianelle, A.; Marzolla, M.; Mazzucato, M.; Sgaravatto, M.; Verlato, M.; Zangrando, L.; Corvo, M.; Miccio, V.; Sciaba, A.; Cesini, D.; Dongiovanni, D.; Grandi, C.

    2008-07-01

    Modern Grid middleware is built around components providing basic functionality, such as data storage, authentication, security, job management, resource monitoring and reservation. In this paper we describe the Computing Resource Execution and Management (CREAM) service. CREAM provides a Web service-based job execution and management capability for Grid systems; in particular, it is being used within the gLite middleware. CREAM exposes a Web service interface allowing conforming clients to submit and manage computational jobs to a Local Resource Management System. We developed a special component, called ICE (Interface to CREAM Environment) to integrate CREAM in gLite. ICE transfers job submissions and cancellations from the Workload Management System, allowing users to manage CREAM jobs from the gLite User Interface. This paper describes some recent studies aimed at assessing the performance and reliability of CREAM and ICE; those tests have been performed as part of the acceptance tests for integration of CREAM and ICE in gLite. We also discuss recent work towards enhancing CREAM with a BES and JSDL compliant interface.

  18. Data partitioning enables the use of standard SOAP Web Services in genome-scale workflows.

    PubMed

    Sztromwasser, Pawel; Puntervoll, Pål; Petersen, Kjell

    2011-07-26

    Biological databases and computational biology tools are provided by research groups around the world, and made accessible on the Web. Combining these resources is a common practice in bioinformatics, but integration of heterogeneous and often distributed tools and datasets can be challenging. To date, this challenge has been commonly addressed in a pragmatic way, by tedious and error-prone scripting. Recently however a more reliable technique has been identified and proposed as the platform that would tie together bioinformatics resources, namely Web Services. In the last decade the Web Services have spread wide in bioinformatics, and earned the title of recommended technology. However, in the era of high-throughput experimentation, a major concern regarding Web Services is their ability to handle large-scale data traffic. We propose a stream-like communication pattern for standard SOAP Web Services, that enables efficient flow of large data traffic between a workflow orchestrator and Web Services. We evaluated the data-partitioning strategy by comparing it with typical communication patterns on an example pipeline for genomic sequence annotation. The results show that data-partitioning lowers resource demands of services and increases their throughput, which in consequence allows to execute in-silico experiments on genome-scale, using standard SOAP Web Services and workflows. As a proof-of-principle we annotated an RNA-seq dataset using a plain BPEL workflow engine.

  19. AFAL: a web service for profiling amino acids surrounding ligands in proteins

    NASA Astrophysics Data System (ADS)

    Arenas-Salinas, Mauricio; Ortega-Salazar, Samuel; Gonzales-Nilo, Fernando; Pohl, Ehmke; Holmes, David S.; Quatrini, Raquel

    2014-11-01

    With advancements in crystallographic technology and the increasing wealth of information populating structural databases, there is an increasing need for prediction tools based on spatial information that will support the characterization of proteins and protein-ligand interactions. Herein, a new web service is presented termed amino acid frequency around ligand (AFAL) for determining amino acids type and frequencies surrounding ligands within proteins deposited in the Protein Data Bank and for assessing the atoms and atom-ligand distances involved in each interaction (availability: http://structuralbio.utalca.cl/AFAL/index.html). AFAL allows the user to define a wide variety of filtering criteria (protein family, source organism, resolution, sequence redundancy and distance) in order to uncover trends and evolutionary differences in amino acid preferences that define interactions with particular ligands. Results obtained from AFAL provide valuable statistical information about amino acids that may be responsible for establishing particular ligand-protein interactions. The analysis will enable investigators to compare ligand-binding sites of different proteins and to uncover general as well as specific interaction patterns from existing data. Such patterns can be used subsequently to predict ligand binding in proteins that currently have no structural information and to refine the interpretation of existing protein models. The application of AFAL is illustrated by the analysis of proteins interacting with adenosine-5'-triphosphate.

  20. AFAL: a web service for profiling amino acids surrounding ligands in proteins.

    PubMed

    Arenas-Salinas, Mauricio; Ortega-Salazar, Samuel; Gonzales-Nilo, Fernando; Pohl, Ehmke; Holmes, David S; Quatrini, Raquel

    2014-11-01

    With advancements in crystallographic technology and the increasing wealth of information populating structural databases, there is an increasing need for prediction tools based on spatial information that will support the characterization of proteins and protein-ligand interactions. Herein, a new web service is presented termed amino acid frequency around ligand (AFAL) for determining amino acids type and frequencies surrounding ligands within proteins deposited in the Protein Data Bank and for assessing the atoms and atom-ligand distances involved in each interaction (availability: http://structuralbio.utalca.cl/AFAL/index.html ). AFAL allows the user to define a wide variety of filtering criteria (protein family, source organism, resolution, sequence redundancy and distance) in order to uncover trends and evolutionary differences in amino acid preferences that define interactions with particular ligands. Results obtained from AFAL provide valuable statistical information about amino acids that may be responsible for establishing particular ligand-protein interactions. The analysis will enable investigators to compare ligand-binding sites of different proteins and to uncover general as well as specific interaction patterns from existing data. Such patterns can be used subsequently to predict ligand binding in proteins that currently have no structural information and to refine the interpretation of existing protein models. The application of AFAL is illustrated by the analysis of proteins interacting with adenosine-5'-triphosphate.

  1. Service Knowledge Spaces for Semantic Collaboration in Web-based Systems

    NASA Astrophysics Data System (ADS)

    Bianchini, Devis; de Antonellis, Valeria; Melchiori, Michele

    Semantic Web technologies have been applied to enable collaboration in open distributed systems, where interoperability issues raise due to the absence of a global view of the shared resources. Adoption of service-oriented technologies has improved interoperability at the application level by exporting systems functionalities as Web services. In fact, Service Oriented Architecture (SOA) constitutes an appropriate platform-independent approach to implement collaboration activities by means of automatic service discovery and composition. Recently, service discovery has been applied to collaborative environments such as the P2P one, where independent partners need cooperate through resource sharing without a stable network configuration and adopting different semantic models. Model-based techniques relying on Semantic Web need be defined to generate semantic service descriptions, allowing collaborative partners to export their functionalities in a semantic way. Semantic-based service matchmaking techniques are in charge of effectively and efficiently evaluating similarity between service requests and service offers in a huge, dynamic distributed environment. The result is an evolving service knowledge space where collaborative partners that provide similar services are semantically related and constitute synergic service centres in a given domain. Specific modeling requirements related to Semantic Web, service-oriented and P2P technologies must be considered.

  2. BioCatalogue: a universal catalogue of web services for the life sciences.

    PubMed

    Bhagat, Jiten; Tanoh, Franck; Nzuobontane, Eric; Laurent, Thomas; Orlowski, Jerzy; Roos, Marco; Wolstencroft, Katy; Aleksejevs, Sergejs; Stevens, Robert; Pettifer, Steve; Lopez, Rodrigo; Goble, Carole A

    2010-07-01

    The use of Web Services to enable programmatic access to on-line bioinformatics is becoming increasingly important in the Life Sciences. However, their number, distribution and the variable quality of their documentation can make their discovery and subsequent use difficult. A Web Services registry with information on available services will help to bring together service providers and their users. The BioCatalogue (http://www.biocatalogue.org/) provides a common interface for registering, browsing and annotating Web Services to the Life Science community. Services in the BioCatalogue can be described and searched in multiple ways based upon their technical types, bioinformatics categories, user tags, service providers or data inputs and outputs. They are also subject to constant monitoring, allowing the identification of service problems and changes and the filtering-out of unavailable or unreliable resources. The system is accessible via a human-readable 'Web 2.0'-style interface and a programmatic Web Service interface. The BioCatalogue follows a community approach in which all services can be registered, browsed and incrementally documented with annotations by any member of the scientific community.

  3. BioCatalogue: a universal catalogue of web services for the life sciences

    PubMed Central

    Bhagat, Jiten; Tanoh, Franck; Nzuobontane, Eric; Laurent, Thomas; Orlowski, Jerzy; Roos, Marco; Wolstencroft, Katy; Aleksejevs, Sergejs; Stevens, Robert; Pettifer, Steve; Lopez, Rodrigo; Goble, Carole A.

    2010-01-01

    The use of Web Services to enable programmatic access to on-line bioinformatics is becoming increasingly important in the Life Sciences. However, their number, distribution and the variable quality of their documentation can make their discovery and subsequent use difficult. A Web Services registry with information on available services will help to bring together service providers and their users. The BioCatalogue (http://www.biocatalogue.org/) provides a common interface for registering, browsing and annotating Web Services to the Life Science community. Services in the BioCatalogue can be described and searched in multiple ways based upon their technical types, bioinformatics categories, user tags, service providers or data inputs and outputs. They are also subject to constant monitoring, allowing the identification of service problems and changes and the filtering-out of unavailable or unreliable resources. The system is accessible via a human-readable ‘Web 2.0’-style interface and a programmatic Web Service interface. The BioCatalogue follows a community approach in which all services can be registered, browsed and incrementally documented with annotations by any member of the scientific community. PMID:20484378

  4. The EarthServer Geology Service: web coverage services for geosciences

    NASA Astrophysics Data System (ADS)

    Laxton, John; Sen, Marcus; Passmore, James

    2014-05-01

    The EarthServer FP7 project is implementing web coverage services using the OGC WCS and WCPS standards for a range of earth science domains: cryospheric; atmospheric; oceanographic; planetary; and geological. BGS is providing the geological service (http://earthserver.bgs.ac.uk/). Geoscience has used remote sensed data from satellites and planes for some considerable time, but other areas of geosciences are less familiar with the use of coverage data. This is rapidly changing with the development of new sensor networks and the move from geological maps to geological spatial models. The BGS geology service is designed initially to address two coverage data use cases and three levels of data access restriction. Databases of remote sensed data are typically very large and commonly held offline, making it time-consuming for users to assess and then download data. The service is designed to allow the spatial selection, editing and display of Landsat and aerial photographic imagery, including band selection and contrast stretching. This enables users to rapidly view data, assess is usefulness for their purposes, and then enhance and download it if it is suitable. At present the service contains six band Landsat 7 (Blue, Green, Red, NIR 1, NIR 2, MIR) and three band false colour aerial photography (NIR, green, blue), totalling around 1Tb. Increasingly 3D spatial models are being produced in place of traditional geological maps. Models make explicit spatial information implicit on maps and thus are seen as a better way of delivering geosciences information to non-geoscientists. However web delivery of models, including the provision of suitable visualisation clients, has proved more challenging than delivering maps. The EarthServer geology service is delivering 35 surfaces as coverages, comprising the modelled superficial deposits of the Glasgow area. These can be viewed using a 3D web client developed in the EarthServer project by Fraunhofer. As well as remote sensed

  5. Market Research: The World Wide Web Meets the Online Services.

    ERIC Educational Resources Information Center

    Bing, Michelle

    1996-01-01

    The World Wide Web can provide direct market research data inexpensively or can target the appropriate professional online database and narrow the search. This article discusses the Web presence of research and investment firms, financial pages, trade associations, and electronic publications containing market research data. It lists Uniform…

  6. Enabling Web-Based Analysis of CUAHSI HIS Hydrologic Data Using R and Web Processing Services

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Kadlec, J.; Bayles, M.; Seul, M.; Hooper, R. P.; Cummings, B.

    2015-12-01

    The CUAHSI Hydrologic Information System (CUAHSI HIS) provides open access to a large number of hydrological time series observation and modeled data from many parts of the world. Several software tools have been designed to simplify searching and access to the CUAHSI HIS datasets. These software tools include: Desktop client software (HydroDesktop, HydroExcel), developer libraries (WaterML R Package, OWSLib, ulmo), and the new interactive search website, http://data.cuahsi.org. An issue with using the time series data from CUAHSI HIS for further analysis by hydrologists (for example for verification of hydrological and snowpack models) is the large heterogeneity of the time series data. The time series may be regular or irregular, contain missing data, have different time support, and be recorded in different units. R is a widely used computational environment for statistical analysis of time series and spatio-temporal data that can be used to assess fitness and perform scientific analyses on observation data. R includes the ability to record a data analysis in the form of a reusable script. The R script together with the input time series dataset can be shared with other users, making the analysis more reproducible. The major goal of this study is to examine the use of R as a Web Processing Service for transforming time series data from the CUAHSI HIS and sharing the results on the Internet within HydroShare. HydroShare is an online data repository and social network for sharing large hydrological data sets such as time series, raster datasets, and multi-dimensional data. It can be used as a permanent cloud storage space for saving the time series analysis results. We examine the issues associated with running R scripts online: including code validation, saving of outputs, reporting progress, and provenance management. An explicit goal is that the script which is run locally should produce exactly the same results as the script run on the Internet. Our design can

  7. Some Programs Should Not Run on Laptops - Providing Programmatic Access to Applications Via Web Services

    NASA Astrophysics Data System (ADS)

    Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.

    2003-12-01

    Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have

  8. The QuakeSim Project: Web Services for Managing Geophysical Data and Applications

    NASA Astrophysics Data System (ADS)

    Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet

    2008-04-01

    We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.

  9. Measuring Transactional Distance in Web-Based Learning Environments: An Initial Instrument Development

    ERIC Educational Resources Information Center

    Huang, Xiaoxia; Chandra, Aruna; DePaolo, Concetta; Cribbs, Jennifer; Simmons, Lakisha

    2015-01-01

    This study was an initial attempt to operationalise Moore's transactional distance theory by developing and validating an instrument measuring the related constructs: dialogue, structure, learner autonomy and transactional distance. Data were collected from 227 online students and analysed through an exploratory factor analysis. Results suggest…

  10. Distance Learning in Indian Country: Becoming the Spider on the Web.

    ERIC Educational Resources Information Center

    Sanchez, John; Stuckey, Mary E.; Morris, Richard

    1998-01-01

    Examines potential uses of distance learning for maintaining and sustaining American-Indian tribal communities within the United States while allowing access to the information and skills needed for employment in the dominant society. Examines distance education in general, traditional education in tribal contexts, tribal uses of…

  11. Enrollment in Distance Education Courses, by State: Fall 2012. Web Tables. NCES 2014-023

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2014

    2014-01-01

    Postsecondary enrollment in distance education courses, particularly those offered online, has rapidly increased in recent years (Allen and Seaman 2013). Traditionally, distance education offerings and enrollment levels have varied across different types of institutions. For example, researchers have found that undergraduate enrollment in at least…

  12. A Web-Based Model for Developing Assessment Literacy of Secondary In-Service Teachers

    ERIC Educational Resources Information Center

    Fan, Ya-Ching; Wang, Tzu-Hua; Wang, Kuo-Hua

    2011-01-01

    This research investigates the effect of a web-based model, named "Practicing, Reflecting, and Revising with Web-based Assessment and Test Analysis system (P2R-WATA) Assessment Literacy Development Model," on enhancing assessment knowledge and perspectives of secondary in-service teachers, and adopts a single group experimental research…

  13. Motivating Pre-Service Teachers in Technology Integration of Web 2.0 for Teaching Internships

    ERIC Educational Resources Information Center

    Kim, Hye Jeong; Jang, Hwan Young

    2015-01-01

    The aim of this study was to examine the predictors of pre-service teachers' use of Web 2.0 tools during a teaching internship, after a course that emphasized the use of the tools for instructional activities. Results revealed that integrating Web 2.0 tools during their teaching internship was strongly predicted by participants' perceived…

  14. Services for Graduate Students: A Review of Academic Library Web Sites

    ERIC Educational Resources Information Center

    Rempel, Hannah Gascho

    2010-01-01

    A library's Web site is well recognized as the gateway to the library for the vast majority of users. Choosing the most user-friendly Web architecture to reflect the many services libraries offer is a complex process, and librarians are still experimenting to find what works best for their users. As part of a redesign of the Oregon State…

  15. OneGeology Web Services and Portal as a global geological SDI - latest standards and technology

    NASA Astrophysics Data System (ADS)

    Duffy, Tim; Tellez-Arenas, Agnes

    2014-05-01

    The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 Web Feature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based Web Feature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone

  16. A Survey of the Usability of Digital Reference Services on Academic Health Science Library Web Sites

    ERIC Educational Resources Information Center

    Dee, Cheryl; Allen, Maryellen

    2006-01-01

    Reference interactions with patrons in a digital library environment using digital reference services (DRS) has become widespread. However, such services in many libraries appear to be underutilized. A study surveying the ease and convenience of such services for patrons in over 100 academic health science library Web sites suggests that…

  17. How the OCLC CORC Service Is Helping Weave Libraries into the Web.

    ERIC Educational Resources Information Center

    Covert, Kay

    2001-01-01

    Describes OCLC's CORC (Cooperative Online Resource Catalog) service. As a state-of-the-art Web-based metadata creation system, CORC is optimized for creating bibliographic records and pathfinders for electronic resources. Discusses how libraries are using CORC in technical services, public services, and collection development and explains the…

  18. QoS Measurement of Workflow-Based Web Service Compositions Using Colored Petri Net

    PubMed Central

    Nematzadeh, Hossein; Motameni, Homayun; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation. PMID:25110748

  19. QoS measurement of workflow-based web service compositions using Colored Petri net.

    PubMed

    Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  20. The value of Web-based library services at Cedars-Sinai Health System.

    PubMed Central

    Halub, L P

    1999-01-01

    Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services. PMID:10427423

  1. Semi-automatic web service composition for the life sciences using the BioMoby semantic web framework.

    PubMed

    DiBernardo, Michael; Pottinger, Rachel; Wilkinson, Mark

    2008-10-01

    Researchers in the life-sciences are currently limited to small-scale informatics experiments and analyses because of the lack of interoperability among life-sciences web services. This limitation can be addressed by annotating services and their interfaces with semantic information, so that interoperability problems can be reasoned about programmatically. The Moby semantic web framework is a popular and mature platform that is used for this purpose. However, the number of services that are available to select from when building a workflow is becoming unmanageable for users. As such, attempts have been made to assist with service selection and composition. These tasks fall under the general label of automated service composition. We present a prototype workflow assembly client that reduces the number of choices that users have to make by (1) restricting the overall set of services presented to them and (2) ranking services so that the the most desirable ones are presented first. We demonstrate via an evaluation of this prototype that a unification of relatively simple techniques can rank desirable services highly while maintaining interactive response times.

  2. Using JavaScript and the FDSN web service to create an interactive earthquake information system

    NASA Astrophysics Data System (ADS)

    Fischer, Kasper D.

    2015-04-01

    The FDSN web service provides a web interface to access earthquake meta-data (e. g. event or station information) and waveform date over the internet. Requests are send to a server as URLs and the output is either XML or miniSEED. This makes it hard to read by humans but easy to process with different software. Different data centers are already supporting the FDSN web service, e. g. USGS, IRIS, ORFEUS. The FDSN web service is also part of the Seiscomp3 (http://www.seiscomp3.org) software. The Seismological Observatory of the Ruhr-University switched to Seiscomp3 as the standard software for the analysis of mining induced earthquakes at the beginning of 2014. This made it necessary to create a new web-based earthquake information service for the publication of results to the general public. This has be done by processing the output of a FDSN web service query by javascript running in a standard browser. The result is an interactive map presenting the observed events and further information of events and stations on a single web page as a table and on a map. In addition the user can download event information, waveform data and station data in different formats like miniSEED, quakeML or FDSNxml. The developed code and all used libraries are open source and freely available.

  3. Design, Development and Testing of Web Services for Multi-Sensor Snow Cover Mapping

    NASA Astrophysics Data System (ADS)

    Kadlec, Jiri

    This dissertation presents the design, development and validation of new data integration methods for mapping the extent of snow cover based on open access ground station measurements, remote sensing images, volunteer observer snow reports, and cross country ski track recordings from location-enabled mobile devices. The first step of the data integration procedure includes data discovery, data retrieval, and data quality control of snow observations at ground stations. The WaterML R package developed in this work enables hydrologists to retrieve and analyze data from multiple organizations that are listed in the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI) Water Data Center catalog directly within the R statistical software environment. Using the WaterML R package is demonstrated by running an energy balance snowpack model in R with data inputs from CUAHSI, and by automating uploads of real time sensor observations to CUAHSI HydroServer. The second step of the procedure requires efficient access to multi-temporal remote sensing snow images. The Snow Inspector web application developed in this research enables the users to retrieve a time series of fractional snow cover from the Moderate Resolution Imaging Spectroradiometer (MODIS) for any point on Earth. The time series retrieval method is based on automated data extraction from tile images provided by a Web Map Tile Service (WMTS). The average required time for retrieving 100 days of data using this technique is 5.4 seconds, which is significantly faster than other methods that require the download of large satellite image files. The presented data extraction technique and space-time visualization user interface can be used as a model for working with other multi-temporal hydrologic or climate data WMTS services. The third, final step of the data integration procedure is generating continuous daily snow cover maps. A custom inverse distance weighting method has been developed

  4. Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment (DAVE)

    DTIC Science & Technology

    2015-05-01

    1 LEXICAL LINK ANALYSIS (LLA) APPLICATION: IMPROVING WEB SERVICE TO DEFENSE ACQUISITION VISIBILITY ENVIRONMENT(DAVE) May 13-14, 2015 Dr. Ying...Improving Web Service to Defense Acquisition Visibility Environment (DAVE) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR...pattern recognition that scales up to Big Data • System Self-Awareness (SSA) • Big data and Deep Learning (BDDL) / Big Data Architecture and

  5. Web-Based Lecture Technologies: Blurring the Boundaries between Face-to-Face and Distance Learning

    ERIC Educational Resources Information Center

    Woo, Karen; Gosper, Maree; McNeill, Margot; Preston, Greg; Green, David; Phillips, Rob

    2008-01-01

    Web-based lecture technologies (WBLT) have gained popularity amongst universities in Australia as a tool for delivering lecture recordings to students in close to real time. This paper reports on a selection of results from a larger research project investigating the impact of WBLT on teaching and learning. Results show that while staff see the…

  6. Using Allaire's ColdFusion to Deliver Web-Based Information to Distance Learning Students.

    ERIC Educational Resources Information Center

    Arashiro, Peter; Milton, Kirby

    1999-01-01

    Discusses Lansing Community College's (Michigan) Virtual College, initiated in 1997 and now offering 47 courses for online delivery. Reviews ColdFusion 3.1, a database-driven Web application server, and describes how the college has successfully used it to support students and faculty. (EMH)

  7. Development of Two Multimedia Corpora for Adaptive Web-Based Distance Learning.

    ERIC Educational Resources Information Center

    Kopsacheilis, Evangelos V.

    This paper presents two cases of the use of a World Wide Web-based system designed to support online educational procedures. The system presented, called DVT (Dynamic Virtual Trainer), consists of one or more server machines that store structured educational material in the form of autonomous units that may be accessed, combined, and presented to…

  8. The Knowledge Web: Learning and Collaborating on the Net. Open and Distance Learning Series.

    ERIC Educational Resources Information Center

    Eisenstadt, Marc, Ed.; Vincent, Tom, Ed.

    This book contains a collection of examples of new and effective uses of the World Wide Web in education from the Knowledge Media Institute (KMi) at the Open University (Great Britain). The publication is organized in three main sections--"Learning Media,""Collaboration and Presence," and "Knowledge Systems on the…

  9. Building from Where We Are: Web Services and Java-Based Clients to Enable Virtual Observatories

    NASA Astrophysics Data System (ADS)

    Candey, R. M.; Chimiak, R. A.; Han, D. B.; Harris, B. T.; Johnson, R. C.; Klipsch, C. A.; Kovalick, T. J.; Leckner, H. A.; Liu, M. H.; McGuire, R. E.

    2005-12-01

    The Space Physics Data Facility at NASA Goddard has developed a strong foundation in space science mission services and data for enhancing the scientific return of space physics research and enabling integration of these services into the emerging Virtual discipline Observatory (VxO) paradigm. We are deploying a critical set of foundation components, leveraging our data format expertise and our existing and very popular science and orbit data web-based services, such as Coordinated Data Analysis Web [CDAWeb] and Satellite Situation Center Web [SSCweb]. We have developed web services APIs for orbit location, data finding across FTP sites and in CDAWeb, data file format translation, and data visualizations that tie together existing data holdings, standardize and simplify their use, and enable much enhanced interoperability and data analysis. We describe the technologies we've developed, our experiences and lessons-learned in implementing them, why we chose some technologies over others (web services vs. CORBA or simple CGI, Java vs. JavaScript or Flash), what we might do differently now, and our future direction. We discuss the difficulties in maintaining compatibility and inter-operability through various versions of web services and in merging various client projects, while adding extended functionality such as sonification interfaces in a modular fashion.

  10. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014

  11. Open, Modular Services for Large, Multi-Dimensional Raster Coverages: The OGC Web Coverage Service (WCS) Standards Suite

    NASA Astrophysics Data System (ADS)

    Baumann, P.

    2009-04-01

    Recent progress in hardware and software technology opens up vistas where flexible services on large, multi-dimensional coverage data become a commodity. Interactive data browsing like with Virtual Globes, selective download, and ad-hoc analysis services are about to become available routinely, as several sites already demonstrate. However, for easy access and true machine-machine communication, Semantic Web concepts as being investigated for vector and meta data, need to be extended to raster data and other coverage types. Even more will it then be important to rely on open standards for data and service interoperability. The Open GeoSpatial Consortium (OGC), following a modular approach to specifying geo service interfaces, has issued the Web Coverage Service (WCS) Implementation Standard for accessing coverages or parts thereof. In contrast to the Web Map Service (WMS), which delivers imagery, WCS preserves data semantics and, thus, allows further processing. Together with the Web Catalog Service (CS-W) and the Web Feature Service (WFS) WCS completes the classical triad of meta, vector, and raster data. As such, they represent the core data services on which other services build. The current version of WCS is 1.1 with Corrigendum 2, also referred to as WCS 1.1.2. The WCS Standards Working Group (WCS.SWG) is continuing development of WCS in various directions. One work item is to extend WCS, which currently is confined to regularly gridded data, with support for further coverage types, such as those specified in ISO 19123. Two recently released extensions to WCS are WCS-T ("T" standing for "transactional") which adds upload capabilities to coverage servers and WCPS (Web Coverage Processing Service) which offers a coverage processing language, thereby bridging the gap to the generic WPS (Web Processing Service). All this is embedded into OGC's current initiative to achieve modular topical specification suites through so-called "extensions" which add focused

  12. Mobile Device Intervention for Student Support Services in Distance Education Context--FRAME Model Perspective

    ERIC Educational Resources Information Center

    Kumar, Lalita S.; Jamatia, Biplab; Aggarwal, A. K.; Kannan, S.

    2011-01-01

    This paper reports the findings of a study conducted to analyse the effect of mobile device intervention for student support services and to gauge its use for enhancing teaching--learning process as a future study in the context of offer of Distance Education programmes. The study was conducted with the learners of the coveted Post Graduate…

  13. Special Education Related Services and Distance Education in the 21st Century Classroom

    ERIC Educational Resources Information Center

    Pantazis, Mary Ellen

    2013-01-01

    This exploratory study addresses special education related services and the requirements when a school-aged student with a disability attends school using synchronous distance education tools to access the least restrictive environment. The researcher examines these placements to explore the implications virtual schooling has on students receiving…

  14. Gazing into the Crystal Ball: Using Scenarios for Future Visioning of a Distance Learning Library Service

    ERIC Educational Resources Information Center

    Casey, Anne Marie; Cawthorne, Jon E.; Citro, Kathleen

    2014-01-01

    This article describes the use of scenarios as a tool to assist a large distance learning library service in its strategic planning. Through a description of the scenario process from beginning to end, the authors detail the steps that the library director and the consultant took initially; their missteps; and the successful conclusion. This study…

  15. A Meta-Ethnographic Synthesis of Support Services for Adult Learners in Distance Learning Programs

    ERIC Educational Resources Information Center

    Tuquero, Jean M.

    2010-01-01

    This qualitative research study utilized Noblit and Hare's (1988) meta-ethnographic approach to synthesize findings of five dissertations that focused on distance learning support services for adult learners. Noblit and Hare's (1988) meta-ethnographic approach consists of seven phases. Each meta-ethnographic phase guided the identification process…

  16. A Meta-Ethnographic Synthesis of Support Services in Distance Learning Programs

    ERIC Educational Resources Information Center

    Tuquero, Jean Marie

    2011-01-01

    This qualitative study utilized a meta-ethnographic approach to synthesize findings of five dissertations that focused on distance learning support services. The history of ethnographical studies stemmed from an anthropological background. Researchers have applied ethnographical approaches to examine and document various phenomena. This…

  17. Should Tutoring Services Be Added to Our High-Enrolling Distance Education Courses?

    ERIC Educational Resources Information Center

    Williams, Peter B.; Howell, Scott L.; Laws, R. Dwight; Metheny, Emily

    2006-01-01

    The researchers of this study selected four pragmatic research questions that distance learning administrators with high-enrolling Independent Study courses, similar to those that Brigham Young University offers through its Department of Independent Study, may be interested in exploring. These questions included: (1) When tutoring services are…

  18. Hunting and Gathering: Attempting to Assess Services to Distance Learning Students

    ERIC Educational Resources Information Center

    Hebert, Andrea

    2016-01-01

    This case study presents the experiences of a newly hired distance learning librarian at a large academic library. Faced with taking over a position that lacked a dedicated staff member for two years, the librarian wanted to understand the current state of services. In the course of investigating and collecting data and while working through…

  19. A Naturalistic Inquiry of a Distance Learning University TESOL Program for In-Service Teachers

    ERIC Educational Resources Information Center

    Summers-Rocha, Lonna

    2015-01-01

    In this naturalistic inquiry, I explore a professional development program which provided Teaching English to Speakers of Other Languages (TESOL) graduate coursework from a university in Northeast Kansas to in-service teachers in Southwest Kansas through distance learning. Data sources included interviews, participant observation, and document and…

  20. Evolution or Integration: What Is the Current State of Library Services for Distance Learners?

    ERIC Educational Resources Information Center

    Behr, Michele D.; Hayward, Julie L.

    2016-01-01

    Are services that were once intended to be exclusively available to the distance learning population now typically available for all users in a university community? This article seeks to investigate this question using two different methods. First, an unobtrusive study of 100 library Websites was conducted to determine whether these libraries…

  1. A study on heterogeneous distributed spatial information platform based on semantic Web services

    NASA Astrophysics Data System (ADS)

    Peng, Shuang-yun; Yang, Kun; Xu, Quan-li; Huang, Bang-mei

    2008-10-01

    With the development of Semantic Web technology, the spatial information service based on ontology is an effective way for sharing and interoperation of heterogeneous information resources in the distributed network environment. This paper discusses spatial information sharing and interoperability in the Semantic Web Services architecture. Through using Ontology record spatial information in sharing knowledge system, explicit and formalization expresses the default and the concealment semantic information. It provides the prerequisite for spatial information sharing and interoperability; Through Semantic Web Services technology parses Ontology and intelligent buildings services under network environment, form a network of services. In order to realize the practical applications of spatial information sharing and interoperation in different brunches of CDC system, a prototype system for HIV/AIDS information sharing based on geo-ontology has also been developed by using the methods described above.

  2. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    NASA Astrophysics Data System (ADS)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  3. Investigating metrics of geospatial web services: The case of a CEOS federated catalog service for earth observation data

    NASA Astrophysics Data System (ADS)

    Han, Weiguo; Di, Liping; Yu, Genong; Shao, Yuanzheng; Kang, Lingjun

    2016-07-01

    Geospatial Web Services (GWS) make geospatial information and computing resources discoverable and accessible over the Web. Among them, Open Geospatial Consortium (OGC) standards-compliant data, catalog and processing services are most popular, and have been widely adopted and leveraged in geospatial research and applications. The GWS metrics, such as visit count, average processing time, and user distribution, are important to evaluate their overall performance and impacts. However, these metrics, especially of federated catalog service, have not been systematically evaluated and reported to relevant stakeholders from the point of view of service providers. Taking an integrated catalog service for earth observation data as an example, this paper describes metrics information retrieval, organization, and representation of a catalog service federation. An extensible and efficient log file analyzer is implemented to retrieve a variety of service metrics from the log file and store analysis results in an easily programmable format. An Ajax powered Web portal is built to provide stakeholders, sponsors, developers, partners, and other types of users with specific and relevant insights into metrics information in an interactive and informative form. The deployed system has provided useful information for periodical reports, service delivery, and decision support. The proposed measurement strategy and analytics framework can be a guidance to help GWS providers evaluate their services.

  4. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    NASA Astrophysics Data System (ADS)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  5. The Evolution of Distance Education: Implications for Instructional Design on the Potential of the Web

    ERIC Educational Resources Information Center

    Moller, Leslie; Foshay, Wellesley R.; Huett, Jason

    2008-01-01

    In this article, the authors offer the first of a three-part series on distance education. The authors discuss key trends in training and development which have five profound impacts on the field of Instruction Design (ID). These five effects concern: (1) quality; (2) needs assessment, ROI and measurement of outcomes; (3) the influence and fusion…

  6. Real Time with the Librarian: Using Web Conferencing Software to Connect to Distance Students

    ERIC Educational Resources Information Center

    Riedel, Tom; Betty, Paul

    2013-01-01

    A pilot program to provide real-time library webcasts to Regis University distance students using Adobe Connect software was initiated in fall of 2011. Previously, most interaction between librarians and online students had been accomplished by asynchronous discussion threads in the Learning Management System. Library webcasts were offered in…

  7. PACS through web compatible with DICOM standard and WADO service: advantages and implementation.

    PubMed

    Koutelakis, George V; Lymperopoulos, Dimitrios K

    2006-01-01

    All users of informatics applications need rapid and reliable access to the kind of information that they are interested in. Web technology provides these capabilities. DICOM standard committees recognized the necessity of a Web medical standard. They specified WADO (Web access to DICOM object) service, so that system interaction takes place through Web, in a standarized way, allowing interoperability and proper information management inside PACS. The advantages of a Web PACS comparatively with a compatible PACS are multiple and they are detected in different fields of functionality. The authors have run a project of a WADO compatible Web PACS development. A Web portal platform with enhanced security has been implemented. Over it, DICOM applications have been developed. JavaServer Pages (JSP) technology is mainly used to satisfy design specifications and dynamic data exchanges. Furthermore, Java applets have been developed and introduced in the whole project to serve specific demands. Evaluation results confirmed our considerations about the improvement of DICOM services, when they are provided through web.

  8. Maternal mortality and accessibility to health services by means of transit-network estimated traveled distances.

    PubMed

    Simões, Patricia Passos; Almeida, Renan Moritz V R

    2014-08-01

    This study analyzed the relationship between maternal mortality and variables related to the use of health services (especially residence-hospital traveled distances estimated through transit networks). Deaths were identified for Rio de Janeiro and adjacent cities, from 2000 to 2002, and were matched by age and socio-economic level to birth admissions without maternal deaths (1 case to 3 controls). The variables used were: type of hospital (general × specialized maternity services), number of hospital beds, nature of hospital ownership (public × private-associated), main admission diagnostic, residence-hospital distance, age, income, and education. Distances were estimated by a geographic information system, and were based on most probable itineraries through the urban transit networks. The probability of death was estimated by conditional logistic regression models. 226 maternal deaths were studied, and another 10 were excluded due to incompleteness of information. The ROC area for the final model was 0.89 [95% CI (0.87-0.92)]. This model retained statistical significance for the variables admission diagnostic, type of hospital and residence-hospital distance. The death odds ratio for women who traveled 5-10 km (reference category: <5 km) was 3.84 [95% CI (1.96-7.55)]. The traveled distance measured through transit networks was an important risk factor for death in the studied population.

  9. IAServ: An Intelligent Home Care Web Services Platform in a Cloud for Aging-in-Place

    PubMed Central

    Su, Chuan-Jun; Chiang, Chang-Yu

    2013-01-01

    As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients’ needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet. PMID:24225647

  10. IAServ: an intelligent home care web services platform in a cloud for aging-in-place.

    PubMed

    Su, Chuan-Jun; Chiang, Chang-Yu

    2013-11-12

    As the elderly population has been rapidly expanding and the core tax-paying population has been shrinking, the need for adequate elderly health and housing services continues to grow while the resources to provide such services are becoming increasingly scarce. Thus, increasing the efficiency of the delivery of healthcare services through the use of modern technology is a pressing issue. The seamless integration of such enabling technologies as ontology, intelligent agents, web services, and cloud computing is transforming healthcare from hospital-based treatments to home-based self-care and preventive care. A ubiquitous healthcare platform based on this technological integration, which synergizes service providers with patients' needs to be developed to provide personalized healthcare services at the right time, in the right place, and the right manner. This paper presents the development and overall architecture of IAServ (the Intelligent Aging-in-place Home care Web Services Platform) to provide personalized healthcare service ubiquitously in a cloud computing setting to support the most desirable and cost-efficient method of care for the aged-aging in place. The IAServ is expected to offer intelligent, pervasive, accurate and contextually-aware personal care services. Architecturally the implemented IAServ leverages web services and cloud computing to provide economic, scalable, and robust healthcare services over the Internet.

  11. Pilot Evaluation of a Web-Based Intervention Targeting Sexual Health Service Access

    ERIC Educational Resources Information Center

    Brown, K. E.; Newby, K.; Caley, M.; Danahay, A.; Kehal, I.

    2016-01-01

    Sexual health service access is fundamental to good sexual health, yet interventions designed to address this have rarely been implemented or evaluated. In this article, pilot evaluation findings for a targeted public health behavior change intervention, delivered via a website and web-app, aiming to increase uptake of sexual health services among…

  12. A SCORM Thin Client Architecture for E-Learning Systems Based on Web Services

    ERIC Educational Resources Information Center

    Casella, Giovanni; Costagliola, Gennaro; Ferrucci, Filomena; Polese, Giuseppe; Scanniello, Giuseppe

    2007-01-01

    In this paper we propose an architecture of e-learning systems characterized by the use of Web services and a suitable middleware component. These technical infrastructures allow us to extend the system with new services as well as to integrate and reuse heterogeneous software e-learning components. Moreover, they let us better support the…

  13. 76 FR 14034 - Proposed Collection; Comment Request; NCI Cancer Genetics Services Directory Web-Based...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... HUMAN SERVICES National Institutes of Health Proposed Collection; Comment Request; NCI Cancer Genetics... Management and Budget (OMB) for review and approval. Proposed Collection: Title: NCI Cancer Genetics Services... application form and the Web-based update mailer is to collect information about genetics professionals to...

  14. Use and Evaluation of Web-Based Professional Development Services across Participant Levels of Support

    ERIC Educational Resources Information Center

    Whitaker, Steve; Kinzie, Mable; Kraft-Sayre, Marcia E.; Mashburn, Andrew; Pianta, Robert C.

    2007-01-01

    When participating in a large-scale, web-based professional development program, to what degree do teachers participate? How useful do they find the program? To what degree do they feel supported in their efforts? What are the associations between participation, evaluation of services, and the level of service teachers receive? MyTeachingPartner…

  15. Leveraging Web Technologies in Student Support Self-Services

    ERIC Educational Resources Information Center

    Herndon, M. Craig

    2011-01-01

    The use of Web technologies to connect with and disperse information to prospective and current students can be effective for students as well as efficient for colleges. Early results of the use of such technologies in a statewide system point to high rates of satisfaction among students when information is delivered, provide clues on how various…

  16. "Just the Answers, Please": Choosing a Web Search Service.

    ERIC Educational Resources Information Center

    Feldman, Susan

    1997-01-01

    Presents guidelines for selecting World Wide Web search engines. Real-life questions were used to test six search engines. Queries sought company information, product reviews, medical information, foreign information, technical reports, and current events. Compares performance and features of AltaVista, Excite, HotBot, Infoseek, Lycos, and Open…

  17. Final Report for DOE Project: Portal Web Services: Support of DOE SciDAC Collaboratories

    SciTech Connect

    Mary Thomas, PI; Geoffrey Fox, Co-PI; Gannon, D; Pierce, M; Moore, R; Schissel, D; Boisseau, J

    2007-10-01

    Grid portals provide the scientific community with familiar and simplified interfaces to the Grid and Grid services, and it is important to deploy grid portals onto the SciDAC grids and collaboratories. The goal of this project is the research, development and deployment of interoperable portal and web services that can be used on SciDAC National Collaboratory grids. This project has four primary task areas: development of portal systems; management of data collections; DOE science application integration; and development of web and grid services in support of the above activities.

  18. Using Web Services and XML Harvesting to Achieve a Dynamic Web Site. Computers in Small Libraries

    ERIC Educational Resources Information Center

    Roberts, Gary

    2005-01-01

    Exploiting and contextualizing free information is a natural part of library culture. In this column, Gary Roberts, the information systems and reference librarian at Herrick Library, Alfred University in Alfred, NY, describes how to use XML content on a Web site to link to hundreds of free and useful resources. He gives a general overview of the…

  19. AMBIT RESTful web services: an implementation of the OpenTox application programming interface

    PubMed Central

    2011-01-01

    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application

  20. AMBIT RESTful web services: an implementation of the OpenTox application programming interface.

    PubMed

    Jeliazkova, Nina; Jeliazkov, Vedrin

    2011-05-16

    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application

  1. Persistent identifiers for web service requests relying on a provenance ontology design pattern

    NASA Astrophysics Data System (ADS)

    Car, Nicholas; Wang, Jingbo; Wyborn, Lesley; Si, Wei

    2016-04-01

    Delivering provenance information for datasets produced from static inputs is relatively straightforward: we represent the processing actions and data flow using provenance ontologies and link to stored copies of the inputs stored in repositories. If appropriate detail is given, the provenance information can then describe what actions have occurred (transparency) and enable reproducibility. When web service-generated data is used by a process to create a dataset instead of a static inputs, we need to use sophisticated provenance representations of the web service request as we can no longer just link to data stored in a repository. A graph-based provenance representation, such as the W3C's PROV standard, can be used to model the web service request as a single conceptual dataset and also as a small workflow with a number of components within the same provenance report. This dual representation does more than just allow simplified or detailed views of a dataset's production to be used where appropriate. It also allow persistent identifiers to be assigned to instances of a web service requests, thus enabling one form of dynamic data citation, and for those identifiers to resolve to whatever level of detail implementers think appropriate in order for that web service request to be reproduced. In this presentation we detail our reasoning in representing web service requests as small workflows. In outline, this stems from the idea that web service requests are perdurant things and in order to most easily persist knowledge of them for provenance, we should represent them as a nexus of relationships between endurant things, such as datasets and knowledge of particular system types, as these endurant things are far easier to persist. We also describe the ontology design pattern that we use to represent workflows in general and how we apply it to different types of web service requests. We give examples of specific web service requests instances that were made by systems

  2. Sensor Webs with a Service-Oriented Architecture for On-demand Science Products

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Ungar, Stephen; Ames, Troy; Justice, Chris; Frye, Stuart; Chien, Steve; Tran, Daniel; Cappelaere, Patrice; Derezinsfi, Linda; Paules, Granville; Di, Liping; Kolitz, Stephan

    2007-01-01

    This paper describes the work being managed by the NASA Goddard Space Flight Center (GSFC) Information System Division (ISD) under a NASA Earth Science Technology Ofice (ESTO) Advanced Information System Technology (AIST) grant to develop a modular sensor web architecture which enables discovery of sensors and workflows that can create customized science via a high-level service-oriented architecture based on Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) web service standards. These capabilities serve as a prototype to a user-centric architecture for Global Earth Observing System of Systems (GEOSS). This work builds and extends previous sensor web efforts conducted at NASA/GSFC using the Earth Observing 1 (EO-1) satellite and other low-earth orbiting satellites.

  3. Technical note: Harmonizing met-ocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Camossi, E.

    2015-11-01

    Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.

  4. Research on registry centre for geospatial web service based on CSW specification

    NASA Astrophysics Data System (ADS)

    Chen, Yumin; Song, Chunqiao; Shen, Shengyu; Yang, Qing

    2008-12-01

    With the increase of geospatial data and services, how to more efficiently utilize and share the geographic information becomes a crucial problem. To effectively integrate and enhance abundant geographic information anywhere, this paper presents a Registry Centre for Geospatial Web Service (RCGWS) based on the Open Geospatial Consortium (OGC) Catalogue Service for Web and the ebXML Registry Information Model (ebRIM), which provides registration and discovery portals for geospatial metadata for dataset and services. The design ideology and architecture of RCGWS are introduced, and the techniques of appending GIS services classification in extended ebRIM and external interfaces of RCGWS based on OGC CWS are discussed. The implementation of RCGWS platform shows that this Registry Centre can satisfy the requirement of geospatial dataset and services.

  5. Chaining of web services and its application in geographic information integration and visualization

    NASA Astrophysics Data System (ADS)

    Chen, Chuanbin; Wu, Qunyong; Chen, Chongcheng; Chen, Han

    2005-11-01

    The rapid development of web services technology ushers Geographical Information Systems (GISs) into the era of geographic information web services (GIWS), which requires a scalable and extensible GISs model to deliver distributed geographic information and GISs functions integrated as independently-provided, interoperable services in a distributed computing environment. Several distributed services can be dynamically chained as a new service to accomplish a specific task. Such a model of service chaining is one of the most important research topics of next generation GISs. The paper highlights the issues of service chaining, the process of combining several distributed, interoperable GIWS dynamically to construct customized applications, and analyses characters of each pattern. Then, based on pattern of client-coordinated chaining, we design a service chaining which is developed in a J2EE development environment using web services technology, and construct a web services-oriented Geo-spatial data integration and visualization platform in order to integrate multi-sources and heterogeneous Geo-spatial data using Geography Markup Language (GML) and Geo-spatial data integration technology, and to visualize geographic information using Scalable Vector Graphics (SVG) and JavaScript technology. During the process of design, several GIWS are defined, and functions, interfaces and related methods of these services are discussed in detail. The paper focuses on the method for chaining distributed GIWS, the mechanism for geographic information dissemination and error handling. Finally, forest Geo-spatial Data which have two typical types of data E00 and Shapefile (SHP) was used to test the platform. The result indicates that using service chaining for multi-sources and heterogeneous Geo-spatial data integration and visualization can efficiently meet customized needs, but further research is needed for better application.

  6. Satellite-Based Distance Courses for In-Service Training: The Case of HeadsUp! Reading

    ERIC Educational Resources Information Center

    Morrison, Johnetta Wade; Raya-Carlton, Pamela; Henk, Jennifer K.; Thornburg, Kathy R.

    2007-01-01

    This article discusses the use of distance courses as an in-service training mechanism for early childhood personnel. The authors evaluated the efficacy of the in-service, satellite based distance course HeadsUp! Reading (HU!R). The analysis of HU!R data revealed that there were no initial differences in the Language and Literacy Early Childhood:…

  7. Exploring Distance Learning Experiences of In-Service Music Teachers from Puerto Rico in a Master's Program

    ERIC Educational Resources Information Center

    Vega-Martinez, Juan Carlos

    2013-01-01

    The purpose of this study was to explore the experiences of in-service music teachers who chose to pursue a master's degree in music education through distance learning. In this study, I examined the motivations of in-service music teachers for choosing to pursue a master's degree in music education through distance learning; the benefits teachers…

  8. Integrating semantic web technologies and geospatial catalog services for geospatial information discovery and processing in cyberinfrastructure

    SciTech Connect

    Yue, Peng; Gong, Jianya; Di, Liping; He, Lianlian; Wei, Yaxing

    2011-04-01

    Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information and discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.

  9. An Approach for Web Service Selection Based on Confidence Level of Decision Maker

    PubMed Central

    Khezrian, Mojtaba; Jahan, Ali; Wan Kadir, Wan Mohd Nasir; Ibrahim, Suhaimi

    2014-01-01

    Web services today are among the most widely used groups for Service Oriented Architecture (SOA). Service selection is one of the most significant current discussions in SOA, which evaluates discovered services and chooses the best candidate from them. Although a majority of service selection techniques apply Quality of Service (QoS), the behaviour of QoS-based service selection leads to service selection problems in Multi-Criteria Decision Making (MCDM). In the existing works, the confidence level of decision makers is neglected and does not consider their expertise in assessing Web services. In this paper, we employ the VIKOR (VIšekriterijumskoKOmpromisnoRangiranje) method, which is absent in the literature for service selection, but is well-known in other research. We propose a QoS-based approach that deals with service selection by applying VIKOR with improvement of features. This research determines the weights of criteria based on user preference and accounts for the confidence level of decision makers. The proposed approach is illustrated by an example in order to demonstrate and validate the model. The results of this research may facilitate service consumers to attain a more efficient decision when selecting the appropriate service. PMID:24897426

  10. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  11. Automated web service composition supporting conditional branch structures

    NASA Astrophysics Data System (ADS)

    Wang, Pengwei; Ding, Zhijun; Jiang, Changjun; Zhou, Mengchu

    2014-01-01

    The creation of value-added services by automatic composition of existing ones is gaining a significant momentum as the potential silver bullet in service-oriented architecture. However, service composition faces two aspects of difficulties. First, users' needs present such characteristics as diversity, uncertainty and personalisation; second, the existing services run in a real-world environment that is highly complex and dynamically changing. These difficulties may cause the emergence of nondeterministic choices in the process of service composition, which has gone beyond what the existing automated service composition techniques can handle. According to most of the existing methods, the process model of composite service includes sequence constructs only. This article presents a method to introduce conditional branch structures into the process model of composite service when needed, in order to satisfy users' diverse and personalised needs and adapt to the dynamic changes of real-world environment. UML activity diagrams are used to represent dependencies in composite service. Two types of user preferences are considered in this article, which have been ignored by the previous work and a simple programming language style expression is adopted to describe them. Two different algorithms are presented to deal with different situations. A real-life case is provided to illustrate the proposed concepts and methods.

  12. US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access

    NASA Astrophysics Data System (ADS)

    Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.

    2012-04-01

    The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.

  13. A Design of a Network Model to the Electric Power Trading System Using Web Services

    NASA Astrophysics Data System (ADS)

    Maruo, Tomoaki; Matsumoto, Keinosuke; Mori, Naoki; Kitayama, Masashi; Izumi, Yoshio

    Web services are regarded as a new application paradigm in the world of the Internet. On the other hand, many business models of a power trading system has been proposed to aim at load reduction by consumers cooperating with electric power suppliers in an electric power market. Then, we propose a network model of power trading system using Web service in this paper. The adaptability of Web services to power trading system was checked in the prototype of our network model and we got good results for it. Each server provides functions as a SOAP server, and it is coupled loosely with each other through SOAP. Storing SOAP message in HTTP packet can establish the penetration communication way that is not conscious of a firewall. Switching of a dynamic server is possible by means of rewriting the server point information on WSDL at the time of obstacle generating.

  14. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal Phase III

    DTIC Science & Technology

    2015-04-30

    ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Lexical Link Analysis Application: Improving Web Service to Acquisition...2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving Web Service...www.acquisitionresearch.net). ^Åèìáëáíáçå=oÉëÉ~êÅÜ=mêçÖê~ãW= `êÉ~íáåÖ=póåÉêÖó=Ñçê=fåÑçêãÉÇ=`Ü~åÖÉ= - 345 - Lexical Link Analysis Application: Improving Web Service

  15. High-performance web services for querying gene and variant annotation.

    PubMed

    Xin, Jiwen; Mark, Adam; Afrasiabi, Cyrus; Tsueng, Ginger; Juchler, Moritz; Gopal, Nikhil; Stupp, Gregory S; Putman, Timothy E; Ainscough, Benjamin J; Griffith, Obi L; Torkamani, Ali; Whetzel, Patricia L; Mungall, Christopher J; Mooney, Sean D; Su, Andrew I; Wu, Chunlei

    2016-05-06

    Efficient tools for data management and integration are essential for many aspects of high-throughput biology. In particular, annotations of genes and human genetic variants are commonly used but highly fragmented across many resources. Here, we describe MyGene.info and MyVariant.info, high-performance web services for querying gene and variant annotation information. These web services are currently accessed more than three million times permonth. They also demonstrate a generalizable cloud-based model for organizing and querying biological annotation information. MyGene.info and MyVariant.info are provided as high-performance web services, accessible at http://mygene.info and http://myvariant.info . Both are offered free of charge to the research community.

  16. Programmatic access to data and information at the IRIS DMC via web services

    NASA Astrophysics Data System (ADS)

    Weertman, B. R.; Trabant, C.; Karstens, R.; Suleiman, Y. Y.; Ahern, T. K.; Casey, R.; Benson, R. B.

    2011-12-01

    The IRIS Data Management Center (DMC) has developed a suite of web services that provide access to the DMC's time series holdings, their related metadata and earthquake catalogs. In addition, services are available to perform simple, on-demand time series processing at the DMC prior to being shipped to the user. The primary goal is to provide programmatic access to data and processing services in a manner usable by and useful to the research community. The web services are relatively simple to understand and use and will form the foundation on which future DMC access tools will be built. Based on standard Web technologies they can be accessed programmatically with a wide range of programming languages (e.g. Perl, Python, Java), command line utilities such as wget and curl or with any web browser. We anticipate these services being used for everything from simple command line access, used in shell scripts and higher programming languages to being integrated within complex data processing software. In addition to improving access to our data by the seismological community the web services will also make our data more accessible to other disciplines. The web services available from the DMC include ws-bulkdataselect for the retrieval of large volumes of miniSEED data, ws-timeseries for the retrieval of individual segments of time series data in a variety of formats (miniSEED, SAC, ASCII, audio WAVE, and PNG plots) with optional signal processing, ws-station for station metadata in StationXML format, ws-resp for the retrieval of instrument response in RESP format, ws-sacpz for the retrieval of sensor response in the SAC poles and zeros convention and ws-event for the retrieval of earthquake catalogs. To make the services even easier to use, the DMC is developing a library that allows Java programmers to seamlessly retrieve and integrate DMC information into their own programs. The library will handle all aspects of dealing with the services and will parse the returned

  17. Designing web services in health information systems: from process to application level.

    PubMed

    Mykkänen, Juha; Riekkinen, Annamari; Sormunen, Marko; Karhunen, Harri; Laitinen, Pertti

    2007-01-01

    Service-oriented architectures (SOAs) and web service technologies have been proposed to respond to some central interoperability challenges of heterogeneous health information systems (HIS). We propose a model which we are using to define services and solutions for healthcare applications from the requirements in the healthcare processes. Focusing on the transition from the process level of the model to the application level, we also present some central design considerations, which can be used to guide the design of service-based interoperability. We illustrate these aspects with examples from our current work from the service-enabled HIS.

  18. Designing web services in health information systems: from process to application level.

    PubMed

    Mykkänen, Juha; Riekkinen, Annamari; Laitinen, Pertti; Karhunen, Harri; Sormunen, Marko

    2005-01-01

    Service-oriented architectures (SOA) and web service technologies have been proposed to respond to some central interoperability challenges of heterogeneous health information systems (HIS). We propose a model, which we are using to define services and solutions for healthcare applications from the requirements in the healthcare processes. Focusing on the transition from the process level of the model to the application level, we also present some central design considerations, which can be used to guide the design of service-based interoperability and illustrate these aspects with examples from our current work in service-enabled HIS.

  19. Wysiwyg Geoprocessing: Coupling Sensor Web and Geoprocessing Services in Virtual Globes

    NASA Astrophysics Data System (ADS)

    Zhai, X.; Gong, J.; Yue, P.; Sun, Z.; Lu, X.

    2011-08-01

    We propose to advance the scientific understanding and applications of geospatial data by coupling Sensor Web and Geoprocessing Services in Virtual Globes for higher-education teaching and research. The vision is the concept of "What You See is What You Get" geoprocessing, shortly known as WYSIWYG geoprocessing. Virtual Globes offer tremendous opportunities, such as providing a learning tool to help educational users and researchers digest global-scale geospatial information about the world, and acting as WYSIWYG platforms, where domain experts can see what their fingertips act in an interactive three-dimensional virtual environment. In the meantime, Sensor Web and Web Service technologies make a large amount of Earth observing sensors and geoprocessing functionalities easily accessible to educational users and researchers like their local resources. Coupling Sensor Web and geoprocessing Services in Virtual Globes will bring a virtual learning and research environment to the desktops of students and professors, empowering them with WYSIWYG geoprocessing capabilities. The implementation combines the visualization and communication power of Virtual Globes with the on-demand data collection and analysis functionalities of Sensor Web and geoprocessing services, to help students and researchers investigate various scientific problems in an environment with natural and intuitive user experiences. The work will contribute to the scientific and educational activities of geoinformatic communities in that they will have a platform that are easily accessible and help themselves perceive world space and perform live geoscientific processes.

  20. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  1. Applying Semantic Web Services and Wireless Sensor Networks for System Integration

    NASA Astrophysics Data System (ADS)

    Berkenbrock, Gian Ricardo; Hirata, Celso Massaki; de Oliveira Júnior, Frederico Guilherme Álvares; de Oliveira, José Maria Parente

    In environments like factories, buildings, and homes automation services tend to often change during their lifetime. Changes are concerned to business rules, process optimization, cost reduction, and so on. It is important to provide a smooth and straightforward way to deal with these changes so that could be handled in a faster and low cost manner. Some prominent solutions use the flexibility of Wireless Sensor Networks and the meaningful description of Semantic Web Services to provide service integration. In this work, we give an overview of current solutions for machinery integration that combine both technologies as well as a discussion about some perspectives and open issues when applying Wireless Sensor Networks and Semantic Web Services for automation services integration.

  2. Performance Issues Related to Web Service Usage for Remote Data Access

    SciTech Connect

    Pais, V. F.; Stancalie, V.; Mihailescu, F. A.; Totolici, M. C.

    2008-04-07

    Web services are starting to be widely used in applications for remotely accessing data. This is of special interest for research based on small and medium scale fusion devices, since scientists participating remotely to experiments are accessing large amounts of data over the Internet. Recent tests were conducted to see how the new network traffic, generated by the use of web services, can be integrated in the existing infrastructure and what would be the impact over existing applications, especially those used in a remote participation scenario.

  3. Implementing a Web-based clinical information system using EMR middle layer services.

    PubMed Central

    Kittredge, R. L.; Estey, G.; Pappas, J. J.; Barnett, G. O.

    1996-01-01

    The Clinical Summary is a Web-based application for accessing the clinical database at the Massachusetts General Hospital. The application has been developed to give physicians in our health care community access to clinical information for patients they refer to our hospital. "Middle layer" services, written previously for the hospital's clinical workstation, supply much of the application's functionality. Employment of reusable services together with a Web-based front end has afforded a rapid and inexpensive means for developing a new clinical information system. This paper discusses the system's design, function, and methods of implementation. PMID:8947742

  4. Distance Education Survey, 2007. A Report on Course Structure and Educational Services in Distance Education and Training Council Member Institutions

    ERIC Educational Resources Information Center

    Distance Education and Training Council, 2007

    2007-01-01

    In April 2007 the Distance Education and Training Council (DETC) surveyed 67 of its accredited institutions to determine current aspects of the distance study educational practice. (Military and international institutions were omitted.) This report is a collection and summary of the data received. This survey contained questions in the following…

  5. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  6. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  7. Integrated web system of geospatial data services for climate research

    NASA Astrophysics Data System (ADS)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander

    2016-04-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.

  8. Technical Note: Harmonizing met-ocean model data via standard web services within small research groups

    USGS Publications Warehouse

    Signell, Richard; Camossi, E.

    2016-01-01

    Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.

  9. Technical note: Harmonising metocean model data via standard web services within small research groups

    NASA Astrophysics Data System (ADS)

    Signell, Richard P.; Camossi, Elena

    2016-05-01

    Work over the last decade has resulted in standardised web services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by (1) making it simple for providers to enable web service access to existing output files; (2) using free technologies that are easy to deploy and configure; and (3) providing standardised, service-based tools that work in existing research environments. We present a simple, local brokering approach that lets modellers continue to use their existing files and tools, while serving virtual data sets that can be used with standardised tools. The goal of this paper is to convince modellers that a standardised framework is not only useful but can be implemented with modest effort using free software components. We use NetCDF Markup language for data aggregation and standardisation, the THREDDS Data Server for data delivery, pycsw for data search, NCTOOLBOX (MATLAB®) and Iris (Python) for data access, and Open Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.

  10. Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.

    2015-12-01

    Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.

  11. The BCube Crawler: Web Scale Data and Service Discovery for EarthCube.

    NASA Astrophysics Data System (ADS)

    Lopez, L. A.; Khalsa, S. J. S.; Duerr, R.; Tayachow, A.; Mingo, E.

    2014-12-01

    Web-crawling, a core component of the NSF-funded BCube project, is researching and applying the use of big data technologies to find and characterize different types of web services, catalog interfaces, and data feeds such as the ESIP OpenSearch, OGC W*S, THREDDS, and OAI-PMH that describe or provide access to scientific datasets. Given the scale of the Internet, which challenges even large search providers such as Google, the BCube plan for discovering these web accessible services is to subdivide the problem into three smaller, more tractable issues. The first, to be able to discover likely sites where relevant data and data services might be found, the second, to be able to deeply crawl the sites discovered to find any data and services which might be present. Lastly, to leverage the use of semantic technologies to characterize the services and data found, and to filter out everything but those relevant to the geosciences. To address the first two challenges BCube uses an adapted version of Apache Nutch (which originated Hadoop), a web scale crawler, and Amazon's ElasticMapReduce service for flexibility and cost effectiveness. For characterization of the services found, BCube is examining existing web service ontologies for their applicability to our needs and will re-use and/or extend these in order to query for services with specific well-defined characteristics in scientific datasets such as the use of geospatial namespaces. The original proposal for the crawler won a grant from Amazon's academic program, which allowed us to become operational; we successfully tested the Bcube Crawler at web scale obtaining a significant corpus, sizeable enough to enable work on characterization of the services and data found. There is still plenty of work to be done, doing "smart crawls" by managing the frontier, developing and enhancing our scoring algorithms and fully implementing the semantic characterization technologies. We describe the current status of the project

  12. Applying Web Service on the Simulation and Demonstration for Groundwater Resources

    NASA Astrophysics Data System (ADS)

    liu, H.; Wang, H.; Chang, L.

    2013-12-01

    This study uses web service technology to reach following goals: (1) unified data format, (2) cross-platform data input/output, (3) linkage to pre-developed groundwater simulation model with database, and (4) friendly cross-platform user interface. With this web service technology, users can remotely (1) read, extract, manage, and display groundwater related data and (2) run the selected groundwater simulation model. In this study, three web formats are provided for different needs: (1)website, (2) the format for mobile devices, and (3) the interface for researchers. For the first two formats, the users can visualize the selected data or model results. The web format allows users to browse it using commonly available browsers. The Android system is used for the second format, which is for mobile devices, simplifying the data demonstration process. Investigators can use mobile devices to show space information, groundwater level, and hydrogeological parameters. The third format provides professional researchers to use the provided API to access massive groundwater related data and parameters. The developed web service is applied to Pingtong groundwater area to simulate multiple groundwater flow scenarios and show the simulation results.

  13. Request Interface for Operations: A Web Service and Workflow Prototype

    NASA Astrophysics Data System (ADS)

    Monrozier, F. Jocteur; Pesquet, T.

    This paper presents the approach retained in a R&D CNES development to provide a configurable and generic request interface for operations, using new modeling and programming techniques (standards and tools) in the core of the resulting "Request Interface for Operations" (RIO) framework. This prototype will be based on object oriented and internet technologies and standards such as SOAP with Attachment1, UML2 State diagram and JAVA. The advantage of the approach presented in this paper is to have a customizable tool that can be configured and deployed depending on the target needs in order to provide a cross-support "request interface for operations". Once this work will be carried out to an end and validated, it should be submitted for approval to CCSDS Cross Support Services Area in order to extend the current SLE Service Request work, to provide a recommendation for a "Cross- support Request Interface for Operations". As this approach also provides a methodology to define a complete and pragmatic service interface specification (with static and dynamic views) focusing on the user point of view, it will be proposed to the CCSDS Systems Architecture Working group to complete the Reference Architecture methodology. Key-words: UML State diagrams, Dynamic Service interface description, formal notation, code generation, SOAP, CCSDS SLE Service Management, Cross-support.

  14. Estimating and Presenting Individualized Earthquake Risk Using Web-Based Information Services

    NASA Astrophysics Data System (ADS)

    Holliday, J. R.; Rundle, J. B.; Donnellan, A.

    2009-12-01

    Great natural disasters have occurred many times throughout human history. Events such as the San Francisco earthquake of 1906, the 2004 Sumatra earthquake and tsunami, and the 2005 Hurricane Katrina have caused massive destruction and suffering. With the modern tools of risk analysis, forecasting, and the world wide web available, human societies should no longer tolerate the human and economic losses these disasters produce. Thanks to new technologies and web-based applications, it will soon be possible to enable a more sustainable human society in the face of severe, recurring natural disasters in the complex earth system. Web-based information services make it easy to specify geographical locations and describe specific building structures. Couple this with publicly available earthquake forecasts and web-based mapping tools and the public can make more informed choices about how to manage their personal exposure to risk from natural catastrophes.

  15. Flexible Web service infrastructure for the development and deployment of predictive models.

    PubMed

    Guha, Rajarshi

    2008-02-01

    The development of predictive statistical models is a common task in the field of drug design. The process of developing such models involves two main steps: building the model and then deploying the model. Traditionally such models have been deployed using Web page interfaces. This approach restricts the user to using the specified Web page, and using the model in other ways can be cumbersome. In this paper we present a flexible and generalizable approach to the deployment of predictive models, based on a Web service infrastructure using R. The infrastructure described allows one to access the functionality of these models using a variety of approaches ranging from Web pages to workflow tools. We highlight the advantages of this infrastructure by developing and subsequently deploying random forest models for two data sets.

  16. Do Altmetrics Work? Twitter and Ten Other Social Web Services

    PubMed Central

    Thelwall, Mike; Haustein, Stefanie; Larivière, Vincent; Sugimoto, Cassidy R.

    2013-01-01

    Altmetric measurements derived from the social web are increasingly advocated and used as early indicators of article impact and usefulness. Nevertheless, there is a lack of systematic scientific evidence that altmetrics are valid proxies of either impact or utility although a few case studies have reported medium correlations between specific altmetrics and citation rates for individual journals or fields. To fill this gap, this study compares 11 altmetrics with Web of Science citations for 76 to 208,739 PubMed articles with at least one altmetric mention in each case and up to 1,891 journals per metric. It also introduces a simple sign test to overcome biases caused by different citation and usage windows. Statistically significant associations were found between higher metric scores and higher citations for articles with positive altmetric scores in all cases with sufficient evidence (Twitter, Facebook wall posts, research highlights, blogs, mainstream media and forums) except perhaps for Google+ posts. Evidence was insufficient for LinkedIn, Pinterest, question and answer sites, and Reddit, and no conclusions should be drawn about articles with zero altmetric scores or the strength of any correlation between altmetrics and citations. Nevertheless, comparisons between citations and metric values for articles published at different times, even within the same year, can remove or reverse this association and so publishers and scientometricians should consider the effect of time when using altmetrics to rank articles. Finally, the coverage of all the altmetrics except for Twitter seems to be low and so it is not clear if they are prevalent enough to be useful in practice. PMID:23724101

  17. ChEMBL web services: streamlining access to drug discovery data and utilities

    PubMed Central

    Davies, Mark; Nowotka, Michał; Papadatos, George; Dedman, Nathan; Gaulton, Anna; Atkinson, Francis; Bellis, Louisa; Overington, John P.

    2015-01-01

    ChEMBL is now a well-established resource in the fields of drug discovery and medicinal chemistry research. The ChEMBL database curates and stores standardized bioactivity, molecule, target and drug data extracted from multiple sources, including the primary medicinal chemistry literature. Programmatic access to ChEMBL data has been improved by a recent update to the ChEMBL web services (version 2.0.x, https://www.ebi.ac.uk/chembl/api/data/docs), which exposes significantly more data from the underlying database and introduces new functionality. To complement the data-focused services, a utility service (version 1.0.x, https://www.ebi.ac.uk/chembl/api/utils/docs), which provides RESTful access to commonly used cheminformatics methods, has also been concurrently developed. The ChEMBL web services can be used together or independently to build applications and data processing workflows relevant to drug discovery and chemical biology. PMID:25883136

  18. Semantic-based web service discovery and chaining for building an Arctic spatial data infrastructure

    NASA Astrophysics Data System (ADS)

    Li, W.; Yang, C.; Nebert, D.; Raskin, R.; Houser, P.; Wu, H.; Li, Z.

    2011-11-01

    Increasing interests in a global environment and climate change have led to studies focused on the changes in the multinational Arctic region. To facilitate Arctic research, a spatial data infrastructure (SDI), where Arctic data, information, and services are shared and integrated in a seamless manner, particularly in light of today's climate change scenarios, is urgently needed. In this paper, we utilize the knowledge-based approach and the spatial web portal technology to prototype an Arctic SDI (ASDI) by proposing (1) a hybrid approach for efficient service discovery from distributed web catalogs and the dynamic Internet; (2) a domain knowledge base to model the latent semantic relationships among scientific data and services; and (3) an intelligent logic reasoning mechanism for (semi-)automatic service selection and chaining. A study of the influence of solid water dynamics to the bio-habitat of the Arctic region is used as an example to demonstrate the prototype.

  19. Design, Implementation and Applications of 3d Web-Services in DB4GEO

    NASA Astrophysics Data System (ADS)

    Breunig, M.; Kuper, P. V.; Dittrich, A.; Wild, P.; Butwilowski, E.; Al-Doori, M.

    2013-09-01

    The object-oriented database architecture DB4GeO was originally designed to support sub-surface applications in the geo-sciences. This is reflected in DB4GeO's geometric data model as well as in its import and export functions. Initially, these functions were designed for communication with 3D geological modeling and visualization tools such as GOCAD or MeshLab. However, it soon became clear that DB4GeO was suitable for a much wider range of applications. Therefore it is natural to move away from a standalone solution and to open the access to DB4GeO data by standardized OGC web-services. Though REST and OGC services seem incompatible at first sight, the implementation in DB4GeO shows that OGC-based implementation of web-services may use parts of the DB4GeO-REST implementation. Starting with initial solutions in the history of DB4GeO, this paper will introduce the design, adaptation (i.e. model transformation), and first steps in the implementation of OGC Web Feature (WFS) and Web Processing Services (WPS), as new interfaces to DB4GeO data and operations. Among its capabilities, DB4GeO can provide data in different data formats like GML, GOCAD, or DB3D XML through a WFS, as well as its ability to run operations like a 3D-to-2D service, or mesh-simplification (Progressive Meshes) through a WPS. We then demonstrate, an Android-based mobile 3D augmented reality viewer for DB4GeO that uses the Web Feature Service to visualize 3D geo-database query results. Finally, we explore future research work considering DB4GeO in the framework of the research group "Computer-Aided Collaborative Subway Track Planning in Multi-Scale 3D City and Building Models".

  20. Study on dynamic services composition of web services based on BPEL

    NASA Astrophysics Data System (ADS)

    Gao, Jinyue; Huang, Fei; Zhang, Gongxuan

    2013-12-01

    From the core concepts of SOA (Service-Oriented Architecture) ——"Service" starting the service composition is discussed in detail, from the service relationships network modeling, services dynamic composition approach based on Business Process Execution Language BPEL (Business Process Execution Language) is proposed in this paper, meanwhile two concepts of service agent and service quality are described, which achieve the service process dynamic execution.

  1. Enabling Interoperability and Servicing Multiple User Segments Through Web Services, Standards, and Data Tools

    NASA Astrophysics Data System (ADS)

    Palanisamy, Giriprakash; Wilson, Bruce E.; Cook, Robert B.; Lenhardt, Chris W.; Santhana Vannan, Suresh; Pan, Jerry; McMurry, Ben F.; Devarakonda, Ranjeet

    2010-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) is one of the science-oriented data centers in EOSDIS, aligned primarily with terrestrial ecology. The ORNL DAAC archives and serves data from NASA-funded field campaigns (such as BOREAS, FIFE, and LBA), regional and global data sets relevant to biogeochemical cycles, land validation studies for remote sensing, and source code for some terrestrial ecology models. Users of the ORNL DAAC include field ecologists, remote sensing scientists, modelers at various scales, synthesis scientific groups, a range of educational users (particularly baccalaureate and graduate instruction), and decision support analysts. It is clear that the wide range of users served by the ORNL DAAC have differing needs and differing capabilities for accessing and using data. It is also not possible for the ORNL DAAC, or the other data centers in EDSS to develop all of the tools and interfaces to support even most of the potential uses of data directly. As is typical of Information Technology to support a research enterprise, the user needs will continue to evolve rapidly over time and users themselves cannot predict future needs, as those needs depend on the results of current investigation. The ORNL DAAC is addressing these needs by targeted implementation of web services and tools which can be consumed by other applications, so that a modeler can retrieve data in netCDF format with the Climate Forecasting convention and a field ecologist can retrieve subsets of that same data in a comma separated value format, suitable for use in Excel or R. Tools such as our MODIS Subsetting capability, the Spatial Data Access Tool (SDAT; based on OGC web services), and OPeNDAP-compliant servers such as THREDDS particularly enable such diverse means of access. We also seek interoperability of metadata, recognizing that terrestrial ecology is a field where there are a very large number of relevant data repositories. ORNL DAAC

  2. Tampa Bay Ecosystem Services Demonstration Pilot Phase 2 web site

    EPA Science Inventory

    The value of nature's benefits is difficult to consider in environmental decision-making since ecosystem goods and services are usually not well measured or quantified in economic terms. The Tampa Bay Estuary Program, Tampa Bay Regional Planning Council, the U.S. Environmental Pr...

  3. New Web-Monitoring Service Worries Some Legal Experts

    ERIC Educational Resources Information Center

    Sander, Libby

    2008-01-01

    A software program that searches for offensive content on college athletes' social-networking sites has drawn skeptical reactions from legal experts, who say it could threaten students' constitutional rights. Billed as a "social-network monitoring service" and marketed exclusively to college athletics departments, YouDiligence was on display at…

  4. Web-Based Academic Support Services: Guidelines for Extensibility

    ERIC Educational Resources Information Center

    McCracken, Holly

    2005-01-01

    Using the experience of the University of Illinois at Springfield's College of Liberal Arts and Sciences at the as a foundation for discussion, this paper addresses the provision of student support services to distant students within the context of development and expansion. Specific issues for consideration include: integrating student support…

  5. Context-Adaptive Learning Designs by Using Semantic Web Services

    ERIC Educational Resources Information Center

    Dietze, Stefan; Gugliotta, Alessio; Domingue, John

    2007-01-01

    IMS Learning Design (IMS-LD) is a promising technology aimed at supporting learning processes. IMS-LD packages contain the learning process metadata as well as the learning resources. However, the allocation of resources--whether data or services--within the learning design is done manually at design-time on the basis of the subjective appraisals…

  6. The Use of Video-Taped Lectures and Web-Based Communications in Teaching: A Distance-Teaching and Cross-Atlantic Collaboration Experiment.

    ERIC Educational Resources Information Center

    Herder, P. M.; Subrahmanian, E.; Talukdar, S.; Turk, A. L.; Westerberg, A. W.

    2002-01-01

    Explains distance education approach applied to the 'Engineering Design Problem Formulation' course simultaneously at the Delft University of Technology (the Netherlands) and at Carnegie Mellon University (CMU, Pittsburgh, USA). Uses video taped lessons, video conferencing, electronic mails and web-accessible document management system LIRE in the…

  7. Software architecture and design of the web services facilitating climate model diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.

    2015-12-01

    Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.

  8. 76 FR 28439 - Submission for OMB Review; Comment Request; NCI Cancer Genetics Services Directory Web-Based...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... Genetics Services Directory Web-Based Application Form and Update Mailer Summary: Under the provisions of... Collection: Title: NCI Cancer Genetics Services Directory Web-based Application Form and Update Mailer. Type... collect information about genetics professionals to be included in the NCI Cancer Genetics...

  9. Oh! Web 2.0, Virtual Reference Service 2.0, Tools & Techniques (II)

    ERIC Educational Resources Information Center

    Arya, Harsh Bardhan; Mishra, J. K.

    2012-01-01

    The paper describes the theory and definition of the practice of librarianship, specifically addressing how Web 2.0 technologies (tools) such as synchronous messaging, collaborative reference service and streaming media, blogs, wikis, social networks, social bookmarking tools, tagging, RSS feeds, and mashups might intimate changes and how…

  10. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  11. Freshman Admissions Predictor: An Interactive Self-Help Web Counseling Service

    ERIC Educational Resources Information Center

    Head, Joe F.; Hughes, Thomas M.

    2004-01-01

    Colleges and universities must seek or develop the most competitive enrollment management tools in order to reach and admit qualified students. However, institutions that utilize transactional Web features are more effective if they can personalize services by providing useful customized information in real time for the prospect. Well crafted high…

  12. The Experiences of Older Students' Use of Web-Based Student Services

    ERIC Educational Resources Information Center

    Ho, Katy W.

    2012-01-01

    The purpose of this phenomenological case study was to understand the experiences of older students' use of web-based student services in a community college setting. For the purpose of this study the term "older student" was defined as people born between the years 1943 and 1960. This group of people, often described as the Baby Boomer…

  13. A Semantic Web Service and Simulation Framework to Intelligent Distributed Manufacturing

    SciTech Connect

    Son, Young Jun; Kulvatunyou, Boonserm; Cho, Hyunbo; Feng, Shaw

    2005-11-01

    To cope with today's fluctuating markets, a virtual enterprise (VE) concept can be employed to achieve the cooperation among independently operating enterprises. The success of VE depends on reliable interoperation among trading partners. This paper proposes a framework based on semantic web of manufacturing and simulation services to enable business and engineering collaborations between VE partners, particularly a design house and manufacturing suppliers.

  14. Using Forecasting to Predict Long-Term Resource Utilization for Web Services

    ERIC Educational Resources Information Center

    Yoas, Daniel W.

    2013-01-01

    Researchers have spent years understanding resource utilization to improve scheduling, load balancing, and system management through short-term prediction of resource utilization. Early research focused primarily on single operating systems; later, interest shifted to distributed systems and, finally, into web services. In each case researchers…

  15. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  16. A Service Oriented Web Application for Learner Knowledge Representation, Management and Sharing Conforming to IMS LIP

    ERIC Educational Resources Information Center

    Lazarinis, Fotis

    2014-01-01

    iLM is a Web based application for representation, management and sharing of IMS LIP conformant user profiles. The tool is developed using a service oriented architecture with emphasis on the easy data sharing. Data elicitation from user profiles is based on the utilization of XQuery scripts and sharing with other applications is achieved through…

  17. Effects of light reduction on food webs and associated ecosystem services of Yaquina Bay

    EPA Science Inventory

    Reduced water clarity can affect estuarine primary production but little is known of its subsequent effects to consumer guilds or ecosystem services. We investigated those effects using inverse analysis of modeled food webs of the lower (polyhaline) and upper (mesohaline) reache...

  18. Prototype of a Mobile Social Network for Education Using Dynamic Web Service

    ERIC Educational Resources Information Center

    Hoentsch, Sandra Costa Pinto; Carvalho, Felipe Oliveira; Santos, Luiz Marcus Monteiro de Almeida; Ribeiro, Admilson de Ribamar Lima

    2012-01-01

    This article presents the proposal of a social network site SocialNetLab that belongs to the Department of Computing-Federal University of Sergipe and which aims to locate and notify users of a nearby friend independently of the location technology available in the equipment through dynamic Web Service; to serve as a laboratory for research in…

  19. NDU Knowledge Net: A Web-Enabled Just-In-Time Information Service for Continuing Education.

    ERIC Educational Resources Information Center

    Alden, Jay

    This paper describes the development of a web-enabled information service for constituents of the Information Resources Management College (National Defense University, Washington, DC). The constituents of the College, who include graduates, current students, and prospective students, typically work in the Chief Information Officer (CIO) office of…

  20. Electronic Resources for Youth Services: A Print Bibliography and Web Site.

    ERIC Educational Resources Information Center

    Amey, Larry; Segal, Erez

    1996-01-01

    This article evaluates 57 World Wide Web sites related to children's literature and youth-oriented library services, in categories including award-winning books; book reviews; reading and storytelling; writing resources; online children's literature; educational entertainment; and authors, publishers, and booksellers. Also included is information…

  1. Issues in implementing services for a wireless web-enabled digital camera

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Sampat, Nitin; Fisher, Yoram; Canosa, John; Noel, Nicholas

    2001-05-01

    The competition in the exploding digital photography market has caused vendors to explore new ways to increase their return on investment. A common view among industry analysts is that increasingly it will be services provided by these cameras, and not the cameras themselves, that will provide the revenue stream. These services will be coupled to e- Appliance based Communities. In addition, the rapidly increasing need to upload images to the Internet for photo- finishing services as well as the need to download software upgrades to the camera is driving many camera OEMs to evaluate the benefits of using the wireless web to extend their enterprise systems. Currently, creating a viable e- appliance such as a digital camera coupled with a wireless web service requires more than just a competency in product development. This paper will evaluate the system implications in the deployment of recurring revenue services and enterprise connectivity of a wireless, web-enabled digital camera. These include, among other things, an architectural design approach for services such as device management, synchronization, billing, connectivity, security, etc. Such an evaluation will assist, we hope, anyone designing or connecting a digital camera to the enterprise systems.

  2. UltiMatch-NL: a Web service matchmaker based on multiple semantic filters.

    PubMed

    Mohebbi, Keyvan; Ibrahim, Suhaimi; Zamani, Mazdak; Khezrian, Mojtaba

    2014-01-01

    In this paper, a Semantic Web service matchmaker called UltiMatch-NL is presented. UltiMatch-NL applies two filters namely Signature-based and Description-based on different abstraction levels of a service profile to achieve more accurate results. More specifically, the proposed filters rely on semantic knowledge to extract the similarity between a given pair of service descriptions. Thus it is a further step towards fully automated Web service discovery via making this process more semantic-aware. In addition, a new technique is proposed to weight and combine the results of different filters of UltiMatch-NL, automatically. Moreover, an innovative approach is introduced to predict the relevance of requests and Web services and eliminate the need for setting a threshold value of similarity. In order to evaluate UltiMatch-NL, the repository of OWLS-TC is used. The performance evaluation based on standard measures from the information retrieval field shows that semantic matching of OWL-S services can be significantly improved by incorporating designed matching filters.

  3. Microarray oligonucleotide probe designer (MOPeD): A web service.

    PubMed

    Patel, Viren C; Mondal, Kajari; Shetty, Amol Carl; Horner, Vanessa L; Bedoyan, Jirair K; Martin, Donna; Caspary, Tamara; Cutler, David J; Zwick, Michael E

    2010-11-01

    Methods of genomic selection that combine high-density oligonucleotide microarrays with next-generation DNA sequencing allow investigators to characterize genomic variation in selected portions of complex eukaryotic genomes. Yet choosing which specific oligonucleotides to be use can pose a major technical challenge. To address this issue, we have developed a software package called MOPeD (Microarray Oligonucleotide Probe Designer), which automates the process of designing genomic selection microarrays. This web-based software allows individual investigators to design custom genomic selection microarrays optimized for synthesis with Roche NimbleGen's maskless photolithography. Design parameters include uniqueness of the probe sequences, melting temperature, hairpin formation, and the presence of single nucleotide polymorphisms. We generated probe databases for the human, mouse, and rhesus macaque genomes and conducted experimental validation of MOPeD-designed microarrays in human samples by sequencing the human X chromosome exome, where relevant sequence metrics indicated superior performance relative to a microarray designed by the Roche NimbleGen proprietary algorithm. We also performed validation in the mouse to identify known mutations contained within a 487-kb region from mouse chromosome 16, the mouse chromosome 16 exome (1.7 Mb), and the mouse chromosome 12 exome (3.3 Mb). Our results suggest that the open source MOPeD software package and website (http://moped.genetics.emory.edu/) will make a valuable resource for investigators in their sequence-based studies of complex eukaryotic genomes.

  4. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows.

    PubMed

    Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José

    2015-01-01

    This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production.

  5. Virtualization of open-source secure web services to support data exchange in a pediatric critical care research network

    PubMed Central

    Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael

    2015-01-01

    Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596

  6. Model-Driven Reengineering Legacy Software Systems to Web Services

    DTIC Science & Technology

    2005-01-01

    association , generalization, and dependency => Relationship class=> Entity Marshal B domain analysis and mapping Table 2...operations in WSDL, while different banking service implementation technology and QoS requirements can be associated to bindings in WSDL as a reification...portType is actually treated as an association when modeling in GME, because the binding entity actually attaches values of the chosen protocol to the

  7. Framework for ReSTful Web Services in OSGi

    NASA Technical Reports Server (NTRS)

    Shams, Khawaja S.; Norris, Jeffrey S.; Powell, Mark W.; Crockett, Thomas M.; Mittman, David S.; Fox, Jason M.; Joswig, Joseph C.; Wallick, Michael N.; Torres, Recaredo J.; Rabe, Kenneth

    2009-01-01

    Ensemble ReST is a software system that eases the development, deployment, and maintenance of server-side application programs to perform functions that would otherwise be performed by client software. Ensemble ReST takes advantage of the proven disciplines of ReST (Representational State Transfer. ReST leverages the standardized HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites

  8. Supporting NATO C2-Simulation Experimentation with Scripted Web Services

    DTIC Science & Technology

    2011-06-01

    ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for... information . Send comments regarding this burden estimate or any other aspect of this collection of information , including suggestions for reducing this burden...to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA

  9. Planning With Incomplete Knowledge for the Composition of Web Services

    DTIC Science & Technology

    2005-08-01

    tedious. Locating the services with required specifics and coordinating the flow between these sources is not an easy task. • B2B Applications The...composition is very important in B2B applications where online partnerships can automatically be formed without prior agreements. A business who wants to...be same each time due to repeatability assumption. One other thing to note is that, different from the Golog approach, we don’t allow the

  10. A study of an adaptive replication framework for orchestrated composite web services.

    PubMed

    Mohamed, Marwa F; Elyamany, Hany F; Nassar, Hamed M

    2013-01-01

    Replication is considered one of the most important techniques to improve the Quality of Services (QoS) of published Web Services. It has achieved impressive success in managing resource sharing and usage in order to moderate the energy consumed in IT environments. For a robust and successful replication process, attention should be paid to suitable time as well as the constraints and capabilities in which the process runs. The replication process is time-consuming since outsourcing some new replicas into other hosts is lengthy. Furthermore, nowadays, most of the business processes that might be implemented over the Web are composed of multiple Web services working together in two main styles: Orchestration and Choreography. Accomplishing a replication over such business processes is another challenge due to the complexity and flexibility involved. In this paper, we present an adaptive replication framework for regular and orchestrated composite Web services. The suggested framework includes a number of components for detecting unexpected and unhappy events that might occur when consuming the original published web services including failure or overloading. It also includes a specific replication controller to manage the replication process and select the best host that would encapsulate a new replica. In addition, it includes a component for predicting the incoming load in order to decrease the time needed for outsourcing new replicas, enhancing the performance greatly. A simulation environment has been created to measure the performance of the suggested framework. The results indicate that adaptive replication with prediction scenario is the best option for enhancing the performance of the replication process in an online business environment.

  11. BioModels.net Web Services, a free and integrated toolkit for computational modelling software.

    PubMed

    Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille

    2010-05-01

    Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.

  12. Soil food web properties explain ecosystem services across European land use systems

    PubMed Central

    de Vries, Franciska T.; Thébault, Elisa; Liiri, Mira; Birkhofer, Klaus; Tsiafouli, Maria A.; Bjørnlund, Lisa; Bracht Jørgensen, Helene; Brady, Mark Vincent; Christensen, Søren; de Ruiter, Peter C.; d’Hertefeldt, Tina; Frouz, Jan; Hedlund, Katarina; Hemerik, Lia; Hol, W. H. Gera; Hotes, Stefan; Mortimer, Simon R.; Setälä, Heikki; Sgardelis, Stefanos P.; Uteseny, Karoline; van der Putten, Wim H.; Wolters, Volkmar; Bardgett, Richard D.

    2013-01-01

    Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world. PMID:23940339

  13. Visualization and analysis of 3D gene expression patterns in zebrafish using web services

    NASA Astrophysics Data System (ADS)

    Potikanond, D.; Verbeek, F. J.

    2012-01-01

    The analysis of patterns of gene expression patterns analysis plays an important role in developmental biology and molecular genetics. Visualizing both quantitative and spatio-temporal aspects of gene expression patterns together with referenced anatomical structures of a model-organism in 3D can help identifying how a group of genes are expressed at a certain location at a particular developmental stage of an organism. In this paper, we present an approach to provide an online visualization of gene expression data in zebrafish (Danio rerio) within 3D reconstruction model of zebrafish in different developmental stages. We developed web services that provide programmable access to the 3D reconstruction data and spatial-temporal gene expression data maintained in our local repositories. To demonstrate this work, we develop a web application that uses these web services to retrieve data from our local information systems. The web application also retrieve relevant analysis of microarray gene expression data from an external community resource; i.e. the ArrayExpress Atlas. All the relevant gene expression patterns data are subsequently integrated with the reconstruction data of the zebrafish atlas using ontology based mapping. The resulting visualization provides quantitative and spatial information on patterns of gene expression in a 3D graphical representation of the zebrafish atlas in a certain developmental stage. To deliver the visualization to the user, we developed a Java based 3D viewer client that can be integrated in a web interface allowing the user to visualize the integrated information over the Internet.

  14. Soil food web properties explain ecosystem services across European land use systems.

    PubMed

    de Vries, Franciska T; Thébault, Elisa; Liiri, Mira; Birkhofer, Klaus; Tsiafouli, Maria A; Bjørnlund, Lisa; Bracht Jørgensen, Helene; Brady, Mark Vincent; Christensen, Søren; de Ruiter, Peter C; d'Hertefeldt, Tina; Frouz, Jan; Hedlund, Katarina; Hemerik, Lia; Hol, W H Gera; Hotes, Stefan; Mortimer, Simon R; Setälä, Heikki; Sgardelis, Stefanos P; Uteseny, Karoline; van der Putten, Wim H; Wolters, Volkmar; Bardgett, Richard D

    2013-08-27

    Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world.

  15. Development of a Dynamic Web Mapping Service for Vegetation Productivity Using Earth Observation and in situ Sensors in a Sensor Web Based Approach

    PubMed Central

    Kooistra, Lammert; Bergsma, Aldo; Chuma, Beatus; de Bruin, Sytze

    2009-01-01

    This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the Netherlands with a spatial resolution of 250 m. Daily available MODIS surface reflectance products and meteorological parameters obtained through a Sensor Observation Service (SOS) were used as input for a vegetation productivity model. This paper presents the vegetation productivity model, the sensor data sources and the implementation of the automated processing facility. Finally, an evaluation is made of the opportunities and limitations of sensor web based approaches for the development of web services which combine both satellite and in situ sensor sources. PMID:22574019

  16. Experimental evaluation of the impact of packet capturing tools for web services.

    SciTech Connect

    Choe, Yung Ryn; Mohapatra, Prasant; Chuah, Chen-Nee; Chen, Chao-Chih

    2011-10-01

    Network measurement is a discipline that provides the techniques to collect data that are fundamental to many branches of computer science. While many capturing tools and comparisons have made available in the literature and elsewhere, the impact of these packet capturing tools on existing processes have not been thoroughly studied. While not a concern for collection methods in which dedicated servers are used, many usage scenarios of packet capturing now requires the packet capturing tool to run concurrently with operational processes. In this work we perform experimental evaluations of the performance impact that packet capturing process have on web-based services; in particular, we observe the impact on web servers. We find that packet capturing processes indeed impact the performance of web servers, but on a multi-core system the impact varies depending on whether the packet capturing and web hosting processes are co-located or not. In addition, the architecture and behavior of the web server and process scheduling is coupled with the behavior of the packet capturing process, which in turn also affect the web server's performance.

  17. Development of Semantic Web - Markup Languages, Web Services, Rules, Explanation, Querying, Proof and Reasoning

    DTIC Science & Technology

    2008-07-01

    KSL-06-16.html • J. William Murdock, Deborah L. McGuinness, Paulo Pinheiro da Silva, Chris Welty, and David Ferrucci. Explaining Conclusions from...Christopher Welty, J. William Murdock, Paulo Pinheiro da Silva, Deborah L. McGuinness, David Ferrucci, Richard Fikes. Tracking Information Extraction from...services/swsl/requirements/swsl-requirements.shtml. • J. William Murdock, Paulo Pinheiro da Silva, David Ferrucci, Christopher Welty and Deborah L

  18. Automating DAML-S Web Services Composition Using SHOP2

    DTIC Science & Technology

    2006-01-01

    formal semantics has been given for DAML-S service description in terms of an action theory based on the situation calculus [8] [9]. The following...input parameters instance −→c1 ,−→c2 , . . . ,−→cn respectively. Then P is a composition for T (−→c ) with respect to K in S0 iff in action theory , we...defined in action theory . – T (−→c ) is the complex action defined for T as in action theory with input parameters instance −→c – p1(−→c1 ), p2(−→c2

  19. Composition of web services using Markov decision processes and dynamic programming.

    PubMed

    Uc-Cetina, Víctor; Moo-Mena, Francisco; Hernandez-Ucan, Rafael

    2015-01-01

    We propose a Markov decision process model for solving the Web service composition (WSC) problem. Iterative policy evaluation, value iteration, and policy iteration algorithms are used to experimentally validate our approach, with artificial and real data. The experimental results show the reliability of the model and the methods employed, with policy iteration being the best one in terms of the minimum number of iterations needed to estimate an optimal policy, with the highest Quality of Service attributes. Our experimental work shows how the solution of a WSC problem involving a set of 100,000 individual Web services and where a valid composition requiring the selection of 1,000 services from the available set can be computed in the worst case in less than 200 seconds, using an Intel Core i5 computer with 6 GB RAM. Moreover, a real WSC problem involving only 7 individual Web services requires less than 0.08 seconds, using the same computational power. Finally, a comparison with two popular reinforcement learning algorithms, sarsa and Q-learning, shows that these algorithms require one or two orders of magnitude and more time than policy iteration, iterative policy evaluation, and value iteration to handle WSC problems of the same complexity.

  20. Composition of Web Services Using Markov Decision Processes and Dynamic Programming

    PubMed Central

    Uc-Cetina, Víctor; Moo-Mena, Francisco; Hernandez-Ucan, Rafael

    2015-01-01

    We propose a Markov decision process model for solving the Web service composition (WSC) problem. Iterative policy evaluation, value iteration, and policy iteration algorithms are used to experimentally validate our approach, with artificial and real data. The experimental results show the reliability of the model and the methods employed, with policy iteration being the best one in terms of the minimum number of iterations needed to estimate an optimal policy, with the highest Quality of Service attributes. Our experimental work shows how the solution of a WSC problem involving a set of 100,000 individual Web services and where a valid composition requiring the selection of 1,000 services from the available set can be computed in the worst case in less than 200 seconds, using an Intel Core i5 computer with 6 GB RAM. Moreover, a real WSC problem involving only 7 individual Web services requires less than 0.08 seconds, using the same computational power. Finally, a comparison with two popular reinforcement learning algorithms, sarsa and Q-learning, shows that these algorithms require one or two orders of magnitude and more time than policy iteration, iterative policy evaluation, and value iteration to handle WSC problems of the same complexity. PMID:25874247

  1. Thomson Scientific's expanding Web of Knowledge: beyond citation databases and current awareness services.

    PubMed

    London, Sue; Brahmi, Frances A

    2005-01-01

    As end-user demand for easy access to electronic full text continues to climb, an increasing number of information providers are combining that access with their other products and services, making navigating their Web sites by librarians seeking information on a given product or service more daunting than ever. One such provider of a complex array of products and services is Thomson Scientific. This paper looks at some of the many products and tools available from two of Thomson Scientific's businesses, Thomson ISI and Thomson ResearchSoft. Among the items of most interest to health sciences and veterinary librarians and their users are the variety of databases available via the ISI Web of Knowledge platform and the information management products available from ResearchSoft.

  2. Business Models of E-Government: Research on Dynamic E-Government Based on Web Services

    NASA Astrophysics Data System (ADS)

    Li, Yan; Yang, Jiumin

    Government transcends all sectors in a society. It provides not only the legal, political and economic infrastructure to support other sectors, but also exerts significant influence on the social factors that contribute to their development. With its maturity of technologies and management, e-government will eventually enter into the time of 'one-stop' services. Among others, the technology of Web services is the major contributor to this achievement. Web services provides a new way of standard-based software technology, letting programmers combine existing computer system in new ways over the Internet within one business or across many, and would thereby bring about profound and far-reaching impacts on e-government. This paper introduced the business modes of e-government, architecture of dynamic e-government and its key technologies. Finally future prospect of dynamic e-government was also briefly discussed.

  3. Development and process evaluation of a web-based responsible beverage service training program

    PubMed Central

    2012-01-01

    Background Responsible beverage service (RBS) training designed to improve the appropriate service of alcohol in commercial establishments is typically delivered in workshops. Recently, Web-based RBS training programs have emerged. This report describes the formative development and subsequent design of an innovative Web-delivered RBS program, and evaluation of the impact of the program on servers’ knowledge, attitudes, and self-efficacy. Methods Formative procedures using focus groups and usability testing were used to develop a Web-based RBS training program. Professional alcohol servers (N = 112) who worked as servers and/or mangers in alcohol service settings were recruited to participate. A pre-post assessment design was used to assess changes associated with using the program. Results Participants who used the program showed significant improvements in their RBS knowledge, attitudes, and self-efficacy. Conclusions Although the current study did not directly observe and determine impact of the intervention on server behaviors, it demonstrated that the development process incorporating input from a multidisciplinary team in conjunction with feedback from end-users resulted in creation of a Web-based RBS program that was well-received by servers and that changed relevant knowledge, attitudes, and self-efficacy. The results also help to establish a needed evidence base in support of the use of online RBS training, which has been afforded little research attention. PMID:22999419

  4. MALINA: a web service for visual analytics of human gut microbiota whole-genome metagenomic reads.

    PubMed

    Tyakht, Alexander V; Popenko, Anna S; Belenikin, Maxim S; Altukhov, Ilya A; Pavlenko, Alexander V; Kostryukova, Elena S; Selezneva, Oksana V; Larin, Andrei K; Karpova, Irina Y; Alexeev, Dmitry G

    2012-12-07

    MALINA is a web service for bioinformatic analysis of whole-genome metagenomic data obtained from human gut microbiota sequencing. As input data, it accepts metagenomic reads of various sequencing technologies, including long reads (such as Sanger and 454 sequencing) and next-generation (including SOLiD and Illumina). It is the first metagenomic web service that is capable of processing SOLiD color-space reads, to authors' knowledge. The web service allows phylogenetic and functional profiling of metagenomic samples using coverage depth resulting from the alignment of the reads to the catalogue of reference sequences which are built into the pipeline and contain prevalent microbial genomes and genes of human gut microbiota. The obtained metagenomic composition vectors are processed by the statistical analysis and visualization module containing methods for clustering, dimension reduction and group comparison. Additionally, the MALINA database includes vectors of bacterial and functional composition for human gut microbiota samples from a large number of existing studies allowing their comparative analysis together with user samples, namely datasets from Russian Metagenome project, MetaHIT and Human Microbiome Project (downloaded from http://hmpdacc.org). MALINA is made freely available on the web at http://malina.metagenome.ru. The website is implemented in JavaScript (using Ext JS), Microsoft .NET Framework, MS SQL, Python, with all major browsers supported.

  5. Using Transactional Distance Theory to Redesign an Online Mathematics Education Course for Pre-Service Primary Teachers

    ERIC Educational Resources Information Center

    Larkin, Kevin; Jamieson-Proctor, Romina

    2015-01-01

    This paper examines the impact of a series of design changes to an online mathematics education course in terms of transactional distance between learner and teachers, pre-service education students' attitudes towards mathematics, and their development of mathematical pedagogical knowledge. Transactional distance theory (TDT) was utilised to…

  6. Analysing the primacy of distance in the utilization of health services in the Ahafo-Ano South district, Ghana.

    PubMed

    Buor, Daniel

    2003-01-01

    Although the distance factor has been identified as key in the utilization of health services in rural areas of developing countries, it has been analysed without recourse to related factors of travel time and transport cost. Also, the influence of distance on vulnerable groups in utilization has not been an object of survey by researchers. This paper addresses the impact of distance on utilization, and how distance compares with travel time and transport cost that are related to it in the utilization of health services in the Ahafo-Ano South (rural) district in Ghana. The study, a cross-sectional survey, also identifies the position of distance among other important factors of utilization. A sample of 400, drawn through systematic random technique, was used for the survey. Data were analysed using the regression model and some graphic techniques. The main instruments used in data collection were formal (face-by-face) interview and a questionnaire. The survey finds that distance is the most important factor that influences the utilization of health services in the Ahafo-Ano South district. Other key factors are income, service cost and education. The effect of travel time on utilization reflects that of distance and utilization. Recommendations to reduce distance coverage, improve formal education and reduce poverty have been made.

  7. OneGeology-Europe: architecture, portal and web services to provide a European geological map

    NASA Astrophysics Data System (ADS)

    Tellez-Arenas, Agnès.; Serrano, Jean-Jacques; Tertre, François; Laxton, John

    2010-05-01

    OneGeology-Europe is a large ambitious project to make geological spatial data further known and accessible. The OneGeology-Europe project develops an integrated system of data to create and make accessible for the first time through the internet the geological map of the whole of Europe. The architecture implemented by the project is web services oriented, based on the OGC standards: the geological map is not a centralized database but is composed by several web services, each of them hosted by a European country involved in the project. Since geological data are elaborated differently from country to country, they are difficult to share. OneGeology-Europe, while providing more detailed and complete information, will foster even beyond the geological community an easier exchange of data within Europe and globally. This implies an important work regarding the harmonization of the data, both model and the content. OneGeology-Europe is characterised by the high technological capacity of the EU Member States, and has the final goal to achieve the harmonisation of European geological survey data according to common standards. As a direct consequence Europe will make a further step in terms of innovation and information dissemination, continuing to play a world leading role in the development of geosciences information. The scope of the common harmonized data model was defined primarily by the requirements of the geological map of Europe, but in addition users were consulted and the requirements of both INSPIRE and ‘high-resolution' geological maps were considered. The data model is based on GeoSciML, developed since 2006 by a group of Geological Surveys. The data providers involved in the project implemented a new component that allows the web services to deliver the geological map expressed into GeoSciML. In order to capture the information describing the geological units of the map of Europe the scope of the data model needs to include lithology; age; genesis and

  8. Research on sudden environmental pollution public service platform construction based on WebGIS

    NASA Astrophysics Data System (ADS)

    Bi, T. P.; Gao, D. Y.; Zhong, X. Y.

    2016-08-01

    In order to actualize the social sharing and service of the emergency-response information for sudden pollution accidents, the public can share the risk source information service, dangerous goods control technology service and so on, The SQL Server and ArcSDE software are used to establish a spatial database to restore all kinds of information including risk sources, hazardous chemicals and handling methods in case of accidents. Combined with Chinese atmospheric environmental assessment standards, the SCREEN3 atmospheric dispersion model and one-dimensional liquid diffusion model are established to realize the query of related information and the display of the diffusion effect under B/S structure. Based on the WebGIS technology, C#.Net language is used to develop the sudden environmental pollution public service platform. As a result, the public service platform can make risk assessments and provide the best emergency processing services.

  9. HydroViewer: Utilizing Web-Based Hydrologic Data And Analytical Services

    NASA Astrophysics Data System (ADS)

    Ye, Z.; Djokic, D.; Armstrong, L.

    2011-12-01

    To conduct a hydrologic study in an area, the hydrologist needs to define the area of interest, collect the spatial, hydrological, and metrological data for the area, and finally perform the desired analysis. Services oriented architecture holds promise that these activities can be done using distributed services combined in a single lightweight application. Hydrologic time series data are collected, stored, and served by multiple agencies in different formats and are often difficult to find, acquire, and mobilize for analysis. CUAHSI's WaterOneFlow web service API provides functions for querying and collecting the temporal data in a consistent manner. Many agencies and Universities publish their time series data using WaterOneFlow services and thus make them available to a broad range of users. ArcGIS server allows publishing of spatial data and mapping services and is widely used in the government, academia, and industry. ArcGIS server can also be used to serve analytical services. By combining spatial services provided by ArcGIS server and temporal data provided through WaterOneFlow services, it is now possible to create web applications to explore the hydrologic time series data available in a given spatial area served by multiple agencies and perform analysis on them. A Web application, HydroViewer, is developed using ArcGIS Silverlight API to allow users to mobilize ArcGIS server and WaterOneFlow services in an integrated fashion. It performs the following tasks: (1) Delineating watershed for a user specified point of interest using Arc Hydro based watershed delineation service, (2) Exploring data collecting sites and data collected and served by different agencies for a given spatial area (watershed or viewing spatial extent) and time domain, (3) Viewing/graphing these data collected by these sites, (4) Collecting the metadata in for the data variables in a form of a data cart, (5) Downloading both spatial and time series data to create an Arc Hydro

  10. Technology Development, Implementation and Assessment: K-16 Pre-Service, In-Service and Distance Learning Initiatives

    NASA Technical Reports Server (NTRS)

    Williams, William B., Jr.

    1999-01-01

    The technologies associated with distance learning are evolving rapidly, giving to educators a potential tool for enhancing the educational experiences of large numbers of students simultaneously. This enhancement, in order to be effective, must take into account the various agendas of teachers, administrators, state systems, and of course students. It must also make use of the latest research on effective pedagogy. This combination, effective pedagogy and robust information technology, is a powerful vehicle for communicating, to a large audience of school children the excitement of mathematics and science--an excitement that for the most part is now well-hidden. This project,"Technology Development, Implementation and Assessment," proposed to bring to bear on the education of learners in grades 3 - 8 in science and mathematics both advances in information technology and in effective pedagogy. Specifically, the project developed components NASA CONNECT video series--problem-based learning modules that focus on the scientific method and that incorporate problem-based learning scenarios tied to national mathematics and science standards. These videos serve two purposes; they engage students in the excitement of hands-on learning and they model for the teachers of these students the problem-based learning practices that are proving to be excellent ways to teach science and mathematics to school students. Another component of NASA CONNECT is the accompanying web-site.

  11. EnviroAtlas - Ecosystem Services Market-Based Programs Web Service, U.S., 2016, Forest Trends' Ecosystem Marketplace

    EPA Pesticide Factsheets

    This EnviroAtlas web service contains layers depicting market-based programs and projects addressing ecosystem services protection in the United States. Layers include data collected via surveys and desk research conducted by Forest Trends' Ecosystem Marketplace from 2008 to 2016 on biodiversity (i.e., imperiled species/habitats; wetlands and streams), carbon, and water markets and enabling conditions that facilitate, directly or indirectly, market-based approaches to protecting and investing in those ecosystem services. This dataset was produced by Forest Trends' Ecosystem Marketplace for EnviroAtlas in order to support public access to and use of information related to environmental markets. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  12. Increasing access to terrestrial ecology and remote sensing (MODIS) data through Web services and visualization tools

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S.; Cook, R. B.; Wei, Y.

    2012-12-01

    In recent years user access to data and information is increasingly handled through tools, services, and applications. Standards-based services have facilitated this development. These service-based methods to access data has boosted the use of data and in increasingly complex ways. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) has taken the approach of service-based access to data and visualization for distribution and visualization of its terrestrial ecology data, including MODIS (Moderate Resolution Imaging Spectroradiometer) remote sensing data products. The MODIS data products are highly useful for field research. The spectral, spatial and temporal characteristics of MODIS products have made them an important data source for analyzing key science questions relating to Earth system processes at multiple spatial and temporal scales. However, MODIS data volume and the complexity in data format make it less usable in some cases. To solve this usability issue, the ORNL DAAC has developed a system that prepares and distributes subsets of selected MODIS land products in a scale and format useful for field researchers. Web and Web service tools provide MODIS subsets in comma-delimited text format and in GIS compatible GeoTIFF format. Users can download and visualize MODIS subsets for a set of pre-defined locations, order MODIS subsets for any land location or automate the process of subset extraction using a SOAP-based Web service. The MODIS tools and services can be extended to support the large volume of data that would be produced by the various decadal survey missions. http://daac.ornl.gov/MODIS . The ORNL DAAC has also created a Web-based Spatial Data Access Tool (SDAT) that enables users to browse, visualize, and download a wide variety of geospatial data in various user-selected spatial/temporal extents, formats, and projections. SDAT is based on Open Geospatial Consortium (OGC) Web service standards that allows users to

  13. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud.

    PubMed

    Wolstencroft, Katherine; Haines, Robert; Fellows, Donal; Williams, Alan; Withers, David; Owen, Stuart; Soiland-Reyes, Stian; Dunlop, Ian; Nenadic, Aleksandra; Fisher, Paul; Bhagat, Jiten; Belhajjame, Khalid; Bacall, Finn; Hardisty, Alex; Nieva de la Hidalga, Abraham; Balcazar Vargas, Maria P; Sufi, Shoaib; Goble, Carole

    2013-07-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud environments), using the Taverna Server. In bioinformatics, Taverna workflows are typically used in the areas of high-throughput omics analyses (for example, proteomics or transcriptomics), or for evidence gathering methods involving text mining or data mining. Through Taverna, scientists have access to several thousand different tools and resources that are freely available from a large range of life science institutions. Once constructed, the workflows are reusable, executable bioinformatics protocols that can be shared, reused and repurposed. A repository of public workflows is available at http://www.myexperiment.org. This article provides an update to the Taverna tool suite, highlighting new features and developments in the workbench and the Taverna Server.

  14. Share and enjoy: anatomical models database--generating and sharing cardiovascular model data using web services.

    PubMed

    Kerfoot, Eric; Lamata, Pablo; Niederer, Steve; Hose, Rod; Spaan, Jos; Smith, Nic

    2013-11-01

    Sharing data between scientists and with clinicians in cardiac research has been facilitated significantly by the use of web technologies. The potential of this technology has meant that information sharing has been routinely promoted through databases that have encouraged stakeholder participation in communities around these services. In this paper we discuss the Anatomical Model Database (AMDB) (Gianni et al. Functional imaging and modeling of the heart. Springer, Heidelberg, 2009; Gianni et al. Phil Trans Ser A Math Phys Eng Sci 368:3039-3056, 2010) which both facilitate a database-centric approach to collaboration, and also extends this framework with new capabilities for creating new mesh data. AMDB currently stores cardiac geometric models described in Gianni et al. (Functional imaging and modelling of the heart. Springer, Heidelberg, 2009), a number of additional cardiac models describing geometry and functional properties, and most recently models generated using a web service. The functional models represent data from simulations in geometric form, such as electrophysiology or mechanics, many of which are present in AMDB as part of a benchmark study. Finally, the heartgen service has been added for producing left or bi-ventricle models derived from binary image data using the methods described in Lamata et al. (Med Image Anal 15:801-813, 2011). The results can optionally be hosted on AMDB alongside other community-provided anatomical models. AMDB is, therefore, a unique database storing geometric data (rather than abstract models or image data) combined with a powerful web service for generating new geometric models.

  15. Gbrowse Moby: a Web-based browser for BioMoby Services

    PubMed Central

    Wilkinson, Mark

    2006-01-01

    Background The BioMoby project aims to identify and deploy standards and conventions that aid in the discovery, execution, and pipelining of distributed bioinformatics Web Services. As of August, 2006, approximately 680 bioinformatics resources were available through the BioMoby interoperability platform. There are a variety of clients that can interact with BioMoby-style services. Here we describe a Web-based browser-style client – Gbrowse Moby – that allows users to discover and "surf" from one bioinformatics service to the next using a semantically-aided browsing interface. Results Gbrowse Moby is a low-throughput, exploratory tool specifically aimed at non-informaticians. It provides a straightforward, minimal interface that enables a researcher to query the BioMoby Central web service registry for data retrieval or analytical tools of interest, and then select and execute their chosen tool with a single mouse-click. The data is preserved at each step, thus allowing the researcher to manually "click" the data from one service to the next, with the Gbrowse Moby application managing all data formatting and interface interpretation on their behalf. The path of manual exploration is preserved and can be downloaded for import into automated, high-throughput tools such as Taverna. Gbrowse Moby also includes a robust data rendering system to ensure that all new data-types that appear in the BioMoby registry can be properly displayed in the Web interface. Conclusion Gbrowse Moby is a robust, yet facile entry point for both newcomers to the BioMoby interoperability project who wish to manually explore what is known about their data of interest, as well as experienced users who wish to observe the functionality of their analytical workflows prior to running them in a high-throughput environment. PMID:17147784

  16. NASA`s ECS Data Pool: OGC Compliant Web Services for Every User and Every Pocket

    NASA Astrophysics Data System (ADS)

    Bories, C.; Marley, S. R.

    2005-12-01

    The NASA Earth Observing System (EOS), supports operations for several satellites including Landsat 7, Terra, and Aqua. ECS (EOSDIS Core System) is a vast archival and distribution system and includes several Distributed Active Archive Centers (DAACs) located around the United States whose combined holdings now exceed 3.5 petabytes, with a daily distribution of 3.5TB. In response to evolutionary changes in technology, the user access services have been moving a substantial part of its distribution capability away from distribution from near-line tape archives to large on-line disk caches that hold several 10's of terabytes of high-value data that allow users to obtain products via electronic download using a web or ftp clients. Although these basic access services are valuable, the need for more advanced services such as data reformatting and subsetting was seen as key to the interoperability and broader adoption of NASA's data with current Decision Support and Geographical Information Systems. Therefore, in 2003, Raytheon was funded to initiate the development of an in-house demonstration prototype that integrated OGC web services (Mapping and Coverage) with reformatting capability (HDF-EOS to GeoTIFF). The experience obtained from that first prototype, led to the formulation of a generalized interoperable architecture, which incorporated a catalog service. Two operational prototypes are now deployed for NASA. The first, utilizing IONIC Software's OGC services is designed to serve large data volumes (up to 50000 pieces of inventory of 10 MODIS data types), and to offer faster access performance. The second prototype was developed from a combination of open-source web services, freeware, and hosted in commodity platforms (Linux based PCs), and had as a main objective to provide a low entry cost services, for potential new data providers. For example, a small University research team, which could find difficult to afford the elevated cost of COTS licenses or

  17. A Mediator-Based Approach to Resolving Interface Heterogeneity of Web Services

    NASA Astrophysics Data System (ADS)

    Leitner, Philipp; Rosenberg, Florian; Michlmayr, Anton; Huber, Andreas; Dustdar, Schahram

    In theory, service-oriented architectures are based on the idea of increasing flexibility in the selection of internal and external business partners using loosely-coupled services. However, in practice this flexibility is limited by the fact that partners need not only to provide the same service, but to do so via virtually the same interface in order to actually be interchangeable easily. Invocation-level mediation may be used to overcome this issue — by using mediation interface differences can be resolved transparently at runtime. In this chapter we discuss the basic ideas of mediation, with a focus on interface-level mediation. We show how interface mediation is integrated into our dynamic Web service invocation framework DAIOS, and present three different mediation strategies, one based on structural message similarity, one based on semantically annotated WSDL, and one which is embedded into the VRESCo SOA runtime, a larger research project with explicit support for service mediation.

  18. ECHO - Leveraging Web Service Technologies to support a net-centric Earth Science Enterprise

    NASA Astrophysics Data System (ADS)

    Burnett, M. T.; Wichmann, K.

    2005-12-01

    Today's world of Earth Science has several challenges beyond that of just increasing our understanding of Planet Earth. Fundamentally, innovative research is being conducted in a widely distributed and dynamic environment, producing an ever growing set of resources (data, services and clients). Beyond that, the need and value of integrating or interoperating these resources is growing as the resource providers are increasingly diverse. In order to support the emerging 21st century science model, a more mature, fluid and extensible cyber-infrastructure must emerge. A well coordinated use of web service technologies (XML, WSDL, SOAP, UDDI) can play a foundational part of the fabric of that enabling fabric. ECHO, a solution developed by NASA, provides a set of interoperable registries that supports this enterprise fabric. ECHO is comprised of a set of infrastructure services that allow the publication, discovery, understanding and access to earth science resources, all based on a web services model. ECHO services support both data and service registries. These registries are interoperable and based on industry and community standards.

  19. Service Quality and Students' Satisfaction with the Professional Teacher Development Programmes by Distance Mode in a South African University

    ERIC Educational Resources Information Center

    Oduaran, A. B.

    2011-01-01

    This article reports on the relationship between seven factors that described dimensions of education service quality and overall service quality on one hand, and students' satisfaction with the professional teacher development programmes by distance mode in a South African University on the other. We sought to find out whether students enrolled…

  20. DL-sQUAL: A Multiple-Item Scale for Measuring Service Quality of Online Distance Learning Programs

    ERIC Educational Resources Information Center

    Shaik, Naj; Lowe, Sue; Pinegar, Kem

    2006-01-01

    Education is a service with multiplicity of student interactions over time and across multiple touch points. Quality teaching needs to be supplemented by consistent quality supporting services for programs to succeed under the competitive distance learning landscape. ServQual and e-SQ scales have been proposed for measuring quality of traditional…

  1. 47 CFR 54.625 - Support for services beyond the maximum supported distance for rural health care providers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... provider to the farthest point on the jurisdictional boundary of the city in that state with the largest... 47 Telecommunication 3 2011-10-01 2011-10-01 false Support for services beyond the maximum... Support for Health Care Providers § 54.625 Support for services beyond the maximum supported distance...

  2. 47 CFR 54.625 - Support for services beyond the maximum supported distance for rural health care providers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... provider to the farthest point on the jurisdictional boundary of the city in that state with the largest... 47 Telecommunication 3 2010-10-01 2010-10-01 false Support for services beyond the maximum... Support for Health Care Providers § 54.625 Support for services beyond the maximum supported distance...

  3. A Web Service Model for Providing Weather Information through Sensor Networks Using a Fermat Point Based Data Forwarding Scheme

    NASA Astrophysics Data System (ADS)

    Ghosh, Kaushik; Rawat, Manoj; Das, Pradip K.

    2010-11-01

    Web services providing weather information are not new. The existing web services working on this kind of fields can provide more précised information if the concerned data is collected in a distributed fashion using a sensor network. Longer the lifetime of the sensor network, longer is the service provided without interruption. In this paper we propose a web service for providing weather information with a sensor network as the backbone. We have used a Fermat point based forwarding technique to minimize the energy consumption of the sensor network which eventually helps the web service work in an uninterrupted fashion for a longer duration, as the life time of the network has prolonged.

  4. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This

  5. Web services and model-data comparison for the Functional Test Platform

    NASA Astrophysics Data System (ADS)

    Krassovski, Misha; Wang, Dali

    2015-04-01

    Web services and model-data comparison for the Functional Test Platform. The realistic representation of key biogeophysical and biogeochemical function is the fundamental on process based ecosystem models. A Functional Test Platform is designed to create direct linkages between site measurements and process-based ecosystem model within the Community Earth System Models (CESM). The platform consists of three major parts: 1) interactive user interfaces, 2) functional test models and 3) observational datasets. The purpose of the observational datasets is to provide an interactive search and visualization capability for direct model-data comparison. The proposed presentation is going to show how web services can be used to feed model-data comparison using AmeriFlux data collection provided by Carbon Dioxide Information Analysis Center (CDIAC) and the way it is coupled with Functional Test Platform for the Community Land Model.

  6. Decentralized Orchestration of Composite Ogc Web Processing Services in the Cloud

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Cao, J.

    2016-09-01

    Current web-based GIS or RS applications generally rely on centralized structure, which has inherent drawbacks such as single points of failure, network congestion, and data inconsistency, etc. The inherent disadvantages of traditional GISs need to be solved for new applications on Internet or Web. Decentralized orchestration offers performance improvements in terms of increased throughput and scalability and lower response time. This paper investigates build time and runtime issues related to decentralized orchestration of composite geospatial processing services based on OGC WPS standard specification. A case study of dust storm detection was demonstrated to evaluate the proposed method and the experimental results indicate that the method proposed in this study is effective for its ability to produce the high quality solution at a low cost of communications for geospatial processing service composition problem.

  7. TnpPred: A Web Service for the Robust Prediction of Prokaryotic Transposases

    PubMed Central

    Riadi, Gonzalo; Medina-Moenne, Cristobal; Holmes, David S.

    2012-01-01

    Transposases (Tnps) are enzymes that participate in the movement of insertion sequences (ISs) within and between genomes. Genes that encode Tnps are amongst the most abundant and widely distributed genes in nature. However, they are difficult to predict bioinformatically and given the increasing availability of prokaryotic genomes and metagenomes, it is incumbent to develop rapid, high quality automatic annotation of ISs. This need prompted us to develop a web service, termed TnpPred for Tnp discovery. It provides better sensitivity and specificity for Tnp predictions than given by currently available programs as determined by ROC analysis. TnpPred should be useful for improving genome annotation. The TnpPred web service is freely available for noncommercial use. PMID:23251097

  8. Pitfalls in Persuasion: How Do Users Experience Persuasive Techniques in a Web Service?

    NASA Astrophysics Data System (ADS)

    Segerståhl, Katarina; Kotro, Tanja; Väänänen-Vainio-Mattila, Kaisa

    Persuasive technologies are designed by utilizing a variety of interactive techniques that are believed to promote target behaviors. This paper describes a field study in which the aim was to discover possible pitfalls of persuasion, i.e., situations in which persuasive techniques do not function as expected. The study investigated persuasive functionality of a web service targeting weight loss. A qualitative online questionnaire was distributed through the web service and a total of 291 responses were extracted for interpretative analysis. The Persuasive Systems Design model (PSD) was used for supporting systematic analysis of persuasive functionality. Pitfalls were identified through situations that evoked negative user experiences. The primary pitfalls discovered were associated with manual logging of eating and exercise behaviors, appropriateness of suggestions and source credibility issues related to social facilitation. These pitfalls, when recognized, can be addressed in design by applying functional and facilitative persuasive techniques in meaningful combinations.

  9. CT-Finder: A Web Service for CRISPR Optimal Target Prediction and Visualization

    PubMed Central

    Zhu, Houxiang; Misel, Lauren; Graham, Mitchell; Robinson, Michael L.; Liang, Chun

    2016-01-01

    The CRISPR system holds much promise for successful genome engineering, but therapeutic, industrial, and research applications will place high demand on improving the specificity and efficiency of this tool. CT-Finder (http://bioinfolab.miamioh.edu/ct-finder) is a web service to help users design guide RNAs (gRNAs) optimized for specificity. CT-Finder accommodates the original single-gRNA Cas9 system and two specificity-enhancing paired-gRNA systems: Cas9 D10A nickases (Cas9n) and dimeric RNA-guided FokI nucleases (RFNs). Optimal target candidates can be chosen based on the minimization of predicted off-target effects. Graphical visualization of on-target and off-target sites in the genome is provided for target validation. Major model organisms are covered by this web service. PMID:27210050

  10. T-Check in Technologies for Interoperability: Business Process Management in a Web Services Context

    DTIC Science & Technology

    2008-09-01

    Business Process Execution Language (BPEL) 7  2.2.2  Business Process Modeling Notation ( BPMN ) 9  2.2.3  Web Service Choreography Description Language...UML Sequence Diagram) 6  Figure 3:   BPMN Diagram of the Order Processing Business Process 9  Figure 4:   T-Check Process for Technology Evaluation 10...Unified Modeling Language (UML), and more recently the Business Process Modeling Notation ( BPMN ) [OMG 2006, OMG 2007]. There are also many tools

  11. Migrating Department of Defense (DoD) Web Service Based Applications to Mobile Computing Platforms

    DTIC Science & Technology

    2012-03-01

    Application Archetype MVC Model View Controller NDFD National Digital Forecast Database NOAA National Oceanic and Atmospheric Administration’s NSSC...for Java (GAE-J). GWT and GAE are not required to implement the COLD-T application, any web framework (PHP, Struts, JSF, Spring MVC , etc.) could be...2010. [3] S. D. K. Hyun Jung La, “Balanced MVC Architecture for Developing Service-based Mobile Applications,” IEEE International Conference on E

  12. A web service and android application for the distribution of rainfall estimates and Earth observation data

    NASA Astrophysics Data System (ADS)

    Mantas, V. M.; Liu, Z.; Pereira, A. J. S. C.

    2015-04-01

    The full potential of Satellite Rainfall Estimates (SRE) can only be realized if timely access to the datasets is possible. Existing data distribution web portals are often focused on global products and offer limited customization options, especially for the purpose of routine regional monitoring. Furthermore, most online systems are designed to meet the needs of desktop users, limiting the compatibility with mobile devices. In response to the growing demand for SRE and to address the current limitations of available web portals a project was devised to create a set of freely available applications and services, available at a common portal that can: (1) simplify cross-platform access to Tropical Rainfall Measuring Mission Online Visualization and Analysis System (TOVAS) data (including from Android mobile devices), (2) provide customized and continuous monitoring of SRE in response to user demands and (3) combine data from different online data distribution services, including rainfall estimates, river gauge measurements or imagery from Earth Observation missions at a single portal, known as the Tropical Rainfall Measuring Mission (TRMM) Explorer. The TRMM Explorer project suite includes a Python-based web service and Android applications capable of providing SRE and ancillary data in different intuitive formats with the focus on regional and continuous analysis. The outputs include dynamic plots, tables and data files that can also be used to feed downstream applications and services. A case study in Southern Angola is used to describe the potential of the TRMM Explorer for SRE distribution and analysis in the context of ungauged watersheds. The development of a collection of data distribution instances helped to validate the concept and identify the limitations of the program, in a real context and based on user feedback. The TRMM Explorer can successfully supplement existing web portals distributing SRE and provide a cost-efficient resource to small and medium

  13. A Real-Time Web Services Hub to Improve Situation Awareness during Flash Flood Events

    NASA Astrophysics Data System (ADS)

    Salas, F. R.; Liu, F.; Maidment, D. R.; Hodges, B. R.

    2011-12-01

    The central Texas corridor is one of the most flash flood-prone regions in the United States. Over the years, flash floods have resulted in hundreds of flood fatalities and billions of dollars in property damage. In order to mitigate risk to residents and infrastructure during flood events, both citizens and emergency responders need to exhibit proactive behavior instead of reactive. Real-time and forecasted flood information is fairly limited and hard to come by at varying spatial scales. The University of Texas at Austin has collaborated with IBM Research-Austin and ESRI to build a distributed real-time flood information system through a framework that leverages large scale data management and distribution, Open Geospatial Consortium standardized web services, and smart map applications. Within this paradigm, observed precipitation data encoded in WaterML is ingested into HEC-HMS and then delivered to a high performance hydraulic routing software package developed by IBM that utilizes the latest advancements in VLSI design, numerical linear algebra and numerical integration techniques on contemporary multicore architecture to solve fully dynamic Saint Venant equations at both small and large scales. In this paper we present a real-time flood inundation map application that in conjunction with a web services Hub, seamlessly integrates hydrologic information available through both public and private data services, model services and mapping services. As a case study for this project, we demonstrate how this system has been implemented in the City of Austin, Texas.

  14. Perspectives for Web Service Intermediaries: How Influence on Quality Makes the Difference

    NASA Astrophysics Data System (ADS)

    Scholten, Ulrich; Fischer, Robin; Zirpins, Christian

    In the service-oriented computing paradigm and the Web service architecture, the broker role is a key facilitator to leverage technical capabilities of loose coupling to achieve organizational capabilities of dynamic customer-provider-relationships. In practice, this role has quickly evolved into a variety of intermediary concepts that refine and extend the basic functionality of service brokerage with respect to various forms of added value like platform or market mechanisms. While this has initially led to a rich variety of Web service intermediaries, many of these are now going through a phase of stagnation or even decline in customer acceptance. In this paper we present a comparative study on insufficient service quality that is arguably one of the key reasons for this phenomenon. In search of a differentiation with respect to quality monitoring and management patterns, we categorize intermediaries into Infomediaries, e-Hubs, e-Markets and Integrators. A mapping of quality factors and control mechanisms to these categories depicts their respective strengths and weaknesses. The results show that Integrators have the highest overall performance, followed by e-Markets, e-Hubs and lastly Infomediaries. A comparative market survey confirms the conceptual findings.

  15. The Investigation of Pre-Service Teachers' Concerns about Integrating Web 2.0 Technologies into Instruction

    ERIC Educational Resources Information Center

    Hao, Yungwei; Wang, Shiou-ling; Chang, Su-jen; Hsu, Yin-hung; Tang, Ren-yen

    2013-01-01

    Studies indicated Web 2.0 technologies can support learning. Then, integration of innovation may create concerns among teachers because of the innovative features. In this study, the innovation refers to Web 2.0 technology integration into instruction. To help pre-service teachers make the best use of the innovation in their future instruction, it…

  16. Oh! Web 2.0, Virtual Reference Service 2.0, Tools and Techniques (I): A Basic Approach

    ERIC Educational Resources Information Center

    Arya, Harsh Bardhan; Mishra, J. K.

    2011-01-01

    This study targets librarians and information professionals who use Web 2.0 tools and applications with a view to providing snapshots on how Web 2.0 technologies are used. It also aims to identify values and impact that such tools have exerted on libraries and their services, as well as to detect various issues associated with the implementation…

  17. 78 FR 26664 - Submission for Review: CyberCorps®: Scholarship For Service (SFS) Registration Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-07

    ... From the Federal Register Online via the Government Publishing Office OFFICE OF PERSONNEL MANAGEMENT Submission for Review: CyberCorps : Scholarship For Service (SFS) Registration Web Site AGENCY: U... of the scholarship or one year, whichever is longer. Approval of the Web page is necessary...

  18. Distance and utilisation of out-of-hours services in a Norwegian urban/rural district: an ecological study

    PubMed Central

    2013-01-01

    Background Long travel distances limit the utilisation of health services. We wanted to examine the relationship between the utilisation of a Norwegian out-of-hours service and the distance from the municipality population centroid to the associated casualty clinic. Methods All first contacts from ten municipalities in Arendal out-of-hours district were registered from 2007 through 2011. The main outcomes were contact and consultation rates for each municipality for each year. The associations between main outcomes and distance from the population centroid of the participating municipalities to the casualty clinic and were examined by linear regression. Demographic and socioeconomic factors were included in multivariate linear regression. Secondary endpoints include association between distance and rates of different first actions taken and priority grades assessed by triage nurses. Age and gender specific subgroup analyses were performed. Results 141 342 contacts were included in the analyses. Increasing distance was associated with marked lower rates of all contact types except telephone consultations by doctor. Moving 43 kilometres away from the casualty clinic led to a 50 per cent drop in the rate of face-to-face consultations with a doctor. Availability of primary care doctors and education level contributed to a limited extent to the variance in consultation rate. The rates of all priority grades decreased significantly with increasing distance. The rate of acute events was reduced by 22 per cent when moving 50 kilometres away. The proportion of patients above 66 years increased with increasing distance, while the proportion of 13- to 19 year olds decreased. The proportion of female patients decreased with increasing distance. Conclusions The results confirm that increasing distance is associated with lower utilisation of out-of-hours services, even for the most acute cases. Extremely long distances might compromise patient safety. This must be taken into

  19. TogoWS: integrated SOAP and REST APIs for interoperable bioinformatics Web services.

    PubMed

    Katayama, Toshiaki; Nakao, Mitsuteru; Takagi, Toshihisa

    2010-07-01

    Web services have become widely used in bioinformatics analysis, but there exist incompatibilities in interfaces and data types, which prevent users from making full use of a combination of these services. Therefore, we have developed the TogoWS service to provide an integrated interface with advanced features. In the TogoWS REST (REpresentative State Transfer) API (application programming interface), we introduce a unified access method for major database resources through intuitive URIs that can be used to search, retrieve, parse and convert the database entries. The TogoWS SOAP API resolves compatibility issues found on the server and client-side SOAP implementations. The TogoWS service is freely available at: http://togows.dbcls.jp/.

  20. Content-Based Discovery for Web Map Service using Support Vector Machine and User Relevance Feedback.

    PubMed

    Hu, Kai; Gui, Zhipeng; Cheng, Xiaoqiang; Qi, Kunlun; Zheng, Jie; You, Lan; Wu, Huayi

    2016-01-01

    Many discovery methods for geographic information services have been proposed. There are approaches for finding and matching geographic information services, methods for constructing geographic information service classification schemes, and automatic geographic information discovery. Overall, the efficiency of the geographic information discovery keeps improving., There are however, still two problems in Web Map Service (WMS) discovery that must be solved. Mismatches between the graphic contents of a WMS and the semantic descriptions in the metadata make discovery difficult for human users. End-users and computers comprehend WMSs differently creating semantic gaps in human-computer interactions. To address these problems, we propose an improved query process for WMSs based on the graphic contents of WMS layers, combining Support Vector Machine (SVM) and user relevance feedback. Our experiments demonstrate that the proposed method can improve the accuracy and efficiency of WMS discovery.

  1. Measuring the impact of the approach to migration in the quality of web service interfaces

    NASA Astrophysics Data System (ADS)

    Mateos, Cristian; Crasso, Marco; Rodriguez, Juan M.; Zunino, Alejandro; Campo, Marcelo

    2015-01-01

    There is a good consensus on the strategic value of service-oriented architecture (SOA) as a way of structuring systems, and a common trend is to migrate legacy applications that use outdated technologies and architectures to SOA. We study the effects in the resulting Web Service interfaces of applying two traditional migration approaches combined with common ways of building services, namely, direct migration with code-first and indirect migration with contract-first. The migrated system was a 35-year-old COBOL system of a government agency that serves several millions of users. In addition, we provide a deep explanation of the trade-offs involved in following either combinations. Results confirm that the 'fast and cheap' approach to move into SOA, which is commonplace in the industry, may deliver poor service interfaces, and interface quality is also subject to the tools supporting the migration process.

  2. Content-Based Discovery for Web Map Service using Support Vector Machine and User Relevance Feedback

    PubMed Central

    Cheng, Xiaoqiang; Qi, Kunlun; Zheng, Jie; You, Lan; Wu, Huayi

    2016-01-01

    Many discovery methods for geographic information services have been proposed. There are approaches for finding and matching geographic information services, methods for constructing geographic information service classification schemes, and automatic geographic information discovery. Overall, the efficiency of the geographic information discovery keeps improving., There are however, still two problems in Web Map Service (WMS) discovery that must be solved. Mismatches between the graphic contents of a WMS and the semantic descriptions in the metadata make discovery difficult for human users. End-users and computers comprehend WMSs differently creating semantic gaps in human-computer interactions. To address these problems, we propose an improved query process for WMSs based on the graphic contents of WMS layers, combining Support Vector Machine (SVM) and user relevance feedback. Our experiments demonstrate that the proposed method can improve the accuracy and efficiency of WMS discovery. PMID:27861505

  3. TogoWS: integrated SOAP and REST APIs for interoperable bioinformatics Web services

    PubMed Central

    Katayama, Toshiaki; Nakao, Mitsuteru; Takagi, Toshihisa

    2010-01-01

    Web services have become widely used in bioinformatics analysis, but there exist incompatibilities in interfaces and data types, which prevent users from making full use of a combination of these services. Therefore, we have developed the TogoWS service to provide an integrated interface with advanced features. In the TogoWS REST (REpresentative State Transfer) API (application programming interface), we introduce a unified access method for major database resources through intuitive URIs that can be used to search, retrieve, parse and convert the database entries. The TogoWS SOAP API resolves compatibility issues found on the server and client-side SOAP implementations. The TogoWS service is freely available at: http://togows.dbcls.jp/. PMID:20472643

  4. I Help, Therefore, I Learn: Service Learning on Web 2.0 in an EFL Speaking Class

    ERIC Educational Resources Information Center

    Sun, Yu-Chih; Yang, Fang-Ying

    2015-01-01

    The present study integrates service learning into English as a Foreign Language (EFL) speaking class using Web 2.0 tools--YouTube and Facebook--as platforms. Fourteen undergraduate students participated in the study. The purpose of the service-learning project was to link service learning with oral communication training in an EFL speaking class…

  5. Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.

    PubMed

    Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L

    2015-07-01

    The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.

  6. The Footprint Database and Web Services of the Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Dobos, László; Varga-Verebélyi, Erika; Verdugo, Eva; Teyssier, David; Exter, Katrina; Valtchanov, Ivan; Budavári, Tamás; Kiss, Csaba

    2016-10-01

    Data from the Herschel Space Observatory is freely available to the public but no uniformly processed catalogue of the observations has been published so far. To date, the Herschel Science Archive does not contain the exact sky coverage (footprint) of individual observations and supports search for measurements based on bounding circles only. Drawing on previous experience in implementing footprint databases, we built the Herschel Footprint Database and Web Services for the Herschel Space Observatory to provide efficient search capabilities for typical astronomical queries. The database was designed with the following main goals in mind: (a) provide a unified data model for meta-data of all instruments and observational modes, (b) quickly find observations covering a selected object and its neighbourhood, (c) quickly find every observation in a larger area of the sky, (d) allow for finding solar system objects crossing observation fields. As a first step, we developed a unified data model of observations of all three Herschel instruments for all pointing and instrument modes. Then, using telescope pointing information and observational meta-data, we compiled a database of footprints. As opposed to methods using pixellation of the sphere, we represent sky coverage in an exact geometric form allowing for precise area calculations. For easier handling of Herschel observation footprints with rather complex shapes, two algorithms were implemented to reduce the outline. Furthermore, a new visualisation tool to plot footprints with various spherical projections was developed. Indexing of the footprints using Hierarchical Triangular Mesh makes it possible to quickly find observations based on sky coverage, time and meta-data. The database is accessible via a web site http://herschel.vo.elte.hu and also as a set of REST web service functions, which makes it readily usable from programming environments such as Python or IDL. The web service allows downloading footprint data

  7. Web servers and services for electrostatics calculations with APBS and PDB2PQR

    SciTech Connect

    Unni, Samir; Huang, Yong; Hanson, Robert M.; Tobias, Malcolm; Krishnan, Sriram; Li, Wilfred; Nielsen, Jens E.; Baker, Nathan A.

    2011-04-02

    APBS and PDB2PQR are widely utilized free software packages for biomolecular electrostatics calculations. Using the Opal toolkit, we have developed a web services framework for these software packages that enables the use of APBS and PDB2PQR by users who do not have local access to the necessary amount of computational capabilities. This not only increases accessibility of the software to a wider range of scientists, educators, and students but it also increases the availability of electrostatics calculations on portable computing platforms. Users can access this new functionality in two ways. First, an Opal-enabled version of APBS is provided in current distributions, available freely on the web. Second, we have extended the PDB2PQR web server to provide an interface for the setup, execution, and visualization electrostatics potentials as calculated by APBS. This web interface also uses the Opal framework which ensures the scalability needed to support the large APBS user community. Both of these resources are available from the APBS/PDB2PQR website: http://www.poissonboltzmann.org/.

  8. Design and Development of a Framework Based on Ogc Web Services for the Visualization of Three Dimensional Large-Scale Geospatial Data Over the Web

    NASA Astrophysics Data System (ADS)

    Roccatello, E.; Nozzi, A.; Rumor, M.

    2013-05-01

    This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.

  9. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability.

    PubMed

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-08-31

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human's daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner.

  10. A Web Service Protocol Realizing Interoperable Internet of Things Tasking Capability

    PubMed Central

    Huang, Chih-Yuan; Wu, Cheng-Hung

    2016-01-01

    The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring, and physical mashup applications can be constructed to improve human’s daily life. In general, IoT devices provide two main capabilities: sensing and tasking capabilities. While the sensing capability is similar to the World-Wide Sensor Web, this research focuses on the tasking capability. However, currently, IoT devices created by different manufacturers follow different proprietary protocols and are locked in many closed ecosystems. This heterogeneity issue impedes the interconnection between IoT devices and damages the potential of the IoT. To address this issue, this research aims at proposing an interoperable solution called tasking capability description that allows users to control different IoT devices using a uniform web service interface. This paper demonstrates the contribution of the proposed solution by interconnecting different IoT devices for different applications. In addition, the proposed solution is integrated with the OGC SensorThings API standard, which is a Web service standard defined for the IoT sensing capability. Consequently, the Extended SensorThings API can realize both IoT sensing and tasking capabilities in an integrated and interoperable manner. PMID:27589759

  11. Study of an Open Web Mapping Service for ESA's Planetary Surface Data Sets

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Hempen, B.; Heather, D.; Salgado, J.; Osuna, P.; Frigeri, A.

    2011-10-01

    The aim of this study is to understand how open GIS technology and standards can be used to manage, process, and stream the geospatial content of the PSA archive across the Internet into existing mapping applications and software libraries. To support this goal, we will design and develop an OGC-compliant web mapping service prototype providing a specific interface for client applications to search, process, retrieve and visualise the Mars Express/ OMEGA mapping spectrometer data [6] currently available in the PSA. The interface will be designed to accommodate future needs as additional planetary surface data become available or additional functionalities are required. It is anticipated that this web mapping service will provide a practical open framework contributing to the international efforts to define an extension of the PDAP data model for planetary surface data [7][8]. Although it has been demonstrated that OGC web mapping standards can be applied to the planetary science domain, they have limitations that currently prevent the use of their full capability. This study will document such limitations and feed potential recommendations into the Planetary OGC Interoperability Experiment (Planetary IE), an international initiative coordinated by USGS Astrogeology Science Center to ensure that the planetary community will properly leverage future OGC specifications [9].

  12. A platform for exploration into chaining of web services for clinical data transformation and reasoning

    PubMed Central

    Maldonado, José Alberto; Marcos, Mar; Fernández-Breis, Jesualdo Tomás; Parcero, Estíbaliz; Boscá, Diego; Legaz-García, María del Carmen; Martínez-Salvador, Begoña; Robles, Montserrat

    2016-01-01

    The heterogeneity of clinical data is a key problem in the sharing and reuse of Electronic Health Record (EHR) data. We approach this problem through the combined use of EHR standards and semantic web technologies, concretely by means of clinical data transformation applications that convert EHR data in proprietary format, first into clinical information models based on archetypes, and then into RDF/OWL extracts which can be used for automated reasoning. In this paper we describe a proof-of-concept platform to facilitate the (re)configuration of such clinical data transformation applications. The platform is built upon a number of web services dealing with transformations at different levels (such as normalization or abstraction), and relies on a collection of reusable mappings designed to solve specific transformation steps in a particular clinical domain. The platform has been used in the development of two different data transformation applications in the area of colorectal cancer. PMID:28269882

  13. A platform for exploration into chaining of web services for clinical data transformation and reasoning.

    PubMed

    Maldonado, José Alberto; Marcos, Mar; Fernández-Breis, Jesualdo Tomás; Parcero, Estíbaliz; Boscá, Diego; Legaz-García, María Del Carmen; Martínez-Salvador, Begoña; Robles, Montserrat

    2016-01-01

    The heterogeneity of clinical data is a key problem in the sharing and reuse of Electronic Health Record (EHR) data. We approach this problem through the combined use of EHR standards and semantic web technologies, concretely by means of clinical data transformation applications that convert EHR data in proprietary format, first into clinical information models based on archetypes, and then into RDF/OWL extracts which can be used for automated reasoning. In this paper we describe a proof-of-concept platform to facilitate the (re)configuration of such clinical data transformation applications. The platform is built upon a number of web services dealing with transformations at different levels (such as normalization or abstraction), and relies on a collection of reusable mappings designed to solve specific transformation steps in a particular clinical domain. The platform has been used in the development of two different data transformation applications in the area of colorectal cancer.

  14. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    PubMed

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  15. Semantic-JSON: a lightweight web service interface for Semantic Web contents integrating multiple life science databases.

    PubMed

    Kobayashi, Norio; Ishii, Manabu; Takahashi, Satoshi; Mochizuki, Yoshiki; Matsushima, Akihiro; Toyoda, Tetsuro

    2011-07-01

    Global cloud frameworks for bioinformatics research databases become huge and heterogeneous; solutions face various diametric challenges comprising cross-integration, retrieval, security and openness. To address this, as of March 2011 organizations including RIKEN published 192 mammalian, plant and protein life sciences databases having 8.2 million data records, integrated as Linked Open or Private Data (LOD/LPD) using SciNetS.org, the Scientists' Networking System. The huge quantity of linked data this database integration framework covers is based on the Semantic Web, where researchers collaborate by managing metadata across public and private databases in a secured data space. This outstripped the data query capacity of existing interface tools like SPARQL. Actual research also requires specialized tools for data analysis using raw original data. To solve these challenges, in December 2009 we developed the lightweight Semantic-JSON interface to access each fragment of linked and raw life sciences data securely under the control of programming languages popularly used by bioinformaticians such as Perl and Ruby. Researchers successfully used the interface across 28 million semantic relationships for biological applications including genome design, sequence processing, inference over phenotype databases, full-text search indexing and human-readable contents like ontology and LOD tree viewers. Semantic-JSON services of SciNetS.org are provided at http://semanticjson.org.

  16. Cross-Dataset Analysis and Visualization Driven by Expressive Web Services

    NASA Astrophysics Data System (ADS)

    Alexandru Dumitru, Mircea; Catalin Merticariu, Vlad

    2015-04-01

    The deluge of data that is hitting us every day from satellite and airborne sensors is changing the workflow of environmental data analysts and modelers. Web geo-services play now a fundamental role, and are no longer needed to preliminary download and store the data, but rather they interact in real-time with GIS applications. Due to the very large amount of data that is curated and made available by web services, it is crucial to deploy smart solutions for optimizing network bandwidth, reducing duplication of data and moving the processing closer to the data. In this context we have created a visualization application for analysis and cross-comparison of aerosol optical thickness datasets. The application aims to help researchers identify and visualize discrepancies between datasets coming from various sources, having different spatial and time resolutions. It also acts as a proof of concept for integration of OGC Web Services under a user-friendly interface that provides beautiful visualizations of the explored data. The tool was built on top of the World Wind engine, a Java based virtual globe built by NASA and the open source community. For data retrieval and processing we exploited the OGC Web Coverage Service potential: the most exciting aspect being its processing extension, a.k.a. the OGC Web Coverage Processing Service (WCPS) standard. A WCPS-compliant service allows a client to execute a processing query on any coverage offered by the server. By exploiting a full grammar, several different kinds of information can be retrieved from one or more datasets together: scalar condensers, cross-sectional profiles, comparison maps and plots, etc. This combination of technology made the application versatile and portable. As the processing is done on the server-side, we ensured that the minimal amount of data is transferred and that the processing is done on a fully-capable server, leaving the client hardware resources to be used for rendering the visualization

  17. Data Access and Web Services at the EarthScope Plate Boundary Observatory

    NASA Astrophysics Data System (ADS)

    Matykiewicz, J.; Anderson, G.; Henderson, D.; Hodgkinson, K.; Hoyt, B.; Lee, E.; Persson, E.; Torrez, D.; Smith, J.; Wright, J.; Jackson, M.

    2007-12-01

    The EarthScope Plate Boundary Observatory (PBO) at UNAVCO, Inc., part of the NSF-funded EarthScope project, is designed to study the three-dimensional strain field resulting from deformation across the active boundary zone between the Pacific and North American plates in the western United States. To meet these goals, PBO will install 880 continuous GPS stations, 103 borehole strainmeter stations, and five laser strainmeters, as well as manage data for 209 previously existing continuous GPS stations and one previously existing laser strainmeter. UNAVCO provides access to data products from these stations, as well as general information about the PBO project, via the PBO web site (http://pboweb.unavco.org). GPS and strainmeter data products can be found using a variety of access methods, incuding map searches, text searches, and station specific data retrieval. In addition, the PBO construction status is available via multiple mapping interfaces, including custom web based map widgets and Google Earth. Additional construction details can be accessed from PBO operational pages and station specific home pages. The current state of health for the PBO network is available with the statistical snap-shot, full map interfaces, tabular web based reports, and automatic data mining and alerts. UNAVCO is currently working to enhance the community access to this information by developing a web service framework for the discovery of data products, interfacing with operational engineers, and exposing data services to third party participants. In addition, UNAVCO, through the PBO project, provides advanced data management and monitoring systems for use by the community in operating geodetic networks in the United States and beyond. We will demonstrate these systems during the AGU meeting, and we welcome inquiries from the community at any time.

  18. SPIDR III: A Web Services Based System for Managing and Accessing Solar Terrestrial Physics Data

    NASA Astrophysics Data System (ADS)

    Redmon, R.; Kihn, E.; Zhizhin, M.

    2005-05-01

    We present SPIDR III, a web based data access, visualization and data management system for the space environment community, allowing a solar terrestrial physics customer to intelligently access and manage historical space physics data for integration with environmental models and space weather forecasts. SPIDR III is the newly redesigned Space Physics Interactive Resource (SPIDR) web application and was redesigned with input from it's user community via an intensive usability study. We will present on SPIDR III's new features, improved use and on lessons learned in usability and federating multi-source data. In 2004, SPIDR II underwent extensive rework yielding a completely redesigned interface for improved user interaction and the addition of many enhanced and complex features. The usability alterations were motivated in large part by a usability study performed by outside professional site reviewers and involving key data managers and current SPIDR II users. SPIDR III is built following the application direct to data archive paradigm, using Web Services both for internal and external exchange of data and information. It is now a framework and application set of Web Services. This application suite is fully open source and is designed to operate as a standalone VO as well as seamlessly integrate with other existing VOs. This extensible and open design yields easy mirroring worldwide for free and open exchange of scientific data and information. Data managed by SPIDR includes Geomagnetic Indices, GOES, Ionospheric, and DMSP which is archived/ingested from many data providers including WDC, IIWG, SAO, HDF, AFCCC, SEC, NASA, and this list is easily extendable. SPIDR III may be accessed via http://spidr.ngdc.noaa.gov/spidr/ A guest login is provided for convenience. Becoming a full access user, is free and only requires completing a short registration form.

  19. Sharing environmental models: An Approach using GitHub repositories and Web Processing Services

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Nuest, Daniel; Pross, Benjamin

    2016-04-01

    The GLUES (Global Assessment of Land Use Dynamics, Greenhouse Gas Emissions and Ecosystem Services) project established a spatial data infrastructure for scientific geospatial data and metadata (http://geoportal-glues.ufz.de), where different regional collaborative projects researching the impacts of climate and socio-economic changes on sustainable land management can share their underlying base scenarios and datasets. One goal of the project is to ease the sharing of computational models between institutions and to make them easily executable in Web-based infrastructures. In this work, we present such an approach for sharing computational models relying on GitHub repositories (http://github.com) and Web Processing Services. At first, model providers upload their model implementations to GitHub repositories in order to share them with others. The GitHub platform allows users to submit changes to the model code. The changes can be discussed and reviewed before merging them. However, while GitHub allows sharing and collaborating of model source code, it does not actually allow running these models, which requires efforts to transfer the implementation to a model execution framework. We thus have extended an existing implementation of the OGC Web Processing Service standard (http://www.opengeospatial.org/standards/wps), the 52°North Web Processing Service (http://52north.org/wps) platform to retrieve all model implementations from a git (http://git-scm.com) repository and add them to the collection of published geoprocesses. The current implementation is restricted to models implemented as R scripts using WPS4R annotations (Hinz et al.) and to Java algorithms using the 52°North WPS Java API. The models hence become executable through a standardized Web API by multiple clients such as desktop or browser GIS and modelling frameworks. If the model code is changed on the GitHub platform, the changes are retrieved by the service and the processes will be updated

  20. No Clinic Left Behind: Providing Cost-Effective In-Services Via Distance Learning.

    PubMed

    Knapp, Herschel; Fletcher, Michael; Taylor, Anne; Chan, Kee; Goetz, Matthew Bidwell

    2010-11-23

    Based on the successful pilot implementation of a Veterans Affairs Quality Enhancement Research Initiative (QUERI) aimed at increasing HIV testing throughout four local facilities and their associated satellite clinics, we expanded our efforts to deploy our methods to substantially more sites spanning six states. Our goal was to implement and examine the cost effectiveness of a distance-learning model to offer provider education to geographically remote (sub)facilities. We developed and implemented an equivalent interactive online version of our in-person presentation. Handouts were shipped to each site before the day of the in-service. Remote participants were receptive to this cost-effective form of provider activation. The technology functioned dependably; no presentation anomalies were encountered. Participants rated in-person presentations higher than online, however, mean scores for both methods were >80%. Online presentations were found to be considerably more affordable than in-person. These findings suggest that this alternate approach may offer a feasible alternative for a variety of subjects.

  1. No clinic left behind: providing cost-effective in-services via distance learning.

    PubMed

    Knapp, Herschel; Fletcher, Michael; Taylor, Anne; Chan, Kee; Goetz, Matthew Bidwell

    2011-09-01

    Based on the successful pilot implementation of a Veterans Affairs Quality Enhancement Research Initiative (QUERI) aimed at increasing HIV testing throughout four local facilities and their associated satellite clinics, we expanded our efforts to deploy our methods to substantially more sites spanning six states. Our goal was to implement and examine the cost effectiveness of a distance-learning model to offer provider education to geographically remote (sub)facilities. We developed and implemented an equivalent interactive online version of our in-person presentation. Handouts were shipped to each site before the day of the in-service. Remote participants were receptive to this cost-effective form of provider activation. The technology functioned dependably; no presentation anomalies were encountered. Participants rated in-person presentations higher than online, however, mean scores for both methods were >80%. Online presentations were found to be considerably more affordable than in-person. These findings suggest that this alternate approach may offer a feasible alternative for a variety of subjects.

  2. EnviroAtlas -Milwaukee, WI- One Meter Resolution Urban Land Cover Data (2010) Web Service

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The EnviroAtlas Milwaukee, WI land cover data and map were generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2010 at 1 m spatial resolution. Nine land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, agriculture, and wetlands (woody and emergent). An accuracy assessment using a completely random sampling of 600 samples yielded an overall accuracy of 85.39% percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Milwaukee. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-

  3. EnviroAtlas -Portland, ME- One Meter Resolution Urban Land Cover (2010) Web Service

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The Portland, ME land cover map was generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2010 at 1 m spatial resolution. Nine land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, agriculture, and wetlands (woody and emergent). An accuracy assessment using a stratified random sampling of 600 samples yielded an overall accuracy of 87.5 percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Portland.This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  4. EnviroAtlas -- Woodbine, IA -- One Meter Resolution Urban Land Cover Data (2011) Web Service

    EPA Pesticide Factsheets

    This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The EnviroAtlas Woodbine, IA land cover (LC) data and map were generated from USDA NAIP (National Agricultural Imagery Program) four band (red, green, blue and near infrared) aerial photography from Late Summer 2011 at 1 m spatial resolution. Six land cover classes were mapped: water, impervious surfaces (dark and light), soil and barren land, trees and forest, grass and herbaceous non-woody vegetation, and agriculture. An accuracy assessment using a completely random sampling of 600 samples yielded an overall accuracy of 87.03% percent using a minimum mapping unit of 9 pixels (3x3 pixel window). The area mapped is defined by the US Census Bureau's 2010 Urban Statistical Area for Woodbine. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  5. Web-based data delivery services in support of disaster-relief applications

    USGS Publications Warehouse

    Jones, B.K.; Risty, R.R.; Buswell, M.

    2003-01-01

    The U.S. Geological Survey Earth Resources Observation Systems Data Center responds to emergencies in support of various government agencies for human-induced and natural disasters. This response consists of satellite tasking and acquisitions, satellite image registrations, disaster-extent maps analysis and creation, base image provision and support, Web-based mapping services for product delivery, and predisaster and postdisaster data archiving. The emergency response staff are on call 24 hours a day, 7 days a week, and have access to many commercial and government satellite and aerial photography tasking authorities. They have access to value-added data processing and photographic laboratory services for off-hour emergency requests. They work with various Federal agencies for preparedness planning, which includes providing base imagery. These data may include digital elevation models, hydrographic models, base satellite images, vector data layers such as roads, aerial photographs, and other predisaster data. These layers are incorporated into a Web-based browser and data delivery service that is accessible either to the general public or to select customers. As usage declines, the data are moved to a postdisaster nearline archive that is still accessible, but not in real time.

  6. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    PubMed

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and

  7. The National 3-D Geospatial Information Web-Based Service of Korea

    NASA Astrophysics Data System (ADS)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of

  8. Provision of Student Learning Support Services in a Large-Scale Distance Education System at Universitas Terbuka, Indonesia

    ERIC Educational Resources Information Center

    Zuhairi, Aminudin; Adnan, Irma; Thaib, Dina

    2007-01-01

    This paper addresses the practice and experience of Universitas Terbuka (UT) in the provision of learning support services for students in a large-scale distance education system. The UT, which has a network of 37 regional offices and participating institutions, has challenges to provide and manage effective learning support system for more than…

  9. Service-Learning from a Distance: Partnering Multiple Universities and Local Governments in a Large Scale Initiative

    ERIC Educational Resources Information Center

    Poindexter, Sandra; Arnold, Pamela; Osterhout, Christopher

    2009-01-01

    Service-learning can be academically effective even when the distances between students and client organizations prevent face-to-face interchanges and site visits. Working with the State of Michigan and Michigan Townships Association, Michigan students from five universities learned about local government while helping Michigan townships develop…

  10. Listen to What They Have to Say! Assessing Distance Learners' Satisfaction with Library Services Using a Transactional Survey

    ERIC Educational Resources Information Center

    Alewine, Michael C.

    2012-01-01

    This paper examines the evolution and findings of an on-going longitudinal study that is assessing the satisfaction of distance education students with library reference services through the use of a transaction-level survey. The survey's purpose is two-fold: first, it is used to garner valuable input from these students; and second, it also…

  11. Designing and Implementing WebCT-Based Courses Online: Distance English Language Teacher Training Program (DELTT) Model

    ERIC Educational Resources Information Center

    Kurubacak, Gulsun

    2002-01-01

    Many undergraduate and graduate courses in Open Education Faculty (OEF) of Anadolu University in Turkey have been delivered for over the two decades. Most of distance programs have been distributed via traditional distance education approaches and philosophy, such as TV and radio programs, printed materials, etc. Today, however, OEF has been…

  12. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*.

    PubMed

    Katayama, Toshiaki; Arakawa, Kazuharu; Nakao, Mitsuteru; Ono, Keiichiro; Aoki-Kinoshita, Kiyoko F; Yamamoto, Yasunori; Yamaguchi, Atsuko; Kawashima, Shuichi; Chun, Hong-Woo; Aerts, Jan; Aranda, Bruno; Barboza, Lord Hendrix; Bonnal, Raoul Jp; Bruskiewich, Richard; Bryne, Jan C; Fernández, José M; Funahashi, Akira; Gordon, Paul Mk; Goto, Naohisa; Groscurth, Andreas; Gutteridge, Alex; Holland, Richard; Kano, Yoshinobu; Kawas, Edward A; Kerhornou, Arnaud; Kibukawa, Eri; Kinjo, Akira R; Kuhn, Michael; Lapp, Hilmar; Lehvaslaiho, Heikki; Nakamura, Hiroyuki; Nakamura, Yasukazu; Nishizawa, Tatsuya; Nobata, Chikashi; Noguchi, Tamotsu; Oinn, Thomas M; Okamoto, Shinobu; Owen, Stuart; Pafilis, Evangelos; Pocock, Matthew; Prins, Pjotr; Ranzinger, René; Reisinger, Florian; Salwinski, Lukasz; Schreiber, Mark; Senger, Martin; Shigemoto, Yasumasa; Standley, Daron M; Sugawara, Hideaki; Tashiro, Toshiyuki; Trelles, Oswaldo; Vos, Rutger A; Wilkinson, Mark D; York, William; Zmasek, Christian M; Asai, Kiyoshi; Takagi, Toshihisa

    2010-08-21

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies.

  13. Exploring Pre-Service Teachers' Beliefs about Using Web 2.0 Technologies in K-12 Classroom

    ERIC Educational Resources Information Center

    Sadaf, Ayesha; Newby, Timothy J.; Ertmer, Peggy A.

    2012-01-01

    This qualitative study explored pre-service teachers' behavioral, normative, and control beliefs regarding their intentions to use Web 2.0 technologies in their future classrooms. The Theory of Planned Behavior (TPB) was used as the theoretical framework (Ajzen, 1991) to understand these beliefs and pre-service teachers' intentions for why they…

  14. The DBCLS BioHackathon: standardization and interoperability for bioinformatics web services and workflows. The DBCLS BioHackathon Consortium*

    PubMed Central

    2010-01-01

    Web services have become a key technology for bioinformatics, since life science databases are globally decentralized and the exponential increase in the amount of available data demands for efficient systems without the need to transfer entire databases for every step of an analysis. However, various incompatibilities among database resources and analysis services make it difficult to connect and integrate these into interoperable workflows. To resolve this situation, we invited domain specialists from web service providers, client software developers, Open Bio* projects, the BioMoby project and researchers of emerging areas where a standard exchange data format is not well established, for an intensive collaboration entitled the BioHackathon 2008. The meeting was hosted by the Database Center for Life Science (DBCLS) and Computational Biology Research Center (CBRC) and was held in Tokyo from February 11th to 15th, 2008. In this report we highlight the work accomplished and the common issues arisen from this event, including the standardization of data exchange formats and services in the emerging fields of glycoinformatics, biological interaction networks, text mining, and phyloinformatics. In addition, common shared object development based on BioSQL, as well as technical challenges in large data management, asynchronous services, and security are discussed. Consequently, we improved interoperability of web services in several fields, however, further cooperation among major database centers and continued collaborative efforts between service providers and software developers are still necessary for an effective advance in bioinformatics web service technologies. PMID:20727200

  15. Technical Services on the Net: Where Are We Now? A Comparative Study of Sixty Web Sites of Academic Libraries

    ERIC Educational Resources Information Center

    Wang, Jianrong; Gao, Vera

    2004-01-01

    This study examines sixty academic libraries' Web sites and finds that 80 percent of them do not have a technical services' homepage. Data reveal that institution's status might be a factor in whether a library has such a page. Further content analysis suggests there is an appropriate and useful public service role that technical services…

  16. Effective electron-density map improvement and structure validation on a Linux multi-CPU web cluster: The TB Structural Genomics Consortium Bias Removal Web Service.

    PubMed

    Reddy, Vinod; Swanson, Stanley M; Segelke, Brent; Kantardjieff, Katherine A; Sacchettini, James C; Rupp, Bernhard

    2003-12-01

    Anticipating a continuing increase in the number of structures solved by molecular replacement in high-throughput crystallography and drug-discovery programs, a user-friendly web service for automated molecular replacement, map improvement, bias removal and real-space correlation structure validation has been implemented. The service is based on an efficient bias-removal protocol, Shake&wARP, and implemented using EPMR and the CCP4 suite of programs, combined with various shell scripts and Fortran90 routines. The service returns improved maps, converted data files and real-space correlation and B-factor plots. User data are uploaded through a web interface and the CPU-intensive iteration cycles are executed on a low-cost Linux multi-CPU cluster using the Condor job-queuing package. Examples of map improvement at various resolutions are provided and include model completion and reconstruction of absent parts, sequence correction, and ligand validation in drug-target structures.

  17. A flexible statistics web processing service--added value for information systems for experiment data.

    PubMed

    Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta

    2010-04-20

    Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.

  18. Middleware and Web Services for the Collaborative Information Portal of NASA's Mars Exploration Rovers Mission

    NASA Technical Reports Server (NTRS)

    Sinderson, Elias; Magapu, Vish; Mak, Ronald

    2004-01-01

    We describe the design and deployment of the middleware for the Collaborative Information Portal (CIP), a mission critical J2EE application developed for NASA's 2003 Mars Exploration Rover mission. CIP enabled mission personnel to access data and images sent back from Mars, staff and event schedules, broadcast messages and clocks displaying various Earth and Mars time zones. We developed the CIP middleware in less than two years time usins cutting-edge technologies, including EJBs, servlets, JDBC, JNDI and JMS. The middleware was designed as a collection of independent, hot-deployable web services, providing secure access to back end file systems and databases. Throughout the middleware we enabled crosscutting capabilities such as runtime service configuration, security, logging and remote monitoring. This paper presents our approach to mitigating the challenges we faced, concluding with a review of the lessons we learned from this project and noting what we'd do differently and why.

  19. A Proposal for a Thesaurus for Web Services in Solar Radiation

    NASA Technical Reports Server (NTRS)

    Gschwind, Benoit; Menard, Lionel; Ranchin, Thierry; Wald, Lucien; Stackhouse, Paul W., Jr.

    2007-01-01

    Metadata are necessary to discover, describe and exchange any type of information, resource and service at a large scale. A significant amount of effort has been made in the field of geography and environment to establish standards. Efforts still remain to address more specific domains such as renewable energies. This communication focuses on solar energy and more specifically on aspects in solar radiation that relate to geography and meteorology. A thesaurus in solar radiation is proposed for the keys elements in solar radiation namely time, space and radiation types. The importance of time-series in solar radiation is outlined and attributes of the key elements are discussed. An XML schema for encoding metadata is proposed. The exploitation of such a schema in web services is discussed. This proposal is a first attempt at establishing a thesaurus for describing data and applications in solar radiation.

  20. Grid enablement of OpenGeospatial Web Services: the G-OWS Working Group

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo

    2010-05-01

    In last decades two main paradigms for resource sharing emerged and reached maturity: the Web and the Grid. They both demonstrate suitable for building Distributed Computing Infrastructures (DCIs) supporting the coordinated sharing of resources (i.e. data, information, services, etc) on the Internet. Grid and Web DCIs have much in common as a result of their underlying Internet technology (protocols, models and specifications). However, being based on different requirements and architectural approaches, they show some differences as well. The Web's "major goal was to be a shared information space through which people and machines could communicate" [Berners-Lee 1996]. The success of the Web, and its consequent pervasiveness, made it appealing for building specialized systems like the Spatial Data Infrastructures (SDIs). In this systems the introduction of Web-based geo-information technologies enables specialized services for geospatial data sharing and processing. The Grid was born to achieve "flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resources" [Foster 2001]. It specifically focuses on large-scale resource sharing, innovative applications, and, in some cases, high-performance orientation. In the Earth and Space Sciences (ESS) the most part of handled information is geo-referred (geo-information) since spatial and temporal meta-information is of primary importance in many application domains: Earth Sciences, Disasters Management, Environmental Sciences, etc. On the other hand, in several application areas there is the need of running complex models which require the large processing and storage capabilities that the Grids are able to provide. Therefore the integration of geo-information and Grid technologies might be a valuable approach in order to enable advanced ESS applications. Currently both geo-information and Grid technologies have reached a high level of maturity, allowing to build such an

  1. QuakeSim: a Web Service Environment for Productive Investigations with Earth Surface Sensor Data

    NASA Astrophysics Data System (ADS)

    Parker, J. W.; Donnellan, A.; Granat, R. A.; Lyzenga, G. A.; Glasscoe, M. T.; McLeod, D.; Al-Ghanmi, R.; Pierce, M.; Fox, G.; Grant Ludwig, L.; Rundle, J. B.

    2011-12-01

    The QuakeSim science gateway environment includes a visually rich portal interface, web service access to data and data processing operations, and the QuakeTables ontology-based database of fault models and sensor data. The integrated tools and services are designed to assist investigators by covering the entire earthquake cycle of strain accumulation and release. The Web interface now includes Drupal-based access to diverse and changing content, with new ability to access data and data processing directly from the public page, as well as the traditional project management areas that require password access. The system is designed to make initial browsing of fault models and deformation data particularly engaging for new users. Popular data and data processing include GPS time series with data mining techniques to find anomalies in time and space, experimental forecasting methods based on catalogue seismicity, faulted deformation models (both half-space and finite element), and model-based inversion of sensor data. The fault models include the CGS and UCERF 2.0 faults of California and are easily augmented with self-consistent fault models from other regions. The QuakeTables deformation data include the comprehensive set of UAVSAR interferograms as well as a growing collection of satellite InSAR data.. Fault interaction simulations are also being incorporated in the web environment based on Virtual California. A sample usage scenario is presented which follows an investigation of UAVSAR data from viewing as an overlay in Google Maps, to selection of an area of interest via a polygon tool, to fast extraction of the relevant correlation and phase information from large data files, to a model inversion of fault slip followed by calculation and display of a synthetic model interferogram.

  2. The NORM technology connection web site : streamlined access to NORM-related service company and regulatory information.

    SciTech Connect

    Smith, K. P.; Richmond, P.; LePoire, D. J.; Arnish, J. J.; Johnson, R.

    2000-11-08

    Argonne National Laboratory has developed an Internet web site providing access to critical information needed to support decisions on the management and disposal of wastes containing naturally occurring radioactive material (NORM). The NORM Technology Connection web site provides current information on (1) service companies that provide support on NORM issues (e.g., site characterization and remediation, sample analysis, radiation safety training, disposal) and (2) existing applicable NORM regulations and guidelines. A third element of the site is an electronic mail list that allows users to post or respond to questions about the management of NORM. Development of the NORM Technology Connection web site was funded by the U.S. Department of Energy, Office of Fossil Energy. It is hosted and maintained by the Interstate Oil and Gas Compact Commission. The web site is publicly available; access is free, as is participation by any of the service companies.

  3. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows

    PubMed Central

    Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José

    2015-01-01

    This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production. PMID:26068216

  4. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data

  5. Enhancing the Teaching of Digital Processing of Remote Sensing Image Course through Geospatial Web Processing Services

    NASA Astrophysics Data System (ADS)

    di, L.; Deng, M.

    2010-12-01

    Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory

  6. The Role of Student Affairs in Distance Education: Cyber-Services or Virtual Communities

    ERIC Educational Resources Information Center

    Kretovics, Mark

    2003-01-01

    As distance education technology enables institutions of higher education to offer courses to students throughout the country, it is important for student affairs to offer opportunities for these students to connect with the institution. This article reviews the relevant literature on distance education and discusses differences between providing…

  7. IRRIMET: a web 2.0 advisory service for irrigation water management

    NASA Astrophysics Data System (ADS)

    De Michele, Carlo; Anzano, Enrico; Colandrea, Marco; Marotta, Luigi; Mula, Ileana; Pelosi, Anna; D'Urso, Guido; Battista Chirico, Giovanni

    2016-04-01

    Irrigation agriculture is one the biggest consumer of water in Europe, especially in southern regions, where it accounts for up to 70% of the total water consumption. The EU Common Agricultural Policy, combined with the Water Framework Directive, imposes to farmers and irrigation managers a substantial increase of the efficiency in the use of water in agriculture for the next decade. Irrigating according to reliable crop water requirement estimates is one of the most convincing solution to decrease agricultural water use. Here we present an innovative irrigation advisory service, applied in Campania region (Southern Italy), where a satellite assisted irrigation advisory service has been operating since 2006. The advisory service is based on the optimal combination of VIS-NIR high resolution satellite images (Landsat, Deimos, Rapideye) to map crop vigour, and high resolution numerical weather prediction for assessing the meteorological variables driving the crop water needs in the short-medium range. The advisory service is broadcasted with a simple and intuitive web app interface which makes daily real time irrigation and evapotranspiration maps and customized weather forecasts (based on Cosmo Leps model) accessible from desktop computers, tablets and smartphones.

  8. Implementation of Web Processing Services (WPS) over IPSL Earth System Grid Federation (ESGF) node

    NASA Astrophysics Data System (ADS)

    Kadygrov, Nikolay; Denvil, Sebastien; Carenton, Nicolas; Levavasseur, Guillaume; Hempelmann, Nils; Ehbrecht, Carsten

    2016-04-01

    The Earth System Grid Federation (ESGF) is aimed to provide access to climate data for the international climate community. ESGF is a system of distributed and federated nodes that dynamically interact with each other. ESGF user may search and download climatic data, geographically distributed over the world, from one common web interface and through standardized API. With the continuous development of the climate models and the beginning of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), the amount of data available from ESGF will continuously increase during the next 5 years. IPSL holds a replication of the different global and regional climate models output, observations and reanalysis data (CMIP5, CORDEX, obs4MIPs, etc) that are available on the IPSL ESGF node. In order to let scientists perform analysis of the models without downloading vast amount of data the Web Processing Services (WPS) were installed at IPSL compute node. The work is part of the CONVERGENCE project founded by French National Research Agency (ANR). PyWPS implementation of the Web processing Service standard from Open Geospatial Consortium (OGC) in the framework of birdhouse software is used. The processes could be run by user remotely through web-based WPS client or by using command-line tool. All the calculations are performed on the server side close to the data. If the models/observations are not available at IPSL it will be downloaded and cached by WPS process from ESGF network using synda tool. The outputs of the WPS processes are available for download as plots, tar-archives or as NetCDF files. We present the architecture of WPS at IPSL along with the processes for evaluation of the model performance, on-site diagnostics and post-analysis processing of the models output, e.g.: - regriding/interpolation/aggregation - ocgis (OpenClimateGIS) based polygon subsetting of the data - average seasonal cycle, multimodel mean, multimodel mean bias - calculation of the

  9. Integration between solar and space science data for space weather forecast using web services

    NASA Astrophysics Data System (ADS)

    Kato, S.

    2007-08-01

    As the technology develops, the opportunity that the human beings behave in space, and it is still understood that the solar activities (especially the solar flare) influence the airlines communication, the ship communication and the power generator of the electric power company, etc. Forecasting the effects of the solar activities is becoming very important because there is such a background. Our goal is that constructs the detailed model from the Sun to the magnetosphere of the earth and simulates the solar activities and the effects. We try to integrate the existing observational data including the ground observational data and satellite observational data using by web service technology as a base to construct the model. We introduce our activity to combine the solar and space science data in Japan. Methods Generally, it is difficult to develop the virtual common database, but web service makes interconnection among different databases comparatively easy. We try to connect some databases in the portal site. Each different data objects is aggregated to a common data object. We can develop more complex services. We use RELAX NG in order to develop these applications easily. We begin the trial of the interconnection among the solar and space science data in Japan. In the case of solar observational data, we find the activity such as VO, for example, VSO and EGSO, but space science data seems to be very complex. In addition to this, there is time lag that solar activity has an effect on the magnetosphere of the Earth. We discuss these characteristic in the data analysis between the solar and space data. This work was supported by the Grant-in-Aid for Creative Scientific Research `The Basic Study of Space Weather Prediction' (17GS0208) from the Ministry of Education, Science, Sports, Technology, and Culture of Japan

  10. Distance learning on the Web supported by Javascript: a critical appraisal with examples from clay mineralogy and knowledge-based tests

    NASA Astrophysics Data System (ADS)

    Krumm, S.; Thum, I.

    1998-08-01

    The hypertext mark-up language (HTML) is used to create hypertext documents in use on the World-Wide Web (WWW), built up as a client/server model. In this paper we discuss the enhancement of HTML documents with JavaScript, a script language understood by most common browsers. JavaScript is considered an easy means for bringing interactivity and answer checking to educational Web pages. It is faster to learn compared to using a programming language like PERL and has the advantage of high portability between different operating systems. Because all actions are performed on the client side, it reduces net traffic and pages can be used off-line. Educational usage, including tests and operations in future distance learning are outlined. Examples of JavaScript supported documents are given using clay mineralogy and knowledge-based tests as examples. A critical review of this relatively new technology reveals some compatibility problems but these seem to be offset by the possibility to make Web pages more attractive.

  11. Increased demand for E-mail health consultation service: analysis of a Web survey.

    PubMed

    Klinar, Ivana; Balazin, Ana; Basić, Martina; Plantas, Igor; Biskupić, Kresimir

    2010-06-01

    The aim of the study was to explore characteristics of the users of the Interactive Service "Your Questions" that is based on E-mail health consultations. We wanted to find out what motivated users to use it, were they satisfied with it and what were its impacts on their health behavior. Therefore, we developed a Web survey and invited 2,747 users to take part in it. 919 (33.5%) of users responded. Results showed that the majority of respondents were women (79.1%) and that most hold at least a college degree (52.4%). The Service was mostly used for obtaining information about certain medical symptoms or medical conditions (50.1%), for a second opinion on a diagnosis (18.6%) and for more information about medical treatment (14.4%). In terms of Service features, it was used because of its convenience with regard to time (38.7%) and a sense of privacy (25.7%). Before posting a question to the Service, 93.2% of the respondents searched for health articles on the PLIVAzdravlje portal while 90% of them read the Questions and Answers database. Over half of them (61.8%) posted their question after they already visited their physicians on that particular issue. Nevertheless, 48% of them were encouraged to visit their physicians after they received the answer. The results show an important trend of increased demand for e-mail health consultation and the need for reliable medical information, with one thousand questions submitted to the Service in the observed period of 40 days. If the source of medical information is reliable, as in case of our Service as well as other forms of e-mail health consultations, it can have positive impact on valuable physician-patient communication based on knowledge and mutual understanding.

  12. Evaluation of flood hazard maps in print and web mapping services as information tools in flood risk communication

    NASA Astrophysics Data System (ADS)

    Hagemeier-Klose, M.; Wagner, K.

    2009-04-01

    Flood risk communication with the general public and the population at risk is getting increasingly important for flood risk management, especially as a precautionary measure. This is also underlined by the EU Flood Directive. The flood related authorities therefore have to develop adjusted information tools which meet the demands of different user groups. This article presents the formative evaluation of flood hazard maps and web mapping services according to the specific requirements and needs of the general public using the dynamic-transactional approach as a theoretical framework. The evaluation was done by a mixture of different methods; an analysis of existing tools, a creative workshop with experts and laymen and an online survey. The currently existing flood hazard maps or web mapping services or web GIS still lack a good balance between simplicity and complexity with adequate readability and usability for the public. Well designed and associative maps (e.g. using blue colours for water depths) which can be compared with past local flood events and which can create empathy in viewers, can help to raise awareness, to heighten the activity and knowledge level or can lead to further information seeking. Concerning web mapping services, a linkage between general flood information like flood extents of different scenarios and corresponding water depths and real time information like gauge levels is an important demand by users. Gauge levels of these scenarios are easier to understand than the scientifically correct return periods or annualities. The recently developed Bavarian web mapping service tries to integrate these requirements.

  13. Building web service interfaces to geoscience data sets: EarthCube GeoWS project activities at the IRIS DMC

    NASA Astrophysics Data System (ADS)

    Trabant, C. M.; Ahern, T. K.; Stults, M.

    2015-12-01

    At the IRIS Data Management Center (DMC) we have been developing web service data access interfaces for our, primarily seismological, repositories for five years. These interfaces have become the primary access mechanisms for all data extraction from the DMC. For the last two years the DMC has been a principal participant in the GeoWS project, which aims to develop common web service interfaces for data access across hydrology, geodesy, seismology, marine geophysics, atmospheric and other geoscience disciplines. By extending our approach we have converged, along with other project members, on a web service interface and presentation design appropriate for geoscience and other data. The key principles of the approach include using a simple subset of RESTful concepts, common calling conventions whenever possible, a common tabular text data set convention, human-readable documentation and tools to help scientific end users learn how to use the interfaces. The common tabular text format, called GeoCSV, has been incorporated into the DMC's seismic station and event (earthquake) services. In addition to modifying our existing services, we have developed prototype GeoCSV web services for data sets managed by external (unfunded) collaborators. These prototype services include interfaces for data sets at NGDC/NCEI (water level tides and meteorological satellite measurements), INTERMAGNET repository and UTEP gravity and magnetic measurements. In progress are interfaces for WOVOdat (volcano observatory measurements), NEON (ecological observatory measurements) and more. An important goal of our work is to build interfaces usable by non-technologist end users. We find direct usability by researchers to be a major factor in cross-discipline data use, which itself is a key to solving complex research questions. In addition to data discovery and collection by end users, these interfaces provide a foundation upon which federated data access and brokering systems are already being

  14. Provision of Distance Learning Services over Interactive Digital TV with MHP

    ERIC Educational Resources Information Center

    Pazos-Arias, Jose J.; Lopez-Nores, Martin; Garcia-Duque, Jorge; Diaz-Redondo, Rebeca P.; Blanco-Fernandez, Yolanda; Ramos-Cabrer, Manuel; Gil-Solla, Alberto; Fernandez-Vilas, Ana

    2008-01-01

    E-learning technologies have developed greatly in recent years, with considerable success. However, there is increasing evidence that web-based learning is not reaching the social sectors which are more reluctant to contact with the new technologies, thus leading to inequalities in the access to education and knowledge in the Information Society.…

  15. Food-web structure and ecosystem services: insights from the Serengeti.

    PubMed

    Dobson, Andy

    2009-06-27

    is likely to be central to the stability of the whole web. If the Serengeti is to be successfully conserved as a fully functioning ecosystem, then it is essential that the full diversity of natural habitats be maintained within the greater Serengeti ecosystem. The best way to do this is by controlling the external forces that threaten the boundaries of the ecosystem and by balancing the economic services the park provides between local, national and international needs. I conclude by discussing how the ecosystem services provided by the Serengeti are driven by species on different trophic levels. Tourism provides the largest financial revenue to the national economy, but it could be better organized to provide more sustained revenue to the park. Ultimately, ecotourism needs to be developed in ways that take lessons from the structure of the Serengeti food webs, and in ways that provide tangible benefits to people living around the park while also improving the experience of all visitors.

  16. Food-web structure and ecosystem services: insights from the Serengeti

    PubMed Central

    Dobson, Andy

    2009-01-01

    is likely to be central to the stability of the whole web. If the Serengeti is to be successfully conserved as a fully functioning ecosystem, then it is essential that the full diversity of natural habitats be maintained within the greater Serengeti ecosystem. The best way to do this is by controlling the external forces that threaten the boundaries of the ecosystem and by balancing the economic services the park provides between local, national and international needs. I conclude by discussing how the ecosystem services provided by the Serengeti are driven by species on different trophic levels. Tourism provides the largest financial revenue to the national economy, but it could be better organized to provide more sustained revenue to the park. Ultimately, ecotourism needs to be developed in ways that take lessons from the structure of the Serengeti food webs, and in ways that provide tangible benefits to people living around the park while also improving the experience of all visitors. PMID:19451118

  17. The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications

    PubMed Central

    2011-01-01

    Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP

  18. GALPROP WebRun: An internet-based service for calculating galactic cosmic ray propagation and associated photon emissions

    NASA Astrophysics Data System (ADS)

    Vladimirov, A. E.; Digel, S. W.; Jóhannesson, G.; Michelson, P. F.; Moskalenko, I. V.; Nolan, P. L.; Orlando, E.; Porter, T. A.; Strong, A. W.

    2011-05-01

    GALPROP is a numerical code for calculating the galactic propagation of relativistic charged particles and the diffuse emissions produced during their propagation. The code incorporates as much realistic astrophysical input as possible together with latest theoretical developments and has become a de facto standard in astrophysics of cosmic rays. We present GALPROP WebRun, a service to the scientific community enabling easy use of the freely available GALPROP code via web browsers. In addition, we introduce the latest GALPROP version 54, available through this service.

  19. Who Are the Young People Choosing Web-based Mental Health Support? Findings From the Implementation of Australia's National Web-based Youth Mental Health Service, eheadspace

    PubMed Central

    Rickwood, Debra; Webb, Marianne; Telford, Nic

    2016-01-01

    Background The adolescent and early adult years are periods of peak prevalence and incidence for most mental disorders. Despite the rapid expansion of Web-based mental health care, and increasing evidence of its effectiveness, there is little research investigating the characteristics of young people who access Web-based mental health care. headspace, Australia’s national youth mental health foundation, is ideally placed to explore differences between young people who seek Web-based mental health care and in-person mental health care as it offers both service modes for young people, and collects corresponding data from each service type. Objective The objective of this study was to provide a comprehensive profile of young people seeking Web-based mental health care through eheadspace (the headspace Web-based counseling platform), and to compare this with the profile of those accessing help in-person through a headspace center. Methods Demographic and clinical presentation data were collected from all eheadspace clients aged 12 to 25 years (the headspace target age range) who received their first counseling session between November 1, 2014 and April 30, 2015 via online chat or email (n=3414). These Web-based clients were compared with all headspace clients aged 12 to 25 who received their first center-based counseling service between October 1, 2014 and March 31, 2015 (n=20,015). Results More eheadspace than headspace center clients were female (78.1% compared with 59.1%), and they tended to be older. A higher percentage of eheadspace clients presented with high or very high levels of psychological distress (86.6% compared with 73.2%), but they were at an earlier stage of illness on other indicators of clinical presentation compared with center clients. Conclusions The findings of this study suggest that eheadspace is reaching a unique client group who may not otherwise seek help or who might wait longer before seeking help if in-person mental health support was

  20. GEO Label Web Services for Dynamic and Effective Communication of Geospatial Metadata Quality

    NASA Astrophysics Data System (ADS)

    Lush, Victoria; Nüst, Daniel; Bastin, Lucy; Masó, Joan; Lumsden, Jo

    2014-05-01

    We present demonstrations of the GEO label Web services and their integration into a prototype extension of the GEOSS portal (http://scgeoviqua.sapienzaconsulting.com/web/guest/geo_home), the GMU portal (http://gis.csiss.gmu.edu/GADMFS/) and a GeoNetwork catalog application (http://uncertdata.aston.ac.uk:8080/geonetwork/srv/eng/main.home). The GEO label is designed to communicate, and facilitate interrogation of, geospatial quality information with a view to supporting efficient and effective dataset selection on the basis of quality, trustworthiness and fitness for use. The GEO label which we propose was developed and evaluated according to a user-centred design (UCD) approach in order to maximise the likelihood of user acceptance once deployed. The resulting label is dynamically generated from producer metadata in ISO or FDGC format, and incorporates user feedback on dataset usage, ratings and discovered issues, in order to supply a highly informative summary of metadata completeness and quality. The label was easily incorporated into a community portal as part of the GEO Architecture Implementation Programme (AIP-6) and has been successfully integrated into a prototype extension of the GEOSS portal, as well as the popular metadata catalog and editor, GeoNetwork. The design of the GEO label was based on 4 user studies conducted to: (1) elicit initial user requirements; (2) investigate initial user views on the concept of a GEO label and its potential role; (3) evaluate prototype label visualizations; and (4) evaluate and validate physical GEO label prototypes. The results of these studies indicated that users and producers support the concept of a label with drill-down interrogation facility, combining eight geospatial data informational aspects, namely: producer profile, producer comments, lineage information, standards compliance, quality information, user feedback, expert reviews, and citations information. These are delivered as eight facets of a wheel

  1. Web-Based Distance Learning: Substitute or Alternative to the Traditional Classroom--Making the Delivery Method Decision

    ERIC Educational Resources Information Center

    Hunt, David Marshall

    2005-01-01

    When a distance learning program administrator makes the critical choice of delivery methods, she/he needs to consider factors such as program developer centrism, international experience, cultural similarity, and desired level of control which will all be elaborated on in this article. The aim of this manuscript is to assist international…

  2. Are Accessible Distance Learning Systems Useful for All Students?: Our Experience with IMES, an Accessible Web-Based Learning System

    ERIC Educational Resources Information Center

    Iglesias, Ana; Moreno, Lourdes; Castro, Elena; Cuadra, Dolores

    2014-01-01

    Nowadays the use of distance learning systems is widely extended in engineering education. Moreover, most of them use multimedia resources that sometimes are the only educational material available to provide certain educational knowledge to the students. Unfortunately, most of the current educational systems and their educational content present…

  3. Are Accessible Distance Learning Systems Useful for All Students?: Our Experience with IMES, an Accessible Web-Based Learning System

    ERIC Educational Resources Information Center

    Iglesias, Ana; Moreno, Lourdes; Cuadra, Dolores; Castro, Elena

    2013-01-01

    Nowadays the use of distance learning systems is widely extended in engineering education. Moreover, most of them use multimedia resources that sometimes are the only educational material available to provide certain educational knowledge to the students. Unfortunately, most of the current educational systems and their educational content present…

  4. Web Services for Astronomical Databases: Connecting AIPS++ to the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Douthit, M. C.

    2002-12-01

    In the year 2010, the NRAO will be operating four of the world's most powerful radio telescopes: GBT, EVLA, VLBA, and ALMA (with international partnership). Multi-Terabyte data sets will quickly accumulate with a rate of twenty-five to fifty Megabytes of data per second generated by ALMA and EVLA each. It will be imperative for scientists to possess software capable of automated data reduction, image synthesis, and archiving. With the evolution of AIPS++ and the recently developed concepts of the image pipeline, the participation of the NRAO in the virtual observatories of the future is now on the horizon giving birth to the need for fast archive access and web service development in AIPS++. When the software package began over 10 years ago, it was not designed for data transfer via the web. In response to the demands of the NVO, we have designed and implemented an application layer that will allow our system to communicate with others. Sponsored by the NRAO and California State University, San Marcos.

  5. Exploring Multidisciplinary Data Sets through Database Driven Search Capabilities and Map-Based Web Services

    NASA Astrophysics Data System (ADS)

    O'Hara, S.; Ferrini, V.; Arko, R.; Carbotte, S. M.; Leung, A.; Bonczkowski, J.; Goodwillie, A.; Ryan, W. B.; Melkonian, A. K.

    2008-12-01

    Relational databases containing geospatially referenced data enable the construction of robust data access pathways that can be customized to suit the needs of a diverse user community. Web-based search capabilities driven by radio buttons and pull-down menus can be generated on-the-fly leveraging the power of the relational database and providing specialists a means of discovering specific data and data sets. While these data access pathways are sufficient for many scientists, map-based data exploration can also be an effective means of data discovery and integration by allowing users to rapidly assess the spatial co- registration of several data types. We present a summary of data access tools currently provided by the Marine Geoscience Data System (www.marine-geo.org) that are intended to serve a diverse community of users and promote data integration. Basic search capabilities allow users to discover data based on data type, device type, geographic region, research program, expedition parameters, personnel and references. In addition, web services are used to create database driven map interfaces that provide live access to metadata and data files.

  6. A web-based library consult service for evidence-based medicine: Technical development

    PubMed Central

    Schwartz, Alan; Millam, Gregory

    2006-01-01

    Background Incorporating evidence based medicine (EBM) into clinical practice requires clinicians to learn to efficiently gain access to clinical evidence and effectively appraise its validity. Even using current electronic systems, selecting literature-based data to solve a single patient-related problem can require more time than practicing physicians or residents can spare. Clinical librarians, as informationists, are uniquely suited to assist physicians in this endeavor. Results To improve support for evidence-based practice, we have developed a web-based EBM library consult service application (LCS). Librarians use the LCS system to provide full text evidence-based literature with critical appraisal in response to a clinical question asked by a remote physician. LCS uses an entirely Free/Open Source Software platform and will be released under a Free Software license. In the first year of the LCS project, the software was successfully developed and a reference implementation put into active use. Two years of evaluation of the clinical, educational, and attitudinal impact on physician-users and librarian staff are underway, and expected to lead to refinement and wide dissemination of the system. Conclusion A web-based EBM library consult model may provide a useful way for informationists to assist clinicians, and is feasible to implement. PMID:16542453

  7. Coastal Ocean Observing Network - Open Source Architecture for Data Management and Web-Based Data Services

    NASA Astrophysics Data System (ADS)

    Pattabhi Rama Rao, E.; Venkat Shesu, R.; Udaya Bhaskar, T. V. S.

    2012-07-01

    The observations from the oceans are the backbone for any kind of operational services, viz. potential fishing zone advisory services, ocean state forecast, storm surges, cyclones, monsoon variability, tsunami, etc. Though it is important to monitor open Ocean, it is equally important to acquire sufficient data in the coastal ocean through coastal ocean observing systems for re-analysis, analysis and forecast of coastal ocean by assimilating different ocean variables, especially sub-surface information; validation of remote sensing data, ocean and atmosphere model/analysis and to understand the processes related to air-sea interaction and ocean physics. Accurate information and forecast of the state of the coastal ocean at different time scales is vital for the wellbeing of the coastal population as well as for the socio-economic development of the country through shipping, offshore oil and energy etc. Considering the importance of ocean observations in terms of understanding our ocean environment and utilize them for operational oceanography, a large number of platforms were deployed in the Indian Ocean including coastal observatories, to acquire data on ocean variables in and around Indian Seas. The coastal observation network includes HF Radars, wave rider buoys, sea level gauges, etc. The surface meteorological and oceanographic data generated by these observing networks are being translated into ocean information services through analysis and modelling. Centralized data management system is a critical component in providing timely delivery of Ocean information and advisory services. In this paper, we describe about the development of open-source architecture for real-time data reception from the coastal observation network, processing, quality control, database generation and web-based data services that includes on-line data visualization and data downloads by various means.

  8. Development of Web Mapping Service Capabilities to Support NASA Disasters Applications / App Development

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.

    2014-01-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  9. Development of Web Mapping Service Capabilities to Support NASA Disasters Applications/App Development

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.

    2014-01-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  10. Semantic querying of relational data for clinical intelligence: a semantic web services-based approach

    PubMed Central

    2013-01-01

    Background Clinical Intelligence, as a research and engineering discipline, is dedicated to the development of tools for data analysis for the purposes of clinical research, surveillance, and effective health care management. Self-service ad hoc querying of clinical data is one desirable type of functionality. Since most of the data are currently stored in relational or similar form, ad hoc querying is problematic as it requires specialised technical skills and the knowledge of particular data schemas. Results A possible solution is semantic querying where the user formulates queries in terms of domain ontologies that are much easier to navigate and comprehend than data schemas. In this article, we are exploring the possibility of using SADI Semantic Web services for semantic querying of clinical data. We have developed a prototype of a semantic querying infrastructure for the surveillance of, and research on, hospital-acquired infections. Conclusions Our results suggest that SADI can support ad-hoc, self-service, semantic queries of relational data in a Clinical Intelligence context. The use of SADI compares favourably with approaches based on declarative semantic mappings from data schemas to ontologies, such as query rewriting and RDFizing by materialisation, because it can easily cope with situations when (i) some computation is required to turn relational data into RDF or OWL, e.g., to implement temporal reasoning, or (ii) integration with external data sources is necessary. PMID:23497556

  11. Detection and Prevention of Insider Threats in Database Driven Web Services

    NASA Astrophysics Data System (ADS)

    Chumash, Tzvi; Yao, Danfeng

    In this paper, we take the first step to address the gap between the security needs in outsourced hosting services and the protection provided in the current practice. We consider both insider and outsider attacks in the third-party web hosting scenarios. We present SafeWS, a modular solution that is inserted between server side scripts and databases in order to prevent and detect website hijacking and unauthorized access to stored data. To achieve the required security, SafeWS utilizes a combination of lightweight cryptographic integrity and encryption tools, software engineering techniques, and security data management principles. We also describe our implementation of SafeWS and its evaluation. The performance analysis of our prototype shows the overhead introduced by security verification is small. SafeWS will allow business owners to significantly reduce the security risks and vulnerabilities of outsourcing their sensitive customer data to third-party providers.

  12. Emotion-Bracelet: A Web Service for Expressing Emotions through an Electronic Interface.

    PubMed

    Martinez, Alicia; Estrada, Hugo; Molina, Alejandra; Mejia, Manuel; Perez, Joaquin

    2016-11-24

    The mechanisms to communicate emotions have dramatically changed in the last 10 years with social networks, where users massively communicate their emotional states by using the Internet. However, people with socialization problems have difficulty expressing their emotions verbally or interpreting the environment and providing an appropriate emotional response. In this paper, a novel solution called the Emotion-Bracelet is presented that combines a hardware device and a software system. The proposed approach identifies the polarity and emotional intensity of texts published on a social network site by performing real-time processing using a web service. It also shows emotions with a LED matrix using five emoticons that represent positive, very positive, negative, very negative, and neutral states. The Emotion-Bracelet is designed to help people express their emotions in a non-intrusive way, thereby expanding the social aspect of human emotions.

  13. Emotion-Bracelet: A Web Service for Expressing Emotions through an Electronic Interface

    PubMed Central

    Martinez, Alicia; Estrada, Hugo; Molina, Alejandra; Mejia, Manuel; Perez, Joaquin

    2016-01-01

    The mechanisms to communicate emotions have dramatically changed in the last 10 years with social networks, where users massively communicate their emotional states by using the Internet. However, people with socialization problems have difficulty expressing their emotions verbally or interpreting the environment and providing an appropriate emotional response. In this paper, a novel solution called the Emotion-Bracelet is presented that combines a hardware device and a software system. The proposed approach identifies the polarity and emotional intensity of texts published on a social network site by performing real-time processing using a web service. It also shows emotions with a LED matrix using five emoticons that represent positive, very positive, negative, very negative, and neutral states. The Emotion-Bracelet is designed to help people express their emotions in a non-intrusive way, thereby expanding the social aspect of human emotions. PMID:27886130

  14. Protein function prediction and annotation in an integrated environment powered by web services (AFAWE).

    PubMed

    Jöcker, Anika; Hoffmann, Fabian; Groscurth, Andreas; Schoof, Heiko

    2008-10-15

    Many sequenced genes are mainly annotated through automatic transfer of annotation from similar sequences. Manual comparison of results or intermediate results from different tools can help avoid wrong annotations and give hints to the function of a gene even if none of the automated tools could return any result. AFAWE simplifies the task of manual functional annotation by running different tools and workflows for automatic function prediction and displaying the results in a way that facilitates comparison. Because all programs are executed as web services, AFAWE is easily extensible and can directly query primary databases, thereby always using the most up-to-date data sources. Visual filters help to distinguish trustworthy results from non-significant results. Furthermore, an interface to add detailed manual annotation to each gene is provided, which can be displayed to other users.

  15. Using the RxNorm web services API for quality assurance purposes.

    PubMed

    Peters, Lee; Bodenreider, Olivier

    2008-11-06

    Auditing large, rapidly evolving terminological systems is still a challenge. In the case of RxNorm, a standardized nomenclature for clinical drugs, we argue that quality assurance processes can benefit from the recently released application programming interface (API) provided by RxNav. We demonstrate the usefulness of the API by performing a systematic comparison of alternative paths in the RxNorm graph, over several thousands of drug entities. This study revealed potential errors in RxNorm, currently under review. The results also prompted us to modify the implementation of RxNav to navigate the RxNorm graph more accurately. The RxNav web services API used in this experiment is robust and fast.

  16. Integrating Socioeconomic and Earth Science Data Using Geobrowsers and Web Services: A Demonstration

    NASA Astrophysics Data System (ADS)

    Schumacher, J. A.; Yetman, G. G.

    2007-12-01

    The societal benefit areas identified as the focus for the Global Earth Observing System of Systems (GEOSS) 10- year implementation plan are an indicator of the importance of integrating socioeconomic data with earth science data to support decision makers. To aid this integration, CIESIN is delivering its global and U.S. demographic data to commercial and open source Geobrowsers and providing open standards based services for data access. Currently, data on population distribution, poverty, and detailed census data for the U.S. are available for visualization and access in Google Earth, NASA World Wind, and a browser-based 2-dimensional mapping client. The mapping client allows for the creation of web map documents that pull together layers from distributed servers and can be saved and shared. Visualization tools with Geobrowsers, user-driven map creation and sharing via browser-based clients, and a prototype for characterizing populations at risk to predicted precipitation deficits will be demonstrated.

  17. Expedition Memory: Towards Agent-based Web Services for Creating and Using Mars Exploration Data.

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Briggs, Geoff; Sims, Mike

    2005-01-01

    Explorers ranging over kilometers of rugged, sometimes "feature-less" terrain for over a year could be overwhelmed by tracking and sharing what they have done and learned. An automated system based on the existing Mobile Agents design [ I ] and Mars Exploration Rover experience [2], could serve as an "expedition memory" that would be indexed by voice as wel1 as a web interface, linking people, places, activities, records (voice notes, photographs, samples). and a descriptive scientific ontology. This database would be accessible during EVAs by astronauts, annotated by the remote science team, linked to EVA plans, and allow cross indexing between sites and expeditions. We consider the basic problem, our philosophical approach, technical methods, and uses of the expedition memory for facilitating long-term collaboration between Mars crews and Earth support teams. We emphasize that a "memory" does not mean a database per se, but an interactive service that combines different resources, and ultimately could be like a helpful librarian.

  18. Development of a Web Service for Analysis in a Distributed Network

    PubMed Central

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among

  19. Recent innovations in using Web Map Services to display gridded and non-gridded ocean data

    NASA Astrophysics Data System (ADS)

    Griffiths, Guy; Blower, Jon; López, Alejandro; Polo, Isabel; Romero, Laia; Loubrieu, Thomas; Brégent, Sophie

    2014-05-01

    The University of Reading has in developed techniques for fast visualisation of gridded data, such that they can be used in a WMS (web-map server) system. The aim is to provide data visualisation which is quick enough to be used interactively (e.g. on a website) even with very large underlying datasets. The two main tools which have come out of this effort are ncWMS (a WMS server for displaying NetCDF data) and its accompanying web client, Godiva2. This software is very widely used by oceanographic (and other) institutions and this presentation describes some of the most recent advances, together with plans for the future. For the MyOcean View Service, the University of Reading has extended ncWMS to allow it to display in situ measurement data. This accesses a supporting system, Oceanotron, created by Ifremer, which performs spatial indexing to retrieve observations from a database. By incorporating Oceanotron into the widely-used ncWMS software, map images of such point data can be accessed in a manner consistent with open standards. Intelligent grouping of variables combined with use of the WMS standard GetFeatureInfo request allows the display of in-situ measurements in a way that makes it simple to investigate the parameters required, when each single point may contain a lot of information. By providing various request parameters, the vertical or time dimensions of the data can be selected on in a straightforward manner. Combined with extensions to Godiva, this allows for in-situ (e.g. buoy) data to be easily visualised and explored in a web browser, alongside other data sources such as model data. Using these tools, Altamira has developed extended functionalities for ocean data visualization in operational portals. These include the visualization of multiple layers of data simultaneously, integration with authentication and authorization systems (in order to display different data depending on user rights, a key requirement for many operational systems) and

  20. An automated and integrated framework for dust storm detection based on ogc web processing services

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data