Sample records for web tool called

  1. Web 2.0 in Computer-Assisted Language Learning: A Research Synthesis and Implications for Instructional Design and Educational Practice

    ERIC Educational Resources Information Center

    Parmaxi, Antigoni; Zaphiris, Panayiotis

    2017-01-01

    This study explores the research development pertaining to the use of Web 2.0 technologies in the field of Computer-Assisted Language Learning (CALL). Published research manuscripts related to the use of Web 2.0 tools in CALL have been explored, and the following research foci have been determined: (1) Web 2.0 tools that dominate second/foreign…

  2. Teaching Web Security Using Portable Virtual Labs

    ERIC Educational Resources Information Center

    Chen, Li-Chiou; Tao, Lixin

    2012-01-01

    We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…

  3. 77 FR 72830 - Request for Comments on Request for Continued Examination (RCE) Practice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... the submission of written comments using a Web-based collaboration tool called IdeaScale[supreg]; and... collaboration tool called IdeaScale[supreg]. The tool allows users to post comments on a topic, and view and...

  4. Mining Hidden Gems Beneath the Surface: A Look At the Invisible Web.

    ERIC Educational Resources Information Center

    Carlson, Randal D.; Repman, Judi

    2002-01-01

    Describes resources for researchers called the Invisible Web that are hidden from the usual search engines and other tools and contrasts them with those resources available on the surface Web. Identifies specialized search tools, databases, and strategies that can be used to locate credible in-depth information. (Author/LRW)

  5. Web-Based Course Delivery and Administration Using Scheme.

    ERIC Educational Resources Information Center

    Salustri, Filippo A.

    This paper discusses the use at the University of Windsor (Ontario) of a small World Wide Web-based tool for course delivery and administration called HAL (HTML-based Administrative Lackey), written in the Scheme programming language. This tool was developed by the author to provide Web-based services for a large first-year undergraduate course in…

  6. Biotool2Web: creating simple Web interfaces for bioinformatics applications.

    PubMed

    Shahid, Mohammad; Alam, Intikhab; Fuellen, Georg

    2006-01-01

    Currently there are many bioinformatics applications being developed, but there is no easy way to publish them on the World Wide Web. We have developed a Perl script, called Biotool2Web, which makes the task of creating web interfaces for simple ('home-made') bioinformatics applications quick and easy. Biotool2Web uses an XML document containing the parameters to run the tool on the Web, and generates the corresponding HTML and common gateway interface (CGI) files ready to be published on a web server. This tool is available for download at URL http://www.uni-muenster.de/Bioinformatics/services/biotool2web/ Georg Fuellen (fuellen@alum.mit.edu).

  7. WEBCAP: Web Scheduler for Distance Learning Multimedia Documents with Web Workload Considerations

    ERIC Educational Resources Information Center

    Habib, Sami; Safar, Maytham

    2008-01-01

    In many web applications, such as the distance learning, the frequency of refreshing multimedia web documents places a heavy burden on the WWW resources. Moreover, the updated web documents may encounter inordinate delays, which make it difficult to retrieve web documents in time. Here, we present an Internet tool called WEBCAP that can schedule…

  8. Web Development Simplified

    ERIC Educational Resources Information Center

    Becker, Bernd W.

    2010-01-01

    The author has discussed the Multimedia Educational Resource for Teaching and Online Learning site, MERLOT, in a recent Electronic Roundup column. In this article, he discusses an entirely new Web page development tool that MERLOT has added for its members. The new tool is called the MERLOT Content Builder and is directly integrated into the…

  9. Introduction to the Watershed Central Web Site and Watershed Wiki Mini-Workshop

    EPA Science Inventory

    Many communities across the country struggle to find the right approaches, tools and data to include in their watershed plans. EPA recently posted a new web site called "Watershed Central,” a “one-stop" tool, to help watershed organizations and others find key resources to protec...

  10. 45 CFR 155.205 - Consumer assistance tools and programs of an Exchange.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Exchange. (a) Call center. The Exchange must provide for operation of a toll-free call center that...)(1), (c)(2)(i), and (c)(3) of this section. (b) Internet Web site. The Exchange must maintain an up-to-date Internet Web site that meets the requirements outlined in paragraph (c) of this section and...

  11. 45 CFR 155.205 - Consumer assistance tools and programs of an Exchange.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Exchange. (a) Call center. The Exchange must provide for operation of a toll-free call center that...)(1), (c)(2)(i), and (c)(3) of this section. (b) Internet Web site. The Exchange must maintain an up-to-date Internet Web site that meets the requirements outlined in paragraph (c) of this section and...

  12. 45 CFR 155.205 - Consumer assistance tools and programs of an Exchange.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Exchange. (a) Call center. The Exchange must provide for operation of a toll-free call center that...)(1), (c)(2)(i), and (c)(3) of this section. (b) Internet Web site. The Exchange must maintain an up-to-date Internet Web site that meets the requirements outlined in paragraph (c) of this section and...

  13. Web-based decision support and visualization tools for water quality management in the Chesapeake Bay watershed

    USGS Publications Warehouse

    Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.

    2009-01-01

    Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.

  14. Assisting Tutors to Utilize Web 2.0 Tools in Education

    ERIC Educational Resources Information Center

    Perikos, Isidoros; Grivokostopoulou, Foteini; Kovas, Konstantinos; Hatzilygeroudis, Ioannis

    2015-01-01

    Over the last decade, web has changed the way that educational procedures are delivered to students and has brought innovative learning technologies and possibilities that were not available before. The Web has evolved to a worldwide platform for collaboration, sharing and innovation, constituting what is called Web 2.0. Social media are an…

  15. A Good Teaching Technique: WebQuests

    ERIC Educational Resources Information Center

    Halat, Erdogan

    2008-01-01

    In this article, the author first introduces and describes a new teaching tool called WebQuests to practicing teachers. He then provides detailed information about the structure of a good WebQuest. Third, the author shows the strengths and weaknesses of using Web-Quests in teaching and learning. Last, he points out the challenges for practicing…

  16. Web-based metabolic network visualization with a zooming user interface

    PubMed Central

    2011-01-01

    Background Displaying complex metabolic-map diagrams, for Web browsers, and allowing users to interact with them for querying and overlaying expression data over them is challenging. Description We present a Web-based metabolic-map diagram, which can be interactively explored by the user, called the Cellular Overview. The main characteristic of this application is the zooming user interface enabling the user to focus on appropriate granularities of the network at will. Various searching commands are available to visually highlight sets of reactions, pathways, enzymes, metabolites, and so on. Expression data from single or multiple experiments can be overlaid on the diagram, which we call the Omics Viewer capability. The application provides Web services to highlight the diagram and to invoke the Omics Viewer. This application is entirely written in JavaScript for the client browsers and connect to a Pathway Tools Web server to retrieve data and diagrams. It uses the OpenLayers library to display tiled diagrams. Conclusions This new online tool is capable of displaying large and complex metabolic-map diagrams in a very interactive manner. This application is available as part of the Pathway Tools software that powers multiple metabolic databases including Biocyc.org: The Cellular Overview is accessible under the Tools menu. PMID:21595965

  17. VIPER: a web application for rapid expert review of variant calls.

    PubMed

    Wöste, Marius; Dugas, Martin

    2018-06-01

    With the rapid development in next-generation sequencing, cost and time requirements for genomic sequencing are decreasing, enabling applications in many areas such as cancer research. Many tools have been developed to analyze genomic variation ranging from single nucleotide variants to whole chromosomal aberrations. As sequencing throughput increases, the number of variants called by such tools also grows. Often employed manual inspection of such calls is thus becoming a time-consuming procedure. We developed the Variant InsPector and Expert Rating tool (VIPER) to speed up this process by integrating the Integrative Genomics Viewer into a web application. Analysts can then quickly iterate through variants, apply filters and make decisions based on the generated images and variant metadata. VIPER was successfully employed in analyses with manual inspection of more than 10 000 calls. VIPER is implemented in Java and Javascript and is freely available at https://github.com/MarWoes/viper. marius.woeste@uni-muenster.de. Supplementary data are available at Bioinformatics online.

  18. Easy Web Interfaces to IDL Code for NSTX Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    W.M. Davis

    Reusing code is a well-known Software Engineering practice to substantially increase the efficiency of code production, as well as to reduce errors and debugging time. A variety of "Web Tools" for the analysis and display of raw and analyzed physics data are in use on NSTX [1], and new ones can be produced quickly from existing IDL [2] code. A Web Tool with only a few inputs, and which calls an IDL routine written in the proper style, can be created in less than an hour; more typical Web Tools with dozens of inputs, and the need for some adaptationmore » of existing IDL code, can be working in a day or so. Efficiency is also increased for users of Web Tools because o f the familiar interface of the web browser, and not needing X-windows, accounts, passwords, etc. Web Tools were adapted for use by PPPL physicists accessing EAST data stored in MDSplus with only a few man-weeks of effort; adapting to additional sites should now be even easier. An overview of Web Tools in use on NSTX, and a list of the most useful features, is also presented.« less

  19. SMARTE: SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (BELFAST, IRELAND)

    EPA Science Inventory

    The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...

  20. SMARTE: IMPROVING REVITALIZATION DECISIONS (BERLIN, GERMANY)

    EPA Science Inventory

    The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...

  1. "Heart of Darkness" Webquest: Using Technology To Teach Literary Criticism.

    ERIC Educational Resources Information Center

    Rozema, Robert

    This paper shows how literary criticism can enrich the high school English classroom. Specifically, the paper focuses on how an Internet teaching tool called the WebQuest helped one educator's students learn about literary criticism and apply it to "Heart of Darkness" by Joseph Conrad. The WebQuest homepage defines a WebQuest as "an…

  2. REopt Lite Web Tool Evaluates Photovoltaics and Battery Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Building on the success of the REopt renewable energy integration and optimization platform, NREL has developed a free, publicly available web version of REopt called REopt Lite. REopt Lite evaluates the economics of grid-connected photovoltaics (PV) and battery storage at a site. It allows building owners to identify the system sizes and battery dispatch strategy that minimize their life cycle cost of energy. This web tool also estimates the amount of time a PV and storage system can sustain the site's critical load during a grid outage.

  3. Using Web Logs in the Science Classroom

    ERIC Educational Resources Information Center

    Duplichan, Staycle C.

    2009-01-01

    As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…

  4. Automated MeSH indexing of the World-Wide Web.

    PubMed Central

    Fowler, J.; Kouramajian, V.; Maram, S.; Devadhar, V.

    1995-01-01

    To facilitate networked discovery and information retrieval in the biomedical domain, we have designed a system for automatic assignment of Medical Subject Headings to documents retrieved from the World-Wide Web. Our prototype implementations show significant promise. We describe our methods and discuss the further development of a completely automated indexing tool called the "Web-MeSH Medibot." PMID:8563421

  5. Rapid Response to Decision Making for Complex Issues - How Technologies of Cooperation Can Help

    DTIC Science & Technology

    2005-11-01

    creating bottom–up taxonomies—called folksonomies —using metadata tools like del.icio.us (in which users create their own tags for bookmarking Web...tools such as RSS, tagging (and the consequent development of folksonomies ), wikis, and group visualization tools all help multiply the individual

  6. Developing the E-Delphi System: A Web-Based Forecasting Tool for Educational Research.

    ERIC Educational Resources Information Center

    Chou, Chien

    2002-01-01

    Discusses use of the Delphi technique and describes the development of an electronic version, called e-Delphi, in which questionnaire construction and communication with panel members was accomplished using the Web. Explains system function and interface and discusses evaluation of the e-Delphi system. (Author/LRW)

  7. ToTem: a tool for variant calling pipeline optimization.

    PubMed

    Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka

    2018-06-26

    High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at  https://totem.software .

  8. A tool for improving the Web accessibility of visually handicapped persons.

    PubMed

    Fujiki, Tadayoshi; Hanada, Eisuke; Yamada, Tomomi; Noda, Yoshihiro; Antoku, Yasuaki; Nakashima, Naoki; Nose, Yoshiaki

    2006-04-01

    Abstract Much has been written concerning the difficulties faced by visually handicapped persons when they access the internet. To solve some of the problems and to make web pages more accessible, we developed a tool we call the "Easy Bar," which works as a toolbar on the web browser. The functions of the Easy Bar are to change the size of web texts and images, to adjust the color, and to clear cached data that is automatically saved by the web browser. These functions are executed with ease by clicking buttons and operating a pull-down list. Since the icons built into Easy Bar are quite large, it is not necessary for the user to deal with delicate operations. The functions of Easy Bar run on any web page without increasing the processing time. For the visually handicapped, Easy Bar would contribute greatly to improved web accessibility to medical information.

  9. Challenging Google, Microsoft Unveils a Search Tool for Scholarly Articles

    ERIC Educational Resources Information Center

    Carlson, Scott

    2006-01-01

    Microsoft has introduced a new search tool to help people find scholarly articles online. The service, which includes journal articles from prominent academic societies and publishers, puts Microsoft in direct competition with Google Scholar. The new free search tool, which should work on most Web browsers, is called Windows Live Academic Search…

  10. CNTRO: A Semantic Web Ontology for Temporal Relation Inferencing in Clinical Narratives.

    PubMed

    Tao, Cui; Wei, Wei-Qi; Solbrig, Harold R; Savova, Guergana; Chute, Christopher G

    2010-11-13

    Using Semantic-Web specifications to represent temporal information in clinical narratives is an important step for temporal reasoning and answering time-oriented queries. Existing temporal models are either not compatible with the powerful reasoning tools developed for the Semantic Web, or designed only for structured clinical data and therefore are not ready to be applied on natural-language-based clinical narrative reports directly. We have developed a Semantic-Web ontology which is called Clinical Narrative Temporal Relation ontology. Using this ontology, temporal information in clinical narratives can be represented as RDF (Resource Description Framework) triples. More temporal information and relations can then be inferred by Semantic-Web based reasoning tools. Experimental results show that this ontology can represent temporal information in real clinical narratives successfully.

  11. Using Firefly Tools to Enhance Archive Web Pages

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2013-10-01

    Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.

  12. The Development of a Web-Based Virtual Environment for Teaching Qualitative Analysis of Structures

    ERIC Educational Resources Information Center

    O'Dwyer, D. W.; Logan-Phelan, T. M.; O'Neill, E. A.

    2007-01-01

    The current paper describes the design and development of a qualitative analysis course and an interactive web-based teaching and assessment tool called VSE (virtual structural environment). The widespread reliance on structural analysis programs requires engineers to be able to verify computer output by carrying out qualitative analyses.…

  13. Monitoring Global Crop Condition Indicators Using a Web-Based Visualization Tool

    Treesearch

    Bob Tetrault; Bob Baldwin

    2006-01-01

    Global crop condition information for major agricultural regions in the world can be monitored using the web-based application called Crop Explorer. With this application, U.S. and international producers, traders, researchers, and the public can access remote sensing information used by agricultural economists and scientists who predict crop production worldwide. For...

  14. Creative Commons: A New Tool for Schools

    ERIC Educational Resources Information Center

    Pitler, Howard

    2006-01-01

    Technology-savvy instructors often require students to create Web pages or videos, tasks that require finding materials such as images, music, or text on the Web, reusing them, and then republishing them in a technique that author Howard Pitler calls "remixing." However, this requires both the student and the instructor to deal with often thorny…

  15. Preparing for the Real Web Generation

    ERIC Educational Resources Information Center

    Pence, Harry E.

    2007-01-01

    Some have called the current generation of college students the Web generation. They are wrong! The pace of technology change continues to quicken. The effects of globalization and social networking have not yet had their full impact. At best, the present students represent a transitional group. The tools we use define us, and the media revolution…

  16. Holy COW: Scaffolding Case Based Conferencing on the Web with Preservice Teachers.

    ERIC Educational Resources Information Center

    Bonk, Curtis J.; Angeli, Charoula; Malikowski, Steve R.; Supplee, Lauren

    2001-01-01

    This study on the effects of scaffolding electronic case-based learning on preservice teacher education explored the use of an asynchronous computer conferencing tool called COW (Conferencing on the Web) to determine whether open-ended learning environments that encouraged critical thinking could foster a greater degree of course connections and…

  17. Using WebQuests as Idea Banks for Fostering Autonomy in Online Language Courses

    ERIC Educational Resources Information Center

    Sadaghian, Shirin; Marandi, S. Susan

    2016-01-01

    The concept of language learner autonomy has influenced Computer-Assisted Language Learning (CALL) to the extent that Schwienhorst (2012) informs us of a paradigm change in CALL design in the light of learner autonomy. CALL is not considered a tool anymore, but a learner environment available to language learners anywhere in the world. Based on a…

  18. Engineering Analysis Using a Web-based Protocol

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.; Claus, Russell W.

    2002-01-01

    This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.

  19. Webpress: An Internet Outreach from NASA Dryden

    NASA Technical Reports Server (NTRS)

    Biezad, Daniel J.

    1996-01-01

    The Technology and Commercialization Office at NASA DRyden has developed many educational outreach programs for K-12 educators. This project concentrates on the internet portion of that effort, specifically focusing on the development of an internet tool for educators called Webpress. This tool will not only provide a user-friendly access to aeronautical topics and interesting individuals on the world wide web (web), but will also enable teachers to rapidly submit and display their own materials and links for use in the classroom.

  20. Watershed Central: A New Gateway to Watershed Information

    EPA Science Inventory

    Many communities across the country struggle to find the right approaches, tools and data to in their watershed plans. EPA recently posted a new Web site called "Watershed Central, a “onestop" tool, to help watershed organizations and others find key resources to protect their ...

  1. An overview of new video coding tools under consideration for VP10: the successor to VP9

    NASA Astrophysics Data System (ADS)

    Mukherjee, Debargha; Su, Hui; Bankoski, James; Converse, Alex; Han, Jingning; Liu, Zoe; Xu, Yaowu

    2015-09-01

    Google started an opensource project, entitled the WebM Project, in 2010 to develop royaltyfree video codecs for the web. The present generation codec developed in the WebM project called VP9 was finalized in mid2013 and is currently being served extensively by YouTube, resulting in billions of views per day. Even though adoption of VP9 outside Google is still in its infancy, the WebM project has already embarked on an ambitious project to develop a next edition codec VP10 that achieves at least a generational bitrate reduction over the current generation codec VP9. Although the project is still in early stages, a set of new experimental coding tools have already been added to baseline VP9 to achieve modest coding gains over a large enough test set. This paper provides a technical overview of these coding tools.

  2. The Design of Modular Web-Based Collaboration

    NASA Astrophysics Data System (ADS)

    Intapong, Ploypailin; Settapat, Sittapong; Kaewkamnerdpong, Boonserm; Achalakul, Tiranee

    Online collaborative systems are popular communication channels as the systems allow people from various disciplines to interact and collaborate with ease. The systems provide communication tools and services that can be integrated on the web; consequently, the systems are more convenient to use and easier to install. Nevertheless, most of the currently available systems are designed according to some specific requirements and cannot be straightforwardly integrated into various applications. This paper provides the design of a new collaborative platform, which is component-based and re-configurable. The platform is called the Modular Web-based Collaboration (MWC). MWC shares the same concept as computer supported collaborative work (CSCW) and computer-supported collaborative learning (CSCL), but it provides configurable tools for online collaboration. Each tool module can be integrated into users' web applications freely and easily. This makes collaborative system flexible, adaptable and suitable for online collaboration.

  3. Collaboratively Conceived, Designed and Implemented: Matching Visualization Tools with Geoscience Data Collections and Geoscience Data Collections with Visualization Tools via the ToolMatch Service.

    NASA Astrophysics Data System (ADS)

    Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.

    2014-12-01

    Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.

  4. Novel inter and intra prediction tools under consideration for the emerging AV1 video codec

    NASA Astrophysics Data System (ADS)

    Joshi, Urvang; Mukherjee, Debargha; Han, Jingning; Chen, Yue; Parker, Sarah; Su, Hui; Chiang, Angie; Xu, Yaowu; Liu, Zoe; Wang, Yunqing; Bankoski, Jim; Wang, Chen; Keyder, Emil

    2017-09-01

    Google started the WebM Project in 2010 to develop open source, royalty- free video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec AV1, in a consortium of major tech companies called the Alliance for Open Media, that achieves at least a generational improvement in coding efficiency over VP9. In this paper, we focus primarily on new tools in AV1 that improve the prediction of pixel blocks before transforms, quantization and entropy coding are invoked. Specifically, we describe tools and coding modes that improve intra, inter and combined inter-intra prediction. Results are presented on standard test sets.

  5. REopt Lite Web Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NREL developed a free, publicly available web version of the REopt (TM) renewable energy integration and optimization platform called REopt Lite. REopt Lite recommends the optimal size and dispatch strategy for grid-connected photovoltaics (PV) and battery storage at a site. It also allows users to explore how PV and storage can increase a site's resiliency during a grid outage.

  6. An Online Image Analysis Tool for Science Education

    ERIC Educational Resources Information Center

    Raeside, L.; Busschots, B.; Waddington, S.; Keating, J. G.

    2008-01-01

    This paper describes an online image analysis tool developed as part of an iterative, user-centered development of an online Virtual Learning Environment (VLE) called the Education through Virtual Experience (EVE) Portal. The VLE provides a Web portal through which schoolchildren and their teachers create scientific proposals, retrieve images and…

  7. CCDST: A free Canadian climate data scraping tool

    NASA Astrophysics Data System (ADS)

    Bonifacio, Charmaine; Barchyn, Thomas E.; Hugenholtz, Chris H.; Kienzle, Stefan W.

    2015-02-01

    In this paper we present a new software tool that automatically fetches, downloads and consolidates climate data from a Web database where the data are contained on multiple Web pages. The tool is called the Canadian Climate Data Scraping Tool (CCDST) and was developed to enhance access and simplify analysis of climate data from Canada's National Climate Data and Information Archive (NCDIA). The CCDST deconstructs a URL for a particular climate station in the NCDIA and then iteratively modifies the date parameters to download large volumes of data, remove individual file headers, and merge data files into one output file. This automated sequence enhances access to climate data by substantially reducing the time needed to manually download data from multiple Web pages. To this end, we present a case study of the temporal dynamics of blowing snow events that resulted in ~3.1 weeks time savings. Without the CCDST, the time involved in manually downloading climate data limits access and restrains researchers and students from exploring climate trends. The tool is coded as a Microsoft Excel macro and is available to researchers and students for free. The main concept and structure of the tool can be modified for other Web databases hosting geophysical data.

  8. Unveiling causal activity of complex networks

    NASA Astrophysics Data System (ADS)

    Williams-García, Rashid V.; Beggs, John M.; Ortiz, Gerardo

    2017-07-01

    We introduce a novel tool for analyzing complex network dynamics, allowing for cascades of causally-related events, which we call causal webs (c-webs), to be separated from other non-causally-related events. This tool shows that traditionally-conceived avalanches may contain mixtures of spatially-distinct but temporally-overlapping cascades of events, and dynamical disorder or noise. In contrast, c-webs separate these components, unveiling previously hidden features of the network and dynamics. We apply our method to mouse cortical data with resulting statistics which demonstrate for the first time that neuronal avalanches are not merely composed of causally-related events. The original version of this article was uploaded to the arXiv on March 17th, 2016 [1].

  9. Encourage Students to Read through the Use of Data Visualization

    ERIC Educational Resources Information Center

    Bandeen, Heather M.; Sawin, Jason E.

    2012-01-01

    Instructors are always looking for new ways to engage students in reading assignments. The authors present a few techniques that rely on a web-based data visualization tool called Wordle (wordle.net). Wordle creates word frequency representations called word clouds. The larger a word appears within a cloud, the more frequently it occurs within a…

  10. deepTools: a flexible platform for exploring deep-sequencing data.

    PubMed

    Ramírez, Fidel; Dündar, Friederike; Diehl, Sarah; Grüning, Björn A; Manke, Thomas

    2014-07-01

    We present a Galaxy based web server for processing and visualizing deeply sequenced data. The web server's core functionality consists of a suite of newly developed tools, called deepTools, that enable users with little bioinformatic background to explore the results of their sequencing experiments in a standardized setting. Users can upload pre-processed files with continuous data in standard formats and generate heatmaps and summary plots in a straight-forward, yet highly customizable manner. In addition, we offer several tools for the analysis of files containing aligned reads and enable efficient and reproducible generation of normalized coverage files. As a modular and open-source platform, deepTools can easily be expanded and customized to future demands and developments. The deepTools webserver is freely available at http://deeptools.ie-freiburg.mpg.de and is accompanied by extensive documentation and tutorials aimed at conveying the principles of deep-sequencing data analysis. The web server can be used without registration. deepTools can be installed locally either stand-alone or as part of Galaxy. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Using Web-Based Collaborative Forecasting to Enhance Information Literacy and Disciplinary Knowledge

    ERIC Educational Resources Information Center

    Buckley, Patrick; Doyle, Elaine

    2016-01-01

    This paper outlines how an existing collaborative forecasting tool called a prediction market (PM) can be integrated into an educational context to enhance information literacy skills and cognitive disciplinary knowledge. The paper makes a number of original contributions. First, it describes how this tool can be packaged as a pedagogical…

  12. WebSat--a web software for microsatellite marker development.

    PubMed

    Martins, Wellington Santos; Lucas, Divino César Soares; Neves, Kelligton Fabricio de Souza; Bertioli, David John

    2009-01-01

    Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. The web tool may be accessed at http://purl.oclc.org/NET/websat/

  13. ProGeRF: Proteome and Genome Repeat Finder Utilizing a Fast Parallel Hash Function

    PubMed Central

    Moraes, Walas Jhony Lopes; Rodrigues, Thiago de Souza; Bartholomeu, Daniella Castanheira

    2015-01-01

    Repetitive element sequences are adjacent, repeating patterns, also called motifs, and can be of different lengths; repetitions can involve their exact or approximate copies. They have been widely used as molecular markers in population biology. Given the sizes of sequenced genomes, various bioinformatics tools have been developed for the extraction of repetitive elements from DNA sequences. However, currently available tools do not provide options for identifying repetitive elements in the genome or proteome, displaying a user-friendly web interface, and performing-exhaustive searches. ProGeRF is a web site for extracting repetitive regions from genome and proteome sequences. It was designed to be efficient, fast, and accurate and primarily user-friendly web tool allowing many ways to view and analyse the results. ProGeRF (Proteome and Genome Repeat Finder) is freely available as a stand-alone program, from which the users can download the source code, and as a web tool. It was developed using the hash table approach to extract perfect and imperfect repetitive regions in a (multi)FASTA file, while allowing a linear time complexity. PMID:25811026

  14. Capturing citation activity in three health sciences departments: a comparison study of Scopus and Web of Science.

    PubMed

    Sarkozy, Alexandra; Slyman, Alison; Wu, Wendy

    2015-01-01

    Scopus and Web of Science are the two major citation databases that collect and disseminate bibliometric statistics about research articles, journals, institutions, and individual authors. Liaison librarians are now regularly called upon to utilize these databases to assist faculty in finding citation activity on their published works for tenure and promotion, grant applications, and more. But questions about the accuracy, scope, and coverage of these tools deserve closer scrutiny. Discrepancies in citation capture led to a systematic study on how Scopus and Web of Science compared in a real-life situation encountered by liaisons: comparing three different disciplines at a medical school and nursing program. How many articles would each database retrieve for each faculty member using the author-searching tools provided? How many cited references for each faculty member would each tool generate? Results demonstrated troubling differences in publication and citation activity capture between Scopus and Web of Science. Implications for librarians are discussed.

  15. CASAS: A tool for composing automatically and semantically astrophysical services

    NASA Astrophysics Data System (ADS)

    Louge, T.; Karray, M. H.; Archimède, B.; Knödlseder, J.

    2017-07-01

    Multiple astronomical datasets are available through internet and the astrophysical Distributed Computing Infrastructure (DCI) called Virtual Observatory (VO). Some scientific workflow technologies exist for retrieving and combining data from those sources. However selection of relevant services, automation of the workflows composition and the lack of user-friendly platforms remain a concern. This paper presents CASAS, a tool for semantic web services composition in astrophysics. This tool proposes automatic composition of astrophysical web services and brings a semantics-based, automatic composition of workflows. It widens the services choice and eases the use of heterogeneous services. Semantic web services composition relies on ontologies for elaborating the services composition; this work is based on Astrophysical Services ONtology (ASON). ASON had its structure mostly inherited from the VO services capacities. Nevertheless, our approach is not limited to the VO and brings VO plus non-VO services together without the need for premade recipes. CASAS is available for use through a simple web interface.

  16. A Web Tool for Research in Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.

    2016-02-01

    This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.

  17. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    PubMed Central

    2010-01-01

    Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791

  18. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    PubMed

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  19. Identification and Illustration of Insecure Direct Object References and their Countermeasures

    NASA Astrophysics Data System (ADS)

    KumarShrestha, Ajay; Singh Maharjan, Pradip; Paudel, Santosh

    2015-03-01

    The insecure direct object reference simply represents the flaws in the system design without the full protection mechanism for the sensitive system resources or data. It basically occurs when the web application developer provides direct access to objects in accordance with the user input. So any attacker can exploit this web vulnerability and gain access to privileged information by bypassing the authorization. The main aim of this paper is to demonstrate the real effect and the identification of the insecure direct object references and then to provide the feasible preventive solutions such that the web applications do not allow direct object references to be manipulated by attackers. The experiment of the insecure direct object referencing is carried out using the insecure J2EE web application called WebGoat and its security testing is being performed using another JAVA based tool called BURP SUITE. The experimental result shows that the access control check for gaining access to privileged information is a very simple problem but at the same time its correct implementation is a tricky task. The paper finally presents some ways to overcome this web vulnerability.

  20. WebSat ‐ A web software for microsatellite marker development

    PubMed Central

    Martins, Wellington Santos; Soares Lucas, Divino César; de Souza Neves, Kelligton Fabricio; Bertioli, David John

    2009-01-01

    Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. Availability The web tool may be accessed at http://purl.oclc.org/NET/websat/ PMID:19255650

  1. Filtering SPAM in P2PSIP Communities with Web of Trust

    NASA Astrophysics Data System (ADS)

    Heikkilä, Juho; Gurtov, Andrei

    Spam is a dominant problem on email systems today. One of the reasons is the lack of infrastructure for security and trust. As Voice over IP (VoIP) communication becomes increasingly popular, proliferation of spam calls is only a matter of time. As SIP identity scheme is practically similar to email, those share the same threats. We utilized Host Identity Protocol (HIP) to provide basic security, such as end-to-end encryption. To provide call filtering, however, other tools are needed. In this paper, we suggest applying trust paths familiar from the PGP web of trust to prevent unwanted communication in P2PSIP communities.

  2. On transform coding tools under development for VP10

    NASA Astrophysics Data System (ADS)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  3. Identification of immunoglobulins using Chou's pseudo amino acid composition with feature selection technique.

    PubMed

    Tang, Hua; Chen, Wei; Lin, Hao

    2016-04-01

    Immunoglobulins, also called antibodies, are a group of cell surface proteins which are produced by the immune system in response to the presence of a foreign substance (called antigen). They play key roles in many medical, diagnostic and biotechnological applications. Correct identification of immunoglobulins is crucial to the comprehension of humoral immune function. With the avalanche of protein sequences identified in postgenomic age, it is highly desirable to develop computational methods to timely identify immunoglobulins. In view of this, we designed a predictor called "IGPred" by formulating protein sequences with the pseudo amino acid composition into which nine physiochemical properties of amino acids were incorporated. Jackknife cross-validated results showed that 96.3% of immunoglobulins and 97.5% of non-immunoglobulins can be correctly predicted, indicating that IGPred holds very high potential to become a useful tool for antibody analysis. For the convenience of most experimental scientists, a web-server for IGPred was established at http://lin.uestc.edu.cn/server/IGPred. We believe that the web-server will become a powerful tool to study immunoglobulins and to guide related experimental validations.

  4. Interactive Visualization of Computational Fluid Dynamics using Mosaic

    NASA Technical Reports Server (NTRS)

    Clucas, Jean; Watson, Velvin; Chancellor, Marisa K. (Technical Monitor)

    1994-01-01

    The Web provides new Methods for accessing Information world-wide, but the current text-and-pictures approach neither utilizes all the Web's possibilities not provides for its limitations. While the inclusion of pictures and animations in a paper communicates more effectively than text alone, It Is essentially an extension of the concept of "publication." Also, as use of the Web increases putting images and animations online will quickly load even the "Information Superhighway." We need to find forms of communication that take advantage of the special nature of the Web. This paper presents one approach: the use of the Internet and the Mosaic interface for data sharing and collaborative analysis. We will describe (and In the presentation, demonstrate) our approach: using FAST (Flow Analysis Software Toolkit), a scientific visualization package, as a data viewer and interactive tool called from MOSAIC. Our intent is to stimulate the development of other tools that utilize the unique nature of electronic communication.

  5. Arachne—A web-based event viewer for MINERνA

    NASA Astrophysics Data System (ADS)

    Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A. M.; Gran, R.; Harris, D. A.; Kordosky, M.; Lee, H.; Maggi, G.; Maher, E.; Mann, W. A.; Marshall, C. M.; McFarland, K. S.; McGowan, A. M.; Mislivec, A.; Mousseau, J.; Osmanov, B.; Osta, J.; Paolone, V.; Perdue, G.; Ransome, R. D.; Ray, H.; Schellman, H.; Schmitz, D. W.; Simon, C.; Solano Salinas, C. J.; Tice, B. G.; Walding, J.; Walton, T.; Wolcott, J.; Zhang, D.; Ziemer, B. P.; MinerνA Collaboration

    2012-06-01

    Neutrino interaction events in the MINERνA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERνA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  6. Arachne - A web-based event viewer for MINERvA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tagg, N.; /Otterbein Coll.; Brangham, J.

    2011-11-01

    Neutrino interaction events in the MINERvA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERvA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  7. Teachers' Emotional Responses to New Pedagogical Tools in High Challenge Settings: Illustrations from the Northern Territory

    ERIC Educational Resources Information Center

    Harper, Helen

    2012-01-01

    This paper explores the role of teachers' emotions in adopting new pedagogical tools in urban and remote schools in the Northern Territory. The discussion is illustrated through case study material of Northern Territory teachers who had taken up using a web-based early childhood literacy resource called ABRACADABRA. A sociocultural perspective is…

  8. Designing Web 2.0 Based Constructivist-Oriented E-Learning Units

    ERIC Educational Resources Information Center

    Chai, Ching Sing; Woo, Huay Lit; Wang, Qiyun

    2010-01-01

    Purpose: The main purpose of this paper is to present how meaningful e-learning units can be created by using an online tool called Meaningful E-learning Units (MeLU). The paper also aims to describe how created e-learning units can be shared by teachers and students. Design/methodology/approach: This tool can help to produce e-learning units that…

  9. The Geogenomic Mutational Atlas of Pathogens (GoMAP) Web System

    PubMed Central

    Sargeant, David P.; Hedden, Michael W.; Deverasetty, Sandeep; Strong, Christy L.; Alaniz, Izua J.; Bartlett, Alexandria N.; Brandon, Nicholas R.; Brooks, Steven B.; Brown, Frederick A.; Bufi, Flaviona; Chakarova, Monika; David, Roxanne P.; Dobritch, Karlyn M.; Guerra, Horacio P.; Levit, Kelvy S.; Mathew, Kiran R.; Matti, Ray; Maza, Dorothea Q.; Mistry, Sabyasachy; Novakovic, Nemanja; Pomerantz, Austin; Rafalski, Timothy F.; Rathnayake, Viraj; Rezapour, Noura; Ross, Christian A.; Schooler, Steve G.; Songao, Sarah; Tuggle, Sean L.; Wing, Helen J.; Yousif, Sandy; Schiller, Martin R.

    2014-01-01

    We present a new approach for pathogen surveillance we call Geogenomics. Geogenomics examines the geographic distribution of the genomes of pathogens, with a particular emphasis on those mutations that give rise to drug resistance. We engineered a new web system called Geogenomic Mutational Atlas of Pathogens (GoMAP) that enables investigation of the global distribution of individual drug resistance mutations. As a test case we examined mutations associated with HIV resistance to FDA-approved antiretroviral drugs. GoMAP-HIV makes use of existing public drug resistance and HIV protein sequence data to examine the distribution of 872 drug resistance mutations in ∼502,000 sequences for many countries in the world. We also implemented a broadened classification scheme for HIV drug resistance mutations. Several patterns for geographic distributions of resistance mutations were identified by visual mining using this web tool. GoMAP-HIV is an open access web application available at http://www.bio-toolkit.com/GoMap/project/ PMID:24675726

  10. The Geogenomic Mutational Atlas of Pathogens (GoMAP) web system.

    PubMed

    Sargeant, David P; Hedden, Michael W; Deverasetty, Sandeep; Strong, Christy L; Alaniz, Izua J; Bartlett, Alexandria N; Brandon, Nicholas R; Brooks, Steven B; Brown, Frederick A; Bufi, Flaviona; Chakarova, Monika; David, Roxanne P; Dobritch, Karlyn M; Guerra, Horacio P; Levit, Kelvy S; Mathew, Kiran R; Matti, Ray; Maza, Dorothea Q; Mistry, Sabyasachy; Novakovic, Nemanja; Pomerantz, Austin; Rafalski, Timothy F; Rathnayake, Viraj; Rezapour, Noura; Ross, Christian A; Schooler, Steve G; Songao, Sarah; Tuggle, Sean L; Wing, Helen J; Yousif, Sandy; Schiller, Martin R

    2014-01-01

    We present a new approach for pathogen surveillance we call Geogenomics. Geogenomics examines the geographic distribution of the genomes of pathogens, with a particular emphasis on those mutations that give rise to drug resistance. We engineered a new web system called Geogenomic Mutational Atlas of Pathogens (GoMAP) that enables investigation of the global distribution of individual drug resistance mutations. As a test case we examined mutations associated with HIV resistance to FDA-approved antiretroviral drugs. GoMAP-HIV makes use of existing public drug resistance and HIV protein sequence data to examine the distribution of 872 drug resistance mutations in ∼ 502,000 sequences for many countries in the world. We also implemented a broadened classification scheme for HIV drug resistance mutations. Several patterns for geographic distributions of resistance mutations were identified by visual mining using this web tool. GoMAP-HIV is an open access web application available at http://www.bio-toolkit.com/GoMap/project/

  11. Semantic annotation of Web data applied to risk in food.

    PubMed

    Hignette, Gaëlle; Buche, Patrice; Couvert, Olivier; Dibie-Barthélemy, Juliette; Doussot, David; Haemmerlé, Ollivier; Mettler, Eric; Soler, Lydie

    2008-11-30

    A preliminary step to risk in food assessment is the gathering of experimental data. In the framework of the Sym'Previus project (http://www.symprevius.org), a complete data integration system has been designed, grouping data provided by industrial partners and data extracted from papers published in the main scientific journals of the domain. Those data have been classified by means of a predefined vocabulary, called ontology. Our aim is to complement the database with data extracted from the Web. In the framework of the WebContent project (www.webcontent.fr), we have designed a semi-automatic acquisition tool, called @WEB, which retrieves scientific documents from the Web. During the @WEB process, data tables are extracted from the documents and then annotated with the ontology. We focus on the data tables as they contain, in general, a synthesis of data published in the documents. In this paper, we explain how the columns of the data tables are automatically annotated with data types of the ontology and how the relations represented by the table are recognised. We also give the results of our experimentation to assess the quality of such an annotation.

  12. Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets

    NASA Astrophysics Data System (ADS)

    Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.

    On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.

  13. A randomized controlled study about the use of eHealth in the home health care of premature infants.

    PubMed

    Gund, Anna; Sjöqvist, Bengt Arne; Wigert, Helena; Hentz, Elisabet; Lindecrantz, Kaj; Bry, Kristina

    2013-02-09

    One area where the use of information and communication technology (ICT), or eHealth, could be developed is the home health care of premature infants. The aim of this randomized controlled study was to investigate whether the use of video conferencing or a web application improves parents' satisfaction in taking care of a premature infant at home and decreases the need of home visits. In addition, nurses' attitudes regarding the use of these tools were examined. Thirty-four families were randomized to one of three groups before their premature infant was discharged from the hospital to home health care: a control group receiving standard home health care (13 families); a web group receiving home health care supplemented with the use of a web application (12 families); a video group with home health care supplemented with video conferencing using Skype (9 families). Families and nursing staff answered questionnaires about the usefulness of ICT. In addition, semi-structured interviews were conducted with 16 families. All the parents in the web group found the web application easy to use. 83% of the families thought it was good to have access to their child's data through the application. All the families in the video group found Skype easy to use and were satisfied with the video calls. 88% of the families thought that video calls were better than ordinary phone calls. 33% of the families in the web group and 75% of those in the video group thought the need for home visits was decreased by the web application or Skype. 50% of the families in the web group and 100% of those in the video group thought the web application or the video calls had helped them feel more confident in caring for their child. Most of the nurses were motivated to use ICT but some were reluctant and avoided using the web application and video conferencing. The families were satisfied with both the web application and video conferencing. The families readily embraced the use of ICT, whereas motivating some of the nurses to accept and use ICT was a major challenge.

  14. A randomized controlled study about the use of eHealth in the home health care of premature infants

    PubMed Central

    2013-01-01

    Background One area where the use of information and communication technology (ICT), or eHealth, could be developed is the home health care of premature infants. The aim of this randomized controlled study was to investigate whether the use of video conferencing or a web application improves parents’ satisfaction in taking care of a premature infant at home and decreases the need of home visits. In addition, nurses’ attitudes regarding the use of these tools were examined. Method Thirty-four families were randomized to one of three groups before their premature infant was discharged from the hospital to home health care: a control group receiving standard home health care (13 families); a web group receiving home health care supplemented with the use of a web application (12 families); a video group with home health care supplemented with video conferencing using Skype (9 families). Families and nursing staff answered questionnaires about the usefulness of ICT. In addition, semi-structured interviews were conducted with 16 families. Results All the parents in the web group found the web application easy to use. 83% of the families thought it was good to have access to their child’s data through the application. All the families in the video group found Skype easy to use and were satisfied with the video calls. 88% of the families thought that video calls were better than ordinary phone calls. 33% of the families in the web group and 75% of those in the video group thought the need for home visits was decreased by the web application or Skype. 50% of the families in the web group and 100% of those in the video group thought the web application or the video calls had helped them feel more confident in caring for their child. Most of the nurses were motivated to use ICT but some were reluctant and avoided using the web application and video conferencing. Conclusion The families were satisfied with both the web application and video conferencing. The families readily embraced the use of ICT, whereas motivating some of the nurses to accept and use ICT was a major challenge. PMID:23394465

  15. Genomics for Everyone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chain, Patrick

    Genomics — the genetic mapping and DNA sequencing of sets of genes or the complete genomes of organisms, along with related genome analysis and database work — is emerging as one of the transformative sciences of the 21st century. But current bioinformatics tools are not accessible to most biological researchers. Now, a new computational and web-based tool called EDGE Bioinformatics is working to fulfill the promise of democratizing genomics.

  16. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  17. Is there a "net generation" in veterinary medicine? A comparative study on the use of the Internet and Web 2.0 by students and the veterinary profession.

    PubMed

    Tenhaven, Christoph; Tipold, Andrea; Fischer, Martin R; Ehlers, Jan P

    2013-01-01

    Informal and formal lifelong learning is essential at university and in the workplace. Apart from classical learning techniques, Web 2.0 tools can be used. It is controversial whether there is a so-called net generation amongst people under 30. To test the hypothesis that a net generation among students and young veterinarians exists. An online survey of students and veterinarians was conducted in the German-speaking countries which was advertised via online media and traditional print media. 1780 people took part in the survey. Students and veterinarians have different usage patterns regarding social networks (91.9% vs. 69%) and IM (55.9% vs. 24.5%). All tools were predominantly used passively and in private, to a lesser extent also professionally and for studying. The use of Web 2.0 tools is useful, however, teaching information and media skills, preparing codes of conduct for the internet and verification of user generated content is essential.

  18. Science 2.0: Communicating Science Creatively

    ERIC Educational Resources Information Center

    Smith, Ben; Mader, Jared

    2017-01-01

    This column shares web tools that support learning. The authors have been covering the International Society for Technology in Education (ISTE) standards in every issue since September 2016. This article examines the final standard, called Creative Communicator, which requires students to communicate effectively and creatively express themselves…

  19. Paradigms, Citations, and Maps of Science: A Personal History.

    ERIC Educational Resources Information Center

    Small, Henry

    2003-01-01

    Discusses mapping science and Kuhn's theories of paradigms and scientific development. Highlights include cocitation clustering; bibliometric definition of a paradigm; specialty dynamics; pathways through science; a new Web tool called Essential Science Indicators (ESI) for studying the structure of science; and microrevolutions. (Author/LRW)

  20. Appreciating Hubble at Hyper-speed: A Web-tool for Students and Teachers

    NASA Astrophysics Data System (ADS)

    Will, Lisa M.; Mechtley, M.; Cohen, S.; Windhorst, R. A.; Malhotra, S.; Rhoads, J.; Pirzkal, N.; Summers, F.

    2006-12-01

    Even post-instruction, many high school students and non-science college majors lack a firm understanding of the basic concepts of physics and astronomy necessary to appreciate our expanding universe. To mitigate this trend, we are developing a state-of-the-art Web-tool called "Appreciating Hubble at Hyper-speed" (AHaH ) that uses the HST Cycle 14 Treasury Project "PEARS" (Probing Evolution And Reionization through Spectra) data. AHaH will span the fully 3-dimensional PEARS database of the GOODS/HUDF galaxy distribution from redshifts z = 0.05 to z = 6.5, spanning nearly 90% of the history of the Universe. The web-tool AHaH will allow students to interactively zoom in/out of this PEARS data base, rotate, and accelerate/decelerate towards a specified target, and travel forward or backwards in time. Hence, students can make a complete interactive journey in look-back time. AHaH will help students learn and visually understand basic concepts of physics and astronomy, and at the same time allow them to explore how galaxies change when traveling back in time, how their light is redshifted, and how they are formed and clustered in the expanding Universe. This poster will describe the features of the web-tool and the services that will be offered to help teachers implement this tool in their classrooms.

  1. Combining Domain-driven Design and Mashups for Service Development

    NASA Astrophysics Data System (ADS)

    Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni

    This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.

  2. Genomics for Everyone

    ScienceCinema

    Chain, Patrick

    2018-05-31

    Genomics — the genetic mapping and DNA sequencing of sets of genes or the complete genomes of organisms, along with related genome analysis and database work — is emerging as one of the transformative sciences of the 21st century. But current bioinformatics tools are not accessible to most biological researchers. Now, a new computational and web-based tool called EDGE Bioinformatics is working to fulfill the promise of democratizing genomics.

  3. WebChem Viewer: a tool for the easy dissemination of chemical and structural data sets

    PubMed Central

    2014-01-01

    Background Sharing sets of chemical data (e.g., chemical properties, docking scores, etc.) among collaborators with diverse skill sets is a common task in computer-aided drug design and medicinal chemistry. The ability to associate this data with images of the relevant molecular structures greatly facilitates scientific communication. There is a need for a simple, free, open-source program that can automatically export aggregated reports of entire chemical data sets to files viewable on any computer, regardless of the operating system and without requiring the installation of additional software. Results We here present a program called WebChem Viewer that automatically generates these types of highly portable reports. Furthermore, in designing WebChem Viewer we have also created a useful online web application for remotely generating molecular structures from SMILES strings. We encourage the direct use of this online application as well as its incorporation into other software packages. Conclusions With these features, WebChem Viewer enables interdisciplinary collaborations that require the sharing and visualization of small molecule structures and associated sets of heterogeneous chemical data. The program is released under the FreeBSD license and can be downloaded from http://nbcr.ucsd.edu/WebChemViewer. The associated web application (called “Smiley2png 1.0”) can be accessed through freely available web services provided by the National Biomedical Computation Resource at http://nbcr.ucsd.edu. PMID:24886360

  4. NaviCell Web Service for network-based data visualization.

    PubMed

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei

    2015-07-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. NaviCell Web Service for network-based data visualization

    PubMed Central

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei

    2015-01-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393

  6. Software architecture and design of the web services facilitating climate model diagnostic analysis

    NASA Astrophysics Data System (ADS)

    Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.

    2015-12-01

    Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.

  7. The VO-Dance web application at the IA2 data center

    NASA Astrophysics Data System (ADS)

    Molinaro, Marco; Knapic, Cristina; Smareglia, Riccardo

    2012-09-01

    Italian center for Astronomical Archives (IA2, http://ia2.oats.inaf.it) is a national infrastructure project of the Italian National Institute for Astrophysics (Istituto Nazionale di AstroFisica, INAF) that provides services for the astronomical community. Besides data hosting for the Large Binocular Telescope (LBT) Corporation, the Galileo National Telescope (Telescopio Nazionale Galileo, TNG) Consortium and other telescopes and instruments, IA2 offers proprietary and public data access through user portals (both developed and mirrored) and deploys resources complying the Virtual Observatory (VO) standards. Archiving systems and web interfaces are developed to be extremely flexible about adding new instruments from other telescopes. VO resources publishing, along with data access portals, implements the International Virtual Observatory Alliance (IVOA) protocols providing astronomers with new ways of analyzing data. Given the large variety of data flavours and IVOA standards, the need for tools to easily accomplish data ingestion and data publishing arises. This paper describes the VO-Dance tool, that IA2 started developing to address VO resources publishing in a dynamical way from already existent database tables or views. The tool consists in a Java web application, potentially DBMS and platform independent, that stores internally the services' metadata and information, exposes restful endpoints to accept VO queries for these services and dynamically translates calls to these endpoints to SQL queries coherent with the published table or view. In response to the call VO-Dance translates back the database answer in a VO compliant way.

  8. TAPIR, a web server for the prediction of plant microRNA targets, including target mimics.

    PubMed

    Bonnet, Eric; He, Ying; Billiau, Kenny; Van de Peer, Yves

    2010-06-15

    We present a new web server called TAPIR, designed for the prediction of plant microRNA targets. The server offers the possibility to search for plant miRNA targets using a fast and a precise algorithm. The precise option is much slower but guarantees to find less perfectly paired miRNA-target duplexes. Furthermore, the precise option allows the prediction of target mimics, which are characterized by a miRNA-target duplex having a large loop, making them undetectable by traditional tools. The TAPIR web server can be accessed at: http://bioinformatics.psb.ugent.be/webtools/tapir. Supplementary data are available at Bioinformatics online.

  9. The Value of Interactive Assignments in the Online Learning Environment

    ERIC Educational Resources Information Center

    Florenthal, Bela

    2016-01-01

    The offerings of Web-based supplemental material for textbooks have been increasingly growing. When deciding to adopt a textbook, instructors examine the added value of the associated supplements, also called "e-learning tools," to enhance students' learning of course concepts. In this study, one such supplement, interactive assignments,…

  10. WebViz: A web browser based application for collaborative analysis of 3D data

    NASA Astrophysics Data System (ADS)

    Ruegg, C. S.

    2011-12-01

    In the age of high speed Internet where people can interact instantly, scientific tools have lacked technology which can incorporate this concept of communication using the web. To solve this issue a web application for geological studies has been created, tentatively titled WebViz. This web application utilizes tools provided by Google Web Toolkit to create an AJAX web application capable of features found in non web based software. Using these tools, a web application can be created to act as piece of software from anywhere in the globe with a reasonably speedy Internet connection. An application of this technology can be seen with data regarding the recent tsunami from the major japan earthquakes. After constructing the appropriate data to fit a computer render software called HVR, WebViz can request images of the tsunami data and display it to anyone who has access to the application. This convenience alone makes WebViz a viable solution, but the option to interact with this data with others around the world causes WebViz to be taken as a serious computational tool. WebViz also can be used on any javascript enabled browser such as those found on modern tablets and smart phones over a fast wireless connection. Due to the fact that WebViz's current state is built using Google Web Toolkit the portability of the application is in it's most efficient form. Though many developers have been involved with the project, each person has contributed to increase the usability and speed of the application. In the project's most recent form a dramatic speed increase has been designed as well as a more efficient user interface. The speed increase has been informally noticed in recent uses of the application in China and Australia with the hosting server being located at the University of Minnesota. The user interface has been improved to not only look better but the functionality has been improved. Major functions of the application are rotating the 3D object using buttons. These buttons have been replaced with a new layout that is easier to understand the function and is also easy to use with mobile devices. With these new changes, WebViz is easier to control and use for general use.

  11. Spitzer Space Telescope proposal process

    NASA Astrophysics Data System (ADS)

    Laine, S.; Silbermann, N. A.; Rebull, L. M.; Storrie-Lombardi, L. J.

    2006-06-01

    This paper discusses the Spitzer Space Telescope General Observer proposal process. Proposals, consisting of the scientific justification, basic contact information for the observer, and observation requests, are submitted electronically using a client-server Java package called Spot. The Spitzer Science Center (SSC) uses a one-phase proposal submission process, meaning that fully-planned observations are submitted for most proposals at the time of submission, not months after acceptance. Ample documentation and tools are available to the observers on SSC web pages to support the preparation of proposals, including an email-based Helpdesk. Upon submission proposals are immediately ingested into a database which can be queried at the SSC for program information, statistics, etc. at any time. Large proposals are checked for technical feasibility and all proposals are checked against duplicates of already approved observations. Output from these tasks is made available to the Time Allocation Committee (TAC) members. At the review meeting, web-based software is used to record reviewer comments and keep track of the voted scores. After the meeting, another Java-based web tool, Griffin, is used to track the approved programs as they go through technical reviews, duplication checks and minor modifications before the observations are released for scheduling. In addition to detailing the proposal process, lessons learned from the first two General Observer proposal calls are discussed.

  12. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).

  13. Is there a “net generation” in veterinary medicine? A comparative study on the use of the Internet and Web 2.0 by students and the veterinary profession

    PubMed Central

    Tenhaven, Christoph; Tipold, Andrea; Fischer, Martin R.; Ehlers, Jan P.

    2013-01-01

    Introduction: Informal and formal lifelong learning is essential at university and in the workplace. Apart from classical learning techniques, Web 2.0 tools can be used. It is controversial whether there is a so-called net generation amongst people under 30. Aims: To test the hypothesis that a net generation among students and young veterinarians exists. Methods: An online survey of students and veterinarians was conducted in the German-speaking countries which was advertised via online media and traditional print media. Results: 1780 people took part in the survey. Students and veterinarians have different usage patterns regarding social networks (91.9% vs. 69%) and IM (55.9% vs. 24.5%). All tools were predominantly used passively and in private, to a lesser extent also professionally and for studying. Outlook: The use of Web 2.0 tools is useful, however, teaching information and media skills, preparing codes of conduct for the internet and verification of user generated content is essential. PMID:23467682

  14. Web-based volume slicer for 3D electron-microscopy data from EMDB

    PubMed Central

    Salavert-Torres, José; Iudin, Andrii; Lagerstedt, Ingvar; Sanz-García, Eduardo; Kleywegt, Gerard J.; Patwardhan, Ardan

    2016-01-01

    We describe the functionality and design of the Volume slicer – a web-based slice viewer for EMDB entries. This tool uniquely provides the facility to view slices from 3D EM reconstructions along the three orthogonal axes and to rapidly switch between them and navigate through the volume. We have employed multiple rounds of user-experience testing with members of the EM community to ensure that the interface is easy and intuitive to use and the information provided is relevant. The impetus to develop the Volume slicer has been calls from the EM community to provide web-based interactive visualisation of 2D slice data. This would be useful for quick initial checks of the quality of a reconstruction. Again in response to calls from the community, we plan to further develop the Volume slicer into a fully-fledged Volume browser that provides integrated visualisation of EMDB and PDB entries from the molecular to the cellular scale. PMID:26876163

  15. webPOISONCONTROL: can poison control be automated?

    PubMed

    Litovitz, Toby; Benson, Blaine E; Smolinske, Susan

    2016-08-01

    A free webPOISONCONTROL app allows the public to determine the appropriate triage of poison ingestions without calling poison control. If accepted and safe, this alternative expands access to reliable poison control services to those who prefer the Internet over the telephone. This study assesses feasibility, safety, and user-acceptance of automated online triage of asymptomatic, nonsuicidal poison ingestion cases. The user provides substance name, amount, age, and weight in an automated online tool or downloadable app, and is given a specific triage recommendation to stay home, go to the emergency department, or call poison control for further guidance. Safety was determined by assessing outcomes of consecutive home-triaged cases with follow-up and by confirming the correct application of algorithms. Case completion times and user perceptions of speed and ease of use were measures of user-acceptance. Of 9256 cases, 73.3% were triaged to home, 2.1% to an emergency department, and 24.5% directed to call poison control. Children younger than 6 years were involved in 75.2% of cases. Automated follow-up was done in 31.2% of home-triaged cases; 82.3% of these had no effect. No major or fatal outcomes were reported. More than 91% of survey respondents found the tool quick and easy to use. Median case completion time was 4.1 minutes. webPOISONCONTROL augments traditional poison control services by providing automated, accurate online access to case-specific triage and first aid guidance for poison ingestions. It is safe, quick, and easy to use. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  16. If You Build It, They Will Scan: Oxford University's Exploration of Community Collections

    ERIC Educational Resources Information Center

    Lee, Stuart D.; Lindsay, Kate

    2009-01-01

    Traditional large digitization projects demand massive resources from the central unit (library, museum, or university) that has acquired funding for them. Another model, enabled by easy access to cameras, scanners, and web tools, calls for public contributions to community collections of artifacts. In 2009, the University of Oxford ran a…

  17. Generic Service Integration in Adaptive Learning Experiences Using IMS Learning Design

    ERIC Educational Resources Information Center

    de-la-Fuente-Valentin, Luis; Pardo, Abelardo; Kloos, Carlos Delgado

    2011-01-01

    IMS Learning Design is a specification to capture the orchestration taking place in a learning scenario. This paper presents an extension called Generic Service Integration. This paradigm allows a bidirectional communication between the course engine in charge of the orchestration and conventional Web 2.0 tools. This communication allows the…

  18. Integration of e-Management, e-Development and e-Learning Technologies for Blended Course Delivery

    ERIC Educational Resources Information Center

    Johnson, Lynn E.; Tang, Michael

    2005-01-01

    This paper describes and assesses a pre-engineering curriculum development project called Foundations of Engineering, Science and Technology (FEST). FEST integrates web-based technologies into an inter-connected system to enable delivery of a blended program at multiple institutions. Tools and systems described include 1) technologies to deliver…

  19. Usability of Browser-Based Tools for Web-Search Privacy

    DTIC Science & Technology

    2010-03-01

    What might Italians call maize? — Polenta From which country do french fries originate? — Belgium In Peru, which color potatoes are grown, in addition...What is kartofflen? — Potato Dumplings What is the name of the flatbread eaten with most Indian cuisine ? — Naan What type of cuisine offers Dim Sum

  20. Examining A Health Care Price Transparency Tool: Who Uses It, And How They Shop For Care.

    PubMed

    Sinaiko, Anna D; Rosenthal, Meredith B

    2016-04-01

    Calls for transparency in health care prices are increasing, in an effort to encourage and enable patients to make value-based decisions. Yet there is very little evidence of whether and how patients use health care price transparency tools. We evaluated the experiences, in the period 2011-12, of an insured population of nonelderly adults with Aetna's Member Payment Estimator, a web-based tool that provides real-time, personalized, episode-level price estimates. Overall, use of the tool increased during the study period but remained low. Nonetheless, for some procedures the number of people searching for prices of services (called searchers) was high relative to the number of people who received the service (called patients). Among Aetna patients who had an imaging service, childbirth, or one of several outpatient procedures, searchers for price information were significantly more likely to be younger and healthier and to have incurred higher annual deductible spending than patients who did not search for price information. A campaign to deliver price information to consumers may be important to increase patients' engagement with price transparency tools. Project HOPE—The People-to-People Health Foundation, Inc.

  1. The PartnerWeb Project: a component-based approach to enterprise-wide information integration and dissemination.

    PubMed Central

    Karson, T. H.; Perkins, C.; Dixon, C.; Ehresman, J. P.; Mammone, G. L.; Sato, L.; Schaffer, J. L.; Greenes, R. A.

    1997-01-01

    A component-based health information resource, delivered on an intranet and the Internet, utilizing World Wide Web (WWW) technology, has been built to meet the needs of a large integrated delivery network (IDN). Called PartnerWeb, this resource is intended to provide a variety of health care and reference information to both practitioners and consumers/patients. The initial target audience has been providers. Content management for the numerous departments, divisions, and other organizational entities within the IDN is accomplished by a distributed authoring and editing environment. Structured entry using a set of form tools into databases facilitates consistency of information presentation, while empowering designated authors and editors in the various entities to be responsible for their own materials, but not requiring them to be technically skilled. Each form tool manages an encapsulated component. The output of each component can be a dynamically generated display on WWW platforms, or an appropriate interface to other presentation environments. The PartnerWeb project lays the foundation for both an internal and external communication infrastructure for the enterprise that can facilitate information dissemination. PMID:9357648

  2. C-SPADE: a web-tool for interactive analysis and visualization of drug screening experiments through compound-specific bioactivity dendrograms

    PubMed Central

    Alam, Zaid; Peddinti, Gopal

    2017-01-01

    Abstract The advent of polypharmacology paradigm in drug discovery calls for novel chemoinformatic tools for analyzing compounds’ multi-targeting activities. Such tools should provide an intuitive representation of the chemical space through capturing and visualizing underlying patterns of compound similarities linked to their polypharmacological effects. Most of the existing compound-centric chemoinformatics tools lack interactive options and user interfaces that are critical for the real-time needs of chemical biologists carrying out compound screening experiments. Toward that end, we introduce C-SPADE, an open-source exploratory web-tool for interactive analysis and visualization of drug profiling assays (biochemical, cell-based or cell-free) using compound-centric similarity clustering. C-SPADE allows the users to visually map the chemical diversity of a screening panel, explore investigational compounds in terms of their similarity to the screening panel, perform polypharmacological analyses and guide drug-target interaction predictions. C-SPADE requires only the raw drug profiling data as input, and it automatically retrieves the structural information and constructs the compound clusters in real-time, thereby reducing the time required for manual analysis in drug development or repurposing applications. The web-tool provides a customizable visual workspace that can either be downloaded as figure or Newick tree file or shared as a hyperlink with other users. C-SPADE is freely available at http://cspade.fimm.fi/. PMID:28472495

  3. ClustVis: a web tool for visualizing clustering of multivariate data using Principal Component Analysis and heatmap

    PubMed Central

    Metsalu, Tauno; Vilo, Jaak

    2015-01-01

    The Principal Component Analysis (PCA) is a widely used method of reducing the dimensionality of high-dimensional data, often followed by visualizing two of the components on the scatterplot. Although widely used, the method is lacking an easy-to-use web interface that scientists with little programming skills could use to make plots of their own data. The same applies to creating heatmaps: it is possible to add conditional formatting for Excel cells to show colored heatmaps, but for more advanced features such as clustering and experimental annotations, more sophisticated analysis tools have to be used. We present a web tool called ClustVis that aims to have an intuitive user interface. Users can upload data from a simple delimited text file that can be created in a spreadsheet program. It is possible to modify data processing methods and the final appearance of the PCA and heatmap plots by using drop-down menus, text boxes, sliders etc. Appropriate defaults are given to reduce the time needed by the user to specify input parameters. As an output, users can download PCA plot and heatmap in one of the preferred file formats. This web server is freely available at http://biit.cs.ut.ee/clustvis/. PMID:25969447

  4. AnnoLnc: a web server for systematically annotating novel human lncRNAs.

    PubMed

    Hou, Mei; Tang, Xing; Tian, Feng; Shi, Fangyuan; Liu, Fenglin; Gao, Ge

    2016-11-16

    Long noncoding RNAs (lncRNAs) have been shown to play essential roles in almost every important biological process through multiple mechanisms. Although the repertoire of human lncRNAs has rapidly expanded, their biological function and regulation remain largely elusive, calling for a systematic and integrative annotation tool. Here we present AnnoLnc ( http://annolnc.cbi.pku.edu.cn ), a one-stop portal for systematically annotating novel human lncRNAs. Based on more than 700 data sources and various tool chains, AnnoLnc enables a systematic annotation covering genomic location, secondary structure, expression patterns, transcriptional regulation, miRNA interaction, protein interaction, genetic association and evolution. An intuitive web interface is available for interactive analysis through both desktops and mobile devices, and programmers can further integrate AnnoLnc into their pipeline through standard JSON-based Web Service APIs. To the best of our knowledge, AnnoLnc is the only web server to provide on-the-fly and systematic annotation for newly identified human lncRNAs. Compared with similar tools, the annotation generated by AnnoLnc covers a much wider spectrum with intuitive visualization. Case studies demonstrate the power of AnnoLnc in not only rediscovering known functions of human lncRNAs but also inspiring novel hypotheses.

  5. Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.

    PubMed

    Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk

    2012-02-01

    Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. QoS measurement of workflow-based web service compositions using Colored Petri net.

    PubMed

    Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  7. Visualization and interaction tools for aerial photograph mosaics

    NASA Astrophysics Data System (ADS)

    Fernandes, João Pedro; Fonseca, Alexandra; Pereira, Luís; Faria, Adriano; Figueira, Helder; Henriques, Inês; Garção, Rita; Câmara, António

    1997-05-01

    This paper describes the development of a digital spatial library based on mosaics of digital orthophotos, called Interactive Portugal, that will enable users both to retrieve geospatial information existing in the Portuguese National System for Geographic Information World Wide Web server, and to develop local databases connected to the main system. A set of navigation, interaction, and visualization tools are proposed and discussed. They include sketching, dynamic sketching, and navigation capabilities over the digital orthophotos mosaics. Main applications of this digital spatial library are pointed out and discussed, namely for education, professional, and tourism markets. Future developments are considered. These developments are related to user reactions, technological advancements, and projects that also aim at delivering and exploring digital imagery on the World Wide Web. Future capabilities for site selection and change detection are also considered.

  8. Acquiring geographical data with web harvesting

    NASA Astrophysics Data System (ADS)

    Dramowicz, K.

    2016-04-01

    Many websites contain very attractive and up to date geographical information. This information can be extracted, stored, analyzed and mapped using web harvesting techniques. Poorly organized data from websites are transformed with web harvesting into a more structured format, which can be stored in a database and analyzed. Almost 25% of web traffic is related to web harvesting, mostly while using search engines. This paper presents how to harvest geographic information from web documents using the free tool called the Beautiful Soup, one of the most commonly used Python libraries for pulling data from HTML and XML files. It is a relatively easy task to process one static HTML table. The more challenging task is to extract and save information from tables located in multiple and poorly organized websites. Legal and ethical aspects of web harvesting are discussed as well. The paper demonstrates two case studies. The first one shows how to extract various types of information about the Good Country Index from the multiple web pages, load it into one attribute table and map the results. The second case study shows how script tools and GIS can be used to extract information from one hundred thirty six websites about Nova Scotia wines. In a little more than three minutes a database containing one hundred and six liquor stores selling these wines is created. Then the availability and spatial distribution of various types of wines (by grape types, by wineries, and by liquor stores) are mapped and analyzed.

  9. Website Analysis as a Tool for Task-Based Language Learning and Higher Order Thinking in an EFL Context

    ERIC Educational Resources Information Center

    Roy, Debopriyo

    2014-01-01

    Besides focusing on grammar, writing skills, and web-based language learning, researchers in "CALL" and second language acquisition have also argued for the importance of promoting higher-order thinking skills in ESL (English as Second Language) and EFL (English as Foreign Language) classrooms. There is solid evidence supporting the…

  10. Web 2.0 Reflective Inquiry: A Transformative Literacy Teacher Education Tool

    ERIC Educational Resources Information Center

    Stevens, Elizabeth Years

    2013-01-01

    The author describes a graduate class she teaches called Perspectives on Literacy and Technology, whose first goal is to provide a space for teachers enrolled in a literacy education MS program to reflect on current theory and research related to digital technologies and literacy. A secondary goal is to foster their use of "cutting-edge"…

  11. An Unexpected Ally: Using Microsoft's SharePoint to Create a Departmental Intranet

    ERIC Educational Resources Information Center

    Dahl, David

    2010-01-01

    In September 2008, the Albert S. Cook Library at Towson University implemented an intranet to support the various functions of the library's Reference Department. This intranet is called the RefPortal. After exploring open source options and other Web 2.0 tools, the department (under the guidance of the library technology coordinator) chose…

  12. A Smart Curriculum for Middle-School Science Instruction: A Web-Based Curriculum Integrating Assessment and Instruction.

    ERIC Educational Resources Information Center

    1996

    This paper discusses a model of integrated instruction and assessment called SMART (Special Multimedia Arenas for Refining Thinking). SMART involves interactive use of the Internet and multimedia software. The Internet serves three important functions: it acts as a formative assessment tool by providing individualized feedback to students, creates…

  13. Pediatric Price Transparency: Still Opaque With Opportunities for Improvement.

    PubMed

    Faherty, Laura J; Wong, Charlene A; Feingold, Jordyn; Li, Joan; Town, Robert; Fieldston, Evan; Werner, Rachel M

    2017-10-01

    Price transparency is gaining importance as families' portion of health care costs rise. We describe (1) online price transparency data for pediatric care on children's hospital Web sites and state-based price transparency Web sites, and (2) the consumer experience of obtaining an out-of-pocket estimate from children's hospitals for a common procedure. From 2015 to 2016, we audited 45 children's hospital Web sites and 38 state-based price transparency Web sites, describing availability and characteristics of health care prices and personalized cost estimate tools. Using secret shopper methodology, we called children's hospitals and submitted online estimate requests posing as a self-paying family requesting an out-of-pocket estimate for a tonsillectomy-adenoidectomy. Eight children's hospital Web sites (18%) listed prices. Twelve (27%) provided personalized cost estimate tool (online form n = 5 and/or phone number n = 9). All 9 hospitals with a phone number for estimates provided the estimated patient liability for a tonsillectomy-adenoidectomy (mean $6008, range $2622-$9840). Of the remaining 36 hospitals without a dedicated price estimate phone number, 21 (58%) provided estimates (mean $7144, range $1200-$15 360). Two of 4 hospitals with online forms provided estimates. Fifteen (39%) state-based Web sites distinguished between prices for pediatric and adult care. One had a personalized cost estimate tool. Meaningful prices for pediatric care were not widely available online through children's hospital or state-based price transparency Web sites. A phone line or online form for price estimates were effective strategies for hospitals to provide out-of-pocket price information. Opportunities exist to improve pediatric price transparency. Copyright © 2017 by the American Academy of Pediatrics.

  14. UnCover on the Web: search hints and applications in library environments.

    PubMed

    Galpern, N F; Albert, K M

    1997-01-01

    Among the huge maze of resources available on the Internet, UnCoverWeb stands out as a valuable tool for medical libraries. This up-to-date, free-access, multidisciplinary database of periodical references is searched through an easy-to-learn graphical user interface that is a welcome improvement over the telnet version. This article reviews the basic and advanced search techniques for UnCoverWeb, as well as providing information on the document delivery functions and table of contents alerting service called Reveal. UnCover's currency is evaluated and compared with other current awareness resources. System deficiencies are discussed, with the conclusion that although UnCoverWeb lacks the sophisticated features of many commercial database search services, it is nonetheless a useful addition to the repertoire of information sources available in a library.

  15. Translation, cross-cultural adaptation and validation of the Diabetes Empowerment Scale – Short Form

    PubMed Central

    Chaves, Fernanda Figueredo; Reis, Ilka Afonso; Pagano, Adriana Silvina; Torres, Heloísa de Carvalho

    2017-01-01

    ABSTRACT OBJECTIVE To translate, cross-culturally adapt and validate the Diabetes Empowerment Scale – Short Form for assessment of psychosocial self-efficacy in diabetes care within the Brazilian cultural context. METHODS Assessment of the instrument’s conceptual equivalence, as well as its translation and cross-cultural adaptation were performed following international standards. The Expert Committee’s assessment of the translated version was conducted through a web questionnaire developed and applied via the web tool e-Surv. The cross-culturally adapted version was used for the pre-test, which was carried out via phone call in a group of eleven health care service users diagnosed with type 2 diabetes mellitus. The pre-test results were examined by a group of experts, composed by health care consultants, applied linguists and statisticians, aiming at an adequate version of the instrument, which was subsequently used for test and retest in a sample of 100 users diagnosed with type 2 diabetes mellitus via phone call, their answers being recorded by the web tool e-Surv. Internal consistency and reproducibility of analysis were carried out within the statistical programming environment R. RESULTS Face and content validity were attained and the Brazilian Portuguese version, entitled Escala de Autoeficácia em Diabetes – Versão Curta, was established. The scale had acceptable internal consistency with Cronbach’s alpha of 0.634 (95%CI 0.494– 0.737), while the correlation of the total score in the two periods was considered moderate (0.47). The intraclass correlation coefficient was 0.50. CONCLUSIONS The translated and cross-culturally adapted version of the instrument to spoken Brazilian Portuguese was considered valid and reliable to be used for assessment within the Brazilian population diagnosed with type 2 diabetes mellitus. The use of a web tool (e-Surv) for recording the Expert Committee responses as well as the responses in the validation tests proved to be a reliable, safe and innovative method. PMID:28355337

  16. RefEx, a reference gene expression dataset as a web tool for the functional analysis of genes.

    PubMed

    Ono, Hiromasa; Ogasawara, Osamu; Okubo, Kosaku; Bono, Hidemasa

    2017-08-29

    Gene expression data are exponentially accumulating; thus, the functional annotation of such sequence data from metadata is urgently required. However, life scientists have difficulty utilizing the available data due to its sheer magnitude and complicated access. We have developed a web tool for browsing reference gene expression pattern of mammalian tissues and cell lines measured using different methods, which should facilitate the reuse of the precious data archived in several public databases. The web tool is called Reference Expression dataset (RefEx), and RefEx allows users to search by the gene name, various types of IDs, chromosomal regions in genetic maps, gene family based on InterPro, gene expression patterns, or biological categories based on Gene Ontology. RefEx also provides information about genes with tissue-specific expression, and the relative gene expression values are shown as choropleth maps on 3D human body images from BodyParts3D. Combined with the newly incorporated Functional Annotation of Mammals (FANTOM) dataset, RefEx provides insight regarding the functional interpretation of unfamiliar genes. RefEx is publicly available at http://refex.dbcls.jp/.

  17. TogoTable: cross-database annotation system using the Resource Description Framework (RDF) data model.

    PubMed

    Kawano, Shin; Watanabe, Tsutomu; Mizuguchi, Sohei; Araki, Norie; Katayama, Toshiaki; Yamaguchi, Atsuko

    2014-07-01

    TogoTable (http://togotable.dbcls.jp/) is a web tool that adds user-specified annotations to a table that a user uploads. Annotations are drawn from several biological databases that use the Resource Description Framework (RDF) data model. TogoTable uses database identifiers (IDs) in the table as a query key for searching. RDF data, which form a network called Linked Open Data (LOD), can be searched from SPARQL endpoints using a SPARQL query language. Because TogoTable uses RDF, it can integrate annotations from not only the reference database to which the IDs originally belong, but also externally linked databases via the LOD network. For example, annotations in the Protein Data Bank can be retrieved using GeneID through links provided by the UniProt RDF. Because RDF has been standardized by the World Wide Web Consortium, any database with annotations based on the RDF data model can be easily incorporated into this tool. We believe that TogoTable is a valuable Web tool, particularly for experimental biologists who need to process huge amounts of data such as high-throughput experimental output. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. RefEx, a reference gene expression dataset as a web tool for the functional analysis of genes

    PubMed Central

    Ono, Hiromasa; Ogasawara, Osamu; Okubo, Kosaku; Bono, Hidemasa

    2017-01-01

    Gene expression data are exponentially accumulating; thus, the functional annotation of such sequence data from metadata is urgently required. However, life scientists have difficulty utilizing the available data due to its sheer magnitude and complicated access. We have developed a web tool for browsing reference gene expression pattern of mammalian tissues and cell lines measured using different methods, which should facilitate the reuse of the precious data archived in several public databases. The web tool is called Reference Expression dataset (RefEx), and RefEx allows users to search by the gene name, various types of IDs, chromosomal regions in genetic maps, gene family based on InterPro, gene expression patterns, or biological categories based on Gene Ontology. RefEx also provides information about genes with tissue-specific expression, and the relative gene expression values are shown as choropleth maps on 3D human body images from BodyParts3D. Combined with the newly incorporated Functional Annotation of Mammals (FANTOM) dataset, RefEx provides insight regarding the functional interpretation of unfamiliar genes. RefEx is publicly available at http://refex.dbcls.jp/. PMID:28850115

  19. MySQL/PHP web database applications for IPAC proposal submission

    NASA Astrophysics Data System (ADS)

    Crane, Megan K.; Storrie-Lombardi, Lisa J.; Silbermann, Nancy A.; Rebull, Luisa M.

    2008-07-01

    The Infrared Processing and Analysis Center (IPAC) is NASA's multi-mission center of expertise for long-wavelength astrophysics. Proposals for various IPAC missions and programs are ingested via MySQL/PHP web database applications. Proposers use web forms to enter coversheet information and upload PDF files related to the proposal. Upon proposal submission, a unique directory is created on the webserver into which all of the uploaded files are placed. The coversheet information is converted into a PDF file using a PHP extension called FPDF. The files are concatenated into one PDF file using the command-line tool pdftk and then forwarded to the review committee. This work was performed at the California Institute of Technology under contract to the National Aeronautics and Space Administration.

  20. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    NASA Astrophysics Data System (ADS)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  1. search GenBank: interactive orchestration and ad-hoc choreography of Web services in the exploration of the biomedical resources of the National Center For Biotechnology Information

    PubMed Central

    2013-01-01

    Background Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. Results We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user’s query, advanced data searching based on the specified user’s query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. Conclusions search GenBank extends standard capabilities of the NCBI Entrez search engine in querying biomedical databases. The possibility of creating and saving macros in the search GenBank is a unique feature and has a great potential. The potential will further grow in the future with the increasing density of networks of relationships between data stored in particular databases. search GenBank is available for public use at http://sgb.biotools.pl/. PMID:23452691

  2. search GenBank: interactive orchestration and ad-hoc choreography of Web services in the exploration of the biomedical resources of the National Center For Biotechnology Information.

    PubMed

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Siążnik, Artur

    2013-03-01

    Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user's query, advanced data searching based on the specified user's query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. search GenBank extends standard capabilities of the NCBI Entrez search engine in querying biomedical databases. The possibility of creating and saving macros in the search GenBank is a unique feature and has a great potential. The potential will further grow in the future with the increasing density of networks of relationships between data stored in particular databases. search GenBank is available for public use at http://sgb.biotools.pl/.

  3. An advanced web query interface for biological databases

    PubMed Central

    Latendresse, Mario; Karp, Peter D.

    2010-01-01

    Although most web-based biological databases (DBs) offer some type of web-based form to allow users to author DB queries, these query forms are quite restricted in the complexity of DB queries that they can formulate. They can typically query only one DB, and can query only a single type of object at a time (e.g. genes) with no possible interaction between the objects—that is, in SQL parlance, no joins are allowed between DB objects. Writing precise queries against biological DBs is usually left to a programmer skillful enough in complex DB query languages like SQL. We present a web interface for building precise queries for biological DBs that can construct much more precise queries than most web-based query forms, yet that is user friendly enough to be used by biologists. It supports queries containing multiple conditions, and connecting multiple object types without using the join concept, which is unintuitive to biologists. This interactive web interface is called the Structured Advanced Query Page (SAQP). Users interactively build up a wide range of query constructs. Interactive documentation within the SAQP describes the schema of the queried DBs. The SAQP is based on BioVelo, a query language based on list comprehension. The SAQP is part of the Pathway Tools software and is available as part of several bioinformatics web sites powered by Pathway Tools, including the BioCyc.org site that contains more than 500 Pathway/Genome DBs. PMID:20624715

  4. Creation and utilization of a World Wide Web based space radiation effects code: SIREST

    NASA Technical Reports Server (NTRS)

    Singleterry, R. C. Jr; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.; Thibeault, S. A.; Noor, A. K.; Cucinotta, F. A.; Badavi, F. F.; Chang, C. K.; Qualls, G. D.; hide

    2001-01-01

    In order for humans and electronics to fully and safely operate in the space environment, codes like HZETRN (High Charge and Energy Transport) must be included in any designer's toolbox for design evaluation with respect to radiation damage. Currently, spacecraft designers do not have easy access to accurate radiation codes like HZETRN to evaluate their design for radiation effects on humans and electronics. Today, the World Wide Web is sophisticated enough to support the entire HZETRN code and all of the associated pre and post processing tools. This package is called SIREST (Space Ionizing Radiation Effects and Shielding Tools). There are many advantages to SIREST. The most important advantage is the instant update capability of the web. Another major advantage is the modularity that the web imposes on the code. Right now, the major disadvantage of SIREST will be its modularity inside the designer's system. This mostly comes from the fact that a consistent interface between the designer and the computer system to evaluate the design is incomplete. This, however, is to be solved in the Intelligent Synthesis Environment (ISE) program currently being funded by NASA.

  5. The Impact of Oral and Written Feedback on EFL Writers with the Use of Screencasts

    ERIC Educational Resources Information Center

    Alvira, Roberto

    2016-01-01

    This article, based on an action research study performed at a Colombian middle-sized private university, proposes specific strategies to provide feedback to English as a foreign language learners and uses a Web 2.0 tool called "screencasting." The findings of the study suggest that the use of coded, written, and oral feedback is widely…

  6. The Electron Microscopy Outreach Program: A Web-based resource for research and education.

    PubMed

    Sosinsky, G E; Baker, T S; Hand, G; Ellisman, M H

    1999-01-01

    We have developed a centralized World Wide Web (WWW)-based environment that serves as a resource of software tools and expertise for biological electron microscopy. A major focus is molecular electron microscopy, but the site also includes information and links on structural biology at all levels of resolution. This site serves to help integrate or link structural biology techniques in accordance with user needs. The WWW site, called the Electron Microscopy (EM) Outreach Program (URL: http://emoutreach.sdsc.edu), provides scientists with computational and educational tools for their research and edification. In particular, we have set up a centralized resource containing course notes, references, and links to image analysis and three-dimensional reconstruction software for investigators wanting to learn about EM techniques either within or outside of their fields of expertise. Copyright 1999 Academic Press.

  7. Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology

    PubMed Central

    Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron

    2010-01-01

    Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237

  8. The semantic web and computer vision: old AI meets new AI

    NASA Astrophysics Data System (ADS)

    Mundy, J. L.; Dong, Y.; Gilliam, A.; Wagner, R.

    2018-04-01

    There has been vast process in linking semantic information across the billions of web pages through the use of ontologies encoded in the Web Ontology Language (OWL) based on the Resource Description Framework (RDF). A prime example is the Wikipedia where the knowledge contained in its more than four million pages is encoded in an ontological database called DBPedia http://wiki.dbpedia.org/. Web-based query tools can retrieve semantic information from DBPedia encoded in interlinked ontologies that can be accessed using natural language. This paper will show how this vast context can be used to automate the process of querying images and other geospatial data in support of report changes in structures and activities. Computer vision algorithms are selected and provided with context based on natural language requests for monitoring and analysis. The resulting reports provide semantically linked observations from images and 3D surface models.

  9. miRNAFold: a web server for fast miRNA precursor prediction in genomes.

    PubMed

    Tav, Christophe; Tempel, Sébastien; Poligny, Laurent; Tahi, Fariza

    2016-07-08

    Computational methods are required for prediction of non-coding RNAs (ncRNAs), which are involved in many biological processes, especially at post-transcriptional level. Among these ncRNAs, miRNAs have been largely studied and biologists need efficient and fast tools for their identification. In particular, ab initio methods are usually required when predicting novel miRNAs. Here we present a web server dedicated for miRNA precursors identification at a large scale in genomes. It is based on an algorithm called miRNAFold that allows predicting miRNA hairpin structures quickly with high sensitivity. miRNAFold is implemented as a web server with an intuitive and user-friendly interface, as well as a standalone version. The web server is freely available at: http://EvryRNA.ibisc.univ-evry.fr/miRNAFold. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. The relation between media promotions and service volume for a statewide tobacco quitline and a web-based cessation program

    PubMed Central

    2011-01-01

    Background This observational study assessed the relation between mass media campaigns and service volume for a statewide tobacco cessation quitline and stand-alone web-based cessation program. Methods Multivariate regression analysis was used to identify how weekly calls to a cessation quitline and weekly registrations to a web-based cessation program are related to levels of broadcast media, media campaigns, and media types, controlling for the impact of external and earned media events. Results There was a positive relation between weekly broadcast targeted rating points and the number of weekly calls to a cessation quitline and the number of weekly registrations to a web-based cessation program. Additionally, print secondhand smoke ads and online cessation ads were positively related to weekly quitline calls. Television and radio cessation ads and radio smoke-free law ads were positively related to web program registration levels. There was a positive relation between the number of web registrations and the number of calls to the cessation quitline, with increases in registrations to the web in 1 week corresponding to increases in calls to the quitline in the subsequent week. Web program registration levels were more highly influenced by earned media and other external events than were quitline call volumes. Conclusion Overall, broadcast advertising had a greater impact on registrations for the web program than calls to the quitline. Furthermore, registrations for the web program influenced calls to the quitline. These two findings suggest the evolving roles of web-based cessation programs and Internet-use practices should be considered when creating cessation programs and media campaigns to promote them. Additionally, because different types of media and campaigns were positively associated with calls to the quitline and web registrations, developing mass media campaigns that offer a variety of messages and communicate through different types of media to motivate tobacco users to seek services appears important to reach tobacco users. Further research is needed to better understand the complexities and opportunities involved in simultaneous promotion of quitline and web-based cessation services. PMID:22177237

  11. The relation between media promotions and service volume for a statewide tobacco quitline and a web-based cessation program.

    PubMed

    Schillo, Barbara A; Mowery, Andrea; Greenseid, Lija O; Luxenberg, Michael G; Zieffler, Andrew; Christenson, Matthew; Boyle, Raymond G

    2011-12-16

    This observational study assessed the relation between mass media campaigns and service volume for a statewide tobacco cessation quitline and stand-alone web-based cessation program. Multivariate regression analysis was used to identify how weekly calls to a cessation quitline and weekly registrations to a web-based cessation program are related to levels of broadcast media, media campaigns, and media types, controlling for the impact of external and earned media events. There was a positive relation between weekly broadcast targeted rating points and the number of weekly calls to a cessation quitline and the number of weekly registrations to a web-based cessation program. Additionally, print secondhand smoke ads and online cessation ads were positively related to weekly quitline calls. Television and radio cessation ads and radio smoke-free law ads were positively related to web program registration levels. There was a positive relation between the number of web registrations and the number of calls to the cessation quitline, with increases in registrations to the web in 1 week corresponding to increases in calls to the quitline in the subsequent week. Web program registration levels were more highly influenced by earned media and other external events than were quitline call volumes. Overall, broadcast advertising had a greater impact on registrations for the web program than calls to the quitline. Furthermore, registrations for the web program influenced calls to the quitline. These two findings suggest the evolving roles of web-based cessation programs and Internet-use practices should be considered when creating cessation programs and media campaigns to promote them. Additionally, because different types of media and campaigns were positively associated with calls to the quitline and web registrations, developing mass media campaigns that offer a variety of messages and communicate through different types of media to motivate tobacco users to seek services appears important to reach tobacco users. Further research is needed to better understand the complexities and opportunities involved in simultaneous promotion of quitline and web-based cessation services.

  12. [EEQ in clinical embryology: a starting program].

    PubMed

    Boyer, Pierre; Brugnon, Florence; Levy, Rachel; Pfeffer, Jérôme; Siest, Jean-Pascal

    2014-01-01

    Every laboratory including those working in assisted reproductive technologies have to be accredited EN ISO 15189 before 2020. This standardisation includes an external quality evaluation (EQE). In order to work out an EQE tool, we used images extracted from our own database developed during daily practice. We achieved an easily online tool called: "EEQ en embryologie clinique", developed on Biologie prospective web site with ART French biologists Association (Blefco) expertise in evaluation of early human embryonic stages. In 2013, 38 ART laboratories participate to the first program with more than 90% of appropriates results. The present article aims at describing this tool and discussing its limits.

  13. GDSCalc: A Web-Based Application for Evaluating Discrete Graph Dynamical Systems

    PubMed Central

    Elmeligy Abdelhamid, Sherif H.; Kuhlman, Chris J.; Marathe, Madhav V.; Mortveit, Henning S.; Ravi, S. S.

    2015-01-01

    Discrete dynamical systems are used to model various realistic systems in network science, from social unrest in human populations to regulation in biological networks. A common approach is to model the agents of a system as vertices of a graph, and the pairwise interactions between agents as edges. Agents are in one of a finite set of states at each discrete time step and are assigned functions that describe how their states change based on neighborhood relations. Full characterization of state transitions of one system can give insights into fundamental behaviors of other dynamical systems. In this paper, we describe a discrete graph dynamical systems (GDSs) application called GDSCalc for computing and characterizing system dynamics. It is an open access system that is used through a web interface. We provide an overview of GDS theory. This theory is the basis of the web application; i.e., an understanding of GDS provides an understanding of the software features, while abstracting away implementation details. We present a set of illustrative examples to demonstrate its use in education and research. Finally, we compare GDSCalc with other discrete dynamical system software tools. Our perspective is that no single software tool will perform all computations that may be required by all users; tools typically have particular features that are more suitable for some tasks. We situate GDSCalc within this space of software tools. PMID:26263006

  14. GDSCalc: A Web-Based Application for Evaluating Discrete Graph Dynamical Systems.

    PubMed

    Elmeligy Abdelhamid, Sherif H; Kuhlman, Chris J; Marathe, Madhav V; Mortveit, Henning S; Ravi, S S

    2015-01-01

    Discrete dynamical systems are used to model various realistic systems in network science, from social unrest in human populations to regulation in biological networks. A common approach is to model the agents of a system as vertices of a graph, and the pairwise interactions between agents as edges. Agents are in one of a finite set of states at each discrete time step and are assigned functions that describe how their states change based on neighborhood relations. Full characterization of state transitions of one system can give insights into fundamental behaviors of other dynamical systems. In this paper, we describe a discrete graph dynamical systems (GDSs) application called GDSCalc for computing and characterizing system dynamics. It is an open access system that is used through a web interface. We provide an overview of GDS theory. This theory is the basis of the web application; i.e., an understanding of GDS provides an understanding of the software features, while abstracting away implementation details. We present a set of illustrative examples to demonstrate its use in education and research. Finally, we compare GDSCalc with other discrete dynamical system software tools. Our perspective is that no single software tool will perform all computations that may be required by all users; tools typically have particular features that are more suitable for some tasks. We situate GDSCalc within this space of software tools.

  15. Epidemiology of operative burns at Kijabe Hospital from 2006 to 2010: Pilot study of a web-based tool for creation of the Kenya Burn Repository

    PubMed Central

    Dale, Elizabeth L.; Mueller, Melissa A.; Wang, Li; Fogerty, Mary D.; Guy, Jeffrey S.; Nthumba, Peter M.

    2013-01-01

    Introduction In order to implement effective burn prevention strategies, the WHO has called for improved data collection to better characterize burn injuries in low and middle income countries (LMIC). This study was designed to gather information on burn injury in Kenya and to test a model for such data collection. Methods The study was designed as a retrospective case series study utilizing an electronic data collection tool to assess the scope of burn injuries requiring operation at Kijabe Hospital from January 2006 to May 2010. Data were entered into a web-based tool to test its utility as the potential Kenya Burn Repository (KBR). Results 174 patients were included. The median age was 10 years. There was a male predominance (59% vs. 41%). Findings included that timing of presentation was associated with burn etiology (p = 0.009). Length of stay (LOS) was associated with burn etiology (p < 0.001). Etiology differed depending on the age group, with scald being most prominent in children (p = 0.002). Conclusions Burn injuries in Kenya show similarities with other LMIC in etiology and pediatric predominance. Late presentation for care and prolonged LOS are areas for further investigation. The web-based database is an effective tool for data collection and international collaboration. PMID:23040425

  16. Epidemiology of operative burns at Kijabe Hospital from 2006 to 2010: pilot study of a web-based tool for creation of the Kenya Burn Repository.

    PubMed

    Dale, Elizabeth L; Mueller, Melissa A; Wang, Li; Fogerty, Mary D; Guy, Jeffrey S; Nthumba, Peter M

    2013-06-01

    In order to implement effective burn prevention strategies, the WHO has called for improved data collection to better characterize burn injuries in low and middle income countries (LMIC). This study was designed to gather information on burn injury in Kenya and to test a model for such data collection. The study was designed as a retrospective case series study utilizing an electronic data collection tool to assess the scope of burn injuries requiring operation at Kijabe Hospital from January 2006 to May 2010. Data were entered into a web-based tool to test its utility as the potential Kenya Burn Repository (KBR). 174 patients were included. The median age was 10 years. There was a male predominance (59% vs. 41%). Findings included that timing of presentation was associated with burn etiology (p=0.009). Length of stay (LOS) was associated with burn etiology (p<0.001). Etiology differed depending on the age group, with scald being most prominent in children (p=0.002). Burn injuries in Kenya show similarities with other LMIC in etiology and pediatric predominance. Late presentation for care and prolonged LOS are areas for further investigation. The web-based database is an effective tool for data collection and international collaboration. Copyright © 2012 Elsevier Ltd and ISBI. All rights reserved.

  17. Using Collaborative Web Technology to Construct the Health Information National Trends Survey (HINTS)

    PubMed Central

    MOSER, RICHARD P.; BECKJORD, ELLEN BURKE; RUTTEN, LILA J. FINNEY; BLAKE, KELLY; HESSE, BRADFORD W.

    2012-01-01

    Scientists are taking advantage of web-based technology to work in new collaborative environments, a phenomenon known as Science 2.0. The National Cancer Institute (NCI) created a web-based tool called HINTS-GEM that allows a diverse group of stakeholders to collaborate in a virtual environment by providing input on content for the Health Information National Trends Survey (HINTS). This involved stakeholders providing new suggested content and commenting and rating on existing content. HINTS is a nationally-representative survey of the US non-institutionalized adult population (see Finney Rutten et al. [this journal] for more information about the HINTS program). This paper describes the conceptual development of HINTS-GEM and provides results of its use by stakeholders in creating an improved survey instrument. PMID:23020764

  18. DRIFTER Web App Development Support

    NASA Technical Reports Server (NTRS)

    Davis, Derrick D.; Armstrong, Curtis D.

    2015-01-01

    During my 2015 internship at Stennis Space Center (SSC) I supported the development of a web based tool to enable user interaction with a low-cost environmental monitoring buoy called the DRIFTER. DRIFTERs are designed by SSC's Applied Science and Technology Projects branch and are used to measure parameters such as water temperature and salinity. Data collected by the buoys help verify measurements by NASA satellites, which contributes to NASA's mission to advance understanding of the Earth by developing technologies to improve the quality of life on or home planet. My main objective during this internship was to support the development of the DRIFTER by writing web-based software that allows the public to view and access data collected by the buoys. In addition, this software would enable DRIFTER owners to configure and control the devices.

  19. A Reflective E-Learning Journey from the Dawn of CALL to Web 2.0 Intercultural Communicative Competence (ICC)

    ERIC Educational Resources Information Center

    Orsini-Jones, Marina

    2015-01-01

    The author reflects on how she has used technology as a language teacher since the mid-1980s. She describes the evolution of technology in language learning pre- and Internet tools, from "blended learning" to social media. She concludes with the telecollaboration projects she has been recently working on and the issues they have found…

  20. A New Tool for Managing Students' Self-Evaluations in Traditional and Distance Education Courses.

    ERIC Educational Resources Information Center

    Contreras-Castillo, Juan; Block, Arthur Edwards

    This paper describes a Web-based question-answering system called E-teacher that can be used in both traditional and distance learning courses to review academic contents. E-teacher is a self-editing template-based system that consists of a set of PHP scripts that generate the HTML code dynamically, or "on the fly." The entire E-teacher…

  1. Faculty Usage of Social Media and Mobile Devices: Analysis of Advantages and Concerns

    ERIC Educational Resources Information Center

    Roebuck, Deborah Britt; Siha, Samia; Bell, Reginald L.

    2013-01-01

    This study seeks to understand the perceptions of professors using social media (also called Web 2.0 tools) in the classroom, what kinds of mobile devices are used to access the social media used, and what drives individuals to use them. In addition, it seeks to identify the advantages and concerns faculty has with the use of social media for…

  2. The national web-based outbreak rapid alert system in Norway: eight years of experience, 2006-2013.

    PubMed

    Guzman-Herrador, B; Vold, L; Berg, T; Berglund, T M; Heier, B; Kapperud, G; Lange, H; Nygård, K

    2016-01-01

    In 2005, the Norwegian Institute of Public Health established a web-based outbreak rapid alert system called Vesuv. The system is used for mandatory outbreak alerts from municipal medical officers, healthcare institutions, and food safety authorities. As of 2013, 1426 outbreaks have been reported, involving 32913 cases. More than half of the outbreaks occurred in healthcare institutions (759 outbreaks, 53·2%). A total of 474 (33·2%) outbreaks were associated with food or drinking water. The web-based rapid alert system has proved to be a helpful tool by enhancing reporting and enabling rapid and efficient information sharing between different authorities at both the local and national levels. It is also an important tool for event-based reporting, as required by the International Health Regulations (IHR) 2005. Collecting information from all the outbreak alerts and reports in a national database is also useful for analysing trends, such as occurrence of certain microorganisms, places or sources of infection, or route of transmission. This can facilitate the identification of specific areas where more general preventive measures are needed.

  3. MAPI: towards the integrated exploitation of bioinformatics Web Services.

    PubMed

    Ramirez, Sergio; Karlsson, Johan; Trelles, Oswaldo

    2011-10-27

    Bioinformatics is commonly featured as a well assorted list of available web resources. Although diversity of services is positive in general, the proliferation of tools, their dispersion and heterogeneity complicate the integrated exploitation of such data processing capacity. To facilitate the construction of software clients and make integrated use of this variety of tools, we present a modular programmatic application interface (MAPI) that provides the necessary functionality for uniform representation of Web Services metadata descriptors including their management and invocation protocols of the services which they represent. This document describes the main functionality of the framework and how it can be used to facilitate the deployment of new software under a unified structure of bioinformatics Web Services. A notable feature of MAPI is the modular organization of the functionality into different modules associated with specific tasks. This means that only the modules needed for the client have to be installed, and that the module functionality can be extended without the need for re-writing the software client. The potential utility and versatility of the software library has been demonstrated by the implementation of several currently available clients that cover different aspects of integrated data processing, ranging from service discovery to service invocation with advanced features such as workflows composition and asynchronous services calls to multiple types of Web Services including those registered in repositories (e.g. GRID-based, SOAP, BioMOBY, R-bioconductor, and others).

  4. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    NASA Astrophysics Data System (ADS)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  5. EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-01-16

    The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graph mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution,more » diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'« less

  6. SLIDE - a web-based tool for interactive visualization of large-scale -omics data.

    PubMed

    Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon

    2018-06-28

    Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.

  7. Protein Data Bank Japan (PDBj): updated user interfaces, resource description framework, analysis tools for large structures

    PubMed Central

    Kinjo, Akira R.; Bekker, Gert-Jan; Suzuki, Hirofumi; Tsuchiya, Yuko; Kawabata, Takeshi; Ikegawa, Yasuyo; Nakamura, Haruki

    2017-01-01

    The Protein Data Bank Japan (PDBj, http://pdbj.org), a member of the worldwide Protein Data Bank (wwPDB), accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins. We herein outline the updated web user interfaces together with RESTful web services and the backend relational database that support the former. To enhance the interoperability of the PDB data, we have previously developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, which is now a wwPDB standard called wwPDB/RDF. We have enhanced the connectivity of the wwPDB/RDF data by incorporating various external data resources. Services for searching, comparing and analyzing the ever-increasing large structures determined by hybrid methods are also described. PMID:27789697

  8. Cyberinfrastructure at IRIS: Challenges and Solutions Providing Integrated Data Access to EarthScope and Other Earth Science Data

    NASA Astrophysics Data System (ADS)

    Ahern, T. K.; Barga, R.; Casey, R.; Kamb, L.; Parastatidis, S.; Stromme, S.; Weertman, B. T.

    2008-12-01

    While mature methods of accessing seismic data from the IRIS DMC have existed for decades, the demands for improved interdisciplinary data integration call for new approaches. Talented software teams at the IRIS DMC, UNAVCO and the ICDP in Germany, have been developing web services for all EarthScope data including data from USArray, PBO and SAFOD. These web services are based upon SOAP and WSDL. The EarthScope Data Portal was the first external system to access data holdings from the IRIS DMC using Web Services. EarthScope will also draw more heavily upon products to aid in cross-disciplinary data reuse. A Product Management System called SPADE allows archive of and access to heterogeneous data products, presented as XML documents, at the IRIS DMC. Searchable metadata are extracted from the XML and enable powerful searches for products from EarthScope and other data sources. IRIS is teaming with the External Research Group at Microsoft Research to leverage a powerful Scientific Workflow Engine (Trident) and interact with the web services developed at centers such as IRIS to enable access to data services as well as computational services. We believe that this approach will allow web- based control of workflows and the invocation of computational services that transform data. This capability will greatly improve access to data across scientific disciplines. This presentation will review some of the traditional access tools as well as many of the newer approaches that use web services, scientific workflow to improve interdisciplinary data access.

  9. Web-based assessments of physical activity in youth: considerations for design and scale calibration.

    PubMed

    Saint-Maurice, Pedro F; Welk, Gregory J

    2014-12-01

    This paper describes the design and methods involved in calibrating a Web-based self-report instrument to estimate physical activity behavior. The limitations of self-report measures are well known, but calibration methods enable the reported information to be equated to estimates obtained from objective data. This paper summarizes design considerations for effective development and calibration of physical activity self-report measures. Each of the design considerations is put into context and followed by a practical application based on our ongoing calibration research with a promising online self-report tool called the Youth Activity Profile (YAP). We first describe the overall concept of calibration and how this influences the selection of appropriate self-report tools for this population. We point out the advantages and disadvantages of different monitoring devices since the choice of the criterion measure and the strategies used to minimize error in the measure can dramatically improve the quality of the data. We summarize strategies to ensure quality control in data collection and discuss analytical considerations involved in group- vs individual-level inference. For cross-validation procedures, we describe the advantages of equivalence testing procedures that directly test and quantify agreement. Lastly, we introduce the unique challenges encountered when transitioning from paper to a Web-based tool. The Web offers considerable potential for broad adoption but an iterative calibration approach focused on continued refinement is needed to ensure that estimates are generalizable across individuals, regions, seasons and countries.

  10. The Future of the Internet in Science

    NASA Technical Reports Server (NTRS)

    Guice, Jon; Duffy, Robert

    2000-01-01

    How are scientists going to make use of the Internet several years from now? This is a case study of a leading-edge experiment in building a 'virtual institute'-- using electronic communication tools to foster collaboration among geographically dispersed scientists. Our experience suggests: Scientists will want to use web-based document management systems. There will be a demand for Internet-enabled meeting support tools. While internet videoconferencing will have limited value for scientists, webcams will be in great demand as a tool for transmitting pictures of objects and settings, rather than "talking heads." and a significant share of scientists who do fieldwork will embrace mobile voice, data and video communication tools. The setting for these findings is a research consortium called the NASA Astrobiology Institute.

  11. Developing a Web-Based Ppgis, as AN Environmental Reporting Service

    NASA Astrophysics Data System (ADS)

    Ranjbar Nooshery, N.; Taleai, M.; Kazemi, R.; Ebadi, K.

    2017-09-01

    Today municipalities are searching for new tools to empower locals for changing the future of their own areas by increasing their participation in different levels of urban planning. These tools should involve the community in planning process using participatory approaches instead of long traditional top-down planning models and help municipalities to obtain proper insight about major problems of urban neighborhoods from the residents' point of view. In this matter, public participation GIS (PPGIS) which enables citizens to record and following up their feeling and spatial knowledge regarding problems of the city in the form of maps have been introduced. In this research, a tool entitled CAER (Collecting & Analyzing of Environmental Reports) is developed. In the first step, a software framework based on Web-GIS tool, called EPGIS (Environmental Participatory GIS) has been designed to support public participation in reporting urban environmental problems and to facilitate data flow between citizens and municipality. A web-based cartography tool was employed for geo-visualization and dissemination of map-based reports. In the second step of CAER, a subsystem is developed based on SOLAP (Spatial On-Line Analytical Processing), as a data mining tools to elicit the local knowledge facilitating bottom-up urban planning practices and to help urban managers to find hidden relations among the recorded reports. This system is implemented in a case study area in Boston, Massachusetts and its usability was evaluated. The CAER should be considered as bottom-up planning tools to collect people's problems and views about their neighborhood and transmits them to the city officials. It also helps urban planners to find solutions for better management from citizen's viewpoint and gives them this chance to develop good plans to the neighborhoods that should be satisfied the citizens.

  12. Opal web services for biomedical applications.

    PubMed

    Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W

    2010-07-01

    Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.

  13. Reusable science tools for analog exploration missions: xGDS Web Tools, VERVE, and Gigapan Voyage

    NASA Astrophysics Data System (ADS)

    Lee, Susan Y.; Lees, David; Cohen, Tamar; Allan, Mark; Deans, Matthew; Morse, Theodore; Park, Eric; Smith, Trey

    2013-10-01

    The Exploration Ground Data Systems (xGDS) project led by the Intelligent Robotics Group (IRG) at NASA Ames Research Center creates software tools to support multiple NASA-led planetary analog field experiments. The two primary tools that fall under the xGDS umbrella are the xGDS Web Tools (xGDS-WT) and Visual Environment for Remote Virtual Exploration (VERVE). IRG has also developed a hardware and software system that is closely integrated with our xGDS tools and is used in multiple field experiments called Gigapan Voyage. xGDS-WT, VERVE, and Gigapan Voyage are examples of IRG projects that improve the ratio of science return versus development effort by creating generic and reusable tools that leverage existing technologies in both hardware and software. xGDS Web Tools provides software for gathering and organizing mission data for science and engineering operations, including tools for planning traverses, monitoring autonomous or piloted vehicles, visualization, documentation, analysis, and search. VERVE provides high performance three dimensional (3D) user interfaces used by scientists, robot operators, and mission planners to visualize robot data in real time. Gigapan Voyage is a gigapixel image capturing and processing tool that improves situational awareness and scientific exploration in human and robotic analog missions. All of these technologies emphasize software reuse and leverage open source and/or commercial-off-the-shelf tools to greatly improve the utility and reduce the development and operational cost of future similar technologies. Over the past several years these technologies have been used in many NASA-led robotic field campaigns including the Desert Research and Technology Studies (DRATS), the Pavilion Lake Research Project (PLRP), the K10 Robotic Follow-Up tests, and most recently we have become involved in the NASA Extreme Environment Mission Operations (NEEMO) field experiments. A major objective of these joint robot and crew experiments is to improve NASAs understanding of how to most effectively execute and increase science return from exploration missions. This paper focuses on an integrated suite of xGDS software and compatible hardware tools: xGDS Web Tools, VERVE, and Gigapan Voyage, how they are used, and the design decisions that were made to allow them to be easily developed, integrated, tested, and reused by multiple NASA field experiments and robotic platforms.

  14. GI-POP: a combinational annotation and genomic island prediction pipeline for ongoing microbial genome projects.

    PubMed

    Lee, Chi-Ching; Chen, Yi-Ping Phoebe; Yao, Tzu-Jung; Ma, Cheng-Yu; Lo, Wei-Cheng; Lyu, Ping-Chiang; Tang, Chuan Yi

    2013-04-10

    Sequencing of microbial genomes is important because of microbial-carrying antibiotic and pathogenetic activities. However, even with the help of new assembling software, finishing a whole genome is a time-consuming task. In most bacteria, pathogenetic or antibiotic genes are carried in genomic islands. Therefore, a quick genomic island (GI) prediction method is useful for ongoing sequencing genomes. In this work, we built a Web server called GI-POP (http://gipop.life.nthu.edu.tw) which integrates a sequence assembling tool, a functional annotation pipeline, and a high-performance GI predicting module, in a support vector machine (SVM)-based method called genomic island genomic profile scanning (GI-GPS). The draft genomes of the ongoing genome projects in contigs or scaffolds can be submitted to our Web server, and it provides the functional annotation and highly probable GI-predicting results. GI-POP is a comprehensive annotation Web server designed for ongoing genome project analysis. Researchers can perform annotation and obtain pre-analytic information include possible GIs, coding/non-coding sequences and functional analysis from their draft genomes. This pre-analytic system can provide useful information for finishing a genome sequencing project. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. One project of Educational Innovation applying news Information and Communications Technologies (ICT): CyberAula 2.0

    NASA Astrophysics Data System (ADS)

    Mendiola, M. A.; Aguado, P. L.; Espejo, R.

    2012-04-01

    The main objective of the CyberAula 2.0 project is to support, record and validate videoconferencing and lecture recording services by means the integration of the Polytechnic University of Madrid (UPM) Moodle Platform with both the Global Plaza platform and Isabel videoconference tool. Each class session is broadcast on the Internet and then recorded, enabling geographically distant students to participate in on-campus classes. All the software used in the project is open source. The videoconferencing tool that we have used is called Isabel and the web platform to schedule, perform, stream, record and publish the videoconferences automatically is called GlobalPlaza. Both of them have been developed at UPM (Universidad Politécnica de Madrid) and are specifically designed for educational purposes. Each class session is broadcasted on the Internet and recorded, enabling geographically distant students to participate live in on-campus classes with questions through a chat tool or through the same videoconference. In order to provide educational support to GlobalPlaza, the CyberAula 2.0 project has been proposed. GlobalPlaza (Barra et al., 2011) is the web platform to schedule, perform, stream, record and publish videoconferences automatically. It is integrated with the videoconferencing tool called Isabel (Quemada et al, 2005), which is a real-time collaboration tool for the Internet, which supports advanced collaborative web/videoconferencing with application sharing and TV like media integration. Both of them are open source solutions which have been developed at our university. GlobalPlaza is a web application developed in the context of the GLOBAL project, a research project supported by the European Commission's seventh framework program. Students can review the recording lectures when needed through Moodle. In this paper we present the project developed at the Escuela Técnica Superior de Ingenieros Agrónomos (ETSIA), Secondary Cycle free-elective-subject, which is currently in process of expiration with the introduction of new curricula within the framework of the European Higher Education Space. Students participate in this subject with outstanding interest, thus achieving transversal competences as they must prepare and present a report in the last week of the Semester. The Project development background was the inclusion of the subject Plants of agro-alimentary interest. It has a quite remarkable, attractive practical deal within the Subjects Offer by this Center and it is one of the most demanded subjects of the free elective ones by students, whose active participation can be highlighted, either in practical workshops or in their individual presentation of reports. In the workshops they must identify, describe, classify and even taste several species of agro-alimentary interest (fruits of tempered or tropical regions, aromatic plants and spices, edible mushrooms and cereals and pseudocereals), many of them formerly unknown for the majority. They are asked to fill some questionnaires in order to consolidate concepts and to evaluate their personal participation in the subject development.

  16. Web Based GIS System For Monitoring the Implementation of County Integrated Development Plan a Case Study of Nandi County, Kenya.

    NASA Astrophysics Data System (ADS)

    Kurui, P. K.; Mutua, F.

    2016-12-01

    The paper seek to propose a conceptual framework of a way of monitoring and evaluating the implementation of the projects listed in the county integrated development plan by applying the use of the geospatial technologies with the focus being developing a web based GIS system in which the public can be able to track and monitor the progress of implementation of the projects in various parts of the county , although the guidelines developed by the ministry of devolution provides for the inclusion of the implementation monitoring and evaluation framework specifying projects to be implemented during the plan period and specification of veriable indicators it does not provide mechanisms for public participation during implementation and monitoring stage. This calls for a system that the public can interact with. The web based GIS system has been developed using the free open source softwares. Development projects have been categorized into; ongoing, completed and planned It can also be shown that a web based GIS system can be used as an important tool for monitoring the implementation of the CIDP due to its easy to use nature and supported by the fact that internet is easily accessed by the majority of the public making it a powerful tool for public participation.

  17. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.

  18. 76 FR 51965 - National Fuel Gas Supply Corporation; Notice of Intent To Prepare an Environmental Assessment for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ... crossings south of the Buffalo Compressor Station. \\1\\ A ``pig'' is a tool that is inserted into and moves... receiving this notice in the mail and are available at http://www.ferc.gov using the link called ``eLibrary... Commission's Web site at http://www.ferc.gov under the link to Documents and Filings. An eComment is an easy...

  19. Internet and World Wide Web-based tools for neuropathology practice and education.

    PubMed

    Fung, Kar-Ming; Tihan, Tarik

    2009-04-01

    The Internet and the World Wide Web (www) serve as a source of information and a communication network. Together they form a so-called web or network that allows for transmission and dissemination of information in unprecedented speed, volume and detail. This article presents an overview of the current status of neuropathology content on the www. As well as considering the Internet as a resource for neuropathology practice, education and research, we also address the issue of quality assurance when evaluating Internet and www content. Four major categories of websites (archival, broker, news and blog) are discussed and resources relevant to neuropathology of each type are highlighted. We believe that our report and similar attempts can provide an opportunity to discuss appropriate and effective use of the Internet by the neuropathology community.

  20. Illinois hospital using Web to build database for relationship marketing.

    PubMed

    Rees, T

    2000-01-01

    Silver Cross Hospital and Medical Centers, Joliet, Ill., is promoting its Web site as a tool for gathering health information about patients and prospective patients in order to build a relationship marketing database. The database will enable the hospital to identify health care needs of consumers in Joliet, Will County and many southwestern suburbs of Chicago. The Web site is promoted in a multimedia advertising campaign that invites residents to participate in a Healthy Living Quiz that rewards respondents with free health screenings. The effort is part of a growing planning and marketing strategy in the health care industry called customer relationship management (CRM). Not only does a total CRM plan offer health care organizations the chance to discover the potential for meeting consumers' needs; it also helps find any marketplace gaps that may exist.

  1. The Pathway Tools software.

    PubMed

    Karp, Peter D; Paley, Suzanne; Romero, Pedro

    2002-01-01

    Bioinformatics requires reusable software tools for creating model-organism databases (MODs). The Pathway Tools is a reusable, production-quality software environment for creating a type of MOD called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc (see http://ecocyc.org) integrates our evolving understanding of the genes, proteins, metabolic network, and genetic network of an organism. This paper provides an overview of the four main components of the Pathway Tools: The PathoLogic component supports creation of new PGDBs from the annotated genome of an organism. The Pathway/Genome Navigator provides query, visualization, and Web-publishing services for PGDBs. The Pathway/Genome Editors support interactive updating of PGDBs. The Pathway Tools ontology defines the schema of PGDBs. The Pathway Tools makes use of the Ocelot object database system for data management services for PGDBs. The Pathway Tools has been used to build PGDBs for 13 organisms within SRI and by external users.

  2. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  3. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    PubMed

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  4. The future of almanac services --- an HMNAO perspective

    NASA Astrophysics Data System (ADS)

    Bell, S.; Nelmes, S.; Prema, P.; Whittaker, J.

    2015-08-01

    This talk will explore the means for delivering almanac data currently under consideration by HM Nautical Almanac Office in the near to medium future. While there will be a need to continue printed almanacs, almanac data must be available in a variety of forms ranging from paper almanacs to traditional web services through to applications for mobile devices and smartphones. The supply of data using applications may call for a different philosophy in supplying ephemeris data, one that differentiates between an application that calls on a web server for its data and one that has built-in ephemerides. These ephemerides need to be of a reasonably high precision while maintaining a modest machine footprint. These services also need to provide a wide range of applications ranging from traditional sunrise/set data though to more specialized services such as celestial navigation. The work necessary to meet these goals involves efficient programming, intuitive user interfaces, compact and efficient ephemerides and a suitable range of tools to meet the user's needs.

  5. Tools, Services & Support of NASA Salinity Mission Data Archival Distribution through PO.DAAC

    NASA Astrophysics Data System (ADS)

    Tsontos, V. M.; Vazquez, J.

    2017-12-01

    The Physical Oceanography Distributed Active Center (PO.DAAC) serves as the designated NASA repository and distribution node for all Aquarius/SAC-D and SMAP sea surface salinity (SSS) mission data products in close collaboration with the projects. In addition to these official mission products, that by December 2017 will include the Aquarius V5.0 end-of-mission data, PO.DAAC archives and distributes high-value, principal investigator led satellite SSS products, and also datasets from NASA's "Salinity Processes in the Upper Ocean Regional Study" (SPURS 1 & 2) field campaigns in the N. Atlantic salinity maximum and high rainfall E. Tropical Pacific regions. Here we report on the status of these data holdings at PO.DAAC, and the range of data services and access tools that are provided in support of NASA salinity. These include user support and data discovery services, OPeNDAP and THREDDS web services for subsetting/extraction, and visualization via LAS and SOTO. Emphasis is placed on newer capabilities, including PODAAC's consolidated web services (CWS) and advanced L2 subsetting tool called HiTIDE.

  6. A Web-based cost-effective training tool with possible application to brain injury rehabilitation.

    PubMed

    Wang, Peijun; Kreutzer, Ina Anna; Bjärnemo, Robert; Davies, Roy C

    2004-06-01

    Virtual reality (VR) has provoked enormous interest in the medical community. In particular, VR offers therapists new approaches for improving rehabilitation effects. However, most of these VR assistant tools are not very portable, extensible or economical. Due to the vast amount of 3D data, they are not suitable for Internet transfer. Furthermore, in order to run these VR systems smoothly, special hardware devices are needed. As a result, existing VR assistant tools tend to be available in hospitals but not in patients' homes. To overcome these disadvantages, as a case study, this paper proposes a Web-based Virtual Ticket Machine, called WBVTM, using VRML [VRML Consortium, The Virtual Reality Modeling Language: International Standard ISO/IEC DIS 14772-1, 1997, available at ], Java and EAI (External Authoring Interface) [Silicon Graphics, Inc., The External Authoring Interface (EAI), available at ], to help people with acquired brain injury (ABI) to relearn basic living skills at home at a low cost. As these technologies are open standard and feature usability on the Internet, WBVTM achieves the goals of portability, easy accessibility and cost-effectiveness.

  7. NASA Webworldwind: Multidimensional Virtual Globe for Geo Big Data Visualization

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Hogan, P.; Prestifilippo, G.; Zamboni, G.

    2016-06-01

    In this paper, we presented a web application created using the NASA WebWorldWind framework. The application is capable of visualizing n-dimensional data using a Voxel model. In this case study, we handled social media data and Call Detailed Records (CDR) of telecommunication networks. These were retrieved from the "BigData Challenge 2015" of Telecom Italia. We focused on the visualization process for a suitable way to show this geo-data in a 3D environment, incorporating more than three dimensions. This engenders an interactive way to browse the data in their real context and understand them quickly. Users will be able to handle several varieties of data, import their dataset using a particular data structure, and then mash them up in the WebWorldWind virtual globe. A broad range of public use this tool for diverse purposes is possible, without much experience in the field, thanks to the intuitive user-interface of this web app.

  8. Standardized data sharing in a paediatric oncology research network--a proof-of-concept study.

    PubMed

    Hochedlinger, Nina; Nitzlnader, Michael; Falgenhauer, Markus; Welte, Stefan; Hayn, Dieter; Koumakis, Lefteris; Potamias, George; Tsiknakis, Manolis; Saraceno, Davide; Rinaldi, Eugenia; Ladenstein, Ruth; Schreier, Günter

    2015-01-01

    Data that has been collected in the course of clinical trials are potentially valuable for additional scientific research questions in so called secondary use scenarios. This is of particular importance in rare disease areas like paediatric oncology. If data from several research projects need to be connected, so called Core Datasets can be used to define which information needs to be extracted from every involved source system. In this work, the utility of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM) as a format for Core Datasets was evaluated and a web tool was developed which received Source ODM XML files and--via Extensible Stylesheet Language Transformation (XSLT)--generated standardized Core Dataset ODM XML files. Using this tool, data from different source systems were extracted and pooled for joined analysis in a proof-of-concept study, facilitating both, basic syntactic and semantic interoperability.

  9. Using Web Ontology Language to Integrate Heterogeneous Databases in the Neurosciences

    PubMed Central

    Lam, Hugo Y.K.; Marenco, Luis; Shepherd, Gordon M.; Miller, Perry L.; Cheung, Kei-Hoi

    2006-01-01

    Integrative neuroscience involves the integration and analysis of diverse types of neuroscience data involving many different experimental techniques. This data will increasingly be distributed across many heterogeneous databases that are web-accessible. Currently, these databases do not expose their schemas (database structures) and their contents to web applications/agents in a standardized, machine-friendly way. This limits database interoperation. To address this problem, we describe a pilot project that illustrates how neuroscience databases can be expressed using the Web Ontology Language, which is a semantically-rich ontological language, as a common data representation language to facilitate complex cross-database queries. In this pilot project, an existing tool called “D2RQ” was used to translate two neuroscience databases (NeuronDB and CoCoDat) into OWL, and the resulting OWL ontologies were then merged. An OWL-based reasoner (Racer) was then used to provide a sophisticated query language (nRQL) to perform integrated queries across the two databases based on the merged ontology. This pilot project is one step toward exploring the use of semantic web technologies in the neurosciences. PMID:17238384

  10. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  11. Depression and work performance: an ecological study using web-based screening.

    PubMed

    Harvey, S B; Glozier, N; Henderson, M; Allaway, S; Litchfield, P; Holland-Elliott, K; Hotopf, M

    2011-05-01

    Depression is reported to be a major cause of illness-related sub-optimal work performance (presenteeism). However, the majority of studies examining presenteeism have relied on self-report measures of work performance. Furthermore, employers currently face a number of practical challenges in attempting to facilitate early identification of depression. To test whether a web-based screening tool for depression could be used successfully in the workplace and whether it was possible to detect an association between rates of depression and objective measures of impaired workgroup performance. All permanent employees of a telecommunications company with UK-based call centres were encouraged to complete a web-based psychological assessment using the Patient Health Questionnaire depression scale (PHQ-9). In addition to confidential individual level results, the tool was able to provide anonymized summary statistics for each workgroup. Four objective measures of work performance were collected for each workgroup. During the study period, 1161 web-based PHQ-9 questionnaires were completed. There was a negative linear relationship between rates of depressive symptoms and the overall performance of a workgroup (P < 0.001). The linear relationship between depression and workgroup performance remained after controlling for gender balance, percent of temporary staff, employees' perceived level of engagement and satisfaction with their line manager (P < 0.01). Workgroups with high levels of depressive symptoms tend to perform poorly. Computer-aided web-based screening for symptoms of depression is feasible in a work setting.

  12. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    PubMed

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs.

  13. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast

    PubMed Central

    2015-01-01

    Background Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. Results The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Conclusions Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs. PMID:26677932

  14. The Effect of Animated Banner Advertisements on a Visual Search Task

    DTIC Science & Technology

    2001-01-01

    experimental result calls into question previous advertising tips suggested by WebWeek, cited in [17]. In 1996, the online magazine recommended that site...prone in the presence of animated banners. Keywords Animation, visual search, banner advertisements , flashing INTRODUCTION As processor and Internet...is the best way to represent the selection tool in a toolbar, where each icon must fit in a small area? Photoshop and other popular painting programs

  15. Training Aide: Research and Guidance for Effective Training User Guide

    DTIC Science & Technology

    2013-12-01

    Research Product 2014-02 Training Aide: Research and Guidance for Effective Training User Guide Beth Plott Shaun...Effective Training User Guide 5a. CONTRACT OR GRANT NUMBER W91WAW-07-C-0081 5b. PROGRAM ELEMENT NUMBER 611102 6. AUTHOR(S) Beth Plott...Representative and Subject Matter POC: Karin A. Orvis 14. ABSTRACT: This is a user guide for the web-based tool called Training Aide: Research and Guidance

  16. Creating a Pilot Educational Psychiatry Website: Opportunities, Barriers, and Next Steps.

    PubMed

    Torous, John; O'Connor, Ryan; Franzen, Jamie; Snow, Caitlin; Boland, Robert; Kitts, Robert

    2015-11-05

    While medical students and residents may be utilizing websites as online learning resources, medical trainees and educators now have the opportunity to create such educational websites and digital tools on their own. However, the process and theory of building educational websites for medical education have not yet been fully explored. To understand the opportunities, barriers, and process of creating a novel medical educational website. We created a pilot psychiatric educational website to better understand the options, opportunities, challenges, and processes involved in the creation of a psychiatric educational website. We sought to integrate visual and interactive Web design elements to underscore the potential of such Web technology. A pilot website (PsychOnCall) was created to demonstrate the potential of Web technology in medical and psychiatric education. Creating an educational website is now technically easier than ever before, and the primary challenge no longer is technology but rather the creation, validation, and maintenance of information for such websites as well as translating text-based didactics into visual and interactive tools. Medical educators can influence the design and implementation of online educational resources through creating their own websites and engaging medical students and residents in the process.

  17. Creating a Pilot Educational Psychiatry Website: Opportunities, Barriers, and Next Steps

    PubMed Central

    O'Connor, Ryan; Franzen, Jamie; Snow, Caitlin; Boland, Robert; Kitts, Robert

    2015-01-01

    Background While medical students and residents may be utilizing websites as online learning resources, medical trainees and educators now have the opportunity to create such educational websites and digital tools on their own. However, the process and theory of building educational websites for medical education have not yet been fully explored. Objective To understand the opportunities, barriers, and process of creating a novel medical educational website. Methods We created a pilot psychiatric educational website to better understand the options, opportunities, challenges, and processes involved in the creation of a psychiatric educational website. We sought to integrate visual and interactive Web design elements to underscore the potential of such Web technology. Results A pilot website (PsychOnCall) was created to demonstrate the potential of Web technology in medical and psychiatric education. Conclusions Creating an educational website is now technically easier than ever before, and the primary challenge no longer is technology but rather the creation, validation, and maintenance of information for such websites as well as translating text-based didactics into visual and interactive tools. Medical educators can influence the design and implementation of online educational resources through creating their own websites and engaging medical students and residents in the process. PMID:27731837

  18. Use of Web Technology to Access and Update College Plans

    ERIC Educational Resources Information Center

    Valeau, Edward J.; Luan, Jing

    2007-01-01

    In this study, the process and outcome of a web-based planning application, called Ports of Call, are discussed. The application allows college management to create, edit, and report out activities relating to college plans, all through a web browser. Its design was based on best practices in modern web technology and the application can be easily…

  19. Web-based training: a new paradigm in computer-assisted instruction in medicine.

    PubMed

    Haag, M; Maylein, L; Leven, F J; Tönshoff, B; Haux, R

    1999-01-01

    Computer-assisted instruction (CAI) programs based on internet technologies, especially on the world wide web (WWW), provide new opportunities in medical education. The aim of this paper is to examine different aspects of such programs, which we call 'web-based training (WBT) programs', and to differentiate them from conventional CAI programs. First, we will distinguish five different interaction types: presentation; browsing; tutorial dialogue; drill and practice; and simulation. In contrast to conventional CAI, there are four architectural types of WBT programs: client-based; remote data and knowledge; distributed teaching; and server-based. We will discuss the implications of the different architectures for developing WBT software. WBT programs have to meet other requirements than conventional CAI programs. The most important tools and programming languages for developing WBT programs will be listed and assigned to the architecture types. For the future, we expect a trend from conventional CAI towards WBT programs.

  20. ProtDec-LTR2.0: an improved method for protein remote homology detection by combining pseudo protein and supervised Learning to Rank.

    PubMed

    Chen, Junjie; Guo, Mingyue; Li, Shumin; Liu, Bin

    2017-11-01

    As one of the most important tasks in protein sequence analysis, protein remote homology detection is critical for both basic research and practical applications. Here, we present an effective web server for protein remote homology detection called ProtDec-LTR2.0 by combining ProtDec-Learning to Rank (LTR) and pseudo protein representation. Experimental results showed that the detection performance is obviously improved. The web server provides a user-friendly interface to explore the sequence and structure information of candidate proteins and find their conserved domains by launching a multiple sequence alignment tool. The web server is free and open to all users with no login requirement at http://bioinformatics.hitsz.edu.cn/ProtDec-LTR2.0/. bliu@hit.edu.cn. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  1. Oceans 2.0: a Data Management Infrastructure as a Platform

    NASA Astrophysics Data System (ADS)

    Pirenne, B.; Guillemot, E.

    2012-04-01

    Oceans 2.0: a Data Management Infrastructure as a Platform Benoît Pirenne, Associate Director, IT, NEPTUNE Canada Eric Guillemot, Manager, Software Development, NEPTUNE Canada The Data Management and Archiving System (DMAS) serving the needs of a number of undersea observing networks such as VENUS and NEPTUNE Canada was conceived from the beginning as a Service-Oriented Infrastructure. Its core functional elements (data acquisition, transport, archiving, retrieval and processing) can interact with the outside world using Web Services. Those Web Services can be exploited by a variety of higher level applications. Over the years, DMAS has developed Oceans 2.0: an environment where these techniques are implemented. The environment thereby becomes a platform in that it allows for easy addition of new and advanced features that build upon the tools at the core of the system. The applications that have been developed include: data search and retrieval, including options such as data product generation, data decimation or averaging, etc. dynamic infrastructure description (search all observatory metadata) and visualization data visualization, including dynamic scalar data plots, integrated fast video segment search and viewing Building upon these basic applications are new concepts, coming from the Web 2.0 world that DMAS has added: They allow people equipped only with a web browser to collaborate and contribute their findings or work results to the wider community. Examples include: addition of metadata tags to any part of the infrastructure or to any data item (annotations) ability to edit and execute, share and distribute Matlab code on-line, from a simple web browser, with specific calls within the code to access data ability to interactively and graphically build pipeline processing jobs that can be executed on the cloud web-based, interactive instrument control tools that allow users to truly share the use of the instruments and communicate with each other and last but not least: a public tool in the form of a game, that crowd-sources the inventory of the underwater video archive content, thereby adding tremendous amounts of metadata Beyond those tools that represent the functionality presently available to users, a number of the Web Services dedicated to data access are being exposed for anyone to use. This allows not only for ad hoc data access by individuals who need non-interactive access, but will foster the development of new applications in a variety of areas.

  2. Skylign: a tool for creating informative, interactive logos representing sequence alignments and profile hidden Markov models

    PubMed Central

    2014-01-01

    Background Logos are commonly used in molecular biology to provide a compact graphical representation of the conservation pattern of a set of sequences. They render the information contained in sequence alignments or profile hidden Markov models by drawing a stack of letters for each position, where the height of the stack corresponds to the conservation at that position, and the height of each letter within a stack depends on the frequency of that letter at that position. Results We present a new tool and web server, called Skylign, which provides a unified framework for creating logos for both sequence alignments and profile hidden Markov models. In addition to static image files, Skylign creates a novel interactive logo plot for inclusion in web pages. These interactive logos enable scrolling, zooming, and inspection of underlying values. Skylign can avoid sampling bias in sequence alignments by down-weighting redundant sequences and by combining observed counts with informed priors. It also simplifies the representation of gap parameters, and can optionally scale letter heights based on alternate calculations of the conservation of a position. Conclusion Skylign is available as a website, a scriptable web service with a RESTful interface, and as a software package for download. Skylign’s interactive logos are easily incorporated into a web page with just a few lines of HTML markup. Skylign may be found at http://skylign.org. PMID:24410852

  3. Web-based Tools for Educators: Outreach Activities of the Polar Radar for Ice Sheet Measurements (PRISM) Project

    NASA Astrophysics Data System (ADS)

    Braaten, D. A.; Holvoet, J. F.; Gogineni, S.

    2003-12-01

    The Radar Systems and Remote Sensing Laboratory at the University of Kansas (KU) has implemented extensive outreach activities focusing on Polar Regions as part of the Polar Radar for Ice Sheet Measurements (PRISM) project. The PRISM project is developing advanced intelligent remote sensing technology that involves radar systems, an autonomous rover, and communications systems to measure detailed ice sheet characteristics, and to determine bed conditions (frozen or wet) below active ice sheets in both Greenland and Antarctica. These measurements will provide a better understanding of the response of polar ice sheets to global climate change and the resulting impact the ice sheets will have on sea level rise. Many of the research and technological development aspects of the PRISM project, such as robotics, radar systems, climate change and exploration of harsh environments, can kindle an excitement and interest in students about science and technology. These topics form the core of our K-12 education and training outreach initiatives, which are designed to capture the imagination of young students, and prompt them to consider an educational path that will lead them to scientific or engineering careers. The K-12 PRISM outreach initiatives are being developed and implemented in a collaboration with the Advanced Learning Technology Program (ALTec) of the High Plains Regional Technology in Education Consortium (HPR*TEC). ALTec is associated with the KU School of Education, and is a well-established educational research center that develops and hosts web tools to enable teachers nationwide to network, collaborate, and share resources with other teachers. An example of an innovative and successful web interface developed by ALTec is called TrackStar. Teachers can use TrackStar over the Web to develop interactive, resource-based lessons (called tracks) on-line for their students. Once developed, tracks are added to the TrackStar database and can be accessed and modified (if necessary) by teachers everywhere. The PRISM project has added a search engine for polar related tracks, and has developed numerous new tracks on robotics, polar exploration, and climate change under the guidance of a K-12 teacher advisory group. The PRISM project is also developing and hosting several other web-based lesson design tools and resources for K-12 educators and students on the PRISM project web page (http://www.ku-prism.org). These tools and resources include: i) "Polar Scientists and Explorers, Past and Present" covering the travels and/or unknown fate of polar explorers and scientists; ii) "Polar News" providing links to current news articles related to polar regions; iii) "Letter of Global Concern", which is a tool to help students draft a letter to a politician, government official, or business leader; iv) "Graphic Sleuth", which is an online utility that allows teachers to make lessons for student use; v) "Bears on Ice" for students in grades K - 6 that can follow the adventures of two stuffed bears that travel with scientists into polar regions; and vi) "K-12 Polar Resources," which provides teachers with images, information, TrackStar lessons, and a search engine designed to identify polar related lessons. In our presentation, we will describe and show examples of these tools and resources, and provide an assessment of their popularity with teachers nationwide.

  4. ESAP plus: a web-based server for EST-SSR marker development.

    PubMed

    Ponyared, Piyarat; Ponsawat, Jiradej; Tongsima, Sissades; Seresangtakul, Pusadee; Akkasaeng, Chutipong; Tantisuwichwong, Nathpapat

    2016-12-22

    Simple sequence repeats (SSRs) have become widely used as molecular markers in plant genetic studies due to their abundance, high allelic variation at each locus and simplicity to analyze using conventional PCR amplification. To study plants with unknown genome sequence, SSR markers from Expressed Sequence Tags (ESTs), which can be obtained from the plant mRNA (converted to cDNA), must be utilized. With the advent of high-throughput sequencing technology, huge EST sequence data have been generated and are now accessible from many public databases. However, SSR marker identification from a large in-house or public EST collection requires a computational pipeline that makes use of several standard bioinformatic tools to design high quality EST-SSR primers. Some of these computational tools are not users friendly and must be tightly integrated with reference genomic databases. A web-based bioinformatic pipeline, called EST Analysis Pipeline Plus (ESAP Plus), was constructed for assisting researchers to develop SSR markers from a large EST collection. ESAP Plus incorporates several bioinformatic scripts and some useful standard software tools necessary for the four main procedures of EST-SSR marker development, namely 1) pre-processing, 2) clustering and assembly, 3) SSR mining and 4) SSR primer design. The proposed pipeline also provides two alternative steps for reducing EST redundancy and identifying SSR loci. Using public sugarcane ESTs, ESAP Plus automatically executed the aforementioned computational pipeline via a simple web user interface, which was implemented using standard PHP, HTML, CSS and Java scripts. With ESAP Plus, users can upload raw EST data and choose various filtering options and parameters to analyze each of the four main procedures through this web interface. All input EST data and their predicted SSR results will be stored in the ESAP Plus MySQL database. Users will be notified via e-mail when the automatic process is completed and they can download all the results through the web interface. ESAP Plus is a comprehensive and convenient web-based bioinformatic tool for SSR marker development. ESAP Plus offers all necessary EST-SSR development processes with various adjustable options that users can easily use to identify SSR markers from a large EST collection. With familiar web interface, users can upload the raw EST using the data submission page and visualize/download the corresponding EST-SSR information from within ESAP Plus. ESAP Plus can handle considerably large EST datasets. This EST-SSR discovery tool can be accessed directly from: http://gbp.kku.ac.th/esap_plus/ .

  5. vMon-mobile provides wireless connection to the electronic patient record

    NASA Astrophysics Data System (ADS)

    Oliveira, Pedro P., Jr.; Rebelo, Marina; Pilon, Paulo E.; Gutierrez, Marco A.; Tachinardi, Umberto

    2002-05-01

    This work presents the development of a set of tools to help doctors to continuously monitor critical patients. Real-time monitoring signals are displayed via a Web Based Electronic Patient Record (Web-EPR) developed at the Heart Institute. Any computer on the Hospital's Intranet can access the Web-EPR that will open a browser plug-in called vMon. Recently vMon was adapted to wireless mobile devices providing the same real-time visualization of vital signals of its desktop counterpart. The monitoring network communicates with the hospital network through a gateway using HL7 messages and has the ability to export waveforms in real time using the multicast protocol through an API library. A dedicated ActiveX component was built that establishes the streaming of the biomedical signals under monitoring and displays them on an Internet Explorer 5.x browser. The mobile version - called vMon-mobile - will parse the browser window and deliver it to a PDA device connected to a local area network. The result is a virtual monitor presenting real-time data on a mobile device. All parameters and signals acquired from the moment the patient is connected to the monitors are stored for a few days. The most clinically relevant information is added to patient's EPR.

  6. STITCHER 2.0: primer design for overlapping PCR applications

    PubMed Central

    O’Halloran, Damien M.; Uriagereka-Herburger, Isabel; Bode, Katrin

    2017-01-01

    Overlapping polymerase chain reaction (PCR) is a common technique used by researchers in very diverse fields that enables the user to ‘stitch’ individual pieces of DNA together. Previously, we have reported a web based tool called STITCHER that provides a platform for researchers to automate the design of primers for overlapping PCR applications. Here we present STITCHER 2.0, which represents a substantial update to STITCHER. STITCHER 2.0 is a newly designed web tool that automates the design of primers for overlapping PCR. Unlike STITCHER, STITCHER 2.0 considers diverse algorithmic parameters, and returns multiple result files that include a facility for the user to draw their own primers as well as comprehensive visual guides to the user’s input, output, and designed primers. These result files provide greater control and insight during experimental design and troubleshooting. STITCHER 2.0 is freely available to all users without signup or login requirements and can be accessed at the following webpage: www.ohalloranlab.net/STITCHER2.html. PMID:28358011

  7. STITCHER 2.0: primer design for overlapping PCR applications.

    PubMed

    O'Halloran, Damien M; Uriagereka-Herburger, Isabel; Bode, Katrin

    2017-03-30

    Overlapping polymerase chain reaction (PCR) is a common technique used by researchers in very diverse fields that enables the user to 'stitch' individual pieces of DNA together. Previously, we have reported a web based tool called STITCHER that provides a platform for researchers to automate the design of primers for overlapping PCR applications. Here we present STITCHER 2.0, which represents a substantial update to STITCHER. STITCHER 2.0 is a newly designed web tool that automates the design of primers for overlapping PCR. Unlike STITCHER, STITCHER 2.0 considers diverse algorithmic parameters, and returns multiple result files that include a facility for the user to draw their own primers as well as comprehensive visual guides to the user's input, output, and designed primers. These result files provide greater control and insight during experimental design and troubleshooting. STITCHER 2.0 is freely available to all users without signup or login requirements and can be accessed at the following webpage: www.ohalloranlab.net/STITCHER2.html.

  8. CRISPRFinder: a web tool to identify clustered regularly interspaced short palindromic repeats.

    PubMed

    Grissa, Ibtissem; Vergnaud, Gilles; Pourcel, Christine

    2007-07-01

    Clustered regularly interspaced short palindromic repeats (CRISPRs) constitute a particular family of tandem repeats found in a wide range of prokaryotic genomes (half of eubacteria and almost all archaea). They consist of a succession of highly conserved regions (DR) varying in size from 23 to 47 bp, separated by similarly sized unique sequences (spacer) of usually viral origin. A CRISPR cluster is flanked on one side by an AT-rich sequence called the leader and assumed to be a transcriptional promoter. Recent studies suggest that this structure represents a putative RNA-interference-based immune system. Here we describe CRISPRFinder, a web service offering tools to (i) detect CRISPRs including the shortest ones (one or two motifs); (ii) define DRs and extract spacers; (iii) get the flanking sequences to determine the leader; (iv) blast spacers against Genbank database and (v) check if the DR is found elsewhere in prokaryotic sequenced genomes. CRISPRFinder is freely accessible at http://crispr.u-psud.fr/Server/CRISPRfinder.php.

  9. Extracting scientific articles from a large digital archive: BioStor and the Biodiversity Heritage Library.

    PubMed

    Page, Roderic D M

    2011-05-23

    The Biodiversity Heritage Library (BHL) is a large digital archive of legacy biological literature, comprising over 31 million pages scanned from books, monographs, and journals. During the digitisation process basic metadata about the scanned items is recorded, but not article-level metadata. Given that the article is the standard unit of citation, this makes it difficult to locate cited literature in BHL. Adding the ability to easily find articles in BHL would greatly enhance the value of the archive. A service was developed to locate articles in BHL based on matching article metadata to BHL metadata using approximate string matching, regular expressions, and string alignment. This article locating service is exposed as a standard OpenURL resolver on the BioStor web site http://biostor.org/openurl/. This resolver can be used on the web, or called by bibliographic tools that support OpenURL. BioStor provides tools for extracting, annotating, and visualising articles from the Biodiversity Heritage Library. BioStor is available from http://biostor.org/.

  10. tRF2Cancer: A web server to detect tRNA-derived small RNA fragments (tRFs) and their expression in multiple cancers.

    PubMed

    Zheng, Ling-Ling; Xu, Wei-Lin; Liu, Shun; Sun, Wen-Ju; Li, Jun-Hao; Wu, Jie; Yang, Jian-Hua; Qu, Liang-Hu

    2016-07-08

    tRNA-derived small RNA fragments (tRFs) are one class of small non-coding RNAs derived from transfer RNAs (tRNAs). tRFs play important roles in cellular processes and are involved in multiple cancers. High-throughput small RNA (sRNA) sequencing experiments can detect all the cellular expressed sRNAs, including tRFs. However, distinguishing genuine tRFs from RNA fragments generated by random degradation remains a major challenge. In this study, we developed an integrated web-based computing system, tRF2Cancer, to accurately identify tRFs from sRNA deep-sequencing data and evaluate their expression in multiple cancers. The binomial test was introduced to evaluate whether reads from a small RNA-seq data set represent tRFs or degraded fragments. A classification method was then used to annotate the types of tRFs based on their sites of origin in pre-tRNA or mature tRNA. We applied the pipeline to analyze 10 991 data sets from 32 types of cancers and identified thousands of expressed tRFs. A tool called 'tRFinCancer' was developed to facilitate the users to inspect the expression of tRFs across different types of cancers. Another tool called 'tRFBrowser' shows both the sites of origin and the distribution of chemical modification sites in tRFs on their source tRNA. The tRF2Cancer web server is available at http://rna.sysu.edu.cn/tRFfinder/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. MosquitoMap and the Mal-area calculator: new web tools to relate mosquito species distribution with vector borne disease

    PubMed Central

    2010-01-01

    Background Mosquitoes are important vectors of diseases but, in spite of various mosquito faunistic surveys globally, there is a need for a spatial online database of mosquito collection data and distribution summaries. Such a resource could provide entomologists with the results of previous mosquito surveys, and vector disease control workers, preventative medicine practitioners, and health planners with information relating mosquito distribution to vector-borne disease risk. Results A web application called MosquitoMap was constructed comprising mosquito collection point data stored in an ArcGIS 9.3 Server/SQL geodatabase that includes administrative area and vector species x country lookup tables. In addition to the layer containing mosquito collection points, other map layers were made available including environmental, and vector and pathogen/disease distribution layers. An application within MosquitoMap called the Mal-area calculator (MAC) was constructed to quantify the area of overlap, for any area of interest, of vector, human, and disease distribution models. Data standards for mosquito records were developed for MosquitoMap. Conclusion MosquitoMap is a public domain web resource that maps and compares georeferenced mosquito collection points to other spatial information, in a geographical information system setting. The MAC quantifies the Mal-area, i.e. the area where it is theoretically possible for vector-borne disease transmission to occur, thus providing a useful decision tool where other disease information is limited. The Mal-area approach emphasizes the independent but cumulative contribution to disease risk of the vector species predicted present. MosquitoMap adds value to, and makes accessible, the results of past collecting efforts, as well as providing a template for other arthropod spatial databases. PMID:20167090

  12. MosquitoMap and the Mal-area calculator: new web tools to relate mosquito species distribution with vector borne disease.

    PubMed

    Foley, Desmond H; Wilkerson, Richard C; Birney, Ian; Harrison, Stanley; Christensen, Jamie; Rueda, Leopoldo M

    2010-02-18

    Mosquitoes are important vectors of diseases but, in spite of various mosquito faunistic surveys globally, there is a need for a spatial online database of mosquito collection data and distribution summaries. Such a resource could provide entomologists with the results of previous mosquito surveys, and vector disease control workers, preventative medicine practitioners, and health planners with information relating mosquito distribution to vector-borne disease risk. A web application called MosquitoMap was constructed comprising mosquito collection point data stored in an ArcGIS 9.3 Server/SQL geodatabase that includes administrative area and vector species x country lookup tables. In addition to the layer containing mosquito collection points, other map layers were made available including environmental, and vector and pathogen/disease distribution layers. An application within MosquitoMap called the Mal-area calculator (MAC) was constructed to quantify the area of overlap, for any area of interest, of vector, human, and disease distribution models. Data standards for mosquito records were developed for MosquitoMap. MosquitoMap is a public domain web resource that maps and compares georeferenced mosquito collection points to other spatial information, in a geographical information system setting. The MAC quantifies the Mal-area, i.e. the area where it is theoretically possible for vector-borne disease transmission to occur, thus providing a useful decision tool where other disease information is limited. The Mal-area approach emphasizes the independent but cumulative contribution to disease risk of the vector species predicted present. MosquitoMap adds value to, and makes accessible, the results of past collecting efforts, as well as providing a template for other arthropod spatial databases.

  13. m6ASNP: a tool for annotating genetic variants by m6A function.

    PubMed

    Jiang, Shuai; Xie, Yubin; He, Zhihao; Zhang, Ya; Zhao, Yuli; Chen, Li; Zheng, Yueyuan; Miao, Yanyan; Zuo, Zhixiang; Ren, Jian

    2018-05-01

    Large-scale genome sequencing projects have identified many genetic variants for diverse diseases. A major goal of these projects is to characterize these genetic variants to provide insight into their function and roles in diseases. N6-methyladenosine (m6A) is one of the most abundant RNA modifications in eukaryotes. Recent studies have revealed that aberrant m6A modifications are involved in many diseases. In this study, we present a user-friendly web server called "m6ASNP" that is dedicated to the identification of genetic variants that target m6A modification sites. A random forest model was implemented in m6ASNP to predict whether the methylation status of an m6A site is altered by the variants that surround the site. In m6ASNP, genetic variants in a standard variant call format (VCF) are accepted as the input data, and the output includes an interactive table that contains the genetic variants annotated by m6A function. In addition, statistical diagrams and a genome browser are provided to visualize the characteristics and to annotate the genetic variants. We believe that m6ASNP is a very convenient tool that can be used to boost further functional studies investigating genetic variants. The web server "m6ASNP" is implemented in JAVA and PHP and is freely available at [60].

  14. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same through web-browser interfaces. The summer school will serve as a valuable testbed for the tool development, preparing CMDA to serve its target community: Earth-science modeling and model-analysis community.

  15. VCS: Tool for Visualizing Copy Number Variation and Single Nucleotide Polymorphism.

    PubMed

    Kim, HyoYoung; Sung, Samsun; Cho, Seoae; Kim, Tae-Hun; Seo, Kangseok; Kim, Heebal

    2014-12-01

    Copy number variation (CNV) or single nucleotide phlyorphism (SNP) is useful genetic resource to aid in understanding complex phenotypes or deseases susceptibility. Although thousands of CNVs and SNPs are currently avaliable in the public databases, they are somewhat difficult to use for analyses without visualization tools. We developed a web-based tool called the VCS (visualization of CNV or SNP) to visualize the CNV or SNP detected. The VCS tool can assist to easily interpret a biological meaning from the numerical value of CNV and SNP. The VCS provides six visualization tools: i) the enrichment of genome contents in CNV; ii) the physical distribution of CNV or SNP on chromosomes; iii) the distribution of log2 ratio of CNVs with criteria of interested; iv) the number of CNV or SNP per binning unit; v) the distribution of homozygosity of SNP genotype; and vi) cytomap of genes within CNV or SNP region.

  16. Designing Crop Simulation Web Service with Service Oriented Architecture Principle

    NASA Astrophysics Data System (ADS)

    Chinnachodteeranun, R.; Hung, N. D.; Honda, K.

    2015-12-01

    Crop simulation models are efficient tools for simulating crop growth processes and yield. Running crop models requires data from various sources as well as time-consuming data processing, such as data quality checking and data formatting, before those data can be inputted to the model. It makes the use of crop modeling limited only to crop modelers. We aim to make running crop models convenient for various users so that the utilization of crop models will be expanded, which will directly improve agricultural applications. As the first step, we had developed a prototype that runs DSSAT on Web called as Tomorrow's Rice (v. 1). It predicts rice yields based on a planting date, rice's variety and soil characteristics using DSSAT crop model. A user only needs to select a planting location on the Web GUI then the system queried historical weather data from available sources and expected yield is returned. Currently, we are working on weather data connection via Sensor Observation Service (SOS) interface defined by Open Geospatial Consortium (OGC). Weather data can be automatically connected to a weather generator for generating weather scenarios for running the crop model. In order to expand these services further, we are designing a web service framework consisting of layers of web services to support compositions and executions for running crop simulations. This framework allows a third party application to call and cascade each service as it needs for data preparation and running DSSAT model using a dynamic web service mechanism. The framework has a module to manage data format conversion, which means users do not need to spend their time curating the data inputs. Dynamic linking of data sources and services are implemented using the Service Component Architecture (SCA). This agriculture web service platform demonstrates interoperability of weather data using SOS interface, convenient connections between weather data sources and weather generator, and connecting various services for running crop models for decision support.

  17. Biological data integration: wrapping data and tools.

    PubMed

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  18. A life scientist's gateway to distributed data management and computing: the PathPort/ToolBus framework.

    PubMed

    Eckart, J Dana; Sobral, Bruno W S

    2003-01-01

    The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.

  19. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed Central

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921

  20. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  1. Dynamic selection mechanism for quality of service aware web services

    NASA Astrophysics Data System (ADS)

    D'Mello, Demian Antony; Ananthanarayana, V. S.

    2010-02-01

    A web service is an interface of the software component that can be accessed by standard Internet protocols. The web service technology enables an application to application communication and interoperability. The increasing number of web service providers throughout the globe have produced numerous web services providing the same or similar functionality. This necessitates the use of tools and techniques to search the suitable services available over the Web. UDDI (universal description, discovery and integration) is the first initiative to find the suitable web services based on the requester's functional demands. However, the requester's requirements may also include non-functional aspects like quality of service (QoS). In this paper, the authors define a QoS model for QoS aware and business driven web service publishing and selection. The authors propose a QoS requirement format for the requesters, to specify their complex demands on QoS for the web service selection. The authors define a tree structure called quality constraint tree (QCT) to represent the requester's variety of requirements on QoS properties having varied preferences. The paper proposes a QoS broker based architecture for web service selection, which facilitates the requesters to specify their QoS requirements to select qualitatively optimal web service. A web service selection algorithm is presented, which ranks the functionally similar web services based on the degree of satisfaction of the requester's QoS requirements and preferences. The paper defines web service provider qualities to distinguish qualitatively competitive web services. The paper also presents the modelling and selection mechanism for the requester's alternative constraints defined on the QoS. The authors implement the QoS broker based system to prove the correctness of the proposed web service selection mechanism.

  2. Visualization of Atmospheric Water Vapor Data for SAGE

    NASA Technical Reports Server (NTRS)

    Kung, Mou-Liang; Chu, W. P. (Technical Monitor)

    2000-01-01

    The goal of this project was to develop visualization tools to study the water vapor dynamics using the Stratospheric Aerosol and Gas Experiment 11 (SAGE 11) water vapor data. During the past years, we completed the development of a visualization tool called EZSAGE, and various Gridded Water Vapor plots, tools deployed on the web to provide users with new insight into the water vapor dynamics. Results and experiences from this project, including papers, tutorials and reviews were published on the main Web page. Additional publishing effort has been initiated to package EZSAGE software for CD production and distribution. There have been some major personnel changes since Fall, 1998. Dr. Mou-Liang Kung, a Professor of Computer Science assumed the PI position vacated by Dr. Waldo Rodriguez who was on leave. However, former PI, Dr. Rodriguez continued to serve as a research adviser to this project to assure smooth transition and project completion. Typically in each semester, five student research assistants were hired and trained. Weekly group meetings were held to discuss problems, progress, new research direction, and activity planning. Other small group meetings were also held regularly for different objectives of this project. All student research assistants were required to submit reports for conference submission.

  3. SieveSifter: a web-based tool for visualizing the sieve analyses of HIV-1 vaccine efficacy trials.

    PubMed

    Fiore-Gartland, Andrew; Kullman, Nicholas; deCamp, Allan C; Clenaghan, Graham; Yang, Wayne; Magaret, Craig A; Edlefsen, Paul T; Gilbert, Peter B

    2017-08-01

    Analysis of HIV-1 virions from participants infected in a randomized controlled preventive HIV-1 vaccine efficacy trial can help elucidate mechanisms of partial protection. By comparing the genetic sequence of viruses from vaccine and placebo recipients to the sequence of the vaccine itself, a technique called 'sieve analysis', one can identify functional specificities of vaccine-induced immune responses. We have created an interactive web-based visualization and data access tool for exploring the results of sieve analyses performed on four major preventive HIV-1 vaccine efficacy trials: (i) the HIV Vaccine Trial Network (HVTN) 502/Step trial, (ii) the RV144/Thai trial, (iii) the HVTN 503/Phambili trial and (iv) the HVTN 505 trial. The tool acts simultaneously as a platform for rapid reinterpretation of sieve effects and as a portal for organizing and sharing the viral sequence data. Access to these valuable datasets also enables the development of novel methodology for future sieve analyses. Visualization: http://sieve.fredhutch.org/viz . Source code: https://github.com/nkullman/SIEVE . Data API: http://sieve.fredhutch.org/data . agartlan@fredhutch.org. © The Author(s) 2017. Published by Oxford University Press.

  4. GOTree Machine (GOTM): a web-based platform for interpreting sets of interesting genes using Gene Ontology hierarchies

    PubMed Central

    Zhang, Bing; Schmoyer, Denise; Kirov, Stefan; Snoddy, Jay

    2004-01-01

    Background Microarray and other high-throughput technologies are producing large sets of interesting genes that are difficult to analyze directly. Bioinformatics tools are needed to interpret the functional information in the gene sets. Results We have created a web-based tool for data analysis and data visualization for sets of genes called GOTree Machine (GOTM). This tool was originally intended to analyze sets of co-regulated genes identified from microarray analysis but is adaptable for use with other gene sets from other high-throughput analyses. GOTree Machine generates a GOTree, a tree-like structure to navigate the Gene Ontology Directed Acyclic Graph for input gene sets. This system provides user friendly data navigation and visualization. Statistical analysis helps users to identify the most important Gene Ontology categories for the input gene sets and suggests biological areas that warrant further study. GOTree Machine is available online at . Conclusion GOTree Machine has a broad application in functional genomic, proteomic and other high-throughput methods that generate large sets of interesting genes; its primary purpose is to help users sort for interesting patterns in gene sets. PMID:14975175

  5. Shiny-phyloseq: Web application for interactive microbiome analysis with provenance tracking.

    PubMed

    McMurdie, Paul J; Holmes, Susan

    2015-01-15

    We have created a Shiny-based Web application, called Shiny-phyloseq, for dynamic interaction with microbiome data that runs on any modern Web browser and requires no programming, increasing the accessibility and decreasing the entrance requirement to using phyloseq and related R tools. Along with a data- and context-aware dynamic interface for exploring the effects of parameter and method choices, Shiny-phyloseq also records the complete user input and subsequent graphical results of a user's session, allowing the user to archive, share and reproduce the sequence of steps that created their result-without writing any new code themselves. Shiny-phyloseq is implemented entirely in the R language. It can be hosted/launched by any system with R installed, including Windows, Mac OS and most Linux distributions. Information technology administrators can also host Shiny--phyloseq from a remote server, in which case users need only have a Web browser installed. Shiny-phyloseq is provided free of charge under a GPL-3 open-source license through GitHub at http://joey711.github.io/shiny-phyloseq/. © The Author 2014. Published by Oxford University Press.

  6. CartograTree: connecting tree genomes, phenotypes and environment.

    PubMed

    Vasquez-Gross, Hans A; Yu, John J; Figueroa, Ben; Gessler, Damian D G; Neale, David B; Wegrzyn, Jill L

    2013-05-01

    Today, researchers spend a tremendous amount of time gathering, formatting, filtering and visualizing data collected from disparate sources. Under the umbrella of forest tree biology, we seek to provide a platform and leverage modern technologies to connect biotic and abiotic data. Our goal is to provide an integrated web-based workspace that connects environmental, genomic and phenotypic data via geo-referenced coordinates. Here, we connect the genomic query web-based workspace, DiversiTree and a novel geographical interface called CartograTree to data housed on the TreeGenes database. To accomplish this goal, we implemented Simple Semantic Web Architecture and Protocol to enable the primary genomics database, TreeGenes, to communicate with semantic web services regardless of platform or back-end technologies. The novelty of CartograTree lies in the interactive workspace that allows for geographical visualization and engagement of high performance computing (HPC) resources. The application provides a unique tool set to facilitate research on the ecology, physiology and evolution of forest tree species. CartograTree can be accessed at: http://dendrome.ucdavis.edu/cartogratree. © 2013 Blackwell Publishing Ltd.

  7. Tool independence for the Web Accessibility Quantitative Metric.

    PubMed

    Vigo, Markel; Brajnik, Giorgio; Arrue, Myriam; Abascal, Julio

    2009-07-01

    The Web Accessibility Quantitative Metric (WAQM) aims at accurately measuring the accessibility of web pages. One of the main features of WAQM among others is that it is evaluation tool independent for ranking and accessibility monitoring scenarios. This article proposes a method to attain evaluation tool independence for all foreseeable scenarios. After demonstrating that homepages have a more similar error profile than any other web page in a given web site, 15 homepages were measured with 10,000 different values of WAQM parameters using EvalAccess and LIFT, two automatic evaluation tools for accessibility. A similar procedure was followed with random pages and with several test files obtaining several tuples that minimise the difference between both tools. One thousand four hundred forty-nine web pages from 15 web sites were measured with these tuples and those values that minimised the difference between the tools were selected. Once the WAQM was tuned, the accessibility of 15 web sites was measured with two metrics for web sites, concluding that even if similar values can be produced, obtaining the same scores is undesirable since evaluation tools behave in a different way.

  8. BingEO: Enable Distributed Earth Observation Data for Environmental Research

    NASA Astrophysics Data System (ADS)

    Wu, H.; Yang, C.; Xu, Y.

    2010-12-01

    Our planet is facing great environmental challenges including global climate change, environmental vulnerability, extreme poverty, and a shortage of clean cheap energy. To address these problems, scientists are developing various models to analysis, forecast, simulate various geospatial phenomena to support critical decision making. These models not only challenge our computing technology, but also challenge us to feed huge demands of earth observation data. Through various policies and programs, open and free sharing of earth observation data are advocated in earth science. Currently, thousands of data sources are freely available online through open standards such as Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Seamless sharing and access to these resources call for a spatial Cyberinfrastructure (CI) to enable the use of spatial data for the advancement of related applied sciences including environmental research. Based on Microsoft Bing Search Engine and Bing Map, a seamlessly integrated and visual tool is under development to bridge the gap between researchers/educators and earth observation data providers. With this tool, earth science researchers/educators can easily and visually find the best data sets for their research and education. The tool includes a registry and its related supporting module at server-side and an integrated portal as its client. The proposed portal, Bing Earth Observation (BingEO), is based on Bing Search and Bing Map to: 1) Use Bing Search to discover Web Map Services (WMS) resources available over the internet; 2) Develop and maintain a registry to manage all the available WMS resources and constantly monitor their service quality; 3) Allow users to manually register data services; 4) Provide a Bing Maps-based Web application to visualize the data on a high-quality and easy-to-manipulate map platform and enable users to select the best data layers online. Given the amount of observation data accumulated already and still growing, BingEO will allow these resources to be utilized more widely, intensively, efficiently and economically in earth science applications.

  9. AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields

    NASA Astrophysics Data System (ADS)

    López, R.; San-Juan, J. F.

    2013-05-01

    Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.

  10. A Java API for working with PubChem datasets.

    PubMed

    Southern, Mark R; Griffin, Patrick R

    2011-03-01

    PubChem is a public repository of chemical structures and associated biological activities. The PubChem BioAssay database contains assay descriptions, conditions and readouts and biological screening results that have been submitted by the biomedical research community. The PubChem web site and Power User Gateway (PUG) web service allow users to interact with the data and raw files are available via FTP. These resources are helpful to many but there can also be great benefit by using a software API to manipulate the data. Here, we describe a Java API with entity objects mapped to the PubChem Schema and with wrapper functions for calling the NCBI eUtilities and PubChem PUG web services. PubChem BioAssays and associated chemical compounds can then be queried and manipulated in a local relational database. Features include chemical structure searching and generation and display of curve fits from stored dose-response experiments, something that is not yet available within PubChem itself. The aim is to provide researchers with a fast, consistent, queryable local resource from which to manipulate PubChem BioAssays in a database agnostic manner. It is not intended as an end user tool but to provide a platform for further automation and tools development. http://code.google.com/p/pubchemdb.

  11. Increasing the availability and usability of terrestrial ecology data through geospatial Web services and visualization tools (Invited)

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.

    2010-12-01

    Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.

  12. Delivering Electronic Resources with Web OPACs and Other Web-based Tools: Needs of Reference Librarians.

    ERIC Educational Resources Information Center

    Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.

    2000-01-01

    Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…

  13. Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.

    2015-12-01

    Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.

  14. Development and formative evaluation of a visual e-tool to help decision makers navigate the evidence around health financing.

    PubMed

    Skordis-Worrall, Jolene; Pulkki-Brännström, Anni-Maria; Utley, Martin; Kembhavi, Gayatri; Bricki, Nouria; Dutoit, Xavier; Rosato, Mikey; Pagel, Christina

    2012-12-21

    There are calls for low and middle income countries to develop robust health financing policies to increase service coverage. However, existing evidence around financing options is complex and often difficult for policy makers to access. To summarize the evidence on the impact of financing health systems and develop an e-tool to help decision makers navigate the findings. After reviewing the literature, we used thematic analysis to summarize the impact of 7 common health financing mechanisms on 5 common health system goals. Information on the relevance of each study to a user's context was provided by 11 country indicators. A Web-based e-tool was then developed to assist users in navigating the literature review. This tool was evaluated using feedback from early users, collected using an online survey and in-depth interviews with key informants. The e-tool provides graphical summaries that allow a user to assess the following parameters with a single snapshot: the number of relevant studies available in the literature, the heterogeneity of evidence, where key evidence is lacking, and how closely the evidence matches their own context. Users particularly liked the visual display and found navigating the tool intuitive. However there was concern that a lack of evidence on positive impact might be construed as evidence against a financing option and that the tool might over-simplify the available financing options. Complex evidence can be made more easily accessible and potentially more understandable using basic Web-based technology and innovative graphical representations that match findings to the users' goals and context.

  15. ballaxy: web services for structural bioinformatics.

    PubMed

    Hildebrandt, Anna Katharina; Stöckel, Daniel; Fischer, Nina M; de la Garza, Luis; Krüger, Jens; Nickels, Stefan; Röttig, Marc; Schärfe, Charlotta; Schumann, Marcel; Thiel, Philipp; Lenhof, Hans-Peter; Kohlbacher, Oliver; Hildebrandt, Andreas

    2015-01-01

    Web-based workflow systems have gained considerable momentum in sequence-oriented bioinformatics. In structural bioinformatics, however, such systems are still relatively rare; while commercial stand-alone workflow applications are common in the pharmaceutical industry, academic researchers often still rely on command-line scripting to glue individual tools together. In this work, we address the problem of building a web-based system for workflows in structural bioinformatics. For the underlying molecular modelling engine, we opted for the BALL framework because of its extensive and well-tested functionality in the field of structural bioinformatics. The large number of molecular data structures and algorithms implemented in BALL allows for elegant and sophisticated development of new approaches in the field. We hence connected the versatile BALL library and its visualization and editing front end BALLView with the Galaxy workflow framework. The result, which we call ballaxy, enables the user to simply and intuitively create sophisticated pipelines for applications in structure-based computational biology, integrated into a standard tool for molecular modelling.  ballaxy consists of three parts: some minor modifications to the Galaxy system, a collection of tools and an integration into the BALL framework and the BALLView application for molecular modelling. Modifications to Galaxy will be submitted to the Galaxy project, and the BALL and BALLView integrations will be integrated in the next major BALL release. After acceptance of the modifications into the Galaxy project, we will publish all ballaxy tools via the Galaxy toolshed. In the meantime, all three components are available from http://www.ball-project.org/ballaxy. Also, docker images for ballaxy are available at https://registry.hub.docker.com/u/anhi/ballaxy/dockerfile/. ballaxy is licensed under the terms of the GPL. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. The iMeteo is a web-based weather visualization tool

    NASA Astrophysics Data System (ADS)

    Tuni San-Martín, Max; San-Martín, Daniel; Cofiño, Antonio S.

    2010-05-01

    iMeteo is a web-based weather visualization tool. Designed with an extensible J2EE architecture, it is capable of displaying information from heterogeneous data sources such as gridded data from numerical models (in NetCDF format) or databases of local predictions. All this information is presented in a user-friendly way, being able to choose the specific tool to display data (maps, graphs, information tables) and customize it to desired locations. *Modular Display System* Visualization of the data is achieved through a set of mini tools called widgets. A user can add them at will and arrange them around the screen easily with a drag and drop movement. They can be of various types and each can be configured separately, forming a really powerful and configurable system. The "Map" is the most complex widget, since it can show several variables simultaneously (either gridded or point-based) through a layered display. Other useful widgets are the the "Histogram", which generates a graph with the frequency characteristics of a variable and the "Timeline" which shows the time evolution of a variable at a given location in an interactive way. *Customization and security* Following the trends in web development, the user can easily customize the way data is displayed. Due to programming in client side with technologies like AJAX, the interaction with the application is similar to the desktop ones because there are rapid respone times. If a user is registered then he could also save his settings in the database, allowing access from any system with Internet access with his particular setup. There is particular emphasis on application security. The administrator can define a set of user profiles, which may have associated restrictions on access to certain data sources, geographic areas or time intervals.

  17. DMPwerkzeug - A tool to support the planning, implementation, and organization of research data management.

    NASA Astrophysics Data System (ADS)

    Klar, Jochen; Engelhardt, Claudia; Neuroth, Heike; Enke, Harry

    2016-04-01

    Following the call to make the results of publicly funded research openly accessible, more and more funding agencies demand the submission of a data management plan (DMP) as part of the application process. These documents specify, how the data management of the project is organized and what datasets will be published when. Of particular importance for European researchers is the Open Data Research Pilot of Horizon 2020 which requires data management plans for a set of 9 selected research fields from social sciences to nanotechnology. In order to assist the researchers creating these documents, several institutions developed dedicated software tools. The most well-known are DMPonline by the Digital Curation Centre (DCC) and DMPtool by the California Digital Library (CDL) - both extensive and well received web applications. The core functionality of these tools is the assisted editing of the DMP templates provided by the particular funding agency.While this is certainly helpful, especially in an environment with a plethora of different funding agencies like the UK or the USA, these tools are somewhat limited to this particular task and don't utilise the full potential of DMP. Beyond the purpose of fulfilling funder requirements, DMP can be useful for a number of additional tasks. In the initial conception phase of a project, they can be used as a planning tool to determine which date management activities and measures are necessary throughout the research process, to assess which resources are needed, and which institutions (computing centers, libraries, data centers) should be involved. During the project, they can act as a constant reference or guideline for the handling of research data. They also determine where the data will be stored after the project has ended and whether it can be accessed by the public, helping to take into account resulting requirements of the data center or actions necessary to ensure re-usability by others from early on. Ideally, a DMP acts as information source for all stakeholders involved during the complete life cycle of the project. The aim our project is the development of a web application called DMPwerkzeug, which enables this structured planning, implementation and administration of the research data management in a scientific project and, in addition, provides the scientist with a textual DMP. Building upon a generic set of content, DMPwerkzeug will be customizable to serve specific disciplinary and institutional requirements. The tool will not only be available at a central web site, but will be able to be installed and integrated into the existing infrastructure of a university or research institution. The tool will be multilingual, with a first version being published in English and German.

  18. CoVaCS: a consensus variant calling system.

    PubMed

    Chiara, Matteo; Gioiosa, Silvia; Chillemi, Giovanni; D'Antonio, Mattia; Flati, Tiziano; Picardi, Ernesto; Zambelli, Federico; Horner, David Stephen; Pesole, Graziano; Castrignanò, Tiziana

    2018-02-05

    The advent and ongoing development of next generation sequencing technologies (NGS) has led to a rapid increase in the rate of human genome re-sequencing data, paving the way for personalized genomics and precision medicine. The body of genome resequencing data is progressively increasing underlining the need for accurate and time-effective bioinformatics systems for genotyping - a crucial prerequisite for identification of candidate causal mutations in diagnostic screens. Here we present CoVaCS, a fully automated, highly accurate system with a web based graphical interface for genotyping and variant annotation. Extensive tests on a gold standard benchmark data-set -the NA12878 Illumina platinum genome- confirm that call-sets based on our consensus strategy are completely in line with those attained by similar command line based approaches, and far more accurate than call-sets from any individual tool. Importantly our system exhibits better sensitivity and higher specificity than equivalent commercial software. CoVaCS offers optimized pipelines integrating state of the art tools for variant calling and annotation for whole genome sequencing (WGS), whole-exome sequencing (WES) and target-gene sequencing (TGS) data. The system is currently hosted at Cineca, and offers the speed of a HPC computing facility, a crucial consideration when large numbers of samples must be analysed. Importantly, all the analyses are performed automatically allowing high reproducibility of the results. As such, we believe that CoVaCS can be a valuable tool for the analysis of human genome resequencing studies. CoVaCS is available at: https://bioinformatics.cineca.it/covacs .

  19. The Joy of Playing with Oceanographic Data

    NASA Astrophysics Data System (ADS)

    Smith, A. T.; Xing, Z.; Armstrong, E. M.; Thompson, C. K.; Huang, T.

    2013-12-01

    The web is no longer just an after thought. It is no longer just a presentation layer filled with HTML, CSS, JavaScript, Frameworks, 3D, and more. It has become the medium of our communication. It is the database of all databases. It is the computing platform of all platforms. It has transformed the way we do science. Web service is the de facto method for communication between machines over the web. Representational State Transfer (REST) has standardized the way we architect services and their interfaces. In the Earth Science domain, we are familiar with tools and services such as Open-Source Project for Network Data Access Protocol (OPeNDAP), Thematic Realtime Environmental Distributed Data Services (THREDDS), and Live Access Server (LAS). We are also familiar with various data formats such as NetCDF3/4, HDF4/5, GRIB, TIFF, etc. One of the challenges for the Earth Science community is accessing information within these data. There are community-accepted readers that our users can download and install. However, the Application Programming Interface (API) between these readers is not standardized, which leads to non-portable applications. Webification (w10n) is an emerging technology, developed at the Jet Propulsion Laboratory, which exploits the hierarchical nature of a science data artifact to assign a URL to each element within the artifact. (e.g. a granule file). By embracing standards such as JSON, XML, and HTML5 and predictable URL, w10n provides a simple interface that enables tool-builders and researchers to develop portable tools/applications to interact with artifacts of various formats. The NASA Physical Oceanographic Distributed Active Archive Center (PO.DAAC) is the designated data center for observational products relevant to the physical state of the ocean. Over the past year PO.DAAC has been evaluating w10n technology by webifying its archive holdings to provide simplified access to oceanographic science artifacts and as a service to enable future tools and services development. In this talk, we will focus on a w10n-based system called Distributed Oceanographic Webification Service (DOWS) being developed at PO.DAAC to provide a newer and simpler method for working with observational data artifacts. As a continued effort at PO.DAAC to provide better tools and services to visualize our data, the talk will discuss the latest in web-based data visualization tools/frameworks (such as d3.js, Three.js, Leaflet.js, and more) and techniques for working with webified oceanographic science data in both a 2D and 3D web approach.

  20. InCHlib - interactive cluster heatmap for web applications.

    PubMed

    Skuta, Ctibor; Bartůněk, Petr; Svozil, Daniel

    2014-12-01

    Hierarchical clustering is an exploratory data analysis method that reveals the groups (clusters) of similar objects. The result of the hierarchical clustering is a tree structure called dendrogram that shows the arrangement of individual clusters. To investigate the row/column hierarchical cluster structure of a data matrix, a visualization tool called 'cluster heatmap' is commonly employed. In the cluster heatmap, the data matrix is displayed as a heatmap, a 2-dimensional array in which the colour of each element corresponds to its value. The rows/columns of the matrix are ordered such that similar rows/columns are near each other. The ordering is given by the dendrogram which is displayed on the side of the heatmap. We developed InCHlib (Interactive Cluster Heatmap Library), a highly interactive and lightweight JavaScript library for cluster heatmap visualization and exploration. InCHlib enables the user to select individual or clustered heatmap rows, to zoom in and out of clusters or to flexibly modify heatmap appearance. The cluster heatmap can be augmented with additional metadata displayed in a different colour scale. In addition, to further enhance the visualization, the cluster heatmap can be interconnected with external data sources or analysis tools. Data clustering and the preparation of the input file for InCHlib is facilitated by the Python utility script inchlib_clust . The cluster heatmap is one of the most popular visualizations of large chemical and biomedical data sets originating, e.g., in high-throughput screening, genomics or transcriptomics experiments. The presented JavaScript library InCHlib is a client-side solution for cluster heatmap exploration. InCHlib can be easily deployed into any modern web application and configured to cooperate with external tools and data sources. Though InCHlib is primarily intended for the analysis of chemical or biological data, it is a versatile tool which application domain is not limited to the life sciences only.

  1. WASP: a Web-based Allele-Specific PCR assay designing tool for detecting SNPs and mutations

    PubMed Central

    Wangkumhang, Pongsakorn; Chaichoompu, Kridsadakorn; Ngamphiw, Chumpol; Ruangrit, Uttapong; Chanprasert, Juntima; Assawamakin, Anunchai; Tongsima, Sissades

    2007-01-01

    Background Allele-specific (AS) Polymerase Chain Reaction is a convenient and inexpensive method for genotyping Single Nucleotide Polymorphisms (SNPs) and mutations. It is applied in many recent studies including population genetics, molecular genetics and pharmacogenomics. Using known AS primer design tools to create primers leads to cumbersome process to inexperience users since information about SNP/mutation must be acquired from public databases prior to the design. Furthermore, most of these tools do not offer the mismatch enhancement to designed primers. The available web applications do not provide user-friendly graphical input interface and intuitive visualization of their primer results. Results This work presents a web-based AS primer design application called WASP. This tool can efficiently design AS primers for human SNPs as well as mutations. To assist scientists with collecting necessary information about target polymorphisms, this tool provides a local SNP database containing over 10 million SNPs of various populations from public domain databases, namely NCBI dbSNP, HapMap and JSNP respectively. This database is tightly integrated with the tool so that users can perform the design for existing SNPs without going off the site. To guarantee specificity of AS primers, the proposed system incorporates a primer specificity enhancement technique widely used in experiment protocol. In particular, WASP makes use of different destabilizing effects by introducing one deliberate 'mismatch' at the penultimate (second to last of the 3'-end) base of AS primers to improve the resulting AS primers. Furthermore, WASP offers graphical user interface through scalable vector graphic (SVG) draw that allow users to select SNPs and graphically visualize designed primers and their conditions. Conclusion WASP offers a tool for designing AS primers for both SNPs and mutations. By integrating the database for known SNPs (using gene ID or rs number), this tool facilitates the awkward process of getting flanking sequences and other related information from public SNP databases. It takes into account the underlying destabilizing effect to ensure the effectiveness of designed primers. With user-friendly SVG interface, WASP intuitively presents resulting designed primers, which assist users to export or to make further adjustment to the design. This software can be freely accessed at . PMID:17697334

  2. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  3. Climate Model Diagnostic Analyzer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  4. Extracting scientific articles from a large digital archive: BioStor and the Biodiversity Heritage Library

    PubMed Central

    2011-01-01

    Background The Biodiversity Heritage Library (BHL) is a large digital archive of legacy biological literature, comprising over 31 million pages scanned from books, monographs, and journals. During the digitisation process basic metadata about the scanned items is recorded, but not article-level metadata. Given that the article is the standard unit of citation, this makes it difficult to locate cited literature in BHL. Adding the ability to easily find articles in BHL would greatly enhance the value of the archive. Description A service was developed to locate articles in BHL based on matching article metadata to BHL metadata using approximate string matching, regular expressions, and string alignment. This article locating service is exposed as a standard OpenURL resolver on the BioStor web site http://biostor.org/openurl/. This resolver can be used on the web, or called by bibliographic tools that support OpenURL. Conclusions BioStor provides tools for extracting, annotating, and visualising articles from the Biodiversity Heritage Library. BioStor is available from http://biostor.org/. PMID:21605356

  5. Advancing Collaboration through Hydrologic Data and Model Sharing

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.

    2015-12-01

    HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.

  6. Measuring engagement effectiveness in social media

    NASA Astrophysics Data System (ADS)

    Li, Lei; Sun, Tong; Peng, Wei; Li, Tao

    2012-03-01

    Social media is becoming increasingly prevalent with the advent of web 2.0 technologies. Popular social media websites, such as Twitter and Facebook, are attracting a gigantic number of online users to post and share information. An interesting phenomenon under this trend involves that more and more users share their experiences or issues with regard to a product, and then the product service agents use commercial social media listening and engagement tools (e.g. Radian6, Sysomos, etc.) to response to users' complaints or issues and help them tackle their problems. This is often called customer care in social media or social customer relationship management (CRM). However, all these existing commercial social media tools only provide an aggregated level of trends, patterns and sentiment analysis based on the keyword-centric brand relevant data, which have little insights for answering one of the key questions in social CRM system: how effective is our social customer care engagement? In this paper, we focus on addressing the problem of how to measure the effectiveness of engagement for service agents in customer care. Traditional CRM effectiveness measurements are defined under the scenario of the call center, where the effectiveness is mostly based on the duration time per call and/or number of answered calls per day. Different from customer care in a call center, we can obtain detailed conversations between agents and customers in social media, and therefore the effectiveness can be measured by analyzing the content of conversations and the sentiment of customers.

  7. The BioCyc collection of microbial genomes and metabolic pathways.

    PubMed

    Karp, Peter D; Billington, Richard; Caspi, Ron; Fulcher, Carol A; Latendresse, Mario; Kothari, Anamika; Keseler, Ingrid M; Krummenacker, Markus; Midford, Peter E; Ong, Quang; Ong, Wai Kit; Paley, Suzanne M; Subhraveti, Pallavi

    2017-08-17

    BioCyc.org is a microbial genome Web portal that combines thousands of genomes with additional information inferred by computer programs, imported from other databases and curated from the biomedical literature by biologist curators. BioCyc also provides an extensive range of query tools, visualization services and analysis software. Recent advances in BioCyc include an expansion in the content of BioCyc in terms of both the number of genomes and the types of information available for each genome; an expansion in the amount of curated content within BioCyc; and new developments in the BioCyc software tools including redesigned gene/protein pages and metabolite pages; new search tools; a new sequence-alignment tool; a new tool for visualizing groups of related metabolic pathways; and a facility called SmartTables, which enables biologists to perform analyses that previously would have required a programmer's assistance. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  9. SporeWeb: an interactive journey through the complete sporulation cycle of Bacillus subtilis.

    PubMed

    Eijlander, Robyn T; de Jong, Anne; Krawczyk, Antonina O; Holsappel, Siger; Kuipers, Oscar P

    2014-01-01

    Bacterial spores are a continuous problem for both food-based and health-related industries. Decades of scientific research dedicated towards understanding molecular and gene regulatory aspects of sporulation, spore germination and spore properties have resulted in a wealth of data and information. To facilitate obtaining a complete overview as well as new insights concerning this complex and tightly regulated process, we have developed a database-driven knowledge platform called SporeWeb (http://sporeweb.molgenrug.nl) that focuses on gene regulatory networks during sporulation in the Gram-positive bacterium Bacillus subtilis. Dynamic features allow the user to navigate through all stages of sporulation with review-like descriptions, schematic overviews on transcriptional regulation and detailed information on all regulators and the genes under their control. The Web site supports data acquisition on sporulation genes and their expression, regulon network interactions and direct links to other knowledge platforms or relevant literature. The information found on SporeWeb (including figures and tables) can and will be updated as new information becomes available in the literature. In this way, SporeWeb offers a novel, convenient and timely reference, an information source and a data acquisition tool that will aid in the general understanding of the dynamics of the complete sporulation cycle.

  10. Infant Gastroesophageal Reflux Information on the World Wide Web.

    PubMed

    Balgowan, Regina; Greer, Leah C; D'Auria, Jennifer P

    2016-01-01

    The purpose of this study was to describe the type and quality of health information about infant gastroesophageal reflux (GER) that a parent may find on the World Wide Web. The data collection tool included evaluation of Web site quality and infant GER-specific content on the 30 sites that met the inclusion criteria. The most commonly found content categories in order of frequency were management strategies, when to call a primary care provider, definition, and clinical features. The most frequently mentioned strategies included feeding changes, infant positioning, and medications. Thirteen of the 30 Web sites included information on both GER and gastroesophageal reflux disease. Mention of the use of medication to lessen infant symptoms was found on 15 of the 30 sites. Only 10 of the 30 sites included information about parent support and coping strategies. Pediatric nurse practitioners (PNPs) should utilize well-child visits to address the normalcy of physiologic infant GER and clarify any misperceptions parents may have about diagnosis and the role of medication from information they may have found on the Internet. It is critical for PNPs to assist in the development of Web sites with accurate content, advise parents on how to identify safe and reliable information, and provide examples of high-quality Web sites about child health topics such as infant GER. Copyright © 2016 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.

  11. Unraveling transcriptional control and cis-regulatory codes using the software suite GeneACT

    PubMed Central

    Cheung, Tom Hiu; Kwan, Yin Lam; Hamady, Micah; Liu, Xuedong

    2006-01-01

    Deciphering gene regulatory networks requires the systematic identification of functional cis-acting regulatory elements. We present a suite of web-based bioinformatics tools, called GeneACT , that can rapidly detect evolutionarily conserved transcription factor binding sites or microRNA target sites that are either unique or over-represented in differentially expressed genes from DNA microarray data. GeneACT provides graphic visualization and extraction of common regulatory sequence elements in the promoters and 3'-untranslated regions that are conserved across multiple mammalian species. PMID:17064417

  12. Redesigning Instruction through Web-based Course Authoring Tools.

    ERIC Educational Resources Information Center

    Dabbagh, Nada H.; Schmitt, Jeff

    1998-01-01

    Examines the pedagogical implications of redesigning instruction for Web-based delivery through a case study of an undergraduate computer science course. Initially designed for a traditional learning environment, this course transformed to a Web-based course using WebCT, a Web-based course authoring tool. Discusses the specific features of WebCT.…

  13. A survey of motif finding Web tools for detecting binding site motifs in ChIP-Seq data

    PubMed Central

    2014-01-01

    Abstract ChIP-Seq (chromatin immunoprecipitation sequencing) has provided the advantage for finding motifs as ChIP-Seq experiments narrow down the motif finding to binding site locations. Recent motif finding tools facilitate the motif detection by providing user-friendly Web interface. In this work, we reviewed nine motif finding Web tools that are capable for detecting binding site motifs in ChIP-Seq data. We showed each motif finding Web tool has its own advantages for detecting motifs that other tools may not discover. We recommended the users to use multiple motif finding Web tools that implement different algorithms for obtaining significant motifs, overlapping resemble motifs, and non-overlapping motifs. Finally, we provided our suggestions for future development of motif finding Web tool that better assists researchers for finding motifs in ChIP-Seq data. Reviewers This article was reviewed by Prof. Sandor Pongor, Dr. Yuriy Gusev, and Dr. Shyam Prabhakar (nominated by Prof. Limsoon Wong). PMID:24555784

  14. Teaching the principles of health management to first year veterinary students.

    PubMed

    Duffield, Todd; Lissemore, Kerry; Sandals, David

    2003-01-01

    A course called Health Management 1 was created as part of a new DVM curriculum at the Ontario Veterinary College. This full year course was designed to introduce students to basic concepts of health management, integrating the disciplines of epidemiology, ethology, and public health in the context of selected animal industries. The course was comprised of 60 lecture hours and four two-hour laboratories. A common definition of health management, incorporating five principles, was used throughout the course, in order to reinforce the concepts and to maintain continuity between lecture blocks. Unlike in the years prior to the introduction of the new curriculum, epidemiology was presented as a tool of health management rather than as a separate discipline. To supplement the lecture and laboratory material, a Web-based resource was created and the students were required to review the appropriate section prior to each lecture block. Small quizzes, consisting of 10 questions each within WebCT, were used to stimulate self-directed learning. Overall, the course was well received by the students. The Web resources combined with the WebCT quizzes proved to be an effective method of stimulating students to prepare for lecture.

  15. Outcomes and Utilization of a Low Intensity Workplace Weight Loss Program

    PubMed Central

    Carpenter, Kelly M.; Lovejoy, Jennifer C.; Lange, Jane M.; Hapgood, Jenny E.; Zbikowski, Susan M.

    2014-01-01

    Obesity is related to high health care costs and lost productivity in the workplace. Employers are increasingly sponsoring weight loss and wellness programs to ameliorate these costs. We evaluated weight loss outcomes, treatment utilization, and health behavior change in a low intensity phone- and web-based, employer-sponsored weight loss program. The intervention included three proactive counseling phone calls with a registered dietician and a behavioral health coach as well as a comprehensive website. At six months, one third of those who responded to the follow-up survey had lost a clinically significant amount of weight (≥5% of body weight). Clinically significant weight loss was predicted by the use of both the counseling calls and the website. When examining specific features of the web site, the weight tracking tool was the most predictive of weight loss. Health behavior changes such as eating more fruits and vegetables, increasing physical activity, and reducing stress were all predictive of clinically significant weight loss. Although limited by the low follow-up rate, this evaluation suggests that even low intensity weight loss programs can lead to clinical weight loss for a significant number of participants. PMID:24688791

  16. A Java API for working with PubChem datasets

    PubMed Central

    Southern, Mark R.; Griffin, Patrick R.

    2011-01-01

    Summary: PubChem is a public repository of chemical structures and associated biological activities. The PubChem BioAssay database contains assay descriptions, conditions and readouts and biological screening results that have been submitted by the biomedical research community. The PubChem web site and Power User Gateway (PUG) web service allow users to interact with the data and raw files are available via FTP. These resources are helpful to many but there can also be great benefit by using a software API to manipulate the data. Here, we describe a Java API with entity objects mapped to the PubChem Schema and with wrapper functions for calling the NCBI eUtilities and PubChem PUG web services. PubChem BioAssays and associated chemical compounds can then be queried and manipulated in a local relational database. Features include chemical structure searching and generation and display of curve fits from stored dose–response experiments, something that is not yet available within PubChem itself. The aim is to provide researchers with a fast, consistent, queryable local resource from which to manipulate PubChem BioAssays in a database agnostic manner. It is not intended as an end user tool but to provide a platform for further automation and tools development. Availability: http://code.google.com/p/pubchemdb Contact: southern@scripps.edu PMID:21216779

  17. NASA GSFC Space Weather Center - Innovative Space Weather Dissemination: Web-Interfaces, Mobile Applications, and More

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo; Zheng, Yihua; Rastaetter, Lutz; Taktakishvili, A.; Mays, M. L.; Kuznetsova, M.; Lee, Hyesook; Chulaki, Anna; Hesse, Michael; Mullinix, Richard; hide

    2012-01-01

    The NASA GSFC Space Weather Center (http://swc.gsfc.nasa.gov) is committed to providing forecasts, alerts, research, and educational support to address NASA's space weather needs - in addition to the needs of the general space weather community. We provide a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, custom space weather alerts and products, weekly summaries and reports, and most recently - video casts. There are many challenges in providing accurate descriptions of past, present, and expected space weather events - and the Space Weather Center at NASA GSFC employs several innovative solutions to provide access to a comprehensive collection of both observational data, as well as space weather model/simulation data. We'll describe the challenges we've faced with managing hundreds of data streams, running models in real-time, data storage, and data dissemination. We'll also highlight several systems and tools that are utilized by the Space Weather Center in our daily operations, all of which are available to the general community as well. These systems and services include a web-based application called the Integrated Space Weather Analysis System (iSWA http://iswa.gsfc.nasa.gov), two mobile space weather applications for both IOS and Android devices, an external API for web-service style access to data, google earth compatible data products, and a downloadable client-based visualization tool.

  18. Scribl: an HTML5 Canvas-based graphics library for visualizing genomic data over the web.

    PubMed

    Miller, Chase A; Anthony, Jon; Meyer, Michelle M; Marth, Gabor

    2013-02-01

    High-throughput biological research requires simultaneous visualization as well as analysis of genomic data, e.g. read alignments, variant calls and genomic annotations. Traditionally, such integrative analysis required desktop applications operating on locally stored data. Many current terabyte-size datasets generated by large public consortia projects, however, are already only feasibly stored at specialist genome analysis centers. As even small laboratories can afford very large datasets, local storage and analysis are becoming increasingly limiting, and it is likely that most such datasets will soon be stored remotely, e.g. in the cloud. These developments will require web-based tools that enable users to access, analyze and view vast remotely stored data with a level of sophistication and interactivity that approximates desktop applications. As rapidly dropping cost enables researchers to collect data intended to answer questions in very specialized contexts, developers must also provide software libraries that empower users to implement customized data analyses and data views for their particular application. Such specialized, yet lightweight, applications would empower scientists to better answer specific biological questions than possible with general-purpose genome browsers currently available. Using recent advances in core web technologies (HTML5), we developed Scribl, a flexible genomic visualization library specifically targeting coordinate-based data such as genomic features, DNA sequence and genetic variants. Scribl simplifies the development of sophisticated web-based graphical tools that approach the dynamism and interactivity of desktop applications. Software is freely available online at http://chmille4.github.com/Scribl/ and is implemented in JavaScript with all modern browsers supported.

  19. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  20. Analysis Tool Web Services from the EMBL-EBI.

    PubMed

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  1. Analysis Tool Web Services from the EMBL-EBI

    PubMed Central

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-01-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  2. Teaching Web 2.0 technologies using Web 2.0 technologies.

    PubMed

    Rethlefsen, Melissa L; Piorun, Mary; Prince, J Dale

    2009-10-01

    The research evaluated participant satisfaction with the content and format of the "Web 2.0 101: Introduction to Second Generation Web Tools" course and measured the impact of the course on participants' self-evaluated knowledge of Web 2.0 tools. The "Web 2.0 101" online course was based loosely on the Learning 2.0 model. Content was provided through a course blog and covered a wide range of Web 2.0 tools. All Medical Library Association members were invited to participate. Participants were asked to complete a post-course survey. Respondents who completed the entire course or who completed part of the course self-evaluated their knowledge of nine social software tools and concepts prior to and after the course using a Likert scale. Additional qualitative information about course strengths and weaknesses was also gathered. Respondents' self-ratings showed a significant change in perceived knowledge for each tool, using a matched pair Wilcoxon signed rank analysis (P<0.0001 for each tool/concept). Overall satisfaction with the course appeared high. Hands-on exercises were the most frequently identified strength of the course; the length and time-consuming nature of the course were considered weaknesses by some. Learning 2.0-style courses, though demanding time and self-motivation from participants, can increase knowledge of Web 2.0 tools.

  3. 75 FR 51058 - Web-Distributed Labeling User Acceptance Pilot

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-18

    ... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OPP-2010-0632; FRL-8840-1] Web-Distributed Labeling User... Pesticide Programs (OPP) is exploring a new initiative called ``web-distributed labeling'' (web-distributed... Internet. Through this Federal Register Notice, OPP is announcing its intention to conduct a web...

  4. Web-based routing assistance tool to reduce pavement damage by overweight and oversize vehicles.

    DOT National Transportation Integrated Search

    2016-10-30

    This report documents the results of a completed project titled Web-Based Routing Assistance Tool to Reduce Pavement Damage by Overweight and Oversize Vehicles. The tasks involved developing a Web-based GIS routing assistance tool and evaluate ...

  5. Designing and Implementing Web-Based Scaffolding Tools for Technology-Enhanced Socioscientific Inquiry

    ERIC Educational Resources Information Center

    Shin, Suhkyung; Brush, Thomas A.; Glazewski, Krista D.

    2017-01-01

    This study explores how web-based scaffolding tools provide instructional support while implementing a socio-scientific inquiry (SSI) unit in a science classroom. This case study focused on how students used web-based scaffolding tools during SSI activities, and how students perceived the SSI unit and the scaffolding tools embedded in the SSI…

  6. iCOSSY: An Online Tool for Context-Specific Subnetwork Discovery from Gene Expression Data

    PubMed Central

    Saha, Ashis; Jeon, Minji; Tan, Aik Choon; Kang, Jaewoo

    2015-01-01

    Pathway analyses help reveal underlying molecular mechanisms of complex biological phenotypes. Biologists tend to perform multiple pathway analyses on the same dataset, as there is no single answer. It is often inefficient for them to implement and/or install all the algorithms by themselves. Online tools can help the community in this regard. Here we present an online gene expression analytical tool called iCOSSY which implements a novel pathway-based COntext-specific Subnetwork discoverY (COSSY) algorithm. iCOSSY also includes a few modifications of COSSY to increase its reliability and interpretability. Users can upload their gene expression datasets, and discover important subnetworks of closely interacting molecules to differentiate between two phenotypes (context). They can also interactively visualize the resulting subnetworks. iCOSSY is a web server that finds subnetworks that are differentially expressed in two phenotypes. Users can visualize the subnetworks to understand the biology of the difference. PMID:26147457

  7. Electrophysiological signal analysis and visualization using Cloudwave for epilepsy clinical research.

    PubMed

    Jayapandian, Catherine P; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D; Zhang, Guo-Qiang; Sahoo, Satya S

    2013-01-01

    Epilepsy is the most common serious neurological disorder affecting 50-60 million persons worldwide. Electrophysiological data recordings, such as electroencephalogram (EEG), are the gold standard for diagnosis and pre-surgical evaluation in epilepsy patients. The increasing trend towards multi-center clinical studies require signal visualization and analysis tools to support real time interaction with signal data in a collaborative environment, which cannot be supported by traditional desktop-based standalone applications. As part of the Prevention and Risk Identification of SUDEP Mortality (PRISM) project, we have developed a Web-based electrophysiology data visualization and analysis platform called Cloudwave using highly scalable open source cloud computing infrastructure. Cloudwave is integrated with the PRISM patient cohort identification tool called MEDCIS (Multi-modality Epilepsy Data Capture and Integration System). The Epilepsy and Seizure Ontology (EpSO) underpins both Cloudwave and MEDCIS to support query composition and result retrieval. Cloudwave is being used by clinicians and research staff at the University Hospital - Case Medical Center (UH-CMC) Epilepsy Monitoring Unit (EMU) and will be progressively deployed at four EMUs in the United States and the United Kingdomas part of the PRISM project.

  8. Real-time access of large volume imagery through low-bandwidth links

    NASA Astrophysics Data System (ADS)

    Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew

    2010-04-01

    Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.

  9. ClimateWizard: A Framework and Easy-to-Use Web-Mapping Tool for Global, Regional, and Local Climate-Change Analysis

    NASA Astrophysics Data System (ADS)

    Girvetz, E. H.; Zganjar, C.; Raber, G. T.; Hoekstra, J.; Lawler, J. J.; Kareiva, P.

    2008-12-01

    Now that there is overwhelming evidence of global climate change, scientists, managers and planners (i.e. practitioners) need to assess the potential impacts of climate change on particular ecological systems, within specific geographic areas, and at spatial scales they care about, in order to make better land management, planning, and policy decisions. Unfortunately, this application of climate science to real world decisions and planning has proceeded too slowly because we lack tools for translating cutting-edge climate science and climate-model outputs into something managers and planners can work with at local or regional scales (CCSP 2008). To help increase the accessibility of climate information, we have developed a freely-available, easy-to-use, web-based climate-change analysis toolbox, called ClimateWizard, for assessing how climate has and is projected to change at specific geographic locations throughout the world. The ClimateWizard uses geographic information systems (GIS), web-services (SOAP/XML), statistical analysis platforms (e.g. R- project), and web-based mapping services (e.g. Google Earth/Maps, KML/GML) to provide a variety of different analyses (e.g. trends and departures) and outputs (e.g. maps, graphs, tables, GIS layers). Because ClimateWizard analyzes large climate datasets stored remotely on powerful computers, users of the tool do not need to have fast computers or expensive software, but simply need access to the internet. The analysis results are then provided to users in a Google Maps webpage tailored to the specific climate-change question being asked. The ClimateWizard is not a static product, but rather a framework to be built upon and modified to suit the purposes of specific scientific, management, and policy questions. For example, it can be expanded to include bioclimatic variables (e.g. evapotranspiration) and marine data (e.g. sea surface temperature), as well as improved future climate projections, and climate-change impact analyses involving hydrology, vegetation, wildfire, disease, and food security. By harnessing the power of computer and web- based technologies, the ClimateWizard puts local, regional, and global climate-change analyses in the hands of a wider array of managers, planners, and scientists.

  10. KnowledgePuzzle: A Browsing Tool to Adapt the Web Navigation Process to the Learner's Mental Model

    ERIC Educational Resources Information Center

    AlAgha, Iyad

    2012-01-01

    This article presents KnowledgePuzzle, a browsing tool for knowledge construction from the web. It aims to adapt the structure of web content to the learner's information needs regardless of how the web content is originally delivered. Learners are provided with a meta-cognitive space (e.g., a concept mapping tool) that enables them to plan…

  11. Older Cancer Patients’ User Experiences With Web-Based Health Information Tools: A Think-Aloud Study

    PubMed Central

    Romijn, Geke; Smets, Ellen M A; Loos, Eugene F; Kunneman, Marleen; van Weert, Julia C M

    2016-01-01

    Background Health information is increasingly presented on the Internet. Several Web design guidelines for older Web users have been proposed; however, these guidelines are often not applied in website development. Furthermore, although we know that older individuals use the Internet to search for health information, we lack knowledge on how they use and evaluate Web-based health information. Objective This study evaluates user experiences with existing Web-based health information tools among older (≥ 65 years) cancer patients and survivors and their partners. The aim was to gain insight into usability issues and the perceived usefulness of cancer-related Web-based health information tools. Methods We conducted video-recorded think-aloud observations for 7 Web-based health information tools, specifically 3 websites providing cancer-related information, 3 Web-based question prompt lists (QPLs), and 1 values clarification tool, with colorectal cancer patients or survivors (n=15) and their partners (n=8) (median age: 73; interquartile range 70-79). Participants were asked to think aloud while performing search, evaluation, and application tasks using the Web-based health information tools. Results Overall, participants perceived Web-based health information tools as highly useful and indicated a willingness to use such tools. However, they experienced problems in terms of usability and perceived usefulness due to difficulties in using navigational elements, shortcomings in the layout, a lack of instructions on how to use the tools, difficulties with comprehensibility, and a large amount of variety in terms of the preferred amount of information. Although participants frequently commented that it was easy for them to find requested information, we observed that the large majority of the participants were not able to find it. Conclusions Overall, older cancer patients appreciate and are able to use cancer information websites. However, this study shows the importance of maintaining awareness of age-related problems such as cognitive and functional decline and navigation difficulties with this target group in mind. The results of this study can be used to design usable and useful Web-based health information tools for older (cancer) patients. PMID:27457709

  12. Developing a Computational Environment for Coupling MOR Data, Maps, and Models: The Virtual Research Vessel (VRV) Prototype

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; O'Dea, E.; Cushing, J. B.; Cuny, J. E.; Toomey, D. R.; Hackett, K.; Tikekar, R.

    2001-12-01

    The East Pacific Rise (EPR) from 9-10deg. N is currently our best-studied section of fast-spreading mid-ocean ridge. During several decades of investigation it has been explored by the full spectrum of ridge investigators, including chemists, biologists, geologists and geophysicists. These studies, and those that are ongoing, provide a wealth of observational data, results and data-driven theoretical (often numerical) studies that have not yet been fully utilized either by research scientists or by professional educators. While the situation is improving, a large amount of data, results, and related theoretical models still exist either in an inert, non-interactive form (e.g., journal publications) or as unlinked and currently incompatible computer data or algorithms. Infrastructure is needed not just for ready access to data, but linkage of disparate data sets (data to data) as well as data to models in order quantitatively evaluate hypotheses, refine numerical simulations, and explore new relations between observables. The prototype of a computational environment and toolset, called the Virtual Research Vessel (VRV), is being developed to provide scientists and educators with ready access to data, results and numerical models. While this effort is focused on the EPR 9N region, the resulting software tools and infrastructure should be helpful in establishing similar systems for other sections of the global mid-ocean ridge. Work in progress includes efforts to develop: (1) virtual database to incorporate diverse data types with domain-specific metadata into a global schema that allows web-query across different marine geology data sets, and an analogous declarative (database available) description of tools and models; (2) the ability to move data between GIS and the above DBMS, and tools to encourage data submission to archivesl (3) tools for finding and viewing archives, and translating between formats; (4) support for "computational steering" (tool composition) and model coupling (e.g., ability to run tool composition locally but access input data from the web, APIs to support coupling such as invoking programs that are running remotely, and help in writing data wrappers to publish programs); (5) support of migration paths for prototyped model coupling; and (6) export of marine geological data and data analysis to the undergraduate classroom (VRV-ET, "Educational Tool"). See the main VRV web site at http://oregonstate.edu/dept/vrv and the VRV-ET web site at: http://www.cs.uoregon.edu/research/vrv-et.

  13. World Wide Web Pages--Tools for Teaching and Learning.

    ERIC Educational Resources Information Center

    Beasley, Sarah; Kent, Jean

    Created to help educators incorporate World Wide Web pages into teaching and learning, this collection of Web pages presents resources, materials, and techniques for using the Web. The first page focuses on tools for teaching and learning via the Web, providing pointers to sites containing the following: (1) course materials for both distance and…

  14. Does your web site draw new patients?

    PubMed

    Wallin, Wendy S

    2009-11-01

    The absence of scientific data forces orthodontists to guess at how best to design Internet sites that persuade prospective patients to call for appointments. This study was conducted to identify the Web-site factors that lead prospective patients to make appointments or, conversely, to reject a practice. Ten participants actively looking online for an orthodontist were recruited to participate. They reviewed 64 orthodontic Web sites in their geographic areas and rated their likelihood of calling each practice for an appointment. The sessions were videotaped. Analysis of participant comments, navigation patterns, and ratings suggested 25 distinguishing factors. Statistical analysis showed 10 Web-site characteristics that predict the success of an orthodontic Web site in attracting new patients.

  15. Mobile Learning Approaches for U.S. Army Training

    DTIC Science & Technology

    2010-08-01

    2.0 tools on smartphones may promote student-centered learning pedagogies (e.g., Cochrane & Bateman, 2010) and provide learners with more fruitful...and effective relationships with their instructors and peers.1 That is, Web 2.0 tools facilitate learners‟ creative practices, participation...1 Web 1.0 tools focused on presenting information to users whereas Web 2.0 tools focused on providing social networking

  16. Using Collaborative Simulation Modeling to Develop a Web-Based Tool to Support Policy-Level Decision Making About Breast Cancer Screening Initiation Age

    PubMed Central

    Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.

    2017-01-01

    Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135

  17. MetalS(3), a database-mining tool for the identification of structurally similar metal sites.

    PubMed

    Valasatava, Yana; Rosato, Antonio; Cavallaro, Gabriele; Andreini, Claudia

    2014-08-01

    We have developed a database search tool to identify metal sites having structural similarity to a query metal site structure within the MetalPDB database of minimal functional sites (MFSs) contained in metal-binding biological macromolecules. MFSs describe the local environment around the metal(s) independently of the larger context of the macromolecular structure. Such a local environment has a determinant role in tuning the chemical reactivity of the metal, ultimately contributing to the functional properties of the whole system. The database search tool, which we called MetalS(3) (Metal Sites Similarity Search), can be accessed through a Web interface at http://metalweb.cerm.unifi.it/tools/metals3/ . MetalS(3) uses a suitably adapted version of an algorithm that we previously developed to systematically compare the structure of the query metal site with each MFS in MetalPDB. For each MFS, the best superposition is kept. All these superpositions are then ranked according to the MetalS(3) scoring function and are presented to the user in tabular form. The user can interact with the output Web page to visualize the structural alignment or the sequence alignment derived from it. Options to filter the results are available. Test calculations show that the MetalS(3) output correlates well with expectations from protein homology considerations. Furthermore, we describe some usage scenarios that highlight the usefulness of MetalS(3) to obtain mechanistic and functional hints regardless of homology.

  18. Discovering Authorities and Hubs in Different Topological Web Graph Structures.

    ERIC Educational Resources Information Center

    Meghabghab, George

    2002-01-01

    Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)

  19. The Availability of Web 2.0 Tools from Community College Libraries' Websites Serving Large Student Bodies

    ERIC Educational Resources Information Center

    Blummer, Barbara; Kenton, Jeffrey M.

    2014-01-01

    Web 2.0 tools offer academic libraries new avenues for delivering services and resources to students. In this research we report on a content analysis of 100 US community college libraries' Websites for the availability of Web 2.0 applications. We found Web 2.0 tools utilized by 97% of our sample population and many of these sites contained more…

  20. KymoKnot: A web server and software package to identify and locate knots in trajectories of linear or circular polymers.

    PubMed

    Tubiana, Luca; Polles, Guido; Orlandini, Enzo; Micheletti, Cristian

    2018-06-07

    The KymoKnot software package and web server identifies and locates physical knots or proper knots in a series of polymer conformations. It is mainly intended as an analysis tool for trajectories of linear or circular polymers, but it can be used on single instances too, e.g. protein structures in PDB format. A key element of the software package is the so-called minimally interfering chain closure algorithm that is used to detect physical knots in open chains and to locate the knotted region in both open and closed chains. The web server offers a user-friendly graphical interface that identifies the knot type and highlights the knotted region on each frame of the trajectory, which the user can visualize interactively from various viewpoints. The dynamical evolution of the knotted region along the chain contour is presented as a kymograph. All data can be downloaded in text format. The KymoKnot package is licensed under the BSD 3-Clause licence. The server is publicly available at http://kymoknot.sissa.it/kymoknot/interactive.php .

  1. A Java Program for the Viewing of Ambient Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisher, A.

    2004-09-03

    The BaBar detector system has a large number of sensors and data feeds, called ambient feeds. These feeds are vital to the operation and monitoring of the Pep rings and the BaBar system. In order to more easily monitor these systems a simple web interface to display graphical representations of the data is needed. Towards this end it was decided that using a Java Web Servlet (Sun Microsystems) would be an effective and simple way to achieve this effect. By combining Web Servlets with the Corba Technology (OMG) this provides a way for many people to access data from anywheremore » in the world. Using this type of program in conjunction with the HEP AIDA systems for graphing makes a powerful tool for the monitoring of the BaBar system. An overview of the BaBar system, and how the Ambient data is given. The uses and limitations of this method for viewing the data as well as examples of other ways to access the data and potential other uses for the servlet are also discussed.« less

  2. Zooniverse - A Platform for Data-Driven Citizen Science

    NASA Astrophysics Data System (ADS)

    Smith, A.; Lintott, C.; Bamford, S.; Fortson, L.

    2011-12-01

    In July 2007 a team of astrophysicists created a web-based astronomy project called Galaxy Zoo in which members of the public were asked to classify galaxies from the Sloan Digital Sky Survey by their shape. Over the following year a community of more than 150,000 people classified each of the 1 million galaxies more than 50 times each. Four years later this community of 'citizen scientists' is more than 450,000 strong and is contributing their time and efforts to more than 10 Zooniverse projects each with its own science team and research case. With projects ranging from transcribing ancient greek texts (ancientlives.org) to lunar science (moonzoo.org) the challenges to the Zooniverse community have gone well beyond the relatively simple original Galaxy Zoo interface. Delivering a range of citizen science projects to a large web-based audience presents challenges on a number of fronts including interface design, data architecture/modelling and reduction techniques, web-infrastructure and software design. In this paper we will describe how the Zooniverse team (a collaboration of scientists, software developers and educators ) have developed tools and techniques to solve some of these issues.

  3. MetaSEEk: a content-based metasearch engine for images

    NASA Astrophysics Data System (ADS)

    Beigi, Mandis; Benitez, Ana B.; Chang, Shih-Fu

    1997-12-01

    Search engines are the most powerful resources for finding information on the rapidly expanding World Wide Web (WWW). Finding the desired search engines and learning how to use them, however, can be very time consuming. The integration of such search tools enables the users to access information across the world in a transparent and efficient manner. These systems are called meta-search engines. The recent emergence of visual information retrieval (VIR) search engines on the web is leading to the same efficiency problem. This paper describes and evaluates MetaSEEk, a content-based meta-search engine used for finding images on the Web based on their visual information. MetaSEEk is designed to intelligently select and interface with multiple on-line image search engines by ranking their performance for different classes of user queries. User feedback is also integrated in the ranking refinement. We compare MetaSEEk with a base line version of meta-search engine, which does not use the past performance of the different search engines in recommending target search engines for future queries.

  4. ROSA: Resource-Oriented Service Management Schemes for Web of Things in a Smart Home.

    PubMed

    Liao, Chun-Feng; Chen, Peng-Yu

    2017-09-21

    A Pervasive-computing-enriched smart home environment, which contains many embedded and tiny intelligent devices and sensors coordinated by service management mechanisms, is capable of anticipating intentions of occupants and providing appropriate services accordingly. Although there are a wealth of research achievements in recent years, the degree of market acceptance is still low. The main reason is that most of the devices and services in such environments depend on particular platform or technology, making it hard to develop an application by composing the devices or services. Meanwhile, the concept of Web of Things (WoT) is becoming popular recently. Based on WoT, the developers can build applications based on popular web tools or technologies. Consequently, the objective of this paper is to propose a set of novel WoT-driven plug-and-play service management schemes for a smart home called Resource-Oriented Service Administration (ROSA). We have implemented an application prototype, and experiments are performed to show the effectiveness of the proposed approach. The results of this research can be a foundation for realizing the vision of "end user programmable smart environments".

  5. Answering the Call of the Web: UVA Crafts a Innovative Web Certification Program for Its Staff.

    ERIC Educational Resources Information Center

    Lee, Sandra T.

    2000-01-01

    Describes the development of a Web Certification Program at the University of Virginia. This program offers certificates at three levels: Web Basics, Web Designer, and Web Master. The paper focuses on: determination of criteria for awarding certificates; program status; program evaluation and program effectiveness; and future plans for the Web…

  6. Gene calling and bacterial genome annotation with BG7.

    PubMed

    Tobes, Raquel; Pareja-Tobes, Pablo; Manrique, Marina; Pareja-Tobes, Eduardo; Kovach, Evdokim; Alekhin, Alexey; Pareja, Eduardo

    2015-01-01

    New massive sequencing technologies are providing many bacterial genome sequences from diverse taxa but a refined annotation of these genomes is crucial for obtaining scientific findings and new knowledge. Thus, bacterial genome annotation has emerged as a key point to investigate in bacteria. Any efficient tool designed specifically to annotate bacterial genomes sequenced with massively parallel technologies has to consider the specific features of bacterial genomes (absence of introns and scarcity of nonprotein-coding sequence) and of next-generation sequencing (NGS) technologies (presence of errors and not perfectly assembled genomes). These features make it convenient to focus on coding regions and, hence, on protein sequences that are the elements directly related with biological functions. In this chapter we describe how to annotate bacterial genomes with BG7, an open-source tool based on a protein-centered gene calling/annotation paradigm. BG7 is specifically designed for the annotation of bacterial genomes sequenced with NGS. This tool is sequence error tolerant maintaining their capabilities for the annotation of highly fragmented genomes or for annotating mixed sequences coming from several genomes (as those obtained through metagenomics samples). BG7 has been designed with scalability as a requirement, with a computing infrastructure completely based on cloud computing (Amazon Web Services).

  7. A journey to Semantic Web query federation in the life sciences.

    PubMed

    Cheung, Kei-Hoi; Frost, H Robert; Marshall, M Scott; Prud'hommeaux, Eric; Samwald, Matthias; Zhao, Jun; Paschke, Adrian

    2009-10-01

    As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. We have demonstrated the use of a set of novel and state-of-the-art Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semantically-equivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community.

  8. A journey to Semantic Web query federation in the life sciences

    PubMed Central

    Cheung, Kei-Hoi; Frost, H Robert; Marshall, M Scott; Prud'hommeaux, Eric; Samwald, Matthias; Zhao, Jun; Paschke, Adrian

    2009-01-01

    Background As interest in adopting the Semantic Web in the biomedical domain continues to grow, Semantic Web technology has been evolving and maturing. A variety of technological approaches including triplestore technologies, SPARQL endpoints, Linked Data, and Vocabulary of Interlinked Datasets have emerged in recent years. In addition to the data warehouse construction, these technological approaches can be used to support dynamic query federation. As a community effort, the BioRDF task force, within the Semantic Web for Health Care and Life Sciences Interest Group, is exploring how these emerging approaches can be utilized to execute distributed queries across different neuroscience data sources. Methods and results We have created two health care and life science knowledge bases. We have explored a variety of Semantic Web approaches to describe, map, and dynamically query multiple datasets. We have demonstrated several federation approaches that integrate diverse types of information about neurons and receptors that play an important role in basic, clinical, and translational neuroscience research. Particularly, we have created a prototype receptor explorer which uses OWL mappings to provide an integrated list of receptors and executes individual queries against different SPARQL endpoints. We have also employed the AIDA Toolkit, which is directed at groups of knowledge workers who cooperatively search, annotate, interpret, and enrich large collections of heterogeneous documents from diverse locations. We have explored a tool called "FeDeRate", which enables a global SPARQL query to be decomposed into subqueries against the remote databases offering either SPARQL or SQL query interfaces. Finally, we have explored how to use the vocabulary of interlinked Datasets (voiD) to create metadata for describing datasets exposed as Linked Data URIs or SPARQL endpoints. Conclusion We have demonstrated the use of a set of novel and state-of-the-art Semantic Web technologies in support of a neuroscience query federation scenario. We have identified both the strengths and weaknesses of these technologies. While Semantic Web offers a global data model including the use of Uniform Resource Identifiers (URI's), the proliferation of semantically-equivalent URI's hinders large scale data integration. Our work helps direct research and tool development, which will be of benefit to this community. PMID:19796394

  9. Increasing efficiency of information dissemination and collection through the World Wide Web

    Treesearch

    Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott

    2000-01-01

    Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...

  10. Developing a search engine for pharmacotherapeutic information that is not published in biomedical journals.

    PubMed

    Do Pazo-Oubiña, F; Calvo Pita, C; Puigventós Latorre, F; Periañez-Párraga, L; Ventayol Bosch, P

    2011-01-01

    To identify publishers of pharmacotherapeutic information not found in biomedical journals that focuses on evaluating and providing advice on medicines and to develop a search engine to access this information. Compiling web sites that publish information on the rational use of medicines and have no commercial interests. Free-access web sites in Spanish, Galician, Catalan or English. Designing a search engine using the Google "custom search" application. Overall 159 internet addresses were compiled and were classified into 9 labels. We were able to recover the information from the selected sources using a search engine, which is called "AlquimiA" and available from http://www.elcomprimido.com/FARHSD/AlquimiA.htm. The main sources of pharmacotherapeutic information not published in biomedical journals were identified. The search engine is a useful tool for searching and accessing "grey literature" on the internet. Copyright © 2010 SEFH. Published by Elsevier Espana. All rights reserved.

  11. Eight ways to go "e" without launching a Web service.

    PubMed

    Toth, C L; Goldstein, D

    2000-01-01

    With more than 100 million Americans on the Internet, the physician-patient relationship is evolving. The Net brings with it unlimited potential for communication and information dissemination, but these benefits do come with some side effects. No everything on the Net is reliable or credible, but many patients don't understand that. For this reason and others, physicians must involve themselves with online communication and e-commerce. There are many realistic and practical ways for physicians to do this, and not all of them involve building a Web site. Sending patients clinical information in a cost-effective manner, improving receivables management, and decreasing the volume of phone calls are just some of the many ways the Net can enhance your practice. Think of the Net as just another business tool--not a replacement for physician-patient relationships. Like the telephone before it, the Net affords improved communication, patient education, and relationship-building with patients.

  12. ATLAS Live: Collaborative Information Streams

    NASA Astrophysics Data System (ADS)

    Goldfarb, Steven; ATLAS Collaboration

    2011-12-01

    I report on a pilot project launched in 2010 focusing on facilitating communication and information exchange within the ATLAS Collaboration, through the combination of digital signage software and webcasting. The project, called ATLAS Live, implements video streams of information, ranging from detailed detector and data status to educational and outreach material. The content, including text, images, video and audio, is collected, visualised and scheduled using digital signage software. The system is robust and flexible, utilizing scripts to input data from remote sources, such as the CERN Document Server, Indico, or any available URL, and to integrate these sources into professional-quality streams, including text scrolling, transition effects, inter and intra-screen divisibility. Information is published via the encoding and webcasting of standard video streams, viewable on all common platforms, using a web browser or other common video tool. Authorisation is enforced at the level of the streaming and at the web portals, using the CERN SSO system.

  13. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  14. Do We Really Know how Much it Costs to Construct High Performance Buildings?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livingston, Olga V.; Dillon, Heather E.; Halverson, Mark A.

    2012-08-31

    Understanding the cost of energy efficient construction is critical to decision makers in building design, code development, and energy analysis. How much does it cost to upgrade from R-13 to R-19 in a building wall? How much do low-e windows really cost? Can we put a dollar figure on commissioning? Answers to these questions have a fuzzy nature, based on educated guesses and industry lore. The response depends on location, perspective, bulk buying, and hand waving. This paper explores the development of a web tool intended to serve as a publicly available repository of building component costs. In 2011 themore » U.S. Department of Energy (DOE) funded the launch of a web tool called the Building Component Cost Community (BC3), dedicated to publishing building component costs from documented sources, actively gathering verifiable cost data from the users, and collecting feedback from a wide range of participants on the quality of the posted cost data. The updated BC3 database, available at http://bc3.pnnl.gov, went live on April 30, 2012. BC3 serves as the ultimate source of the energy-related component costs for DOE’s residential code development activities, including cost-effectiveness analyses. The paper discusses BC3 objectives, structure, functionality and the current content of the database. It aims to facilitate a dialog about the lack of verifiable transparent cost data, as well as introduce a web tool that helps to address the problem. The questions posed above will also be addressed by this paper, but they have to be resolved by the user community by providing feedback and cost data to the BC3 database, thus increasing transparency and removing information asymmetry.« less

  15. RCQ-GA: RDF Chain Query Optimization Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Hogenboom, Alexander; Milea, Viorel; Frasincar, Flavius; Kaymak, Uzay

    The application of Semantic Web technologies in an Electronic Commerce environment implies a need for good support tools. Fast query engines are needed for efficient querying of large amounts of data, usually represented using RDF. We focus on optimizing a special class of SPARQL queries, the so-called RDF chain queries. For this purpose, we devise a genetic algorithm called RCQ-GA that determines the order in which joins need to be performed for an efficient evaluation of RDF chain queries. The approach is benchmarked against a two-phase optimization algorithm, previously proposed in literature. The more complex a query is, the more RCQ-GA outperforms the benchmark in solution quality, execution time needed, and consistency of solution quality. When the algorithms are constrained by a time limit, the overall performance of RCQ-GA compared to the benchmark further improves.

  16. STOP-IT: Windows executable software for the stop-signal paradigm.

    PubMed

    Verbruggen, Frederick; Logan, Gordon D; Stevens, Michaël A

    2008-05-01

    The stop-signal paradigm is a useful tool for the investigation of response inhibition. In this paradigm, subjects are instructed to respond as fast as possible to a stimulus unless a stop signal is presented after a variable delay. However, programming the stop-signal task is typically considered to be difficult. To overcome this issue, we present software called STOP-IT, for running the stop-signal task, as well as an additional analyzing program called ANALYZE-IT. The main advantage of both programs is that they are a precompiled executable, and for basic use there is no need for additional programming. STOP-IT and ANALYZE-IT are completely based on free software, are distributed under the GNU General Public License, and are available at the personal Web sites of the first two authors or at expsy.ugent.be/tscope/stop.html.

  17. ESCAPE: database for integrating high-content published data collected from human and mouse embryonic stem cells.

    PubMed

    Xu, Huilei; Baroukh, Caroline; Dannenfelser, Ruth; Chen, Edward Y; Tan, Christopher M; Kou, Yan; Kim, Yujin E; Lemischka, Ihor R; Ma'ayan, Avi

    2013-01-01

    High content studies that profile mouse and human embryonic stem cells (m/hESCs) using various genome-wide technologies such as transcriptomics and proteomics are constantly being published. However, efforts to integrate such data to obtain a global view of the molecular circuitry in m/hESCs are lagging behind. Here, we present an m/hESC-centered database called Embryonic Stem Cell Atlas from Pluripotency Evidence integrating data from many recent diverse high-throughput studies including chromatin immunoprecipitation followed by deep sequencing, genome-wide inhibitory RNA screens, gene expression microarrays or RNA-seq after knockdown (KD) or overexpression of critical factors, immunoprecipitation followed by mass spectrometry proteomics and phosphoproteomics. The database provides web-based interactive search and visualization tools that can be used to build subnetworks and to identify known and novel regulatory interactions across various regulatory layers. The web-interface also includes tools to predict the effects of combinatorial KDs by additive effects controlled by sliders, or through simulation software implemented in MATLAB. Overall, the Embryonic Stem Cell Atlas from Pluripotency Evidence database is a comprehensive resource for the stem cell systems biology community. Database URL: http://www.maayanlab.net/ESCAPE

  18. HC Forum®: a web site based on an international human cytogenetic database

    PubMed Central

    Cohen, Olivier; Mermet, Marie-Ange; Demongeot, Jacques

    2001-01-01

    Familial structural rearrangements of chromosomes represent a factor of malformation risk that could vary over a large range, making genetic counseling difficult. However, they also represent a powerful tool for increasing knowledge of the genome, particularly by studying breakpoints and viable imbalances of the genome. We have developed a collaborative database that now includes data on more than 4100 families, from which we have developed a web site called HC Forum® (http://HCForum.imag.fr). It offers geneticists assistance in diagnosis and in genetic counseling by assessing the malformation risk with statistical models. For researchers, interactive interfaces exhibit the distribution of chromosomal breakpoints and of the genome regions observed at birth in trisomy or in monosomy. Dedicated tools including an interactive pedigree allow electronic submission of data, which will be anonymously shown in a forum for discussions. After validation, data are definitively registered in the database with the email of the sender, allowing direct location of biological material. Thus HC Forum® constitutes a link between diagnosis laboratories and genome research centers, and after 1 year, more than 700 users from about 40 different countries already exist. PMID:11125121

  19. Refining and validating a two-stage and web-based cancer risk assessment tool for village doctors in China.

    PubMed

    Shen, Xing-Rong; Chai, Jing; Feng, Rui; Liu, Tong-Zhu; Tong, Gui-Xian; Cheng, Jing; Li, Kai-Chun; Xie, Shao-Yu; Shi, Yong; Wang, De-Bin

    2014-01-01

    The big gap between efficacy of population level prevention and expectations due to heterogeneity and complexity of cancer etiologic factors calls for selective yet personalized interventions based on effective risk assessment. This paper documents our research protocol aimed at refining and validating a two-stage and web- based cancer risk assessment tool, from a tentative one in use by an ongoing project, capable of identifying individuals at elevated risk for one or more types of the 80% leading cancers in rural China with adequate sensitivity and specificity and featuring low cost, easy application and cultural and technical sensitivity for farmers and village doctors. The protocol adopted a modified population-based case control design using 72, 000 non-patients as controls, 2, 200 cancer patients as cases, and another 600 patients as cases for external validation. Factors taken into account comprised 8 domains including diet and nutrition, risk behaviors, family history, precancerous diseases, related medical procedures, exposure to environment hazards, mood and feelings, physical activities and anthropologic and biologic factors. Modeling stresses explored various methodologies like empirical analysis, logistic regression, neuro-network analysis, decision theory and both internal and external validation using concordance statistics, predictive values, etc..

  20. WebMeV | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Web MeV (Multiple-experiment Viewer) is a web/cloud-based tool for genomic data analysis. Web MeV is being built to meet the challenge of exploring large public genomic data set with intuitive graphical interface providing access to state-of-the-art analytical tools.

  1. Cataloguing and displaying Web feeds from French language health sites: a Web 2.0 add-on to a health gateway.

    PubMed

    Kerdelhué, G; Thirion, B; Dahamna, B; Darmoni, S J

    2008-01-01

    Among the numerous new functionalities of the Internet, commonly called Web 2.0, Web syndication illustrates the trend for better and faster information sharing. Web feeds (a.k.a RSS feeds), which were used mostly on weblogs at first, are now also widely used in academic, scientific and institutional websites such as PubMed. As very few French language feeds were listed or catalogued in the Health field by the year of 2007, it was decided to implement them in the quality-controlled health gateway CISMeF ([French] acronym for Catalogue and Index of French Language Health Resources on the Internet). Furthermore, making full use of the nature of Web syndication, a Web feed aggregator was put online in to provide a dynamic news gateway called "CISMeF actualités" (http://www.chu-rouen.fr/actualites/). This article describes the process to retrieve and implement the Web feeds in the catalogue and how its terminology was adjusted to describe this new content. It also describes how the aggregator was put online and the features of this news gateway. CISMeF actualités was built accordingly to the editorial policy of CISMeF. Only a part of the Web feeds of the catalogue were included to display the most authoritative sources. Web feeds were also grouped by medical specialties and by countries using the prior indexing of websites with MeSH terms and the so-called metaterms. CISMeF actualités now displays 131 Web feeds across 40 different medical specialities, coming from 5 different countries. It is one example, among many, that static hypertext links can now easily and beneficially be completed, or replaced, by dynamic display of Web content using syndication feeds.

  2. A web tool for STORET/WQX water quality data retrieval and Best Management Practice scenario suggestion.

    PubMed

    Park, Youn Shik; Engel, Bernie A; Kim, Jonggun; Theller, Larry; Chaubey, Indrajeet; Merwade, Venkatesh; Lim, Kyoung Jae

    2015-03-01

    Total Maximum Daily Load is a water quality standard to regulate water quality of streams, rivers and lakes. A wide range of approaches are used currently to develop TMDLs for impaired streams and rivers. Flow and load duration curves (FDC and LDC) have been used in many states to evaluate the relationship between flow and pollutant loading along with other models and approaches. A web-based LDC Tool was developed to facilitate development of FDC and LDC as well as to support other hydrologic analyses. In this study, the FDC and LDC tool was enhanced to allow collection of water quality data via the web and to assist in establishing cost-effective Best Management Practice (BMP) implementations. The enhanced web-based tool provides use of water quality data not only from the US Geological Survey but also from the Water Quality Portal for the U.S. via web access. Moreover, the web-based tool identifies required pollutant reductions to meet standard loads and suggests a BMP scenario based on ability of BMPs to reduce pollutant loads, BMP establishment and maintenance costs. In the study, flow and water quality data were collected via web access to develop LDC and to identify the required reduction. The suggested BMP scenario from the web-based tool was evaluated using the EPA Spreadsheet Tool for the Estimation of Pollutant Load model to attain the required pollutant reduction at least cost. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Using Web-Based Technologies and Tools in Future Choreographers' Training: British Experience

    ERIC Educational Resources Information Center

    Bidyuk, Dmytro

    2016-01-01

    In the paper the problem of using effective web-based technologies and tools in teaching choreography in British higher education institutions has been discussed. Researches on the usage of web-based technologies and tools for practical dance courses in choreographers' professional training at British higher education institutions by such British…

  4. Translation of SNOMED CT - strategies and description of a pilot project.

    PubMed

    Klein, Gunnar O; Chen, Rong

    2009-01-01

    The translation and localization of SNOMED CT (Systematized Nomenclature of Medicine - Clinical Terms) have been initiated in a few countries. In Sweden, we conducted the first evaluation of this terminology in a project called REFTERM in which we also developed a software tool which could handle a large scale translation with a number of translators and reviewers in a web-based environment. The system makes use of existing authorized English-Swedish translations of medical terminologies such as ICD-10. The paper discusses possible strategies for a national project to translate and adapt this terminology.

  5. DNAApp: a mobile application for sequencing data analysis

    PubMed Central

    Nguyen, Phi-Vu; Verma, Chandra Shekhar; Gan, Samuel Ken-En

    2014-01-01

    Summary: There have been numerous applications developed for decoding and visualization of ab1 DNA sequencing files for Windows and MAC platforms, yet none exists for the increasingly popular smartphone operating systems. The ability to decode sequencing files cannot easily be carried out using browser accessed Web tools. To overcome this hurdle, we have developed a new native app called DNAApp that can decode and display ab1 sequencing file on Android and iOS. In addition to in-built analysis tools such as reverse complementation, protein translation and searching for specific sequences, we have incorporated convenient functions that would facilitate the harnessing of online Web tools for a full range of analysis. Given the high usage of Android/iOS tablets and smartphones, such bioinformatics apps would raise productivity and facilitate the high demand for analyzing sequencing data in biomedical research. Availability and implementation: The Android version of DNAApp is available in Google Play Store as ‘DNAApp’, and the iOS version is available in the App Store. More details on the app can be found at www.facebook.com/APDLab; www.bii.a-star.edu.sg/research/trd/apd.php The DNAApp user guide is available at http://tinyurl.com/DNAAppuser, and a video tutorial is available on Google Play Store and App Store, as well as on the Facebook page. Contact: samuelg@bii.a-star.edu.sg PMID:25095882

  6. DNAApp: a mobile application for sequencing data analysis.

    PubMed

    Nguyen, Phi-Vu; Verma, Chandra Shekhar; Gan, Samuel Ken-En

    2014-11-15

    There have been numerous applications developed for decoding and visualization of ab1 DNA sequencing files for Windows and MAC platforms, yet none exists for the increasingly popular smartphone operating systems. The ability to decode sequencing files cannot easily be carried out using browser accessed Web tools. To overcome this hurdle, we have developed a new native app called DNAApp that can decode and display ab1 sequencing file on Android and iOS. In addition to in-built analysis tools such as reverse complementation, protein translation and searching for specific sequences, we have incorporated convenient functions that would facilitate the harnessing of online Web tools for a full range of analysis. Given the high usage of Android/iOS tablets and smartphones, such bioinformatics apps would raise productivity and facilitate the high demand for analyzing sequencing data in biomedical research. The Android version of DNAApp is available in Google Play Store as 'DNAApp', and the iOS version is available in the App Store. More details on the app can be found at www.facebook.com/APDLab; www.bii.a-star.edu.sg/research/trd/apd.php The DNAApp user guide is available at http://tinyurl.com/DNAAppuser, and a video tutorial is available on Google Play Store and App Store, as well as on the Facebook page. samuelg@bii.a-star.edu.sg. © The Author 2014. Published by Oxford University Press.

  7. A WebGL Tool for Visualizing the Topology of the Sun's Coronal Magnetic Field

    NASA Astrophysics Data System (ADS)

    Duffy, A.; Cheung, C.; DeRosa, M. L.

    2012-12-01

    We present a web-based, topology-viewing tool that allows users to visualize the geometry and topology of the Sun's 3D coronal magnetic field in an interactive manner. The tool is implemented using, open-source, mature, modern web technologies including WebGL, jQuery, HTML 5, and CSS 3, which are compatible with nearly all modern web browsers. As opposed to the traditional method of visualization, which involves the downloading and setup of various software packages-proprietary and otherwise-the tool presents a clean interface that allows the user to easily load and manipulate the model, while also offering great power to choose which topological features are displayed. The tool accepts data encoded in the JSON open format that has libraries available for nearly every major programming language, making it simple to generate the data.

  8. Web3DMol: interactive protein structure visualization based on WebGL.

    PubMed

    Shi, Maoxiang; Gao, Juntao; Zhang, Michael Q

    2017-07-03

    A growing number of web-based databases and tools for protein research are being developed. There is now a widespread need for visualization tools to present the three-dimensional (3D) structure of proteins in web browsers. Here, we introduce our 3D modeling program-Web3DMol-a web application focusing on protein structure visualization in modern web browsers. Users submit a PDB identification code or select a PDB archive from their local disk, and Web3DMol will display and allow interactive manipulation of the 3D structure. Featured functions, such as sequence plot, fragment segmentation, measure tool and meta-information display, are offered for users to gain a better understanding of protein structure. Easy-to-use APIs are available for developers to reuse and extend Web3DMol. Web3DMol can be freely accessed at http://web3dmol.duapp.com/, and the source code is distributed under the MIT license. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. 78 FR 54241 - Proposed Information Collection; Comment Request; BroadbandMatch Web Site Tool

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... Information Collection; Comment Request; BroadbandMatch Web Site Tool AGENCY: National Telecommunications and... goal of increased broadband deployment and use in the United States. The BroadbandMatch Web site began... empowering technology effectively. II. Method of Collection BroadbandMatch users access the Web site through...

  10. Digital Discernment: An E-Commerce Web Site Evaluation Tool

    ERIC Educational Resources Information Center

    Sigman, Betsy Page; Boston, Brian J.

    2013-01-01

    Students entering the business workforce today may well share some responsibility for developing, revising, or evaluating their company's Web site. They may lack the experience, however, to critique their employer's Web presence effectively. The purpose of developing Digital Discernment, an e-commerce Web site evaluation tool, was to prepare…

  11. Ondex Web: web-based visualization and exploration of heterogeneous biological networks.

    PubMed

    Taubert, Jan; Hassani-Pak, Keywan; Castells-Brooke, Nathalie; Rawlings, Christopher J

    2014-04-01

    Ondex Web is a new web-based implementation of the network visualization and exploration tools from the Ondex data integration platform. New features such as context-sensitive menus and annotation tools provide users with intuitive ways to explore and manipulate the appearance of heterogeneous biological networks. Ondex Web is open source, written in Java and can be easily embedded into Web sites as an applet. Ondex Web supports loading data from a variety of network formats, such as XGMML, NWB, Pajek and OXL. http://ondex.rothamsted.ac.uk/OndexWeb.

  12. Architecture for biomedical multimedia information delivery on the World Wide Web

    NASA Astrophysics Data System (ADS)

    Long, L. Rodney; Goh, Gin-Hua; Neve, Leif; Thoma, George R.

    1997-10-01

    Research engineers at the National Library of Medicine are building a prototype system for the delivery of multimedia biomedical information on the World Wide Web. This paper discuses the architecture and design considerations for the system, which will be used initially to make images and text from the third National Health and Nutrition Examination Survey (NHANES) publicly available. We categorized our analysis as follows: (1) fundamental software tools: we analyzed trade-offs among use of conventional HTML/CGI, X Window Broadway, and Java; (2) image delivery: we examined the use of unconventional TCP transmission methods; (3) database manager and database design: we discuss the capabilities and planned use of the Informix object-relational database manager and the planned schema for the HNANES database; (4) storage requirements for our Sun server; (5) user interface considerations; (6) the compatibility of the system with other standard research and analysis tools; (7) image display: we discuss considerations for consistent image display for end users. Finally, we discuss the scalability of the system in terms of incorporating larger or more databases of similar data, and the extendibility of the system for supporting content-based retrieval of biomedical images. The system prototype is called the Web-based Medical Information Retrieval System. An early version was built as a Java applet and tested on Unix, PC, and Macintosh platforms. This prototype used the MiniSQL database manager to do text queries on a small database of records of participants in the second NHANES survey. The full records and associated x-ray images were retrievable and displayable on a standard Web browser. A second version has now been built, also a Java applet, using the MySQL database manager.

  13. Scribl: an HTML5 Canvas-based graphics library for visualizing genomic data over the web

    PubMed Central

    Miller, Chase A.; Anthony, Jon; Meyer, Michelle M.; Marth, Gabor

    2013-01-01

    Motivation: High-throughput biological research requires simultaneous visualization as well as analysis of genomic data, e.g. read alignments, variant calls and genomic annotations. Traditionally, such integrative analysis required desktop applications operating on locally stored data. Many current terabyte-size datasets generated by large public consortia projects, however, are already only feasibly stored at specialist genome analysis centers. As even small laboratories can afford very large datasets, local storage and analysis are becoming increasingly limiting, and it is likely that most such datasets will soon be stored remotely, e.g. in the cloud. These developments will require web-based tools that enable users to access, analyze and view vast remotely stored data with a level of sophistication and interactivity that approximates desktop applications. As rapidly dropping cost enables researchers to collect data intended to answer questions in very specialized contexts, developers must also provide software libraries that empower users to implement customized data analyses and data views for their particular application. Such specialized, yet lightweight, applications would empower scientists to better answer specific biological questions than possible with general-purpose genome browsers currently available. Results: Using recent advances in core web technologies (HTML5), we developed Scribl, a flexible genomic visualization library specifically targeting coordinate-based data such as genomic features, DNA sequence and genetic variants. Scribl simplifies the development of sophisticated web-based graphical tools that approach the dynamism and interactivity of desktop applications. Availability and implementation: Software is freely available online at http://chmille4.github.com/Scribl/ and is implemented in JavaScript with all modern browsers supported. Contact: gabor.marth@bc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23172864

  14. DiAs Web Monitoring: A Real-Time Remote Monitoring System Designed for Artificial Pancreas Outpatient Trials

    PubMed Central

    Place, Jérôme; Robert, Antoine; Brahim, Najib Ben; Patrick, Keith-Hynes; Farret, Anne; Marie-Josée, Pelletier; Buckingham, Bruce; Breton, Marc; Kovatchev, Boris; Renard, Eric

    2013-01-01

    Background Developments in an artificial pancreas (AP) for patients with type 1 diabetes have allowed a move toward performing outpatient clinical trials. “Home-like” environment implies specific protocol and system adaptations among which the introduction of remote monitoring is meaningful. We present a novel tool allowing multiple patients to monitor AP use in home-like settings. Methods We investigated existing systems, performed interviews of experienced clinical teams, listed required features, and drew several mockups of the user interface. The resulting application was tested on the bench before it was used in three outpatient studies representing 3480 h of remote monitoring. Results Our tool, called DiAs Web Monitoring (DWM), is a web-based application that ensures reception, storage, and display of data sent by AP systems. Continuous glucose monitoring (CGM) and insulin delivery data are presented in a colored chart to facilitate reading and interpretation. Several subjects can be monitored simultaneously on the same screen, and alerts are triggered to help detect events such as hypoglycemia or CGM failures. In the third trial, DWM received approximately 460 data per subject per hour: 77% for log messages, 5% for CGM data. More than 97% of transmissions were achieved in less than 5 min. Conclusions Transition from a hospital setting to home-like conditions requires specific AP supervision to which remote monitoring systems can contribute valuably. DiAs Web Monitoring worked properly when tested in our outpatient studies. It could facilitate subject monitoring and even accelerate medical and technical assessment of the AP. It should now be adapted for long-term studies with an enhanced notification feature. J Diabetes Sci Technol 2013;7(6):1427–1435 PMID:24351169

  15. SemanticSCo: A platform to support the semantic composition of services for gene expression analysis.

    PubMed

    Guardia, Gabriela D A; Ferreira Pires, Luís; da Silva, Eduardo G; de Farias, Cléver R G

    2017-02-01

    Gene expression studies often require the combined use of a number of analysis tools. However, manual integration of analysis tools can be cumbersome and error prone. To support a higher level of automation in the integration process, efforts have been made in the biomedical domain towards the development of semantic web services and supporting composition environments. Yet, most environments consider only the execution of simple service behaviours and requires users to focus on technical details of the composition process. We propose a novel approach to the semantic composition of gene expression analysis services that addresses the shortcomings of the existing solutions. Our approach includes an architecture designed to support the service composition process for gene expression analysis, and a flexible strategy for the (semi) automatic composition of semantic web services. Finally, we implement a supporting platform called SemanticSCo to realize the proposed composition approach and demonstrate its functionality by successfully reproducing a microarray study documented in the literature. The SemanticSCo platform provides support for the composition of RESTful web services semantically annotated using SAWSDL. Our platform also supports the definition of constraints/conditions regarding the order in which service operations should be invoked, thus enabling the definition of complex service behaviours. Our proposed solution for semantic web service composition takes into account the requirements of different stakeholders and addresses all phases of the service composition process. It also provides support for the definition of analysis workflows at a high-level of abstraction, thus enabling users to focus on biological research issues rather than on the technical details of the composition process. The SemanticSCo source code is available at https://github.com/usplssb/SemanticSCo. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. DiAs web monitoring: a real-time remote monitoring system designed for artificial pancreas outpatient trials.

    PubMed

    Place, Jérôme; Robert, Antoine; Ben Brahim, Najib; Keith-Hynes, Patrick; Farret, Anne; Pelletier, Marie-Josée; Buckingham, Bruce; Breton, Marc; Kovatchev, Boris; Renard, Eric

    2013-11-01

    Developments in an artificial pancreas (AP) for patients with type 1 diabetes have allowed a move toward performing outpatient clinical trials. "Home-like" environment implies specific protocol and system adaptations among which the introduction of remote monitoring is meaningful. We present a novel tool allowing multiple patients to monitor AP use in home-like settings. We investigated existing systems, performed interviews of experienced clinical teams, listed required features, and drew several mockups of the user interface. The resulting application was tested on the bench before it was used in three outpatient studies representing 3480 h of remote monitoring. Our tool, called DiAs Web Monitoring (DWM), is a web-based application that ensures reception, storage, and display of data sent by AP systems. Continuous glucose monitoring (CGM) and insulin delivery data are presented in a colored chart to facilitate reading and interpretation. Several subjects can be monitored simultaneously on the same screen, and alerts are triggered to help detect events such as hypoglycemia or CGM failures. In the third trial, DWM received approximately 460 data per subject per hour: 77% for log messages, 5% for CGM data. More than 97% of transmissions were achieved in less than 5 min. Transition from a hospital setting to home-like conditions requires specific AP supervision to which remote monitoring systems can contribute valuably. DiAs Web Monitoring worked properly when tested in our outpatient studies. It could facilitate subject monitoring and even accelerate medical and technical assessment of the AP. It should now be adapted for long-term studies with an enhanced notification feature. © 2013 Diabetes Technology Society.

  17. Content Documents Management

    NASA Technical Reports Server (NTRS)

    Muniz, R.; Hochstadt, J.; Boelke J.; Dalton, A.

    2011-01-01

    The Content Documents are created and managed under the System Software group with. Launch Control System (LCS) project. The System Software product group is lead by NASA Engineering Control and Data Systems branch (NEC3) at Kennedy Space Center. The team is working on creating Operating System Images (OSI) for different platforms (i.e. AIX, Linux, Solaris and Windows). Before the OSI can be created, the team must create a Content Document which provides the information of a workstation or server, with the list of all the software that is to be installed on it and also the set where the hardware belongs. This can be for example in the LDS, the ADS or the FR-l. The objective of this project is to create a User Interface Web application that can manage the information of the Content Documents, with all the correct validations and filters for administrator purposes. For this project we used one of the most excellent tools in agile development applications called Ruby on Rails. This tool helps pragmatic programmers develop Web applications with Rails framework and Ruby programming language. It is very amazing to see how a student can learn about OOP features with the Ruby language, manage the user interface with HTML and CSS, create associations and queries with gems, manage databases and run a server with MYSQL, run shell commands with command prompt and create Web frameworks with Rails. All of this in a real world project and in just fifteen weeks!

  18. w4CSeq: software and web application to analyze 4C-seq data.

    PubMed

    Cai, Mingyang; Gao, Fan; Lu, Wange; Wang, Kai

    2016-11-01

    Circularized Chromosome Conformation Capture followed by deep sequencing (4C-Seq) is a powerful technique to identify genome-wide partners interacting with a pre-specified genomic locus. Here, we present a computational and statistical approach to analyze 4C-Seq data generated from both enzyme digestion and sonication fragmentation-based methods. We implemented a command line software tool and a web interface called w4CSeq, which takes in the raw 4C sequencing data (FASTQ files) as input, performs automated statistical analysis and presents results in a user-friendly manner. Besides providing users with the list of candidate interacting sites/regions, w4CSeq generates figures showing genome-wide distribution of interacting regions, and sketches the enrichment of key features such as TSSs, TTSs, CpG sites and DNA replication timing around 4C sites. Users can establish their own web server by downloading source codes at https://github.com/WGLab/w4CSeq Additionally, a demo web server is available at http://w4cseq.wglab.org CONTACT: kaiwang@usc.edu or wangelu@usc.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  20. PseKRAAC: a flexible web server for generating pseudo K-tuple reduced amino acids composition.

    PubMed

    Zuo, Yongchun; Li, Yuan; Chen, Yingli; Li, Guangpeng; Yan, Zhenhe; Yang, Lei

    2017-01-01

    The reduced amino acids perform powerful ability for both simplifying protein complexity and identifying functional conserved regions. However, dealing with different protein problems may need different kinds of cluster methods. Encouraged by the success of pseudo-amino acid composition algorithm, we developed a freely available web server, called PseKRAAC (the pseudo K-tuple reduced amino acids composition). By implementing reduced amino acid alphabets, the protein complexity can be significantly simplified, which leads to decrease chance of overfitting, lower computational handicap and reduce information redundancy. PseKRAAC delivers more capability for protein research by incorporating three crucial parameters that describes protein composition. Users can easily generate many different modes of PseKRAAC tailored to their needs by selecting various reduced amino acids alphabets and other characteristic parameters. It is anticipated that the PseKRAAC web server will become a very useful tool in computational proteomics and protein sequence analysis. Freely available on the web at http://bigdata.imu.edu.cn/psekraac CONTACTS: yczuo@imu.edu.cn or imu.hema@foxmail.com or yanglei_hmu@163.comSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Impact of a Temporary NRT Enhancement in a State Quitline and Web-Based Program.

    PubMed

    Cole, Sam; Suter, Casey; Nash, Chelsea; Pollard, Joseph

    2018-06-01

    To examine the impact of a nicotine replacement therapy (NRT) enhancement on quit outcomes. Observational study using an intent to treat as treated analysis. Not available. A total of 4022 Idaho tobacco users aged ≥18 years who received services from the Idaho Tobacco Quitline or Idaho's web-based program. One-call phone or web-based participants were sent a single 4- or 8-week NRT shipment. Multiple-call participants were sent NRT in a single 4-week shipment or two 4-week shipments (second shipment sent only to those completing a second coaching call). North American Quitline Consortium recommended Minimal Data Set items collected at registration and follow-up. Thirty-day point prevalence quit rates were assessed at 7-month follow-up. Multiple logistic regression models were used to examine the effects of program type and amount of NRT sent to participants while controlling for demographic and tobacco use characteristics. Abstinence rates were significantly higher among 8-week versus 4-week NRT recipients (42.5% vs 33.3%). The effect was only significant between multiple-call program participants who received both 4-week NRT shipments versus only the first of 2 possible 4-week shipments (51.1% vs 31.1%). Costs per quit were lowest among web-based participants who received 4 weeks of NRT (US$183 per quit) and highest among multiple-call participants who received only 1 of 2 possible NRT shipments (US$557 per quit). To better balance cost with clinical effectiveness, funders of state-based tobacco cessation services may want to consider (1) allowing tobacco users to choose between phone- and web-based programs while (2) limiting longer NRT benefits only to multiple-call program participants.

  2. System Testing of Desktop and Web Applications

    ERIC Educational Resources Information Center

    Slack, James M.

    2011-01-01

    We want our students to experience system testing of both desktop and web applications, but the cost of professional system-testing tools is far too high. We evaluate several free tools and find that AutoIt makes an ideal educational system-testing tool. We show several examples of desktop and web testing with AutoIt, starting with simple…

  3. Providing Knowledge Recommendations: An Approach for Informal Electronic Mentoring

    ERIC Educational Resources Information Center

    Colomo-Palacios, Ricardo; Casado-Lumbreras, Cristina; Soto-Acosta, Pedro; Misra, Sanjay

    2014-01-01

    The use of Web 2.0 technologies for knowledge management is invading the corporate sphere. The Web 2.0 is the most adopted knowledge transfer tool within knowledge intensive firms and is starting to be used for mentoring. This paper presents IM-TAG, a Web 2.0 tool, based on semantic technologies, for informal mentoring. The tool offers…

  4. On Recommending Web 2.0 Tools to Personalise Learning

    ERIC Educational Resources Information Center

    Juškeviciene, Anita; Kurilovas, Eugenijus

    2014-01-01

    The paper aims to present research results on using Web 2.0 tools for learning personalisation. In the work, personalised Web 2.0 tools selection method is presented. This method takes into account student's learning preferences for content and communication modes tailored to the learning activities with a view to help the learner to quickly and…

  5. Component Architectures and Web-Based Learning Environments

    ERIC Educational Resources Information Center

    Ferdig, Richard E.; Mishra, Punya; Zhao, Yong

    2004-01-01

    The Web has caught the attention of many educators as an efficient communication medium and content delivery system. But we feel there is another aspect of the Web that has not been given the attention it deserves. We call this aspect of the Web its "component architecture." Briefly it means that on the Web one can develop very complex…

  6. Selecting a Free Web-Hosted Survey Tool for Student Use

    ERIC Educational Resources Information Center

    Elbeck, Matt

    2014-01-01

    This study provides marketing educators a review of free web-based survey services and guidance for student use. A mixed methods approach started with online searches and metrics identifying 13 free web-hosted survey services, described as demonstration or project tools, and ranked using popularity and importance web-based metrics. For each…

  7. Student Inquiry and Web 2.0

    ERIC Educational Resources Information Center

    Berger, Pam

    2010-01-01

    Web 2.0 applications are changing how educators interact both with each other and with their students. Educators can use these new Web tools daily to create, share, socialize, and collaborate with students, colleagues, and newly developed network contacts. School librarians are finding that Web 2.0 tools are bringing them more ways to embrace and…

  8. Students' Reaction to WebCT: Implications for Designing On-Line Learning Environments

    ERIC Educational Resources Information Center

    Osman, Mohamed Eltahir

    2005-01-01

    There is a growing number of web-based and web-assisted course development tools and products that can be used to create on-line learning environment. The utility of these products, however, varies greatly depending on their feasibility, prerequisite infrastructure, technical features, interface, and course development and management tools. WebCT…

  9. Standards opportunities around data-bearing Web pages.

    PubMed

    Karger, David

    2013-03-28

    The evolving Web has seen ever-growing use of structured data, thanks to the way it enhances information authoring, querying, visualization and sharing. To date, however, most structured data authoring and management tools have been oriented towards programmers and Web developers. End users have been left behind, unable to leverage structured data for information management and communication as well as professionals. In this paper, I will argue that many of the benefits of structured data management can be provided to end users as well. I will describe an approach and tools that allow end users to define their own schemas (without knowing what a schema is), manage data and author (not program) interactive Web visualizations of that data using the Web tools with which they are already familiar, such as plain Web pages, blogs, wikis and WYSIWYG document editors. I will describe our experience deploying these tools and some lessons relevant to their future evolution.

  10. Myria: Scalable Analytics as a Service

    NASA Astrophysics Data System (ADS)

    Howe, B.; Halperin, D.; Whitaker, A.

    2014-12-01

    At the UW eScience Institute, we're working to empower non-experts, especially in the sciences, to write and use data-parallel algorithms. To this end, we are building Myria, a web-based platform for scalable analytics and data-parallel programming. Myria's internal model of computation is the relational algebra extended with iteration, such that every program is inherently data-parallel, just as every query in a database is inherently data-parallel. But unlike databases, iteration is a first class concept, allowing us to express machine learning tasks, graph traversal tasks, and more. Programs can be expressed in a number of languages and can be executed on a number of execution environments, but we emphasize a particular language called MyriaL that supports both imperative and declarative styles and a particular execution engine called MyriaX that uses an in-memory column-oriented representation and asynchronous iteration. We deliver Myria over the web as a service, providing an editor, performance analysis tools, and catalog browsing features in a single environment. We find that this web-based "delivery vector" is critical in reaching non-experts: they are insulated from irrelevant effort technical work associated with installation, configuration, and resource management. The MyriaX backend, one of several execution runtimes we support, is a main-memory, column-oriented, RDBMS-on-the-worker system that supports cyclic data flows as a first-class citizen and has been shown to outperform competitive systems on 100-machine cluster sizes. I will describe the Myria system, give a demo, and present some new results in large-scale oceanographic microbiology.

  11. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    PubMed

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  12. Design of a self-administered online food frequency questionnaire (FFQ) to assess dietary intake among university population.

    PubMed

    González Carrascosa, R; Bayo Montó, J L; Meneu Barreira, T; García Segovia, P; Martínez-Monzó, J

    2011-01-01

    To introduce and describe a new tool called UPV-FFQ to evaluate dietary intake of the university population. The new tool consists principally in a self-administered online food frequency questionnaire (FFQ). The tool UPV-FFQ has been developed by means of web pages applying the technology ASP.NET 2.0 and using the database SQL Server 2005 as support. To develop the FFQ has been used as model the paper and pencil FFQ called "Dieta, salud y antropometría en la población universitaria". The tool has three parts: (1) a homepage, (2) a general questionnaire and (3) a FFQ. The FFQ has a closed list of 84 food items commonly consumed in Valencia region. The respondents has to indicate the food items that they consume (2 possible options), the frequency of consumption (9 response options) and the quantity consumed (7 response options). The UPV-FFQ has approximately 250 color photographs that represents to three portion sizes. The photographs are useful to help the respondents to choose the portion sizes that more adjusts to their habitual portions. The new tool provides quantitative information of the habitual intake of 31 nutritional parameters and provides qualitative information of the general questionnaire. A pilot study was done for a total of 57 respondents. The media time spend to fill in was 15 minutes. The pilot study concluded that the questionnaire was ease-of-use, low cost and time-effectiveness questionnaire. The format and the sequence of the questions were easily understood.

  13. Integration of Web 2.0 Tools in Learning a Programming Course

    ERIC Educational Resources Information Center

    Majid, Nazatul Aini Abd

    2014-01-01

    Web 2.0 tools are expected to assist students to acquire knowledge effectively in their university environment. However, the lack of effort from lecturers in planning the learning process can make it difficult for the students to optimize their learning experiences. The aim of this paper is to integrate Web 2.0 tools with learning strategy in…

  14. Motivating Pre-Service Teachers in Technology Integration of Web 2.0 for Teaching Internships

    ERIC Educational Resources Information Center

    Kim, Hye Jeong; Jang, Hwan Young

    2015-01-01

    The aim of this study was to examine the predictors of pre-service teachers' use of Web 2.0 tools during a teaching internship, after a course that emphasized the use of the tools for instructional activities. Results revealed that integrating Web 2.0 tools during their teaching internship was strongly predicted by participants' perceived…

  15. K-12 Student Use of Web 2.0 Tools: A Global Study

    ERIC Educational Resources Information Center

    Toledo, Cheri; Shepard, MaryFriend

    2011-01-01

    Over the past decade, Internet use has increased 445% worldwide. This boom has enabled widespread access to online tools and digital spaces for educational practices. The results of this study of Web 2.0 tool use in kindergarten through high school (K-12) classrooms around the world will be presented. A web-based survey was sent out through online…

  16. Prototyping Tool for Web-Based Multiuser Online Role-Playing Game

    NASA Astrophysics Data System (ADS)

    Okamoto, Shusuke; Kamada, Masaru; Yonekura, Tatsuhiro

    This letter proposes a prototyping tool for Web-based Multiuser Online Role-Playing Game (MORPG). The design goal is to make this tool simple and powerful. The tool is comprised of a GUI editor, a translator and a runtime environment. The GUI editor is used to edit state-transition diagrams, each of which defines the behavior of the fictional characters. The state-transition diagrams are translated into C program codes, which plays the role of a game engine in RPG system. The runtime environment includes PHP, JavaScript with Ajax and HTML. So the prototype system can be played on the usual Web browser, such as Fire-fox, Safari and IE. On a click or key press by a player, the Web browser sends it to the Web server to reflect its consequence on the screens which other players are looking at. Prospected users of this tool include programming novices and schoolchildren. The knowledge or skill of any specific programming languages is not required to create state-transition diagrams. Its structure is not only suitable for the definition of a character behavior but also intuitive to help novices understand. Therefore, the users can easily create Web-based MORPG system with the tool.

  17. Group dynamics and social interaction in a South Asian online learning forum for faculty development of medical teachers.

    PubMed

    Anshu; Sharma, M; Burdick, W P; Singh, T

    2010-04-01

    Group dynamics of online medical faculty development programs have not been analyzed and reported in literature. Knowledge of the types of content of posted messages will help to understand group dynamics and promote participation in an asynchronous learning environment. This paper assesses group dynamics and social interactivity in an online learning environment for medical teachers in the South Asian context. Participants of a medical education fellowship program conducted by the Foundation for Advancement of International Medical Education and Research (FAIMER) Regional Institute at Christian Medical College, Ludhiana (CMCL) in India interact on a listserv called the Mentoring-Learning Web (ML-Web). Monthly topics for online discussion are chosen by fellows through a standard tool called "multi-voting". Fellows volunteer to moderate sessions and direct the pace of the discussion. We analyzed the content and process of the discussion of one particular month. The emails were categorized as those that reflected cognitive presence (dealing with construction and exploration of knowledge), teacher presence (dealing with instructional material and learning resources), and social presence, or were administrative in nature. Social emails were further classified as: affective, cohesive and interactive. Social emails constituted one-third of the total emails. Another one-quarter of the emails dealt with sharing of resources and teacher presence, while cognitive emails comprised 36.2% of the total. More than half of the social emails were affective, while a little less than one-third were cohesive. Social posts are an inevitable part of online learning. These posts promote bonding between learners and contribute to better interaction and collaboration in online learning. Moderators should be aware of their presence and use them as tools to promote interactivity.

  18. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    PubMed

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  19. Enhancing Thematic Units Using the World Wide Web: Tools and Strategies for Students with Mild Disabilities.

    ERIC Educational Resources Information Center

    Gardner, J. Emmett; Wissick, Cheryl A.

    2002-01-01

    This article presents principles for using Web-based activities to support curriculum accommodations for students with mild disabilities. Tools, resources, and strategies are identified to help teachers construct meaningful and Web-enhanced thematic units. Web sites are listed in the areas of math, science, language arts, and social studies;…

  20. Finding, Browsing and Getting Data Easily Using SPDF Web Services

    NASA Technical Reports Server (NTRS)

    Candey, R.; Chimiak, R.; Harris, B.; Johnson, R.; Kovalick, T.; Lal, N.; Leckner, H.; Liu, M.; McGuire, R.; Papitashvili, N.; hide

    2010-01-01

    The NASA GSFC Space Physics Data Facility (5PDF) provides heliophysics science-enabling information services for enhancing scientific research and enabling integration of these services into the Heliophysics Data Environment paradigm, via standards-based approach (SOAP) and Representational State Transfer (REST) web services in addition to web browser, FTP, and OPeNDAP interfaces. We describe these interfaces and the philosophies behind these web services, and show how to call them from various languages, such as IDL and Perl. We are working towards a "one simple line to call" philosophy extolled in the recent VxO discussions. Combining data from many instruments and missions enables broad research analysis and correlation and coordination with other experiments and missions.

  1. Gobe: an interactive, web-based tool for comparative genomic visualization.

    PubMed

    Pedersen, Brent S; Tang, Haibao; Freeling, Michael

    2011-04-01

    Gobe is a web-based tool for viewing comparative genomic data. It supports viewing multiple genomic regions simultaneously. Its simple text format and flash-based rendering make it an interactive, exploratory research tool. Gobe can be used without installation through our web service, or downloaded and customized with stylesheets and javascript callback functions. Gobe is a flash application that runs in all modern web-browsers. The full source-code, including that for the online web application is available under the MIT license at: http://github.com/brentp/gobe. Sample applications are hosted at http://try-gobe.appspot.com/ and http://synteny.cnr.berkeley.edu/gobe-app/.

  2. Informal Learning through Expertise Mining in the Social Web

    ERIC Educational Resources Information Center

    Valencia-Garcia, Rafael; Garcia-Sanchez, Francisco; Casado-Lumbreras, Cristina; Castellanos-Nieves, Dagoberto; Fernandez-Breis, Jesualdo Tomas

    2012-01-01

    The advent of Web 2.0, also called the Social Web, has changed the way people interact with the Web. Assisted by the technologies associated with this new trend, users now play a much more active role as content providers. This Web paradigm shift has also changed how companies operate and interact with their employees, partners and customers. The…

  3. orthoFind Facilitates the Discovery of Homologous and Orthologous Proteins.

    PubMed

    Mier, Pablo; Andrade-Navarro, Miguel A; Pérez-Pulido, Antonio J

    2015-01-01

    Finding homologous and orthologous protein sequences is often the first step in evolutionary studies, annotation projects, and experiments of functional complementation. Despite all currently available computational tools, there is a requirement for easy-to-use tools that provide functional information. Here, a new web application called orthoFind is presented, which allows a quick search for homologous and orthologous proteins given one or more query sequences, allowing a recurrent and exhaustive search against reference proteomes, and being able to include user databases. It addresses the protein multidomain problem, searching for homologs with the same domain architecture, and gives a simple functional analysis of the results to help in the annotation process. orthoFind is easy to use and has been proven to provide accurate results with different datasets. Availability: http://www.bioinfocabd.upo.es/orthofind/.

  4. Component Technology for High-Performance Scientific Simulation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epperly, T; Kohn, S; Kumfert, G

    2000-11-09

    We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less

  5. Wet Lab Accelerator: A Web-Based Application Democratizing Laboratory Automation for Synthetic Biology.

    PubMed

    Bates, Maxwell; Berliner, Aaron J; Lachoff, Joe; Jaschke, Paul R; Groban, Eli S

    2017-01-20

    Wet Lab Accelerator (WLA) is a cloud-based tool that allows a scientist to conduct biology via robotic control without the need for any programming knowledge. A drag and drop interface provides a convenient and user-friendly method of generating biological protocols. Graphically developed protocols are turned into programmatic instruction lists required to conduct experiments at the cloud laboratory Transcriptic. Prior to the development of WLA, biologists were required to write in a programming language called "Autoprotocol" in order to work with Transcriptic. WLA relies on a new abstraction layer we call "Omniprotocol" to convert the graphical experimental description into lower level Autoprotocol language, which then directs robots at Transcriptic. While WLA has only been tested at Transcriptic, the conversion of graphically laid out experimental steps into Autoprotocol is generic, allowing extension of WLA into other cloud laboratories in the future. WLA hopes to democratize biology by bringing automation to general biologists.

  6. WebProtégé: A Collaborative Ontology Editor and Knowledge Acquisition Tool for the Web

    PubMed Central

    Tudorache, Tania; Nyulas, Csongor; Noy, Natalya F.; Musen, Mark A.

    2012-01-01

    In this paper, we present WebProtégé—a lightweight ontology editor and knowledge acquisition tool for the Web. With the wide adoption of Web 2.0 platforms and the gradual adoption of ontologies and Semantic Web technologies in the real world, we need ontology-development tools that are better suited for the novel ways of interacting, constructing and consuming knowledge. Users today take Web-based content creation and online collaboration for granted. WebProtégé integrates these features as part of the ontology development process itself. We tried to lower the entry barrier to ontology development by providing a tool that is accessible from any Web browser, has extensive support for collaboration, and a highly customizable and pluggable user interface that can be adapted to any level of user expertise. The declarative user interface enabled us to create custom knowledge-acquisition forms tailored for domain experts. We built WebProtégé using the existing Protégé infrastructure, which supports collaboration on the back end side, and the Google Web Toolkit for the front end. The generic and extensible infrastructure allowed us to easily deploy WebProtégé in production settings for several projects. We present the main features of WebProtégé and its architecture and describe briefly some of its uses for real-world projects. WebProtégé is free and open source. An online demo is available at http://webprotege.stanford.edu. PMID:23807872

  7. WebGeocalc and Cosmographia: Modern Tools to Access SPICE Archives

    NASA Astrophysics Data System (ADS)

    Semenov, B. V.; Acton, C. H.; Bachman, N. J.; Ferguson, E. W.; Rose, M. E.; Wright, E. D.

    2017-06-01

    The WebGeocalc (WGC) web client-server tool and the SPICE-enhanced Cosmographia visualization program are two new ways for accessing space mission geometry data provided in the PDS SPICE kernel archives and by mission operational SPICE kernel sets.

  8. Use Cases for Combining Web Services with ArcPython Tools for Enabling Quality Control of Land Remote Sensing Data Products.

    NASA Astrophysics Data System (ADS)

    Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.

    2016-12-01

    Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.

  9. a Web-Based Interactive Tool for Multi-Resolution 3d Models of a Maya Archaeological Site

    NASA Astrophysics Data System (ADS)

    Agugiaro, G.; Remondino, F.; Girardi, G.; von Schwerin, J.; Richards-Rissetto, H.; De Amicis, R.

    2011-09-01

    Continuous technological advances in surveying, computing and digital-content delivery are strongly contributing to a change in the way Cultural Heritage is "perceived": new tools and methodologies for documentation, reconstruction and research are being created to assist not only scholars, but also to reach more potential users (e.g. students and tourists) willing to access more detailed information about art history and archaeology. 3D computer-simulated models, sometimes set in virtual landscapes, offer for example the chance to explore possible hypothetical reconstructions, while on-line GIS resources can help interactive analyses of relationships and change over space and time. While for some research purposes a traditional 2D approach may suffice, this is not the case for more complex analyses concerning spatial and temporal features of architecture, like for example the relationship of architecture and landscape, visibility studies etc. The project aims therefore at creating a tool, called "QueryArch3D" tool, which enables the web-based visualisation and queries of an interactive, multi-resolution 3D model in the framework of Cultural Heritage. More specifically, a complete Maya archaeological site, located in Copan (Honduras), has been chosen as case study to test and demonstrate the platform's capabilities. Much of the site has been surveyed and modelled at different levels of detail (LoD) and the geometric model has been semantically segmented and integrated with attribute data gathered from several external data sources. The paper describes the characteristics of the research work, along with its implementation issues and the initial results of the developed prototype.

  10. Cold Calling and Web Postings: Do They Improve Students' Preparation and Learning in Statistics?

    ERIC Educational Resources Information Center

    Levy, Dan

    2014-01-01

    Getting students to prepare well for class is a common challenge faced by instructors all over the world. This study investigates the effects that two frequently used techniques to increase student preparation--web postings and cold calling--have on student outcomes. The study is based on two experiments and a qualitative study conducted in a…

  11. Evaluating the Effectiveness of Web-based Climate Resilience Decision Support Tools: Insights from Coastal New Jersey

    NASA Astrophysics Data System (ADS)

    Brady, M.; Lathrop, R.; Auermuller, L. M.; Leichenko, R.

    2016-12-01

    Despite the recent surge of Web-based decision support tools designed to promote resiliency in U.S. coastal communities, to-date there has been no systematic study of their effectiveness. This study demonstrates a method to evaluate important aspects of effectiveness of four Web map tools designed to promote consideration of climate risk information in local decision-making and planning used in coastal New Jersey. In summer 2015, the research team conducted in-depth phone interviews with users of one regulatory and three non-regulatory Web map tools using a semi-structured questionnaire. The interview and analysis design drew from a combination of effectiveness evaluation approaches developed in software and information usability, program evaluation, and management information system (MIS) research. Effectiveness assessment results were further analyzed and discussed in terms of conceptual hierarchy of system objectives defined by respective tool developer and user organizations represented in the study. Insights from the interviews suggest that users rely on Web tools as a supplement to desktop and analog map sources because they provide relevant and up-to-date information in a highly accessible and mobile format. The users also reported relying on multiple information sources and comparison between digital and analog sources for decision support. However, with respect to this decision support benefit, users were constrained by accessibility factors such as lack of awareness and training with some tools, lack of salient information such as planning time horizons associated with future flood scenarios, and environmental factors such as mandates restricting some users to regulatory tools. Perceptions of Web tool credibility seem favorable overall, but factors including system design imperfections and inconsistencies in data and information across platforms limited trust, highlighting a need for better coordination between tools. Contributions of the study include user feedback on web-tool system designs consistent with collaborative methods for enhancing usability and a systematic look at effectiveness that includes both user perspectives and consideration of developer and organizational objectives.

  12. The Impact of Self-Efficacy and Professional Development on Implementation of Web 2.0 Tools in Elementary Classrooms

    ERIC Educational Resources Information Center

    Ward, Stephen

    2015-01-01

    This study sought to understand the impact of self-efficacy and professional development on the implementation of specific Web 2.0 tools in the elementary classroom. There were three research questions addressed in this QUAN-Qual study. Quantitative data were collected through three surveys with 48 total participants: the Web 2.0 tools Utilization…

  13. Oh! Web 2.0, Virtual Reference Service 2.0, Tools and Techniques (I): A Basic Approach

    ERIC Educational Resources Information Center

    Arya, Harsh Bardhan; Mishra, J. K.

    2011-01-01

    This study targets librarians and information professionals who use Web 2.0 tools and applications with a view to providing snapshots on how Web 2.0 technologies are used. It also aims to identify values and impact that such tools have exerted on libraries and their services, as well as to detect various issues associated with the implementation…

  14. Education and Technology in the 21st Century Experiences of Adult Online Learners Using Web 2.0

    ERIC Educational Resources Information Center

    Bryant, Wanda L.

    2014-01-01

    The emergence of a knowledge-based and technology-driven economy has prompted adults to seek additional knowledge and skills that will enable them to participate effectively in society. The rapid growth and popularity of the internet tools such as Web 2.0 tools have revolutionized adult learning. Through the rich support of Web 2.0 tools, adult…

  15. Interpreting User's Choice of Technologies: A Quantitative Research on Choosing the Best Web-Based Communication Tools

    ERIC Educational Resources Information Center

    Adebiaye, Richmond

    2010-01-01

    The proliferation of web-based communication tools like email clients vis-a-vis Yahoo mail, Gmail, and Hotmail have led to new innovations in web-based communication. Email users benefit greatly from this technology, but lack of security of these tools can put users at risk of loss of privacy, including identity theft, corporate espionage, and…

  16. SVAw - a web-based application tool for automated surrogate variable analysis of gene expression studies

    PubMed Central

    2013-01-01

    Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726

  17. A Randomized Controlled Trial of COMPASS Web-Based and Face-to-Face Teacher Coaching in Autism

    PubMed Central

    Ruble, Lisa A.; McGrew, John H.; Toland, Michael D.; Dalrymple, Nancy J.; Jung, Lee Ann

    2013-01-01

    Objective Most children with autism rely on schools as their primary source of intervention, yet research has suggested that teachers rarely use evidence-based practices. To address the need for improved educational outcomes, a previously tested consultation intervention called the Collaborative Model for Promoting Competence and Success (COMPASS; Ruble, Dalrymple, & McGrew, 2010; Ruble, Dalrymple, & McGrew, 2012) was evaluated in a 2nd randomized controlled trial, with the addition of a web-based group. Method Forty-nine teacher–child dyads were randomized into 1 of 3 groups: (1) a placebo control (PBO) group, (2) COMPASS followed by face-to-face (FF) coaching sessions, and (3) COMPASS followed by web-based (WEB) coaching sessions. Three individualized goals (social, communication, and independence skills) were selected for intervention for each child. The primary outcome of independent ratings of child goal attainment and several process measures (e.g., consultant and teacher fidelity) were evaluated. Results Using an intent-to-treat approach, findings replicated earlier results with a very large effect size (d = 1.41) for the FF group and a large effect size (d = 1.12) for the WEB group relative to the PBO group. There were no differences in overall change across goal domains between the FF and WEB groups, suggesting the efficacy of videoconferencing technology. Conclusions COMPASS is effective and results in improved educational outcomes for young children with autism. Videoconferencing technology, as a scalable tool, has promise for facilitating access to autism specialists and bridging the research-to-practice gap. PMID:23438314

  18. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  19. An online planning tool for designing terrace layouts

    USDA-ARS?s Scientific Manuscript database

    A web-based conservation planning tool, WebTERLOC (web-based Terrace Location Program), was developed to provide multiple terrace layout options using digital elevation model (DEM) and geographic information systems (GIS). Development of a terrace system is complicated by the time-intensive manual ...

  20. The Sargassum Early Advisory System (SEAS)

    NASA Astrophysics Data System (ADS)

    Armstrong, D.; Gallegos, S. C.

    2016-02-01

    The Sargassum Early Advisory System (SEAS) web-app was designed to automatically detect Sargassum at sea, forecast movement of the seaweed, and alert users of potential landings. Inspired to help address the economic hardships caused by large landings of Sargassum, the web app automates and enhances the manual tasks conducted by the SEAS group of Texas A&M University at Galveston. The SEAS web app is a modular, mobile-friendly tool that automates the entire workflow from data acquisition to user management. The modules include: 1) an Imagery Retrieval Module to automatically download Landsat-8 Operational Land Imagery (OLI) from the United States Geological Survey (USGS), 2) a Processing Module for automatic detection of Sargassum in the OLI imagery, and subsequent mapping of theses patches in the HYCOM grid, producing maps that show Sargassum clusters; 3) a Forecasting engine fed by the HYbrid Coordinate Ocean Model (HYCOM) model currents and winds from weather buoys; and 4) a mobile phone optimized geospatial user interface. The user can view the last known position of Sargassum clusters, trajectory and location projections for the next 24, 72 and 168 hrs. Users can also subscribe to alerts generated for particular areas. Currently, the SEAS web app produces advisories for Texas beaches. The forecasted Sargassum landing locations are validated by reports from Texas beach managers. However, the SEAS web app was designed to easily expand to other areas, and future plans call for extending the SEAS web app to Mexico and the Caribbean islands. The SEAS web app development is led by NASA, with participation by ASRC Federal/Computer Science Corporation, and the Naval Research Laboratory, all at Stennis Space Center, and Texas A&M University at Galveston.

  1. Teaching with technology: automatically receiving information from the internet and web.

    PubMed

    Wink, Diane M

    2010-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools, social networking and social bookmarking sites, virtual worlds, and Web-based teaching and learning programs. This article presents information and tools related to automatically receiving information from the Internet and Web.

  2. Experimental evaluation of the impact of packet capturing tools for web services.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choe, Yung Ryn; Mohapatra, Prasant; Chuah, Chen-Nee

    Network measurement is a discipline that provides the techniques to collect data that are fundamental to many branches of computer science. While many capturing tools and comparisons have made available in the literature and elsewhere, the impact of these packet capturing tools on existing processes have not been thoroughly studied. While not a concern for collection methods in which dedicated servers are used, many usage scenarios of packet capturing now requires the packet capturing tool to run concurrently with operational processes. In this work we perform experimental evaluations of the performance impact that packet capturing process have on web-based services;more » in particular, we observe the impact on web servers. We find that packet capturing processes indeed impact the performance of web servers, but on a multi-core system the impact varies depending on whether the packet capturing and web hosting processes are co-located or not. In addition, the architecture and behavior of the web server and process scheduling is coupled with the behavior of the packet capturing process, which in turn also affect the web server's performance.« less

  3. SG-ADVISER mtDNA: a web server for mitochondrial DNA annotation with data from 200 samples of a healthy aging cohort.

    PubMed

    Rueda, Manuel; Torkamani, Ali

    2017-08-18

    Whole genome and exome sequencing usually include reads containing mitochondrial DNA (mtDNA). Yet, state-of-the-art pipelines and services for human nuclear genome variant calling and annotation do not handle mitochondrial genome data appropriately. As a consequence, any researcher desiring to add mtDNA variant analysis to their investigations is forced to explore the literature for mtDNA pipelines, evaluate them, and implement their own instance of the desired tool. This task is far from trivial, and can be prohibitive for non-bioinformaticians. We have developed SG-ADVISER mtDNA, a web server to facilitate the analysis and interpretation of mtDNA genomic data coming from next generation sequencing (NGS) experiments. The server was built in the context of our SG-ADVISER framework and on top of the MtoolBox platform (Calabrese et al., Bioinformatics 30(21):3115-3117, 2014), and includes most of its functionalities (i.e., assembly of mitochondrial genomes, heteroplasmic fractions, haplogroup assignment, functional and prioritization analysis of mitochondrial variants) as well as a back-end and a front-end interface. The server has been tested with unpublished data from 200 individuals of a healthy aging cohort (Erikson et al., Cell 165(4):1002-1011, 2016) and their data is made publicly available here along with a preliminary analysis of the variants. We observed that individuals over ~90 years old carried low levels of heteroplasmic variants in their genomes. SG-ADVISER mtDNA is a fast and functional tool that allows for variant calling and annotation of human mtDNA data coming from NGS experiments. The server was built with simplicity in mind, and builds on our own experience in interpreting mtDNA variants in the context of sudden death and rare diseases. Our objective is to provide an interface for non-bioinformaticians aiming to acquire (or contrast) mtDNA annotations via MToolBox. SG-ADVISER web server is freely available to all users at https://genomics.scripps.edu/mtdna .

  4. A Performance-Based Web Budget Tool

    ERIC Educational Resources Information Center

    Abou-Sayf, Frank K.; Lau, Wilson

    2007-01-01

    A web-based formula-driven tool has been developed for the purpose of performing two distinct academic department budgeting functions: allocation funding to the department, and budget management by the department. The tool's major features are discussed and its uses demonstrated. The tool's advantages are presented. (Contains 10 figures.)

  5. Virus-Clip: a fast and memory-efficient viral integration site detection tool at single-base resolution with annotation capability.

    PubMed

    Ho, Daniel W H; Sze, Karen M F; Ng, Irene O L

    2015-08-28

    Viral integration into the human genome upon infection is an important risk factor for various human malignancies. We developed viral integration site detection tool called Virus-Clip, which makes use of information extracted from soft-clipped sequencing reads to identify exact positions of human and virus breakpoints of integration events. With initial read alignment to virus reference genome and streamlined procedures, Virus-Clip delivers a simple, fast and memory-efficient solution to viral integration site detection. Moreover, it can also automatically annotate the integration events with the corresponding affected human genes. Virus-Clip has been verified using whole-transcriptome sequencing data and its detection was validated to have satisfactory sensitivity and specificity. Marked advancement in performance was detected, compared to existing tools. It is applicable to versatile types of data including whole-genome sequencing, whole-transcriptome sequencing, and targeted sequencing. Virus-Clip is available at http://web.hku.hk/~dwhho/Virus-Clip.zip.

  6. Googling DNA sequences on the World Wide Web.

    PubMed

    Hajibabaei, Mehrdad; Singer, Gregory A C

    2009-11-10

    New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.

  7. SeqMule: automated pipeline for analysis of human exome/genome sequencing data.

    PubMed

    Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai

    2015-09-18

    Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.

  8. indCAPS: A tool for designing screening primers for CRISPR/Cas9 mutagenesis events.

    PubMed

    Hodgens, Charles; Nimchuk, Zachary L; Kieber, Joseph J

    2017-01-01

    Genetic manipulation of organisms using CRISPR/Cas9 technology generally produces small insertions/deletions (indels) that can be difficult to detect. Here, we describe a technique to easily and rapidly identify such indels. Sequence-identified mutations that alter a restriction enzyme recognition site can be readily distinguished from wild-type alleles using a cleaved amplified polymorphic sequence (CAPS) technique. If a restriction site is created or altered by the mutation such that only one allele contains the restriction site, a polymerase chain reaction (PCR) followed by a restriction digest can be used to distinguish the two alleles. However, in the case of most CRISPR-induced alleles, no such restriction sites are present in the target sequences. In this case, a derived CAPS (dCAPS) approach can be used in which mismatches are purposefully introduced in the oligonucleotide primers to create a restriction site in one, but not both, of the amplified templates. Web-based tools exist to aid dCAPS primer design, but when supplied sequences that include indels, the current tools often fail to suggest appropriate primers. Here, we report the development of a Python-based, species-agnostic web tool, called indCAPS, suitable for the design of PCR primers used in dCAPS assays that is compatible with indels. This tool should have wide utility for screening editing events following CRISPR/Cas9 mutagenesis as well as for identifying specific editing events in a pool of CRISPR-mediated mutagenesis events. This tool was field-tested in a CRISPR mutagenesis experiment targeting a cytokinin receptor (AHK3) in Arabidopsis thaliana. The tool suggested primers that successfully distinguished between wild-type and edited alleles of a target locus and facilitated the isolation of two novel ahk3 null alleles. Users can access indCAPS and design PCR primers to employ dCAPS to identify CRISPR/Cas9 alleles at http://indcaps.kieber.cloudapps.unc.edu/.

  9. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  10. WebPresent: a World Wide Web-based telepresentation tool for physicians

    NASA Astrophysics Data System (ADS)

    Sampath-Kumar, Srihari; Banerjea, Anindo; Moshfeghi, Mehran

    1997-05-01

    In this paper, we present the design architecture and the implementation status of WebPresent - a world wide web based tele-presentation tool. This tool allows a physician to use a conference server workstation and make a presentation of patient cases to a geographically distributed audience. The audience consists of other physicians collaborating on patients' health care management and physicians participating in continuing medical education. These physicians are at several locations with networks of different bandwidth and capabilities connecting them. Audiences also receive the patient case information on different computers ranging form high-end display workstations to laptops with low-resolution displays. WebPresent is a scalable networked multimedia tool which supports the presentation of hypertext, images, audio, video, and a white-board to remote physicians with hospital Intranet access. WebPresent allows the audience to receive customized information. The data received can differ in resolution and bandwidth, depending on the availability of resources such as display resolution and network bandwidth.

  11. ROSA: Resource-Oriented Service Management Schemes for Web of Things in a Smart Home

    PubMed Central

    Chen, Peng-Yu

    2017-01-01

    A Pervasive-computing-enriched smart home environment, which contains many embedded and tiny intelligent devices and sensors coordinated by service management mechanisms, is capable of anticipating intentions of occupants and providing appropriate services accordingly. Although there are a wealth of research achievements in recent years, the degree of market acceptance is still low. The main reason is that most of the devices and services in such environments depend on particular platform or technology, making it hard to develop an application by composing the devices or services. Meanwhile, the concept of Web of Things (WoT) is becoming popular recently. Based on WoT, the developers can build applications based on popular web tools or technologies. Consequently, the objective of this paper is to propose a set of novel WoT-driven plug-and-play service management schemes for a smart home called Resource-Oriented Service Administration (ROSA). We have implemented an application prototype, and experiments are performed to show the effectiveness of the proposed approach. The results of this research can be a foundation for realizing the vision of “end user programmable smart environments”. PMID:28934159

  12. MAPI: a software framework for distributed biomedical applications

    PubMed Central

    2013-01-01

    Background The amount of web-based resources (databases, tools etc.) in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license). PMID:23311574

  13. Web-accessible molecular modeling with Rosetta: The Rosetta Online Server that Includes Everyone (ROSIE).

    PubMed

    Moretti, Rocco; Lyskov, Sergey; Das, Rhiju; Meiler, Jens; Gray, Jeffrey J

    2018-01-01

    The Rosetta molecular modeling software package provides a large number of experimentally validated tools for modeling and designing proteins, nucleic acids, and other biopolymers, with new protocols being added continually. While freely available to academic users, external usage is limited by the need for expertise in the Unix command line environment. To make Rosetta protocols available to a wider audience, we previously created a web server called Rosetta Online Server that Includes Everyone (ROSIE), which provides a common environment for hosting web-accessible Rosetta protocols. Here we describe a simplification of the ROSIE protocol specification format, one that permits easier implementation of Rosetta protocols. Whereas the previous format required creating multiple separate files in different locations, the new format allows specification of the protocol in a single file. This new, simplified protocol specification has more than doubled the number of Rosetta protocols available under ROSIE. These new applications include pK a determination, lipid accessibility calculation, ribonucleic acid redesign, protein-protein docking, protein-small molecule docking, symmetric docking, antibody docking, cyclic toxin docking, critical binding peptide determination, and mapping small molecule binding sites. ROSIE is freely available to academic users at http://rosie.rosettacommons.org. © 2017 The Protein Society.

  14. An Examination of Teachers' Integration of Web 2.0 Technologies in Secondary Classrooms: A Phenomenological Study

    ERIC Educational Resources Information Center

    Wang, Ling

    2013-01-01

    Web 2.0 tools may be able to close the digital gap between teachers and students if teachers can integrate the tools and change their pedagogy. The TPACK framework has outlined the elements needed to effect change, and research on Web 2.0 tools shows its potential as a change agent, but little research has looked at how the two interrelate. Using…

  15. MICRA: an automatic pipeline for fast characterization of microbial genomes from high-throughput sequencing data.

    PubMed

    Caboche, Ségolène; Even, Gaël; Loywick, Alexandre; Audebert, Christophe; Hot, David

    2017-12-19

    The increase in available sequence data has advanced the field of microbiology; however, making sense of these data without bioinformatics skills is still problematic. We describe MICRA, an automatic pipeline, available as a web interface, for microbial identification and characterization through reads analysis. MICRA uses iterative mapping against reference genomes to identify genes and variations. Additional modules allow prediction of antibiotic susceptibility and resistance and comparing the results of several samples. MICRA is fast, producing few false-positive annotations and variant calls compared to current methods, making it a tool of great interest for fully exploiting sequencing data.

  16. Using a Learning Styles Inventory to Examine Student Satisfaction with Web-Based Instruction: A 15-Year Study of One Professor's Web-Based Course Instruction

    ERIC Educational Resources Information Center

    Olliges, Ralph

    2017-01-01

    This article examines Active Engagement, Active Communication, and Peer Engagement learning practices among various student groups. It examines which tools are most important for increasing student satisfaction with web-based and web-enhanced instruction. Second, it looks at how different tools lead to greater satisfaction among different types of…

  17. TrawlerWeb: an online de novo motif discovery tool for next-generation sequencing datasets.

    PubMed

    Dang, Louis T; Tondl, Markus; Chiu, Man Ho H; Revote, Jerico; Paten, Benedict; Tano, Vincent; Tokolyi, Alex; Besse, Florence; Quaife-Ryan, Greg; Cumming, Helen; Drvodelic, Mark J; Eichenlaub, Michael P; Hallab, Jeannette C; Stolper, Julian S; Rossello, Fernando J; Bogoyevitch, Marie A; Jans, David A; Nim, Hieu T; Porrello, Enzo R; Hudson, James E; Ramialison, Mirana

    2018-04-05

    A strong focus of the post-genomic era is mining of the non-coding regulatory genome in order to unravel the function of regulatory elements that coordinate gene expression (Nat 489:57-74, 2012; Nat 507:462-70, 2014; Nat 507:455-61, 2014; Nat 518:317-30, 2015). Whole-genome approaches based on next-generation sequencing (NGS) have provided insight into the genomic location of regulatory elements throughout different cell types, organs and organisms. These technologies are now widespread and commonly used in laboratories from various fields of research. This highlights the need for fast and user-friendly software tools dedicated to extracting cis-regulatory information contained in these regulatory regions; for instance transcription factor binding site (TFBS) composition. Ideally, such tools should not require prior programming knowledge to ensure they are accessible for all users. We present TrawlerWeb, a web-based version of the Trawler_standalone tool (Nat Methods 4:563-5, 2007; Nat Protoc 5:323-34, 2010), to allow for the identification of enriched motifs in DNA sequences obtained from next-generation sequencing experiments in order to predict their TFBS composition. TrawlerWeb is designed for online queries with standard options common to web-based motif discovery tools. In addition, TrawlerWeb provides three unique new features: 1) TrawlerWeb allows the input of BED files directly generated from NGS experiments, 2) it automatically generates an input-matched biologically relevant background, and 3) it displays resulting conservation scores for each instance of the motif found in the input sequences, which assists the researcher in prioritising the motifs to validate experimentally. Finally, to date, this web-based version of Trawler_standalone remains the fastest online de novo motif discovery tool compared to other popular web-based software, while generating predictions with high accuracy. TrawlerWeb provides users with a fast, simple and easy-to-use web interface for de novo motif discovery. This will assist in rapidly analysing NGS datasets that are now being routinely generated. TrawlerWeb is freely available and accessible at: http://trawler.erc.monash.edu.au .

  18. Assessing the Effect of Web-Based Learning Tools on Student Understanding of Stoichiometry Using Knowledge Space Theory

    ERIC Educational Resources Information Center

    Arasasingham, Ramesh D.; Taagepera, Mare; Potter, Frank; Martorell, Ingrid; Lonjers, Stacy

    2005-01-01

    Student achievement in web-based learning tools is assessed by using in-class examination, pretests, and posttests. The study reveals that using mastering chemistry web software in large-scale instruction provides an overall benefit to introductory chemistry students.

  19. READY: a web-based geographical information system for enhanced flood resilience through raising awareness in citizens

    NASA Astrophysics Data System (ADS)

    Albano, R.; Sole, A.; Adamowski, J.

    2015-02-01

    As evidenced by the EU Floods Directive (2007/60/EC), flood management strategies in Europe have undergone a shift in focus in recent years. The goal of flood prevention using structural measures has been replaced by an emphasis on the management of flood risks using non-structural measures. One implication of this is that it is no longer public authorities alone who take responsibility for flood management. A broader range of stakeholders, who may experience the negative effects of flooding, also take on responsibility to protect themselves. Therefore, it is vital that information concerning flood risks are conveyed to those who may be affected in order to facilitate the self-protection of citizens. Experience shows that even where efforts have been made to communicate flood risks, problems persist. There is a need for the development of new tools, which are able to rapidly disseminate flood risk information to the general public. To be useful, these tools must be able to present information relevant to the location of the user. Moreover, the content and design of the tool need to be adjusted to laypeople's needs. Dissemination and communication influences both people's access to and understanding of natural risk information. Such a tool could be a useful aid to effective management of flood risks. To address this gap, a Web-based Geographical Information System, (WebGIS), has been developed through the collaborative efforts of a group of scientists, hazard and risk analysts and managers, GIS analysts, system developers and communication designers. This tool, called "READY: Risk, Extreme Events, Adaptation, Defend Yourself", aims to enhance the general public knowledge of flood risk, making them more capable of responding appropriately during a flood event. The READY WebGIS has allowed for the visualization and easy querying of a complex hazard and risk database thanks to a high degree of interactivity and its easily readable maps. In this way, READY has enabled fast exploration of alternative flood scenarios or past calamitous events. Combined also with a system of graphic symbols designed ad hoc for communication of self-protection behaviors, it is believed READY could lead to an increase in citizen participation, informed discussion and consensus building. The platform has been developed for a site-specific application, i.e. the Basilicata Region, Italy, has been selected as pilot application area. The goal of the prototype is to raise citizen awareness of flood risks, and to build social capacity and enhanced resilience to flood events.

  20. READY: a web-based geographical information system for enhanced flood resilience through raising awareness in citizens

    NASA Astrophysics Data System (ADS)

    Albano, R.; Sole, A.; Adamowski, J.

    2015-07-01

    As evidenced by the EU Floods Directive (2007/60/EC), flood management strategies in Europe have undergone a shift in focus in recent years. The goal of flood prevention using structural measures has been replaced by an emphasis on the management of flood risks using non-structural measures. One implication of this is that public authorities alone not only take responsibility for flood management. A broader range of stakeholders, who may personally experience the negative effects of flooding, also take on responsibility for protecting themselves. Therefore, it is vital that information concerning flood risks is conveyed to those who may be affected in order to facilitate the self-protection of citizens. Experience shows that problems persist even where efforts have been made to communicate flood risks. There is a need for the development of new tools that are able to rapidly disseminate flood-risk information to the general public. To be useful these tools must be able to present information relevant to the location of the user. Moreover, the content and design of the tool need to be adjusted to laypeople's needs. Dissemination and communication influence both people's access to and understanding of natural risk information. Such a tool could be a useful aid to effective management of flood risks. To address this gap, a web-based geographical information system (WebGIS) has been developed through the collaborative efforts of a group of scientists, hazard and risk analysts and managers, GIS analysts, system developers and communication designers. This tool, called "READY: Risk, Extreme Events, Adaptation, Defend Yourself", aims to enhance the general public knowledge of flood risk, making citizens more capable of responding appropriately during a flood event. The READY WebGIS has allowed for the visualization and easy querying of a complex hazard and risk database thanks to a high degree of interactivity and easily read maps. In this way, READY has enabled fast exploration of alternative flood scenarios or past calamitous events. Combined also with a system of graphic symbols designed ad hoc for communication of self-protection behaviours, it is believed READY could lead to an increase in citizen participation, informed discussion and consensus building. The platform has been developed for a site-specific application: the Basilicata region, Italy, has been selected as pilot application area. The goal of the prototype is to raise citizen awareness of flood risks and to build social capacity and enhanced resilience to flood events.

  1. Web 2 Technologies for Net Native Language Learners: A "Social CALL"

    ERIC Educational Resources Information Center

    Karpati, Andrea

    2009-01-01

    In order to make optimal educational use of social spaces offered by thousands of international communities in the second generation web applications termed Web 2 or Social Web, ICT competences as well as social skills are needed for both teachers and learners. The paper outlines differences in competence structures of Net Natives (who came of age…

  2. Learning in a Sheltered Internet Environment: The Use of WebQuests

    ERIC Educational Resources Information Center

    Segers, Eliane; Verhoeven, Ludo

    2009-01-01

    The present study investigated the effects on learning in a sheltered Internet environment using so-called WebQuests in elementary school classrooms in the Netherlands. A WebQuest is an assignment presented together with a series of web pages to help guide children's learning. The learning gains and quality of the work of 229 sixth graders…

  3. A Framework for Transparently Accessing Deep Web Sources

    ERIC Educational Resources Information Center

    Dragut, Eduard Constantin

    2010-01-01

    An increasing number of Web sites expose their content via query interfaces, many of them offering the same type of products/services (e.g., flight tickets, car rental/purchasing). They constitute the so-called "Deep Web". Accessing the content on the Deep Web has been a long-standing challenge for the database community. For a user interested in…

  4. User Interface Requirements for Web-Based Integrated Care Pathways: Evidence from the Evaluation of an Online Care Pathway Investigation Tool.

    PubMed

    Balatsoukas, Panos; Williams, Richard; Davies, Colin; Ainsworth, John; Buchan, Iain

    2015-11-01

    Integrated care pathways (ICPs) define a chronological sequence of steps, most commonly diagnostic or treatment, to be followed in providing care for patients. Care pathways help to ensure quality standards are met and to reduce variation in practice. Although research on the computerisation of ICP progresses, there is still little knowledge on what are the requirements for designing user-friendly and usable electronic care pathways, or how users (normally health care professionals) interact with interfaces that support design, analysis and visualisation of ICPs. The purpose of the study reported in this paper was to address this gap by evaluating the usability of a novel web-based tool called COCPIT (Collaborative Online Care Pathway Investigation Tool). COCPIT supports the design, analysis and visualisation of ICPs at the population level. In order to address the aim of this study, an evaluation methodology was designed based on heuristic evaluations and a mixed method usability test. The results showed that modular visualisation and direct manipulation of information related to the design and analysis of ICPs is useful for engaging and stimulating users. However, designers should pay attention to issues related to the visibility of the system status and the match between the system and the real world, especially in relation to the display of statistical information about care pathways and the editing of clinical information within a care pathway. The paper concludes with recommendations for interface design.

  5. Secure, web-accessible call rosters for academic radiology departments.

    PubMed

    Nguyen, A V; Tellis, W M; Avrin, D E

    2000-05-01

    Traditionally, radiology department call rosters have been posted via paper and bulletin boards. Frequently, changes to these lists are made by multiple people independently, but often not synchronized, resulting in confusion among the house staff and technical staff as to who is on call and when. In addition, multiple and disparate copies exist in different sections of the department, and changes made would not be propagated to all the schedules. To eliminate such difficulties, a paperless call scheduling application was developed. Our call scheduling program allowed Java-enabled web access to a database by designated personnel from each radiology section who have privileges to make the necessary changes. Once a person made a change, everyone accessing the database would see the modification. This eliminates the chaos resulting from people swapping shifts at the last minute and not having the time to record or broadcast the change. Furthermore, all changes to the database were logged. Users are given a log-in name and password and can only edit their section; however, all personnel have access to all sections' schedules. Our applet was written in Java 2 using the latest technology in database access. We access our Interbase database through the DataExpress and DB Swing (Borland, Scotts Valley, CA) components. The result is secure access to the call rosters via the web. There are many advantages to the web-enabled access, mainly the ability for people to make changes and have the changes recorded and propagated in a single virtual location and available to all who need to know.

  6. Web Audio/Video Streaming Tool

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2003-01-01

    In order to promote NASA-wide educational outreach program to educate and inform the public of space exploration, NASA, at Kennedy Space Center, is seeking efficient ways to add more contents to the web by streaming audio/video files. This project proposes a high level overview of a framework for the creation, management, and scheduling of audio/video assets over the web. To support short-term goals, the prototype of a web-based tool is designed and demonstrated to automate the process of streaming audio/video files. The tool provides web-enabled users interfaces to manage video assets, create publishable schedules of video assets for streaming, and schedule the streaming events. These operations are performed on user-defined and system-derived metadata of audio/video assets stored in a relational database while the assets reside on separate repository. The prototype tool is designed using ColdFusion 5.0.

  7. A decade of Web Server updates at the Bioinformatics Links Directory: 2003-2012.

    PubMed

    Brazas, Michelle D; Yim, David; Yeung, Winston; Ouellette, B F Francis

    2012-07-01

    The 2012 Bioinformatics Links Directory update marks the 10th special Web Server issue from Nucleic Acids Research. Beginning with content from their 2003 publication, the Bioinformatics Links Directory in collaboration with Nucleic Acids Research has compiled and published a comprehensive list of freely accessible, online tools, databases and resource materials for the bioinformatics and life science research communities. The past decade has exhibited significant growth and change in the types of tools, databases and resources being put forth, reflecting both technology changes and the nature of research over that time. With the addition of 90 web server tools and 12 updates from the July 2012 Web Server issue of Nucleic Acids Research, the Bioinformatics Links Directory at http://bioinformatics.ca/links_directory/ now contains an impressive 134 resources, 455 databases and 1205 web server tools, mirroring the continued activity and efforts of our field.

  8. Testing simple deceptive honeypot tools

    NASA Astrophysics Data System (ADS)

    Yahyaoui, Aymen; Rowe, Neil C.

    2015-05-01

    Deception can be a useful defensive technique against cyber-attacks; it has the advantage of unexpectedness to attackers and offers a variety of tactics. Honeypots are a good tool for deception. They act as decoy computers to confuse attackers and exhaust their time and resources. This work tested the effectiveness of two free honeypot tools in real networks by varying their location and virtualization, and the effects of adding more deception to them. We tested a Web honeypot tool, Glastopf and an SSH honeypot tool Kippo. We deployed the Web honeypot in both a residential network and our organization's network and as both real and virtual machines; the organization honeypot attracted more attackers starting in the third week. Results also showed that the virtual honeypots received attacks from more unique IP addresses. They also showed that adding deception to the Web honeypot, in the form of additional linked Web pages and interactive features, generated more interest by attackers. For the purpose of comparison, we used examined log files of a legitimate Web-site www.cmand.org. The traffic distributions for the Web honeypot and the legitimate Web site showed similarities (with much malicious traffic from Brazil), but the SSH honeypot was different (with much malicious traffic from China). Contrary to previous experiments where traffic to static honeypots decreased quickly, our honeypots received increasing traffic over a period of three months. It appears that both honeypot tools are useful for providing intelligence about cyber-attack methods, and that additional deception is helpful.

  9. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications

    PubMed Central

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-01-01

    Background Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. Results The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. Conclusion The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools. PMID:18021453

  10. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications.

    PubMed

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-11-19

    Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools.

  11. WebQuests as Language-Learning Tools

    ERIC Educational Resources Information Center

    Aydin, Selami

    2016-01-01

    This study presents a review of the literature that examines WebQuests as tools for second-language acquisition and foreign language-learning processes to guide teachers in their teaching activities and researchers in further research on the issue. The study first introduces the theoretical background behind WebQuest use in the mentioned…

  12. Revel8or: Model Driven Capacity Planning Tool Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less

  13. EPEPT: A web service for enhanced P-value estimation in permutation tests

    PubMed Central

    2011-01-01

    Background In computational biology, permutation tests have become a widely used tool to assess the statistical significance of an event under investigation. However, the common way of computing the P-value, which expresses the statistical significance, requires a very large number of permutations when small (and thus interesting) P-values are to be accurately estimated. This is computationally expensive and often infeasible. Recently, we proposed an alternative estimator, which requires far fewer permutations compared to the standard empirical approach while still reliably estimating small P-values [1]. Results The proposed P-value estimator has been enriched with additional functionalities and is made available to the general community through a public website and web service, called EPEPT. This means that the EPEPT routines can be accessed not only via a website, but also programmatically using any programming language that can interact with the web. Examples of web service clients in multiple programming languages can be downloaded. Additionally, EPEPT accepts data of various common experiment types used in computational biology. For these experiment types EPEPT first computes the permutation values and then performs the P-value estimation. Finally, the source code of EPEPT can be downloaded. Conclusions Different types of users, such as biologists, bioinformaticians and software engineers, can use the method in an appropriate and simple way. Availability http://informatics.systemsbiology.net/EPEPT/ PMID:22024252

  14. MultiSETTER: web server for multiple RNA structure comparison.

    PubMed

    Čech, Petr; Hoksza, David; Svozil, Daniel

    2015-08-12

    Understanding the architecture and function of RNA molecules requires methods for comparing and analyzing their tertiary and quaternary structures. While structural superposition of short RNAs is achievable in a reasonable time, large structures represent much bigger challenge. Therefore, we have developed a fast and accurate algorithm for RNA pairwise structure superposition called SETTER and implemented it in the SETTER web server. However, though biological relationships can be inferred by a pairwise structure alignment, key features preserved by evolution can be identified only from a multiple structure alignment. Thus, we extended the SETTER algorithm to the alignment of multiple RNA structures and developed the MultiSETTER algorithm. In this paper, we present the updated version of the SETTER web server that implements a user friendly interface to the MultiSETTER algorithm. The server accepts RNA structures either as the list of PDB IDs or as user-defined PDB files. After the superposition is computed, structures are visualized in 3D and several reports and statistics are generated. To the best of our knowledge, the MultiSETTER web server is the first publicly available tool for a multiple RNA structure alignment. The MultiSETTER server offers the visual inspection of an alignment in 3D space which may reveal structural and functional relationships not captured by other multiple alignment methods based either on a sequence or on secondary structure motifs.

  15. Free and Easy to Use Web Based Presentation and Classroom Tools

    ERIC Educational Resources Information Center

    Jensen, Jennifer; Tunon, Johanna

    2012-01-01

    A number of free Web-based tools are available for distance librarians to create presentations and online assignments. The relative merits of presentation tools like Dabbleboard, Jing, Prezi, Tildee, 280 Slides, and Glogster, and classroom tools like Make Beliefs Comix, Picviewer, Photopeach, and Wordle are assessed for ease of use by distance…

  16. A web-based resource for promoting equity in midwifery education and training: Towards meaningful diversity and inclusion.

    PubMed

    Effland, Kristin J; Hays, Karen

    2018-06-01

    Increasing the midwifery workforce requires that aspiring midwives complete education and training, but structural racism and microaggressions impact the lives of underrepresented midwifery students and apprentices, adding stressors and disparities to the usual demanding educational challenges. In order to be resilient, students rely on preceptors, faculty, administrators and institutions to promote equity. Equity-focused learning environments improve student experiences and success rates, and better prepare all students to provide culturally humble and sensitive care to diverse childbearing persons and other essential competencies outlined by the International Confederation of Midwives. The comprehensive web-based resource, www.equitymidwifery.org, is designed to support midwifery educators in promoting equity and social justice in midwifery education and training. The website highlights examples and provides tools including original webinar content and encourages visitors to attend virtual strategy and collaboration calls. It offers a model of continuous professional development that is easily accessible. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. LifeWatchGreece Portal development: architecture, implementation and challenges for a biodiversity research e-infrastructure.

    PubMed

    Gougousis, Alexandros; Bailly, Nicolas

    2016-01-01

    Biodiversity data is characterized by its cross-disciplinary character, the extremely broad range of data types and structures, and the plethora of different data sources providing resources for the same piece of information in a heterogeneous way. Since the web inception two decades ago, there are multiple initiatives to connect, aggregate, share, and publish biodiversity data, and to establish data and work flows in order to analyze them. The European program LifeWatch aims at establishing a distributed network of nodes implementing virtual research environment in Europe to facilitate the work of biodiversity researchers and managers. LifeWatchGreece is one of these nodes where a portal was developed offering access to a suite of virtual laboratories and e-services. Despite its strict definition in information technology, in practice "portal" is a fairly broad term that embraces many web architectures. In the biodiversity domain, the term "portal" is usually used to indicate either a web site that provides access to a single or an aggregation of data repositories (like: http://indiabiodiversity.org/, http://www.mountainbiodiversity.org/, http://data.freshwaterbiodiversity.eu), a web site that gathers information about various online biodiversity tools (like http://test-eubon.ebd.csic.es/, http://marine.lifewatch.eu/) or a web site that just gathers information and news about the biodiversity domain (like http://chm.moew.government.bg). LifeWatchGreece's portal takes the concept of a portal a step further. In strict IT terms, LifeWatchGreece's portal is partly a portal, partly a platform and partly an aggregator. It includes a number of biodiversity-related web tools integrated into a centrally-controlled software ecosystem. This ecosystem includes subsystems for access control, traffic monitoring, user notifications and web tool management. These subsystems are shared to all the web tools that have been integrated to the portal and thereby are part of this ecosystem. These web tools do not consist in external and completely independent web applications as it happens in most other portals. A quite obvious (to the user) indication of this is the Single-Sign-On (SSO) functionality for all tools and the common user interface wrapper that most of these tools use. Another example of a less obvious functionality is the common user profile that is shared and can be utilized by all tools (e.g user's timezone).

  18. Isomorphic semantic mapping of variant call format (VCF2RDF).

    PubMed

    Penha, Emanuel Diego S; Iriabho, Egiebade; Dussaq, Alex; de Oliveira, Diana Magalhães; Almeida, Jonas S

    2017-02-15

    The move of computational genomics workflows to Cloud Computing platforms is associated with a new level of integration and interoperability that challenges existing data representation formats. The Variant Calling Format (VCF) is in a particularly sensitive position in that regard, with both clinical and consumer-facing analysis tools relying on this self-contained description of genomic variation in Next Generation Sequencing (NGS) results. In this report we identify an isomorphic map between VCF and the reference Resource Description Framework. RDF is advanced by the World Wide Web Consortium (W3C) to enable representations of linked data that are both distributed and discoverable. The resulting ability to decompose VCF reports of genomic variation without loss of context addresses the need to modularize and govern NGS pipelines for Precision Medicine. Specifically, it provides the flexibility (i.e. the indexing) needed to support the wide variety of clinical scenarios and patient-facing governance where only part of the VCF data is fitting. Software libraries with a claim to be both domain-facing and consumer-facing have to pass the test of portability across the variety of devices that those consumers in fact adopt. That is, ideally the implementation should itself take place within the space defined by web technologies. Consequently, the isomorphic mapping function was implemented in JavaScript, and was tested in a variety of environments and devices, client and server side alike. These range from web browsers in mobile phones to the most popular micro service platform, NodeJS. The code is publicly available at https://github.com/ibl/VCFr , with a live deployment at: http://ibl.github.io/VCFr/ . jonas.almeida@stonybrookmedicine.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. The Virtual Learning Commons (VLC): Enabling Co-Innovation Across Disciplines

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gandara, A.; Del Rio, N.

    2014-12-01

    A key challenge for scientists addressing grand-challenge problems is identifying, understanding, and integrating potentially relevant methods, models and tools that that are rapidly evolving in the informatics community. Such tools are essential for effectively integrating data and models in complex research projects, yet it is often difficult to know what tools are available and it is not easy to understand or evaluate how they might be used in a given research context. The goal of the National Science Foundation-funded Virtual Learning Commons (VLC) is to improve awareness and understanding of emerging methodologies and technologies, facilitate individual and group evaluation of these, and trace the impact of innovations within and across teams, disciplines, and communities. The VLC is a Web-based social bookmarking site designed specifically to support knowledge exchange in research communities. It is founded on well-developed models of technology adoption, diffusion of innovation, and experiential learning. The VLC makes use of Web 2.0 (Social Web) and Web 3.0 (Semantic Web) approaches. Semantic Web approaches enable discovery of potentially relevant methods, models, and tools, while Social Web approaches enable collaborative learning about their function. The VLC is under development and the first release is expected Fall 2014.

  20. VOTable JAVA Streaming Writer and Applications.

    NASA Astrophysics Data System (ADS)

    Kulkarni, P.; Kembhavi, A.; Kale, S.

    2004-07-01

    Virtual Observatory related tools use a new standard for data transfer called the VOTable format. This is a variant of the xml format that enables easy transfer of data over the web. We describe a streaming interface that can bridge the VOTable format, through a user friendly graphical interface, with the FITS and ASCII formats, which are commonly used by astronomers. A streaming interface is important for efficient use of memory because of the large size of catalogues. The tools are developed in JAVA to provide a platform independent interface. We have also developed a stand-alone version that can be used to convert data stored in ASCII or FITS format on a local machine. The Streaming writer is successfully being used in VOPlot (See Kale et al 2004 for a description of VOPlot).We present the test results of converting huge FITS and ASCII data into the VOTable format on machines that have only limited memory.

  1. MycoCosm, an Integrated Fungal Genomics Resource

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabalov, Igor; Grigoriev, Igor

    2012-03-16

    MycoCosm is a web-based interactive fungal genomics resource, which was first released in March 2010, in response to an urgent call from the fungal community for integration of all fungal genomes and analytical tools in one place (Pan-fungal data resources meeting, Feb 21-22, 2010, Alexandria, VA). MycoCosm integrates genomics data and analysis tools to navigate through over 100 fungal genomes sequenced at JGI and elsewhere. This resource allows users to explore fungal genomes in the context of both genome-centric analysis and comparative genomics, and promotes user community participation in data submission, annotation and analysis. MycoCosm has over 4500 unique visitors/monthmore » or 35000+ visitors/year as well as hundreds of registered users contributing their data and expertise to this resource. Its scalable architecture allows significant expansion of the data expected from JGI Fungal Genomics Program, its users, and integration with external resources used by fungal community.« less

  2. An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.

    PubMed

    Saccomani, Maria Pia

    2011-08-01

    Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.

  3. A user-friendly tool for medical-related patent retrieval.

    PubMed

    Pasche, Emilie; Gobeill, Julien; Teodoro, Douglas; Gaudinat, Arnaud; Vishnyakova, Dina; Lovis, Christian; Ruch, Patrick

    2012-01-01

    Health-related information retrieval is complicated by the variety of nomenclatures available to name entities, since different communities of users will use different ways to name a same entity. We present in this report the development and evaluation of a user-friendly interactive Web application aiming at facilitating health-related patent search. Our tool, called TWINC, relies on a search engine tuned during several patent retrieval competitions, enhanced with intelligent interaction modules, such as chemical query, normalization and expansion. While the functionality of related article search showed promising performances, the ad hoc search results in fairly contrasted results. Nonetheless, TWINC performed well during the PatOlympics competition and was appreciated by intellectual property experts. This result should be balanced by the limited evaluation sample. We can also assume that it can be customized to be applied in corporate search environments to process domain and company-specific vocabularies, including non-English literature and patents reports.

  4. What if Undergraduate Students Designed Their Own Web Learning Environment? Exploring Students' Web 2.0 Mentality through Participatory Design

    ERIC Educational Resources Information Center

    Palaigeorgiou, G.; Triantafyllakos, G.; Tsinakos, A.

    2011-01-01

    Following the increasing calls for a more skeptical analysis of web 2.0 and the empowerment of learners' voices in formulating upcoming technologies, this paper elaborates on the participatory design of a web learning environment. A total of 117 undergraduate students from two Greek Informatics Departments participated in 25 participatory design…

  5. Web Accessibility in Europe and the United States: What We Are Doing to Increase Inclusion

    ERIC Educational Resources Information Center

    Wheaton, Joseph; Bertini, Patrizia

    2007-01-01

    Accessibility is hardly a new problem and certainly did not originate with the Web. Lack of access to buildings long preceded the call for accessible Web content. Although it is unlikely that rehabilitation educators look at Web page accessibility with indifference, many may also find it difficult to implement. The authors posit three reasons why…

  6. The Dockstore: enabling modular, community-focused sharing of Docker-based genomics tools and workflows

    PubMed Central

    O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent

    2017-01-01

    As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774

  7. The Dockstore: enabling modular, community-focused sharing of Docker-based genomics tools and workflows.

    PubMed

    O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent

    2017-01-01

    As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).

  8. Using component technologies for web based wavelet enhanced mammographic image visualization.

    PubMed

    Sakellaropoulos, P; Costaridou, L; Panayiotakis, G

    2000-01-01

    The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.

  9. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry.

    PubMed

    Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P

    2008-11-30

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.

  10. Vocabulary Learning on Learner-Created Content by Using Web 2.0 Tools

    ERIC Educational Resources Information Center

    Eren, Omer

    2015-01-01

    The present research examined the use of Web 2.0 tools to improve students' vocabulary knowledge at the School of Foreign Languages, Gaziantep University. Current studies in literature mostly deal with descriptions of students' attitudes towards the reasons for the use of web-based platforms. However, integrating usual classroom environment with…

  11. Evaluating the Effectiveness of College Web Sites for Prospective Students

    ERIC Educational Resources Information Center

    Ford, Wendy G.

    2011-01-01

    College Web sites are often the first structured encounter a student has with a prospective college or university. Outside of serving as a marketing tool (Williams 2000), very little literature exists on the functional purpose of a college's Web site. Almost all college sites show an informational and transactional tool for currently enrolled…

  12. Web 2.0 in the Mathematics Classroom

    ERIC Educational Resources Information Center

    McCoy, Leah P.

    2014-01-01

    A key characteristic of successful mathematics teachers is that they are able to provide varied activities that promote student learning and assessment. Web 2.0 applications can provide an assortment of tools to help produce creative activities. A Web 2.0 tool enables the student to enter data and create multimedia products using text, graphics,…

  13. Change Management Meets Web 2.0

    ERIC Educational Resources Information Center

    Gale, Doug

    2008-01-01

    Web 2.0 is the term used to describe a group of web-based creativity, information-sharing, and collaboration tools including wikis, blogs, social networks, and folksonomies. The common thread in all of these tools is twofold: They enable collaboration and information sharing, and their impact on higher education has been dramatic. A recent study…

  14. Web Database Development: Implications for Academic Publishing.

    ERIC Educational Resources Information Center

    Fernekes, Bob

    This paper discusses the preliminary planning, design, and development of a pilot project to create an Internet accessible database and search tool for locating and distributing company data and scholarly work. Team members established four project objectives: (1) to develop a Web accessible database and decision tool that creates Web pages on the…

  15. Comprehensive Analysis of Semantic Web Reasoners and Tools: A Survey

    ERIC Educational Resources Information Center

    Khamparia, Aditya; Pandey, Babita

    2017-01-01

    Ontologies are emerging as best representation techniques for knowledge based context domains. The continuing need for interoperation, collaboration and effective information retrieval has lead to the creation of semantic web with the help of tools and reasoners which manages personalized information. The future of semantic web lies in an ontology…

  16. AAVSO Target Tool: A Web-Based Service for Tracking Variable Star Observations (Abstract)

    NASA Astrophysics Data System (ADS)

    Burger, D.; Stassun, K. G.; Barnes, C.; Kafka, S.; Beck, S.; Li, K.

    2018-06-01

    (Abstract only) The AAVSO Target Tool is a web-based interface for bringing stars in need of observation to the attention of AAVSOís network of amateur and professional astronomers. The site currently tracks over 700 targets of interest, collecting data from them on a regular basis from AAVSOís servers and sorting them based on priority. While the target tool does not require a login, users can obtain visibility times for each target by signing up and entering a telescope location. Other key features of the site include filtering by AAVSO observing section, sorting by different variable types, formatting the data for printing, and exporting the data to a CSV file. The AAVSO Target Tool builds upon seven years of experience developing web applications for astronomical data analysis, most notably on Filtergraph (Burger, D., et al. 2013, Astronomical Data Analysis Software and Systems XXII, Astronomical Society of the Pacific, San Francisco, 399), and is built using the web2py web framework based on the python programming language. The target tool is available at http://filtergraph.com/aavso.

  17. Examining Web 2.0 Tools Usage of Science Teacher Candidates

    ERIC Educational Resources Information Center

    Balkan Kiyici, Fatime

    2012-01-01

    Using technology in a science teaching is so important. Only the person, who can use these tools in expert level, can use these tools in their teaching activities. In this research it is aimed firstly identifying science teacher candidates web 2.0 tools usage experience level and factors affecting experience level. In this research survey method…

  18. Sharik 1.0: User Needs and System Requirements for a Web-Based Tool to Support Collaborative Sensemaking

    DTIC Science & Technology

    2016-05-01

    Sharik 1.0: User Needs and System Requirements for a Web -Based Tool to Support Collaborative Sensemaking Shadi Ghajar-Khosravi...share the new intelligence items with their peers. In this report, the authors describe Sharik (SHAring Resources, Information, and Knowledge), a web ...SHAring Resources, Information and Knowledge, soit le partage des ressources, de l’information et des connaissances), un outil Web qui facilite le

  19. Utilizing Web 2.0 Technologies for Library Web Tutorials: An Examination of Instruction on Community College Libraries' Websites Serving Large Student Bodies

    ERIC Educational Resources Information Center

    Blummer, Barbara; Kenton, Jeffrey M.

    2015-01-01

    This is the second part of a series on Web 2.0 tools available from community college libraries' Websites. The first article appeared in an earlier volume of this journal and it illustrated the wide variety of Web 2.0 tools on community college libraries' Websites serving large student bodies (Blummer and Kenton 2014). The research found many of…

  20. Emerging Instructional Technologies: Exploring the Extent of Faculty Use of Web 2.0 Tools at a Midwestern Community College

    ERIC Educational Resources Information Center

    Daher, Tareq; Lazarevic, Bojan

    2014-01-01

    The purpose of this research is to provide insight into the several aspects of instructional use of emerging web-based technologies. The study first explores the extent of Web 2.0 technology integration into face-to-face classroom activities. In this phase, the main focus of research interests was on the types and dynamics of Web 2.0 tools used by…

  1. Contributions of Traditional Web 1.0 Tools e.g. Email and Web 2.0 Tools e.g. Weblog towards Knowledge Management

    ERIC Educational Resources Information Center

    Dehinbo, Johnson

    2010-01-01

    The use of email utilizes the power of Web 1.0 to enable users to access their email from any computer and mobile devices that is connected to the Internet making email valuable in acquiring and transferring knowledge. But the advent of Web 2.0 and social networking seems to indicate certain limitations of email. The use of social networking seems…

  2. How To Succeed in Promoting Your Web Site: The Impact of Search Engine Registration on Retrieval of a World Wide Web Site.

    ERIC Educational Resources Information Center

    Tunender, Heather; Ervin, Jane

    1998-01-01

    Character strings were planted in a World Wide Web site (Project Whistlestop) to test indexing and retrieval rates of five Web search tools (Lycos, infoseek, AltaVista, Yahoo, Excite). It was found that search tools indexed few of the planted character strings, none indexed the META descriptor tag, and only Excite indexed into the 3rd-4th site…

  3. SOAP based web services and their future role in VO projects

    NASA Astrophysics Data System (ADS)

    Topf, F.; Jacquey, C.; Génot, V.; Cecconi, B.; André, N.; Zhang, T. L.; Kallio, E.; Lammer, H.; Facsko, G.; Stöckler, R.; Khodachenko, M.

    2011-10-01

    Modern state-of-the-art web services are from crucial importance for the interoperability of different VO tools existing in the planetary community. SOAP based web services assure the interconnectability between different data sources and tools by providing a common protocol for communication. This paper will point out a best practice approach with the Automated Multi-Dataset Analysis Tool (AMDA) developed by CDPP, Toulouse and the provision of VEX/MAG data from a remote database located at IWF, Graz. Furthermore a new FP7 project IMPEx will be introduced with a potential usage example of AMDA web services in conjunction with simulation models.

  4. NOAA Miami Regional Library > Home

    Science.gov Websites

    Services & Education Social Networking & Other Web Tools for Earth Science Library Catalog AOML ; Education|Social Networking & Other Web Tools for Earth Science 4301 Rickenbacker Causeway, Miami, Fl

  5. Telecommunications Relay Service

    MedlinePlus

    ... used with an existing voice telephone and a computer or other Web-enabled device without requiring any ... based TRS call. The user may use a computer or other web-enabled device to communicate with ...

  6. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    NASA Astrophysics Data System (ADS)

    Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston

    2007-06-01

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  7. 2B-Alert Web: An Open-Access Tool for Predicting the Effects of Sleep/Wake Schedules and Caffeine Consumption on Neurobehavioral Performance.

    PubMed

    Reifman, Jaques; Kumar, Kamal; Wesensten, Nancy J; Tountas, Nikolaos A; Balkin, Thomas J; Ramakrishnan, Sridhar

    2016-12-01

    Computational tools that predict the effects of daily sleep/wake amounts on neurobehavioral performance are critical components of fatigue management systems, allowing for the identification of periods during which individuals are at increased risk for performance errors. However, none of the existing computational tools is publicly available, and the commercially available tools do not account for the beneficial effects of caffeine on performance, limiting their practical utility. Here, we introduce 2B-Alert Web, an open-access tool for predicting neurobehavioral performance, which accounts for the effects of sleep/wake schedules, time of day, and caffeine consumption, while incorporating the latest scientific findings in sleep restriction, sleep extension, and recovery sleep. We combined our validated Unified Model of Performance and our validated caffeine model to form a single, integrated modeling framework instantiated as a Web-enabled tool. 2B-Alert Web allows users to input daily sleep/wake schedules and caffeine consumption (dosage and time) to obtain group-average predictions of neurobehavioral performance based on psychomotor vigilance tasks. 2B-Alert Web is accessible at: https://2b-alert-web.bhsai.org. The 2B-Alert Web tool allows users to obtain predictions for mean response time, mean reciprocal response time, and number of lapses. The graphing tool allows for simultaneous display of up to seven different sleep/wake and caffeine schedules. The schedules and corresponding predicted outputs can be saved as a Microsoft Excel file; the corresponding plots can be saved as an image file. The schedules and predictions are erased when the user logs off, thereby maintaining privacy and confidentiality. The publicly accessible 2B-Alert Web tool is available for operators, schedulers, and neurobehavioral scientists as well as the general public to determine the impact of any given sleep/wake schedule, caffeine consumption, and time of day on performance of a group of individuals. This evidence-based tool can be used as a decision aid to design effective work schedules, guide the design of future sleep restriction and caffeine studies, and increase public awareness of the effects of sleep amounts, time of day, and caffeine on alertness. © 2016 Associated Professional Sleep Societies, LLC.

  8. Hydrogen Financial Analysis Scenario Tool (H2FAST) Documentation

    Science.gov Websites

    for the web and spreadsheet versions of H2FAST. H2FAST Web Tool User's Manual H2FAST Spreadsheet Tool User's Manual (DRAFT) Technical Support Send questions or feedback about H2FAST to H2FAST@nrel.gov. Home

  9. Educational and Scientific Applications of Climate Model Diagnostic Analyzer

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.

    2016-12-01

    Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.

  10. Document Exploration and Automatic Knowledge Extraction for Unstructured Biomedical Text

    NASA Astrophysics Data System (ADS)

    Chu, S.; Totaro, G.; Doshi, N.; Thapar, S.; Mattmann, C. A.; Ramirez, P.

    2015-12-01

    We describe our work on building a web-browser based document reader with built-in exploration tool and automatic concept extraction of medical entities for biomedical text. Vast amounts of biomedical information are offered in unstructured text form through scientific publications and R&D reports. Utilizing text mining can help us to mine information and extract relevant knowledge from a plethora of biomedical text. The ability to employ such technologies to aid researchers in coping with information overload is greatly desirable. In recent years, there has been an increased interest in automatic biomedical concept extraction [1, 2] and intelligent PDF reader tools with the ability to search on content and find related articles [3]. Such reader tools are typically desktop applications and are limited to specific platforms. Our goal is to provide researchers with a simple tool to aid them in finding, reading, and exploring documents. Thus, we propose a web-based document explorer, which we called Shangri-Docs, which combines a document reader with automatic concept extraction and highlighting of relevant terms. Shangri-Docsalso provides the ability to evaluate a wide variety of document formats (e.g. PDF, Words, PPT, text, etc.) and to exploit the linked nature of the Web and personal content by performing searches on content from public sites (e.g. Wikipedia, PubMed) and private cataloged databases simultaneously. Shangri-Docsutilizes Apache cTAKES (clinical Text Analysis and Knowledge Extraction System) [4] and Unified Medical Language System (UMLS) to automatically identify and highlight terms and concepts, such as specific symptoms, diseases, drugs, and anatomical sites, mentioned in the text. cTAKES was originally designed specially to extract information from clinical medical records. Our investigation leads us to extend the automatic knowledge extraction process of cTAKES for biomedical research domain by improving the ontology guided information extraction process. We will describe our experience and implementation of our system and share lessons learned from our development. We will also discuss ways in which this could be adapted to other science fields. [1] Funk et al., 2014. [2] Kang et al., 2014. [3] Utopia Documents, http://utopiadocs.com [4] Apache cTAKES, http://ctakes.apache.org

  11. Patient-oriented interactive E-health tools on U.S. hospital Web sites.

    PubMed

    Huang, Edgar; Chang, Chiu-Chi Angela

    2012-01-01

    The purpose of this study is to provide evidence for strategic planning regarding e-health development in U.S. hospitals. A content analysis of a representative sample of the U.S. hospital Web sites has revealed how U.S. hospitals have taken advantage of the 21 patient-oriented interactive tools identified in this study. Significant gaps between various types of hospitals have also been found. It is concluded that although the majority of the U.S. hospitals have adopted traditional functional tools, they need to make significant inroad in implementing the core e-business tools to serve their patients/users, making their Web sites more efficient marketing tools.

  12. WIRM: An Open Source Toolkit for Building Biomedical Web Applications

    PubMed Central

    Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.

    2002-01-01

    This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108

  13. Global Connections: Web Conferencing Tools Help Educators Collaborate Anytime, Anywhere

    ERIC Educational Resources Information Center

    Forrester, Dave

    2009-01-01

    Web conferencing tools help educators from around the world collaborate in real time. Teachers, school counselors, and administrators need only to put on their headsets, check the time zone, and log on to meet and learn from educators across the globe. In this article, the author discusses how educators can use Web conferencing at their schools.…

  14. Faculty Recommendations for Web Tools: Implications for Course Management Systems

    ERIC Educational Resources Information Center

    Oliver, Kevin; Moore, John

    2008-01-01

    A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…

  15. Teachers' Perceptions and Attitudes toward the Implementation of Web 2.0 Tools in Secondary Education

    ERIC Educational Resources Information Center

    Quadri, Lekan Kamil

    2014-01-01

    Researchers have concluded that Web 2.0 technologies offered many educational benefits. However, many secondary teachers in a large northwestern school district were not using Web 2.0 tools in spite of its possibilities for teaching and learning. The purpose of this quantitative correlational research was to explore the relationships between the…

  16. The Effectiveness of Web-Based Learning Environment: A Case Study of Public Universities in Kenya

    ERIC Educational Resources Information Center

    Kirui, Paul A.; Mutai, Sheila J.

    2010-01-01

    Web mining is emerging in many aspects of e-learning, aiming at improving online learning and teaching processes and making them more transparent and effective. Researchers using Web mining tools and techniques are challenged to learn more about the online students' reshaping online courses and educational websites, and create tools for…

  17. Mixed Sequence Reader: A Program for Analyzing DNA Sequences with Heterozygous Base Calling

    PubMed Central

    Chang, Chun-Tien; Tsai, Chi-Neu; Tang, Chuan Yi; Chen, Chun-Houh; Lian, Jang-Hau; Hu, Chi-Yu; Tsai, Chia-Lung; Chao, Angel; Lai, Chyong-Huey; Wang, Tzu-Hao; Lee, Yun-Shien

    2012-01-01

    The direct sequencing of PCR products generates heterozygous base-calling fluorescence chromatograms that are useful for identifying single-nucleotide polymorphisms (SNPs), insertion-deletions (indels), short tandem repeats (STRs), and paralogous genes. Indels and STRs can be easily detected using the currently available Indelligent or ShiftDetector programs, which do not search reference sequences. However, the detection of other genomic variants remains a challenge due to the lack of appropriate tools for heterozygous base-calling fluorescence chromatogram data analysis. In this study, we developed a free web-based program, Mixed Sequence Reader (MSR), which can directly analyze heterozygous base-calling fluorescence chromatogram data in .abi file format using comparisons with reference sequences. The heterozygous sequences are identified as two distinct sequences and aligned with reference sequences. Our results showed that MSR may be used to (i) physically locate indel and STR sequences and determine STR copy number by searching NCBI reference sequences; (ii) predict combinations of microsatellite patterns using the Federal Bureau of Investigation Combined DNA Index System (CODIS); (iii) determine human papilloma virus (HPV) genotypes by searching current viral databases in cases of double infections; (iv) estimate the copy number of paralogous genes, such as β-defensin 4 (DEFB4) and its paralog HSPDP3. PMID:22778697

  18. FireMap: A Web Tool for Dynamic Data-Driven Predictive Wildfire Modeling Powered by the WIFIRE Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Block, J.; Crawl, D.; Artes, T.; Cowart, C.; de Callafon, R.; DeFanti, T.; Graham, J.; Smarr, L.; Srivas, T.; Altintas, I.

    2016-12-01

    The NSF-funded WIFIRE project has designed a web-based wildfire modeling simulation and visualization tool called FireMap. The tool executes FARSITE to model fire propagation using dynamic weather and fire data, configuration settings provided by the user, and static topography and fuel datasets already built-in. Using GIS capabilities combined with scalable big data integration and processing, FireMap enables simple execution of the model with options for running ensembles by taking the information uncertainty into account. The results are easily viewable, sharable, repeatable, and can be animated as a time series. From these capabilities, users can model real-time fire behavior, analyze what-if scenarios, and keep a history of model runs over time for sharing with collaborators. Firemap runs FARSITE with national and local sensor networks for real-time weather data ingestion and High-Resolution Rapid Refresh (HRRR) weather for forecasted weather. The HRRR is a NOAA/NCEP operational weather prediction system comprised of a numerical forecast model and an analysis/assimilation system to initialize the model. It is run with a horizontal resolution of 3 km, has 50 vertical levels, and has a temporal resolution of 15 minutes. The HRRR requires an Environmental Data Exchange (EDEX) server to receive the feed and generate secondary products out of it for the modeling. UCSD's EDEX server, funded by NSF, makes high-resolution weather data available to researchers worldwide and enables visualization of weather systems and weather events lasting months or even years. The high-speed server aggregates weather data from the University Consortium for Atmospheric Research by way of a subscription service from the Consortium called the Internet Data Distribution system. These features are part of WIFIRE's long term goals to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. Although Firemap is a research product of WIFIRE, developed in collaboration with a number of fire departments, the tool is operational in pilot form for providing big data-driven predictive fire spread modeling. Most recently, FireMap was used for situational awareness in the July 2016 Sand Fire by LA City and LA County Fire Departments.

  19. Attitudes, Perceptions, and Behavioral Intentions of Engineering Workers toward Web 2.0 Tools in the Workplace

    ERIC Educational Resources Information Center

    Krause, Jaclyn A.

    2010-01-01

    As Web 2.0 tools and technologies increase in popularity in consumer markets, enterprises are seeking ways to take advantage of the rich social knowledge exchanges that these tools offer. The problem this study addresses is that it remains unknown whether employees perceive that these tools offer value to the organization and therefore will be…

  20. The Use of Technology in Participant Tracking and Study Retention: Lessons Learned From a Clinical Trials Network Study.

    PubMed

    Mitchell, Shannon Gwin; Schwartz, Robert P; Alvanzo, Anika A H; Weisman, Monique S; Kyle, Tiffany L; Turrigiano, Eva M; Gibson, Martha L; Perez, Livangelie; McClure, Erin A; Clingerman, Sara; Froias, Autumn; Shandera, Danielle R; Walker, Robrina; Babcock, Dean L; Bailey, Genie L; Miele, Gloria M; Kunkel, Lynn E; Norton, Michael; Stitzer, Maxine L

    2015-01-01

    The growing use of newer communication and Internet technologies, even among low-income and transient populations, require research staff to update their outreach strategies to ensure high follow-up and participant retention rates. This paper presents the views of research assistants on the use of cell phones and the Internet to track participants in a multisite randomized trial of substance use disorder treatment. Preinterview questionnaires exploring tracking and other study-related activities were collected from 21 research staff across the 10 participating US sites. Data were then used to construct a semistructured interview guide that, in turn, was used to interview 12 of the same staff members. The questionnaires and interview data were entered in Atlas.ti and analyzed for emergent themes related to the use of technology for participant-tracking purposes. Study staff reported that most participants had cell phones, despite having unstable physical addresses and landlines. The incoming call feature of most cell phones was useful for participants and research staff alike, and texting proved to have additional benefits. However, reliance on participants' cell phones also proved problematic. Even homeless participants were found to have access to the Internet through public libraries and could respond to study staff e-mails. Some study sites opened generic social media accounts, through which study staff sent private messages to participants. However, the institutional review board (IRB) approval process for tracking participants using social media at some sites was prohibitively lengthy. Internet searches through Google, national paid databases, obituaries, and judiciary Web sites were also helpful tools. Research staff perceive that cell phones, Internet searches, and social networking sites were effective tools to achieve high follow-up rates in drug abuse research. Studies should incorporate cell phone, texting, and social network Web site information on locator forms; obtain IRB approval for contacting participants using social networking Web sites; and include Web searches, texting, and the use of social media in staff training as standard operating procedures.

  1. Oh! Web 2.0, Virtual Reference Service 2.0, Tools & Techniques (II)

    ERIC Educational Resources Information Center

    Arya, Harsh Bardhan; Mishra, J. K.

    2012-01-01

    The paper describes the theory and definition of the practice of librarianship, specifically addressing how Web 2.0 technologies (tools) such as synchronous messaging, collaborative reference service and streaming media, blogs, wikis, social networks, social bookmarking tools, tagging, RSS feeds, and mashups might intimate changes and how…

  2. Enhancing e-Learning Content by Using Semantic Web Technologies

    ERIC Educational Resources Information Center

    García-González, Herminio; Gayo, José Emilio Labra; del Puerto Paule-Ruiz, María

    2017-01-01

    We describe a new educational tool that relies on Semantic Web technologies to enhance lessons content. We conducted an experiment with 32 students whose results demonstrate better performance when exposed to our tool in comparison with a plain native tool. Consequently, this prototype opens new possibilities in lessons content enhancement.

  3. Web Surveys to Digital Movies: Technological Tools of the Trade.

    ERIC Educational Resources Information Center

    Fetterman, David M.

    2002-01-01

    Highlights some of the technological tools used by educational researchers today, focusing on data collection related tools such as Web surveys, digital photography, voice recognition and transcription, file sharing and virtual office, videoconferencing on the Internet, instantaneous chat and chat rooms, reporting and dissemination, and digital…

  4. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification.

    PubMed

    Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.

  5. Web services-based text-mining demonstrates broad impacts for interoperability and process simplification

    PubMed Central

    Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.

    2014-01-01

    The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658

  6. SequenceCEROSENE: a computational method and web server to visualize spatial residue neighborhoods at the sequence level.

    PubMed

    Heinke, Florian; Bittrich, Sebastian; Kaiser, Florian; Labudde, Dirk

    2016-01-01

    To understand the molecular function of biopolymers, studying their structural characteristics is of central importance. Graphics programs are often utilized to conceive these properties, but with the increasing number of available structures in databases or structure models produced by automated modeling frameworks this process requires assistance from tools that allow automated structure visualization. In this paper a web server and its underlying method for generating graphical sequence representations of molecular structures is presented. The method, called SequenceCEROSENE (color encoding of residues obtained by spatial neighborhood embedding), retrieves the sequence of each amino acid or nucleotide chain in a given structure and produces a color coding for each residue based on three-dimensional structure information. From this, color-highlighted sequences are obtained, where residue coloring represent three-dimensional residue locations in the structure. This color encoding thus provides a one-dimensional representation, from which spatial interactions, proximity and relations between residues or entire chains can be deduced quickly and solely from color similarity. Furthermore, additional heteroatoms and chemical compounds bound to the structure, like ligands or coenzymes, are processed and reported as well. To provide free access to SequenceCEROSENE, a web server has been implemented that allows generating color codings for structures deposited in the Protein Data Bank or structure models uploaded by the user. Besides retrieving visualizations in popular graphic formats, underlying raw data can be downloaded as well. In addition, the server provides user interactivity with generated visualizations and the three-dimensional structure in question. Color encoded sequences generated by SequenceCEROSENE can aid to quickly perceive the general characteristics of a structure of interest (or entire sets of complexes), thus supporting the researcher in the initial phase of structure-based studies. In this respect, the web server can be a valuable tool, as users are allowed to process multiple structures, quickly switch between results, and interact with generated visualizations in an intuitive manner. The SequenceCEROSENE web server is available at https://biosciences.hs-mittweida.de/seqcerosene.

  7. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  8. Web Search Studies: Multidisciplinary Perspectives on Web Search Engines

    NASA Astrophysics Data System (ADS)

    Zimmer, Michael

    Perhaps the most significant tool of our internet age is the web search engine, providing a powerful interface for accessing the vast amount of information available on the world wide web and beyond. While still in its infancy compared to the knowledge tools that precede it - such as the dictionary or encyclopedia - the impact of web search engines on society and culture has already received considerable attention from a variety of academic disciplines and perspectives. This article aims to organize a meta-discipline of “web search studies,” centered around a nucleus of major research on web search engines from five key perspectives: technical foundations and evaluations; transaction log analyses; user studies; political, ethical, and cultural critiques; and legal and policy analyses.

  9. The Next Linear Collider Program

    Science.gov Websites

    posted to the new SLAC ILC web site http://www-project.slac.stanford.edu/ilc/. Also, see the new site for . The NLC web site will remain accessible as an archive of important work done on the many systems | Navbar || || Documentation | NLC Playpen | Web Comments & Suggestions | Desktop Trouble Call | LC

  10. WebCHECK: The Website Evaluation Instrument

    ERIC Educational Resources Information Center

    Small, Ruth V.; Arnone, Marilyn P.

    2014-01-01

    Just as with print resources, as the number of Web-based resources continues to soar, the need to evaluate them has become a critical information skill for both children and adults. This is particularly true for schools where librarians often are called on to recommend Web resources to classroom teachers, parents, and students, and to support…

  11. Web-Based Teaching: The Beginning of the End for Universities?

    ERIC Educational Resources Information Center

    Wyatt, Ray

    This paper describes a World Wide Web-based, generic, inter-disciplinary subject called computer-aided policymaking. It has been offered at Melbourne University (Australia) from the beginning of 2001. It has generated some salutary lessons in marketing and pedagogy, but overall it is concluded that Web-based teaching has a rosy future.…

  12. Web-Based Quiz-Game-Like Formative Assessment: Development and Evaluation

    ERIC Educational Resources Information Center

    Wang, Tzu-Hua

    2008-01-01

    This research aims to develop a multiple-choice Web-based quiz-game-like formative assessment system, named GAM-WATA. The unique design of "Ask-Hint Strategy" turns the Web-based formative assessment into an online quiz game. "Ask-Hint Strategy" is composed of "Prune Strategy" and "Call-in Strategy".…

  13. MendelWeb: An Electronic Science/Math/History Resource for the WWW.

    ERIC Educational Resources Information Center

    Blumberg, Roger B.

    This paper describes a hypermedia resource, called MendelWeb that integrates elementary biology, discrete mathematics, and the history of science. MendelWeb is constructed from Gregor Menders 1865 paper, "Experiments in Plant Hybridization". An English translation of Mendel's paper, which is considered to mark the birth of classical and…

  14. A Tutorial in Creating Web-Enabled Databases with Inmagic DB/TextWorks through ODBC.

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2000-01-01

    Explains how to create Web-enabled databases. Highlights include Inmagic's DB/Text WebPublisher product called DB/TextWorks; ODBC (Open Database Connectivity) drivers; Perl programming language; HTML coding; Structured Query Language (SQL); Common Gateway Interface (CGI) programming; and examples of HTML pages and Perl scripts. (LRW)

  15. Development and Evaluation of Mechatronics Learning System in a Web-Based Environment

    ERIC Educational Resources Information Center

    Shyr, Wen-Jye

    2011-01-01

    The development of remote laboratory suitable for the reinforcement of undergraduate level teaching of mechatronics is important. For the reason, a Web-based mechatronics learning system, called the RECOLAB (REmote COntrol LABoratory), for remote learning in engineering education has been developed in this study. The web-based environment is an…

  16. Informatics in radiology: radiology gamuts ontology: differential diagnosis for the Semantic Web.

    PubMed

    Budovec, Joseph J; Lam, Cesar A; Kahn, Charles E

    2014-01-01

    The Semantic Web is an effort to add semantics, or "meaning," to empower automated searching and processing of Web-based information. The overarching goal of the Semantic Web is to enable users to more easily find, share, and combine information. Critical to this vision are knowledge models called ontologies, which define a set of concepts and formalize the relations between them. Ontologies have been developed to manage and exploit the large and rapidly growing volume of information in biomedical domains. In diagnostic radiology, lists of differential diagnoses of imaging observations, called gamuts, provide an important source of knowledge. The Radiology Gamuts Ontology (RGO) is a formal knowledge model of differential diagnoses in radiology that includes 1674 differential diagnoses, 19,017 terms, and 52,976 links between terms. Its knowledge is used to provide an interactive, freely available online reference of radiology gamuts ( www.gamuts.net ). A Web service allows its content to be discovered and consumed by other information systems. The RGO integrates radiologic knowledge with other biomedical ontologies as part of the Semantic Web. © RSNA, 2014.

  17. A World-Wide Net of Solar Radio Spectrometers: e-CALLISTO

    NASA Astrophysics Data System (ADS)

    Benz, A. O.; Monstein, C.; Meyer, H.; Manoharan, P. K.; Ramesh, R.; Altyntsev, A.; Lara, A.; Paez, J.; Cho, K.-S.

    2009-04-01

    Radio spectrometers of the CALLISTO type to observe solar flares have been distributed to nine locations around the globe. The instruments observe automatically, their data is collected every day via internet and stored in a central data base. A public web-interface exists through which data can be browsed and retrieved. The nine instruments form a network called e-CALLISTO. It is still growing in the number of stations, as redundancy is desirable for full 24 h coverage of the solar radio emission in the meter and low decimeter band. The e-CALLISTO system has already proven to be a valuable new tool for monitoring solar activity and for space weather research.

  18. A Web Based Collaborative Design Environment for Spacecraft

    NASA Technical Reports Server (NTRS)

    Dunphy, Julia

    1998-01-01

    In this era of shrinking federal budgets in the USA we need to dramatically improve our efficiency in the spacecraft engineering design process. We have come up with a method which captures much of the experts' expertise in a dataflow design graph: Seamlessly connectable set of local and remote design tools; Seamlessly connectable web based design tools; and Web browser interface to the developing spacecraft design. We have recently completed our first web browser interface and demonstrated its utility in the design of an aeroshell using design tools located at web sites at three NASA facilities. Multiple design engineers and managers are now able to interrogate the design engine simultaneously and find out what the design looks like at any point in the design cycle, what its parameters are, and how it reacts to adverse space environments.

  19. Social Web mining and exploitation for serious applications: Technosocial Predictive Analytics and related technologies for public health, environmental and national security surveillance.

    PubMed

    Kamel Boulos, Maged N; Sanfilippo, Antonio P; Corley, Courtney D; Wheeler, Steve

    2010-10-01

    This paper explores Technosocial Predictive Analytics (TPA) and related methods for Web "data mining" where users' posts and queries are garnered from Social Web ("Web 2.0") tools such as blogs, micro-blogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people's collective health status of whole populations. Several health related tool examples are described and demonstrated as practical means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  20. New approaches in assessing food intake in epidemiology.

    PubMed

    Conrad, Johanna; Koch, Stefanie A J; Nöthlings, Ute

    2018-06-22

    A promising direction for improving dietary intake measurement in epidemiologic studies is the combination of short-term and long-term dietary assessment methods using statistical methods. Thereby, web-based instruments are particularly interesting as their application offers several potential advantages such as self-administration and a shorter completion time. The objective of this review is to provide an overview of new web-based short-term instruments and to describe their features. A number of web-based short-term dietary assessment tools for application in different countries and age-groups have been developed so far. Particular attention should be paid to the underlying database and the search function of the tool. Moreover, web-based instruments can improve the estimation of portion sizes by offering several options to the user. Web-based dietary assessment methods are associated with lower costs and reduced burden for participants and researchers, and show a comparable validity with traditional instruments. When there is a need for a web-based tool researcher should consider the adaptation of existing tools rather than developing new instruments. The combination of short-term and long-term instruments seems more feasible with the use of new technology.

  1. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    PubMed

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  2. Technology Integration in Science Classrooms: Framework, Principles, and Examples

    ERIC Educational Resources Information Center

    Kim, Minchi C.; Freemyer, Sarah

    2011-01-01

    A great number of technologies and tools have been developed to support science learning and teaching. However, science teachers and researchers point out numerous challenges to implementing such tools in science classrooms. For instance, guidelines, lesson plans, Web links, and tools teachers can easily find through Web-based search engines often…

  3. SETAC Short Course: Introduction to interspecies toxicity extrapolation using EPA’s Web-ICE tool

    EPA Science Inventory

    The Web-ICE tool is a user friendly interface that contains modules to predict acute toxicity to over 500 species of aquatic (algae, invertebrates, fish) and terrestrial (birds and mammals) taxa. The tool contains a suite of over 3000 ICE models developed from a database of over ...

  4. The Web-Database Connection Tools for Sharing Information on the Campus Intranet.

    ERIC Educational Resources Information Center

    Thibeault, Nancy E.

    This paper evaluates four tools for creating World Wide Web pages that interface with Microsoft Access databases: DB Gateway, Internet Database Assistant (IDBA), Microsoft Internet Database Connector (IDC), and Cold Fusion. The system requirements and features of each tool are discussed. A sample application, "The Virtual Help Desk"…

  5. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    ERIC Educational Resources Information Center

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  6. Simulation Platform: a cloud-based online simulation environment.

    PubMed

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-09-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Reprint of: Simulation Platform: a cloud-based online simulation environment.

    PubMed

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-11-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. HELIOGate, a Portal for the Heliophysics Community

    NASA Astrophysics Data System (ADS)

    Pierantoni; Gabriele; Carley, Eoin

    2014-10-01

    Heliophysics is the branch of physics that investigates the interactions between the Sun and the other bodies of the solar system. Heliophysicists rely on data collected from numerous sources scattered across the Solar System. The data collected from these sources is processed to extract metadata and the metadata extracted in this fashion is then used to build indexes of features and events called catalogues. Heliophysicists also develop conceptual and mathematical models of the phenomena and the environment of the Solar System. More specifically, they investigate the physical characteristics of the phenomena and they simulate how they propagate throughout the Solar System with mathematical and physical abstractions called propagation models. HELIOGate aims at addressing the need to combine and orchestrate existing web services in a flexible and easily configurable fashion to tackle different scientific questions. HELIOGate also offers a tool capable of connecting to size! able computation and storage infrastructures to execute data processing codes that are needed to calibrate raw data and to extract metadata.

  9. The ALICE analysis train system

    NASA Astrophysics Data System (ADS)

    Zimmermann, Markus; ALICE Collaboration

    2015-05-01

    In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.

  10. Spliceman2: a computational web server that predicts defects in pre-mRNA splicing.

    PubMed

    Cygan, Kamil Jan; Sanford, Clayton Hendrick; Fairbrother, William Guy

    2017-09-15

    Most pre-mRNA transcripts in eukaryotic cells must undergo splicing to remove introns and join exons, and splicing elements present a large mutational target for disease-causing mutations. Splicing elements are strongly position dependent with respect to the transcript annotations. In 2012, we presented Spliceman, an online tool that used positional dependence to predict how likely distant mutations around annotated splice sites were to disrupt splicing. Here, we present an improved version of the previous tool that will be more useful for predicting the likelihood of splicing mutations. We have added industry-standard input options (i.e. Spliceman now accepts variant call format files), which allow much larger inputs than previously available. The tool also can visualize the locations-within exons and introns-of sequence variants to be analyzed and the predicted effects on splicing of the pre-mRNA transcript. In addition, Spliceman2 integrates with RNAcompete motif libraries to provide a prediction of which trans -acting factors binding sites are disrupted/created and links out to the UCSC genome browser. In summary, the new features in Spliceman2 will allow scientists and physicians to better understand the effects of single nucleotide variations on splicing. Freely available on the web at http://fairbrother.biomed.brown.edu/spliceman2 . Website implemented in PHP framework-Laravel 5, PostgreSQL, Apache, and Perl, with all major browsers supported. william_fairbrother@brown.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. Mendel,MD: A user-friendly open-source web tool for analyzing WES and WGS in the diagnosis of patients with Mendelian disorders

    PubMed Central

    D. Linhares, Natália; Pena, Sérgio D. J.

    2017-01-01

    Whole exome and whole genome sequencing have both become widely adopted methods for investigating and diagnosing human Mendelian disorders. As pangenomic agnostic tests, they are capable of more accurate and agile diagnosis compared to traditional sequencing methods. This article describes new software called Mendel,MD, which combines multiple types of filter options and makes use of regularly updated databases to facilitate exome and genome annotation, the filtering process and the selection of candidate genes and variants for experimental validation and possible diagnosis. This tool offers a user-friendly interface, and leads clinicians through simple steps by limiting the number of candidates to achieve a final diagnosis of a medical genetics case. A useful innovation is the “1-click” method, which enables listing all the relevant variants in genes present at OMIM for perusal by clinicians. Mendel,MD was experimentally validated using clinical cases from the literature and was tested by students at the Universidade Federal de Minas Gerais, at GENE–Núcleo de Genética Médica in Brazil and at the Children’s University Hospital in Dublin, Ireland. We show in this article how it can simplify and increase the speed of identifying the culprit mutation in each of the clinical cases that were received for further investigation. Mendel,MD proved to be a reliable web-based tool, being open-source and time efficient for identifying the culprit mutation in different clinical cases of patients with Mendelian Disorders. It is also freely accessible for academic users on the following URL: https://mendelmd.org. PMID:28594829

  12. Tailored and Integrated Web-Based Tools for Improving Psychosocial Outcomes of Cancer Patients: The DoTTI Development Framework

    PubMed Central

    Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-01-01

    Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991

  13. Tailored and integrated Web-based tools for improving psychosocial outcomes of cancer patients: the DoTTI development framework.

    PubMed

    Smits, Rochelle; Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-03-14

    Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases.

  14. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  15. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  16. WHIDE—a web tool for visual data mining colocation patterns in multivariate bioimages

    PubMed Central

    Kölling, Jan; Langenkämper, Daniel; Abouna, Sylvie; Khan, Michael; Nattkemper, Tim W.

    2012-01-01

    Motivation: Bioimaging techniques rapidly develop toward higher resolution and dimension. The increase in dimension is achieved by different techniques such as multitag fluorescence imaging, Matrix Assisted Laser Desorption / Ionization (MALDI) imaging or Raman imaging, which record for each pixel an N-dimensional intensity array, representing local abundances of molecules, residues or interaction patterns. The analysis of such multivariate bioimages (MBIs) calls for new approaches to support users in the analysis of both feature domains: space (i.e. sample morphology) and molecular colocation or interaction. In this article, we present our approach WHIDE (Web-based Hyperbolic Image Data Explorer) that combines principles from computational learning, dimension reduction and visualization in a free web application. Results: We applied WHIDE to a set of MBI recorded using the multitag fluorescence imaging Toponome Imaging System. The MBI show field of view in tissue sections from a colon cancer study and we compare tissue from normal/healthy colon with tissue classified as tumor. Our results show, that WHIDE efficiently reduces the complexity of the data by mapping each of the pixels to a cluster, referred to as Molecular Co-Expression Phenotypes and provides a structural basis for a sophisticated multimodal visualization, which combines topology preserving pseudocoloring with information visualization. The wide range of WHIDE's applicability is demonstrated with examples from toponome imaging, high content screens and MALDI imaging (shown in the Supplementary Material). Availability and implementation: The WHIDE tool can be accessed via the BioIMAX website http://ani.cebitec.uni-bielefeld.de/BioIMAX/; Login: whidetestuser; Password: whidetest. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: tim.nattkemper@uni-bielefeld.de PMID:22390938

  17. RAP: RNA-Seq Analysis Pipeline, a new cloud-based NGS web application.

    PubMed

    D'Antonio, Mattia; D'Onorio De Meo, Paolo; Pallocca, Matteo; Picardi, Ernesto; D'Erchia, Anna Maria; Calogero, Raffaele A; Castrignanò, Tiziana; Pesole, Graziano

    2015-01-01

    The study of RNA has been dramatically improved by the introduction of Next Generation Sequencing platforms allowing massive and cheap sequencing of selected RNA fractions, also providing information on strand orientation (RNA-Seq). The complexity of transcriptomes and of their regulative pathways make RNA-Seq one of most complex field of NGS applications, addressing several aspects of the expression process (e.g. identification and quantification of expressed genes and transcripts, alternative splicing and polyadenylation, fusion genes and trans-splicing, post-transcriptional events, etc.). In order to provide researchers with an effective and friendly resource for analyzing RNA-Seq data, we present here RAP (RNA-Seq Analysis Pipeline), a cloud computing web application implementing a complete but modular analysis workflow. This pipeline integrates both state-of-the-art bioinformatics tools for RNA-Seq analysis and in-house developed scripts to offer to the user a comprehensive strategy for data analysis. RAP is able to perform quality checks (adopting FastQC and NGS QC Toolkit), identify and quantify expressed genes and transcripts (with Tophat, Cufflinks and HTSeq), detect alternative splicing events (using SpliceTrap) and chimeric transcripts (with ChimeraScan). This pipeline is also able to identify splicing junctions and constitutive or alternative polyadenylation sites (implementing custom analysis modules) and call for statistically significant differences in genes and transcripts expression, splicing pattern and polyadenylation site usage (using Cuffdiff2 and DESeq). Through a user friendly web interface, the RAP workflow can be suitably customized by the user and it is automatically executed on our cloud computing environment. This strategy allows to access to bioinformatics tools and computational resources without specific bioinformatics and IT skills. RAP provides a set of tabular and graphical results that can be helpful to browse, filter and export analyzed data, according to the user needs.

  18. iDNA-Prot: Identification of DNA Binding Proteins Using Random Forest with Grey Model

    PubMed Central

    Lin, Wei-Zhong; Fang, Jian-An; Xiao, Xuan; Chou, Kuo-Chen

    2011-01-01

    DNA-binding proteins play crucial roles in various cellular processes. Developing high throughput tools for rapidly and effectively identifying DNA-binding proteins is one of the major challenges in the field of genome annotation. Although many efforts have been made in this regard, further effort is needed to enhance the prediction power. By incorporating the features into the general form of pseudo amino acid composition that were extracted from protein sequences via the “grey model” and by adopting the random forest operation engine, we proposed a new predictor, called iDNA-Prot, for identifying uncharacterized proteins as DNA-binding proteins or non-DNA binding proteins based on their amino acid sequences information alone. The overall success rate by iDNA-Prot was 83.96% that was obtained via jackknife tests on a newly constructed stringent benchmark dataset in which none of the proteins included has pairwise sequence identity to any other in a same subset. In addition to achieving high success rate, the computational time for iDNA-Prot is remarkably shorter in comparison with the relevant existing predictors. Hence it is anticipated that iDNA-Prot may become a useful high throughput tool for large-scale analysis of DNA-binding proteins. As a user-friendly web-server, iDNA-Prot is freely accessible to the public at the web-site on http://icpr.jci.edu.cn/bioinfo/iDNA-Prot or http://www.jci-bioinfo.cn/iDNA-Prot. Moreover, for the convenience of the vast majority of experimental scientists, a step-by-step guide is provided on how to use the web-server to get the desired results. PMID:21935457

  19. An offline-online Web-GIS Android application for fast data acquisition of landslide hazard and risk

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Sudmeier-Rieux, Karen; Jaboyedoff, Michel; Derron, Marc-Henri; Devkota, Sanjaya

    2017-04-01

    Regional landslide assessments and mapping have been effectively pursued by research institutions, national and local governments, non-governmental organizations (NGOs), and different stakeholders for some time, and a wide range of methodologies and technologies have consequently been proposed. Land-use mapping and hazard event inventories are mostly created by remote-sensing data, subject to difficulties, such as accessibility and terrain, which need to be overcome. Likewise, landslide data acquisition for the field navigation can magnify the accuracy of databases and analysis. Open-source Web and mobile GIS tools can be used for improved ground-truthing of critical areas to improve the analysis of hazard patterns and triggering factors. This paper reviews the implementation and selected results of a secure mobile-map application called ROOMA (Rapid Offline-Online Mapping Application) for the rapid data collection of landslide hazard and risk. This prototype assists the quick creation of landslide inventory maps (LIMs) by collecting information on the type, feature, volume, date, and patterns of landslides using open-source Web-GIS technologies such as Leaflet maps, Cordova, GeoServer, PostgreSQL as the real DBMS (database management system), and PostGIS as its plug-in for spatial database management. This application comprises Leaflet maps coupled with satellite images as a base layer, drawing tools, geolocation (using GPS and the Internet), photo mapping, and event clustering. All the features and information are recorded into a GeoJSON text file in an offline version (Android) and subsequently uploaded to the online mode (using all browsers) with the availability of Internet. Finally, the events can be accessed and edited after approval by an administrator and then be visualized by the general public.

  20. Teaching a Foreign Language to Deaf People via Vodcasting & Web 2.0 Tools

    NASA Astrophysics Data System (ADS)

    Drigas, Athanasios; Vrettaros, John; Tagoulis, Alexandors; Kouremenos, Dimitris

    This paper presents the design and development of an e-learning course in teaching deaf people in a foreign language, whose first language is the sign language. The course is based in e-material, vodcasting and web 2.0 tools such as social networking and blog The course has been designed especially for deaf people and it is exploring the possibilities that e-learning material vodcasting and web 2.0 tools can offer to enhance the learning process and achieve more effective learning results.

  1. Developing creativity and problem-solving skills of engineering students: a comparison of web- and pen-and-paper-based approaches

    NASA Astrophysics Data System (ADS)

    Valentine, Andrew; Belski, Iouri; Hamilton, Margaret

    2017-11-01

    Problem-solving is a key engineering skill, yet is an area in which engineering graduates underperform. This paper investigates the potential of using web-based tools to teach students problem-solving techniques without the need to make use of class time. An idea generation experiment involving 90 students was designed. Students were surveyed about their study habits and reported they use electronic-based materials more than paper-based materials while studying, suggesting students may engage with web-based tools. Students then generated solutions to a problem task using either a paper-based template or an equivalent web interface. Students who used the web-based approach performed as well as students who used the paper-based approach, suggesting the technique can be successfully adopted and taught online. Web-based tools may therefore be adopted as supplementary material in a range of engineering courses as a way to increase students' options for enhancing problem-solving skills.

  2. Distributed data collection and supervision based on web sensor

    NASA Astrophysics Data System (ADS)

    He, Pengju; Dai, Guanzhong; Fu, Lei; Li, Xiangjun

    2006-11-01

    As a node in Internet/Intranet, web sensor has been promoted in recent years and wildly applied in remote manufactory, workshop measurement and control field. However, the conventional scheme can only support HTTP protocol, and the remote users supervise and control the collected data published by web in the standard browser because of the limited resource of the microprocessor in the sensor; moreover, only one node of data acquirement can be supervised and controlled in one instant therefore the requirement of centralized remote supervision, control and data process can not be satisfied in some fields. In this paper, the centralized remote supervision, control and data process by the web sensor are proposed and implemented by the principle of device driver program. The useless information of the every collected web page embedded in the sensor is filtered and the useful data is transmitted to the real-time database in the workstation, and different filter algorithms are designed for different sensors possessing independent web pages. Every sensor node has its own filter program of web, called "web data collection driver program", the collecting details are shielded, and the supervision, control and configuration software can be implemented by the call of web data collection driver program just like the use of the I/O driver program. The proposed technology can be applied in the data acquirement where relative low real-time is required.

  3. A Consensus Method for the Prediction of ‘Aggregation-Prone’ Peptides in Globular Proteins

    PubMed Central

    Tsolis, Antonios C.; Papandreou, Nikos C.; Iconomidou, Vassiliki A.; Hamodrakas, Stavros J.

    2013-01-01

    The purpose of this work was to construct a consensus prediction algorithm of ‘aggregation-prone’ peptides in globular proteins, combining existing tools. This allows comparison of the different algorithms and the production of more objective and accurate results. Eleven (11) individual methods are combined and produce AMYLPRED2, a publicly, freely available web tool to academic users (http://biophysics.biol.uoa.gr/AMYLPRED2), for the consensus prediction of amyloidogenic determinants/‘aggregation-prone’ peptides in proteins, from sequence alone. The performance of AMYLPRED2 indicates that it functions better than individual aggregation-prediction algorithms, as perhaps expected. AMYLPRED2 is a useful tool for identifying amyloid-forming regions in proteins that are associated with several conformational diseases, called amyloidoses, such as Altzheimer's, Parkinson's, prion diseases and type II diabetes. It may also be useful for understanding the properties of protein folding and misfolding and for helping to the control of protein aggregation/solubility in biotechnology (recombinant proteins forming bacterial inclusion bodies) and biotherapeutics (monoclonal antibodies and biopharmaceutical proteins). PMID:23326595

  4. ContEst16S: an algorithm that identifies contaminated prokaryotic genomes using 16S RNA gene sequences.

    PubMed

    Lee, Imchang; Chalita, Mauricio; Ha, Sung-Min; Na, Seong-In; Yoon, Seok-Hwan; Chun, Jongsik

    2017-06-01

    Thanks to the recent advancement of DNA sequencing technology, the cost and time of prokaryotic genome sequencing have been dramatically decreased. It has repeatedly been reported that genome sequencing using high-throughput next-generation sequencing is prone to contaminations due to its high depth of sequencing coverage. Although a few bioinformatics tools are available to detect potential contaminations, these have inherited limitations as they only use protein-coding genes. Here we introduce a new algorithm, called ContEst16S, to detect potential contaminations using 16S rRNA genes from genome assemblies. We screened 69 745 prokaryotic genomes from the NCBI Assembly Database using ContEst16S and found that 594 were contaminated by bacteria, human and plants. Of the predicted contaminated genomes, 8 % were not predicted by the existing protein-coding gene-based tool, implying that both methods can be complementary in the detection of contaminations. A web-based service of the algorithm is available at www.ezbiocloud.net/tools/contest16s.

  5. Executing SADI services in Galaxy.

    PubMed

    Aranguren, Mikel Egaña; González, Alejandro Rodríguez; Wilkinson, Mark D

    2014-01-01

    In recent years Galaxy has become a popular workflow management system in bioinformatics, due to its ease of installation, use and extension. The availability of Semantic Web-oriented tools in Galaxy, however, is limited. This is also the case for Semantic Web Services such as those provided by the SADI project, i.e. services that consume and produce RDF. Here we present SADI-Galaxy, a tool generator that deploys selected SADI Services as typical Galaxy tools. SADI-Galaxy is a Galaxy tool generator: through SADI-Galaxy, any SADI-compliant service becomes a Galaxy tool that can participate in other out-standing features of Galaxy such as data storage, history, workflow creation, and publication. Galaxy can also be used to execute and combine SADI services as it does with other Galaxy tools. Finally, we have semi-automated the packing and unpacking of data into RDF such that other Galaxy tools can easily be combined with SADI services, plugging the rich SADI Semantic Web Service environment into the popular Galaxy ecosystem. SADI-Galaxy bridges the gap between Galaxy, an easy to use but "static" workflow system with a wide user-base, and SADI, a sophisticated, semantic, discovery-based framework for Web Services, thus benefiting both user communities.

  6. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    NASA Astrophysics Data System (ADS)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  7. NeuronDepot: keeping your colleagues in sync by combining modern cloud storage services, the local file system, and simple web applications

    PubMed Central

    Rautenberg, Philipp L.; Kumaraswamy, Ajayrama; Tejero-Cantero, Alvaro; Doblander, Christoph; Norouzian, Mohammad R.; Kai, Kazuki; Jacobsen, Hans-Arno; Ai, Hiroyuki; Wachtler, Thomas; Ikeno, Hidetoshi

    2014-01-01

    Neuroscience today deals with a “data deluge” derived from the availability of high-throughput sensors of brain structure and brain activity, and increased computational resources for detailed simulations with complex output. We report here (1) a novel approach to data sharing between collaborating scientists that brings together file system tools and cloud technologies, (2) a service implementing this approach, called NeuronDepot, and (3) an example application of the service to a complex use case in the neurosciences. The main drivers for our approach are to facilitate collaborations with a transparent, automated data flow that shields scientists from having to learn new tools or data structuring paradigms. Using NeuronDepot is simple: one-time data assignment from the originator and cloud based syncing—thus making experimental and modeling data available across the collaboration with minimum overhead. Since data sharing is cloud based, our approach opens up the possibility of using new software developments and hardware scalabitliy which are associated with elastic cloud computing. We provide an implementation that relies on existing synchronization services and is usable from all devices via a reactive web interface. We are motivating our solution by solving the practical problems of the GinJang project, a collaboration of three universities across eight time zones with a complex workflow encompassing data from electrophysiological recordings, imaging, morphological reconstructions, and simulations. PMID:24971059

  8. QuadBase2: web server for multiplexed guanine quadruplex mining and visualization

    PubMed Central

    Dhapola, Parashar; Chowdhury, Shantanu

    2016-01-01

    DNA guanine quadruplexes or G4s are non-canonical DNA secondary structures which affect genomic processes like replication, transcription and recombination. G4s are computationally identified by specific nucleotide motifs which are also called putative G4 (PG4) motifs. Despite the general relevance of these structures, there is currently no tool available that can allow batch queries and genome-wide analysis of these motifs in a user-friendly interface. QuadBase2 (quadbase.igib.res.in) presents a completely reinvented web server version of previously published QuadBase database. QuadBase2 enables users to mine PG4 motifs in up to 178 eukaryotes through the EuQuad module. This module interfaces with Ensembl Compara database, to allow users mine PG4 motifs in the orthologues of genes of interest across eukaryotes. PG4 motifs can be mined across genes and their promoter sequences in 1719 prokaryotes through ProQuad module. This module includes a feature that allows genome-wide mining of PG4 motifs and their visualization as circular histograms. TetraplexFinder, the module for mining PG4 motifs in user-provided sequences is now capable of handling up to 20 MB of data. QuadBase2 is a comprehensive PG4 motif mining tool that further expands the configurations and algorithms for mining PG4 motifs in a user-friendly way. PMID:27185890

  9. NeuronDepot: keeping your colleagues in sync by combining modern cloud storage services, the local file system, and simple web applications.

    PubMed

    Rautenberg, Philipp L; Kumaraswamy, Ajayrama; Tejero-Cantero, Alvaro; Doblander, Christoph; Norouzian, Mohammad R; Kai, Kazuki; Jacobsen, Hans-Arno; Ai, Hiroyuki; Wachtler, Thomas; Ikeno, Hidetoshi

    2014-01-01

    Neuroscience today deals with a "data deluge" derived from the availability of high-throughput sensors of brain structure and brain activity, and increased computational resources for detailed simulations with complex output. We report here (1) a novel approach to data sharing between collaborating scientists that brings together file system tools and cloud technologies, (2) a service implementing this approach, called NeuronDepot, and (3) an example application of the service to a complex use case in the neurosciences. The main drivers for our approach are to facilitate collaborations with a transparent, automated data flow that shields scientists from having to learn new tools or data structuring paradigms. Using NeuronDepot is simple: one-time data assignment from the originator and cloud based syncing-thus making experimental and modeling data available across the collaboration with minimum overhead. Since data sharing is cloud based, our approach opens up the possibility of using new software developments and hardware scalabitliy which are associated with elastic cloud computing. We provide an implementation that relies on existing synchronization services and is usable from all devices via a reactive web interface. We are motivating our solution by solving the practical problems of the GinJang project, a collaboration of three universities across eight time zones with a complex workflow encompassing data from electrophysiological recordings, imaging, morphological reconstructions, and simulations.

  10. Official crime data versus collaborative crime mapping at a Brazilian city

    NASA Astrophysics Data System (ADS)

    Brito, P. L.; Jesus, E. G. V.; Sant'Ana, R. M. S.; Martins, C.; Delgado, J. P. M.; Fernandes, V. O.

    2014-11-01

    In July of 2013 a group of undergraduate students from the Federal University of Bahia, Brazil, published a collaborative web map called "Where I Was Robbed". Their initial efforts in publicizing their web map were restricted to announce it at a local radio as a tool of social interest. In two months the map had almost 10.000 reports, 155 reports per day and people from more the 350 cities had already reported a crime. The present study consists in an investigation about this collaborative web map spatial correlation to official robbery data registered at the Secretary of Public Safety database, for the city of Salvador, Bahia. Kernel density estimator combined with map algebra was used to the investigation. Spatial correlations with official robbery data for the city of Salvador were not found initially, but after standardizing collaborative data and mining official registers, both data pointed at very similar areas as the main hot spots for pedestrian robbery. Both areas are located at two of the most economical active areas of the city, although web map crimes reports were more concentrated in an area with higher income population. This results and discussions indicates that this collaborative application is been used mainly by mid class and upper class parcel of the city population, but can still provide significant information on public safety priority areas. Therefore, extended divulgation, on local papers, radio and TV, of the collaborative crime map application and partnership with official agencies are strongly recommended.

  11. Services for Emodnet-Chemistry Data Products

    NASA Astrophysics Data System (ADS)

    Santinelli, Giorgio; Hendriksen, Gerrit; Barth, Alexander

    2016-04-01

    In the framework of Emodnet Chemistry lot, data products from regional leaders were made available in order to transform information into a database. This has been done by using functions and scripts, reading so-called enriched ODV files and inserting data directly into a cloud relational geodatabase. The main table is the one of observations which contains the main data and meta-data associated with the enriched ODV files. A particular implementation in data loading is used in order to improve on-the-fly computational speed. Data from Baltic Sea, North Sea, Mediterrean, Black Sea and part of the Atlantic region has been entered into the geodatabase, and consequently being instantly available from the OceanBrowser Emodnet portal. Furthermore, Deltares has developed an application that provides additional visualisation services for the aggregated and validated data collections. The visualisations are produced by making use of part of the OpenEarthTool stack (http://www.openearth.eu), by the integration of Web Feature Services and by the implementation of Web Processing Services. The goal is the generation of server-side plots of timeseries, profiles, timeprofiles and maps of selected parameters from data sets of selected stations. Regional data collections are retrieved using Emodnet Chemistry cloud relational geo-database. The spatial resolution in time and the intensity of data availability for selected parameters is shown using Web Service requests via the OceanBrowser Emodnet Web portal. OceanBrowser also shows station reference codes, which are used to establish a link for additional metadata, further data shopping and download.

  12. Interoperability Between Coastal Web Atlases Using Semantic Mediation: A Case Study of the International Coastal Atlas Network (ICAN)

    NASA Astrophysics Data System (ADS)

    Wright, D. J.; Lassoued, Y.; Dwyer, N.; Haddad, T.; Bermudez, L. E.; Dunne, D.

    2009-12-01

    Coastal mapping plays an important role in informing marine spatial planning, resource management, maritime safety, hazard assessment and even national sovereignty. As such, there is now a plethora of data/metadata catalogs, pre-made maps, tabular and text information on resource availability and exploitation, and decision-making tools. A recent trend has been to encapsulate these in a special class of web-enabled geographic information systems called a coastal web atlas (CWA). While multiple benefits are derived from tailor-made atlases, there is great value added from the integration of disparate CWAs. CWAs linked to one another can query more successfully to optimize planning and decision-making. If a dataset is missing in one atlas, it may be immediately located in another. Similar datasets in two atlases may be combined to enhance study in either region. *But how best to achieve semantic interoperability to mitigate vague data queries, concepts or natural language semantics when retrieving and integrating data and information?* We report on the development of a new prototype seeking to interoperate between two initial CWAs: the Marine Irish Digital Atlas (MIDA) and the Oregon Coastal Atlas (OCA). These two mature atlases are used as a testbed for more regional connections, with the intent for the OCA to use lessons learned to develop a regional network of CWAs along the west coast, and for MIDA to do the same in building and strengthening atlas networks with the UK, Belgium, and other parts of Europe. Our prototype uses semantic interoperability via services harmonization and ontology mediation, allowing local atlases to use their own data structures, and vocabularies (ontologies). We use standard technologies such as OGC Web Map Services (WMS) for delivering maps, and OGC Catalogue Service for the Web (CSW) for delivering and querying ISO-19139 metadata. The metadata records of a given CWA use a given ontology of terms called local ontology. Human or machine users formulate their requests using a common ontology of metadata terms, called global ontology. A CSW mediator rewrites the user’s request into CSW requests over local CSWs using their own (local) ontologies, collects the results and sends them back to the user. To extend the system, we have recently added global maritime boundaries and are also considering nearshore ocean observing system data. Ongoing work includes adding WFS, error management, and exception handling, enabling Smart Searches, and writing full documentation. This prototype is a central research project of the new International Coastal Atlas Network (ICAN), a group of 30+ organizations from 14 nations (and growing) dedicated to seeking interoperability approaches to CWAs in support of coastal zone management and the translation of coastal science to coastal decision-making.

  13. Hospital-based nurses' perceptions of the adoption of Web 2.0 tools for knowledge sharing, learning, social interaction and the production of collective intelligence.

    PubMed

    Lau, Adela S M

    2011-11-11

    Web 2.0 provides a platform or a set of tools such as blogs, wikis, really simple syndication (RSS), podcasts, tags, social bookmarks, and social networking software for knowledge sharing, learning, social interaction, and the production of collective intelligence in a virtual environment. Web 2.0 is also becoming increasingly popular in e-learning and e-social communities. The objectives were to investigate how Web 2.0 tools can be applied for knowledge sharing, learning, social interaction, and the production of collective intelligence in the nursing domain and to investigate what behavioral perceptions are involved in the adoption of Web 2.0 tools by nurses. The decomposed technology acceptance model was applied to construct the research model on which the hypotheses were based. A questionnaire was developed based on the model and data from nurses (n = 388) were collected from late January 2009 until April 30, 2009. Pearson's correlation analysis and t tests were used for data analysis. Intention toward using Web 2.0 tools was positively correlated with usage behavior (r = .60, P < .05). Behavioral intention was positively correlated with attitude (r = .72, P < .05), perceived behavioral control (r = .58, P < .05), and subjective norm (r = .45, P < .05). In their decomposed constructs, perceived usefulness (r = .7, P < .05), relative advantage (r = .64, P < .05), and compatibility (r = .60,P < .05) were positively correlated with attitude, but perceived ease of use was not significantly correlated (r = .004, P < .05) with it. Peer (r = .47, P < .05), senior management (r = .24,P < .05), and hospital (r = .45, P < .05) influences had positive correlations with subjective norm. Resource (r = .41,P < .05) and technological (r = .69,P < .05) conditions were positively correlated with perceived behavioral control. The identified behavioral perceptions may further health policy makers' understanding of nurses' concerns regarding and barriers to the adoption of Web 2.0 tools and enable them to better plan the strategy of implementation of Web 2.0 tools for knowledge sharing, learning, social interaction, and the production of collective intelligence.

  14. Hospital-Based Nurses’ Perceptions of the Adoption of Web 2.0 Tools for Knowledge Sharing, Learning, Social Interaction and the Production of Collective Intelligence

    PubMed Central

    2011-01-01

    Background Web 2.0 provides a platform or a set of tools such as blogs, wikis, really simple syndication (RSS), podcasts, tags, social bookmarks, and social networking software for knowledge sharing, learning, social interaction, and the production of collective intelligence in a virtual environment. Web 2.0 is also becoming increasingly popular in e-learning and e-social communities. Objectives The objectives were to investigate how Web 2.0 tools can be applied for knowledge sharing, learning, social interaction, and the production of collective intelligence in the nursing domain and to investigate what behavioral perceptions are involved in the adoption of Web 2.0 tools by nurses. Methods The decomposed technology acceptance model was applied to construct the research model on which the hypotheses were based. A questionnaire was developed based on the model and data from nurses (n = 388) were collected from late January 2009 until April 30, 2009. Pearson’s correlation analysis and t tests were used for data analysis. Results Intention toward using Web 2.0 tools was positively correlated with usage behavior (r = .60, P < .05). Behavioral intention was positively correlated with attitude (r = .72, P < .05), perceived behavioral control (r = .58, P < .05), and subjective norm (r = .45, P < .05). In their decomposed constructs, perceived usefulness (r = .7, P < .05), relative advantage (r = .64, P < .05), and compatibility (r = .60, P < .05) were positively correlated with attitude, but perceived ease of use was not significantly correlated (r = .004, P < .05) with it. Peer (r = .47, P < .05), senior management (r = .24, P < .05), and hospital (r = .45, P < .05) influences had positive correlations with subjective norm. Resource (r = .41, P < .05) and technological (r = .69, P < .05) conditions were positively correlated with perceived behavioral control. Conclusions The identified behavioral perceptions may further health policy makers’ understanding of nurses’ concerns regarding and barriers to the adoption of Web 2.0 tools and enable them to better plan the strategy of implementation of Web 2.0 tools for knowledge sharing, learning, social interaction, and the production of collective intelligence. PMID:22079851

  15. Navigating complex patients using an innovative tool: the MTM Spider Web.

    PubMed

    Morello, Candis M; Hirsch, Jan D; Lee, Kelly C

    2013-01-01

    To introduce a teaching tool that can be used to assess the complexity of medication therapy management (MTM) patients, prioritize appropriate interventions, and design patient-centered care plans for each encounter. MTM patients are complex as a result of multiple comorbidities, medications, and socioeconomic and behavioral issues. Pharmacists who provide MTM services are required to synthesize a plethora of information (medical and nonmedical), evaluate and prioritize the clinical problems, and design a comprehensive patient-centered care plan. The MTM Spider Web is a visual tool to facilitate this process. A description is provided regarding how to build the MTM Spider Web using case-based scenarios. This model can be used to teach pharmacists, health professional students, and patients. The MTM Spider Web is an innovative teaching tool that can be used to teach pharmacists and students how to assess complex patients and design a patient-centered care plan to deliver the most appropriate medication therapy.

  16. IDS plot tools for time series of DORIS station positions and orbit residuals

    NASA Astrophysics Data System (ADS)

    Soudarin, L.; Ferrage, P.; Moreaux, G.; Mezerette, A.

    2012-12-01

    DORIS (Doppler Orbitography and Radiopositioning Integrated by Satellite) is a Doppler satellite tracking system developed for precise orbit determination and precise ground location. It is onboard the Cryosat-2, Jason-1, Jason-2 and HY-2A altimetric satellites and the remote sensing satellites SPOT-4 and SPOT-5. It also flew with SPOT-2, SPOT-3, TOPEX/POSEIDON and ENVISAT. Since 1994 and thanks to its worldwide distributed network of more than fifty permanent stations, DORIS contributes to the realization and maintenance of the ITRS (International Terrestrial Reference System). 3D positions and velocities of the reference sites at a cm and mm/yr accuracy lead to scientific studies in geodesy and geophysics. The primary objective of the International DORIS Service (IDS) is to provide a support, through DORIS data and products, to research and operational activities. In order to promote the use of the DORIS products, the IDS has made available on its web site (ids-doris.org) a new set of tools, called Plot tools, to interactively build and display graphs of DORIS station coordinates time series and orbit residuals. These web tools are STCDtool providing station coordinates time series (North, East, Up position evolution) from the IDS Analysis Centers, and POEtool providing statistics time series (orbit residuals and number of measurements for the DORIS stations) from CNES (the French Space Agency) Precise Orbit Determination processing. Complementary data about station and satellites events can also be displayed (e.g. antenna changes, system failures, degraded data...). Information about earthquakes obtained from USGS survey service can also be superimposed on the position time series. All these events can help in interpreting the discontinuities in the time series. The purpose of this presentation is to show the functionalities of these tools and their interest for the monitoring of the crustal deformation at DORIS sites.

  17. Analysis of Sea Level Rise in Action

    NASA Astrophysics Data System (ADS)

    Gill, K. M.; Huang, T.; Quach, N. T.; Boening, C.

    2016-12-01

    NASA's Sea Level Change Portal provides scientists and the general public with "one-stop" source for current sea level change information and data. Sea Level Rise research is a multidisciplinary research and in order to understand its causes, scientists must be able to access different measurements and to be able to compare them. The portal includes an interactive tool, called the Data Analysis Tool (DAT), for accessing, visualizing, and analyzing observations and models relevant to the study of Sea Level Rise. Using NEXUS, an open source, big data analytic technology developed at the Jet Propulsion Laboratory, the DAT is able provide user on-the-fly data analysis on all relevant parameters. DAT is composed of three major components: A dedicated instance of OnEarth (a WMTS service), NEXUS deep data analytic platform, and the JPL Common Mapping Client (CMC) for web browser based user interface (UI). Utilizing the global imagery, a user is capable of browsing the data in a visual manner and isolate areas of interest for further study. The interfaces "Analysis" tool provides tools for area or point selection, single and/or comparative dataset selection, and a range of options, algorithms, and plotting. This analysis component utilizes the Nexus cloud computing platform to provide on-demand processing of the data within the user-selected parameters and immediate display of the results. A RESTful web API is exposed for users comfortable with other interfaces and who may want to take advantage of the cloud computing capabilities. This talk discuss how DAT enables on-the-fly sea level research. The talk will introduce the DAT with an end-to-end tour of the tool with exploration and animating of available imagery, a demonstration of comparative analysis and plotting, and how to share and export data along with images for use in publications/presentations. The session will cover what kind of data is available, what kind of analysis is possible, and what are the outputs.

  18. War Gamers Handbook: A Guide for Professional War Gamers

    DTIC Science & Technology

    2015-11-01

    more complex games led us to integrate knowledge management, web tools, and multitouch , multiuser technologies in order to more efficiently and... Multitouch multiuser (MTMU) and communications operating picture (COP) interfaces ◊ Web development—Web tools and player interfaces Now that the game...hurricane or flood scenario to provide a plausible backdrop to facilitate player interaction toward game objectives. Scenarios should include only the

  19. Using WebCT as a Supplemental Tool to Enhance Critical Thinking and Engagement among Developmental Reading Students

    ERIC Educational Resources Information Center

    Burgess, Melissa L.

    2009-01-01

    The purpose of this research was to examine possible outcomes of developmental students' critical thinking and motivation to read when the online learning community, WebCT, was implemented. My role, in addition to instructor, was that of participant-observer. I implemented WebCT tools, such as discussion board and chat, over a four-month period…

  20. Exploring the Relationship between Web 2.0 Tools Self-Efficacy and Teachers' Use of These Tools in Their Teaching

    ERIC Educational Resources Information Center

    Alhassan, Riyadh

    2017-01-01

    The purpose of this study was to examine the relationship between teachers' self-efficacy in using of Web 2.0 tools and some demographic variables, and their use of those tools in their teaching. The study data was collected from a random sample of public school teachers in Riyadh, Saudi Arabia. The results showed a strong positive relationship…

  1. Analysis of Utility and Use of a Web-Based Tool for Digital Signal Processing Teaching by Means of a Technological Acceptance Model

    ERIC Educational Resources Information Center

    Toral, S. L.; Barrero, F.; Martinez-Torres, M. R.

    2007-01-01

    This paper presents an exploratory study about the development of a structural and measurement model for the technological acceptance (TAM) of a web-based educational tool. The aim consists of measuring not only the use of this tool, but also the external variables with a significant influence in its use for planning future improvements. The tool,…

  2. Share2Quit: Web-Based Peer-Driven Referrals for Smoking Cessation

    PubMed Central

    2013-01-01

    Background Smoking is the number one preventable cause of death in the United States. Effective Web-assisted tobacco interventions are often underutilized and require new and innovative engagement approaches. Web-based peer-driven chain referrals successfully used outside health care have the potential for increasing the reach of Internet interventions. Objective The objective of our study was to describe the protocol for the development and testing of proactive Web-based chain-referral tools for increasing the access to Decide2Quit.org, a Web-assisted tobacco intervention system. Methods We will build and refine proactive chain-referral tools, including email and Facebook referrals. In addition, we will implement respondent-driven sampling (RDS), a controlled chain-referral sampling technique designed to remove inherent biases in chain referrals and obtain a representative sample. We will begin our chain referrals with an initial recruitment of former and current smokers as seeds (initial participants) who will be trained to refer current smokers from their social network using the developed tools. In turn, these newly referred smokers will also be provided the tools to refer other smokers from their social networks. We will model predictors of referral success using sample weights from the RDS to estimate the success of the system in the targeted population. Results This protocol describes the evaluation of proactive Web-based chain-referral tools, which can be used in tobacco interventions to increase the access to hard-to-reach populations, for promoting smoking cessation. Conclusions Share2Quit represents an innovative advancement by capitalizing on naturally occurring technology trends to recruit smokers to Web-assisted tobacco interventions. PMID:24067329

  3. Social Web mining and exploitation for serious applications: Technosocial Predictive Analytics and related technologies for public health, environmental and national security surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamel Boulos, Maged; Sanfilippo, Antonio P.; Corley, Courtney D.

    2010-03-17

    This paper explores techno-social predictive analytics (TPA) and related methods for Web “data mining” where users’ posts and queries are garnered from Social Web (“Web 2.0”) tools such as blogs, microblogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people’s collective health status of whole populations. Several health related tool examples are described and demonstrated as practicalmore » means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks.« less

  4. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP Web...

  5. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP Web...

  6. 5 CFR 1640.6 - Methods of providing information.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... part to participants by making it available on the TSP Web site. A participant can request paper copies of that information from the TSP by calling the ThriftLine, submitting a request through the TSP Web...

  7. Application of ESE Data and Tools to Air Quality Management: Services for Helping the Air Quality Community use ESE Data (SHAirED)

    NASA Technical Reports Server (NTRS)

    Falke, Stefan; Husar, Rudolf

    2011-01-01

    The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.

  8. Implementing Web 2.0 Tools in the Classroom: Four Teachers' Accounts

    ERIC Educational Resources Information Center

    Kovalik, Cindy; Kuo, Chia-Ling; Cummins, Megan; Dipzinski, Erin; Joseph, Paula; Laskey, Stephanie

    2014-01-01

    In this paper, four teachers shared their experiences using the following free Web 2.0 tools with their students: Jing, Wix, Google Sites, and Blogger. The teachers found that students reacted positively to lessons in which these tools were used, and also noted improvements they could make when using them in the future.

  9. eCDRweb User Guide–Primary Support

    EPA Pesticide Factsheets

    This document presents the user guide for the Office of Pollution Prevention and Toxics’ (OPPT) e-CDR web tool. E-CDRweb is the electronic, web-based tool provided by the Environmental Protection Agency (EPA) for the submission of Chemical Data Reporting (CDR) information. This document is the user guide for the Primary Support user of the e-CDRweb tool.

  10. Webquest 2.0: An Instructional Model for Digital Learners

    ERIC Educational Resources Information Center

    Dell, Diana F. Abernathy

    2012-01-01

    Teaching and learning tools such as Moodle and Web 2.0 tools are appearing in K-12 classrooms; however, there is a lack of scholarly research to guide the implementation of these tools. The WebQuest model, a widely adopted inquiry-based model for online instruction, has instructional inadequacies and does not make the most of emerging…

  11. Web-Based Machine Translation as a Tool for Promoting Electronic Literacy and Language Awareness

    ERIC Educational Resources Information Center

    Williams, Lawrence

    2006-01-01

    This article addresses a pervasive problem of concern to teachers of many foreign languages: the use of Web-Based Machine Translation (WBMT) by students who do not understand the complexities of this relatively new tool. Although networked technologies have greatly increased access to many language and communication tools, WBMT is still…

  12. eCDRweb User Guide–Secondary Support

    EPA Pesticide Factsheets

    This document presents the user guide for the Office of Pollution Prevention and Toxics’ (OPPT) e-CDR web tool. E-CDRweb is the electronic, web-based tool provided by the Environmental Protection Agency (EPA) for the submission of Chemical Data Reporting (CDR) information. This document is the user guide for the Secondary Support user of the e-CDRweb tool.

  13. Web Based Personal Nutrition Management Tool

    NASA Astrophysics Data System (ADS)

    Bozkurt, Selen; Zayim, Neşe; Gülkesen, Kemal Hakan; Samur, Mehmet Kemal

    Internet is being used increasingly as a resource for accessing health-related information because of its several advantages. Therefore, Internet tailoring becomes quite preferable in health education and personal health management recently. Today, there are many web based health programs de-signed for individuals. Among these studies nutrition and weight management is popular because, obesity has become a heavy burden for populations worldwide. In this study, we designed a web based personal nutrition education and management tool, The Nutrition Web Portal, in order to enhance patients’ nutrition knowledge, and provide behavioral change against obesity. The present paper reports analysis, design and development processes of The Nutrition Web Portal.

  14. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  15. Charting Our Path with a Web Literacy Map

    ERIC Educational Resources Information Center

    Dalton, Bridget

    2015-01-01

    Being a literacy teacher today means being a teacher of Web literacies. This article features the "Web Literacy Map", an open source tool from Mozilla's Webmaker project. The map focuses on Exploring (Navigating the Web); Building (creating for the Web), and Connecting (Participating on the Web). Readers are invited to use resources,…

  16. The Role of the Web Server in a Capstone Web Application Course

    ERIC Educational Resources Information Center

    Umapathy, Karthikeyan; Wallace, F. Layne

    2010-01-01

    Web applications have become commonplace in the Information Systems curriculum. Much of the discussion about Web development for capstone courses has centered on the scripting tools. Very little has been discussed about different ways to incorporate the Web server into Web application development courses. In this paper, three different ways of…

  17. Genomic Enzymology: Web Tools for Leveraging Protein Family Sequence-Function Space and Genome Context to Discover Novel Functions.

    PubMed

    Gerlt, John A

    2017-08-22

    The exponentially increasing number of protein and nucleic acid sequences provides opportunities to discover novel enzymes, metabolic pathways, and metabolites/natural products, thereby adding to our knowledge of biochemistry and biology. The challenge has evolved from generating sequence information to mining the databases to integrating and leveraging the available information, i.e., the availability of "genomic enzymology" web tools. Web tools that allow identification of biosynthetic gene clusters are widely used by the natural products/synthetic biology community, thereby facilitating the discovery of novel natural products and the enzymes responsible for their biosynthesis. However, many novel enzymes with interesting mechanisms participate in uncharacterized small-molecule metabolic pathways; their discovery and functional characterization also can be accomplished by leveraging information in protein and nucleic acid databases. This Perspective focuses on two genomic enzymology web tools that assist the discovery novel metabolic pathways: (1) Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST) for generating sequence similarity networks to visualize and analyze sequence-function space in protein families and (2) Enzyme Function Initiative-Genome Neighborhood Tool (EFI-GNT) for generating genome neighborhood networks to visualize and analyze the genome context in microbial and fungal genomes. Both tools have been adapted to other applications to facilitate target selection for enzyme discovery and functional characterization. As the natural products community has demonstrated, the enzymology community needs to embrace the essential role of web tools that allow the protein and genome sequence databases to be leveraged for novel insights into enzymological problems.

  18. Genomic Enzymology: Web Tools for Leveraging Protein Family Sequence–Function Space and Genome Context to Discover Novel Functions

    PubMed Central

    2017-01-01

    The exponentially increasing number of protein and nucleic acid sequences provides opportunities to discover novel enzymes, metabolic pathways, and metabolites/natural products, thereby adding to our knowledge of biochemistry and biology. The challenge has evolved from generating sequence information to mining the databases to integrating and leveraging the available information, i.e., the availability of “genomic enzymology” web tools. Web tools that allow identification of biosynthetic gene clusters are widely used by the natural products/synthetic biology community, thereby facilitating the discovery of novel natural products and the enzymes responsible for their biosynthesis. However, many novel enzymes with interesting mechanisms participate in uncharacterized small-molecule metabolic pathways; their discovery and functional characterization also can be accomplished by leveraging information in protein and nucleic acid databases. This Perspective focuses on two genomic enzymology web tools that assist the discovery novel metabolic pathways: (1) Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST) for generating sequence similarity networks to visualize and analyze sequence–function space in protein families and (2) Enzyme Function Initiative-Genome Neighborhood Tool (EFI-GNT) for generating genome neighborhood networks to visualize and analyze the genome context in microbial and fungal genomes. Both tools have been adapted to other applications to facilitate target selection for enzyme discovery and functional characterization. As the natural products community has demonstrated, the enzymology community needs to embrace the essential role of web tools that allow the protein and genome sequence databases to be leveraged for novel insights into enzymological problems. PMID:28826221

  19. preAssemble: a tool for automatic sequencer trace data processing.

    PubMed

    Adzhubei, Alexei A; Laerdahl, Jon K; Vlasova, Anna V

    2006-01-17

    Trace or chromatogram files (raw data) are produced by automatic nucleic acid sequencing equipment or sequencers. Each file contains information which can be interpreted by specialised software to reveal the sequence (base calling). This is done by the sequencer proprietary software or publicly available programs. Depending on the size of a sequencing project the number of trace files can vary from just a few to thousands of files. Sequencing quality assessment on various criteria is important at the stage preceding clustering and contig assembly. Two major publicly available packages--Phred and Staden are used by preAssemble to perform sequence quality processing. The preAssemble pre-assembly sequence processing pipeline has been developed for small to large scale automatic processing of DNA sequencer chromatogram (trace) data. The Staden Package Pregap4 module and base-calling program Phred are utilized in the pipeline, which produces detailed and self-explanatory output that can be displayed with a web browser. preAssemble can be used successfully with very little previous experience, however options for parameter tuning are provided for advanced users. preAssemble runs under UNIX and LINUX operating systems. It is available for downloading and will run as stand-alone software. It can also be accessed on the Norwegian Salmon Genome Project web site where preAssemble jobs can be run on the project server. preAssemble is a tool allowing to perform quality assessment of sequences generated by automatic sequencing equipment. preAssemble is flexible since both interactive jobs on the preAssemble server and the stand alone downloadable version are available. Virtually no previous experience is necessary to run a default preAssemble job, on the other hand options for parameter tuning are provided. Consequently preAssemble can be used as efficiently for just several trace files as for large scale sequence processing.

  20. Criteria for Comparing Children's Web Search Tools.

    ERIC Educational Resources Information Center

    Kuntz, Jerry

    1999-01-01

    Presents criteria for evaluating and comparing Web search tools designed for children. Highlights include database size; accountability; categorization; search access methods; help files; spell check; URL searching; links to alternative search services; advertising; privacy policy; and layout and design. (LRW)

Top