Sample records for google web services

  1. Taking advantage of Google's Web-based applications and services.

    PubMed

    Brigham, Tara J

    2014-01-01

    Google is a company that is constantly expanding and growing its services and products. While most librarians possess a "love/hate" relationship with Google, there are a number of reasons you should consider exploring some of the tools Google has created and made freely available. Applications and services such as Google Docs, Slides, and Google+ are functional and dynamic without the cost of comparable products. This column will address some of the issues users should be aware of before signing up to use Google's tools, and a description of some of Google's Web applications and services, plus how they can be useful to librarians in health care.

  2. Boverhof's App Earns Honorable Mention in Amazon's Web Services

    Science.gov Websites

    » Boverhof's App Earns Honorable Mention in Amazon's Web Services Competition News & Publications News Publications Facebook Google+ Twitter Boverhof's App Earns Honorable Mention in Amazon's Web Services by Amazon Web Services (AWS). Amazon officially announced the winners of its EC2 Spotathon on Monday

  3. Systems and Methods for Decoy Routing and Convert Channel Bonding

    DTIC Science & Technology

    2013-11-26

    34 Proc. R. Soc. A, vol. 463, Jan. 12, 2007, pp. 1-16. " Stupid censorship Web Proxy," http://www.stupidcensorship.com/, retrieved from the internet on...services such as those offered by Google or Skype, web or microblogs such as Twitter, various social media services such as Face- book, and file...device (e.g., Skype, Google , Jabber, Firefox) to be directed to the proprietary software for processing. For instance, the proprietary software of

  4. Feature Positioning on Google Street View Panoramas

    NASA Astrophysics Data System (ADS)

    Tsai, V. J. D.; Chang, C.-T.

    2012-07-01

    Location-based services (LBS) on web-based maps and images have come into real-time since Google launched its Street View imaging services in 2007. This research employs Google Maps API and Web Service, GAE for JAVA, AJAX, Proj4js, CSS and HTML in developing an internet platform for accessing the orientation parameters of Google Street View (GSV) panoramas in order to determine the three dimensional position of interest features that appear on two overlapping panoramas by geometric intersection. A pair of GSV panoramas was examined using known points located on the Library Building of National Chung Hsing University (NCHU) with the root-mean-squared errors of ±0.522m, ±1.230m, and ±5.779m for intersection and ±0.142m, ±1.558m, and ±5.733m for resection in X, Y, and h (elevation), respectively. Potential error sources in GSV positioning were analyzed and illustrated that the errors in Google provided GSV positional parameters dominate the errors in geometric intersection. The developed system is suitable for data collection in establishing LBS applications integrated with Google Maps and Google Earth in traffic sign and infrastructure inventory by adding automatic extraction and matching techniques for points of interest (POI) from GSV panoramas.

  5. Google Scholar Usage: An Academic Library's Experience

    ERIC Educational Resources Information Center

    Wang, Ya; Howard, Pamela

    2012-01-01

    Google Scholar is a free service that provides a simple way to broadly search for scholarly works and to connect patrons with the resources libraries provide. The researchers in this study analyzed Google Scholar usage data from 2006 for three library tools at San Francisco State University: SFX link resolver, Web Access Management proxy server,…

  6. With Free Google Alert Services

    ERIC Educational Resources Information Center

    Gunn, Holly

    2005-01-01

    Alert services are a great way of keeping abreast of topics that interest you. Rather than searching the Web regularly to find new content about your areas of interest, an alert service keeps you informed by sending you notices when new material is added to the Web that matches your registered search criteria. Alert services are examples of push…

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None Available

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  8. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-03-16

    of infrastructure-as-a- service (IaaS) cloud computing services such as Ama- zon Web Services, Google Compute Engine, Rackspace, et. al. means that...Implementation We implemented keylime in ∼3.2k lines of Python in four components: registrar, node, CV, and tenant. The registrar offers a REST-based web ...bootstrap key K. It provides an unencrypted REST-based web service for these two functions. As described earlier, the pro- tocols for exchanging data

  9. Deep Web video

    ScienceCinema

    None Available

    2018-02-06

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  10. Google Is Not the Net: Social Networks Are Surging and Present the Real Service Challenge--And Opportunity--For Libraries

    ERIC Educational Resources Information Center

    Albanese, Andrew Richard

    2006-01-01

    This article observes that it's not hard to understand why Google creates such unease among librarians. The profession, however, can't afford to be myopic when it comes to Google. As inescapable as it is, Google is not the Internet. And as the web evolves, new opportunities and challenges loom larger for libraries than who's capturing the bulk of…

  11. Multi-Resource Fair Queueing for Packet Processing

    DTIC Science & Technology

    2012-06-19

    Huawei , Intel, MarkLogic, Microsoft, NetApp, Oracle, Quanta, Splunk, VMware and by DARPA (contract #FA8650-11-C-7136). Multi-Resource Fair Queueing for...Google PhD Fellowship, gifts from Amazon Web Services, Google, SAP, Blue Goji, Cisco, Cloud- era, Ericsson, General Electric, Hewlett Packard, Huawei

  12. Beyond Google: The Invisible Web in the Academic Library

    ERIC Educational Resources Information Center

    Devine, Jane; Egger-Sider, Francine

    2004-01-01

    This article analyzes the concept of the Invisible Web and its implication for academic librarianship. It offers a guide to tools that can be used to mine the Invisible Web and discusses the benefits of using the Invisible Web to promote interest in library services. In addition, the article includes an expanded definition, a literature review,…

  13. Challenging Google, Microsoft Unveils a Search Tool for Scholarly Articles

    ERIC Educational Resources Information Center

    Carlson, Scott

    2006-01-01

    Microsoft has introduced a new search tool to help people find scholarly articles online. The service, which includes journal articles from prominent academic societies and publishers, puts Microsoft in direct competition with Google Scholar. The new free search tool, which should work on most Web browsers, is called Windows Live Academic Search…

  14. A Case Study in Web 2.0 Application Development

    NASA Astrophysics Data System (ADS)

    Marganian, P.; Clark, M.; Shelton, A.; McCarty, M.; Sessoms, E.

    2010-12-01

    Recent web technologies focusing on languages, frameworks, and tools are discussed, using the Robert C. Byrd Green Bank Telescopes (GBT) new Dynamic Scheduling System as the primary example. Within that example, we use a popular Python web framework, Django, to build the extensive web services for our users. We also use a second complimentary server, written in Haskell, to incorporate the core scheduling algorithms. We provide a desktop-quality experience across all the popular browsers for our users with the Google Web Toolkit and judicious use of JQuery in Django templates. Single sign-on and authentication throughout all NRAO web services is accomplished via the Central Authentication Service protocol, or CAS.

  15. Googling DNA sequences on the World Wide Web.

    PubMed

    Hajibabaei, Mehrdad; Singer, Gregory A C

    2009-11-10

    New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.

  16. [Google Scholar and the h-index in biomedicine: the popularization of bibliometric assessment].

    PubMed

    Cabezas-Clavijo, A; Delgado-López-Cózar, E

    2013-01-01

    The aim of this study is to review the features, benefits and limitations of the new scientific evaluation products derived from Google Scholar, such as Google Scholar Metrics and Google Scholar Citations, as well as the h-index, which is the standard bibliometric indicator adopted by these services. The study also outlines the potential of this new database as a source for studies in Biomedicine, and compares the h-index obtained by the most relevant journals and researchers in the field of intensive care medicine, based on data extracted from the Web of Science, Scopus and Google Scholar. Results show that although the average h-index values in Google Scholar are almost 30% higher than those obtained in Web of Science, and about 15% higher than those collected by Scopus, there are no substantial changes in the rankings generated from one data source or the other. Despite some technical problems, it is concluded that Google Scholar is a valid tool for researchers in Health Sciences, both for purposes of information retrieval and for the computation of bibliometric indicators. Copyright © 2012 Elsevier España, S.L. and SEMICYUC. All rights reserved.

  17. A Web-Based Interactive Mapping System of State Wide School Performance: Integrating Google Maps API Technology into Educational Achievement Data

    ERIC Educational Resources Information Center

    Wang, Kening; Mulvenon, Sean W.; Stegman, Charles; Anderson, Travis

    2008-01-01

    Google Maps API (Application Programming Interface), released in late June 2005 by Google, is an amazing technology that allows users to embed Google Maps in their own Web pages with JavaScript. Google Maps API has accelerated the development of new Google Maps based applications. This article reports a Web-based interactive mapping system…

  18. Humans Do It Better: Inside the Open Directory Project.

    ERIC Educational Resources Information Center

    Sherman, Chris

    2000-01-01

    Explains the Open Directory Project (ODP), an attempt to catalog the World Wide Web by creating a human-compiled Web directory. Discusses the history of the project; open source models; the use of volunteer editors; quality control; problems and complaints; and use of ODP data by commercial services such as Google. (LRW)

  19. a Map Mash-Up Application: Investigation the Temporal Effects of Climate Change on Salt Lake Basin

    NASA Astrophysics Data System (ADS)

    Kirtiloglu, O. S.; Orhan, O.; Ekercin, S.

    2016-06-01

    The main purpose of this paper is to investigate climate change effects that have been occurred at the beginning of the twenty-first century at the Konya Closed Basin (KCB) located in the semi-arid central Anatolian region of Turkey and particularly in Salt Lake region where many major wetlands located in and situated in KCB and to share the analysis results online in a Web Geographical Information System (GIS) environment. 71 Landsat 5-TM, 7-ETM+ and 8-OLI images and meteorological data obtained from 10 meteorological stations have been used at the scope of this work. 56 of Landsat images have been used for extraction of Salt Lake surface area through multi-temporal Landsat imagery collected from 2000 to 2014 in Salt lake basin. 15 of Landsat images have been used to make thematic maps of Normalised Difference Vegetation Index (NDVI) in KCB, and 10 meteorological stations data has been used to generate the Standardized Precipitation Index (SPI), which was used in drought studies. For the purpose of visualizing and sharing the results, a Web GIS-like environment has been established by using Google Maps and its useful data storage and manipulating product Fusion Tables which are all Google's free of charge Web service elements. The infrastructure of web application includes HTML5, CSS3, JavaScript, Google Maps API V3 and Google Fusion Tables API technologies. These technologies make it possible to make effective "Map Mash-Ups" involving an embedded Google Map in a Web page, storing the spatial or tabular data in Fusion Tables and add this data as a map layer on embedded map. The analysing process and map mash-up application have been discussed in detail as the main sections of this paper.

  20. Spatiotemporal-Thematic Data Processing for the Semantic Web

    NASA Astrophysics Data System (ADS)

    Hakimpour, Farshad; Aleman-Meza, Boanerges; Perry, Matthew; Sheth, Amit

    This chapter presents practical approaches to data processing in the space, time and theme dimensions using existing Semantic Web technologies. It describes how we obtain geographic and event data from Internet sources and also how we integrate them into an RDF store. We briefly introduce a set of functionalities in space, time and semantics. These functionalities are implemented based on our existing technology for main-memory-based RDF data processing developed at the LSDIS Lab. A number of these functionalities are exposed as REST Web services. We present two sample client-side applications that are developed using a combination of our services with Google Maps service.

  1. Stopping Web Plagiarists from Stealing Your Content

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    This article gives tips on how to avoid having content stolen by plagiarists. Suggestions include: using a Web search service such as Google to search for unique strings of text at the individuals site to uncover other sites with the same content; buying a infringement-detection program; or hiring a public relations firm to do the work. There are…

  2. Croatian Medical Journal citation score in Web of Science, Scopus, and Google Scholar.

    PubMed

    Sember, Marijan; Utrobicić, Ana; Petrak, Jelka

    2010-04-01

    To analyze the 2007 citation count of articles published by the Croatian Medical Journal in 2005-2006 based on data from the Web of Science, Scopus, and Google Scholar. Web of Science and Scopus were searched for the articles published in 2005-2006. As all articles returned by Scopus were included in Web of Science, the latter list was the sample for further analysis. Total citation counts for each article on the list were retrieved from Web of Science, Scopus, and Google Scholar. The overlap and unique citations were compared and analyzed. Proportions were compared using chi(2)-test. Google Scholar returned the greatest proportion of articles with citations (45%), followed by Scopus (42%), and Web of Science (38%). Almost a half (49%) of articles had no citations and 11% had an equal number of identical citations in all 3 databases. The greatest overlap was found between Web of Science and Scopus (54%), followed by Scopus and Google Scholar (51%), and Web of Science and Google Scholar (44%). The greatest number of unique citations was found by Google Scholar (n=86). The majority of these citations (64%) came from journals, followed by books and PhD theses. Approximately 55% of all citing documents were full-text resources in open access. The language of citing documents was mostly English, but as many as 25 citing documents (29%) were in Chinese. Google Scholar shares a total of 42% citations returned by two others, more influential, bibliographic resources. The list of unique citations in Google Scholar is predominantly journal based, but these journals are mainly of local character. Citations received by internationally recognized medical journals are crucial for increasing the visibility of small medical journals but Google Scholar may serve as an alternative bibliometric tool for an orientational citation insight.

  3. Curating the Web: Building a Google Custom Search Engine for the Arts

    ERIC Educational Resources Information Center

    Hennesy, Cody; Bowman, John

    2008-01-01

    Google's first foray onto the web made search simple and results relevant. With its Co-op platform, Google has taken another step toward dramatically increasing the relevancy of search results, further adapting the World Wide Web to local needs. Google Custom Search Engine, a tool on the Co-op platform, puts one in control of his or her own search…

  4. Some Features of "Alt" Texts Associated with Images in Web Pages

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2006-01-01

    Introduction: This paper extends a series on summaries of Web objects, in this case, the alt attribute of image files. Method: Data were logged from 1894 pages from Yahoo!'s random page service and 4703 pages from the Google directory; an img tag was extracted randomly from each where present; its alt attribute, if any, was recorded; and the…

  5. IRIS Earthquake Browser with Integration to the GEON IDV for 3-D Visualization of Hypocenters.

    NASA Astrophysics Data System (ADS)

    Weertman, B. R.

    2007-12-01

    We present a new generation of web based earthquake query tool - the IRIS Earthquake Browser (IEB). The IEB combines the DMC's large set of earthquake catalogs (provided by USGS/NEIC, ISC and the ANF) with the popular Google Maps web interface. With the IEB you can quickly and easily find earthquakes in any region of the globe. Using Google's detailed satellite images, earthquakes can be easily co-located with natural geographic features such as volcanoes as well as man made features such as commercial mines. A set of controls allow earthquakes to be filtered by time, magnitude, and depth range as well as catalog name, contributor name and magnitude type. Displayed events can be easily exported in NetCDF format into the GEON Integrated Data Viewer (IDV) where hypocenters may be visualized in three dimensions. Looking "under the hood", the IEB is based on AJAX technology and utilizes REST style web services hosted at the IRIS DMC. The IEB is part of a broader effort at the DMC aimed at making our data holdings available via web services. The IEB is useful both educationally and as a research tool.

  6. Differences in the quality of information on the internet about lung cancer between the United States and Japan.

    PubMed

    Goto, Yasushi; Sekine, Ikuo; Sekiguchi, Hiroshi; Yamada, Kazuhiko; Nokihara, Hiroshi; Yamamoto, Noboru; Kunitoh, Hideo; Ohe, Yuichiro; Tamura, Tomohide

    2009-07-01

    Quality of information available over the Internet has been a cause for concern. Our goal was to evaluate the quality of information available on lung cancer in the United States and Japan and assess the differences between the two. We conducted a prospective, observational Web review by searching the word "lung cancer" in Japanese and English, using Google Japan (Google-J), Google United States (Google-U), and Yahoo Japan (Yahoo-J). The first 50 Web sites displayed were evaluated from the ethical perspective and for the validity of the information. The administrator of each Web site was also investigated. Ethical policies were generally well described in the Web sites displayed by Google-U but less well so in the sites displayed by Google-J and Yahoo-J. The differences in the validity of the information available was more striking, in that 80% of the Web sites generated by Google-U described the most appropriate treatment methods, whereas less than 50% of the Web sites displayed by Google-J and Yahoo-J recommended the standard therapy, and more than 10% advertised alternative therapy. Nonprofit organizations and public institutions were the primary Web site administrators in the United States, whereas commercial or personal Web sites were more frequent in Japan. Differences in the quality of information on lung cancer available over the Internet were apparent between Japan and the United States. The reasons for such differences might be tracked to the administrators of the Web sites. Nonprofit organizations and public institutions are the up-and-coming Web site administrators for relaying reliable medical information.

  7. The Effects of Online Homework on First Year Pre-Service Science Teachers' Learning Achievements of Introductory Organic Chemistry

    ERIC Educational Resources Information Center

    Ratniyom, Jadsada; Boonphadung, Suttipong; Unnanantn, Thassanant

    2016-01-01

    This study examined the effects of the introductory organic chemistry online homework on first year pre-service science teachers' learning achievements. The online homework was created using a web-based Google form in order to enhance the pre-service science teachers' learning achievements. The steps for constructing online homework were…

  8. Google Scholar Goes to School: The Presence of Google Scholar on College and University Web Sites

    ERIC Educational Resources Information Center

    Neuhaus, Chris; Neuhaus, Ellen; Asher, Alan

    2008-01-01

    This study measured the degree of Google Scholar adoption within academia by analyzing the frequency of Google Scholar appearances on 948 campus and library Web sites, and by ascertaining the establishment of link resolution between Google Scholar and library resources. Results indicate a positive correlation between the implementation of Google…

  9. Available, intuitive and free! Building e-learning modules using web 2.0 services.

    PubMed

    Tam, Chun Wah Michael; Eastwood, Anne

    2012-01-01

    E-learning is part of the mainstream in medical education and often provides the most efficient and effective means of engaging learners in a particular topic. However, translating design and content ideas into a useable product can be technically challenging, especially in the absence of information technology (IT) support. There is little published literature on the use of web 2.0 services to build e-learning activities. To describe the web 2.0 tools and solutions employed to build the GP Synergy evidence-based medicine and critical appraisal online course. We used and integrated a number of free web 2.0 services including: Prezi, a web-based presentation platform; YouTube, a video sharing service; Google Docs, a online document platform; Tiny.cc, a URL shortening service; and Wordpress, a blogging platform. The course consisting of five multimedia-rich, tutorial-like modules was built without IT specialist assistance or specialised software. The web 2.0 services used were free. The course can be accessed with a modern web browser. Modern web 2.0 services remove many of the technical barriers for creating and sharing content on the internet. When used synergistically, these services can be a flexible and low-cost platform for building e-learning activities. They were a pragmatic solution in our context.

  10. Croatian Medical Journal Citation Score in Web of Science, Scopus, and Google Scholar

    PubMed Central

    Šember, Marijan; Utrobičić, Ana; Petrak, Jelka

    2010-01-01

    Aim To analyze the 2007 citation count of articles published by the Croatian Medical Journal in 2005-2006 based on data from the Web of Science, Scopus, and Google Scholar. Methods Web of Science and Scopus were searched for the articles published in 2005-2006. As all articles returned by Scopus were included in Web of Science, the latter list was the sample for further analysis. Total citation counts for each article on the list were retrieved from Web of Science, Scopus, and Google Scholar. The overlap and unique citations were compared and analyzed. Proportions were compared using χ2-test. Results Google Scholar returned the greatest proportion of articles with citations (45%), followed by Scopus (42%), and Web of Science (38%). Almost a half (49%) of articles had no citations and 11% had an equal number of identical citations in all 3 databases. The greatest overlap was found between Web of Science and Scopus (54%), followed by Scopus and Google Scholar (51%), and Web of Science and Google Scholar (44%). The greatest number of unique citations was found by Google Scholar (n = 86). The majority of these citations (64%) came from journals, followed by books and PhD theses. Approximately 55% of all citing documents were full-text resources in open access. The language of citing documents was mostly English, but as many as 25 citing documents (29%) were in Chinese. Conclusion Google Scholar shares a total of 42% citations returned by two others, more influential, bibliographic resources. The list of unique citations in Google Scholar is predominantly journal based, but these journals are mainly of local character. Citations received by internationally recognized medical journals are crucial for increasing the visibility of small medical journals but Google Scholar may serve as an alternative bibliometric tool for an orientational citation insight. PMID:20401951

  11. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    simultaneous cloud nodes. 1. INTRODUCTION The proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as...Amazon Web Services and Google Compute Engine means more cloud tenants are hosting sensitive, private, and business critical data and applications in the...thousands of IaaS resources as they are elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features

  12. Media Use in Higher Education from a Cross-National Perspective

    ERIC Educational Resources Information Center

    Grosch, Michael

    2013-01-01

    The web 2.0 has already penetrated the learning environment of students ubiquitously. This dissemination of online services into tertiary education has led to constant changes in students' learning and study behaviour. Students use services such as Google and Wikipedia most often not only during free time but also for learning. At the same…

  13. FastLane: An Agile Congestion Signaling Mechanism for Improving Datacenter Performance

    DTIC Science & Technology

    2013-05-20

    Cloudera, Ericsson, Facebook, General Electric, Hortonworks, Huawei , Intel, Microsoft, NetApp, Oracle, Quanta, Samsung, Splunk, VMware and Yahoo...Web Services, Google, SAP, Blue Goji, Cisco, Clearstory Data, Cloud- era, Ericsson, Facebook, General Electric, Hortonworks, Huawei , Intel, Microsoft

  14. How to Optimize Your Web Site

    ERIC Educational Resources Information Center

    Dysart, Joe

    2008-01-01

    Given Google's growing market share--69% of all searches by the close of 2007--it's absolutely critical for any school on the Web to ensure its site is Google-friendly. A Google-optimized site ensures that students and parents can quickly find one's district on the Web even if they don't know the address. Plus, good search optimization simply…

  15. Flipping the Online Classroom with Web 2.0: The Asynchronous Workshop

    ERIC Educational Resources Information Center

    Cummings, Lance

    2016-01-01

    This article examines how Web 2.0 technologies can be used to "flip" the online classroom by creating asynchronous workshops in social environments where immediacy and social presence can be maximized. Using experience teaching several communication and writing classes in Google Apps (Google+, Google Hangouts, Google Drive, etc.), I…

  16. Web GIS in practice III: creating a simple interactive map of England's Strategic Health Authorities using Google Maps API, Google Earth KML, and MSN Virtual Earth Map Control

    PubMed Central

    Boulos, Maged N Kamel

    2005-01-01

    This eye-opener article aims at introducing the health GIS community to the emerging online consumer geoinformatics services from Google and Microsoft (MSN), and their potential utility in creating custom online interactive health maps. Using the programmable interfaces provided by Google and MSN, we created three interactive demonstrator maps of England's Strategic Health Authorities. These can be browsed online at – Google Maps API (Application Programming Interface) version, – Google Earth KML (Keyhole Markup Language) version, and – MSN Virtual Earth Map Control version. Google and MSN's worldwide distribution of "free" geospatial tools, imagery, and maps is to be commended as a significant step towards the ultimate "wikification" of maps and GIS. A discussion is provided of these emerging online mapping trends, their expected future implications and development directions, and associated individual privacy, national security and copyrights issues. Although ESRI have announced their planned response to Google (and MSN), it remains to be seen how their envisaged plans will materialize and compare to the offerings from Google and MSN, and also how Google and MSN mapping tools will further evolve in the near future. PMID:16176577

  17. Development of a Web-Based Visualization Platform for Climate Research Using Google Earth

    NASA Technical Reports Server (NTRS)

    Sun, Xiaojuan; Shen, Suhung; Leptoukh, Gregory G.; Wang, Panxing; Di, Liping; Lu, Mingyue

    2011-01-01

    Recently, it has become easier to access climate data from satellites, ground measurements, and models from various data centers, However, searching. accessing, and prc(essing heterogeneous data from different sources are very tim -consuming tasks. There is lack of a comprehensive visual platform to acquire distributed and heterogeneous scientific data and to render processed images from a single accessing point for climate studies. This paper. documents the design and implementation of a Web-based visual, interoperable, and scalable platform that is able to access climatological fields from models, satellites, and ground stations from a number of data sources using Google Earth (GE) as a common graphical interface. The development is based on the TCP/IP protocol and various data sharing open sources, such as OPeNDAP, GDS, Web Processing Service (WPS), and Web Mapping Service (WMS). The visualization capability of integrating various measurements into cE extends dramatically the awareness and visibility of scientific results. Using embedded geographic information in the GE, the designed system improves our understanding of the relationships of different elements in a four dimensional domain. The system enables easy and convenient synergistic research on a virtual platform for professionals and the general public, gr$tly advancing global data sharing and scientific research collaboration.

  18. Bootstrapping and Maintaining Trust in the Cloud

    DTIC Science & Technology

    2016-12-01

    proliferation and popularity of infrastructure-as-a- service (IaaS) cloud computing services such as Amazon Web Services and Google Compute Engine means...IaaS trusted computing system: • Secure Bootstrapping – the system should enable the tenant to securely install an initial root secret into each cloud ...elastically instantiated and terminated. Prior cloud trusted computing solutions address a subset of these features, but none achieve all. Excalibur [31] sup

  19. OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.

    PubMed

    Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe

    2013-02-15

    Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.

  20. Googling trends in conservation biology.

    PubMed

    Proulx, Raphaël; Massicotte, Philippe; Pépino, Marc

    2014-02-01

    Web-crawling approaches, that is, automated programs data mining the internet to obtain information about a particular process, have recently been proposed for monitoring early signs of ecosystem degradation or for establishing crop calendars. However, lack of a clear conceptual and methodological framework has prevented the development of such approaches within the field of conservation biology. Our objective was to illustrate how Google Trends, a freely accessible web-crawling engine, can be used to track changes in timing of biological processes, spatial distribution of invasive species, and level of public awareness about key conservation issues. Google Trends returns the number of internet searches that were made for a keyword in a given region of the world over a defined period. Using data retrieved online for 13 countries, we exemplify how Google Trends can be used to study the timing of biological processes, such as the seasonal recurrence of pollen release or mosquito outbreaks across a latitudinal gradient. We mapped the spatial extent of results from Google Trends for 5 invasive species in the United States and found geographic patterns in invasions that are consistent with their coarse-grained distribution at state levels. From 2004 through 2012, Google Trends showed that the level of public interest and awareness about conservation issues related to ecosystem services, biodiversity, and climate change increased, decreased, and followed both trends, respectively. Finally, to further the development of research approaches at the interface of conservation biology, collective knowledge, and environmental management, we developed an algorithm that allows the rapid retrieval of Google Trends data. © 2013 Society for Conservation Biology.

  1. EntrezAJAX: direct web browser access to the Entrez Programming Utilities.

    PubMed

    Loman, Nicholas J; Pallen, Mark J

    2010-06-21

    Web applications for biology and medicine often need to integrate data from Entrez services provided by the National Center for Biotechnology Information. However, direct access to Entrez from a web browser is not possible due to 'same-origin' security restrictions. The use of "Asynchronous JavaScript and XML" (AJAX) to create rich, interactive web applications is now commonplace. The ability to access Entrez via AJAX would be advantageous in the creation of integrated biomedical web resources. We describe EntrezAJAX, which provides access to Entrez eUtils and is able to circumvent same-origin browser restrictions. EntrezAJAX is easily implemented by JavaScript developers and provides identical functionality as Entrez eUtils as well as enhanced functionality to ease development. We provide easy-to-understand developer examples written in JavaScript to illustrate potential uses of this service. For the purposes of speed, reliability and scalability, EntrezAJAX has been deployed on Google App Engine, a freely available cloud service. The EntrezAJAX webpage is located at http://entrezajax.appspot.com/

  2. Medicine 2.0: social networking, collaboration, participation, apomediation, and openness.

    PubMed

    Eysenbach, Gunther

    2008-08-25

    In a very significant development for eHealth, broad adoption of Web 2.0 technologies and approaches coincides with the more recent emergence of Personal Health Application Platforms and Personally Controlled Health Records such as Google Health, Microsoft HealthVault, and Dossia. "Medicine 2.0" applications, services and tools are defined as Web-based services for health care consumers, caregivers, patients, health professionals, and biomedical researchers, that use Web 2.0 technologies and/or semantic web and virtual reality approaches to enable and facilitate specifically 1) social networking, 2) participation, 3) apomediation, 4) openness and 5) collaboration, within and between these user groups. The Journal of Medical Internet Research (JMIR) publishes a Medicine 2.0 theme issue and sponsors a conference on "How Social Networking and Web 2.0 changes Health, Health Care, Medicine and Biomedical Research", to stimulate and encourage research in these five areas.

  3. Medicine 2.0: Social Networking, Collaboration, Participation, Apomediation, and Openness

    PubMed Central

    2008-01-01

    In a very significant development for eHealth, a broad adoption of Web 2.0 technologies and approaches coincides with the more recent emergence of Personal Health Application Platforms and Personally Controlled Health Records such as Google Health, Microsoft HealthVault, and Dossia. “Medicine 2.0” applications, services, and tools are defined as Web-based services for health care consumers, caregivers, patients, health professionals, and biomedical researchers, that use Web 2.0 technologies and/or semantic web and virtual reality approaches to enable and facilitate specifically 1) social networking, 2) participation, 3) apomediation, 4) openness, and 5) collaboration, within and between these user groups. The Journal of Medical Internet Research (JMIR) publishes a Medicine 2.0 theme issue and sponsors a conference on “How Social Networking and Web 2.0 changes Health, Health Care, Medicine, and Biomedical Research”, to stimulate and encourage research in these five areas. PMID:18725354

  4. mORCA: sailing bioinformatics world with mobile devices.

    PubMed

    Díaz-Del-Pino, Sergio; Falgueras, Juan; Perez-Wohlfeil, Esteban; Trelles, Oswaldo

    2018-03-01

    Nearly 10 years have passed since the first mobile apps appeared. Given the fact that bioinformatics is a web-based world and that mobile devices are endowed with web-browsers, it seemed natural that bioinformatics would transit from personal computers to mobile devices but nothing could be further from the truth. The transition demands new paradigms, designs and novel implementations. Throughout an in-depth analysis of requirements of existing bioinformatics applications we designed and deployed an easy-to-use web-based lightweight mobile client. Such client is able to browse, select, compose automatically interface parameters, invoke services and monitor the execution of Web Services using the service's metadata stored in catalogs or repositories. mORCA is available at http://bitlab-es.com/morca/app as a web-app. It is also available in the App store by Apple and Play Store by Google. The software will be available for at least 2 years. ortrelles@uma.es. Source code, final web-app, training material and documentation is available at http://bitlab-es.com/morca. © The Author(s) 2017. Published by Oxford University Press.

  5. Using secure web services to visualize poison center data for nationwide biosurveillance: a case study.

    PubMed

    Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M Barry; Lee, Brian; Stinn, John; Worthen, Katherine

    2010-01-01

    Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance.

  6. Radon

    MedlinePlus

    ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: March 3, 2011 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...

  7. Mercury

    MedlinePlus

    ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: February 12, 2013 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...

  8. Asbestos

    MedlinePlus

    ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: March 3, 2011 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...

  9. Beryllium Toxicity

    MedlinePlus

    ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Beryllium Toxicity Patient Education Care Instruction Sheet ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: May 23, 2008 Page ...

  10. ToxFAQs

    MedlinePlus

    ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Get email updates To receive email updates ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: June 24, 2014 Page ...

  11. Side by Side: What a Comparative Usability Study Told Us about a Web Site Redesign

    ERIC Educational Resources Information Center

    Dougan, Kirstin; Fulton, Camilla

    2009-01-01

    Library Web sites must compete against easy-to-use sites, such as Google Scholar, Google Books, and Wikipedia, for students' time and attention. Library Web sites must therefore be designed with aesthetics and user perceptions at the forefront. The Music and Performing Arts Library at Urbana-Champaign's Web site was overcrowded and in much need of…

  12. 78 FR 60876 - Advisory Committee to the Director (ACD), Centers for Disease Control and Prevention (CDC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Advisory... by teleconference. Please dial (877) 930-8819 and enter code 1579739. Web links: Windows Connection-2: http://wm.onlinevideoservice.com/CDC2 Flash Connection-4 (For Safari and Google Chrome Users): http...

  13. 78 FR 60876 - Advisory Committee to the Director (ACD), Centers for Disease Control and Prevention (CDC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention Advisory... by teleconference. Please dial (877) 930-8819 and enter code 1579739. Web Links Windows Connection-2: http://wm.onlinevideoservice.com/CDC2 . Flash Connection-4 (For Safari and Google Chrome Users): http...

  14. EntrezAJAX: direct web browser access to the Entrez Programming Utilities

    PubMed Central

    2010-01-01

    Web applications for biology and medicine often need to integrate data from Entrez services provided by the National Center for Biotechnology Information. However, direct access to Entrez from a web browser is not possible due to 'same-origin' security restrictions. The use of "Asynchronous JavaScript and XML" (AJAX) to create rich, interactive web applications is now commonplace. The ability to access Entrez via AJAX would be advantageous in the creation of integrated biomedical web resources. We describe EntrezAJAX, which provides access to Entrez eUtils and is able to circumvent same-origin browser restrictions. EntrezAJAX is easily implemented by JavaScript developers and provides identical functionality as Entrez eUtils as well as enhanced functionality to ease development. We provide easy-to-understand developer examples written in JavaScript to illustrate potential uses of this service. For the purposes of speed, reliability and scalability, EntrezAJAX has been deployed on Google App Engine, a freely available cloud service. The EntrezAJAX webpage is located at http://entrezajax.appspot.com/ PMID:20565938

  15. Interfaces to PeptideAtlas: a case study of standard data access systems

    PubMed Central

    Handcock, Jeremy; Robinson, Thomas; Deutsch, Eric W.; Boyle, John

    2012-01-01

    Access to public data sets is important to the scientific community as a resource to develop new experiments or validate new data. Projects such as the PeptideAtlas, Ensembl and The Cancer Genome Atlas (TCGA) offer both access to public data and a repository to share their own data. Access to these data sets is often provided through a web page form and a web service API. Access technologies based on web protocols (e.g. http) have been in use for over a decade and are widely adopted across the industry for a variety of functions (e.g. search, commercial transactions, and social media). Each architecture adapts these technologies to provide users with tools to access and share data. Both commonly used web service technologies (e.g. REST and SOAP), and custom-built solutions over HTTP are utilized in providing access to research data. Providing multiple access points ensures that the community can access the data in the simplest and most effective manner for their particular needs. This article examines three common access mechanisms for web accessible data: BioMart, caBIG, and Google Data Sources. These are illustrated by implementing each over the PeptideAtlas repository and reviewed for their suitability based on specific usages common to research. BioMart, Google Data Sources, and caBIG are each suitable for certain uses. The tradeoffs made in the development of the technology are dependent on the uses each was designed for (e.g. security versus speed). This means that an understanding of specific requirements and tradeoffs is necessary before selecting the access technology. PMID:22941959

  16. Total Petroleum Hydrocarbons (TPH): ToxFAQs

    MedlinePlus

    ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: February 4, 2014 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...

  17. Efficiently Communicating Rich Heterogeneous Geospatial Data from the FeMO2008 Dive Cruise with FlashMap on EarthRef.org

    NASA Astrophysics Data System (ADS)

    Minnett, R. C.; Koppers, A. A.; Staudigel, D.; Staudigel, H.

    2008-12-01

    EarthRef.org is comprehensive and convenient resource for Earth Science reference data and models. It encompasses four main portals: the Geochemical Earth Reference Model (GERM), the Magnetics Information Consortium (MagIC), the Seamount Biogeosciences Network (SBN), and the Enduring Resources for Earth Science Education (ERESE). Their underlying databases are publically available and the scientific community has contributed widely and is urged to continue to do so. However, the net result is a vast and largely heterogeneous warehouse of geospatial data ranging from carefully prepared maps of seamounts to geochemical data/metadata, daily reports from seagoing expeditions, large volumes of raw and processed multibeam data, images of paleomagnetic sampling sites, etc. This presents a considerable obstacle for integrating other rich media content, such as videos, images, data files, cruise tracks, and interoperable database results, without overwhelming the web user. The four EarthRef.org portals clearly lend themselves to a more intuitive user interface and has, therefore, been an invaluable test bed for the design and implementation of FlashMap, a versatile KML-driven geospatial browser written for reliability and speed in Adobe Flash. FlashMap allows layers of content to be loaded and displayed over a streaming high-resolution map which can be zoomed and panned similarly to Google Maps and Google Earth. Many organizations, from National Geographic to the USGS, have begun using Google Earth software to display geospatial content. However, Google Earth, as a desktop application, does not integrate cleanly with existing websites requiring the user to navigate away from the browser and focus on a separate application and Google Maps, written in Java Script, does not scale up reliably to large datasets. FlashMap remedies these problems as a web-based application that allows for seamless integration of the real-time display power of Google Earth and the flexibility of the web without losing scalability and control of the base maps. Our Flash-based application is fully compatible with KML (Keyhole Markup Language) 2.2, the most recent iteration of KML, allowing users with existing Google Earth KML files to effortlessly display their geospatial content embedded in a web page. As a test case for FlashMap, the annual Iron-Oxidizing Microbial Observatory (FeMO) dive cruise to the Loihi Seamount, in conjunction with data available from ongoing and published FeMO laboratory studies, showcases the flexibility of this single web-based application. With a KML 2.2 compatible web-service providing the content, any database can display results in FlashMap. The user can then hide and show multiple layers of content, potentially from several data sources, and rapidly digest a vast quantity of information to narrow the search results. This flexibility gives experienced users the ability to drill down to exactly the record they are looking for (SERC at Carleton College's educational application of FlashMap at http://serc.carleton.edu/sp/erese/activities/22223.html) and allows users familiar with Google Earth the ability to load and view geospatial data content within a browser from any computer with an internet connection.

  18. Oyster Fisheries App

    NASA Technical Reports Server (NTRS)

    Perez Guerrero, Geraldo A.; Armstrong, Duane; Underwood, Lauren

    2015-01-01

    This project is creating a cloud-enabled, HTML 5 web application to help oyster fishermen and state agencies apply Earth science to improve the management of this important natural and economic resource. The Oyster Fisheries app gathers and analyzes environmental and water quality information, and alerts fishermen and resources managers about problems in oyster fishing waters. An intuitive interface based on Google Maps displays the geospatial information and provides familiar interactive controls to the users. Alerts can be tailored to notify users when conditions in specific leases or public fishing areas require attention. The app is hosted on the Amazon Web Services cloud. It is being developed and tested using some of the latest web development tools such as web components and Polymer.

  19. Next Generation Landsat Products Delivered Using Virtual Globes and OGC Standard Services

    NASA Astrophysics Data System (ADS)

    Neiers, M.; Dwyer, J.; Neiers, S.

    2008-12-01

    The Landsat Data Continuity Mission (LDCM) is the next in the series of Landsat satellite missions and is tasked with the objective of delivering data acquired by the Operational Land Imager (OLI). The OLI instrument will provide data continuity to over 30 years of global multispectral data collected by the Landsat series of satellites. The U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center has responsibility for the development and operation of the LDCM ground system. One of the mission objectives of the LDCM is to distribute OLI data products electronically over the Internet to the general public on a nondiscriminatory basis and at no cost. To ensure the user community and general public can easily access LDCM data from multiple clients, the User Portal Element (UPE) of the LDCM ground system will use OGC standards and services such as Keyhole Markup Language (KML), Web Map Service (WMS), Web Coverage Service (WCS), and Geographic encoding of Really Simple Syndication (GeoRSS) feeds for both access to and delivery of LDCM products. The USGS has developed and tested the capabilities of several successful UPE prototypes for delivery of Landsat metadata, full resolution browse, and orthorectified (L1T) products from clients such as Google Earth, Google Maps, ESRI ArcGIS Explorer, and Microsoft's Virtual Earth. Prototyping efforts included the following services: using virtual globes to search the historical Landsat archive by dynamic generation of KML; notification of and access to new Landsat acquisitions and L1T downloads from GeoRSS feeds; Google indexing of KML files containing links to full resolution browse and data downloads; WMS delivery of reduced resolution browse, full resolution browse, and cloud mask overlays; and custom data downloads using WCS clients. These various prototypes will be demonstrated and LDCM service implementation plans will be discussed during this session.

  20. Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC)

    NASA Technical Reports Server (NTRS)

    Liu, Z.; Ostrenga, D.; Vollmer, B.; Kempler, S.; Deshong, B.; Greene, M.

    2015-01-01

    The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is also home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 17 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available: -Level-1 GPM Microwave Imager (GMI) and partner radiometer products, DPR products -Level-2 Goddard Profiling Algorithm (GPROF) GMI and partner products, DPR products -Level-3 daily and monthly products, DPR products -Integrated Multi-satellitE Retrievals for GPM (IMERG) products (early, late, and final) A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http://disc.sci.gsfc.nasa.gov/gpm). Data services that are currently and to-be available include Google-like Mirador (http://mirador.gsfc.nasa.gov/) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http://giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications. The United User Interface (UUI) is the next step in the evolution of the GES DISC web site. It attempts to provide seamless access to data, information and services through a single interface without sending the user to different applications or URLs (e.g., search, access, subset, Giovanni, documents).

  1. ToxGuides: Quick Reference Pocket Guide for Toxicological Profiles

    MedlinePlus

    ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Get email updates To receive email updates ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: January 21, 2015 Page ...

  2. Using Secure Web Services to Visualize Poison Center Data for Nationwide Biosurveillance: A Case Study

    PubMed Central

    Savel, Thomas G; Bronstein, Alvin; Duck, William; Rhodes, M. Barry; Lee, Brian; Stinn, John; Worthen, Katherine

    2010-01-01

    Objectives Real-time surveillance systems are valuable for timely response to public health emergencies. It has been challenging to leverage existing surveillance systems in state and local communities, and, using a centralized architecture, add new data sources and analytical capacity. Because this centralized model has proven to be difficult to maintain and enhance, the US Centers for Disease Control and Prevention (CDC) has been examining the ability to use a federated model based on secure web services architecture, with data stewardship remaining with the data provider. Methods As a case study for this approach, the American Association of Poison Control Centers and the CDC extended an existing data warehouse via a secure web service, and shared aggregate clinical effects and case counts data by geographic region and time period. To visualize these data, CDC developed a web browser-based interface, Quicksilver, which leveraged the Google Maps API and Flot, a javascript plotting library. Results Two iterations of the NPDS web service were completed in 12 weeks. The visualization client, Quicksilver, was developed in four months. Discussion This implementation of web services combined with a visualization client represents incremental positive progress in transitioning national data sources like BioSense and NPDS to a federated data exchange model. Conclusion Quicksilver effectively demonstrates how the use of secure web services in conjunction with a lightweight, rapidly deployed visualization client can easily integrate isolated data sources for biosurveillance. PMID:23569581

  3. Cyber Exercise Playbook

    DTIC Science & Technology

    2014-11-01

    unclassified tools and techniques that can be shared with PNs, to include social engineering, spear phishing , fake web sites, physical access attempts, and...and instead rely on commercial services such as Yahoo or Google . Some nations have quite advanced cyber security practices, but may take vastly...unauthorized access to data/systems Inject external network scanning, email phishing , malicious website access, social engineering Sample

  4. Comparisons of citations in Web of Science, Scopus, and Google Scholar for articles published in general medical journals.

    PubMed

    Kulkarni, Abhaya V; Aziz, Brittany; Shams, Iffat; Busse, Jason W

    2009-09-09

    Until recently, Web of Science was the only database available to track citation counts for published articles. Other databases are now available, but their relative performance has not been established. To compare the citation count profiles of articles published in general medical journals among the citation databases of Web of Science, Scopus, and Google Scholar. Cohort study of 328 articles published in JAMA, Lancet, or the New England Journal of Medicine between October 1, 1999, and March 31, 2000. Total citation counts for each article up to June 2008 were retrieved from Web of Science, Scopus, and Google Scholar. Article characteristics were analyzed in linear regression models to determine interaction with the databases. Number of citations received by an article since publication and article characteristics associated with citation in databases. Google Scholar and Scopus retrieved more citations per article with a median of 160 (interquartile range [IQR], 83 to 324) and 149 (IQR, 78 to 289), respectively, than Web of Science (median, 122; IQR, 66 to 241) (P < .001 for both comparisons). Compared with Web of Science, Scopus retrieved more citations from non-English-language sources (median, 10.2% vs 4.1%) and reviews (30.8% vs 18.2%), and fewer citations from articles (57.2% vs 70.5%), editorials (2.1% vs 5.9%), and letters (0.8% vs 2.6%) (all P < .001). On a log(10)-transformed scale, fewer citations were found in Google Scholar to articles with declared industry funding (nonstandardized regression coefficient, -0.09; 95% confidence interval [CI], -0.15 to -0.03), reporting a study of a drug or medical device (-0.05; 95% CI, -0.11 to 0.01), or with group authorship (-0.29; 95% CI, -0.35 to -0.23). In multivariable analysis, group authorship was the only characteristic that differed among the databases; Google Scholar had significantly fewer citations to group-authored articles (-0.30; 95% CI, -0.36 to -0.23) compared with Web of Science. Web of Science, Scopus, and Google Scholar produced quantitatively and qualitatively different citation counts for articles published in 3 general medical journals.

  5. Going beyond Google for Faster and Smarter Web Searching

    ERIC Educational Resources Information Center

    Vine, Rita

    2004-01-01

    With more than 4 billion web pages in its database, Google is suitable for many different kinds of searches. When you know what you are looking for, Google can be a pretty good first choice, as long as you want to search a word pattern that can be expected to appear on any results pages. The problem starts when you don't know exactly what you're…

  6. The quality of patient-orientated Internet information on oral lichen planus: a pilot study.

    PubMed

    López-Jornet, Pía; Camacho-Alonso, Fabio

    2010-10-01

    This study examines the accessibility and quality Web pages related with oral lichen planus. Sites were identified using two search engines (Google and Yahoo!) and the search terms 'oral lichen planus' and 'oral lesion lichenoid'. The first 100 sites in each search were visited and classified. The web sites were evaluated for content quality by using the validated DISCERN rating instrument. JAMA benchmarks and 'Health on the Net' seal (HON). A total of 109,000 sites were recorded in Google using the search terms and 520,000 in Yahoo! A total of 19 Web pages considered relevant were examined on Google and 20 on Yahoo! As regards the JAMA benchmarks, only two pages satisfied the four criteria in Google (10%), and only three (15%) in Yahoo! As regards DISCERN, the overall quality of web site information was poor, no site reaching the maximum score. In Google 78.94% of sites had important deficiencies, and 50% in Yahoo!, the difference between the two search engines being statistically significant (P = 0.031). Only five pages (17.2%) on Google and eight (40%) on Yahoo! showed the HON code. Based on our review, doctors must assume primary responsibility for educating and counselling their patients. © 2010 Blackwell Publishing Ltd.

  7. Google it: obtaining information about local STD/HIV testing services online.

    PubMed

    Habel, Melissa A; Hood, Julia; Desai, Sheila; Kachur, Rachel; Buhi, Eric R; Liddon, Nicole

    2011-04-01

    Although the Internet is one of the most commonly accessed resources for health information, finding information on local sexual health services, such as sexually transmitted disease (STD) testing, can be challenging. Recognizing that most quests for online health information begin with search engines, the purpose of this exploratory study was to examine the extent to which online information about local STD/HIV testing services can be found using Google. Queries on STD and HIV testing services were executed in Google for 6 geographically unique locations across the United States. The first 3 websites that resulted from each query were coded for the following characteristics: (1) relevancy to the search topic, (2) domain and purpose, (3) rank in Google results, and (4) content. Websites hosted at .com (57.3%), .org (25.7%), and .gov (10.5%) domains were retrieved most frequently. Roughly half of all websites (n = 376) provided information relevant to the query, and about three-quarters (77.0%) of all queries yielded at least 1 relevant website within the first 3 results. Searches for larger cities were more likely to yield relevant results compared with smaller cities (odds ratio [OR] = 10.0, 95% confidence interval [CI] = 5.6, 17.9). On comparison with .com domains, .gov (OR = 2.9, 95% CI = 1.4, 5.6) and .org domains (OR = 2.9, 95% CI = 1.7, 4.8) were more likely to provide information of the location to get tested. Ease of online access to information about sexual health services varies by search topic and locale. Sexual health service providers must optimize their website placement so as to reach a greater proportion of the sexually active population who use web search engines.

  8. Citations and the h index of soil researchers and journals in the Web of Science, Scopus, and Google Scholar

    PubMed Central

    Hartemink, Alfred E.; McBratney, Alex; Jang, Ho-Jun

    2013-01-01

    Citation metrics and h indices differ using different bibliometric databases. We compiled the number of publications, number of citations, h index and year since the first publication from 340 soil researchers from all over the world. On average, Google Scholar has the highest h index, number of publications and citations per researcher, and the Web of Science the lowest. The number of papers in Google Scholar is on average 2.3 times higher and the number of citations is 1.9 times higher compared to the data in the Web of Science. Scopus metrics are slightly higher than that of the Web of Science. The h index in Google Scholar is on average 1.4 times larger than Web of Science, and the h index in Scopus is on average 1.1 times larger than Web of Science. Over time, the metrics increase in all three databases but fastest in Google Scholar. The h index of an individual soil scientist is about 0.7 times the number of years since his/her first publication. There is a large difference between the number of citations, number of publications and the h index using the three databases. From this analysis it can be concluded that the choice of the database affects widely-used citation and evaluation metrics but that bibliometric transfer functions exist to relate the metrics from these three databases. We also investigated the relationship between journal’s impact factor and Google Scholar’s h5-index. The h5-index is a better measure of a journal’s citation than the 2 or 5 year window impact factor. PMID:24167778

  9. Citations and the h index of soil researchers and journals in the Web of Science, Scopus, and Google Scholar.

    PubMed

    Minasny, Budiman; Hartemink, Alfred E; McBratney, Alex; Jang, Ho-Jun

    2013-01-01

    Citation metrics and h indices differ using different bibliometric databases. We compiled the number of publications, number of citations, h index and year since the first publication from 340 soil researchers from all over the world. On average, Google Scholar has the highest h index, number of publications and citations per researcher, and the Web of Science the lowest. The number of papers in Google Scholar is on average 2.3 times higher and the number of citations is 1.9 times higher compared to the data in the Web of Science. Scopus metrics are slightly higher than that of the Web of Science. The h index in Google Scholar is on average 1.4 times larger than Web of Science, and the h index in Scopus is on average 1.1 times larger than Web of Science. Over time, the metrics increase in all three databases but fastest in Google Scholar. The h index of an individual soil scientist is about 0.7 times the number of years since his/her first publication. There is a large difference between the number of citations, number of publications and the h index using the three databases. From this analysis it can be concluded that the choice of the database affects widely-used citation and evaluation metrics but that bibliometric transfer functions exist to relate the metrics from these three databases. We also investigated the relationship between journal's impact factor and Google Scholar's h5-index. The h5-index is a better measure of a journal's citation than the 2 or 5 year window impact factor.

  10. International use of an academic nephrology World Wide Web site: from medical information resource to business tool.

    PubMed

    Abbott, Kevin C; Oliver, David K; Boal, Thomas R; Gadiyak, Grigorii; Boocks, Carl; Yuan, Christina M; Welch, Paul G; Poropatich, Ronald K

    2002-04-01

    Studies of the use of the World Wide Web to obtain medical knowledge have largely focused on patients. In particular, neither the international use of academic nephrology World Wide Web sites (websites) as primary information sources nor the use of search engines (and search strategies) to obtain medical information have been described. Visits ("hits") to the Walter Reed Army Medical Center (WRAMC) Nephrology Service website from April 30, 2000, to March 14, 2001, were analyzed for the location of originating source using Webtrends, and search engines (Google, Lycos, etc.) were analyzed manually for search strategies used. From April 30, 2000 to March 14, 2001, the WRAMC Nephrology Service website received 1,007,103 hits and 12,175 visits. These visits were from 33 different countries, and the most frequent regions were Western Europe, Asia, Australia, the Middle East, Pacific Islands, and South America. The most frequent organization using the site was the military Internet system, followed by America Online and automated search programs of online search engines, most commonly Google. The online lecture series was the most frequently visited section of the website. Search strategies used in search engines were extremely technical. The use of "robots" by standard Internet search engines to locate websites, which may be blocked by mandatory registration, has allowed users worldwide to access the WRAMC Nephrology Service website to answer very technical questions. This suggests that it is being used as an alternative to other primary sources of medical information and that the use of mandatory registration may hinder users from finding valuable sites. With current Internet technology, even a single service can become a worldwide information resource without sacrificing its primary customers.

  11. Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses.

    PubMed

    Falagas, Matthew E; Pitsouni, Eleni I; Malietzis, George A; Pappas, Georgios

    2008-02-01

    The evolution of the electronic age has led to the development of numerous medical databases on the World Wide Web, offering search facilities on a particular subject and the ability to perform citation analysis. We compared the content coverage and practical utility of PubMed, Scopus, Web of Science, and Google Scholar. The official Web pages of the databases were used to extract information on the range of journals covered, search facilities and restrictions, and update frequency. We used the example of a keyword search to evaluate the usefulness of these databases in biomedical information retrieval and a specific published article to evaluate their utility in performing citation analysis. All databases were practical in use and offered numerous search facilities. PubMed and Google Scholar are accessed for free. The keyword search with PubMed offers optimal update frequency and includes online early articles; other databases can rate articles by number of citations, as an index of importance. For citation analysis, Scopus offers about 20% more coverage than Web of Science, whereas Google Scholar offers results of inconsistent accuracy. PubMed remains an optimal tool in biomedical electronic research. Scopus covers a wider journal range, of help both in keyword searching and citation analysis, but it is currently limited to recent articles (published after 1995) compared with Web of Science. Google Scholar, as for the Web in general, can help in the retrieval of even the most obscure information but its use is marred by inadequate, less often updated, citation information.

  12. A Web Portal-Based Time-Aware KML Animation Tool for Exploring Spatiotemporal Dynamics of Hydrological Events

    NASA Astrophysics Data System (ADS)

    Bao, X.; Cai, X.; Liu, Y.

    2009-12-01

    Understanding spatiotemporal dynamics of hydrological events such as storms and droughts is highly valuable for decision making on disaster mitigation and recovery. Virtual Globe-based technologies such as Google Earth and Open Geospatial Consortium KML standards show great promises for collaborative exploration of such events using visual analytical approaches. However, currently there are two barriers for wider usage of such approaches. First, there lacks an easy way to use open source tools to convert legacy or existing data formats such as shapefiles, geotiff, or web services-based data sources to KML and to produce time-aware KML files. Second, an integrated web portal-based time-aware animation tool is currently not available. Thus users usually share their files in the portal but have no means to visually explore them without leaving the portal environment which the users are familiar with. We develop a web portal-based time-aware KML animation tool for viewing extreme hydrologic events. The tool is based on Google Earth JavaScript API and Java Portlet standard 2.0 JSR-286, and it is currently deployable in one of the most popular open source portal frameworks, namely Liferay. We have also developed an open source toolkit kml-soc-ncsa (http://code.google.com/p/kml-soc-ncsa/) to facilitate the conversion of multiple formats into KML and the creation of time-aware KML files. We illustrate our tool using some example cases, in which drought and storm events with both time and space dimension can be explored in this web-based KML animation portlet. The tool provides an easy-to-use web browser-based portal environment for multiple users to collaboratively share and explore their time-aware KML files as well as improving the understanding of the spatiotemporal dynamics of the hydrological events.

  13. 78 FR 2398 - Motorola Mobility LLC and Google Inc.; Analysis of Proposed Consent Order to Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-11

    ... responsible for making sure that your comment does not include any sensitive health information, like medical records or other individually identifiable health information. In addition, do not include any ``[t]rade... overnight service. Visit the Commission Web site at http://www.ftc.gov to read this Notice and the news...

  14. [An evaluation of the quality of health web pages using a validated questionnaire].

    PubMed

    Conesa Fuentes, Maria del Carmen; Aguinaga Ontoso, Enrique; Hernández Morante, Juan José

    2011-01-01

    The objective of the present study was to evaluate the quality of general health information in Spanish language web pages, and the official Regional Services web pages from the different Autonomous Regions. It is a cross-sectional study. We have used a previously validated questionnaire to study the present state of the health information on Internet for a lay-user point of view. By mean of PageRank (Google®), we obtained a group of webs, including a total of 65 health web pages. We applied some exclusion criteria, and finally obtained a total of 36 webs. We also analyzed the official web pages from the different Health Services in Spain (19 webs), making a total of 54 health web pages. In the light of our data, we observed that, the quality of the general information health web pages was generally rather low, especially regarding the information quality. Not one page reached the maximum score (19 points). The mean score of the web pages was of 9.8±2.8. In conclusion, to avoid the problems arising from the lack of quality, health professionals should design advertising campaigns and other media to teach the lay-user how to evaluate the information quality. Copyright © 2009 Elsevier España, S.L. All rights reserved.

  15. NaviCell Web Service for network-based data visualization.

    PubMed

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P A; Barillot, Emmanuel; Zinovyev, Andrei

    2015-07-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of 'omics' data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. NaviCell Web Service for network-based data visualization

    PubMed Central

    Bonnet, Eric; Viara, Eric; Kuperstein, Inna; Calzone, Laurence; Cohen, David P. A.; Barillot, Emmanuel; Zinovyev, Andrei

    2015-01-01

    Data visualization is an essential element of biological research, required for obtaining insights and formulating new hypotheses on mechanisms of health and disease. NaviCell Web Service is a tool for network-based visualization of ‘omics’ data which implements several data visual representation methods and utilities for combining them together. NaviCell Web Service uses Google Maps and semantic zooming to browse large biological network maps, represented in various formats, together with different types of the molecular data mapped on top of them. For achieving this, the tool provides standard heatmaps, barplots and glyphs as well as the novel map staining technique for grasping large-scale trends in numerical values (such as whole transcriptome) projected onto a pathway map. The web service provides a server mode, which allows automating visualization tasks and retrieving data from maps via RESTful (standard HTTP) calls. Bindings to different programming languages are provided (Python and R). We illustrate the purpose of the tool with several case studies using pathway maps created by different research groups, in which data visualization provides new insights into molecular mechanisms involved in systemic diseases such as cancer and neurodegenerative diseases. PMID:25958393

  17. Improving Land Cover Mapping: a Mobile Application Based on ESA Sentinel 2 Imagery

    NASA Astrophysics Data System (ADS)

    Melis, M. T.; Dessì, F.; Loddo, P.; La Mantia, C.; Da Pelo, S.; Deflorio, A. M.; Ghiglieri, G.; Hailu, B. T.; Kalegele, K.; Mwasi, B. N.

    2018-04-01

    The increasing availability of satellite data is a real value for the enhancement of environmental knowledge and land management. Possibilities to integrate different source of geo-data are growing and methodologies to create thematic database are becoming very sophisticated. Moreover, the access to internet services and, in particular, to web mapping services is well developed and spread either between expert users than the citizens. Web map services, like Google Maps or Open Street Maps, give the access to updated optical imagery or topographic maps but information on land cover/use - are not still provided. Therefore, there are many failings in the general utilization -non-specialized users- and access to those maps. This issue is particularly felt where the digital (web) maps could form the basis for land use management as they are more economic and accessible than the paper maps. These conditions are well known in many African countries where, while the internet access is becoming open to all, the local map agencies and their products are not widespread.

  18. UNAVCO Software and Services for Visualization and Exploration of Geoscience Data

    NASA Astrophysics Data System (ADS)

    Meertens, C.; Wier, S.

    2007-12-01

    UNAVCO has been involved in visualization of geoscience data to support education and research for several years. An early and ongoing service is the Jules Verne Voyager, a web browser applet built on the GMT that displays any area on Earth, with many data set choices, including maps, satellite images, topography, geoid heights, sea-floor ages, strain rates, political boundaries, rivers and lakes, earthquake and volcano locations, focal mechanisms, stress axes, and observed and modeled plate motion and deformation velocity vectors from geodetic measurements around the world. As part of the GEON project, UNAVCO has developed the GEON IDV, a research-level, 4D (earth location, depth and/or altitude, and time), Java application for interactive display and analysis of geoscience data. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three-dimensional geoscience data anywhere on earth. The GEON IDV supports simultaneous displays of data sets from differing sources, with complete control over colors, time animation, map projection, map area, point of view, and vertical scale. The GEON IDV displays gridded and point data, images, GIS shape files, and several other types of data. The GEON IDV has symbols and displays for GPS velocity vectors, seismic tomography, earthquake focal mechanisms, earthquake locations with magnitude or depth, seismic ray paths in 3D, seismic anisotropy, convection model visualization, earth strain axes and strain field imagery, and high-resolution 3D topographic relief maps. Multiple data sources and display types may appear in one view. As an example of GEON IDV utility, it can display hypocenters under a volcano, a surface geology map of the volcano draped over 3D topographic relief, town locations and political boundaries, and real-time 3D weather radar clouds of volcanic ash in the atmosphere, with time animation. The GEON IDV can drive a GeoWall or other 3D stereo system. IDV output includes imagery, movies, and KML files for Google Earth use of IDV static images, where Google Earth can handle the display. The IDV can be scripted to create display images on user request or automatically on data arrival, offering the use of the IDV as a back end to support a data web site. We plan to extend the power of the IDV by accepting new data types and data services, such as GeoSciML. An active program of online and video training in GEON IDV use is planned. UNAVCO will support users who need assistance converting their data to the standard formats used by the GEON IDV. The UNAVCO Facility provides web-accessible support for Google Earth and Google Maps display of any of more than 9500 GPS stations and survey points, including metadata for each installation. UNAVCO provides corresponding Open Geospatial Consortium (OGC) web services with the same data. UNAVCO's goal is to facilitate data access, interoperability, and efficient searches, exploration, and use of data by promoting web services, standards for GEON IDV data formats and metadata, and software able to simultaneously read and display multiple data sources, formats, and map locations or projections. Retention and propagation of semantics and metadata with observational and experimental values is essential for interoperability and understanding diverse data sources.

  19. PhyloGeoViz: a web-based program that visualizes genetic data on maps.

    PubMed

    Tsai, Yi-Hsin E

    2011-05-01

    The first step of many population genetic studies is the simple visualization of allele frequencies on a landscape. This basic data exploration can be challenging without proprietary software, and the manual plotting of data is cumbersome and unfeasible at large sample sizes. I present an open source, web-based program that plots any kind of frequency or count data as pie charts in Google Maps (Google Inc., Mountain View, CA). Pie polygons are then exportable to Google Earth (Google Inc.), a free Geographic Information Systems platform. Import of genetic data into Google Earth allows phylogeographers access to a wealth of spatial information layers integral to forming hypotheses and understanding patterns in the data. © 2010 Blackwell Publishing Ltd.

  20. The Adversarial Route Analysis Tool: A Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casson, William H. Jr.

    2012-08-02

    The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.

  1. The Privilege of Ranking: Google Plays Ball.

    ERIC Educational Resources Information Center

    Wiggins, Richard

    2003-01-01

    Discussion of ranking systems used in various settings, including college football and academic admissions, focuses on the Google search engine. Explains the PageRank mathematical formula that scores Web pages by connecting the number of links; limitations, including authenticity and accuracy of ranked Web pages; relevancy; adjusting algorithms;…

  2. Accelerating North American rangeland conservation with earth observation data and user driven web applications.

    NASA Astrophysics Data System (ADS)

    Allred, B. W.; Naugle, D.; Donnelly, P.; Tack, J.; Jones, M. O.

    2016-12-01

    In 2010, the USDA Natural Resources Conservation Service (NRCS) launched the Sage Grouse Initiative (SGI) to voluntarily reduce threats facing sage-grouse and rangelands on private lands. Over the past five years, SGI has matured into a primary catalyst for rangeland and wildlife conservation across the North American west, focusing on the shared vision of wildlife conservation through sustainable working landscapes and providing win-win solutions for producers, sage grouse, and 350 other sagebrush obligate species. SGI and its partners have invested a total of $750 million into rangeland and wildlife conservation. Moving forward, SGI continues to focus on rangeland conservation. Partnering with Google Earth Engine, SGI has developed outcome monitoring and conservation planning tools at continental scales. The SGI science team is currently developing assessment and monitoring algorithms of key conservation indicators. The SGI web application utilizes Google Earth Engine for user defined analysis and planning, putting the appropriate information directly into the hands of managers and conservationists.

  3. About Student's Media Use for Learning in Tertiary Education Influence Factors and Structures of Usage Behavior

    ERIC Educational Resources Information Center

    Grosch, Michael

    2014-01-01

    The rise of the web 2.0 led to dramatic changes in media usage behavior of students in tertiary education. Services such as Google and Facebook are most accepted amongst students not only in pastime but also for learning. A representative survey was made at Karlsruhe Institute of Technology (KIT). About 1,400 students were asked 150 questions to…

  4. Through the Google Goggles: Sociopolitical Bias in Search Engine Design

    NASA Astrophysics Data System (ADS)

    Diaz, A.

    Search engines like Google are essential to navigating the Web's endless supply of news, political information, and citizen discourse. The mechanisms and conditions under which search results are selected should therefore be of considerable interest to media scholars, political theorists, and citizens alike. In this chapter, I adopt a "deliberative" ideal for search engines and examine whether Google exhibits the "same old" media biases of mainstreaming, hypercommercialism, and industry consolidation. In the end, serious objections to Google are raised: Google may favor popularity over richness; it provides advertising that competes directly with "editorial" content; it so overwhelmingly dominates the industry that users seldom get a second opinion, and this is unlikely to change. Ultimately, however, the results of this analysis may speak less about Google than about contradictions in the deliberative ideal and the so-called "inherently democratic" nature of the Web.

  5. Arctic Glass: Innovative Consumer Technology in Support of Arctic Research

    NASA Astrophysics Data System (ADS)

    Ruthkoski, T.

    2015-12-01

    The advancement of cyberinfrastructure on the North Slope of Alaska is drastically limited by location-specific conditions, including: unique geophysical features, remoteness of location, and harsh climate. The associated cost of maintaining this unique cyberinfrastructure also becomes a limiting factor. As a result, field experiments conducted in this region have historically been at a technological disadvantage. The Arctic Glass project explored a variety of scenarios where innovative consumer-grade technology was leveraged as a lightweight, rapidly deployable, sustainable, alternatives to traditional large-scale Arctic cyberinfrastructure installations. Google Glass, cloud computing services, Internet of Things (IoT) microcontrollers, miniature LIDAR, co2 sensors designed for HVAC systems, and portable network kits are several of the components field-tested at the Toolik Field Station as part of this project. Region-specific software was also developed, including a multi featured, voice controlled Google Glass application named "Arctic Glass". Additionally, real-time sensor monitoring and remote control capability was evaluated through the deployment of a small cluster of microcontroller devices. Network robustness was analyzed as the devices delivered streams of abiotic data to a web-based dashboard monitoring service in near real time. The same data was also uploaded synchronously by the devices to Amazon Web Services. A detailed overview of solutions deployed during the 2015 field season, results from experiments utilizing consumer sensors, and potential roles consumer technology could play in support of Arctic science will be discussed.

  6. Use of Openly Available Satellite Images for Remote Sensing Education

    NASA Astrophysics Data System (ADS)

    Wang, C.-K.

    2011-09-01

    With the advent of Google Earth, Google Maps, and Microsoft Bing Maps, high resolution satellite imagery are becoming more easily accessible than ever. It have been the case that the college students may already have wealth experiences with the high resolution satellite imagery by using these software and web services prior to any formal remote sensing education. It is obvious that the remote sensing education should be adjusted to the fact that the audience are already the customers of remote sensing products (through the use of the above mentioned services). This paper reports the use of openly available satellite imagery in an introductory-level remote sensing course in the Department of Geomatics of National Cheng Kung University as a term project. From the experience learned from the fall of 2009 and 2010, it shows that this term project has effectively aroused the students' enthusiastic toward Remote Sensing.

  7. Finding research information on the web: how to make the most of Google and other free search tools.

    PubMed

    Blakeman, Karen

    2013-01-01

    The Internet and the World Wide Web has had a major impact on the accessibility of research information. The move towards open access and development of institutional repositories has resulted in increasing amounts of information being made available free of charge. Many of these resources are not included in conventional subscription databases and Google is not always the best way to ensure that one is picking up all relevant material on a topic. This article will look at how Google's search engine works, how to use Google more effectively for identifying research information, alternatives to Google and will review some of the specialist tools that have evolved to cope with the diverse forms of information that now exist in electronic form.

  8. Design and Implementation of Surrounding Transaction Plotting and Management System Based on Google Map API

    NASA Astrophysics Data System (ADS)

    Cao, Y. B.; Hua, Y. X.; Zhao, J. X.; Guo, S. M.

    2013-11-01

    With China's rapid economic development and comprehensive national strength growing, Border work has become a long-term and important task in China's diplomatic work. How to implement rapid plotting, real-time sharing and mapping surrounding affairs has taken great significance for government policy makers and diplomatic staff. However, at present the already exists Boundary information system are mainly have problems of Geospatial data update is heavily workload, plotting tools are in a state of serious lack of, Geographic events are difficult to share, this phenomenon has seriously hampered the smooth development of the border task. The development and progress of Geographic information system technology especially the development of Web GIS offers the possibility to solve the above problems, this paper adopts four layers of B/S architecture, with the support of Google maps service, uses the free API which is offered by Google maps and its features of openness, ease of use, sharing characteristics, highresolution images to design and implement the surrounding transaction plotting and management system based on the web development technology of ASP.NET, C#, Ajax. The system can provide decision support for government policy makers as well as diplomatic staff's real-time plotting and sharing of surrounding information. The practice has proved that the system has good usability and strong real-time.

  9. Quality of Public Hospitals Websites: A Cross-Sectional Analytical Study in Iran.

    PubMed

    Salarvand, Shahin; Samadbeik, Mahnaz; Tarrahi, Mohammad Javad; Salarvand, Hamed

    2016-04-01

    Nowadays, hospitals have turned increasingly towards the Internet and develop their own web presence. Hospital Websites could be operating as effective web resources of information and interactive communication mediums to enhance hospital services to the public. Therefore, the aim of this study was to assess the quality of websites in Tehran's public hospitals. This cross-sectional analysis involved all public hospitals in Iran's capital city, Tehran, with a working website or subsites between April and June, 2014 (N=59). The websites were evaluated using three validated instruments: a localized checklist, Google page rank, and the Alexa traffic ranking. The mentioned checklist consisted of 112 items divided into five sections: technical characteristics, hospital information and facilities, medical services, interactive on-line services and external activities. Data were analyzed using descriptive and analytical statistics. The mean website evaluation score was 45.7 out of 224 for selected public hospitals. All the studied websites were in the weak category based on the earned quality scores. There was no statistically significant association between the website evaluation score with Google page rank (P=0.092), Alexa global traffic rank and Alexa traffic rank in Iran (P>0.05). The hospital websites had a lower quality score in the interactive online services and external activities criteria in comparing to other criteria. Due to the low quality level of the studied websites and the importance of hospital portals in providing information and services on the Internet, the authorities should do precise planning for the appreciable improvement in the quality of hospital websites.

  10. Quality of Public Hospitals Websites: A Cross-Sectional Analytical Study in Iran

    PubMed Central

    Salarvand, Shahin; Samadbeik, Mahnaz; Tarrahi, Mohammad Javad; Salarvand, Hamed

    2016-01-01

    Introduction: Nowadays, hospitals have turned increasingly towards the Internet and develop their own web presence. Hospital Websites could be operating as effective web resources of information and interactive communication mediums to enhance hospital services to the public. Aim: Therefore, the aim of this study was to assess the quality of websites in Tehran’s public hospitals. Material and methods: This cross-sectional analysis involved all public hospitals in Iran’s capital city, Tehran, with a working website or subsites between April and June, 2014 (N=59). The websites were evaluated using three validated instruments: a localized checklist, Google page rank, and the Alexa traffic ranking. The mentioned checklist consisted of 112 items divided into five sections: technical characteristics, hospital information and facilities, medical services, interactive on-line services and external activities. Data were analyzed using descriptive and analytical statistics. Results: The mean website evaluation score was 45.7 out of 224 for selected public hospitals. All the studied websites were in the weak category based on the earned quality scores. There was no statistically significant association between the website evaluation score with Google page rank (P=0.092), Alexa global traffic rank and Alexa traffic rank in Iran (P>0.05). The hospital websites had a lower quality score in the interactive online services and external activities criteria in comparing to other criteria. Due to the low quality level of the studied websites and the importance of hospital portals in providing information and services on the Internet, the authorities should do precise planning for the appreciable improvement in the quality of hospital websites. PMID:27147806

  11. Confessions of a Librarian or: How I Learned to Stop Worrying and Love Google

    ERIC Educational Resources Information Center

    Gunnels, Claire B.; Sisson, Amy

    2009-01-01

    Have you ever stopped to think about life before Google? We will make the argument that Google is the first manifestation of Web 2.0, of the power and promise of social networking and the ubiquitous wiki. We will discuss the positive influence of Google and how Google and other social networking tools afford librarians leading-edge technologies…

  12. Breast cancer on the world wide web: cross sectional survey of quality of information and popularity of websites

    PubMed Central

    Meric, Funda; Bernstam, Elmer V; Mirza, Nadeem Q; Hunt, Kelly K; Ames, Frederick C; Ross, Merrick I; Kuerer, Henry M; Pollock, Raphael E; Musen, Mark A; Singletary, S Eva

    2002-01-01

    Objectives To determine the characteristics of popular breast cancer related websites and whether more popular sites are of higher quality. Design The search engine Google was used to generate a list of websites about breast cancer. Google ranks search results by measures of link popularity—the number of links to a site from other sites. The top 200 sites returned in response to the query “breast cancer” were divided into “more popular” and “less popular” subgroups by three different measures of link popularity: Google rank and number of links reported independently by Google and by AltaVista (another search engine). Main outcome measures Type and quality of content. Results More popular sites according to Google rank were more likely than less popular ones to contain information on ongoing clinical trials (27% v 12%, P=0.01 ), results of trials (12% v 3%, P=0.02), and opportunities for psychosocial adjustment (48% v 23%, P<0.01). These characteristics were also associated with higher number of links as reported by Google and AltaVista. More popular sites by number of linking sites were also more likely to provide updates on other breast cancer research, information on legislation and advocacy, and a message board service. Measures of quality such as display of authorship, attribution or references, currency of information, and disclosure did not differ between groups. Conclusions Popularity of websites is associated with type rather than quality of content. Sites that include content correlated with popularity may best meet the public's desire for information about breast cancer. What is already known on this topicPatients are using the world wide web to search for health informationBreast cancer is one of the most popular search topicsCharacteristics of popular websites may reflect the information needs of patientsWhat this study addsType rather than quality of content correlates with popularity of websitesMeasures of quality correlate with accuracy of medical information PMID:11884322

  13. MaRGEE: Move and Rotate Google Earth Elements

    NASA Astrophysics Data System (ADS)

    Dordevic, Mladen M.; Whitmeyer, Steven J.

    2015-12-01

    Google Earth is recognized as a highly effective visualization tool for geospatial information. However, there remain serious limitations that have hindered its acceptance as a tool for research and education in the geosciences. One significant limitation is the inability to translate or rotate geometrical elements on the Google Earth virtual globe. Here we present a new JavaScript web application to "Move and Rotate Google Earth Elements" (MaRGEE). MaRGEE includes tools to simplify, translate, and rotate elements, add intermediate steps to a transposition, and batch process multiple transpositions. The transposition algorithm uses spherical geometry calculations, such as the haversine formula, to accurately reposition groups of points, paths, and polygons on the Google Earth globe without distortion. Due to the imminent deprecation of the Google Earth API and browser plugin, MaRGEE uses a Google Maps interface to facilitate and illustrate the transpositions. However, the inherent spatial distortions that result from the Google Maps Web Mercator projection are not apparent once the transposed elements are saved as a KML file and opened in Google Earth. Potential applications of the MaRGEE toolkit include tectonic reconstructions, the movements of glaciers or thrust sheets, and time-based animations of other large- and small-scale geologic processes.

  14. Google Analytics: Single Page Traffic Reports

    EPA Pesticide Factsheets

    These are pages that live outside of Google Analytics (GA) but allow you to view GA data for any individual page on either the public EPA web or EPA intranet. You do need to log in to Google Analytics to view them.

  15. 100 Colleges Sign Up with Google to Speed Access to Library Resources

    ERIC Educational Resources Information Center

    Young, Jeffrey R.

    2005-01-01

    More than 100 colleges and universities have arranged to give people using the Google Scholar search engine on their campuses more-direct access to library materials. Google Scholar is a free tool that searches scholarly materials on the Web and in academic databases. The new arrangements essentially let Google know which online databases the…

  16. Semantic Web Data Discovery of Earth Science Data at NASA Goddard Earth Sciences Data and Information Services Center (GES DISC)

    NASA Technical Reports Server (NTRS)

    Hegde, Mahabaleshwara; Strub, Richard F.; Lynnes, Christopher S.; Fang, Hongliang; Teng, William

    2008-01-01

    Mirador is a web interface for searching Earth Science data archived at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). Mirador provides keyword-based search and guided navigation for providing efficient search and access to Earth Science data. Mirador employs the power of Google's universal search technology for fast metadata keyword searches, augmented by additional capabilities such as event searches (e.g., hurricanes), searches based on location gazetteer, and data services like format converters and data sub-setters. The objective of guided data navigation is to present users with multiple guided navigation in Mirador is an ontology based on the Global Change Master directory (GCMD) Directory Interchange Format (DIF). Current implementation includes the project ontology covering various instruments and model data. Additional capabilities in the pipeline include Earth Science parameter and applications ontologies.

  17. Internet Presentation of Departments of Pediatric Surgery in Germany and Their Compliance with Recommended Criteria for Promoting Services and Offering Professional Information for Patients.

    PubMed

    Farhat, Naim; Zoeller, Christoph; Petersen, Claus; Ure, Benno

    2016-08-01

    Introduction The presentation of health institutions in the internet is highly variable concerning marketing features and medical information. We aimed to investigate the structure and the kind of information provided on the Web sites of all departments of pediatric surgery in Germany. Furthermore, we aimed to identify the degree to which these Web sites comply with internet marketing recommendations for generating business. Method The Web sites of all pediatric surgery units referred to as departments on the official Web site of the German Society of Pediatric Surgery (GSPS) were assessed. The search engine Google was used by entering the terms "pediatric surgery" and the name of the city. Besides general data eight content characteristics focusing on ranking, accessibility, use of social media, multilingual sites, navigation options, selected images, contact details, and medical information were evaluated according to published recommendations. Results A total of 85 departments of pediatric surgery were included. On Google search results 44 (52%) ranked number one and 34 (40%) of the department's homepages were accessible directly through the homepage link of the GSPS. A link to own digital and/or social media was offered on 11 (13%) homepages. Nine sites were multilingual. The most common navigation bar item was clinical services on 74 (87%) homepages. Overall, 76 (89%) departments presented their doctors and 17 (20%) presented other staff members with images of doctors on 53 (62%) and contact data access from the homepage on 68 (80%) Web sites. On 25 (29%) Web sites information on the medical conditions treated were presented, on 17 (20%) details of treating concepts, and on 4 (5%) numbers of patients with specific conditions treated in the own department per year. Conclusion We conclude that numerous of the investigated online presentations do not comply with recommended criteria for offering professional information for patients and for promoting services. Only less than one-third of the departments of pediatric surgery in Germany offer information about the medical conditions they treat. Features, which may influence the decision of patients and parents such as ranking, accessibility, use of social media, multilingual sites, navigation options, selected images, and contact information were differently lacking on many Web sites. Georg Thieme Verlag KG Stuttgart · New York.

  18. The Library in Your Toolbar: You Can Make It Easy to Search Library Resources from Your Own Browser

    ERIC Educational Resources Information Center

    Webster, Peter

    2007-01-01

    For years, patrons have been able to access library services from home and in the library building, but in the world of Google, Yahoo, YouTube, MySpace, and Facebook, library web sites and catalogs are too often not the first place people go to look for information. The innovative use of toolbars could change this. Toolbars have been popular for…

  19. Utility of Web search query data in testing theoretical assumptions about mephedrone.

    PubMed

    Kapitány-Fövény, Máté; Demetrovics, Zsolt

    2017-05-01

    With growing access to the Internet, people who use drugs and traffickers started to obtain information about novel psychoactive substances (NPS) via online platforms. This paper aims to analyze whether a decreasing Web interest in formerly banned substances-cocaine, heroin, and MDMA-and the legislative status of mephedrone predict Web interest about this NPS. Google Trends was used to measure changes of Web interest on cocaine, heroin, MDMA, and mephedrone. Google search results for mephedrone within the same time frame were analyzed and categorized. Web interest about classic drugs found to be more persistent. Regarding geographical distribution, location of Web searches for heroin and cocaine was less centralized. Illicit status of mephedrone was a negative predictor of its Web search query rates. The connection between mephedrone-related Web search rates and legislative status of this substance was significantly mediated by ecstasy-related Web search queries, the number of documentaries, and forum/blog entries about mephedrone. The results might provide support for the hypothesis that mephedrone's popularity was highly correlated with its legal status as well as it functioned as a potential substitute for MDMA. Google Trends was found to be a useful tool for testing theoretical assumptions about NPS. Copyright © 2017 John Wiley & Sons, Ltd.

  20. A Knowledge Portal and Collaboration Environment for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.

    2008-12-01

    Earth Knowledge is developing a web-based 'Knowledge Portal and Collaboration Environment' that will serve as the information-technology-based foundation of a modular Internet-based Earth-Systems Monitoring, Analysis, and Management Tool. This 'Knowledge Portal' is essentially a 'mash- up' of web-based and client-based tools and services that support on-line collaboration, community discussion, and broad public dissemination of earth and environmental science information in a wide-area distributed network. In contrast to specialized knowledge-management or geographic-information systems developed for long- term and incremental scientific analysis, this system will exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize existing environmental datasets using Google Earth and Google Maps. An early form of these tools and services is being used by Earth Knowledge to facilitate the investigations and conversations of scientists, resource managers, and citizen-stakeholders addressing water resource sustainability issues in the Great Basin region of the desert southwestern United States. These ongoing projects will serve as use cases for the further development of this information-technology infrastructure. This 'Knowledge Portal' will accelerate the deployment of Earth- system data and information into an operational knowledge management system that may be used by decision-makers concerned with stewardship of water resources in the American Desert Southwest.

  1. Visualization of Client-Side Web Browsing and Email Activity

    DTIC Science & Technology

    2009-06-01

    mantenimiento the amazing race dustin & candice oskar schindler mythbusters femjoy 080814-kathi in peace anna_ac_-_elixia miley cyrus mapa linea 12 metro... mantenimiento www.google.com.mx alcohol isopropilico www.google.com.mx descargas rapidshare corta final firefox www.google.com.mx desactivar

  2. A Geospatial Database that Supports Derivation of Climatological Features of Severe Weather

    NASA Astrophysics Data System (ADS)

    Phillips, M.; Ansari, S.; Del Greco, S.

    2007-12-01

    The Severe Weather Data Inventory (SWDI) at NOAA's National Climatic Data Center (NCDC) provides user access to archives of several datasets critical to the detection and evaluation of severe weather. These datasets include archives of: · NEXRAD Level-III point features describing general storm structure, hail, mesocyclone and tornado signatures · National Weather Service Storm Events Database · National Weather Service Local Storm Reports collected from storm spotters · National Weather Service Warnings · Lightning strikes from Vaisala's National Lightning Detection Network (NLDN) SWDI archives all of these datasets in a spatial database that allows for convenient searching and subsetting. These data are accessible via the NCDC web site, Web Feature Services (WFS) or automated web services. The results of interactive web page queries may be saved in a variety of formats, including plain text, XML, Google Earth's KMZ, standards-based NetCDF and Shapefile. NCDC's Storm Risk Assessment Project (SRAP) uses data from the SWDI database to derive gridded climatology products that show the spatial distributions of the frequency of various events. SRAP also can relate SWDI events to other spatial data such as roads, population, watersheds, and other geographic, sociological, or economic data to derive products that are useful in municipal planning, emergency management, the insurance industry, and other areas where there is a need to quantify and qualify how severe weather patterns affect people and property.

  3. Lecturers’ Understanding on Indexing Databases of SINTA, DOAJ, Google Scholar, SCOPUS, and Web of Science: A Study of Indonesians

    NASA Astrophysics Data System (ADS)

    Saleh Ahmar, Ansari; Kurniasih, Nuning; Irawan, Dasapta Erwin; Utami Sutiksno, Dian; Napitupulu, Darmawan; Ikhsan Setiawan, Muhammad; Simarmata, Janner; Hidayat, Rahmat; Busro; Abdullah, Dahlan; Rahim, Robbi; Abraham, Juneman

    2018-01-01

    The Ministry of Research, Technology and Higher Education of Indonesia has introduced several national and international indexers of scientific works. This policy becomes a guideline for lecturers and researchers in choosing the reputable publications. This study aimed to describe the understanding level of Indonesian lecturers related to indexing databases, i.e. SINTA, DOAJ, Scopus, Web of Science, and Google Scholar. This research used descriptive design and survey method. The populations in this study were Indonesian lecturers and researchers. The primary data were obtained from a questionnaire filled by 316 lecturers and researchers from 33 Provinces in Indonesia recruited with convenience sampling technique on October-November 2017. The data analysis was performed using frequency distribution tables, cross tabulation and descriptive analysis. The results of this study showed that the understanding of Indonesian lecturers and researchers regarding publications in indexing databases SINTA, DOAJ, Scopus, Web of Science and Google Scholar is that, on average, 66,5% have known about SINTA, DOAJ, Scopus, Web of Science and Google Scholar. However, based on empirical frequency 76% of them have never published with journals or proceedings indexed in Scopus.

  4. Analysis of Web Spam for Non-English Content: Toward More Effective Language-Based Classifiers

    PubMed Central

    Alsaleh, Mansour; Alarifi, Abdulrahman

    2016-01-01

    Web spammers aim to obtain higher ranks for their web pages by including spam contents that deceive search engines in order to include their pages in search results even when they are not related to the search terms. Search engines continue to develop new web spam detection mechanisms, but spammers also aim to improve their tools to evade detection. In this study, we first explore the effect of the page language on spam detection features and we demonstrate how the best set of detection features varies according to the page language. We also study the performance of Google Penguin, a newly developed anti-web spamming technique for their search engine. Using spam pages in Arabic as a case study, we show that unlike similar English pages, Google anti-spamming techniques are ineffective against a high proportion of Arabic spam pages. We then explore multiple detection features for spam pages to identify an appropriate set of features that yields a high detection accuracy compared with the integrated Google Penguin technique. In order to build and evaluate our classifier, as well as to help researchers to conduct consistent measurement studies, we collected and manually labeled a corpus of Arabic web pages, including both benign and spam pages. Furthermore, we developed a browser plug-in that utilizes our classifier to warn users about spam pages after clicking on a URL and by filtering out search engine results. Using Google Penguin as a benchmark, we provide an illustrative example to show that language-based web spam classifiers are more effective for capturing spam contents. PMID:27855179

  5. Analysis of Web Spam for Non-English Content: Toward More Effective Language-Based Classifiers.

    PubMed

    Alsaleh, Mansour; Alarifi, Abdulrahman

    2016-01-01

    Web spammers aim to obtain higher ranks for their web pages by including spam contents that deceive search engines in order to include their pages in search results even when they are not related to the search terms. Search engines continue to develop new web spam detection mechanisms, but spammers also aim to improve their tools to evade detection. In this study, we first explore the effect of the page language on spam detection features and we demonstrate how the best set of detection features varies according to the page language. We also study the performance of Google Penguin, a newly developed anti-web spamming technique for their search engine. Using spam pages in Arabic as a case study, we show that unlike similar English pages, Google anti-spamming techniques are ineffective against a high proportion of Arabic spam pages. We then explore multiple detection features for spam pages to identify an appropriate set of features that yields a high detection accuracy compared with the integrated Google Penguin technique. In order to build and evaluate our classifier, as well as to help researchers to conduct consistent measurement studies, we collected and manually labeled a corpus of Arabic web pages, including both benign and spam pages. Furthermore, we developed a browser plug-in that utilizes our classifier to warn users about spam pages after clicking on a URL and by filtering out search engine results. Using Google Penguin as a benchmark, we provide an illustrative example to show that language-based web spam classifiers are more effective for capturing spam contents.

  6. How To: Maximize Google

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2004-01-01

    Google is shaking out to be the leading Web search engine, with recent research from Nielsen NetRatings reporting about 40 percent of all U.S. households using the tool at least once in January 2004. This brief article discusses how teachers and students can maximize their use of Google.

  7. The BCube Crawler: Web Scale Data and Service Discovery for EarthCube.

    NASA Astrophysics Data System (ADS)

    Lopez, L. A.; Khalsa, S. J. S.; Duerr, R.; Tayachow, A.; Mingo, E.

    2014-12-01

    Web-crawling, a core component of the NSF-funded BCube project, is researching and applying the use of big data technologies to find and characterize different types of web services, catalog interfaces, and data feeds such as the ESIP OpenSearch, OGC W*S, THREDDS, and OAI-PMH that describe or provide access to scientific datasets. Given the scale of the Internet, which challenges even large search providers such as Google, the BCube plan for discovering these web accessible services is to subdivide the problem into three smaller, more tractable issues. The first, to be able to discover likely sites where relevant data and data services might be found, the second, to be able to deeply crawl the sites discovered to find any data and services which might be present. Lastly, to leverage the use of semantic technologies to characterize the services and data found, and to filter out everything but those relevant to the geosciences. To address the first two challenges BCube uses an adapted version of Apache Nutch (which originated Hadoop), a web scale crawler, and Amazon's ElasticMapReduce service for flexibility and cost effectiveness. For characterization of the services found, BCube is examining existing web service ontologies for their applicability to our needs and will re-use and/or extend these in order to query for services with specific well-defined characteristics in scientific datasets such as the use of geospatial namespaces. The original proposal for the crawler won a grant from Amazon's academic program, which allowed us to become operational; we successfully tested the Bcube Crawler at web scale obtaining a significant corpus, sizeable enough to enable work on characterization of the services and data found. There is still plenty of work to be done, doing "smart crawls" by managing the frontier, developing and enhancing our scoring algorithms and fully implementing the semantic characterization technologies. We describe the current status of the project, our successes and issues encountered. The final goal of the BCube crawler project is to provide relevant data services to other projects on the EarthCube stack and third party partners so they can be brokered and used by a wider scientific community.

  8. Finding Citations to Social Work Literature: The Relative Benefits of Using "Web of Science," "Scopus," or "Google Scholar"

    ERIC Educational Resources Information Center

    Bergman, Elaine M. Lasda

    2012-01-01

    Past studies of citation coverage of "Web of Science," "Scopus," and "Google Scholar" do not demonstrate a consistent pattern that can be applied to the interdisciplinary mix of resources used in social work research. To determine the utility of these tools to social work researchers, an analysis of citing references to well-known social work…

  9. Identifying Evidence for Public Health Guidance: A Comparison of Citation Searching with Web of Science and Google Scholar

    ERIC Educational Resources Information Center

    Levay, Paul; Ainsworth, Nicola; Kettle, Rachel; Morgan, Antony

    2016-01-01

    Aim: To examine how effectively forwards citation searching with Web of Science (WOS) or Google Scholar (GS) identified evidence to support public health guidance published by the National Institute for Health and Care Excellence. Method: Forwards citation searching was performed using GS on a base set of 46 publications and replicated using WOS.…

  10. Estimating search engine index size variability: a 9-year longitudinal study.

    PubMed

    van den Bosch, Antal; Bogers, Toine; de Kunder, Maurice

    One of the determining factors of the quality of Web search engines is the size of their index. In addition to its influence on search result quality, the size of the indexed Web can also tell us something about which parts of the WWW are directly accessible to the everyday user. We propose a novel method of estimating the size of a Web search engine's index by extrapolating from document frequencies of words observed in a large static corpus of Web pages. In addition, we provide a unique longitudinal perspective on the size of Google and Bing's indices over a nine-year period, from March 2006 until January 2015. We find that index size estimates of these two search engines tend to vary dramatically over time, with Google generally possessing a larger index than Bing. This result raises doubts about the reliability of previous one-off estimates of the size of the indexed Web. We find that much, if not all of this variability can be explained by changes in the indexing and ranking infrastructure of Google and Bing. This casts further doubt on whether Web search engines can be used reliably for cross-sectional webometric studies.

  11. Leveraging Google Geo Tools for Interactive STEM Education: Insights from the GEODE Project

    NASA Astrophysics Data System (ADS)

    Dordevic, M.; Whitmeyer, S. J.; De Paor, D. G.; Karabinos, P.; Burgin, S.; Coba, F.; Bentley, C.; St John, K. K.

    2016-12-01

    Web-based imagery and geospatial tools have transformed our ability to immerse students in global virtual environments. Google's suite of geospatial tools, such as Google Earth (± Engine), Google Maps, and Street View, allow developers and instructors to create interactive and immersive environments, where students can investigate and resolve common misconceptions in STEM concepts and natural processes. The GEODE (.net) project is developing digital resources to enhance STEM education. These include virtual field experiences (VFEs), such as an interactive visualization of the breakup of the Pangaea supercontinent, a "Grand Tour of the Terrestrial Planets," and GigaPan-based VFEs of sites like the Canadian Rockies. Web-based challenges, such as EarthQuiz (.net) and the "Fold Analysis Challenge," incorporate scaffolded investigations of geoscience concepts. EarthQuiz features web-hosted imagery, such as Street View, Photo Spheres, GigaPans, and Satellite View, as the basis for guided inquiry. In the Fold Analysis Challenge, upper-level undergraduates use Google Earth to evaluate a doubly-plunging fold at Sheep Mountain, WY. GEODE.net also features: "Reasons for the Seasons"—a Google Earth-based visualization that addresses misconceptions that abound amongst students, teachers, and the public, many of whom believe that seasonality is caused by large variations in Earth's distance from the Sun; "Plate Euler Pole Finder," which helps students understand rotational motion of tectonic plates on the globe; and "Exploring Marine Sediments Using Google Earth," an exercise that uses empirical data to explore the surficial distribution of marine sediments in the modern ocean. The GEODE research team includes the authors and: Heather Almquist, Cinzia Cervato, Gene Cooper, Helen Crompton, Terry Pavlis, Jen Piatek, Bill Richards, Jeff Ryan, Ron Schott, Barb Tewksbury, and their students and collaborating colleagues. We are supported by NSF DUE 1323419 and a Google Geo Curriculum Award.

  12. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui

    PubMed Central

    2012-01-01

    Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications. PMID:22998945

  13. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui.

    PubMed

    Newton, Richard; Deonarine, Andrew; Wernisch, Lorenz

    2012-09-24

    The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications.

  14. Using Mobile App Development Tools to Build a GIS Application

    NASA Astrophysics Data System (ADS)

    Mital, A.; Catchen, M.; Mital, K.

    2014-12-01

    Our group designed and built working web, android, and IOS applications using different mapping libraries as bases on which to overlay fire data from NASA. The group originally planned to make app versions for Google Maps, Leaflet, and OpenLayers. However, because the Leaflet library did not properly load on Android, the group focused efforts on the other two mapping libraries. For Google Maps, the group first designed a UI for the web app and made a working version of the app. After updating the source of fire data to one which also provided historical fire data, the design had to be modified to include the extra data. After completing a working version of the web app, the group used webview in android, a built in resource which allowed porting the web app to android without rewriting the code for android. Upon completing this, the group found Apple IOS devices had a similar capability, and so decided to add an IOS app to the project using a function similar to webview. Alongside this effort, the group began implementing an OpenLayers fire map using a simpler UI. This web app was completed fairly quickly relative to Google Maps; however, it did not include functionality such as satellite imagery or searchable locations. The group finished the project with a working android version of the Google Maps based app supporting API levels 14-19 and an OpenLayers based app supporting API levels 8-19, as well as a Google Maps based IOS app supporting both old and new screen formats. This project was implemented by high school and college students under an SGT Inc. STEM internship program

  15. KML Super Overlay to WMS Translator

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2007-01-01

    This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.

  16. Evaluating Google, Twitter, and Wikipedia as Tools for Influenza Surveillance Using Bayesian Change Point Analysis: A Comparative Analysis.

    PubMed

    Sharpe, J Danielle; Hopkins, Richard S; Cook, Robert L; Striley, Catherine W

    2016-10-20

    Traditional influenza surveillance relies on influenza-like illness (ILI) syndrome that is reported by health care providers. It primarily captures individuals who seek medical care and misses those who do not. Recently, Web-based data sources have been studied for application to public health surveillance, as there is a growing number of people who search, post, and tweet about their illnesses before seeking medical care. Existing research has shown some promise of using data from Google, Twitter, and Wikipedia to complement traditional surveillance for ILI. However, past studies have evaluated these Web-based sources individually or dually without comparing all 3 of them, and it would be beneficial to know which of the Web-based sources performs best in order to be considered to complement traditional methods. The objective of this study is to comparatively analyze Google, Twitter, and Wikipedia by examining which best corresponds with Centers for Disease Control and Prevention (CDC) ILI data. It was hypothesized that Wikipedia will best correspond with CDC ILI data as previous research found it to be least influenced by high media coverage in comparison with Google and Twitter. Publicly available, deidentified data were collected from the CDC, Google Flu Trends, HealthTweets, and Wikipedia for the 2012-2015 influenza seasons. Bayesian change point analysis was used to detect seasonal changes, or change points, in each of the data sources. Change points in Google, Twitter, and Wikipedia that occurred during the exact week, 1 preceding week, or 1 week after the CDC's change points were compared with the CDC data as the gold standard. All analyses were conducted using the R package "bcp" version 4.0.0 in RStudio version 0.99.484 (RStudio Inc). In addition, sensitivity and positive predictive values (PPV) were calculated for Google, Twitter, and Wikipedia. During the 2012-2015 influenza seasons, a high sensitivity of 92% was found for Google, whereas the PPV for Google was 85%. A low sensitivity of 50% was calculated for Twitter; a low PPV of 43% was found for Twitter also. Wikipedia had the lowest sensitivity of 33% and lowest PPV of 40%. Of the 3 Web-based sources, Google had the best combination of sensitivity and PPV in detecting Bayesian change points in influenza-related data streams. Findings demonstrated that change points in Google, Twitter, and Wikipedia data occasionally aligned well with change points captured in CDC ILI data, yet these sources did not detect all changes in CDC data and should be further studied and developed.

  17. Using Google AdWords in the MBA MIS Course

    ERIC Educational Resources Information Center

    Rosso, Mark A.; McClelland, Marilyn K.; Jansen, Bernard J.; Fleming, Sundar W.

    2009-01-01

    From February to June 2008, Google ran its first ever student competition in sponsored Web search, the 2008 Google Online Marketing Challenge (GOMC). The 2008 GOMC was based on registrations from 61 countries: 629 course sections from 468 universities participated, fielding over 4000 student teams of approximately 21,000 students. Working with a…

  18. How Accurately Can the Google Web Speech API Recognize and Transcribe Japanese L2 English Learners' Oral Production?

    ERIC Educational Resources Information Center

    Ashwell, Tim; Elam, Jesse R.

    2017-01-01

    The ultimate aim of our research project was to use the Google Web Speech API to automate scoring of elicited imitation (EI) tests. However, in order to achieve this goal, we had to take a number of preparatory steps. We needed to assess how accurate this speech recognition tool is in recognizing native speakers' production of the test items; we…

  19. Shifting Sands: Science Researchers on Google Scholar, Web of Science, and PubMed, with Implications for Library Collections Budgets

    ERIC Educational Resources Information Center

    Hightower, Christy; Caldwell, Christy

    2010-01-01

    Science researchers at the University of California Santa Cruz were surveyed about their article database use and preferences in order to inform collection budget choices. Web of Science was the single most used database, selected by 41.6%. Statistically there was no difference between PubMed (21.5%) and Google Scholar (18.7%) as the second most…

  20. Dr Google

    PubMed Central

    Pías-Peleteiro, Leticia; Cortés-Bordoy, Javier; Martinón-Torres, Federico

    2013-01-01

    Objectives: To assess and analyze the information and recommendations provided by Google Web Search™ (Google) in relation to web searches on the HPV vaccine, indications for females and males and possible adverse effects. Materials and Methods: Descriptive cross-sectional study of the results of 14 web searches. Comprehensive analysis of results based on general recommendation given (favorable/dissuasive), as well as compliance with pre-established criteria, namely design, content and credibility. Sub-analysis of results according to site category: general information, blog / forum and press. Results: In the comprehensive analysis of results, 72.2% of websites offer information favorable to HPV vaccination, with varying degrees of content detail, vs. 27.8% with highly dissuasive content in relation to HPV vaccination. The most frequent type of site is the blog or forum. The information found is frequently incomplete, poorly structured, and often lacking in updates, bibliography and adequate citations, as well as sound credibility criteria (scientific association accreditation and/or trust mark system). Conclusions: Google, as a tool which users employ to locate medical information and advice, is not specialized in providing information that is necessarily rigorous or valid from a scientific perspective. Search results and ranking based on Google's generalized algorithms can lead users to poorly grounded opinions and statements, which may impact HPV vaccination perception and subsequent decision making. PMID:23744505

  1. The White-hat Bot: A Novel Botnet Defense Strategy

    DTIC Science & Technology

    2012-06-14

    etc. I will briefly discuss one common exploit here. One fraudulent activity 4 perpetuated by botnets involves ad services such as Google’s AdSense ...which pays website owners revenue for posting the AdSense banner on their web site (Google, 2012). The AdSense banner displays messages from...botmaster creates a bot that is programmed to visit the botmaster’s own websites to click on the advertisements displayed in the AdSense banners. Since

  2. Google Analytics – Index of Resources

    EPA Pesticide Factsheets

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  3. Dark Web 101

    DTIC Science & Technology

    2016-07-21

    Todays internet has multiple webs. The surface web is what Google and other search engines index and pull based on links. Essentially, the surface...financial records, research and development), and personal data (medical records or legal documents). These are all deep web. Standard search engines dont

  4. WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets

    NASA Astrophysics Data System (ADS)

    Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.

    2010-12-01

    WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface of the web application. These features are all in addition to a full range of essential visualization functions including 3-D camera and object orientation, position manipulation, time-stepping control, and custom color/alpha mapping.

  5. Teaching Google Search Techniques in an L2 Academic Writing Context

    ERIC Educational Resources Information Center

    Han, Sumi; Shin, Jeong-Ah

    2017-01-01

    This mixed-method study examines the effectiveness of teaching Google search techniques (GSTs) to Korean EFL college students in an intermediate-level academic English writing course. 18 students participated in a 4-day GST workshop consisting of an overview session of the web as corpus and Google as a concordancer, and three training sessions…

  6. The Effects of Collaborative Writing Activity Using Google Docs on Students' Writing Abilities

    ERIC Educational Resources Information Center

    Suwantarathip, Ornprapat; Wichadee, Saovapa

    2014-01-01

    Google Docs, a free web-based version of Microsoft Word, offers collaborative features which can be used to facilitate collaborative writing in a foreign language classroom. The current study compared writing abilities of students who collaborated on writing assignments using Google Docs with those working in groups in a face-to-face classroom.…

  7. Web-scale discovery in an academic health sciences library: development and implementation of the EBSCO Discovery Service.

    PubMed

    Thompson, Jolinda L; Obrig, Kathe S; Abate, Laura E

    2013-01-01

    Funds made available at the close of the 2010-11 fiscal year allowed purchase of the EBSCO Discovery Service (EDS) for a year-long trial. The appeal of this web-scale discovery product that offers a Google-like interface to library resources was counter-balanced by concerns about quality of search results in an academic health science setting and the challenge of configuring an interface that serves the needs of a diverse group of library users. After initial configuration, usability testing with library users revealed the need for further work before general release. Of greatest concern were continuing issues with the relevance of items retrieved, appropriateness of system-supplied facet terms, and user difficulties with navigating the interface. EBSCO has worked with the library to better understand and identify problems and solutions. External roll-out to users occurred in June 2012.

  8. Integrating web 2.0 in clinical research education in a developing country.

    PubMed

    Amgad, Mohamed; AlFaar, Ahmad Samir

    2014-09-01

    The use of Web 2.0 tools in education and health care has received heavy attention over the past years. Over two consecutive years, Children's Cancer Hospital - Egypt 57357 (CCHE 57357), in collaboration with Egyptian universities, student bodies, and NGOs, conducted a summer course that supports undergraduate medical students to cross the gap between clinical practice and clinical research. This time, there was a greater emphasis on reaching out to the students using social media and other Web 2.0 tools, which were heavily used in the course, including Google Drive, Facebook, Twitter, YouTube, Mendeley, Google Hangout, Live Streaming, Research Electronic Data Capture (REDCap), and Dropbox. We wanted to investigate the usefulness of integrating Web 2.0 technologies into formal educational courses and modules. The evaluation survey was filled in by 156 respondents, 134 of whom were course candidates (response rate = 94.4 %) and 22 of whom were course coordinators (response rate = 81.5 %). The course participants came from 14 different universities throughout Egypt. Students' feedback was positive and supported the integration of Web 2.0 tools in academic courses and modules. Google Drive, Facebook, and Dropbox were found to be most useful.

  9. EpiCollect: linking smartphones to web applications for epidemiology, ecology and community data collection.

    PubMed

    Aanensen, David M; Huntley, Derek M; Feil, Edward J; al-Own, Fada'a; Spratt, Brian G

    2009-09-16

    Epidemiologists and ecologists often collect data in the field and, on returning to their laboratory, enter their data into a database for further analysis. The recent introduction of mobile phones that utilise the open source Android operating system, and which include (among other features) both GPS and Google Maps, provide new opportunities for developing mobile phone applications, which in conjunction with web applications, allow two-way communication between field workers and their project databases. Here we describe a generic framework, consisting of mobile phone software, EpiCollect, and a web application located within www.spatialepidemiology.net. Data collected by multiple field workers can be submitted by phone, together with GPS data, to a common web database and can be displayed and analysed, along with previously collected data, using Google Maps (or Google Earth). Similarly, data from the web database can be requested and displayed on the mobile phone, again using Google Maps. Data filtering options allow the display of data submitted by the individual field workers or, for example, those data within certain values of a measured variable or a time period. Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting 'citizen scientists' to contribute data easily to central databases through their mobile phone.

  10. An Android based location service using GSMCellID and GPS to obtain a graphical guide to the nearest cash machine

    NASA Astrophysics Data System (ADS)

    Jacobsen, Jurma; Edlich, Stefan

    2009-02-01

    There is a broad range of potential useful mobile location-based applications. One crucial point seems to be to make them available to the public at large. This case illuminates the abilities of Android - the operating system for mobile devices - to fulfill this demand in the mashup way by use of some special geocoding web services and one integrated web service for getting the nearest cash machines data. It shows an exemplary approach for building mobile location-based mashups for everyone: 1. As a basis for reaching as many people as possible the open source Android OS is assumed to spread widely. 2. Everyone also means that the handset has not to be an expensive GPS device. This is realized by re-utilization of the existing GSM infrastructure with the Cell of Origin (COO) method which makes a lookup of the CellID in one of the growing web available CellID databases. Some of these databases are still undocumented and not yet published. Furthermore the Google Maps API for Mobile (GMM) and the open source counterpart OpenCellID are used. The user's current position localization via lookup of the closest cell to which the handset is currently connected to (COO) is not as precise as GPS, but appears to be sufficient for lots of applications. For this reason the GPS user is the most pleased one - for this user the system is fully automated. In contrary there could be some users who doesn't own a GPS cellular. This user should refine his/her location by one click on the map inside of the determined circular region. The users are then shown and guided by a path to the nearest cash machine by integrating Google Maps API with an overlay. Additionally, the GPS user can keep track of him- or herself by getting a frequently updated view via constantly requested precise GPS data for his or her position.

  11. Using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  12. Evaluating Google, Twitter, and Wikipedia as Tools for Influenza Surveillance Using Bayesian Change Point Analysis: A Comparative Analysis

    PubMed Central

    Hopkins, Richard S; Cook, Robert L; Striley, Catherine W

    2016-01-01

    Background Traditional influenza surveillance relies on influenza-like illness (ILI) syndrome that is reported by health care providers. It primarily captures individuals who seek medical care and misses those who do not. Recently, Web-based data sources have been studied for application to public health surveillance, as there is a growing number of people who search, post, and tweet about their illnesses before seeking medical care. Existing research has shown some promise of using data from Google, Twitter, and Wikipedia to complement traditional surveillance for ILI. However, past studies have evaluated these Web-based sources individually or dually without comparing all 3 of them, and it would be beneficial to know which of the Web-based sources performs best in order to be considered to complement traditional methods. Objective The objective of this study is to comparatively analyze Google, Twitter, and Wikipedia by examining which best corresponds with Centers for Disease Control and Prevention (CDC) ILI data. It was hypothesized that Wikipedia will best correspond with CDC ILI data as previous research found it to be least influenced by high media coverage in comparison with Google and Twitter. Methods Publicly available, deidentified data were collected from the CDC, Google Flu Trends, HealthTweets, and Wikipedia for the 2012-2015 influenza seasons. Bayesian change point analysis was used to detect seasonal changes, or change points, in each of the data sources. Change points in Google, Twitter, and Wikipedia that occurred during the exact week, 1 preceding week, or 1 week after the CDC’s change points were compared with the CDC data as the gold standard. All analyses were conducted using the R package “bcp” version 4.0.0 in RStudio version 0.99.484 (RStudio Inc). In addition, sensitivity and positive predictive values (PPV) were calculated for Google, Twitter, and Wikipedia. Results During the 2012-2015 influenza seasons, a high sensitivity of 92% was found for Google, whereas the PPV for Google was 85%. A low sensitivity of 50% was calculated for Twitter; a low PPV of 43% was found for Twitter also. Wikipedia had the lowest sensitivity of 33% and lowest PPV of 40%. Conclusions Of the 3 Web-based sources, Google had the best combination of sensitivity and PPV in detecting Bayesian change points in influenza-related data streams. Findings demonstrated that change points in Google, Twitter, and Wikipedia data occasionally aligned well with change points captured in CDC ILI data, yet these sources did not detect all changes in CDC data and should be further studied and developed. PMID:27765731

  13. Increasing the availability and usability of terrestrial ecology data through geospatial Web services and visualization tools (Invited)

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.

    2010-12-01

    Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.

  14. EPA Web Training Classes

    EPA Pesticide Factsheets

    Scheduled webinars can help you better manage EPA web content. Class topics include Drupal basics, creating different types of pages in the WebCMS such as document pages and forms, using Google Analytics, and best practices for metadata and accessibility.

  15. Return of the Google Game: More Fun Ideas to Transform Students into Skilled Researchers

    ERIC Educational Resources Information Center

    Watkins, Katrine

    2008-01-01

    Teens are impatient and unsophisticated online researchers who are often limited by their poor reading skills. Because they are attracted to clean and simple Web interfaces, they often turn to Google--and now Wikipedia--to help meet their research needs. The Google Game, co-authored by this author, teaches kids that there is a well-thought-out…

  16. iAnn: an event sharing platform for the life sciences.

    PubMed

    Jimenez, Rafael C; Albar, Juan P; Bhak, Jong; Blatter, Marie-Claude; Blicher, Thomas; Brazas, Michelle D; Brooksbank, Cath; Budd, Aidan; De Las Rivas, Javier; Dreyer, Jacqueline; van Driel, Marc A; Dunn, Michael J; Fernandes, Pedro L; van Gelder, Celia W G; Hermjakob, Henning; Ioannidis, Vassilios; Judge, David P; Kahlem, Pascal; Korpelainen, Eija; Kraus, Hans-Joachim; Loveland, Jane; Mayer, Christine; McDowall, Jennifer; Moran, Federico; Mulder, Nicola; Nyronen, Tommi; Rother, Kristian; Salazar, Gustavo A; Schneider, Reinhard; Via, Allegra; Villaveces, Jose M; Yu, Ping; Schneider, Maria V; Attwood, Teresa K; Corpas, Manuel

    2013-08-01

    We present iAnn, an open source community-driven platform for dissemination of life science events, such as courses, conferences and workshops. iAnn allows automatic visualisation and integration of customised event reports. A central repository lies at the core of the platform: curators add submitted events, and these are subsequently accessed via web services. Thus, once an iAnn widget is incorporated into a website, it permanently shows timely relevant information as if it were native to the remote site. At the same time, announcements submitted to the repository are automatically disseminated to all portals that query the system. To facilitate the visualization of announcements, iAnn provides powerful filtering options and views, integrated in Google Maps and Google Calendar. All iAnn widgets are freely available. http://iann.pro/iannviewer manuel.corpas@tgac.ac.uk.

  17. Development of RESTful services and map-based user interface tools for access and delivery of data and metadata from the Marine-Geo Digital Library

    NASA Astrophysics Data System (ADS)

    Morton, J. J.; Ferrini, V. L.

    2015-12-01

    The Marine Geoscience Data System (MGDS, www.marine-geo.org) operates an interactive digital data repository and metadata catalog that provides access to a variety of marine geology and geophysical data from throughout the global oceans. Its Marine-Geo Digital Library includes common marine geophysical data types and supporting data and metadata, as well as complementary long-tail data. The Digital Library also includes community data collections and custom data portals for the GeoPRISMS, MARGINS and Ridge2000 programs, for active source reflection data (Academic Seismic Portal), and for marine data acquired by the US Antarctic Program (Antarctic and Southern Ocean Data Portal). Ensuring that these data are discoverable not only through our own interfaces but also through standards-compliant web services is critical for enabling investigators to find data of interest.Over the past two years, MGDS has developed several new RESTful web services that enable programmatic access to metadata and data holdings. These web services are compliant with the EarthCube GeoWS Building Blocks specifications and are currently used to drive our own user interfaces. New web applications have also been deployed to provide a more intuitive user experience for searching, accessing and browsing metadata and data. Our new map-based search interface combines components of the Google Maps API with our web services for dynamic searching and exploration of geospatially constrained data sets. Direct introspection of nearly all data formats for hundreds of thousands of data files curated in the Marine-Geo Digital Library has allowed for precise geographic bounds, which allow geographic searches to an extent not previously possible. All MGDS map interfaces utilize the web services of the Global Multi-Resolution Topography (GMRT) synthesis for displaying global basemap imagery and for dynamically provide depth values at the cursor location.

  18. Quality of consumer-targeted internet guidance on home firearm and ammunition storage.

    PubMed

    Freundlich, Katherine L; Skoczylas, Maria Shakour; Schmidt, John P; Keshavarzi, Nahid R; Mohr, Bethany Anne

    2016-10-01

    Four storage practices protect against unintentional and/or self-inflicted firearm injury among children and adolescents: keeping guns locked (1) and unloaded (2) and keeping ammunition locked up (3) and in a separate location from the guns (4). Our aim was to mimic common Google search strategies on firearm/ammunition storage and assess whether the resulting web pages provided recommendations consistent with those supported by the literature. We identified 87 web pages by Google search of the 10 most commonly used search terms in the USA related to firearm/ammunition storage. Two non-blinded independent reviewers analysed web page technical quality according to a 17-item checklist derived from previous studies. A single reviewer analysed readability by US grade level assigned by Flesch-Kincaid Grade Level Index. Two separate, blinded, independent reviewers analysed deidentified web page content for accuracy and completeness describing the four accepted storage practices. Reviewers resolved disagreements by consensus. The web pages described, on average, less than one of four accepted storage practices (mean 0.2 (95% CL 0.1 to 0.4)). Only two web pages (2%) identified all four practices. Two web pages (2%) made assertions inconsistent with recommendations; both implied that loaded firearms could be stored safely. Flesch-Kincaid Grade Level Index averaged 8.0 (95% CL 7.3 to 8.7). The average technical quality score was 7.1 (95% CL 6.8 to 7.4) out of an available score of 17. There was a high degree of agreement between reviewers regarding completeness (weighted κ 0.78 (95% CL 0.61 to 0.97)). The internet currently provides incomplete information about safe firearm storage. Understanding existing deficiencies may inform future strategies for improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  19. Using GeoRSS feeds to distribute house renting and selling information based on Google map

    NASA Astrophysics Data System (ADS)

    Nong, Yu; Wang, Kun; Miao, Lei; Chen, Fei

    2007-06-01

    Geographically Encoded Objects RSS (GeoRSS) is a way to encode location in RSS feeds. RSS is a widely supported format for syndication of news and weblogs, and is extendable to publish any sort of itemized data. When Weblogs explode since RSS became new portals, Geo-tagged feed is necessary to show the location that story tells. Geographically Encoded Objects adopts the core of RSS framework, making itself the map annotations specified in the RSS XML format. The case studied illuminates that GeoRSS could be maximally concise in representation and conception, so it's simple to manipulate generation and then mashup GeoRSS feeds with Google Map through API to show the real estate information with other attribute in the information window. After subscribe to feeds of concerned subjects, users could easily check for new bulletin showing on map through syndication. The primary design goal of GeoRSS is to make spatial data creation as easy as regular Web content development. However, it does more for successfully bridging the gap between traditional GIS professionals and amateurs, Web map hackers, and numerous services that enable location-based content for its simplicity and effectiveness.

  20. ClimateWizard: A Framework and Easy-to-Use Web-Mapping Tool for Global, Regional, and Local Climate-Change Analysis

    NASA Astrophysics Data System (ADS)

    Girvetz, E. H.; Zganjar, C.; Raber, G. T.; Hoekstra, J.; Lawler, J. J.; Kareiva, P.

    2008-12-01

    Now that there is overwhelming evidence of global climate change, scientists, managers and planners (i.e. practitioners) need to assess the potential impacts of climate change on particular ecological systems, within specific geographic areas, and at spatial scales they care about, in order to make better land management, planning, and policy decisions. Unfortunately, this application of climate science to real world decisions and planning has proceeded too slowly because we lack tools for translating cutting-edge climate science and climate-model outputs into something managers and planners can work with at local or regional scales (CCSP 2008). To help increase the accessibility of climate information, we have developed a freely-available, easy-to-use, web-based climate-change analysis toolbox, called ClimateWizard, for assessing how climate has and is projected to change at specific geographic locations throughout the world. The ClimateWizard uses geographic information systems (GIS), web-services (SOAP/XML), statistical analysis platforms (e.g. R- project), and web-based mapping services (e.g. Google Earth/Maps, KML/GML) to provide a variety of different analyses (e.g. trends and departures) and outputs (e.g. maps, graphs, tables, GIS layers). Because ClimateWizard analyzes large climate datasets stored remotely on powerful computers, users of the tool do not need to have fast computers or expensive software, but simply need access to the internet. The analysis results are then provided to users in a Google Maps webpage tailored to the specific climate-change question being asked. The ClimateWizard is not a static product, but rather a framework to be built upon and modified to suit the purposes of specific scientific, management, and policy questions. For example, it can be expanded to include bioclimatic variables (e.g. evapotranspiration) and marine data (e.g. sea surface temperature), as well as improved future climate projections, and climate-change impact analyses involving hydrology, vegetation, wildfire, disease, and food security. By harnessing the power of computer and web- based technologies, the ClimateWizard puts local, regional, and global climate-change analyses in the hands of a wider array of managers, planners, and scientists.

  1. Moving beyond a Google Search: Google Earth, SketchUp, Spreadsheet, and More

    ERIC Educational Resources Information Center

    Siegle, Del

    2007-01-01

    Google has been the search engine of choice for most Web surfers for the past half decade. More recently, the creative founders of the popular search engine have been busily creating and testing a variety of useful products that will appeal to gifted learners of varying ages. The purpose of this paper is to share information about three of these…

  2. Web GIS in practice VI: a demo playlist of geo-mashups for public health neogeographers

    PubMed Central

    Boulos, Maged N Kamel; Scotch, Matthew; Cheung, Kei-Hoi; Burden, David

    2008-01-01

    'Mashup' was originally used to describe the mixing together of musical tracks to create a new piece of music. The term now refers to Web sites or services that weave data from different sources into a new data source or service. Using a musical metaphor that builds on the origin of the word 'mashup', this paper presents a demonstration "playlist" of four geo-mashup vignettes that make use of a range of Web 2.0, Semantic Web, and 3-D Internet methods, with outputs/end-user interfaces spanning the flat Web (two-dimensional – 2-D maps), a three-dimensional – 3-D mirror world (Google Earth) and a 3-D virtual world (Second Life ®). The four geo-mashup "songs" in this "playlist" are: 'Web 2.0 and GIS (Geographic Information Systems) for infectious disease surveillance', 'Web 2.0 and GIS for molecular epidemiology', 'Semantic Web for GIS mashup', and 'From Yahoo! Pipes to 3-D, avatar-inhabited geo-mashups'. It is hoped that this showcase of examples and ideas, and the pointers we are providing to the many online tools that are freely available today for creating, sharing and reusing geo-mashups with minimal or no coding, will ultimately spark the imagination of many public health practitioners and stimulate them to start exploring the use of these methods and tools in their day-to-day practice. The paper also discusses how today's Web is rapidly evolving into a much more intensely immersive, mixed-reality and ubiquitous socio-experiential Metaverse that is heavily interconnected through various kinds of user-created mashups. PMID:18638385

  3. Hand Society and Matching Program Web Sites Provide Poor Access to Information Regarding Hand Surgery Fellowship.

    PubMed

    Hinds, Richard M; Klifto, Christopher S; Naik, Amish A; Sapienza, Anthony; Capo, John T

    2016-08-01

    The Internet is a common resource for applicants of hand surgery fellowships, however, the quality and accessibility of fellowship online information is unknown. The objectives of this study were to evaluate the accessibility of hand surgery fellowship Web sites and to assess the quality of information provided via program Web sites. Hand fellowship Web site accessibility was evaluated by reviewing the American Society for Surgery of the Hand (ASSH) on November 16, 2014 and the National Resident Matching Program (NRMP) fellowship directories on February 12, 2015, and performing an independent Google search on November 25, 2014. Accessible Web sites were then assessed for quality of the presented information. A total of 81 programs were identified with the ASSH directory featuring direct links to 32% of program Web sites and the NRMP directory directly linking to 0%. A Google search yielded direct links to 86% of program Web sites. The quality of presented information varied greatly among the 72 accessible Web sites. Program description (100%), fellowship application requirements (97%), program contact email address (85%), and research requirements (75%) were the most commonly presented components of fellowship information. Hand fellowship program Web sites can be accessed from the ASSH directory and, to a lesser extent, the NRMP directory. However, a Google search is the most reliable method to access online fellowship information. Of assessable programs, all featured a program description though the quality of the remaining information was variable. Hand surgery fellowship applicants may face some difficulties when attempting to gather program information online. Future efforts should focus on improving the accessibility and content quality on hand surgery fellowship program Web sites.

  4. Creating Web Area Segments with Google Analytics

    EPA Pesticide Factsheets

    Segments allow you to quickly access data for a predefined set of Sessions or Users, such as government or education users, or sessions in a particular state. You can then apply this segment to any report within the Google Analytics (GA) interface.

  5. Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC)

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Ostrenga, D.; Vollmer, B.; Deshong, B.; Greene, M.; Teng, W.; Kempler, S. J.

    2015-01-01

    On February 27, 2014, the NASA Global Precipitation Measurement (GPM) mission was launched to provide the next-generation global observations of rain and snow (http:pmm.nasa.govGPM). The GPM mission consists of an international network of satellites in which a GPM Core Observatory satellite carries both active and passive microwave instruments to measure precipitation and serve as a reference standard, to unify precipitation measurements from a constellation of other research and operational satellites. The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 16 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available include the following: 1. Level-1 GPM Microwave Imager (GMI) and partner radiometer products. 2. Goddard Profiling Algorithm (GPROF) GMI and partner products. 3. Integrated Multi-satellitE Retrievals for GPM (IMERG) products. (early, late, and final)A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http:disc.sci.gsfc.nasa.govgpm). Data services that are currently and to-be available include Google-like Mirador (http:mirador.gsfc.nasa.gov) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http:giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications.In this presentation, we will present GPM data products and services with examples.

  6. Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC)

    NASA Astrophysics Data System (ADS)

    Ostrenga, D.; Liu, Z.; Vollmer, B.; Teng, W. L.; Kempler, S. J.

    2014-12-01

    On February 27, 2014, the NASA Global Precipitation Measurement (GPM) mission was launched to provide the next-generation global observations of rain and snow (http://pmm.nasa.gov/GPM). The GPM mission consists of an international network of satellites in which a GPM "Core Observatory" satellite carries both active and passive microwave instruments to measure precipitation and serve as a reference standard, to unify precipitation measurements from a constellation of other research and operational satellites. The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 16 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available include the following: Level-1 GPM Microwave Imager (GMI) and partner radiometer products Goddard Profiling Algorithm (GPROF) GMI and partner products Integrated Multi-satellitE Retrievals for GPM (IMERG) products (early, late, and final) A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http://disc.sci.gsfc.nasa.gov/gpm). Data services that are currently and to-be available include Google-like Mirador (http://mirador.gsfc.nasa.gov/) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http://giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications. In this presentation, we will present GPM data products and services with examples.

  7. [Health information on the Internet and trust marks as quality indicators: vaccines case study].

    PubMed

    Mayer, Miguel Angel; Leis, Angela; Sanz, Ferran

    2009-10-01

    To find out the prevalence of quality trust marks present in websites and to analyse the quality of these websites displaying trust marks compared with those that do not display them, in order to put forward these trust marks as a quality indicator. Cross-sectional study. Internet. Websites on vaccines. Using "vacunas OR vaccines" as key words, the features of 40 web pages were analysed. These web pages were selected from the page results of two search engines, Google and Yahoo! Based on a total of 9 criteria, the average score of criteria fulfilled was 7 (95% CI 3.96-10.04) points for the web pages offered by Yahoo! and 7.3 (95% CI 3.86-10.74) offered by Google. Amongst web pages offered by Yahoo!, there were three with clearly inaccurate information, while there were four in the pages offered by Google. Trust marks were displayed in 20% and 30% medical web pages, respectively, and their presence reached statistical significance (P=0.033) when fulfilling the quality criteria compared with web pages where trust marks were not displayed. A wide variety of web pages was obtained by search engines and a large number of them with useless information. Although the websites analysed had a good quality, between 15% and 20% showed inaccurate information. Websites where trust marks were displayed had more quality than those that did not display one and none of them were included amongst those where inaccurate information was found.

  8. Lightweight Advertising and Scalable Discovery of Services, Datasets, and Events Using Feedcasts

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Ramachandran, R.; Movva, S.

    2010-12-01

    Broadcast feeds (Atom or RSS) are a mechanism for advertising the existence of new data objects on the web, with metadata and links to further information. Users then subscribe to the feed to receive updates. This concept has already been used to advertise the new granules of science data as they are produced (datacasting), with browse images and metadata, and to advertise bundles of web services (service casting). Structured metadata is introduced into the XML feed format by embedding new XML tags (in defined namespaces), using typed links, and reusing built-in Atom feed elements. This “infocasting” concept can be extended to include many other science artifacts, including data collections, workflow documents, topical geophysical events (hurricanes, forest fires, etc.), natural hazard warnings, and short articles describing a new science result. The common theme is that each infocast contains machine-readable, structured metadata describing the object and enabling further manipulation. For example, service casts contain type links pointing to the service interface description (e.g., WSDL for SOAP services), service endpoint, and human-readable documentation. Our Infocasting project has three main goals: (1) define and evangelize micro-formats (metadata standards) so that providers can easily advertise their web services, datasets, and topical geophysical events by adding structured information to broadcast feeds; (2) develop authoring tools so that anyone can easily author such service advertisements, data casts, and event descriptions; and (3) provide a one-stop, Google-like search box in the browser that allows discovery of service, data and event casts visible on the web, and services & data registered in the GEOSS repository and other NASA repositories (GCMD & ECHO). To demonstrate the event casting idea, a series of micro-articles—with accompanying event casts containing links to relevant datasets, web services, and science analysis workflows--will be authored for several kinds of geophysical events, such as hurricanes, smoke plume events, tsunamis, etc. The talk will describe our progress so far, and some of the issues with leveraging existing metadata standards to define lightweight micro-formats.

  9. The 2nd Generation Real Time Mission Monitor (RTMM) Development

    NASA Technical Reports Server (NTRS)

    Blakeslee, Richard; Goodman, Michael; Meyer, Paul; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn; Conover, Helen; Smith, Tammy; Lu, Jessica; hide

    2009-01-01

    The NASA Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decisionmaking for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery and orbit data, radar and other surface observations (e.g., lightning location network data), airborne navigation and instrument data sets, model output parameters, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. In order to improve the usefulness and efficiency of the RTMM system, capabilities are being developed to allow the end-user to easily configure RTMM applications based on their mission-specific requirements and objectives. This second generation RTMM is being redesigned to take advantage of the Google plug-in capabilities to run multiple applications in a web browser rather than the original single application Google Earth approach. Currently RTMM employs a limited Service Oriented Architecture approach to enable discovery of mission specific resources. We are expanding the RTMM architecture such that it will more effectively utilize the Open Geospatial Consortium Sensor Web Enablement services and other new technology software tools and components. These modifications and extensions will result in a robust, versatile RTMM system that will greatly increase flexibility of the user to choose which science data sets and support applications to view and/or use. The improvements brought about by RTMM 2nd generation system will provide mission planners and airborne scientists with enhanced decision-making tools and capabilities to more efficiently plan, prepare and execute missions, as well as to playback and review past mission data. To paraphrase the old television commercial RTMM doesn t make the airborne science, it makes the airborne science easier.

  10. Ajax Architecture Implementation Techniques

    NASA Astrophysics Data System (ADS)

    Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader

    2012-03-01

    Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.

  11. Robotic Prostatectomy on the Web: A Cross-Sectional Qualitative Assessment.

    PubMed

    Borgmann, Hendrik; Mager, René; Salem, Johannes; Bründl, Johannes; Kunath, Frank; Thomas, Christian; Haferkamp, Axel; Tsaur, Igor

    2016-08-01

    Many patients diagnosed with prostate cancer search for information on robotic prostatectomy (RobP) on the Web. We aimed to evaluate the qualitative characteristics of the mostly frequented Web sites on RobP with a particular emphasis on provider-dependent issues. Google was searched for the term "robotic prostatectomy" in Europe and North America. The mostly frequented Web sites were selected and classified as physician-provided and publically-provided. Quality was measured using Journal of the American Medical Association (JAMA) benchmark criteria, DISCERN score, and addressing of Trifecta surgical outcomes. Popularity was analyzed using Google PageRank and Alexa tool. Accessibility, usability, and reliability were investigated using the LIDA tool and readability was assessed using readability indices. Twenty-eight Web sites were physician-provided and 15 publically-provided. For all Web sites, 88% of JAMA benchmark criteria were fulfilled, DISCERN quality score was high, and 81% of Trifecta outcome measurements were addressed. Popularity was average according to Google PageRank (mean 2.9 ± 1.5) and Alexa Traffic Rank (median, 49,109; minimum, 7; maximum, 8,582,295). Accessibility (85 ± 7%), usability (92 ± 3%), and reliability scores (88 ± 8%) were moderate to high. Automated Readability Index was 7.2 ± 2.1 and Flesch-Kincaid Grade Level was 9 ± 2, rating the Web sites as difficult to read. Physician-provided Web sites had higher quality scores and lower readability compared with publically-provided Web sites. Websites providing information on RobP obtained medium to high ratings in all domains of quality in the current assessment. In contrast, readability needs to be significantly improved so that this content can become available for the populace. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC)

    NASA Technical Reports Server (NTRS)

    Ostrenga, D.; Liu, Z.; Vollmer, B.; Teng, W.; Kempler, S.

    2014-01-01

    On February 27, 2014, the NASA Global Precipitation Measurement (GPM) mission was launched to provide the next-generation global observations of rain and snow (http:pmm.nasa.govGPM). The GPM mission consists of an international network of satellites in which a GPM Core Observatory satellite carries both active and passive microwave instruments to measure precipitation and serve as a reference standard, to unify precipitation measurements from a constellation of other research and operational satellites. The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 16 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available include the following:Level-1 GPM Microwave Imager (GMI) and partner radiometer productsLevel-2 Goddard Profiling Algorithm (GPROF) GMI and partner productsLevel-3 daily and monthly productsIntegrated Multi-satellitE Retrievals for GPM (IMERG) products (early, late, and final) A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http:disc.sci.gsfc.nasa.govgpm). Data services that are currently and to-be available include Google-like Mirador (http:mirador.gsfc.nasa.gov) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http:giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications.

  13. Prospective analysis of the quality of Spanish health information web sites after 3 years.

    PubMed

    Conesa-Fuentes, Maria C; Hernandez-Morante, Juan J

    2016-12-01

    Although the Internet has become an essential source of health information, our study conducted 3 years ago provided evidence of the low quality of Spanish health web sites. The objective of the present study was to evaluate the quality of Spanish health information web sites now, and to compare these results with those obtained 3 years ago. For the original study, the most visited health information web sites were selected through the PageRank® (Google®) system. The present study evaluated the quality of the same web sites from February to May 2013, using the method developed by Bermúdez-Tamayo et al. and HONCode® criteria. The mean quality of the selected web sites was low and has deteriorated since the previous evaluation, especially in regional health services and institutions' web sites. The quality of private web sites remained broadly similar. Compliance with privacy and update criteria also improved in the intervening period. The results indicate that, even in the case of health web sites, design or appearance is more relevant to developers than quality of information. It is recommended that responsible institutions should increase their efforts to eliminate low-quality health information that may further contribute to health problems.

  14. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  15. Exploring an increased role for Australian community pharmacy in mental health professional service delivery: evaluation of the literature.

    PubMed

    Hattingh, H Laetitia; Scahill, Shane; Fowler, Jane L; Wheeler, Amanda J

    2016-12-01

    Australian general practitioners primarily treat mental health problems by prescribing medication dispensed by community pharmacists. Pharmacists therefore have regular interactions with mental health consumers and carers. This narrative review explored the potential role of community pharmacy in mental health services. Medline, CINAHL, ProQuest, Emerald, PsycINFO, Science Direct, PubMed, Web of Knowledge and IPA were utilised. The Cochrane Library as well as grey literature and "lay" search engines such as GoogleScholar were also searched. Four systematic reviews and ten community pharmacy randomised controlled trials were identified. Various relevant reviews outlining the impact of community pharmacy based disease state or medicines management services were also identified. International studies involving professional service interventions for mental health consumers could be contextualised for the Australian setting. Australian studies of pharmacy professional services for chronic physical health conditions provided further guidance for the expansion of community pharmacy mental health professional services.

  16. Accredited hand surgery fellowship Web sites: analysis of content and accessibility.

    PubMed

    Trehan, Samir K; Morrell, Nathan T; Akelman, Edward

    2015-04-01

    To assess the accessibility and content of accredited hand surgery fellowship Web sites. A list of all accredited hand surgery fellowships was obtained from the online database of the American Society for Surgery of the Hand (ASSH). Fellowship program information on the ASSH Web site was recorded. All fellowship program Web sites were located via Google search. Fellowship program Web sites were analyzed for accessibility and content in 3 domains: program overview, application information/recruitment, and education. At the time of this study, there were 81 accredited hand surgery fellowships with 169 available positions. Thirty of 81 programs (37%) had a functional link on the ASSH online hand surgery fellowship directory; however, Google search identified 78 Web sites. Three programs did not have a Web site. Analysis of content revealed that most Web sites contained contact information, whereas information regarding the anticipated clinical, research, and educational experiences during fellowship was less often present. Furthermore, information regarding past and present fellows, salary, application process/requirements, call responsibilities, and case volume was frequently lacking. Overall, 52 of 81 programs (64%) had the minimal online information required for residents to independently complete the fellowship application process. Hand fellowship program Web sites could be accessed either via the ASSH online directory or Google search, except for 3 programs that did not have Web sites. Although most fellowship program Web sites contained contact information, other content such as application information/recruitment and education, was less frequently present. This study provides comparative data regarding the clinical and educational experiences outlined on hand fellowship program Web sites that are of relevance to residents, fellows, and academic hand surgeons. This study also draws attention to various ways in which the hand surgery fellowship application process can be made more user-friendly and efficient. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  17. Analysis of Orthopaedic Research Produced During the Wars in Iraq and Afghanistan.

    PubMed

    Balazs, George C; Dickens, Jonathan F; Brelin, Alaina M; Wolfe, Jared A; Rue, John-Paul H; Potter, Benjamin K

    2015-09-01

    Military orthopaedic surgeons have published a substantial amount of original research based on our care of combat-wounded service members and related studies during the wars in Iraq and Afghanistan. However, to our knowledge, the influence of this body of work has not been evaluated bibliometrically, and doing so is important to determine the modern impact of combat casualty research in the wider medical community. We sought to identify the 20 most commonly cited works from military surgeons published during the Iraq and Afghanistan conflicts and analyze them to answer the following questions: (1) What were the subject areas of these 20 articles and what was the 2013 Impact Factor of each journal that published them? (2) How many citations did they receive and what were the characteristics of the journals that cited them? (3) Do the citation analysis results obtained from Google Scholar mirror the results obtained from Thompson-Reuters' Web of Science? We searched the Web of Science Citation Index Expanded for relevant original research performed by US military orthopaedic surgeons related to Operation Iraqi Freedom and Operation Enduring Freedom between 2001 and 2014. Articles citing these studies were reviewed using both Web of Science and Google Scholar data. The 20 most cited articles meeting inclusion criteria were identified and analyzed by content domain, frequency of citation, and sources in which they were cited. Nine of these studies examined the epidemiology and outcome of combat injury. Six studies dealt with wound management, wound dehiscence, and formation of heterotopic ossification. Five studies examined infectious complications of combat trauma. The median number of citations garnered by these 20 articles was 41 (range, 28-264) in Web of Science. Other research citing these studies has appeared in 279 different journals, covering 26 different medical and surgical subspecialties, from authors in 31 different countries. Google Scholar contained 97% of the Web of Science citations, but also had 31 duplicate entries and 29 citations with defective links. Modern combat casualty research by military orthopaedic surgeons is widely cited by researchers in a diverse range of subspecialties and geographic locales. This suggests that the military continues to be a source of innovation that is broadly applicable to civilian medical and surgical practice and should encourage expansion of military-civilian collaboration to maximize the utility of the knowledge gained in the treatment of war trauma. Level IV, therapeutic study.

  18. Commercial products that convey personal health information in emergencies.

    PubMed

    Potini, Vishnu C; Weerasuriya, Dilani N; Lowery-North, Douglas W; Kellermann, Arthur L

    2011-12-01

    Describe commercially available products and services designed to convey personal health information in emergencies. The search engine Google®, supplemented by print ads, was used to identify companies and organizations that offer relevant products and services to the general market. Disease-specific, health system, and health plan-specific offerings were excluded. Vendor web sites were the primary sources of information, supplemented by telephone and e-mail queries to sales representatives. Perfect inter-rater agreement was achieved. Thirty-nine unique vendors were identified. Eight sell engraved jewelry. Three offer an embossed card or pamphlet. Twelve supply USB drives with various features. Eleven support password-protected web sites. Five maintain national call centers. Available media differed markedly with respect to capacity and accessibility. Quoted prices ranged from a one-time expenditure of $3.50 to an annual fee of $200. Associated features and annual fees varied widely. A wide range of products and services exist to help patients convey personal health information. Health care providers should be familiar with their features, so they can access the information in a disaster or emergency.

  19. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.

  20. Getting to the top of Google: search engine optimization.

    PubMed

    Maley, Catherine; Baum, Neil

    2010-01-01

    Search engine optimization is the process of making your Web site appear at or near the top of popular search engines such as Google, Yahoo, and MSN. This is not done by luck or knowing someone working for the search engines but by understanding the process of how search engines select Web sites for placement on top or on the first page. This article will review the process and provide methods and techniques to use to have your site rated at the top or very near the top.

  1. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  2. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  3. Global Imagery Browse Services (GIBS) - Rapidly Serving NASA Imagery for Applications and Science Users

    NASA Astrophysics Data System (ADS)

    Schmaltz, J. E.; Ilavajhala, S.; Plesea, L.; Hall, J. R.; Boller, R. A.; Chang, G.; Sadaqathullah, S.; Kim, R.; Murphy, K. J.; Thompson, C. K.

    2012-12-01

    Expedited processing of imagery from NASA satellites for near-real time use by non-science applications users has a long history, especially since the beginning of the Terra and Aqua missions. Several years ago, the Land Atmosphere Near-real-time Capability for EOS (LANCE) was created to greatly expand the range of near-real time data products from a variety of Earth Observing System (EOS) instruments. NASA's Earth Observing System Data and Information System (EOSDIS) began exploring methods to distribute these data as imagery in an intuitive, geo-referenced format, which would be available within three hours of acquisition. Toward this end, EOSDIS has developed the Global Imagery Browse Services (GIBS, http://earthdata.nasa.gov/gibs) to provide highly responsive, scalable, and expandable imagery services. The baseline technology chosen for GIBS was a Tiled Web Mapping Service (TWMS) developed at the Jet Propulsion Laboratory. Using this, global images and mosaics are divided into tiles with fixed bounding boxes for a pyramid of fixed resolutions. Initially, the satellite imagery is created at the existing data systems for each sensor, ensuring the oversight of those most knowledgeable about the science. There, the satellite data is geolocated and converted to an image format such as JPEG, TIFF, or PNG. The GIBS ingest server retrieves imagery from the various data systems and converts them into image tiles, which are stored in a highly-optimized raster format named Meta Raster Format (MRF). The image tiles are then served to users via HTTP by means of an Apache module. Services are available for the entire globe (lat-long projection) and for both polar regions (polar stereographic projection). Requests to the services can be made with the non-standard, but widely known, TWMS format or via the well-known OGC Web Map Tile Service (WMTS) standard format. Standard OGC Web Map Service (WMS) access to the GIBS server is also available. In addition, users may request a KML pyramid. This variety of access methods allows stakeholders to develop visualization/browse clients for a diverse variety of specific audiences. Currently, EOSDIS is providing an OpenLayers web client, Worldview (http://earthdata.nasa.gov/worldview), as an interface to GIBS. A variety of other existing clients can also be developed using such tools as Google Earth, Google Earth browser Plugin, ESRI's Adobe Flash/Flex Client Library, NASA World Wind, Perceptive Pixel Client, Esri's iOS Client Library, and OpenLayers for Mobile. The imagery browse capabilities from GIBS can be combined with other EOSDIS services (i.e. ECHO OpenSearch) via a client that ties them both together to provide an interface that enables data download from the onscreen imagery. Future plans for GIBS include providing imagery based on science quality data from the entire data record of these EOS instruments.

  4. How good is Google? The quality of otolaryngology information on the internet.

    PubMed

    Pusz, Max D; Brietzke, Scott E

    2012-09-01

    To assess the quality of the information a patient (parent) may encounter using a Google search for typical otolaryngology ailments. Cross-sectional study. Tertiary care center. A Google keyword search was performed for 10 common otolaryngology problems including ear infection, hearing loss, tonsillitis, and so on. The top 10 search results for each were critically examined using the 16-item (1-5 scale) standardized DISCERN instrument. The DISCERN instrument was developed to assess the quality and comprehensiveness of patient treatment choice literature. A total of 100 Web sites were assessed. Of these, 19 (19%) were primarily advertisements for products and were excluded from DISCERN scoring. Searches for more typically chronic otolaryngic problems (eg, tinnitus, sleep apnea, etc) resulted in more biased, advertisement-type results than those for typically acute problems (eg, ear infection, sinus infection, P = .03). The search for "sleep apnea treatment" produced the highest scoring results (mean overall DISCERN score = 3.49, range = 1.81-4.56), and the search for "hoarseness treatment" produced the lowest scores (mean = 2.49, range = 1.56-3.56). Results from major comprehensive Web sites (WebMD, EMedicinehealth.com, Wikipedia, etc.) scored higher than other Web sites (mean DISCERN score = 3.46 vs 2.48, P < .001). There is marked variability in the quality of Web site information for the treatment of common otolaryngologic problems. Searches on more chronic problems resulted in a higher proportion of biased advertisement Web sites. Larger, comprehensive Web sites generally provided better information but were less than perfect in presenting complete information on treatment options.

  5. Google Earth as a (Not Just) Geography Education Tool

    ERIC Educational Resources Information Center

    Patterson, Todd C.

    2007-01-01

    The implementation of Geographic Information Science (GIScience) applications and discussion of GIScience-related themes are useful for teaching fundamental geographic and technological concepts. As one of the newest geographic information tools available on the World Wide Web, Google Earth has considerable potential to enhance methods for…

  6. Operational Use of OGC Web Services at the Met Office

    NASA Astrophysics Data System (ADS)

    Wright, Bruce

    2010-05-01

    The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or graphs, and combined with the WMS pre-rendered images and text, in a FLEX application, to provide sophisticated, user impact-based view of the weather. The OGC web services supporting these applications have been developed in collaboration with commercial companies. Visual Weather was originally a desktop application for forecasters, but IBL have developed it to expose the full range of forecast and observation data through standard web services (WCS and WMS). Forecasts and observations relating to specific locations and geographic features are held in an Oracle Database, and exposed as a WFS using Snowflake Software's GO-Publisher application. The Met Office has worked closely with both IBL and Snowflake Software to ensure that the web services provided strike a balance between conformance to the standards and performance in an operational environment. This has proved challenging in areas where the standards are rapidly evolving (e.g. WCS) or do not allow adequate description of the Met-Ocean domain (e.g. multiple time coordinates and parametric vertical coordinates). It has also become clear that careful selection of the features to expose, based on the way in which you expect users to query those features, in necessary in order to deliver adequate performance. These experiences are providing useful 'real-world' input in to the recently launched OGC MetOcean Domain Working Group and World Meteorological Organisation (WMO) initiatives in this area.

  7. Web-based surveillance of public information needs for informing preconception interventions.

    PubMed

    D'Ambrosio, Angelo; Agricola, Eleonora; Russo, Luisa; Gesualdo, Francesco; Pandolfi, Elisabetta; Bortolus, Renata; Castellani, Carlo; Lalatta, Faustina; Mastroiacovo, Pierpaolo; Tozzi, Alberto Eugenio

    2015-01-01

    The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health. Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics. We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time. Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations.

  8. Web-Based Surveillance of Public Information Needs for Informing Preconception Interventions

    PubMed Central

    D’Ambrosio, Angelo; Agricola, Eleonora; Russo, Luisa; Gesualdo, Francesco; Pandolfi, Elisabetta; Bortolus, Renata; Castellani, Carlo; Lalatta, Faustina; Mastroiacovo, Pierpaolo; Tozzi, Alberto Eugenio

    2015-01-01

    Background The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health. Methods Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics. Results We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time. Conclusion Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations. PMID:25879682

  9. Integrating Socioeconomic and Earth Science Data Using Geobrowsers and Web Services: A Demonstration

    NASA Astrophysics Data System (ADS)

    Schumacher, J. A.; Yetman, G. G.

    2007-12-01

    The societal benefit areas identified as the focus for the Global Earth Observing System of Systems (GEOSS) 10- year implementation plan are an indicator of the importance of integrating socioeconomic data with earth science data to support decision makers. To aid this integration, CIESIN is delivering its global and U.S. demographic data to commercial and open source Geobrowsers and providing open standards based services for data access. Currently, data on population distribution, poverty, and detailed census data for the U.S. are available for visualization and access in Google Earth, NASA World Wind, and a browser-based 2-dimensional mapping client. The mapping client allows for the creation of web map documents that pull together layers from distributed servers and can be saved and shared. Visualization tools with Geobrowsers, user-driven map creation and sharing via browser-based clients, and a prototype for characterizing populations at risk to predicted precipitation deficits will be demonstrated.

  10. Novel data sources for women's health research: mapping breast screening online information seeking through Google trends.

    PubMed

    Fazeli Dehkordy, Soudabeh; Carlos, Ruth C; Hall, Kelli S; Dalton, Vanessa K

    2014-09-01

    Millions of people use online search engines everyday to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker's behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows Internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information-seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. To capture the temporal variations of information seeking about dense breasts, the Web search query "dense breast" was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information-seeking trends about dense breasts over time. Newsworthy events and legislative actions appear to correlate well with peaks in search volume of "dense breast". Geographic regions with the highest search volumes have passed, denied, or are currently considering the dense breast legislation. Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for "dense breast" on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  11. Sally Ride EarthKAM - Automated Image Geo-Referencing Using Google Earth Web Plug-In

    NASA Technical Reports Server (NTRS)

    Andres, Paul M.; Lazar, Dennis K.; Thames, Robert Q.

    2013-01-01

    Sally Ride EarthKAM is an educational program funded by NASA that aims to provide the public the ability to picture Earth from the perspective of the International Space Station (ISS). A computer-controlled camera is mounted on the ISS in a nadir-pointing window; however, timing limitations in the system cause inaccurate positional metadata. Manually correcting images within an orbit allows the positional metadata to be improved using mathematical regressions. The manual correction process is time-consuming and thus, unfeasible for a large number of images. The standard Google Earth program allows for the importing of KML (keyhole markup language) files that previously were created. These KML file-based overlays could then be manually manipulated as image overlays, saved, and then uploaded to the project server where they are parsed and the metadata in the database is updated. The new interface eliminates the need to save, download, open, re-save, and upload the KML files. Everything is processed on the Web, and all manipulations go directly into the database. Administrators also have the control to discard any single correction that was made and validate a correction. This program streamlines a process that previously required several critical steps and was probably too complex for the average user to complete successfully. The new process is theoretically simple enough for members of the public to make use of and contribute to the success of the Sally Ride EarthKAM project. Using the Google Earth Web plug-in, EarthKAM images, and associated metadata, this software allows users to interactively manipulate an EarthKAM image overlay, and update and improve the associated metadata. The Web interface uses the Google Earth JavaScript API along with PHP-PostgreSQL to present the user the same interface capabilities without leaving the Web. The simpler graphical user interface will allow the public to participate directly and meaningfully with EarthKAM. The use of similar techniques is being investigated to place ground-based observations in a Google Mars environment, allowing the MSL (Mars Science Laboratory) Science Team a means to visualize the rover and its environment.

  12. Detecting Runtime Anomalies in AJAX Applications through Trace Analysis

    DTIC Science & Technology

    2011-08-10

    statements by adding the instrumentation to the GWT UI classes, leaving the user code untouched. Some content management frameworks such as Drupal [12...Google web toolkit.” http://code.google.com/webtoolkit/. [12] “Form generation – drupal api.” http://api.drupal.org/api/group/form_api/6. 9

  13. Scan This Book!

    ERIC Educational Resources Information Center

    Albanese, Andrew Richard

    2007-01-01

    In this article, the author presents an interview with Brewster Kahle, leader of the Open Content Alliance (OCA). OCA book scan program is an alternative to Google's library project that aims to make books accessible online. In this interview, Kahle discusses his views on the challenges of getting books on the Web, on Google's library…

  14. (Meta)Search like Google

    ERIC Educational Resources Information Center

    Rochkind, Jonathan

    2007-01-01

    The ability to search and receive results in more than one database through a single interface--or metasearch--is something many users want. Google Scholar--the search engine of specifically scholarly content--and library metasearch products like Ex Libris's MetaLib, Serials Solution's Central Search, WebFeat, and products based on MuseGlobal used…

  15. Webulous and the Webulous Google Add-On--a web service and application for ontology building from templates.

    PubMed

    Jupp, Simon; Burdett, Tony; Welter, Danielle; Sarntivijai, Sirarat; Parkinson, Helen; Malone, James

    2016-01-01

    Authoring bio-ontologies is a task that has traditionally been undertaken by skilled experts trained in understanding complex languages such as the Web Ontology Language (OWL), in tools designed for such experts. As requests for new terms are made, the need for expert ontologists represents a bottleneck in the development process. Furthermore, the ability to rigorously enforce ontology design patterns in large, collaboratively developed ontologies is difficult with existing ontology authoring software. We present Webulous, an application suite for supporting ontology creation by design patterns. Webulous provides infrastructure to specify templates for populating ontology design patterns that get transformed into OWL assertions in a target ontology. Webulous provides programmatic access to the template server and a client application has been developed for Google Sheets that allows templates to be loaded, populated and resubmitted to the Webulous server for processing. The development and delivery of ontologies to the community requires software support that goes beyond the ontology editor. Building ontologies by design patterns and providing simple mechanisms for the addition of new content helps reduce the overall cost and effort required to develop an ontology. The Webulous system provides support for this process and is used as part of the development of several ontologies at the European Bioinformatics Institute.

  16. Beyond Description: Converting Web Site Usage Statistics into Concrete Site Improvement Ideas

    ERIC Educational Resources Information Center

    Arendt, Julie; Wagner, Cassie

    2010-01-01

    Web site usage statistics are a widely used tool for Web site development, but libraries are still learning how to use them successfully. This case study summarizes how Morris Library at Southern Illinois University Carbondale implemented Google Analytics on its Web site and used the reports to inform a site redesign. As the main campus library at…

  17. In-field Access to Geoscientific Metadata through GPS-enabled Mobile Phones

    NASA Astrophysics Data System (ADS)

    Hobona, Gobe; Jackson, Mike; Jordan, Colm; Butchart, Ben

    2010-05-01

    Fieldwork is an integral part of much geosciences research. But whilst geoscientists have physical or online access to data collections whilst in the laboratory or at base stations, equivalent in-field access is not standard or straightforward. The increasing availability of mobile internet and GPS-supported mobile phones, however, now provides the basis for addressing this issue. The SPACER project was commissioned by the Rapid Innovation initiative of the UK Joint Information Systems Committee (JISC) to explore the potential for GPS-enabled mobile phones to access geoscientific metadata collections. Metadata collections within the geosciences and the wider geospatial domain can be disseminated through web services based on the Catalogue Service for Web(CSW) standard of the Open Geospatial Consortium (OGC) - a global grouping of over 380 private, public and academic organisations aiming to improve interoperability between geospatial technologies. CSW offers an XML-over-HTTP interface for querying and retrieval of geospatial metadata. By default, the metadata returned by CSW is based on the ISO19115 standard and encoded in XML conformant to ISO19139. The SPACER project has created a prototype application that enables mobile phones to send queries to CSW containing user-defined keywords and coordinates acquired from GPS devices built-into the phones. The prototype has been developed using the free and open source Google Android platform. The mobile application offers views for listing titles, presenting multiple metadata elements and a Google Map with an overlay of bounding coordinates of datasets. The presentation will describe the architecture and approach applied in the development of the prototype.

  18. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google Earth layers using KML; generation of maps via WMS or ArcIMS protocols; and data manipulation with Unix utilities.

  19. Head Lice Surveillance on a Deregulated OTC-Sales Market: A Study Using Web Query Data

    PubMed Central

    Lindh, Johan; Magnusson, Måns; Grünewald, Maria; Hulth, Anette

    2012-01-01

    The head louse, Pediculus humanus capitis, is an obligate ectoparasite that causes infestations of humans. Studies have demonstrated a correlation between sales figures for over-the-counter (OTC) treatment products and the number of humans with head lice. The deregulation of the Swedish pharmacy market on July 1, 2009, decreased the possibility to obtain complete sale figures and thereby the possibility to obtain yearly trends of head lice infestations. In the presented study we wanted to investigate whether web queries on head lice can be used as substitute for OTC sales figures. Via Google Insights for Search and Vårdguiden medical web site, the number of queries on “huvudlöss” (head lice) and “hårlöss” (lice in hair) were obtained. The analysis showed that both the Vårdguiden series and the Google series were statistically significant (p<0.001) when added separately, but if the Google series were already included in the model, the Vårdguiden series were not statistically significant (p = 0.5689). In conclusion, web queries can detect if there is an increase or decrease of head lice infested humans in Sweden over a period of years, and be as reliable a proxy as the OTC-sales figures. PMID:23144923

  20. Head lice surveillance on a deregulated OTC-sales market: a study using web query data.

    PubMed

    Lindh, Johan; Magnusson, Måns; Grünewald, Maria; Hulth, Anette

    2012-01-01

    The head louse, Pediculus humanus capitis, is an obligate ectoparasite that causes infestations of humans. Studies have demonstrated a correlation between sales figures for over-the-counter (OTC) treatment products and the number of humans with head lice. The deregulation of the Swedish pharmacy market on July 1, 2009, decreased the possibility to obtain complete sale figures and thereby the possibility to obtain yearly trends of head lice infestations. In the presented study we wanted to investigate whether web queries on head lice can be used as substitute for OTC sales figures. Via Google Insights for Search and Vårdguiden medical web site, the number of queries on "huvudlöss" (head lice) and "hårlöss" (lice in hair) were obtained. The analysis showed that both the Vårdguiden series and the Google series were statistically significant (p<0.001) when added separately, but if the Google series were already included in the model, the Vårdguiden series were not statistically significant (p = 0.5689). In conclusion, web queries can detect if there is an increase or decrease of head lice infested humans in Sweden over a period of years, and be as reliable a proxy as the OTC-sales figures.

  1. An overview of new video coding tools under consideration for VP10: the successor to VP9

    NASA Astrophysics Data System (ADS)

    Mukherjee, Debargha; Su, Hui; Bankoski, James; Converse, Alex; Han, Jingning; Liu, Zoe; Xu, Yaowu

    2015-09-01

    Google started an opensource project, entitled the WebM Project, in 2010 to develop royaltyfree video codecs for the web. The present generation codec developed in the WebM project called VP9 was finalized in mid2013 and is currently being served extensively by YouTube, resulting in billions of views per day. Even though adoption of VP9 outside Google is still in its infancy, the WebM project has already embarked on an ambitious project to develop a next edition codec VP10 that achieves at least a generational bitrate reduction over the current generation codec VP9. Although the project is still in early stages, a set of new experimental coding tools have already been added to baseline VP9 to achieve modest coding gains over a large enough test set. This paper provides a technical overview of these coding tools.

  2. The Number of Scholarly Documents on the Public Web

    PubMed Central

    Khabsa, Madian; Giles, C. Lee

    2014-01-01

    The number of scholarly documents available on the web is estimated using capture/recapture methods by studying the coverage of two major academic search engines: Google Scholar and Microsoft Academic Search. Our estimates show that at least 114 million English-language scholarly documents are accessible on the web, of which Google Scholar has nearly 100 million. Of these, we estimate that at least 27 million (24%) are freely available since they do not require a subscription or payment of any kind. In addition, at a finer scale, we also estimate the number of scholarly documents on the web for fifteen fields: Agricultural Science, Arts and Humanities, Biology, Chemistry, Computer Science, Economics and Business, Engineering, Environmental Sciences, Geosciences, Material Science, Mathematics, Medicine, Physics, Social Sciences, and Multidisciplinary, as defined by Microsoft Academic Search. In addition, we show that among these fields the percentage of documents defined as freely available varies significantly, i.e., from 12 to 50%. PMID:24817403

  3. The number of scholarly documents on the public web.

    PubMed

    Khabsa, Madian; Giles, C Lee

    2014-01-01

    The number of scholarly documents available on the web is estimated using capture/recapture methods by studying the coverage of two major academic search engines: Google Scholar and Microsoft Academic Search. Our estimates show that at least 114 million English-language scholarly documents are accessible on the web, of which Google Scholar has nearly 100 million. Of these, we estimate that at least 27 million (24%) are freely available since they do not require a subscription or payment of any kind. In addition, at a finer scale, we also estimate the number of scholarly documents on the web for fifteen fields: Agricultural Science, Arts and Humanities, Biology, Chemistry, Computer Science, Economics and Business, Engineering, Environmental Sciences, Geosciences, Material Science, Mathematics, Medicine, Physics, Social Sciences, and Multidisciplinary, as defined by Microsoft Academic Search. In addition, we show that among these fields the percentage of documents defined as freely available varies significantly, i.e., from 12 to 50%.

  4. HCLS 2.0/3.0: health care and life sciences data mashup using Web 2.0/3.0.

    PubMed

    Cheung, Kei-Hoi; Yip, Kevin Y; Townsend, Jeffrey P; Scotch, Matthew

    2008-10-01

    We describe the potential of current Web 2.0 technologies to achieve data mashup in the health care and life sciences (HCLS) domains, and compare that potential to the nascent trend of performing semantic mashup. After providing an overview of Web 2.0, we demonstrate two scenarios of data mashup, facilitated by the following Web 2.0 tools and sites: Yahoo! Pipes, Dapper, Google Maps and GeoCommons. In the first scenario, we exploited Dapper and Yahoo! Pipes to implement a challenging data integration task in the context of DNA microarray research. In the second scenario, we exploited Yahoo! Pipes, Google Maps, and GeoCommons to create a geographic information system (GIS) interface that allows visualization and integration of diverse categories of public health data, including cancer incidence and pollution prevalence data. Based on these two scenarios, we discuss the strengths and weaknesses of these Web 2.0 mashup technologies. We then describe Semantic Web, the mainstream Web 3.0 technology that enables more powerful data integration over the Web. We discuss the areas of intersection of Web 2.0 and Semantic Web, and describe the potential benefits that can be brought to HCLS research by combining these two sets of technologies.

  5. HCLS 2.0/3.0: Health Care and Life Sciences Data Mashup Using Web 2.0/3.0

    PubMed Central

    Cheung, Kei-Hoi; Yip, Kevin Y.; Townsend, Jeffrey P.; Scotch, Matthew

    2010-01-01

    We describe the potential of current Web 2.0 technologies to achieve data mashup in the health care and life sciences (HCLS) domains, and compare that potential to the nascent trend of performing semantic mashup. After providing an overview of Web 2.0, we demonstrate two scenarios of data mashup, facilitated by the following Web 2.0 tools and sites: Yahoo! Pipes, Dapper, Google Maps and GeoCommons. In the first scenario, we exploited Dapper and Yahoo! Pipes to implement a challenging data integration task in the context of DNA microarray research. In the second scenario, we exploited Yahoo! Pipes, Google Maps, and GeoCommons to create a geographic information system (GIS) interface that allows visualization and integration of diverse categories of public health data, including cancer incidence and pollution prevalence data. Based on these two scenarios, we discuss the strengths and weaknesses of these Web 2.0 mashup technologies. We then describe Semantic Web, the mainstream Web 3.0 technology that enables more powerful data integration over the Web. We discuss the areas of intersection of Web 2.0 and Semantic Web, and describe the potential benefits that can be brought to HCLS research by combining these two sets of technologies. PMID:18487092

  6. rasdaman Array Database: current status

    NASA Astrophysics Data System (ADS)

    Merticariu, George; Toader, Alexandru

    2015-04-01

    rasdaman (Raster Data Manager) is a Free Open Source Array Database Management System which provides functionality for storing and processing massive amounts of raster data in the form of multidimensional arrays. The user can access, process and delete the data using SQL. The key features of rasdaman are: flexibility (datasets of any dimensionality can be processed with the help of SQL queries), scalability (rasdaman's distributed architecture enables it to seamlessly run on cloud infrastructures while offering an increase in performance with the increase of computation resources), performance (real-time access, processing, mixing and filtering of arrays of any dimensionality) and reliability (legacy communication protocol replaced with a new one based on cutting edge technology - Google Protocol Buffers and ZeroMQ). Among the data with which the system works, we can count 1D time series, 2D remote sensing imagery, 3D image time series, 3D geophysical data, and 4D atmospheric and climate data. Most of these representations cannot be stored only in the form of raw arrays, as the location information of the contents is also important for having a correct geoposition on Earth. This is defined by ISO 19123 as coverage data. rasdaman provides coverage data support through the Petascope service. Extensions were added on top of rasdaman in order to provide support for the Geoscience community. The following OGC standards are currently supported: Web Map Service (WMS), Web Coverage Service (WCS), and Web Coverage Processing Service (WCPS). The Web Map Service is an extension which provides zoom and pan navigation over images provided by a map server. Starting with version 9.1, rasdaman supports WMS version 1.3. The Web Coverage Service provides capabilities for downloading multi-dimensional coverage data. Support is also provided for several extensions of this service: Subsetting Extension, Scaling Extension, and, starting with version 9.1, Transaction Extension, which defines request types for inserting, updating and deleting coverages. A web client, designed for both novice and experienced users, is also available for the service and its extensions. The client offers an intuitive interface that allows users to work with multi-dimensional coverages by abstracting the specifics of the standard definitions of the requests. The Web Coverage Processing Service defines a language for on-the-fly processing and filtering multi-dimensional raster coverages. rasdaman exposes this service through the WCS processing extension. Demonstrations are provided online via the Earthlook website (earthlook.org) which presents use-cases from a wide variety of application domains, using the rasdaman system as processing engine.

  7. Biographer: web-based editing and rendering of SBGN compliant biochemical networks.

    PubMed

    Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas

    2013-06-01

    The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-independent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL

  8. Usability evaluation of cloud-based mapping tools for the display of very large datasets

    NASA Astrophysics Data System (ADS)

    Stotz, Nicole Marie

    The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would need to be dropped. This is a free product and would therefore be the best option if cost is an issue, but this map may not be supported in the future.

  9. Exploring the Relationship between Self-Regulated Vocabulary Learning and Web-Based Collaboration

    ERIC Educational Resources Information Center

    Liu, Sarah Hsueh-Jui; Lan, Yu-Ju; Ho, Cloudia Ya-Yu

    2014-01-01

    Collaborative learning has placed an emphasis on co-constructing knowledge by sharing and negotiating meaning for problem-solving activities, and this cannot be accomplished without governing the self-regulatory processes of students. This study employed a Web-based tool, Google Docs, to determine the effects of Web-based collaboration on…

  10. Google Wave: Collaboration Reworked

    ERIC Educational Resources Information Center

    Rethlefsen, Melissa L.

    2010-01-01

    Over the past several years, Internet users have become accustomed to Web 2.0 and cloud computing-style applications. It's commonplace and even intuitive to drag and drop gadgets on personalized start pages, to comment on a Facebook post without reloading the page, and to compose and save documents through a web browser. The web paradigm has…

  11. Novel Data Sources for Women’s Health Research: Mapping Breast Screening Online Information Seeking Through Google Trends

    PubMed Central

    Dehkordy, Soudabeh Fazeli; Carlos, Ruth C.; Hall, Kelli S.; Dalton, Vanessa K.

    2015-01-01

    Rationale and Objectives Millions of people use online search engines every day to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker’s behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. Materials and Methods In order to capture the temporal variations of information seeking about dense breasts, the web search query “dense breast” was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information seeking trends about dense breasts over time. Results Newsworthy events and legislative actions appear to correlate well with peaks in search volume of “dense breast”. Geographic regions with the highest search volumes have either passed, denied, or are currently considering the dense breast legislation. Conclusions Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for “dense breast” on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. PMID:24998689

  12. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    NASA Astrophysics Data System (ADS)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  13. Coverage of Google Scholar, Scopus, and Web of Science: a case study of the h-index in nursing.

    PubMed

    De Groote, Sandra L; Raszewski, Rebecca

    2012-01-01

    This study compares the articles cited in CINAHL, Scopus, Web of Science (WOS), and Google Scholar and the h-index ratings provided by Scopus, WOS, and Google Scholar. The publications of 30 College of Nursing faculty at a large urban university were examined. Searches by author name were executed in Scopus, WOS, and POP (Publish or Perish, which searches Google Scholar), and the h-index for each author from each database was recorded. In addition, the citing articles of their published articles were imported into a bibliographic management program. This data was used to determine an aggregated h-index for each author. Scopus, WOS, and Google Scholar provided different h-index ratings for authors and each database found unique and duplicate citing references. More than one tool should be used to calculate the h-index for nursing faculty because one tool alone cannot be relied on to provide a thorough assessment of a researcher's impact. If researchers are interested in a comprehensive h-index, they should aggregate the citing references located by WOS and Scopus. Because h-index rankings differ among databases, comparisons between researchers should be done only within a specified database. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Security Risks of Cloud Computing and Its Emergence as 5th Utility Service

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.

  15. Study of medicine 2.0 due to Web 2.0?! - Risks and opportunities for the curriculum in Leipzig

    PubMed Central

    Hempel, Gunther; Neef, Martin; Rotzoll, Daisy; Heinke, Wolfgang

    2013-01-01

    Web 2.0 is changing the study of medicine by opening up totally new ways of learning and teaching in an ongoing process. Global social networking services like Facebook, YouTube, Flickr, Google Drive and Xing already play an important part in communication both among students and between students and teaching staff. Moreover, local portals (such as the platform [http://www.leipzig-medizin.de] established in 2003) have also caught on and in some cases eclipsed the use of the well-known location-independent social media. The many possibilities and rapid changes brought about by social networks need to be publicized within medical faculties. Therefore, an E-learning and New Media Working Group was set up at the Faculty of Medicine of Universität Leipzig in order to harness the opportunities of Web 2.0, analyse the resulting processes of change in the study of medicine, and curb the risks of the Internet. With Web 2.0 and the social web already influencing the study of medicine, the opportunities of the Internet now need to be utilized to improve the teaching of medicine. PMID:23467440

  16. NASA GSFC Space Weather Center - Innovative Space Weather Dissemination: Web-Interfaces, Mobile Applications, and More

    NASA Technical Reports Server (NTRS)

    Maddox, Marlo; Zheng, Yihua; Rastaetter, Lutz; Taktakishvili, A.; Mays, M. L.; Kuznetsova, M.; Lee, Hyesook; Chulaki, Anna; Hesse, Michael; Mullinix, Richard; hide

    2012-01-01

    The NASA GSFC Space Weather Center (http://swc.gsfc.nasa.gov) is committed to providing forecasts, alerts, research, and educational support to address NASA's space weather needs - in addition to the needs of the general space weather community. We provide a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, custom space weather alerts and products, weekly summaries and reports, and most recently - video casts. There are many challenges in providing accurate descriptions of past, present, and expected space weather events - and the Space Weather Center at NASA GSFC employs several innovative solutions to provide access to a comprehensive collection of both observational data, as well as space weather model/simulation data. We'll describe the challenges we've faced with managing hundreds of data streams, running models in real-time, data storage, and data dissemination. We'll also highlight several systems and tools that are utilized by the Space Weather Center in our daily operations, all of which are available to the general community as well. These systems and services include a web-based application called the Integrated Space Weather Analysis System (iSWA http://iswa.gsfc.nasa.gov), two mobile space weather applications for both IOS and Android devices, an external API for web-service style access to data, google earth compatible data products, and a downloadable client-based visualization tool.

  17. Data Access and Web Services at the EarthScope Plate Boundary Observatory

    NASA Astrophysics Data System (ADS)

    Matykiewicz, J.; Anderson, G.; Henderson, D.; Hodgkinson, K.; Hoyt, B.; Lee, E.; Persson, E.; Torrez, D.; Smith, J.; Wright, J.; Jackson, M.

    2007-12-01

    The EarthScope Plate Boundary Observatory (PBO) at UNAVCO, Inc., part of the NSF-funded EarthScope project, is designed to study the three-dimensional strain field resulting from deformation across the active boundary zone between the Pacific and North American plates in the western United States. To meet these goals, PBO will install 880 continuous GPS stations, 103 borehole strainmeter stations, and five laser strainmeters, as well as manage data for 209 previously existing continuous GPS stations and one previously existing laser strainmeter. UNAVCO provides access to data products from these stations, as well as general information about the PBO project, via the PBO web site (http://pboweb.unavco.org). GPS and strainmeter data products can be found using a variety of access methods, incuding map searches, text searches, and station specific data retrieval. In addition, the PBO construction status is available via multiple mapping interfaces, including custom web based map widgets and Google Earth. Additional construction details can be accessed from PBO operational pages and station specific home pages. The current state of health for the PBO network is available with the statistical snap-shot, full map interfaces, tabular web based reports, and automatic data mining and alerts. UNAVCO is currently working to enhance the community access to this information by developing a web service framework for the discovery of data products, interfacing with operational engineers, and exposing data services to third party participants. In addition, UNAVCO, through the PBO project, provides advanced data management and monitoring systems for use by the community in operating geodetic networks in the United States and beyond. We will demonstrate these systems during the AGU meeting, and we welcome inquiries from the community at any time.

  18. The Plate Boundary Observatory: Community Focused Web Services

    NASA Astrophysics Data System (ADS)

    Matykiewicz, J.; Anderson, G.; Lee, E.; Hoyt, B.; Hodgkinson, K.; Persson, E.; Wright, J.; Torrez, D.; Jackson, M.

    2006-12-01

    The Plate Boundary Observatory (PBO), part of the NSF-funded EarthScope project, is designed to study the three-dimensional strain field resulting from deformation across the active boundary zone between the Pacific and North American plates in the western United States. To meet these goals, PBO will install 852 continuous GPS stations, 103 borehole strainmeter stations, 28 tiltmeters, and five laser strainmeters, as well as manage data for 209 previously existing continuous GPS stations. UNAVCO provides access to data products from these stations, as well as general information about the PBO project, via the PBO web site (http://pboweb.unavco.org). GPS and strainmeter data products can be found using a variety of channels, including map searches, text searches, and station specific data retrieval. In addition, the PBO construction status is available via multiple mapping interfaces, including custom web based map widgets and Google Earth. Additional construction details can be accessed from PBO operational pages and station specific home pages. The current state of health for the PBO network is available with the statistical snap-shot, full map interfaces, tabular web based reports, and automatic data mining and alerts. UNAVCO is currently working to enhance the community access to this information by developing a web service framework for the discovery of data products, interfacing with operational engineers, and exposing data services to third party participants. In addition, UNAVCO, through the PBO project, provides advanced data management and monitoring systems for use by the community in operating geodetic networks in the United States and beyond. We will demonstrate these systems during the AGU meeting, and we welcome inquiries from the community at any time.

  19. Connecting long-tail scientists with big data centers using SaaS

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Bermudez, L. E.

    2012-12-01

    Big data centers and long tail scientists represent two extremes in the geoscience research community. Interoperability and inter-use based on software-as-a-service (SaaS) increases access to big data holdings by this underserved community of scientists. Large, institutional data centers have long been recognized as vital resources in the geoscience community. Permanent data archiving and dissemination centers provide "access to the data and (are) a critical source of people who have experience in the use of the data and can provide advice and counsel for new applications." [NRC] The "long-tail of science" is the geoscience researchers that work separate from institutional data centers [Heidorn]. Long-tail scientists need to be efficient consumers of data from large, institutional data centers. Discussions in NSF EarthCube capture the challenges: "Like the vast majority of NSF-funded researchers, Alice (a long-tail scientist) works with limited resources. In the absence of suitable expertise and infrastructure, the apparently simple task that she assigns to her graduate student becomes an information discovery and management nightmare. Downloading and transforming datasets takes weeks." [Foster, et.al.] The long-tail metaphor points to methods to bridge the gap, i.e., the Web. A decade ago, OGC began building a geospatial information space using open, web standards for geoprocessing [ORM]. Recently, [Foster, et.al.] accurately observed that "by adopting, adapting, and applying semantic web and SaaS technologies, we can make the use of geoscience data as easy and convenient as consumption of online media." SaaS places web services into Cloud Computing. SaaS for geospatial is emerging rapidly building on the first-generation geospatial web, e.g., OGC Web Coverage Service [WCS] and the Data Access Protocol [DAP]. Several recent examples show progress in applying SaaS to geosciences: - NASA's Earth Data Coherent Web has a goal to improve science user experience using Web Services (e.g. W*S, SOAP, RESTful) to reduce barriers to using EOSDIS data [ECW]. - NASA's LANCE provides direct access to vast amounts of satellite data using the OGC Web Map Tile Service (WMTS). - NOAA's Unified Access Framework for Gridded Data (UAF Grid) is a web service based capability for direct access to a variety of datasets using netCDF, OPeNDAP, THREDDS, WMS and WCS. [UAF] Tools to access SaaS's are many and varied: some proprietary, others open source; some run in browsers, others are stand-alone applications. What's required is interoperability using web interfaces offered by the data centers. NOAA's UAF service stack supports Matlab, ArcGIS, Ferret, GrADS, Google Earth, IDV, LAS. Any SaaS that offers OGC Web Services (WMS, WFS, WCS) can be accessed by scores of clients [OGC]. While there has been much progress in the recent year toward offering web services for the long-tail of scientists, more needs to be done. Web services offer data access but more than access is needed for inter-use of data, e.g. defining data schemas that allow for data fusion, addressing coordinate systems, spatial geometry, and semantics for observations. Connecting long-tail scientists with large, data centers using SaaS and, in the future, semantic web, will address this large and currently underserved user community.

  20. Automating Information Discovery Within the Invisible Web

    NASA Astrophysics Data System (ADS)

    Sweeney, Edwina; Curran, Kevin; Xie, Ermai

    A Web crawler or spider crawls through the Web looking for pages to index, and when it locates a new page it passes the page on to an indexer. The indexer identifies links, keywords, and other content and stores these within its database. This database is searched by entering keywords through an interface and suitable Web pages are returned in a results page in the form of hyperlinks accompanied by short descriptions. The Web, however, is increasingly moving away from being a collection of documents to a multidimensional repository for sounds, images, audio, and other formats. This is leading to a situation where certain parts of the Web are invisible or hidden. The term known as the "Deep Web" has emerged to refer to the mass of information that can be accessed via the Web but cannot be indexed by conventional search engines. The concept of the Deep Web makes searches quite complex for search engines. Google states that the claim that conventional search engines cannot find such documents as PDFs, Word, PowerPoint, Excel, or any non-HTML page is not fully accurate and steps have been taken to address this problem by implementing procedures to search items such as academic publications, news, blogs, videos, books, and real-time information. However, Google still only provides access to a fraction of the Deep Web. This chapter explores the Deep Web and the current tools available in accessing it.

  1. Critical Reading of the Web

    ERIC Educational Resources Information Center

    Griffin, Teresa; Cohen, Deb

    2012-01-01

    The ubiquity and familiarity of the world wide web means that students regularly turn to it as a source of information. In doing so, they "are said to rely heavily on simple search engines, such as Google to find what they want." Researchers have also investigated how students use search engines, concluding that "the young web users tended to…

  2. Microsoft or Google Web 2.0 Tools for Course Management

    ERIC Educational Resources Information Center

    Rienzo, Thomas; Han, Bernard

    2009-01-01

    While Web 2.0 has no universal definition, it always refers to online interactions in which user groups both provide and receive content with the aim of collective intelligence. Since 2005, online software has provided Web 2.0 collaboration technologies, for little or no charge, that were formerly available only to wealthy organizations. Academic…

  3. Using Google Scholar to Estimate the Impact of Journal Articles in Education

    ERIC Educational Resources Information Center

    van Aalst, Jan

    2010-01-01

    This article discusses the potential of Google Scholar as an alternative or complement to the Web of Science and Scopus for measuring the impact of journal articles in education. Three handbooks on research in science education, language education, and educational technology were used to identify a sample of 112 accomplished scholars. Google…

  4. Center for Adaptive Optics | Search

    Science.gov Websites

    Center for Adaptive Optics A University of California Science and Technology Center home Search CfAO Google Search search: CfAO All of UCOLick.org Whole Web Search for recent Adaptive Optics news at GoogleNews! Last Modified: Sep 21, 2010 Center for Adaptive Optics | Search | The Center | Adaptive Optics

  5. "Google Reigns Triumphant"?: Stemming the Tide of Googlitis via Collaborative, Situated Information Literacy Instruction

    ERIC Educational Resources Information Center

    Leibiger, Carol A.

    2011-01-01

    Googlitis, the overreliance on search engines for research and the resulting development of poor searching skills, is a recognized problem among today's students. Google is not an effective research tool because, in addition to encouraging keyword searching at the expense of more powerful subject searching, it only accesses the Surface Web and is…

  6. Complaint go: an online complaint registration system using web services and android

    NASA Astrophysics Data System (ADS)

    Mareeswari, V.; Gopalakrishnan, V.

    2017-11-01

    In numerous nations, there are city bodies that are the nearby representing bodies that help keep up and run urban communities. These administering bodies are for the most part called MC (Municipal Cooperation). The MC may need to introduce edit cameras and other observation gadgets to guarantee the city is running easily and productively. It is imperative for an MC to know the deficiencies occurring inside the city. As of now, this must be for all intents and purposes conceivable by introducing sensors/cameras and so forth or enabling nationals to straightforwardly address them. The everyday operations and working of the city are taken care by administering bodies which are known as Government Authorities. Presently keeping in mind the end goal to keep up the huge city requires that the Government Authority should know about any issue or deficiency either through (sensors/CCTV cameras) or by enabling the nationals to grumble about these issues. The second choice is generally granted on the grounds that it gives the best possible substantial data. The GA by and large enables its residents to enlist their grievance through a few mediums. In this application, the citizens are facilitated to send the complaints directly from their smartphone to the higher officials. Many APIs are functioning as the web services which are really essential to make it easier to register a complaint such as Google Places API to detect your current location and show that in Map. The Web portal is used to process various complaints well supported with different web services.

  7. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are updated in near-realtime based on the linked data providers mentioned above. An alert is automatically pushed to the user if the new data meets the conditions of the registered filter expression. This monitoring service is available on the web portal with alerting by email and within the mobile app with alerting by email and push notification.

  8. Three options for citation tracking: Google Scholar, Scopus and Web of Science.

    PubMed

    Bakkalbasi, Nisa; Bauer, Kathleen; Glover, Janis; Wang, Lei

    2006-06-29

    Researchers turn to citation tracking to find the most influential articles for a particular topic and to see how often their own published papers are cited. For years researchers looking for this type of information had only one resource to consult: the Web of Science from Thomson Scientific. In 2004 two competitors emerged--Scopus from Elsevier and Google Scholar from Google. The research reported here uses citation analysis in an observational study examining these three databases; comparing citation counts for articles from two disciplines (oncology and condensed matter physics) and two years (1993 and 2003) to test the hypothesis that the different scholarly publication coverage provided by the three search tools will lead to different citation counts from each. Eleven journal titles with varying impact factors were selected from each discipline (oncology and condensed matter physics) using the Journal Citation Reports (JCR). All articles published in the selected titles were retrieved for the years 1993 and 2003, and a stratified random sample of articles was chosen, resulting in four sets of articles. During the week of November 7-12, 2005, the citation counts for each research article were extracted from the three sources. The actual citing references for a subset of the articles published in 2003 were also gathered from each of the three sources. For oncology 1993 Web of Science returned the highest average number of citations, 45.3. Scopus returned the highest average number of citations (8.9) for oncology 2003. Web of Science returned the highest number of citations for condensed matter physics 1993 and 2003 (22.5 and 3.9 respectively). The data showed a significant difference in the mean citation rates between all pairs of resources except between Google Scholar and Scopus for condensed matter physics 2003. For articles published in 2003 Google Scholar returned the largest amount of unique citing material for oncology and Web of Science returned the most for condensed matter physics. This study did not identify any one of these three resources as the answer to all citation tracking needs. Scopus showed strength in providing citing literature for current (2003) oncology articles, while Web of Science produced more citing material for 2003 and 1993 condensed matter physics, and 1993 oncology articles. All three tools returned some unique material. Our data indicate that the question of which tool provides the most complete set of citing literature may depend on the subject and publication year of a given article.

  9. Three options for citation tracking: Google Scholar, Scopus and Web of Science

    PubMed Central

    Bakkalbasi, Nisa; Bauer, Kathleen; Glover, Janis; Wang, Lei

    2006-01-01

    Background Researchers turn to citation tracking to find the most influential articles for a particular topic and to see how often their own published papers are cited. For years researchers looking for this type of information had only one resource to consult: the Web of Science from Thomson Scientific. In 2004 two competitors emerged – Scopus from Elsevier and Google Scholar from Google. The research reported here uses citation analysis in an observational study examining these three databases; comparing citation counts for articles from two disciplines (oncology and condensed matter physics) and two years (1993 and 2003) to test the hypothesis that the different scholarly publication coverage provided by the three search tools will lead to different citation counts from each. Methods Eleven journal titles with varying impact factors were selected from each discipline (oncology and condensed matter physics) using the Journal Citation Reports (JCR). All articles published in the selected titles were retrieved for the years 1993 and 2003, and a stratified random sample of articles was chosen, resulting in four sets of articles. During the week of November 7–12, 2005, the citation counts for each research article were extracted from the three sources. The actual citing references for a subset of the articles published in 2003 were also gathered from each of the three sources. Results For oncology 1993 Web of Science returned the highest average number of citations, 45.3. Scopus returned the highest average number of citations (8.9) for oncology 2003. Web of Science returned the highest number of citations for condensed matter physics 1993 and 2003 (22.5 and 3.9 respectively). The data showed a significant difference in the mean citation rates between all pairs of resources except between Google Scholar and Scopus for condensed matter physics 2003. For articles published in 2003 Google Scholar returned the largest amount of unique citing material for oncology and Web of Science returned the most for condensed matter physics. Conclusion This study did not identify any one of these three resources as the answer to all citation tracking needs. Scopus showed strength in providing citing literature for current (2003) oncology articles, while Web of Science produced more citing material for 2003 and 1993 condensed matter physics, and 1993 oncology articles. All three tools returned some unique material. Our data indicate that the question of which tool provides the most complete set of citing literature may depend on the subject and publication year of a given article. PMID:16805916

  10. Web of Science, Scopus, and Google Scholar citation rates: a case study of medical physics and biomedical engineering: what gets cited and what doesn't?

    PubMed

    Trapp, Jamie

    2016-12-01

    There are often differences in a publication's citation count, depending on the database accessed. Here, aspects of citation counts for medical physics and biomedical engineering papers are studied using papers published in the journal Australasian physical and engineering sciences in medicine. Comparison is made between the Web of Science, Scopus, and Google Scholar. Papers are categorised into subject matter, and citation trends are examined. It is shown that review papers as a group tend to receive more citations on average; however the highest cited individual papers are more likely to be research papers.

  11. Trends in access of plant biodiversity data revealed by Google Analytics

    PubMed Central

    Baxter, David G.; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E.

    2014-01-01

    Abstract The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development. PMID:25425933

  12. Trends in access of plant biodiversity data revealed by Google Analytics.

    PubMed

    Jones, Timothy Mark; Baxter, David G; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E

    2014-01-01

    The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development.

  13. 76 FR 18762 - Google, Inc.; Analysis of Proposed Consent Order To Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-05

    ... final the agreement's proposed order. On February 9, 2010, Google launched a social networking service... street name and city or town; (c) email address or other online contact information, such as a user... networking service (``Google Buzz'') it used personal information previously collected for other purposes...

  14. SECURE INTERNET OF THINGS-BASED CLOUD FRAMEWORK TO CONTROL ZIKA VIRUS OUTBREAK.

    PubMed

    Sareen, Sanjay; Sood, Sandeep K; Gupta, Sunil Kumar

    2017-01-01

    Zika virus (ZikaV) is currently one of the most important emerging viruses in the world which has caused outbreaks and epidemics and has also been associated with severe clinical manifestations and congenital malformations. Traditional approaches to combat the ZikaV outbreak are not effective for detection and control. The aim of this study is to propose a cloud-based system to prevent and control the spread of Zika virus disease using integration of mobile phones and Internet of Things (IoT). A Naive Bayesian Network (NBN) is used to diagnose the possibly infected users, and Google Maps Web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each ZikaV infected user, mosquito-dense sites, and breeding sites on the Google map that helps the government healthcare authorities to control such risk-prone areas effectively and efficiently. The performance and accuracy of the proposed system are evaluated using dataset for 2 million users. Our system provides high accuracy for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment. The cloud-based proposed system contributed to the accurate NBN-based classification of infected users and accurate identification of risk-prone areas using Google Maps.

  15. The GLIMS Glacier Database

    NASA Astrophysics Data System (ADS)

    Raup, B. H.; Khalsa, S. S.; Armstrong, R.

    2007-12-01

    The Global Land Ice Measurements from Space (GLIMS) project has built a geospatial and temporal database of glacier data, composed of glacier outlines and various scalar attributes. These data are being derived primarily from satellite imagery, such as from ASTER and Landsat. Each "snapshot" of a glacier is from a specific time, and the database is designed to store multiple snapshots representative of different times. We have implemented two web-based interfaces to the database; one enables exploration of the data via interactive maps (web map server), while the other allows searches based on text-field constraints. The web map server is an Open Geospatial Consortium (OGC) compliant Web Map Server (WMS) and Web Feature Server (WFS). This means that other web sites can display glacier layers from our site over the Internet, or retrieve glacier features in vector format. All components of the system are implemented using Open Source software: Linux, PostgreSQL, PostGIS (geospatial extensions to the database), MapServer (WMS and WFS), and several supporting components such as Proj.4 (a geographic projection library) and PHP. These tools are robust and provide a flexible and powerful framework for web mapping applications. As a service to the GLIMS community, the database contains metadata on all ASTER imagery acquired over glacierized terrain. Reduced-resolution of the images (browse imagery) can be viewed either as a layer in the MapServer application, or overlaid on the virtual globe within Google Earth. The interactive map application allows the user to constrain by time what data appear on the map. For example, ASTER or glacier outlines from 2002 only, or from Autumn in any year, can be displayed. The system allows users to download their selected glacier data in a choice of formats. The results of a query based on spatial selection (using a mouse) or text-field constraints can be downloaded in any of these formats: ESRI shapefiles, KML (Google Earth), MapInfo, GML (Geography Markup Language) and GMT (Generic Mapping Tools). This "clip-and-ship" function allows users to download only the data they are interested in. Our flexible web interfaces to the database, which includes various support layers (e.g. a layer to help collaborators identify satellite imagery over their region of expertise) will facilitate enhanced analysis to be undertaken on glacier systems, their distribution, and their impacts on other Earth systems.

  16. A Java API for working with PubChem datasets.

    PubMed

    Southern, Mark R; Griffin, Patrick R

    2011-03-01

    PubChem is a public repository of chemical structures and associated biological activities. The PubChem BioAssay database contains assay descriptions, conditions and readouts and biological screening results that have been submitted by the biomedical research community. The PubChem web site and Power User Gateway (PUG) web service allow users to interact with the data and raw files are available via FTP. These resources are helpful to many but there can also be great benefit by using a software API to manipulate the data. Here, we describe a Java API with entity objects mapped to the PubChem Schema and with wrapper functions for calling the NCBI eUtilities and PubChem PUG web services. PubChem BioAssays and associated chemical compounds can then be queried and manipulated in a local relational database. Features include chemical structure searching and generation and display of curve fits from stored dose-response experiments, something that is not yet available within PubChem itself. The aim is to provide researchers with a fast, consistent, queryable local resource from which to manipulate PubChem BioAssays in a database agnostic manner. It is not intended as an end user tool but to provide a platform for further automation and tools development. http://code.google.com/p/pubchemdb.

  17. Are Google or Yahoo a good portal for getting quality healthcare web information?

    PubMed

    Chang, Polun; Hou, I-Ching; Hsu, Chiao-Ling; Lai, Hsiang-Fen

    2006-01-01

    We examined the ranks of 50 award-won health websites in Taiwan against the search results of two popular portals with 6 common diseases. The results showed that the portal search results do not rank the quality web sites reasonably.

  18. An integrated WebGIS framework for volunteered geographic information and social media in soil and water conservation.

    PubMed

    Werts, Joshua D; Mikhailova, Elena A; Post, Christopher J; Sharp, Julia L

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  19. An Integrated WebGIS Framework for Volunteered Geographic Information and Social Media in Soil and Water Conservation

    NASA Astrophysics Data System (ADS)

    Werts, Joshua D.; Mikhailova, Elena A.; Post, Christopher J.; Sharp, Julia L.

    2012-04-01

    Volunteered geographic information and social networking in a WebGIS has the potential to increase public participation in soil and water conservation, promote environmental awareness and change, and provide timely data that may be otherwise unavailable to policymakers in soil and water conservation management. The objectives of this study were: (1) to develop a framework for combining current technologies, computing advances, data sources, and social media; and (2) develop and test an online web mapping interface. The mapping interface integrates Microsoft Silverlight, Bing Maps, ArcGIS Server, Google Picasa Web Albums Data API, RSS, Google Analytics, and Facebook to create a rich user experience. The website allows the public to upload photos and attributes of their own subdivisions or sites they have identified and explore other submissions. The website was made available to the public in early February 2011 at http://www.AbandonedDevelopments.com and evaluated for its potential long-term success in a pilot study.

  20. Lexicon Sextant: Modeling a Mnemonic System for Customizable Browser Information Organization and Management

    ERIC Educational Resources Information Center

    Shen, Siu-Tsen

    2016-01-01

    This paper presents an ongoing study of the development of a customizable web browser information organization and management system, which the author has named Lexicon Sextant (LS). LS is a user friendly, graphical web based add-on to the latest generation of web browsers, such as Google Chrome, making it easier and more intuitive to store and…

  1. Why We Are Not Google: Lessons from a Library Web Site Usability Study

    ERIC Educational Resources Information Center

    Swanson, Troy A.; Green, Jeremy

    2011-01-01

    In the Fall of 2009, the Moraine Valley Community College Library, using guidelines developed by Jakob Nielsen, conducted a usability study to determine how students were using the library Web site and to inform the redesign of the Web site. The authors found that Moraine Valley's current gateway design was a more effective access point to library…

  2. Moving Forward: The Next-Gen Catalog and the New Discovery Tools

    ERIC Educational Resources Information Center

    Weare, William H., Jr.; Toms, Sue; Breeding, Marshall

    2011-01-01

    Do students prefer to use Google instead of the library catalog? Ever wondered why? Google is easier to use and delivers plenty of "good enough" resources to meet their needs. The current generation of online catalogs has two main problems. First, the look and feel of the interface doesn't reflect the conventions adhered to elsewhere on the web,…

  3. Development of an Innovative Interactive Virtual Classroom System for K-12 Education Using Google App Engine

    ERIC Educational Resources Information Center

    Mumba, Frackson; Zhu, Mengxia

    2013-01-01

    This paper presents a Simulation-based interactive Virtual ClassRoom web system (SVCR: www.vclasie.com) powered by the state-of-the-art cloud computing technology from Google SVCR integrates popular free open-source math, science and engineering simulations and provides functions such as secure user access control and management of courses,…

  4. Social Constructivist Approach to Web-Based EFL Learning: Collaboration, Motivation, and Perception on the Use of Google Docs

    ERIC Educational Resources Information Center

    Liu, Sarah Hsueh-Jui; Lan, Yu-Ju

    2016-01-01

    This study reports on the differences in motivation, vocabulary gain, and perceptions on using or the Google Docs between individual and collaborative learning at a tertiary level. Two classes of English-as-a-Foreign Language (EFL) students were recruited and each class was randomly assigned into one of the two groups--individuals or…

  5. Why do people google movement disorders? An infodemiological study of information seeking behaviors.

    PubMed

    Brigo, Francesco; Erro, Roberto

    2016-05-01

    Millions of people worldwide everyday search Google or Wikipedia to look for health-related information. Aim of this study was to evaluate and interpret web search queries for terms related to movement disorders (MD) in English-speaking countries and their changes over time. We analyzed information regarding the volume of online searches in Google and Wikipedia for the most common MD and their treatments. We determined the highest search volume peaks to identify possible relation with online news headlines. The volume of searches for some queries related to MD entered in Google enormously increased over time. Most queries were related to definition, subtypes, symptoms and treatment (mostly to adverse effects, or alternatively, to possible alternative treatments). The highest peaks of MD search queries were temporally related to news about celebrities suffering from MD, to specific mass-media events or to news concerning pharmaceutic companies or scientific discoveries on MD. An increasing number of people use Google and Wikipedia to look for terms related to MD to obtain information on definitions, causes and symptoms, possibly to aid initial self-diagnosis. MD information demand and the actual prevalence of different MDs do not travel together: web search volume may mirrors patients' fears and worries about some particular disorders perceived as more serious than others, or may be driven by release of news about celebrities suffering from MD, "breaking news" or specific mass-media events regarding MD.

  6. "Publish or Perish" as citation metrics used to analyze scientific output in the humanities: International case studies in economics, geography, social sciences, philosophy, and history.

    PubMed

    Baneyx, Audrey

    2008-01-01

    Traditionally, the most commonly used source of bibliometric data is the Thomson ISI Web of Knowledge, in particular the (Social) Science Citation Index and the Journal Citation Reports, which provide the yearly Journal Impact Factors. This database used for the evaluation of researchers is not advantageous in the humanities, mainly because books, conference papers, and non-English journals, which are an important part of scientific activity, are not (well) covered. This paper presents the use of an alternative source of data, Google Scholar, and its benefits in calculating citation metrics in the humanities. Because of its broader range of data sources, the use of Google Scholar generally results in more comprehensive citation coverage in the humanities. This presentation compares and analyzes some international case studies with ISI Web of Knowledge and Google Scholar. The fields of economics, geography, social sciences, philosophy, and history are focused on to illustrate the differences of results between these two databases. To search for relevant publications in the Google Scholar database, the use of "Publish or Perish" and of CleanPoP, which the author developed to clean the results, are compared.

  7. Enabling Tools and Methods for International, Inter-disciplinary and Educational Collaboration

    NASA Astrophysics Data System (ADS)

    Robinson, E. M.; Hoijarvi, K.; Falke, S.; Fialkowski, E.; Kieffer, M.; Husar, R. B.

    2008-05-01

    In the past, collaboration has taken place in tightly-knit workgroups where the members had direct connections to each other. Such collaboration was confined to small workgroups and person-to-person communication. Recent developments through the Internet foster virtual workgroups and organizations where dynamic, 'just-in-time' collaboration can take place over a much larger scale. The emergence of virtual workgroups has strongly influenced the interaction of inter-national, inter-disciplinary, as well as educational activities. In this paper we present an array of enabling tools and methods that incorporate the new technologies including web services, software mashups, tag-based structuring and searching, and wikis for collaborative writing and content organization. Large monolithic, 'do-it-all' software tools are giving way to web service modules, combined through service chaining. Application software can now be created using Service Oriented Architecture (SOA). In the air quality community, data providers and users are distributed in space and time creating barriers for data access. By exposing the data on the internet the space, time barriers are lessened. The federated data system, DataFed, developed at Washington University, accesses data from autonomous, distributed providers. Through data "wrappers", DataFed provides uniform and standards-based access services to heterogeneous, distributed data. Service orientation not only lowers the entry resistance for service providers, but it also allows the creation of user-defined applications and/or mashups. For example, Google Earth's open API allowed many groups to mash their content with Google Earth. Ad hoc tagging gives a rich description of the internet resources, but it has the disadvantage of providing a fuzzy schema. The semantic uniformity of the internet resources can be improved by controlled tagging which apply a consistent namespace and tag combinations to diverse objects. One example of this is the photo-sharing web application Flickr. Just like data, by exposing photos through the internet those can be reused in ways unknown and unanticipated by the provider. For air quality application, Flickr allowed a rich collection of images of forest fire smoke, wind blown dust and haze events to be tagged with controlled tags and used in for evaluating subtle features of the events. Wikis, originally used just for collaboratively writing and discuss documents, are now also a social software workflow managers. In air quality data, wikis provides the means to collaboratively create rich metadata. Wikis become a virtual meeting place to discuss ideas before a workshop of conference, display tagged internet resources, and collaboratively work on documents. Wikis are also useful in the classroom. For instance in class projects, the wiki displays harvested resources, maintains collaborative documents and discussions and is the organizational memory for the project.

  8. Biographer: web-based editing and rendering of SBGN compliant biochemical networks

    PubMed Central

    Krause, Falko; Schulz, Marvin; Ripkens, Ben; Flöttmann, Max; Krantz, Marcus; Klipp, Edda; Handorf, Thomas

    2013-01-01

    Motivation: The rapid accumulation of knowledge in the field of Systems Biology during the past years requires advanced, but simple-to-use, methods for the visualization of information in a structured and easily comprehensible manner. Results: We have developed biographer, a web-based renderer and editor for reaction networks, which can be integrated as a library into tools dealing with network-related information. Our software enables visualizations based on the emerging standard Systems Biology Graphical Notation. It is able to import networks encoded in various formats such as SBML, SBGN-ML and jSBGN, a custom lightweight exchange format. The core package is implemented in HTML5, CSS and JavaScript and can be used within any kind of web-based project. It features interactive graph-editing tools and automatic graph layout algorithms. In addition, we provide a standalone graph editor and a web server, which contains enhanced features like web services for the import and export of models and visualizations in different formats. Availability: The biographer tool can be used at and downloaded from the web page http://biographer.biologie.hu-berlin.de/. The different software packages, including a server-indepenent version as well as a web server for Windows and Linux based systems, are available at http://code.google.com/p/biographer/ under the open-source license LGPL. Contact: edda.klipp@biologie.hu-berlin.de or handorf@physik.hu-berlin.de PMID:23574737

  9. Using a Web-based GIS to Teach Problem-based Science in High School and College

    NASA Astrophysics Data System (ADS)

    Metzger, E.; Lenkeit Meezan, , K. A.; Schmidt, C.; Taketa, R.; Carter, J.; Iverson, R.

    2008-12-01

    Foothill College has partnered with San Jose State University to bring GIS web mapping technology to the high school and college classroom. The project consists of two parts. In the first part, Foothill and San Jose State University have teamed up to offer classes on building and maintaining Web based Geographic Information Systems (GIS). Web-based GIS such as Google Maps, MapQuest and Yahoo Maps have become ubiquitous, and the skills to build and maintain these systems are in high demand from many employers. In the second part of the project, high school students will be able to learn about Web GIS as a real world tool used by scientists. The students in the Foothill College/San Jose State class will build their Web GIS using scientific data related to the San Francisco/San Joaquin Delta region, with a focus on watersheds, biodiversity and earthquake hazards. This project includes high school level curriculum development that will tie in to No Child Left Behind and National Curriculum Standards in both Science and Geography, and provide workshops for both pre-and in- service teachers in the use of Web GIS-driven course material in the high school classroom. The project will bring the work of professional scientists into any high school classroom with an internet connection; while simultaneously providing workforce training in high demand technology based jobs.

  10. Search engine as a diagnostic tool in difficult immunological and allergologic cases: is Google useful?

    PubMed

    Lombardi, C; Griffiths, E; McLeod, B; Caviglia, A; Penagos, M

    2009-07-01

    Web search engines are an important tool in communication and diffusion of knowledge. Among these, Google appears to be the most popular one: in August 2008, it accounted for 87% of all web searches in the UK, compared with Yahoo's 3.3%. Google's value as a diagnostic guide in general medicine was recently reported. The aim of this comparative cross-sectional study was to evaluate whether searching Google with disease-related terms was effective in the identification and diagnosis of complex immunological and allergic cases. Forty-five case reports were randomly selected by an independent observer from peer-reviewed medical journals. Clinical data were presented separately to three investigators, blinded to the final diagnoses. Investigator A was a Consultant with an expert knowledge in Internal Medicine and Allergy (IM&A) and basic computing skills. Investigator B was a Registrar in IM&A. Investigator C was a Research Nurse. Both Investigators B and C were familiar with computers and search engines. For every clinical case presented, each investigator independently carried out an Internet search using Google to provide a final diagnosis. Their results were then compared with the published diagnoses. Correct diagnoses were provided in 30/45 (66%) cases, 39/45 (86%) cases, and in 29/45 (64%) cases by investigator A, B, and C, respectively. All of the three investigators achieved the correct diagnosis in 19 cases (42%), and all of them failed in two cases. This Google-based search was useful to identify an appropriate diagnosis in complex immunological and allergic cases. Computing skills may help to get better results.

  11. Exploring Google to Enhance Reference Services

    ERIC Educational Resources Information Center

    Jia, Peijun

    2011-01-01

    Google is currently recognized as the world's most powerful search engine. Google is so powerful and intuitive that one does not need to possess many skills to use it. However, Google is more than just simple search. For those who have special search skills and know Google's superior search features, it becomes an extraordinary tool. To understand…

  12. Using NASA's Giovanni Web Portal to Access and Visualize Satellite-based Earth Science Data in the Classroom

    NASA Technical Reports Server (NTRS)

    Lloyd, Steven; Acker, James G.; Prados, Ana I.; Leptoukh, Gregory G.

    2008-01-01

    One of the biggest obstacles for the average Earth science student today is locating and obtaining satellite-based remote sensing data sets in a format that is accessible and optimal for their data analysis needs. At the Goddard Earth Sciences Data and Information Services Center (GES-DISC) alone, on the order of hundreds of Terabytes of data are available for distribution to scientists, students and the general public. The single biggest and time-consuming hurdle for most students when they begin their study of the various datasets is how to slog through this mountain of data to arrive at a properly sub-setted and manageable data set to answer their science question(s). The GES DISC provides a number of tools for data access and visualization, including the Google-like Mirador search engine and the powerful GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) web interface.

  13. Wilber 3: A Python-Django Web Application For Acquiring Large-scale Event-oriented Seismic Data

    NASA Astrophysics Data System (ADS)

    Newman, R. L.; Clark, A.; Trabant, C. M.; Karstens, R.; Hutko, A. R.; Casey, R. E.; Ahern, T. K.

    2013-12-01

    Since 2001, the IRIS Data Management Center (DMC) WILBER II system has provided a convenient web-based interface for locating seismic data related to a particular event, and requesting a subset of that data for download. Since its launch, both the scale of available data and the technology of web-based applications have developed significantly. Wilber 3 is a ground-up redesign that leverages a number of public and open-source projects to provide an event-oriented data request interface with a high level of interactivity and scalability for multiple data types. Wilber 3 uses the IRIS/Federation of Digital Seismic Networks (FDSN) web services for event data, metadata, and time-series data. Combining a carefully optimized Google Map with the highly scalable SlickGrid data API, the Wilber 3 client-side interface can load tens of thousands of events or networks/stations in a single request, and provide instantly responsive browsing, sorting, and filtering of event and meta data in the web browser, without further reliance on the data service. The server-side of Wilber 3 is a Python-Django application, one of over a dozen developed in the last year at IRIS, whose common framework, components, and administrative overhead represent a massive savings in developer resources. Requests for assembled datasets, which may include thousands of data channels and gigabytes of data, are queued and executed using the Celery distributed Python task scheduler, giving Wilber 3 the ability to operate in parallel across a large number of nodes.

  14. An assessment of the visibility of MeSH-indexed medical web catalogs through search engines.

    PubMed

    Zweigenbaum, P; Darmoni, S J; Grabar, N; Douyère, M; Benichou, J

    2002-01-01

    Manually indexed Internet health catalogs such as CliniWeb or CISMeF provide resources for retrieving high-quality health information. Users of these quality-controlled subject gateways are most often referred to them by general search engines such as Google, AltaVista, etc. This raises several questions, among which the following: what is the relative visibility of medical Internet catalogs through search engines? This study addresses this issue by measuring and comparing the visibility of six major, MeSH-indexed health catalogs through four different search engines (AltaVista, Google, Lycos, Northern Light) in two languages (English and French). Over half a million queries were sent to the search engines; for most of these search engines, according to our measures at the time the queries were sent, the most visible catalog for English MeSH terms was CliniWeb and the most visible one for French MeSH terms was CISMeF.

  15. [Electronic poison information management system].

    PubMed

    Kabata, Piotr; Waldman, Wojciech; Kaletha, Krystian; Sein Anand, Jacek

    2013-01-01

    We describe deployment of electronic toxicological information database in poison control center of Pomeranian Center of Toxicology. System was based on Google Apps technology, by Google Inc., using electronic, web-based forms and data tables. During first 6 months from system deployment, we used it to archive 1471 poisoning cases, prepare monthly poisoning reports and facilitate statistical analysis of data. Electronic database usage made Poison Center work much easier.

  16. Next-Gen Search Engines

    ERIC Educational Resources Information Center

    Gupta, Amardeep

    2005-01-01

    Current search engines--even the constantly surprising Google--seem unable to leap the next big barrier in search: the trillions of bytes of dynamically generated data created by individual web sites around the world, or what some researchers call the "deep web." The challenge now is not information overload, but information overlook.…

  17. Decision support system for emergency management of oil spill accidents in the Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Liubartseva, Svitlana; Coppini, Giovanni; Pinardi, Nadia; De Dominicis, Michela; Lecci, Rita; Turrisi, Giuseppe; Cretì, Sergio; Martinelli, Sara; Agostini, Paola; Marra, Palmalisa; Palermo, Francesco

    2016-08-01

    This paper presents an innovative web-based decision support system to facilitate emergency management in the case of oil spill accidents, called WITOIL (Where Is The Oil). The system can be applied to create a forecast of oil spill events, evaluate uncertainty of the predictions, and calculate hazards based on historical meteo-oceanographic datasets. To compute the oil transport and transformation, WITOIL uses the MEDSLIK-II oil spill model forced by operational meteo-oceanographic services. Results of the modeling are visualized through Google Maps. A special application for Android is designed to provide mobile access for competent authorities, technical and scientific institutions, and citizens.

  18. NOAA's Big Data Partnership and Applications to Ocean Sciences

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2016-02-01

    New opportunities for the distribution of NOAA's oceanographic and other environmental data are being explored through NOAA's Big Data Partnership (BDP) with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Corp. and the Open Cloud Consortium. This partnership was established in April 2015 through Cooperative Research and Development Agreements, and is seeking new, financially self-sustaining collaborations between the Partners and the federal government centered upon NOAA's data and their potential value in the information marketplace. We will discuss emerging opportunities for collaboration among businesses and NOAA, progress in making NOAA's ocean data more widely accessible through the Partnerships, and applications based upon this access to NOAA's data.

  19. Students' Google Drive Intended Usage: A Case Study of Mathematics Courses in Bangkok University

    ERIC Educational Resources Information Center

    Prasertsith, Krisawan; Kanthawongs, Penjira; Limpachote, Tan

    2016-01-01

    Many technologies have changed the way individuals live and learn. Google Inc. has played significant roles in business and academic worlds. Google Apps for Education and Google Classroom have been offered to higher institutions around the globe. Although large cloud service provider such as Google do not encrypt all their stored electronic data…

  20. Can people find patient decision aids on the Internet?

    PubMed

    Morris, Debra; Drake, Elizabeth; Saarimaki, Anton; Bennett, Carol; O'Connor, Annette

    2008-12-01

    To determine if people could find patient decision aids (PtDAs) on the Internet using the most popular general search engines. We chose five medical conditions for which English language PtDAs were available from at least three different developers. The search engines used were: Google (www.google.com), Yahoo! (www.yahoo.com), and MSN (www.msn.com). For each condition and search engine we ran six searches using a combination of search terms. We coded all non-sponsored Web pages that were linked from the first page of the search results. Most first page results linked to informational Web pages about the condition, only 16% linked to PtDAs. PtDAs were more readily found for the breast cancer surgery decision (our searches found seven of the nine developers). The searches using Yahoo and Google search engines were more likely to find PtDAs. The following combination of search terms: condition, treatment, decision (e.g. breast cancer surgery decision) was most successful across all search engines (29%). While some terms and search engines were more successful, few resulted in direct links to PtDAs. Finding PtDAs would be improved with use of standardized labelling, providing patients with specific Web site addresses or access to an independent PtDA clearinghouse.

  1. Facilitating Semantic Interoperability Among Ocean Data Systems: ODIP-R2R Student Outcomes

    NASA Astrophysics Data System (ADS)

    Stocks, K. I.; Chen, Y.; Shepherd, A.; Chandler, C. L.; Dockery, N.; Elya, J. L.; Smith, S. R.; Ferreira, R.; Fu, L.; Arko, R. A.

    2014-12-01

    With informatics providing an increasingly important set of tools for geoscientists, it is critical to train the next generation of scientists in information and data techniques. The NSF-supported Rolling Deck to Repository (R2R) Program works with the academic fleet community to routinely document, assess, and preserve the underway sensor data from U.S. research vessels. The Ocean Data Interoperability Platform (ODIP) is an EU-US-Australian collaboration fostering interoperability among regional e-infrastructures through workshops and joint prototype development. The need to align terminology between systems is a common challenge across all of the ODIP prototypes. Five R2R students were supported to address aspects of semantic interoperability within ODIP. Developing a vocabulary matching service that links terms from different vocabularies with similar concept. The service implements Google Refine reconciliation service interface such that users can leverage Google Refine application as a friendly user interface while linking different vocabulary terms. Developing Resource Description Framework (RDF) resources that map Shipboard Automated Meteorological Oceanographic System (SAMOS) vocabularies to internationally served vocabularies. Each SAMOS vocabulary term (data parameter and quality control flag) will be described as an RDF resource page. These RDF resources allow for enhanced discoverability and retrieval of SAMOS data by enabling data searches based on parameter. Improving data retrieval and interoperability by exposing data and mapped vocabularies using Semantic Web technologies. We have collaborated with ODIP participating organizations in order to build a generalized data model that will be used to populate a SPARQL endpoint in order to provide expressive querying over our data files. Mapping local and regional vocabularies used by R2R to those used by ODIP partners. This work is described more fully in a companion poster. Making published Linked Data Web developer-friendly with a RESTful service. This goal was achieved by defining a proxy layer on top of the existing SPARQL endpoint that 1) translates HTTP requests into SPARQL queries, and 2) renders the returned results as required by the request sender using content negotiation, suffixes and parameters.

  2. QuakeSim: a Web Service Environment for Productive Investigations with Earth Surface Sensor Data

    NASA Astrophysics Data System (ADS)

    Parker, J. W.; Donnellan, A.; Granat, R. A.; Lyzenga, G. A.; Glasscoe, M. T.; McLeod, D.; Al-Ghanmi, R.; Pierce, M.; Fox, G.; Grant Ludwig, L.; Rundle, J. B.

    2011-12-01

    The QuakeSim science gateway environment includes a visually rich portal interface, web service access to data and data processing operations, and the QuakeTables ontology-based database of fault models and sensor data. The integrated tools and services are designed to assist investigators by covering the entire earthquake cycle of strain accumulation and release. The Web interface now includes Drupal-based access to diverse and changing content, with new ability to access data and data processing directly from the public page, as well as the traditional project management areas that require password access. The system is designed to make initial browsing of fault models and deformation data particularly engaging for new users. Popular data and data processing include GPS time series with data mining techniques to find anomalies in time and space, experimental forecasting methods based on catalogue seismicity, faulted deformation models (both half-space and finite element), and model-based inversion of sensor data. The fault models include the CGS and UCERF 2.0 faults of California and are easily augmented with self-consistent fault models from other regions. The QuakeTables deformation data include the comprehensive set of UAVSAR interferograms as well as a growing collection of satellite InSAR data.. Fault interaction simulations are also being incorporated in the web environment based on Virtual California. A sample usage scenario is presented which follows an investigation of UAVSAR data from viewing as an overlay in Google Maps, to selection of an area of interest via a polygon tool, to fast extraction of the relevant correlation and phase information from large data files, to a model inversion of fault slip followed by calculation and display of a synthetic model interferogram.

  3. Toward Exposing Timing-Based Probing Attacks in Web Applications †

    PubMed Central

    Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai

    2017-01-01

    Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users’ browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach. PMID:28245610

  4. Toward Exposing Timing-Based Probing Attacks in Web Applications.

    PubMed

    Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai

    2017-02-25

    Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users' browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach.

  5. Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.

  6. Mapping and Modeling Web Portal to Advance Global Monitoring and Climate Research

    NASA Astrophysics Data System (ADS)

    Chang, G.; Malhotra, S.; Bui, B.; Sadaqathulla, S.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Rodriguez, L.; Law, E.

    2011-12-01

    Today, the principal investigators of NASA Earth Science missions develop their own software to manipulate, visualize, and analyze the data collected from Earth, space, and airborne observation instruments. There is very little, if any, collaboration among these principal investigators due to the lack of collaborative tools, which would allow these scientists to share data and results. At NASA's Jet Propulsion Laboratory (JPL), under the Lunar Mapping and Modeling Project (LMMP), we have built a web portal that exposes a set of common services to users to allow search, visualization, subset, and download lunar science data. Users also have access to a set of tools that visualize, analyze and annotate the data. These services are developed according to industry standards for data access and manipulation, such REST and Open Geospatial Consortium (OGC) web services. As a result, users can access the datasets through custom written applications or off-the-shelf applications such as Google Earth. Even though it's currently used to store and process lunar data, this web portal infrastructure has been designed to support other solar system bodies such as asteroids and planets, including Earth. The infrastructure uses a combination of custom, commercial, and open-source software as well as off-the-shelf hardware and pay-by-use cloud computing services. The use of standardized web service interfaces facilitates platform and application-independent access to the services and data. For instance, we have software clients for the LMMP portal that provide a rich browsing and analysis experience from a variety of platforms including iOS and Android mobile platforms and large screen multi-touch displays with 3-D terrain viewing functions. The service-oriented architecture and design principles utilized in the implementation of the portal lends itself to be reusable and scalable and could naturally be extended to include a collaborative environment that enables scientists and principal investigators to share their research and analysis seamlessly. In addition, this extension will allow users to easily share their tools and data, and to enrich their mapping and analysis experiences. In this talk, we will describe the advanced data management and portal technologies used to power this collaborative environment. We will further illustrate how this environment can enable, enhance and advance global monitoring and climate research.

  7. Focus Group in Community Mental Health Research: Need for Adaption.

    PubMed

    Zupančič, Vesna; Pahor, Majda; Kogovšek, Tina

    2018-04-27

    The article presents an analysis of the use of focus groups in researching community mental health users, starting with the reasons for using them, their implementation in mental health service users' research, and the adaptations of focus group use when researching the experiences of users. Based on personal research experience and a review of scientific publications in the Google Scholar, Web of Science, ProQuest, EBSCOhost, and Scopus databases, 20 articles published between 2010 and 2016 were selected for targeted content analysis. A checklist for reporting on the use of focus groups with community mental health service users, aiming to improve the comparability, verifiability and validity was developed. Adaptations of the implementation of focus groups in relation to participants' characteristics were suggested. Focus groups are not only useful as a scientific research technique, but also for ensuring service users' participation in decision-making in community mental health and evaluating the quality of the mental health system and services .

  8. Start Your Search Engines. Part One: Taming Google--and Other Tips to Master Web Searches

    ERIC Educational Resources Information Center

    Adam, Anna; Mowers, Helen

    2008-01-01

    There are a lot of useful tools on the Web, all those social applications, and the like. Still most people go online for one thing--to perform a basic search. For most fact-finding missions, the Web is there. But--as media specialists well know--the sheer wealth of online information can hamper efforts to focus on a few reliable references.…

  9. Virtual Field Trips: Using Google Maps to Support Online Learning and Teaching of the History of Astronomy

    ERIC Educational Resources Information Center

    Fluke, Christopher J.

    2009-01-01

    I report on a pilot study on the use of Google Maps to provide virtual field trips as a component of a wholly online graduate course on the history of astronomy. The Astronomical Tourist Web site (http://astronomy.swin.edu.au/sao/tourist), themed around the role that specific locations on Earth have contributed to the development of astronomical…

  10. Where Did Google Get Its Value?

    ERIC Educational Resources Information Center

    Caufield, James

    2005-01-01

    Google's extraordinary success is usually attributed to innovative technology and new business models. By contrast, this paper argues that Google's success is mostly due to its adoption of certain library values. First, Google has refused to adopt the standard practices of the search engine business, practices that compromised service to the user…

  11. Podcast 1 2 3

    ERIC Educational Resources Information Center

    Griffey, Jason

    2007-01-01

    The University of Tennessee at Chattanooga (UTC) offers student workshops that range from Cool New Web Stuff (what is on the web that can help make research or just plain life easier) and How To Use Google Scholar. These workshops are brilliant fodder for podcasting. In fact, the initial idea for its podcast project came from a student plagiarism…

  12. Using Web Speech Technology with Language Learning Applications

    ERIC Educational Resources Information Center

    Daniels, Paul

    2015-01-01

    In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…

  13. Tags Help Make Libraries Del.icio.us: Social Bookmarking and Tagging Boost Participation

    ERIC Educational Resources Information Center

    Rethlefsen, Melissa L.

    2007-01-01

    Traditional library web products, whether online public access catalogs, library databases, or even library web sites, have long been rigidly controlled and difficult to use. Patrons regularly prefer Google's simple interface. Now social bookmarking and tagging tools help librarians bridge the gap between the library's need to offer authoritative,…

  14. Web Analytics Reveal User Behavior: TTU Libraries' Experience with Google Analytics

    ERIC Educational Resources Information Center

    Barba, Ian; Cassidy, Ryan; De Leon, Esther; Williams, B. Justin

    2013-01-01

    Proper planning and assessment surveys of projects for academic library Web sites will not always be predictive of real world use, no matter how many responses they might receive. In this case, multiple-phase development, librarian focus groups, and patron surveys performed before implementation of such a project inaccurately overrated utility and…

  15. A-Z Link

    Science.gov Websites

    Index (this page) 2. Use search.lbl.gov powered by Google. 3. Use DS The Directory of both People and Berkeley Lab Lawrence Berkeley National Laboratory A-Z Index Directory Submit Web People Navigation Berkeley Lab Search Submit Web People Close About the Lab Leadership/Organization Calendar News Center

  16. Teaching Lab Science Courses Online: Resources for Best Practices, Tools, and Technology

    ERIC Educational Resources Information Center

    Jeschofnig, Linda; Jeschofnig, Peter

    2011-01-01

    "Teaching Lab Science Courses Online" is a practical resource for educators developing and teaching fully online lab science courses. First, it provides guidance for using learning management systems and other web 2.0 technologies such as video presentations, discussion boards, Google apps, Skype, video/web conferencing, and social media…

  17. A systematic narrative review of consumer-directed care for older people: implications for model development.

    PubMed

    Ottmann, Goetz; Allen, Jacqui; Feldman, Peter

    2013-11-01

    Consumer-directed care is increasingly becoming a mainstream option in community-based aged care. However, a systematic review describing how the current evaluation research translates into practise has not been published to date. This review aimed to systematically establish an evidence base of user preferences for and satisfaction with services associated with consumer-directed care programmes for older people. Twelve databases were searched, including MedLine, BioMed Central, Cinahl, Expanded Academic ASAP, PsychInfo, ProQuest, Age Line, Science Direct, Social Citation Index, Sociological Abstracts, Web of Science and the Cochrane Library. Google Scholar and Google were also searched. Eligible studies were those reporting on choice, user preferences and service satisfaction outcomes regarding a programme or model of home-based care in the United States or United Kingdom. This systematic narrative review retrieved literature published from January 1992 to August 2011. A total of 277 references were identified. Of these 17 met the selection criteria and were reviewed. Findings indicate that older people report varying preferences for consumer-directed care with some demonstrating limited interest. Clients and carers reported good service satisfaction. However, research comparing user preferences across countries or investigating how ecological factors shape user preferences has received limited attention. Policy-makers and practitioners need to carefully consider the diverse contexts, needs and preferences of older adults in adopting consumer-directed care approaches in community aged care. The review calls for the development of consumer-directed care programmes offering a broad range of options that allow for personalisation and greater control over services without necessarily transferring the responsibility for administrative responsibilities to service users. Review findings suggest that consumer-directed care approaches have the potential to empower older people. © 2013 Blackwell Publishing Ltd.

  18. Searching for cochlear implant information on the internet maze: implications for parents and professionals.

    PubMed

    Zaidman-Zait, Anat; Jamieson, Janet R

    2004-01-01

    The present study has three purposes: (a) to determine who disseminates information on cochlear implants on the Web; (b) to describe a representative sample of Web sites that disseminate information on cochlear implants, with a focus on the content topics and their relevance to parents of deaf children; and (c) to discuss the practical issues of Web-based information and its implications for professionals working with parents of deaf children. Using the terms "cochlear implants" and "children," the first 10 sites generated by the four most popular search engines (Google, Yahoo, Microsoft's MSN, and America Online) at two points in time were selected for analysis, resulting in a sample of 31 Web sites. The majority of Web sites represented medically oriented academic departments and government organizations, although a wide variety of other sources containing information about cochlear implants were also located. Qualitative analysis revealed that the content tended to fall into eight categories; however, the important issues of educational concerns, habilitation following surgery, and communication methods were either addressed minimally or neglected completely. Using analytical tools that had been developed to evaluate "user friendliness" in other domains, each Web site was assessed for its stability, service/design features and ease of use. In general, wide variability was noted across the Web sites for each of these factors. The strong recommendation is made that professionals understand and enhance their knowledge of both the advantages and limitations of incorporating the new technology into their work with parents.

  19. Google Maps for Crowdsourced Emergency Routing

    NASA Astrophysics Data System (ADS)

    Nedkov, S.; Zlatanova, S.

    2012-08-01

    Gathering infrastructure data in emergency situations is challenging. The affected by a disaster areas are often large and the needed observations numerous. Spaceborne remote sensing techniques cover large areas but they are of limited use as their field of view may be blocked by clouds, smoke, buildings, highways, etc. Remote sensing products furthermore require specialists to collect and analyze the data. This contrasts the nature of the damage detection problem: almost everyone is capable of observing whether a street is usable or not. The crowd is fit for solving these challenges as its members are numerous, they are willing to help and are often in the vicinity of the disaster thereby forming a highly dispersed sensor network. This paper proposes and implements a small WebGIS application for performing shortest path calculations based on crowdsourced information about the infrastructure health. The application is built on top of Google Maps and uses its routing service to calculate the shortest distance between two locations. Impassable areas are indicated on a map by people performing in-situ observations on a mobile device, and by users on a desktop machine who consult a multitude of information sources.

  20. Interacting With A Near Real-Time Urban Digital Watershed Using Emerging Geospatial Web Technologies

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Fazio, D. J.; Abdelzaher, T.; Minsker, B.

    2007-12-01

    The value of real-time hydrologic data dissemination including river stage, streamflow, and precipitation for operational stormwater management efforts is particularly high for communities where flash flooding is common and costly. Ideally, such data would be presented within a watershed-scale geospatial context to portray a holistic view of the watershed. Local hydrologic sensor networks usually lack comprehensive integration with sensor networks managed by other agencies sharing the same watershed due to administrative, political, but mostly technical barriers. Recent efforts on providing unified access to hydrological data have concentrated on creating new SOAP-based web services and common data format (e.g. WaterML and Observation Data Model) for users to access the data (e.g. HIS and HydroSeek). Geospatial Web technology including OGC sensor web enablement (SWE), GeoRSS, Geo tags, Geospatial browsers such as Google Earth and Microsoft Virtual Earth and other location-based service tools provides possibilities for us to interact with a digital watershed in near-real-time. OGC SWE proposes a revolutionary concept towards a web-connected/controllable sensor networks. However, these efforts have not provided the capability to allow dynamic data integration/fusion among heterogeneous sources, data filtering and support for workflows or domain specific applications where both push and pull mode of retrieving data may be needed. We propose a light weight integration framework by extending SWE with open source Enterprise Service Bus (e.g., mule) as a backbone component to dynamically transform, transport, and integrate both heterogeneous sensor data sources and simulation model outputs. We will report our progress on building such framework where multi-agencies" sensor data and hydro-model outputs (with map layers) will be integrated and disseminated in a geospatial browser (e.g. Microsoft Virtual Earth). This is a collaborative project among NCSA, USGS Illinois Water Science Center, Computer Science Department at UIUC funded by the Adaptive Environmental Infrastructure Sensing and Information Systems initiative at UIUC.

  1. A Java API for working with PubChem datasets

    PubMed Central

    Southern, Mark R.; Griffin, Patrick R.

    2011-01-01

    Summary: PubChem is a public repository of chemical structures and associated biological activities. The PubChem BioAssay database contains assay descriptions, conditions and readouts and biological screening results that have been submitted by the biomedical research community. The PubChem web site and Power User Gateway (PUG) web service allow users to interact with the data and raw files are available via FTP. These resources are helpful to many but there can also be great benefit by using a software API to manipulate the data. Here, we describe a Java API with entity objects mapped to the PubChem Schema and with wrapper functions for calling the NCBI eUtilities and PubChem PUG web services. PubChem BioAssays and associated chemical compounds can then be queried and manipulated in a local relational database. Features include chemical structure searching and generation and display of curve fits from stored dose–response experiments, something that is not yet available within PubChem itself. The aim is to provide researchers with a fast, consistent, queryable local resource from which to manipulate PubChem BioAssays in a database agnostic manner. It is not intended as an end user tool but to provide a platform for further automation and tools development. Availability: http://code.google.com/p/pubchemdb Contact: southern@scripps.edu PMID:21216779

  2. Google searches help with diagnosis in dermatology.

    PubMed

    Amri, Montassar; Feroz, Kaliyadan

    2014-01-01

    Several previous studies have tried to assess the usefulness of Google search as a diagnostic aid. The results were discordant and have led to controversies. To investigate how often Google search is helpful to reach correct diagnoses in dermatology. Two fifth-year students (A and B) and one demonstrator (C) have participated as investigators in this paper. Twenty-five diagnostic dermatological cases were selected from all the clinical cases published in the Web only images in clinical medicine from March 2005 to November 2009. The main outcome measure of our paper was to compare the number of correct diagnoses provided by the investigators without, and with Google search. Investigator A gave correct diagnoses in 9/25 (36%) cases without Google search, his diagnostic success after Google search was 18/25 (72%). Investigator B results were 11/25 (44%) correct diagnoses without Google search, and 19/25 (76%) after this search. For investigator C, the results were 12/25 (48%) without Google search, and 18/25 (72%) after the use of this tool. Thus, the total correct diagnoses provided by the three investigators were 32 (42.6%) without Google search, and 55 (73.3%) when using this facility. The difference was statistically significant between the total number of correct diagnoses given by the three investigators without, and with Google search (p = 0.0002). In the light of our paper, Google search appears to be an interesting diagnostic aid in dermatology. However, we emphasize that diagnosis is primarily an art based on clinical skills and experience.

  3. Successful participant recruitment strategies for an online smokeless tobacco cessation program.

    PubMed

    Gordon, Judith S; Akers, Laura; Severson, Herbert H; Danaher, Brian G; Boles, Shawn M

    2006-12-01

    An estimated 22% of Americans currently use smokeless tobacco (ST). Most live in small towns and rural areas that offer few ST cessation resources. Approximately 94 million Americans use the Internet for health-related information, and on-line access is growing among lower-income and less-educated groups. As part of a randomized clinical trial to assess the reach and effectiveness of Web-based programs for delivering an ST cessation intervention, the authors developed and evaluated several methods for overcoming the recruitment challenges associated with Web-based research. This report describes and evaluates these methods. Participants were recruited through: (a) Thematic promotional "releases" to print and broadcast media, (b) Google ads, (c) placement of a link on other Web sites, (d) limited purchase of paid advertising, (e) direct mailings to ST users, and (f) targeted mailings to health care and tobacco control professionals. Combined recruitment activities resulted in more than 23,500 hits on our recruitment website from distinct IP addresses over 15 months, which yielded 2,523 eligible ST users who completed the registration process and enrolled in the study. Self-reports revealed that at least 1,276 (50.6%) of these participants were recruited via mailings, 874 (34.6%) from Google ads or via search engines or links on another Web site, and 373 (14.8%) from all other methods combined. The use of thematic mailings is novel in research settings. Recruitment of study participants went quickly and smoothly. Google ads and mailings to media outlets were the methods that recruited the highest number of participants.

  4. Online transit trip planner for small agencies using Google Transit : final deployment package.

    DOT National Transportation Integrated Search

    2011-09-01

    Google Transit is a public transportation trip planner that enables travelers to obtain information regarding available transit services between a : given origin and given destination. While transit agencies can publish their service information onto...

  5. Pre-Service Teachers' Opinions on Cloud Supported Social Network

    ERIC Educational Resources Information Center

    Ozcan, Seher; Gokcearslan, Sahin; Kukul, Volkan

    2015-01-01

    Pre-service teachers are expected to use new technologies such as Google+ which facilitates contacting, sharing in certain environments and working collaboratively with the help of cloud support in their lessons effectively. This study aims to examine pre-service teachers' opinions regarding the use of Google+ to support lesson activities. In this…

  6. There comes a baby! What should I do? Smartphones' pregnancy-related applications: A web-based overview.

    PubMed

    Bert, Fabrizio; Passi, Stefano; Scaioli, Giacomo; Gualano, Maria R; Siliquini, Roberta

    2016-09-01

    Our article aims to give an overview of the most mentioned smartphones' pregnancy-related applications (Apps). A keywords string with selected keywords was entered both in a general search engine (Google(®)) and PubMed. While PubMed returned no pertinent results, a total of 370 web pages were found on Google(®), and 146 of them were selected. All the pregnancy-related Apps cited at least eight times were included. Information about App's producer, price, contents, privacy policy, and presence of a scientific board was collected. Finally, nine apps were considered. The majority of them were free and available in the two main online markets (Apple(®) App Store and Android(®) Google Play). Five apps presented a privacy policy statement, while a scientific board was mentioned in only three. Further studies are needed in order to deepen the knowledge regarding the main risks of these devices, such as privacy loss, contents control concerns, the digital divide and a potential humanization reduction. © The Author(s) 2015.

  7. Implementing Web 2.0 Tools in the Classroom: Four Teachers' Accounts

    ERIC Educational Resources Information Center

    Kovalik, Cindy; Kuo, Chia-Ling; Cummins, Megan; Dipzinski, Erin; Joseph, Paula; Laskey, Stephanie

    2014-01-01

    In this paper, four teachers shared their experiences using the following free Web 2.0 tools with their students: Jing, Wix, Google Sites, and Blogger. The teachers found that students reacted positively to lessons in which these tools were used, and also noted improvements they could make when using them in the future.

  8. Development of Web-Based Learning Application for Generation Z

    ERIC Educational Resources Information Center

    Hariadi, Bambang; Dewiyani Sunarto, M. J.; Sudarmaningtyas, Pantjawati

    2016-01-01

    This study aimed to develop a web-based learning application as a form of learning revolution. The form of learning revolution includes the provision of unlimited teaching materials, real time class organization, and is not limited by time or place. The implementation of this application is in the form of hybrid learning by using Google Apps for…

  9. Collaborative Writing with Web 2.0 Technologies: Education Students' Perceptions

    ERIC Educational Resources Information Center

    Brodahl, Cornelia; Hadjerrouit, Said; Hansen, Nils Kristian

    2011-01-01

    Web 2.0 technologies are becoming popular in teaching and learning environments. Among them several online collaborative writing tools, like wikis and blogs, have been integrated into educational settings. Research has been carried out on a wide range of subjects related to wikis, while other, comparable tools like Google Docs and EtherPad remain…

  10. 76 FR 20054 - Self-Regulatory Organizations; the NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ... over 50,000,000 investors on Web sites operated by Google, Interactive Data, and Dow Jones, among... systems (``ATSs''), including dark pools and electronic communication networks (``ECNs''). Each SRO market..., Attain, TracECN, BATS Trading and Direct Edge. Today, BATS publishes its data at no charge on its Web...

  11. A profile of anti-vaccination lobbying on the South African internet, 2011-2013.

    PubMed

    Burnett, Rosemary Joyce; von Gogh, Lauren Jennifer; Moloi, Molelekeng H; François, Guido

    2015-11-01

    The South African Vaccination and Immunisation Centre receives many requests to explain the validity of internet-based anti-vaccination claims. Previous global studies on internet-based anti-vaccination lobbying had not identified anti-vaccination web pages originating in South Africa (SA). To characterise SA internet-based anti-vaccination lobbying. In 2011, searches for anti-vaccination content were performed using Google, Yahoo and MSN-Bing, limited to English-language SA web pages. Content analysis was performed on web pages expressing anti-vaccination sentiment about infant vaccination. This was repeated in 2012 and 2013 using Google, with the first 700 web pages per search being analysed. Blogs/forums, articles and e-shops constituted 40.3%, 55.2% and 4.5% of web pages, respectively. Authors were lay people (63.5%), complementary/alternative medicine (CAM) practitioners (23.1%), medical professionals practising CAM (7.7%) and medical professionals practising only allopathic medicine (5.8%). Advertisements appeared on 55.2% of web pages. Of these, 67.6% were sponsored by or linked to organisations with financial interests in discrediting vaccines, with 80.0% and 24.0% of web pages sponsored by these organisations claiming respectively that vaccines are ineffective and that vaccination is profit driven. The vast majority of web pages (92.5%) claimed that vaccines are not safe, and 77.6% of anti-vaccination claims originated from the USA. South Africans are creating web pages or blogs for local anti-vaccination lobbying. Research is needed to understand what influence internet-based anti-vaccination lobbying has on the uptake of infant vaccination in SA.

  12. An assessment of the visibility of MeSH-indexed medical web catalogs through search engines.

    PubMed Central

    Zweigenbaum, P.; Darmoni, S. J.; Grabar, N.; Douyère, M.; Benichou, J.

    2002-01-01

    Manually indexed Internet health catalogs such as CliniWeb or CISMeF provide resources for retrieving high-quality health information. Users of these quality-controlled subject gateways are most often referred to them by general search engines such as Google, AltaVista, etc. This raises several questions, among which the following: what is the relative visibility of medical Internet catalogs through search engines? This study addresses this issue by measuring and comparing the visibility of six major, MeSH-indexed health catalogs through four different search engines (AltaVista, Google, Lycos, Northern Light) in two languages (English and French). Over half a million queries were sent to the search engines; for most of these search engines, according to our measures at the time the queries were sent, the most visible catalog for English MeSH terms was CliniWeb and the most visible one for French MeSH terms was CISMeF. PMID:12463965

  13. Greater freedom of speech on Web 2.0 correlates with dominance of views linking vaccines to autism.

    PubMed

    Venkatraman, Anand; Garg, Neetika; Kumar, Nilay

    2015-03-17

    It is suspected that Web 2.0 web sites, with a lot of user-generated content, often support viewpoints that link autism to vaccines. We assessed the prevalence of the views supporting a link between vaccines and autism online by comparing YouTube, Google and Wikipedia with PubMed. Freedom of speech is highest on YouTube and progressively decreases for the others. Support for a link between vaccines and autism is most prominent on YouTube, followed by Google search results. It is far lower on Wikipedia and PubMed. Anti-vaccine activists use scientific arguments, certified physicians and official-sounding titles to gain credibility, while also leaning on celebrity endorsement and personalized stories. Online communities with greater freedom of speech lead to a dominance of anti-vaccine voices. Moderation of content by editors can offer balance between free expression and factual accuracy. Health communicators and medical institutions need to step up their activity on the Internet. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Knowledge-based personalized search engine for the Web-based Human Musculoskeletal System Resources (HMSR) in biomechanics.

    PubMed

    Dao, Tien Tuan; Hoang, Tuan Nha; Ta, Xuan Hien; Tho, Marie Christine Ho Ba

    2013-02-01

    Human musculoskeletal system resources of the human body are valuable for the learning and medical purposes. Internet-based information from conventional search engines such as Google or Yahoo cannot response to the need of useful, accurate, reliable and good-quality human musculoskeletal resources related to medical processes, pathological knowledge and practical expertise. In this present work, an advanced knowledge-based personalized search engine was developed. Our search engine was based on a client-server multi-layer multi-agent architecture and the principle of semantic web services to acquire dynamically accurate and reliable HMSR information by a semantic processing and visualization approach. A security-enhanced mechanism was applied to protect the medical information. A multi-agent crawler was implemented to develop a content-based database of HMSR information. A new semantic-based PageRank score with related mathematical formulas were also defined and implemented. As the results, semantic web service descriptions were presented in OWL, WSDL and OWL-S formats. Operational scenarios with related web-based interfaces for personal computers and mobile devices were presented and analyzed. Functional comparison between our knowledge-based search engine, a conventional search engine and a semantic search engine showed the originality and the robustness of our knowledge-based personalized search engine. In fact, our knowledge-based personalized search engine allows different users such as orthopedic patient and experts or healthcare system managers or medical students to access remotely into useful, accurate, reliable and good-quality HMSR information for their learning and medical purposes. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. The National 3-D Geospatial Information Web-Based Service of Korea

    NASA Astrophysics Data System (ADS)

    Lee, D. T.; Kim, C. W.; Kang, I. G.

    2013-09-01

    3D geospatial information systems should provide efficient spatial analysis tools and able to use all capabilities of the third dimension, and a visualization. Currently, many human activities make steps toward the third dimension like land use, urban and landscape planning, cadastre, environmental monitoring, transportation monitoring, real estate market, military applications, etc. To reflect this trend, the Korean government has been started to construct the 3D geospatial data and service platform. Since the geospatial information was introduced in Korea, the construction of geospatial information (3D geospatial information, digital maps, aerial photographs, ortho photographs, etc.) has been led by the central government. The purpose of this study is to introduce the Korean government-lead 3D geospatial information web-based service for the people who interested in this industry and we would like to introduce not only the present conditions of constructed 3D geospatial data but methodologies and applications of 3D geospatial information. About 15% (about 3,278.74 km2) of the total urban area's 3D geospatial data have been constructed by the national geographic information institute (NGII) of Korea from 2005 to 2012. Especially in six metropolitan cities and Dokdo (island belongs to Korea) on level of detail (LOD) 4 which is photo-realistic textured 3D models including corresponding ortho photographs were constructed in 2012. In this paper, we represented web-based 3D map service system composition and infrastructure and comparison of V-world with Google Earth service will be presented. We also represented Open API based service cases and discussed about the protection of location privacy when we construct 3D indoor building models. In order to prevent an invasion of privacy, we processed image blurring, elimination and camouflage. The importance of public-private cooperation and advanced geospatial information policy is emphasized in Korea. Thus, the progress of spatial information industry of Korea is expected in the near future.

  16. Google in a Quantum Network

    PubMed Central

    Paparo, G. D.; Martin-Delgado, M. A.

    2012-01-01

    We introduce the characterization of a class of quantum PageRank algorithms in a scenario in which some kind of quantum network is realizable out of the current classical internet web, but no quantum computer is yet available. This class represents a quantization of the PageRank protocol currently employed to list web pages according to their importance. We have found an instance of this class of quantum protocols that outperforms its classical counterpart and may break the classical hierarchy of web pages depending on the topology of the web. PMID:22685626

  17. Web-based visualization of gridded dataset usings OceanBrowser

    NASA Astrophysics Data System (ADS)

    Barth, Alexander; Watelet, Sylvain; Troupin, Charles; Beckers, Jean-Marie

    2015-04-01

    OceanBrowser is a web-based visualization tool for gridded oceanographic data sets. Those data sets are typically four-dimensional (longitude, latitude, depth and time). OceanBrowser allows one to visualize horizontal sections at a given depth and time to examine the horizontal distribution of a given variable. It also offers the possibility to display the results on an arbitrary vertical section. To study the evolution of the variable in time, the horizontal and vertical sections can also be animated. Vertical section can be generated by using a fixed distance from coast or fixed ocean depth. The user can customize the plot by changing the color-map, the range of the color-bar, the type of the plot (linearly interpolated color, simple contours, filled contours) and download the current view as a simple image or as Keyhole Markup Language (KML) file for visualization in applications such as Google Earth. The data products can also be accessed as NetCDF files and through OPeNDAP. Third-party layers from a web map service can also be integrated. OceanBrowser is used in the frame of the SeaDataNet project (http://gher-diva.phys.ulg.ac.be/web-vis/) and EMODNET Chemistry (http://oceanbrowser.net/emodnet/) to distribute gridded data sets interpolated from in situ observation using DIVA (Data-Interpolating Variational Analysis).

  18. 3D Orbit Visualization for Earth-Observing Missions

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Plesea, Lucian; Chafin, Brian G.; Weiss, Barry H.

    2011-01-01

    This software visualizes orbit paths for the Orbiting Carbon Observatory (OCO), but was designed to be general and applicable to any Earth-observing mission. The software uses the Google Earth user interface to provide a visual mechanism to explore spacecraft orbit paths, ground footprint locations, and local cloud cover conditions. In addition, a drill-down capability allows for users to point and click on a particular observation frame to pop up ancillary information such as data product filenames and directory paths, latitude, longitude, time stamp, column-average dry air mole fraction of carbon dioxide, and solar zenith angle. This software can be integrated with the ground data system for any Earth-observing mission to automatically generate daily orbit path data products in Google Earth KML format. These KML data products can be directly loaded into the Google Earth application for interactive 3D visualization of the orbit paths for each mission day. Each time the application runs, the daily orbit paths are encapsulated in a KML file for each mission day since the last time the application ran. Alternatively, the daily KML for a specified mission day may be generated. The application automatically extracts the spacecraft position and ground footprint geometry as a function of time from a daily Level 1B data product created and archived by the mission s ground data system software. In addition, ancillary data, such as the column-averaged dry air mole fraction of carbon dioxide and solar zenith angle, are automatically extracted from a Level 2 mission data product. Zoom, pan, and rotate capability are provided through the standard Google Earth interface. Cloud cover is indicated with an image layer from the MODIS (Moderate Resolution Imaging Spectroradiometer) aboard the Aqua satellite, which is automatically retrieved from JPL s OnEarth Web service.

  19. 76 FR 21017 - United States v. Google Inc. and ITA Software Inc., Proposed Final Judgment and Competitive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-14

    .... Google seeks to expand its search services by launching an Internet travel site to offer comparative... and other companies offering travel-related products and services. 14. Metas enable consumers to search for flights but do not offer booking services. When a consumer on a Meta travel site enters a...

  20. On transform coding tools under development for VP10

    NASA Astrophysics Data System (ADS)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  1. Infodemiology of status epilepticus: A systematic validation of the Google Trends-based search queries.

    PubMed

    Bragazzi, Nicola Luigi; Bacigaluppi, Susanna; Robba, Chiara; Nardone, Raffaele; Trinka, Eugen; Brigo, Francesco

    2016-02-01

    People increasingly use Google looking for health-related information. We previously demonstrated that in English-speaking countries most people use this search engine to obtain information on status epilepticus (SE) definition, types/subtypes, and treatment. Now, we aimed at providing a quantitative analysis of SE-related web queries. This analysis represents an advancement, with respect to what was already previously discussed, in that the Google Trends (GT) algorithm has been further refined and correlational analyses have been carried out to validate the GT-based query volumes. Google Trends-based SE-related query volumes were well correlated with information concerning causes and pharmacological and nonpharmacological treatments. Google Trends can provide both researchers and clinicians with data on realities and contexts that are generally overlooked and underexplored by classic epidemiology. In this way, GT can foster new epidemiological studies in the field and can complement traditional epidemiological tools. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. An overview of the web-based Google Earth coincident imaging tool

    USGS Publications Warehouse

    Chander, Gyanesh; Kilough, B.; Gowda, S.

    2010-01-01

    The Committee on Earth Observing Satellites (CEOS) Visualization Environment (COVE) tool is a browser-based application that leverages Google Earth web to display satellite sensor coverage areas. The analysis tool can also be used to identify near simultaneous surface observation locations for two or more satellites. The National Aeronautics and Space Administration (NASA) CEOS System Engineering Office (SEO) worked with the CEOS Working Group on Calibration and Validation (WGCV) to develop the COVE tool. The CEOS member organizations are currently operating and planning hundreds of Earth Observation (EO) satellites. Standard cross-comparison exercises between multiple sensors to compare near-simultaneous surface observations and to identify corresponding image pairs are time-consuming and labor-intensive. COVE is a suite of tools that have been developed to make such tasks easier.

  3. ETDEWEB versus the World-Wide-Web: a specific database/web comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cutler, Debbie

    2010-06-28

    A study was performed comparing user search results from the specialized scientific database on energy-related information, ETDEWEB, with search results from the internet search engines Google and Google Scholar. The primary objective of the study was to determine if ETDEWEB (the Energy Technology Data Exchange – World Energy Base) continues to bring the user search results that are not being found by Google and Google Scholar. As a multilateral information exchange initiative, ETDE’s member countries and partners contribute cost- and task-sharing resources to build the largest database of energy-related information in the world. As of early 2010, the ETDEWEB databasemore » has 4.3 million citations to world-wide energy literature. One of ETDEWEB’s strengths is its focused scientific content and direct access to full text for its grey literature (over 300,000 documents in PDF available for viewing from the ETDE site and over a million additional links to where the documents can be found at research organizations and major publishers globally). Google and Google Scholar are well-known for the wide breadth of the information they search, with Google bringing in news, factual and opinion-related information, and Google Scholar also emphasizing scientific content across many disciplines. The analysis compared the results of 15 energy-related queries performed on all three systems using identical words/phrases. A variety of subjects was chosen, although the topics were mostly in renewable energy areas due to broad international interest. Over 40,000 search result records from the three sources were evaluated. The study concluded that ETDEWEB is a significant resource to energy experts for discovering relevant energy information. For the 15 topics in this study, ETDEWEB was shown to bring the user unique results not shown by Google or Google Scholar 86.7% of the time. Much was learned from the study beyond just metric comparisons. Observations about the strengths of each system and factors impacting the search results are also shared along with background information and summary tables of the results. If a user knows a very specific title of a document, all three systems are helpful in finding the user a source for the document. But if the user is looking to discover relevant documents on a specific topic, each of the three systems will bring back a considerable volume of data, but quite different in focus. Google is certainly a highly-used and valuable tool to find significant ‘non-specialist’ information, and Google Scholar does help the user focus on scientific disciplines. But if a user’s interest is scientific and energy-specific, ETDEWEB continues to hold a strong position in the energy research, technology and development (RTD) information field and adds considerable value in knowledge discovery. (auth)« less

  4. 21st Century Senior Leader Education: Ubiquitous Open Access Learning Environment

    DTIC Science & Technology

    2011-02-22

    Failures: It‘s the content, stupid ‖22 because agencies focus on systems rather than substance and access to the content is critical. The access to Army...Resource Capabilities. 18 As an example to demonstrate how a civilian capability provides learning value to the PLE, the ― Google Alerts‖ ® web...technology pushed content to the author for review in the development of this paper. The technology consists of a user creating a Google account, logging

  5. Web GIS in practice V: 3-D interactive and real-time mapping in Second Life

    PubMed Central

    Boulos, Maged N Kamel; Burden, David

    2007-01-01

    This paper describes technologies from Daden Limited for geographically mapping and accessing live news stories/feeds, as well as other real-time, real-world data feeds (e.g., Google Earth KML feeds and GeoRSS feeds) in the 3-D virtual world of Second Life, by plotting and updating the corresponding Earth location points on a globe or some other suitable form (in-world), and further linking those points to relevant information and resources. This approach enables users to visualise, interact with, and even walk or fly through, the plotted data in 3-D. Users can also do the reverse: put pins on a map in the virtual world, and then view the data points on the Web in Google Maps or Google Earth. The technologies presented thus serve as a bridge between mirror worlds like Google Earth and virtual worlds like Second Life. We explore the geo-data display potential of virtual worlds and their likely convergence with mirror worlds in the context of the future 3-D Internet or Metaverse, and reflect on the potential of such technologies and their future possibilities, e.g. their use to develop emergency/public health virtual situation rooms to effectively manage emergencies and disasters in real time. The paper also covers some of the issues associated with these technologies, namely user interface accessibility and individual privacy. PMID:18042275

  6. Quantifying the effect of media limitations on outbreak data in a global online web-crawling epidemic intelligence system, 2008–2011

    PubMed Central

    Scales, David; Zelenev, Alexei; Brownstein, John S.

    2013-01-01

    Background This is the first study quantitatively evaluating the effect that media-related limitations have on data from an automated epidemic intelligence system. Methods We modeled time series of HealthMap's two main data feeds, Google News and Moreover, to test for evidence of two potential limitations: first, human resources constraints, and second, high-profile outbreaks “crowding out” coverage of other infectious diseases. Results Google News events declined by 58.3%, 65.9%, and 14.7% on Saturday, Sunday and Monday, respectively, relative to other weekdays. Events were reduced by 27.4% during Christmas/New Years weeks and 33.6% lower during American Thanksgiving week than during an average week for Google News. Moreover data yielded similar results with the addition of Memorial Day (US) being associated with a 36.2% reduction in events. Other holiday effects were not statistically significant. We found evidence for a crowd out phenomenon for influenza/H1N1, where a 50% increase in influenza events corresponded with a 4% decline in other disease events for Google News only. Other prominent diseases in this database – avian influenza (H5N1), cholera, or foodborne illness – were not associated with a crowd out phenomenon. Conclusions These results provide quantitative evidence for the limited impact of editorial biases on HealthMap's web-crawling epidemic intelligence. PMID:24206612

  7. Sentiment Analysis of Web Sites Related to Vaginal Mesh Use in Pelvic Reconstructive Surgery.

    PubMed

    Hobson, Deslyn T G; Meriwether, Kate V; Francis, Sean L; Kinman, Casey L; Stewart, J Ryan

    2018-05-02

    The purpose of this study was to utilize sentiment analysis to describe online opinions toward vaginal mesh. We hypothesized that sentiment in legal Web sites would be more negative than that in medical and reference Web sites. We generated a list of relevant key words related to vaginal mesh and searched Web sites using the Google search engine. Each unique uniform resource locator (URL) was sorted into 1 of 6 categories: "medical", "legal", "news/media", "patient generated", "reference", or "unrelated". Sentiment of relevant Web sites, the primary outcome, was scored on a scale of -1 to +1, and mean sentiment was compared across all categories using 1-way analysis of variance. Tukey test evaluated differences between category pairs. Google searches of 464 unique key words resulted in 11,405 URLs. Sentiment analysis was performed on 8029 relevant URLs (3472 legal, 1625 "medical", 1774 "reference", 666 "news media", 492 "patient generated"). The mean sentiment for all relevant Web sites was +0.01 ± 0.16; analysis of variance revealed significant differences between categories (P < 0.001). Web sites categorized as "legal" and "news/media" had a slightly negative mean sentiment, whereas those categorized as "medical," "reference," and "patient generated" had slightly positive mean sentiments. Tukey test showed differences between all category pairs except the "medical" versus "reference" in comparison with the largest mean difference (-0.13) seen in the "legal" versus "reference" comparison. Web sites related to vaginal mesh have an overall mean neutral sentiment, and Web sites categorized as "medical," "reference," and "patient generated" have significantly higher sentiment scores than related Web sites in "legal" and "news/media" categories.

  8. Community-Based Services that Facilitate Interoperability and Intercomparison of Precipitation Datasets from Multiple Sources

    NASA Technical Reports Server (NTRS)

    Liu, Zhong; Kempler, Steven; Teng, William; Leptoukh, Gregory; Ostrenga, Dana

    2010-01-01

    Over the past 12 years, large volumes of precipitation data have been generated from space-based observatories (e.g., TRMM), merging of data products (e.g., gridded 3B42), models (e.g., GMAO), climatologies (e.g., Chang SSM/I derived rain indices), field campaigns, and ground-based measuring stations. The science research, applications, and education communities have greatly benefited from the unrestricted availability of these data from the Goddard Earth Sciences Data and Information Services Center (GES DISC) and, in particular, the services tailored toward precipitation data access and usability. In addition, tools and services that are responsive to the expressed evolving needs of the precipitation data user communities have been developed at the Precipitation Data and Information Services Center (PDISC) (http://disc.gsfc.nasa.gov/precipitation or google NASA PDISC), located at the GES DISC, to provide users with quick data exploration and access capabilities. In recent years, data management and access services have become increasingly sophisticated, such that they now afford researchers, particularly those interested in multi-data set science analysis and/or data validation, the ability to homogenize data sets, in order to apply multi-variant, comparison, and evaluation functions. Included in these services is the ability to capture data quality and data provenance. These interoperability services can be directly applied to future data sets, such as those from the Global Precipitation Measurement (GPM) mission. This presentation describes the data sets and services at the PDISC that are currently used by precipitation science and applications researchers, and which will be enhanced in preparation for GPM and associated multi-sensor data research. Specifically, the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) will be illustrated. Giovanni enables scientific exploration of Earth science data without researchers having to perform the complicated data access and match-up processes. In addition, PDISC tool and service capabilities being adapted for GPM data will be described, including the Google-like Mirador data search and access engine; semantic technology to help manage large amounts of multi-sensor data and their relationships; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion to various formats (e.g., netCDF, HDF, KML (for Google Earth)); visualization and analysis of Level 2 data profiles and maps; parameter and spatial subsetting; time and temporal aggregation; regridding; data version control and provenance; continuous archive verification; and expertise in data-related standards and interoperability. The goal of providing these services is to further the progress towards a common framework by which data analysis/validation can be more easily accomplished.

  9. [Who Hits the Mark? A Comparative Study of the Free Geocoding Services of Google and OpenStreetMap].

    PubMed

    Lemke, D; Mattauch, V; Heidinger, O; Hense, H W

    2015-09-01

    Geocoding, the process of converting textual information (addresses) into geographic coordinates is increasingly used in public health/epidemiological research and practice. To date, little attention has been paid to geocoding quality and its impact on different types of spatially-related health studies. The primary aim of this study was to compare 2 freely available geocoding services (Google and OpenStreetMap) with regard to matching rate (percentage of address records capable of being geocoded) and positional accuracy (distance between geocodes and the ground truth locations). Residential addresses were geocoded by the NRW state office for information and technology and were considered as reference data (gold standard). The gold standard included the coordinates, the quality of the addresses (4 categories), and a binary urbanity indicator based on the CORINE land cover data. 2 500 addresses were randomly sampled after stratification for address quality and urbanity indicator (approximately 20 000 addresses). These address samples were geocoded using the geocoding services from Google and OSM. In general, both geocoding services showed a decrease in the matching rate with decreasing address quality and urbanity. Google showed consistently a higher completeness than OSM (>93 vs. >82%). Also, the cartographic confounding between urban and rural regions was less distinct with Google's geocoding API. Regarding the positional accuracy of the geo-coordinates, Google also showed the smallest deviations from the reference coordinates, with a median of <9 vs. <175.8 m. The cumulative density function derived from the positional accuracy showed for Google that nearly 95% and for OSM 50% of the addresses were geocoded within <50 m of their reference coordinates. The geocoding API from Google is superior to OSM regarding completeness and positional accuracy of the geocoded addresses. On the other hand, Google has several restrictions, such as the limitation of the requests to 2 500 addresses per 24 h and the presentation of the results exclusively on Google Maps, which may complicate the use for scientific purposes. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Southern California Earthquake Center Geologic Vertical Motion Database

    NASA Astrophysics Data System (ADS)

    Niemi, Nathan A.; Oskin, Michael; Rockwell, Thomas K.

    2008-07-01

    The Southern California Earthquake Center Geologic Vertical Motion Database (VMDB) integrates disparate sources of geologic uplift and subsidence data at 104- to 106-year time scales into a single resource for investigations of crustal deformation in southern California. Over 1800 vertical deformation rate data points in southern California and northern Baja California populate the database. Four mature data sets are now represented: marine terraces, incised river terraces, thermochronologic ages, and stratigraphic surfaces. An innovative architecture and interface of the VMDB exposes distinct data sets and reference frames, permitting user exploration of this complex data set and allowing user control over the assumptions applied to convert geologic and geochronologic information into absolute uplift rates. Online exploration and download tools are available through all common web browsers, allowing the distribution of vertical motion results as HTML tables, tab-delimited GIS-compatible text files, or via a map interface through the Google Maps™ web service. The VMDB represents a mature product for research of fault activity and elastic deformation of southern California.

  11. MapMyFlu: visualizing spatio-temporal relationships between related influenza sequences

    PubMed Central

    Nolte, Nicholas; Kurzawa, Nils; Eils, Roland; Herrmann, Carl

    2015-01-01

    Understanding the molecular dynamics of viral spreading is crucial for anticipating the epidemiological implications of disease outbreaks. In the case of influenza, reassortments or point mutations affect the adaption to new hosts or resistance to anti-viral drugs and can determine whether a new strain will result in a pandemic infection or a less severe progression. To this end, tools integrating molecular information with epidemiological parameters are important to understand how molecular characteristics reflect in the infection dynamics. We present a new web tool, MapMyFlu, which allows to spatially and temporally display influenza viruses related to a query sequence on a Google Map based on BLAST results against the NCBI Influenza Database. Temporal and geographical trends appear clearly and may help in reconstructing the evolutionary history of a particular sequence. The tool is accessible through a web server, hence without the need for local installation. The website has an intuitive design and provides an easy-to-use service, and is available at http://mapmyflu.ipmb.uni-heidelberg.de PMID:25940623

  12. RCSB PDB Mobile: iOS and Android mobile apps to provide data access and visualization to the RCSB Protein Data Bank.

    PubMed

    Quinn, Gregory B; Bi, Chunxiao; Christie, Cole H; Pang, Kyle; Prlić, Andreas; Nakane, Takanori; Zardecki, Christine; Voigt, Maria; Berman, Helen M; Bourne, Philip E; Rose, Peter W

    2015-01-01

    The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB) resource provides tools for query, analysis and visualization of the 3D structures in the PDB archive. As the mobile Web is starting to surpass desktop and laptop usage, scientists and educators are beginning to integrate mobile devices into their research and teaching. In response, we have developed the RCSB PDB Mobile app for the iOS and Android mobile platforms to enable fast and convenient access to RCSB PDB data and services. Using the app, users from the general public to expert researchers can quickly search and visualize biomolecules, and add personal annotations via the RCSB PDB's integrated MyPDB service. RCSB PDB Mobile is freely available from the Apple App Store and Google Play (http://www.rcsb.org). © The Author 2014. Published by Oxford University Press.

  13. RCSB PDB Mobile: iOS and Android mobile apps to provide data access and visualization to the RCSB Protein Data Bank

    PubMed Central

    Quinn, Gregory B.; Bi, Chunxiao; Christie, Cole H.; Pang, Kyle; Prlić, Andreas; Nakane, Takanori; Zardecki, Christine; Voigt, Maria; Berman, Helen M.; Rose, Peter W.

    2015-01-01

    Summary: The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB) resource provides tools for query, analysis and visualization of the 3D structures in the PDB archive. As the mobile Web is starting to surpass desktop and laptop usage, scientists and educators are beginning to integrate mobile devices into their research and teaching. In response, we have developed the RCSB PDB Mobile app for the iOS and Android mobile platforms to enable fast and convenient access to RCSB PDB data and services. Using the app, users from the general public to expert researchers can quickly search and visualize biomolecules, and add personal annotations via the RCSB PDB’s integrated MyPDB service. Availability and implementation: RCSB PDB Mobile is freely available from the Apple App Store and Google Play (http://www.rcsb.org). Contact: pwrose@ucsd.edu PMID:25183487

  14. What to do with a Dead Research Code

    NASA Astrophysics Data System (ADS)

    Nemiroff, Robert J.

    2016-01-01

    The project has ended -- should all of the computer codes that enabled the project be deleted? No. Like research papers, research codes typically carry valuable information past project end dates. Several possible end states to the life of research codes are reviewed. Historically, codes are typically left dormant on an increasingly obscure local disk directory until forgotten. These codes will likely become any or all of: lost, impossible to compile and run, difficult to decipher, and likely deleted when the code's proprietor moves on or dies. It is argued here, though, that it would be better for both code authors and astronomy generally if project codes were archived after use in some way. Archiving is advantageous for code authors because archived codes might increase the author's ADS citable publications, while astronomy as a science gains transparency and reproducibility. Paper-specific codes should be included in the publication of the journal papers they support, just like figures and tables. General codes that support multiple papers, possibly written by multiple authors, including their supporting websites, should be registered with a code registry such as the Astrophysics Source Code Library (ASCL). Codes developed on GitHub can be archived with a third party service such as, currently, BackHub. An important code version might be uploaded to a web archiving service like, currently, Zenodo or Figshare, so that this version receives a Digital Object Identifier (DOI), enabling it to found at a stable address into the future. Similar archiving services that are not DOI-dependent include perma.cc and the Internet Archive Wayback Machine at archive.org. Perhaps most simply, copies of important codes with lasting value might be kept on a cloud service like, for example, Google Drive, while activating Google's Inactive Account Manager.

  15. Post-Web 2.0 Pedagogy: From Student-Generated Content to International Co-Production Enabled by Mobile Social Media

    ERIC Educational Resources Information Center

    Cochrane, Thomas; Antonczak, Laurent; Wagner, Daniel

    2013-01-01

    The advent of web 2.0 has enabled new forms of collaboration centred upon user-generated content, however, mobile social media is enabling a new wave of social collaboration. Mobile devices have disrupted and reinvented traditional media markets and distribution: iTunes, Google Play and Amazon now dominate music industry distribution channels,…

  16. What Major Search Engines Like Google, Yahoo and Bing Need to Know about Teachers in the UK?

    ERIC Educational Resources Information Center

    Seyedarabi, Faezeh

    2014-01-01

    This article briefly outlines the current major search engines' approach to teachers' web searching. The aim of this article is to make Web searching easier for teachers when searching for relevant online teaching materials, in general, and UK teacher practitioners at primary, secondary and post-compulsory levels, in particular. Therefore, major…

  17. Measuring Link-Resolver Success: Comparing 360 Link with a Local Implementation of WebBridge

    ERIC Educational Resources Information Center

    Herrera, Gail

    2011-01-01

    This study reviewed link resolver success comparing 360 Link and a local implementation of WebBridge. Two methods were used: (1) comparing article-level access and (2) examining technical issues for 384 randomly sampled OpenURLs. Google Analytics was used to collect user-generated OpenURLs. For both methods, 360 Link out-performed the local…

  18. Fast segmentation of satellite images using SLIC, WebGL and Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Baart, Fedor; Gorelick, Noel; Eisemann, Elmar; van de Giesen, Nick

    2017-04-01

    Google Earth Engine (GEE) is a parallel geospatial processing platform, which harmonizes access to petabytes of freely available satellite images. It provides a very rich API, allowing development of dedicated algorithms to extract useful geospatial information from these images. At the same time, modern GPUs provide thousands of computing cores, which are mostly not utilized in this context. In the last years, WebGL became a popular and well-supported API, allowing fast image processing directly in web browsers. In this work, we will evaluate the applicability of WebGL to enable fast segmentation of satellite images. A new implementation of a Simple Linear Iterative Clustering (SLIC) algorithm using GPU shaders will be presented. SLIC is a simple and efficient method to decompose an image in visually homogeneous regions. It adapts a k-means clustering approach to generate superpixels efficiently. While this approach will be hard to scale, due to a significant amount of data to be transferred to the client, it should significantly improve exploratory possibilities and simplify development of dedicated algorithms for geoscience applications. Our prototype implementation will be used to improve surface water detection of the reservoirs using multispectral satellite imagery.

  19. A Web-Based Information System for Field Data Management

    NASA Astrophysics Data System (ADS)

    Weng, Y. H.; Sun, F. S.

    2014-12-01

    A web-based field data management system has been designed and developed to allow field geologists to store, organize, manage, and share field data online. System requirements were analyzed and clearly defined first regarding what data are to be stored, who the potential users are, and what system functions are needed in order to deliver the right data in the right way to the right user. A 3-tiered architecture was adopted to create this secure, scalable system that consists of a web browser at the front end while a database at the back end and a functional logic server in the middle. Specifically, HTML, CSS, and JavaScript were used to implement the user interface in the front-end tier, the Apache web server runs PHP scripts, and MySQL to server is used for the back-end database. The system accepts various types of field information, including image, audio, video, numeric, and text. It allows users to select data and populate them on either Google Earth or Google Maps for the examination of the spatial relations. It also makes the sharing of field data easy by converting them into XML format that is both human-readable and machine-readable, and thus ready for reuse.

  20. Decision support system for the response to infectious disease emergencies based on WebGIS and mobile services in China.

    PubMed

    Li, Ya-pin; Fang, Li-qun; Gao, Su-qing; Wang, Zhen; Gao, Hong-wei; Liu, Peng; Wang, Ze-Rui; Li, Yan-Li; Zhu, Xu-Guang; Li, Xin-Lou; Xu, Bo; Li, Yin-Jun; Yang, Hong; de Vlas, Sake J; Shi, Tao-Xing; Cao, Wu-Chun

    2013-01-01

    For years, emerging infectious diseases have appeared worldwide and threatened the health of people. The emergence and spread of an infectious-disease outbreak are usually unforeseen, and have the features of suddenness and uncertainty. Timely understanding of basic information in the field, and the collection and analysis of epidemiological information, is helpful in making rapid decisions and responding to an infectious-disease emergency. Therefore, it is necessary to have an unobstructed channel and convenient tool for the collection and analysis of epidemiologic information in the field. Baseline information for each county in mainland China was collected and a database was established by geo-coding information on a digital map of county boundaries throughout the country. Google Maps was used to display geographic information and to conduct calculations related to maps, and the 3G wireless network was used to transmit information collected in the field to the server. This study established a decision support system for the response to infectious-disease emergencies based on WebGIS and mobile services (DSSRIDE). The DSSRIDE provides functions including data collection, communication and analyses in real time, epidemiological detection, the provision of customized epidemiological questionnaires and guides for handling infectious disease emergencies, and the querying of professional knowledge in the field. These functions of the DSSRIDE could be helpful for epidemiological investigations in the field and the handling of infectious-disease emergencies. The DSSRIDE provides a geographic information platform based on the Google Maps application programming interface to display information of infectious disease emergencies, and transfers information between workers in the field and decision makers through wireless transmission based on personal computers, mobile phones and personal digital assistants. After a 2-year practice and application in infectious disease emergencies, the DSSRIDE is becoming a useful platform and is a useful tool for investigations in the field carried out by response sections and individuals. The system is suitable for use in developing countries and low-income districts.

  1. Citation Analysis of the Korean Journal of Urology From Web of Science, Scopus, Korean Medical Citation Index, KoreaMed Synapse, and Google Scholar

    PubMed Central

    2013-01-01

    The Korean Journal of Urology began to be published exclusively in English in 2010 and is indexed in PubMed Central/PubMed. This study analyzed a variety of citation indicators of the Korean Journal of Urology before and after 2010 to clarify the present position of the journal among the urology category journals. The impact factor, SCImago Journal Rank (SJR), impact index, Z-impact factor (ZIF, impact factor excluding self-citation), and Hirsch Index (H-index) were referenced or calculated from Web of Science, Scopus, SCImago Journal & Country Ranking, Korean Medical Citation Index (KoMCI), KoreaMed Synapse, and Google Scholar. Both the impact factor and the total citations rose rapidly beginning in 2011. The 2012 impact factor corresponded to the upper 84.9% in the nephrology-urology category, whereas the 2011 SJR was in the upper 58.5%. The ZIF in KoMCI was one fifth of the impact factor because there are only two other urology journals in KoMCI. Up to 2009, more than half of the citations in the Web of Science were from Korean researchers, but from 2010 to 2012, more than 85% of the citations were from international researchers. The H-indexes from Web of Science, Scopus, KoMCI, KoreaMed Synapse, and Google Scholar were 8, 10, 12, 9, and 18, respectively. The strategy of the language change in 2010 was successful from the perspective of citation indicators. The values of the citation indicators will continue to increase rapidly and consistently as the research achievement of authors of the Korean Journal of Urology increases. PMID:23614057

  2. Citation Analysis of the Korean Journal of Urology From Web of Science, Scopus, Korean Medical Citation Index, KoreaMed Synapse, and Google Scholar.

    PubMed

    Huh, Sun

    2013-04-01

    The Korean Journal of Urology began to be published exclusively in English in 2010 and is indexed in PubMed Central/PubMed. This study analyzed a variety of citation indicators of the Korean Journal of Urology before and after 2010 to clarify the present position of the journal among the urology category journals. The impact factor, SCImago Journal Rank (SJR), impact index, Z-impact factor (ZIF, impact factor excluding self-citation), and Hirsch Index (H-index) were referenced or calculated from Web of Science, Scopus, SCImago Journal & Country Ranking, Korean Medical Citation Index (KoMCI), KoreaMed Synapse, and Google Scholar. Both the impact factor and the total citations rose rapidly beginning in 2011. The 2012 impact factor corresponded to the upper 84.9% in the nephrology-urology category, whereas the 2011 SJR was in the upper 58.5%. The ZIF in KoMCI was one fifth of the impact factor because there are only two other urology journals in KoMCI. Up to 2009, more than half of the citations in the Web of Science were from Korean researchers, but from 2010 to 2012, more than 85% of the citations were from international researchers. The H-indexes from Web of Science, Scopus, KoMCI, KoreaMed Synapse, and Google Scholar were 8, 10, 12, 9, and 18, respectively. The strategy of the language change in 2010 was successful from the perspective of citation indicators. The values of the citation indicators will continue to increase rapidly and consistently as the research achievement of authors of the Korean Journal of Urology increases.

  3. Electronic Health Records: An Enhanced Security Paradigm to Preserve Patient's Privacy

    NASA Astrophysics Data System (ADS)

    Slamanig, Daniel; Stingl, Christian

    In recent years, demographic change and increasing treatment costs demand the adoption of more cost efficient, highly qualitative and integrated health care processes. The rapid growth and availability of the Internet facilitate the development of eHealth services and especially of electronic health records (EHRs) which are promising solutions to meet the aforementioned requirements. Considering actual web-based EHR systems, patient-centric and patient moderated approaches are widely deployed. Besides, there is an emerging market of so called personal health record platforms, e.g. Google Health. Both concepts provide a central and web-based access to highly sensitive medical data. Additionally, the fact that these systems may be hosted by not fully trustworthy providers necessitates to thoroughly consider privacy issues. In this paper we define security and privacy objectives that play an important role in context of web-based EHRs. Furthermore, we discuss deployed solutions as well as concepts proposed in the literature with respect to this objectives and point out several weaknesses. Finally, we introduce a system which overcomes the drawbacks of existing solutions by considering an holistic approach to preserve patient's privacy and discuss the applied methods.

  4. Saint: a lightweight integration environment for model annotation.

    PubMed

    Lister, Allyson L; Pocock, Matthew; Taschuk, Morgan; Wipat, Anil

    2009-11-15

    Saint is a web application which provides a lightweight annotation integration environment for quantitative biological models. The system enables modellers to rapidly mark up models with biological information derived from a range of data sources. Saint is freely available for use on the web at http://www.cisban.ac.uk/saint. The web application is implemented in Google Web Toolkit and Tomcat, with all major browsers supported. The Java source code is freely available for download at http://saint-annotate.sourceforge.net. The Saint web server requires an installation of libSBML and has been tested on Linux (32-bit Ubuntu 8.10 and 9.04).

  5. Brandenburg 3D - a comprehensive 3D Subsurface Model, Conception of an Infrastructure Node and a Web Application

    NASA Astrophysics Data System (ADS)

    Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim

    2014-05-01

    The Energiewende and the increasing scarcity of raw materials will lead to an intensified utilization of the subsurface in Germany. Within this context, geological 3D modeling is a fundamental approach for integrated decision and planning processes. Initiated by the development of the European Geospatial Infrastructure INSPIRE, the German State Geological Offices started digitizing their predominantly analog archive inventory. Until now, a comprehensive 3D subsurface model of Brandenburg did not exist. Therefore the project B3D strived to develop a new 3D model as well as a subsequent infrastructure node to integrate all geological and spatial data within the Geodaten-Infrastruktur Brandenburg (Geospatial Infrastructure, GDI-BB) and provide it to the public through an interactive 2D/3D web application. The functionality of the web application is based on a client-server architecture. Server-sided, all available spatial data is published through GeoServer. GeoServer is designed for interoperability and acts as the reference implementation of the Open Geospatial Consortium (OGC) Web Feature Service (WFS) standard that provides the interface that allows requests for geographical features. In addition, GeoServer implements, among others, the high performance certified compliant Web Map Service (WMS) that serves geo-referenced map images. For publishing 3D data, the OGC Web 3D Service (W3DS), a portrayal service for three-dimensional geo-data, is used. The W3DS displays elements representing the geometry, appearance, and behavior of geographic objects. On the client side, the web application is solely based on Free and Open Source Software and leans on the JavaScript API WebGL that allows the interactive rendering of 2D and 3D graphics by means of GPU accelerated usage of physics and image processing as part of the web page canvas without the use of plug-ins. WebGL is supported by most web browsers (e.g., Google Chrome, Mozilla Firefox, Safari, and Opera). The web application enables an intuitive navigation through all available information and allows the visualization of geological maps (2D), seismic transects (2D/3D), wells (2D/3D), and the 3D-model. These achievements will alleviate spatial and geological data management within the German State Geological Offices and foster the interoperability of heterogeneous systems. It will provide guidance to a systematic subsurface management across system, domain and administrative boundaries on the basis of a federated spatial data infrastructure, and include the public in the decision processes (e-Governance). Yet, the interoperability of the systems has to be strongly propelled forward through agreements on standards that need to be decided upon in responsible committees. The project B3D is funded with resources from the European Fund for Regional Development (EFRE).

  6. Spectral properties of Google matrix of Wikipedia and other networks

    NASA Astrophysics Data System (ADS)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2013-05-01

    We study the properties of eigenvalues and eigenvectors of the Google matrix of the Wikipedia articles hyperlink network and other real networks. With the help of the Arnoldi method, we analyze the distribution of eigenvalues in the complex plane and show that eigenstates with significant eigenvalue modulus are located on well defined network communities. We also show that the correlator between PageRank and CheiRank vectors distinguishes different organizations of information flow on BBC and Le Monde web sites.

  7. Content and Accessibility of Shoulder and Elbow Fellowship Web Sites in the United States.

    PubMed

    Young, Bradley L; Oladeji, Lasun O; Cichos, Kyle; Ponce, Brent

    2016-01-01

    Increasing numbers of training physicians are using the Internet to gather information about graduate medical education programs. The content and accessibility of web sites that provide this information have been demonstrated to influence applicants' decisions. Assessments of orthopedic fellowship web sites including sports medicine, pediatrics, hand and spine have found varying degrees of accessibility and material. The purpose of this study was to evaluate the accessibility and content of the American Shoulder and Elbow Surgeons (ASES) fellowship web sites (SEFWs). A complete list of ASES programs was obtained from a database on the ASES web site. The accessibility of each SEFWs was assessed by the existence of a functioning link found in the database and through Google®. Then, the following content areas of each SEFWs were evaluated: fellow education, faculty/previous fellow information, and recruitment. At the time of the study, 17 of the 28 (60.7%) ASES programs had web sites accessible through Google®, and only five (17.9%) had functioning links in the ASES database. Nine programs lacked a web site. Concerning web site content, the majority of SEFWs contained information regarding research opportunities, research requirements, case descriptions, meetings and conferences, teaching responsibilities, attending faculty, the application process, and a program description. Fewer than half of the SEFWs provided information regarding rotation schedules, current fellows, previous fellows, on-call expectations, journal clubs, medical school of current fellows, residency of current fellows, employment of previous fellows, current research, and previous research. A large portion of ASES fellowship programs lacked functioning web sites, and even fewer provided functioning links through the ASES database. Valuable information for potential applicants was largely inadequate across present SEFWs.

  8. Should we Google it? Resource use by internal medicine residents for point-of-care clinical decision making.

    PubMed

    Duran-Nelson, Alisa; Gladding, Sophia; Beattie, Jim; Nixon, L James

    2013-06-01

    To determine which resources residents use at the point-of-care (POC) for decision making, the drivers for selection of these resources, and how residents use Google/Google Scholar to answer clinical questions at the POC. In January 2012, 299 residents from three internal medicine residencies were sent an electronic survey regarding resources used for POC decision making. Resource use frequency and factors influencing choice were determined using descriptive statistics. Binary logistic regression analysis was performed to determine relationships between the independent variables. A total of 167 residents (56%) responded; similar numbers responded at each level of training. Residents most frequently reported using UpToDate and Google at the POC at least daily (85% and 63%, respectively), with speed and trust in the quality of information being the primary drivers of selection. Google, used by 68% of residents, was used primarily to locate Web sites and general information about diseases, whereas Google Scholar, used by 30% of residents, tended to be used for treatment and management decisions or locating a journal article. The findings suggest that internal medicine residents use UpToDate most frequently, followed by consultation with faculty and the search engines Google and Google Scholar; speed, trust, and portability are the biggest drivers for resource selection; and time and information overload appear to be the biggest barriers to resources such as Ovid MEDLINE. Residents frequently used Google and may benefit from further training in information management skills.

  9. Netwar

    NASA Astrophysics Data System (ADS)

    Keen, Arthur A.

    2006-04-01

    This paper describes technology being developed at 21st Century Technologies to automate Computer Network Operations (CNO). CNO refers to DoD activities related to Attacking and Defending Computer Networks (CNA & CND). Next generation cyber threats are emerging in the form of powerful Internet services and tools that automate intelligence gathering, planning, testing, and surveillance. We will focus on "Search-Engine Hacks", queries that can retrieve lists of router/switch/server passwords, control panels, accessible cameras, software keys, VPN connection files, and vulnerable web applications. Examples include "Titan Rain" attacks against DoD facilities and the Santy worm, which identifies vulnerable sites by searching Google for URLs containing application-specific strings. This trend will result in increasingly sophisticated and automated intelligence-driven cyber attacks coordinated across multiple domains that are difficult to defeat or even understand with current technology. One traditional method of CNO relies on surveillance detection as an attack predictor. Unfortunately, surveillance detection is difficult because attackers can perform search engine-driven surveillance such as with Google Hacks, and avoid touching the target site. Therefore, attack observables represent only about 5% of the attacker's total attack time, and are inadequate to provide warning. In order to predict attacks and defend against them, CNO must also employ more sophisticated techniques and work to understand the attacker's Motives, Means and Opportunities (MMO). CNO must use automated reconnaissance tools, such as Google, to identify information vulnerabilities, and then utilize Internet tools to observe the intelligence gathering, planning, testing, and collaboration activities that represent 95% of the attacker's effort.

  10. Discrepancies Between Classic and Digital Epidemiology in Searching for the Mayaro Virus: Preliminary Qualitative and Quantitative Analysis of Google Trends

    PubMed Central

    Adawi, Mohammad; Watad, Abdulla; Sharif, Kassem; Amital, Howard; Mahroum, Naim

    2017-01-01

    Background Mayaro virus (MAYV), first discovered in Trinidad in 1954, is spread by the Haemagogus mosquito. Small outbreaks have been described in the past in the Amazon jungles of Brazil and other parts of South America. Recently, a case was reported in rural Haiti. Objective Given the emerging importance of MAYV, we aimed to explore the feasibility of exploiting a Web-based tool for monitoring and tracking MAYV cases. Methods Google Trends is an online tracking system. A Google-based approach is particularly useful to monitor especially infectious diseases epidemics. We searched Google Trends from its inception (from January 2004 through to May 2017) for MAYV-related Web searches worldwide. Results We noted a burst in search volumes in the period from July 2016 (relative search volume [RSV]=13%) to December 2016 (RSV=18%), with a peak in September 2016 (RSV=100%). Before this burst, the average search activity related to MAYV was very low (median 1%). MAYV-related queries were concentrated in the Caribbean. Scientific interest from the research community and media coverage affected digital seeking behavior. Conclusions MAYV has always circulated in South America. Its recent appearance in the Caribbean has been a source of concern, which resulted in a burst of Internet queries. While Google Trends cannot be used to perform real-time epidemiological surveillance of MAYV, it can be exploited to capture the public’s reaction to outbreaks. Public health workers should be aware of this, in that information and communication technologies could be used to communicate with users, reassure them about their concerns, and to empower them in making decisions affecting their health. PMID:29196278

  11. Discrepancies Between Classic and Digital Epidemiology in Searching for the Mayaro Virus: Preliminary Qualitative and Quantitative Analysis of Google Trends.

    PubMed

    Adawi, Mohammad; Bragazzi, Nicola Luigi; Watad, Abdulla; Sharif, Kassem; Amital, Howard; Mahroum, Naim

    2017-12-01

    Mayaro virus (MAYV), first discovered in Trinidad in 1954, is spread by the Haemagogus mosquito. Small outbreaks have been described in the past in the Amazon jungles of Brazil and other parts of South America. Recently, a case was reported in rural Haiti. Given the emerging importance of MAYV, we aimed to explore the feasibility of exploiting a Web-based tool for monitoring and tracking MAYV cases. Google Trends is an online tracking system. A Google-based approach is particularly useful to monitor especially infectious diseases epidemics. We searched Google Trends from its inception (from January 2004 through to May 2017) for MAYV-related Web searches worldwide. We noted a burst in search volumes in the period from July 2016 (relative search volume [RSV]=13%) to December 2016 (RSV=18%), with a peak in September 2016 (RSV=100%). Before this burst, the average search activity related to MAYV was very low (median 1%). MAYV-related queries were concentrated in the Caribbean. Scientific interest from the research community and media coverage affected digital seeking behavior. MAYV has always circulated in South America. Its recent appearance in the Caribbean has been a source of concern, which resulted in a burst of Internet queries. While Google Trends cannot be used to perform real-time epidemiological surveillance of MAYV, it can be exploited to capture the public's reaction to outbreaks. Public health workers should be aware of this, in that information and communication technologies could be used to communicate with users, reassure them about their concerns, and to empower them in making decisions affecting their health. ©Mohammad Adawi, Nicola Luigi Bragazzi, Abdulla Watad, Kassem Sharif, Howard Amital, Naim Mahroum. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 01.12.2017.

  12. Citizen Science in the Classroom: Perils and Promise of the New Web

    NASA Astrophysics Data System (ADS)

    Loughran, T.; Dirksen, R.

    2010-12-01

    Classroom citizen science projects invite students to generate, curate, post, query, and analyze data, publishing and discussing results in potentially large collaborative contexts. The new web offers a rich palette of such projects for any STEM educator to select from or create. This easy access to citizen science in the classroom is full of both promise and peril for science education. By offering examples of classroom citizen science projects in particle physics, earth and environmental sciences, each supported by a common mashup of technologies available to ordinary users, we will illustrate something of the promise of these projects for science education, and point to some of the challenges and failure modes--the peril--raised by easy access and particularly easy publication of data. How one sensibly responds to this promise and peril depends on how one views the goals of science (or more broadly, STEM) education: either as the equipping of individual students with STEM knowledge and skills so as to empower them for future options, or as the issuing of effective invitations into STEM communities. Building on the claim that these are complementary perspectives, both of value, we will provide an example of a classroom citizen science project analyzed from both perspectives. The BOSCO classroom-to-classroom water source mapping project provides students both in Northern Uganda and in South Dakota a collaborative platform for analyzing and responding to local water quality concerns. Students gather water quality data, use Google Forms embedded in a project wiki to enter data in a spreadsheet, which then automatically (through Mapalist, a free web service) gets posted to a Google Map, itself embedded in the project wiki. Using these technologies, data is thus collected and posted for analysis in a collaborative environment: the stage is set for classroom citizen science. In the context of this project we will address the question of how teachers can take advantage of the new web to encourage students to become creative problem-solvers in online collaborative contexts without looking past the foundation of careful preparation and the standards of reliability associated with publication in the STEM disciplines.

  13. Implications of Web of Science journal impact factor for scientific output evaluation in 16 institutions and investigators' opinion.

    PubMed

    Wáng, Yì-Xiáng J; Arora, Richa; Choi, Yongdoo; Chung, Hsiao-Wen; Egorov, Vyacheslav I; Frahm, Jens; Kudo, Hiroyuki; Kuyumcu, Suleyman; Laurent, Sophie; Loffroy, Romaric; Maurea, Simone; Morcos, Sameh K; Ni, Yicheng; Oei, Edwin H G; Sabarudin, Akmal; Yu, Xin

    2014-12-01

    Journal based metrics is known not to be ideal for the measurement of the quality of individual researcher's scientific output. In the current report 16 contributors from Hong Kong SAR, India, Korea, Taiwan, Russia, Germany, Japan, Turkey, Belgium, France, Italy, UK, The Netherlands, Malaysia, and USA are invited. The following six questions were asked: (I) is Web of Sciences journal impact factor (IF) and Institute for Scientific Information (ISI) citation the main academic output performance evaluation tool in your institution? and your country? (II) How does Google citation count in your institution? and your country? (III) If paper is published in a non-SCI journal but it is included in PubMed and searchable by Google scholar, how it is valued when compared with a paper published in a journal with an IF? (IV) Do you value to publish a piece of your work in a non-SCI journal as much as a paper published in a journal with an IF? (V) What is your personal view on the metric measurement of scientific output? (VI) Overall, do you think Web of Sciences journal IF is beneficial, or actually it is doing more harm? The results show that IF and ISI citation is heavily affecting the academic life in most of the institutions. Google citation and evaluation, while is being used and convenient and speedy, has not gain wide 'official' recognition as a tool for scientific output evaluation.

  14. Google Maps: You Are Here

    ERIC Educational Resources Information Center

    Jacobsen, Mikael

    2008-01-01

    Librarians use online mapping services such as Google Maps, MapQuest, Yahoo Maps, and others to check traffic conditions, find local businesses, and provide directions. However, few libraries are using one of Google Maps most outstanding applications, My Maps, for the creation of enhanced and interactive multimedia maps. My Maps is a simple and…

  15. Tapir: A web interface for transit/eclipse observability

    NASA Astrophysics Data System (ADS)

    Jensen, Eric

    2013-06-01

    Tapir is a set of tools, written in Perl, that provides a web interface for showing the observability of periodic astronomical events, such as exoplanet transits or eclipsing binaries. The package provides tools for creating finding charts for each target and airmass plots for each event. The code can access target lists that are stored on-line in a Google spreadsheet or in a local text file.

  16. Using Google Analytics to evaluate the impact of the CyberTraining project.

    PubMed

    McGuckin, Conor; Crowley, Niall

    2012-11-01

    A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.

  17. Development of a web service for analysis in a distributed network.

    PubMed

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.

  18. Development of a Web Service for Analysis in a Distributed Network

    PubMed Central

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes. PMID:25848586

  19. GIS Technologies For The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Docasal, R.; Barbarisi, I.; Rios, C.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; De Marchi, G.; Martinez, S.; Grotheer, E.; Lim, T.; Besse, S.; Heather, D.; Fraga, D.; Barthelemy, M.

    2015-12-01

    Geographical information system (GIS) is becoming increasingly used for planetary science. GIS are computerised systems for the storage, retrieval, manipulation, analysis, and display of geographically referenced data. Some data stored in the Planetary Science Archive (PSA), for instance, a set of Mars Express/Venus Express data, have spatial metadata associated to them. To facilitate users in handling and visualising spatial data in GIS applications, the new PSA should support interoperability with interfaces implementing the standards approved by the Open Geospatial Consortium (OGC). These standards are followed in order to develop open interfaces and encodings that allow data to be exchanged with GIS Client Applications, well-known examples of which are Google Earth and NASA World Wind as well as open source tools such as Openlayers. The technology already exists within PostgreSQL databases to store searchable geometrical data in the form of the PostGIS extension. An existing open source maps server is GeoServer, an instance of which has been deployed for the new PSA, uses the OGC standards to allow, among others, the sharing, processing and editing of data and spatial data through the Web Feature Service (WFS) standard as well as serving georeferenced map images through the Web Map Service (WMS). The final goal of the new PSA, being developed by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is to create an archive which enables science exploitation of ESA's planetary missions datasets. This can be facilitated through the GIS framework, offering interfaces (both web GUI and scriptable APIs) that can be used more easily and scientifically by the community, and that will also enable the community to build added value services on top of the PSA.

  20. The impact of the web and social networks on vaccination. New challenges and opportunities offered to fight against vaccine hesitancy.

    PubMed

    Stahl, J-P; Cohen, R; Denis, F; Gaudelus, J; Martinot, A; Lery, T; Lepetit, H

    2016-05-01

    Vaccine hesitancy is a growing and threatening trend, increasing the risk of disease outbreaks and potentially defeating health authorities' strategies. We aimed to describe the significant role of social networks and the Internet on vaccine hesitancy, and more generally on vaccine attitudes and behaviors. Presentation and discussion of lessons learnt from: (i) the monitoring and analysis of web and social network contents on vaccination; (ii) the tracking of Google search terms used by web users; (iii) the analysis of Google search suggestions related to vaccination; (iv) results from the Vaccinoscopie(©) study, online annual surveys of representative samples of 6500 to 10,000 French mothers, monitoring vaccine behaviors and attitude of French parents as well as vaccination coverage of their children, since 2008; and (v) various studies published in the scientific literature. Social networks and the web play a major role in disseminating information about vaccination. They have modified the vaccination decision-making process and, more generally, the doctor/patient relationship. The Internet may fuel controversial issues related to vaccination and durably impact public opinion, but it may also provide new tools to fight against vaccine hesitancy. Vaccine hesitancy should be fought on the Internet battlefield, and for this purpose, communication strategies should take into account new threats and opportunities offered by the web and social networks. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  1. An Evaluation of Web- and Print-Based Methods to Attract People to a Physical Activity Intervention

    PubMed Central

    Jennings, Cally; Plotnikoff, Ronald C; Vandelanotte, Corneel

    2016-01-01

    Background Cost-effective and efficient methods to attract people to Web-based health behavior interventions need to be identified. Traditional print methods including leaflets, posters, and newspaper advertisements remain popular despite the expanding range of Web-based advertising options that have the potential to reach larger numbers at lower cost. Objective This study evaluated the effectiveness of multiple Web-based and print-based methods to attract people to a Web-based physical activity intervention. Methods A range of print-based (newspaper advertisements, newspaper articles, letterboxing, leaflets, and posters) and Web-based (Facebook advertisements, Google AdWords, and community calendars) methods were applied to attract participants to a Web-based physical activity intervention in Australia. The time investment, cost, number of first time website visits, the number of completed sign-up questionnaires, and the demographics of participants were recorded for each advertising method. Results A total of 278 people signed up to participate in the physical activity program. Of the print-based methods, newspaper advertisements totaled AUD $145, letterboxing AUD $135, leaflets AUD $66, posters AUD $52, and newspaper article AUD $3 per sign-up. Of the Web-based methods, Google AdWords totaled AUD $495, non-targeted Facebook advertisements AUD $68, targeted Facebook advertisements AUD $42, and community calendars AUD $12 per sign-up. Although the newspaper article and community calendars cost the least per sign-up, they resulted in only 17 and 6 sign-ups respectively. The targeted Facebook advertisements were the next most cost-effective method and reached a large number of sign-ups (n=184). The newspaper article and the targeted Facebook advertisements required the lowest time investment per sign-up (5 and 7 minutes respectively). People reached through the targeted Facebook advertisements were on average older (60 years vs 50 years, P<.001) and had a higher body mass index (32 vs 30, P<.05) than people reached through the other methods. Conclusions Overall, our results demonstrate that targeted Facebook advertising is the most cost-effective and efficient method at attracting moderate numbers to physical activity interventions in comparison to the other methods tested. Newspaper advertisements, letterboxing, and Google AdWords were not effective. The community calendars and newspaper articles may be effective for small community interventions. ClinicalTrial Australian New Zealand Clinical Trials Registry: ACTRN12614000339651; https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=363570&isReview=true (Archived by WebCite at http://www.webcitation.org/6hMnFTvBt) PMID:27235075

  2. An Evaluation of Web- and Print-Based Methods to Attract People to a Physical Activity Intervention.

    PubMed

    Alley, Stephanie; Jennings, Cally; Plotnikoff, Ronald C; Vandelanotte, Corneel

    2016-05-27

    Cost-effective and efficient methods to attract people to Web-based health behavior interventions need to be identified. Traditional print methods including leaflets, posters, and newspaper advertisements remain popular despite the expanding range of Web-based advertising options that have the potential to reach larger numbers at lower cost. This study evaluated the effectiveness of multiple Web-based and print-based methods to attract people to a Web-based physical activity intervention. A range of print-based (newspaper advertisements, newspaper articles, letterboxing, leaflets, and posters) and Web-based (Facebook advertisements, Google AdWords, and community calendars) methods were applied to attract participants to a Web-based physical activity intervention in Australia. The time investment, cost, number of first time website visits, the number of completed sign-up questionnaires, and the demographics of participants were recorded for each advertising method. A total of 278 people signed up to participate in the physical activity program. Of the print-based methods, newspaper advertisements totaled AUD $145, letterboxing AUD $135, leaflets AUD $66, posters AUD $52, and newspaper article AUD $3 per sign-up. Of the Web-based methods, Google AdWords totaled AUD $495, non-targeted Facebook advertisements AUD $68, targeted Facebook advertisements AUD $42, and community calendars AUD $12 per sign-up. Although the newspaper article and community calendars cost the least per sign-up, they resulted in only 17 and 6 sign-ups respectively. The targeted Facebook advertisements were the next most cost-effective method and reached a large number of sign-ups (n=184). The newspaper article and the targeted Facebook advertisements required the lowest time investment per sign-up (5 and 7 minutes respectively). People reached through the targeted Facebook advertisements were on average older (60 years vs 50 years, P<.001) and had a higher body mass index (32 vs 30, P<.05) than people reached through the other methods. Overall, our results demonstrate that targeted Facebook advertising is the most cost-effective and efficient method at attracting moderate numbers to physical activity interventions in comparison to the other methods tested. Newspaper advertisements, letterboxing, and Google AdWords were not effective. The community calendars and newspaper articles may be effective for small community interventions. Australian New Zealand Clinical Trials Registry: ACTRN12614000339651; https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=363570&isReview=true (Archived by WebCite at http://www.webcitation.org/6hMnFTvBt).

  3. Has the American Public's Interest in Information Related to Relationships Beyond "The Couple" Increased Over Time?

    PubMed

    Moors, Amy C

    2017-01-01

    Finding romance, love, and sexual intimacy is a central part of our life experience. Although people engage in romance in a variety of ways, alternatives to "the couple" are largely overlooked in relationship research. Scholars and the media have recently argued that the rules of romance are changing, suggesting that interest in consensual departures from monogamy may become popular as people navigate their long-term coupling. This study utilizes Google Trends to assess Americans' interest in seeking out information related to consensual nonmonogamous relationships across a 10-year period (2006-2015). Using anonymous Web queries from hundreds of thousands of Google search engine users, results show that searches for words related to polyamory and open relationships (but not swinging) have significantly increased over time. Moreover, the magnitude of the correlation between consensual nonmonogamy Web queries and time was significantly higher than popular Web queries over the same time period, indicating this pattern of increased interest in polyamory and open relationships is unique. Future research avenues for incorporating consensual nonmonogamous relationships into relationship science are discussed.

  4. The Role of Google Scholar in Evidence Reviews and Its Applicability to Grey Literature Searching

    PubMed Central

    Haddaway, Neal Robert; Collins, Alexandra Mary; Coughlin, Deborah; Kirk, Stuart

    2015-01-01

    Google Scholar (GS), a commonly used web-based academic search engine, catalogues between 2 and 100 million records of both academic and grey literature (articles not formally published by commercial academic publishers). Google Scholar collates results from across the internet and is free to use. As a result it has received considerable attention as a method for searching for literature, particularly in searches for grey literature, as required by systematic reviews. The reliance on GS as a standalone resource has been greatly debated, however, and its efficacy in grey literature searching has not yet been investigated. Using systematic review case studies from environmental science, we investigated the utility of GS in systematic reviews and in searches for grey literature. Our findings show that GS results contain moderate amounts of grey literature, with the majority found on average at page 80. We also found that, when searched for specifically, the majority of literature identified using Web of Science was also found using GS. However, our findings showed moderate/poor overlap in results when similar search strings were used in Web of Science and GS (10–67%), and that GS missed some important literature in five of six case studies. Furthermore, a general GS search failed to find any grey literature from a case study that involved manual searching of organisations’ websites. If used in systematic reviews for grey literature, we recommend that searches of article titles focus on the first 200 to 300 results. We conclude that whilst Google Scholar can find much grey literature and specific, known studies, it should not be used alone for systematic review searches. Rather, it forms a powerful addition to other traditional search methods. In addition, we advocate the use of tools to transparently document and catalogue GS search results to maintain high levels of transparency and the ability to be updated, critical to systematic reviews. PMID:26379270

  5. The Role of Google Scholar in Evidence Reviews and Its Applicability to Grey Literature Searching.

    PubMed

    Haddaway, Neal Robert; Collins, Alexandra Mary; Coughlin, Deborah; Kirk, Stuart

    2015-01-01

    Google Scholar (GS), a commonly used web-based academic search engine, catalogues between 2 and 100 million records of both academic and grey literature (articles not formally published by commercial academic publishers). Google Scholar collates results from across the internet and is free to use. As a result it has received considerable attention as a method for searching for literature, particularly in searches for grey literature, as required by systematic reviews. The reliance on GS as a standalone resource has been greatly debated, however, and its efficacy in grey literature searching has not yet been investigated. Using systematic review case studies from environmental science, we investigated the utility of GS in systematic reviews and in searches for grey literature. Our findings show that GS results contain moderate amounts of grey literature, with the majority found on average at page 80. We also found that, when searched for specifically, the majority of literature identified using Web of Science was also found using GS. However, our findings showed moderate/poor overlap in results when similar search strings were used in Web of Science and GS (10-67%), and that GS missed some important literature in five of six case studies. Furthermore, a general GS search failed to find any grey literature from a case study that involved manual searching of organisations' websites. If used in systematic reviews for grey literature, we recommend that searches of article titles focus on the first 200 to 300 results. We conclude that whilst Google Scholar can find much grey literature and specific, known studies, it should not be used alone for systematic review searches. Rather, it forms a powerful addition to other traditional search methods. In addition, we advocate the use of tools to transparently document and catalogue GS search results to maintain high levels of transparency and the ability to be updated, critical to systematic reviews.

  6. Tweeting links to Cochrane Schizophrenia Group reviews: a randomised controlled trial.

    PubMed

    Adams, C E; Jayaram, M; Bodart, A Y M; Sampson, S; Zhao, S; Montgomery, A A

    2016-03-08

    To assess the effects of using health social media on web activity. Individually randomised controlled parallel group superiority trial. Twitter and Weibo. 170 Cochrane Schizophrenia Group full reviews with an abstract and plain language summary web page. Three randomly ordered slightly different 140 character or less messages, each containing a short URL to the freely accessible summary page sent on specific times on one single day. This was compared with no messaging. The primary outcome was web page visits at 1 week. Secondary outcomes were other metrics of web activity at 1 week. 85 reviews were randomised to each of the intervention and control arms. Google Analytics allowed 100% follow-up within 1 week of completion. Intervention and control reviews received a total of 1162 and 449 visits, respectively (IRR 2.7, 95% CI 2.2 to 3.3). Fewer intervention reviews had single page only visits (16% vs 31%, OR 0.41, 0.19 to 0.88) and users spent more time viewing intervention reviews (geometric mean 76 vs 31 s, ratio 2.5, 1.3 to 4.6). Other secondary metrics of web activity all showed strong evidence in favour of the intervention. Tweeting in this limited area of healthcare increases 'product placement' of evidence with the potential for that to influence care. ISRCTN84658943. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. Virtual Sensors in a Web 2.0 Digital Watershed

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Hill, D. J.; Marini, L.; Kooper, R.; Rodriguez, A.; Myers, J. D.

    2008-12-01

    The lack of rainfall data in many watersheds is one of the major barriers for modeling and studying many environmental and hydrological processes and supporting decision making. There are just not enough rain gages on the ground. To overcome this data scarcity issue, a Web 2.0 digital watershed is developed at NCSA(National Center for Supercomputing Applications), where users can point-and-click on a web-based google map interface and create new precipitation virtual sensors at any location within the same coverage region as a NEXRAD station. A set of scientific workflows are implemented to perform spatial, temporal and thematic transformations to the near-real-time NEXRAD Level II data. Such workflows can be triggered by the users' actions and generate either rainfall rate or rainfall accumulation streaming data at a user-specified time interval. We will discuss some underlying components of this digital watershed, which consists of a semantic content management middleware, a semantically enhanced streaming data toolkit, virtual sensor management functionality, and RESTful (REpresentational State Transfer) web service that can trigger the workflow execution. Such loosely coupled architecture presents a generic framework for constructing a Web 2.0 style digital watershed. An implementation of this architecture at the Upper Illinois Rive Basin will be presented. We will also discuss the implications of the virtual sensor concept for the broad environmental observatory community and how such concept will help us move towards a participatory digital watershed.

  8. Plus One More?

    ERIC Educational Resources Information Center

    Gross, Liz

    2012-01-01

    As a new year begins, higher education professionals who manage social media are getting to know the latest social network, Google+, and how they can best use Google+ Pages to advance their institutions. When Google+ first came on the scene in late June 2011, several institutions signed up and began using the service. Given the popularity of other…

  9. The value and impact of information provided through library services for patient care: developing guidance for best practice.

    PubMed

    Weightman, Alison; Urquhart, Christine; Spink, Siân; Thomas, Rhian

    2009-03-01

    Previous impact tool-kits for UK health libraries required updating to reflect recent evidence and changes in library services. The National Knowledge Service funded development of updated guidance. Survey tools were developed based on previous impact studies and a systematic review. The resulting draft questionnaire survey was tested at four sites, and the interview schedule was investigated in a fifth area. A literature search in ASSIA, Google Scholar, INTUTE, LISA, LISTA, SCIRUS, Social Sciences Citation Index (Web of Knowledge), and the major UK University and National Libraries Catalogue (COPAC), identified ways to improve response rates. Other expert advice contributed to the guidance. The resulting guidance contains evidence-based advice and a planning pathway for conducting an impact survey as a service audit. The survey tools (critical incident questionnaire and interview schedule) are available online. The evidence-based advice recommends personalizing the request, assuring confidentiality, and using follow-up reminders. Questionnaires should be brief, and small incentives, such as a lottery draw should be considered. Bias is minimized if the survey is conducted and analysed by independent researchers. The guidance is a starting point for a pragmatic survey to assess the impact of health library services.

  10. Evaluation of the content and accessibility of web sites for accredited orthopaedic sports medicine fellowships.

    PubMed

    Mulcahey, Mary K; Gosselin, Michelle M; Fadale, Paul D

    2013-06-19

    The Internet is a common source of information for orthopaedic residents applying for sports medicine fellowships, with the web sites of the American Orthopaedic Society for Sports Medicine (AOSSM) and the San Francisco Match serving as central databases. We sought to evaluate the web sites for accredited orthopaedic sports medicine fellowships with regard to content and accessibility. We reviewed the existing web sites of the ninety-five accredited orthopaedic sports medicine fellowships included in the AOSSM and San Francisco Match databases from February to March 2012. A Google search was performed to determine the overall accessibility of program web sites and to supplement information obtained from the AOSSM and San Francisco Match web sites. The study sample consisted of the eighty-seven programs whose web sites connected to information about the fellowship. Each web site was evaluated for its informational value. Of the ninety-five programs, fifty-one (54%) had links listed in the AOSSM database. Three (3%) of all accredited programs had web sites that were linked directly to information about the fellowship. Eighty-eight (93%) had links listed in the San Francisco Match database; however, only five (5%) had links that connected directly to information about the fellowship. Of the eighty-seven programs analyzed in our study, all eighty-seven web sites (100%) provided a description of the program and seventy-six web sites (87%) included information about the application process. Twenty-one web sites (24%) included a list of current fellows. Fifty-six web sites (64%) described the didactic instruction, seventy (80%) described team coverage responsibilities, forty-seven (54%) included a description of cases routinely performed by fellows, forty-one (47%) described the role of the fellow in seeing patients in the office, eleven (13%) included call responsibilities, and seventeen (20%) described a rotation schedule. Two Google searches identified direct links for 67% to 71% of all accredited programs. Most accredited orthopaedic sports medicine fellowships lack easily accessible or complete web sites in the AOSSM or San Francisco Match databases. Improvement in the accessibility and quality of information on orthopaedic sports medicine fellowship web sites would facilitate the ability of applicants to obtain useful information.

  11. Engaging the YouTube Google-Eyed Generation: Strategies for Using Web 2.0 in Teaching and Learning

    ERIC Educational Resources Information Center

    Duffy, Peter

    2008-01-01

    YouTube, Podcasting, Blogs, Wikis and RSS are buzz words currently associated with the term Web 2.0 and represent a shifting pedagogical paradigm for the use of a new set of tools within education. The implication here is a possible shift from the basic archetypical vehicles used for (e)learning today (lecture notes, printed material, PowerPoint,…

  12. Extensible Probabilistic Repository Technology (XPRT)

    DTIC Science & Technology

    2004-10-01

    projects, such as, Centaurus , Evidence Data Base (EDB), etc., others were fabricated, such as INS and FED, while others contain data from the open...Google Web Report Unlimited SOAP API News BBC News Unlimited WEB RSS 1.0 Centaurus Person Demographics 204,402 people from 240 countries...objects of the domain ontology map to the various simulated data-sources. For example, the PersonDemographics are stored in the Centaurus database, while

  13. Being There is Only the Beginning: Toward More Effective Web 2.0 Use in Academic Libraries

    DTIC Science & Technology

    2010-01-02

    Google is Our Friend,” and “ Plagiarism 101.” Also unlike the hard-to-find blogs, many academic libraries, including both Hollins University and Urbana...Effective Web 2.0 Use in Academic Libraries by Hanna C. Bachrach Pratt Institute...5a. CONTRACT NUMBER 2.0 Use in Academic Libraries 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Bachrach

  14. Integrating Radar Image Data with Google Maps

    NASA Technical Reports Server (NTRS)

    Chapman, Bruce D.; Gibas, Sarah

    2010-01-01

    A public Web site has been developed as a method for displaying the multitude of radar imagery collected by NASA s Airborne Synthetic Aperture Radar (AIRSAR) instrument during its 16-year mission. Utilizing NASA s internal AIRSAR site, the new Web site features more sophisticated visualization tools that enable the general public to have access to these images. The site was originally maintained at NASA on six computers: one that held the Oracle database, two that took care of the software for the interactive map, and three that were for the Web site itself. Several tasks were involved in moving this complicated setup to just one computer. First, the AIRSAR database was migrated from Oracle to MySQL. Then the back-end of the AIRSAR Web site was updated in order to access the MySQL database. To do this, a few of the scripts needed to be modified; specifically three Perl scripts that query that database. The database connections were then updated from Oracle to MySQL, numerous syntax errors were corrected, and a query was implemented that replaced one of the stored Oracle procedures. Lastly, the interactive map was designed, implemented, and tested so that users could easily browse and access the radar imagery through the Google Maps interface.

  15. Vocabulary services to support scientific data interoperability

    NASA Astrophysics Data System (ADS)

    Cox, Simon; Mills, Katie; Tan, Florence

    2013-04-01

    Shared vocabularies are a core element in interoperable systems. Vocabularies need to be available at run-time, and where the vocabularies are shared by a distributed community this implies the use of web technology to provide vocabulary services. Given the ubiquity of vocabularies or classifiers in systems, vocabulary services are effectively the base of the interoperability stack. In contemporary knowledge organization systems, a vocabulary item is considered a concept, with the "terms" denoting it appearing as labels. The Simple Knowledge Organization System (SKOS) formalizes this as an RDF Schema (RDFS) application, with a bridge to formal logic in Web Ontology Language (OWL). For maximum utility, a vocabulary should be made available through the following interfaces: * the vocabulary as a whole - at an ontology URI corresponding to a vocabulary document * each item in the vocabulary - at the item URI * summaries, subsets, and resources derived by transformation * through the standard RDF web API - i.e. a SPARQL endpoint * through a query form for human users. However, the vocabulary data model may be leveraged directly in a standard vocabulary API that uses the semantics provided by SKOS. SISSvoc3 [1] accomplishes this as a standard set of URI templates for a vocabulary. Any URI comforming to the template selects a vocabulary subset based on the SKOS properties, including labels (skos:prefLabel, skos:altLabel, rdfs:label) and a subset of the semantic relations (skos:broader, skos:narrower, etc). SISSvoc3 thus provides a RESTFul SKOS API to query a vocabulary, but hiding the complexity of SPARQL. It has been implemented using the Linked Data API (LDA) [2], which connects to a SPARQL endpoint. By using LDA, we also get content-negotiation, alternative views, paging, metadata and other functionality provided in a standard way. A number of vocabularies have been formalized in SKOS and deployed by CSIRO, the Australian Bureau of Meteorology (BOM) and their collaborators using SISSvoc3, including: * geologic timescale (multiple versions) * soils classification * definitions from OGC standards * geosciml vocabularies * mining commodities * hyperspectral scalars Several other agencies in Australia have adopted SISSvoc3 for their vocabularies. SISSvoc3 differs from other SKOS-based vocabulary-access APIs such as GEMET [3] and NVS [4] in that (a) the service is decoupled from the content store, (b) the service URI is independent of the content URIs This means that a SISSvoc3 interface can be deployed over any SKOS vocabulary which is available at a SPARQL endpoint. As an example, a SISSvoc3 query and presentation interface has been deployed over the NERC vocabulary service hosted by the BODC, providing a search interface which is not available natively. We use vocabulary services to populate menus in user interfaces, to support data validation, and to configure data conversion routines. Related services built on LDA have also been used as a generic registry interface, and extended for serving gazetteer information. ACKNOWLEDGEMENTS The CSIRO SISSvoc3 implementation is built using the Epimorphics ELDA platform http://code.google.com/p/elda/. We thank Jacqui Githaiga and Terry Rankine for their contributions to SISSvoc design and implementation. REFERENCES 1. SISSvoc3 Specification https://www.seegrid.csiro.au/wiki/Siss/SISSvoc30Specification 2. Linked Data API http://code.google.com/p/linked-data-api/wiki/Specification 3. GEMET https://svn.eionet.europa.eu/projects/Zope/wiki/GEMETWebServiceAPI 4. NVS 2.0 http://vocab.nerc.ac.uk/

  16. Directed network modules

    NASA Astrophysics Data System (ADS)

    Palla, Gergely; Farkas, Illés J.; Pollner, Péter; Derényi, Imre; Vicsek, Tamás

    2007-06-01

    A search technique locating network modules, i.e. internally densely connected groups of nodes in directed networks is introduced by extending the clique percolation method originally proposed for undirected networks. After giving a suitable definition for directed modules we investigate their percolation transition in the Erdos-Rényi graph both analytically and numerically. We also analyse four real-world directed networks, including Google's own web-pages, an email network, a word association graph and the transcriptional regulatory network of the yeast Saccharomyces cerevisiae. The obtained directed modules are validated by additional information available for the nodes. We find that directed modules of real-world graphs inherently overlap and the investigated networks can be classified into two major groups in terms of the overlaps between the modules. Accordingly, in the word-association network and Google's web-pages, overlaps are likely to contain in-hubs, whereas the modules in the email and transcriptional regulatory network tend to overlap via out-hubs.

  17. Google matrix analysis of directed networks

    NASA Astrophysics Data System (ADS)

    Ermann, Leonardo; Frahm, Klaus M.; Shepelyansky, Dima L.

    2015-10-01

    In the past decade modern societies have developed enormous communication and social networks. Their classification and information retrieval processing has become a formidable task for the society. Because of the rapid growth of the World Wide Web, and social and communication networks, new mathematical methods have been invented to characterize the properties of these networks in a more detailed and precise way. Various search engines extensively use such methods. It is highly important to develop new tools to classify and rank a massive amount of network information in a way that is adapted to internal network structures and characteristics. This review describes the Google matrix analysis of directed complex networks demonstrating its efficiency using various examples including the World Wide Web, Wikipedia, software architectures, world trade, social and citation networks, brain neural networks, DNA sequences, and Ulam networks. The analytical and numerical matrix methods used in this analysis originate from the fields of Markov chains, quantum chaos, and random matrix theory.

  18. Google and Microsoft[R] Go to School

    ERIC Educational Resources Information Center

    Dessoff, Alan

    2010-01-01

    Steve Nelson, chief IT strategist for the Oregon Department of Education needed a new e-mail service that would provide security and privacy to users and found what he was looking for with Google Apps Education Edition. The Kentucky Department of Education wanted an improved capability to provide e-mail services to the more than 700,000 students,…

  19. Free or Open Access to Scholarly Documentation: Google Scholar or Academic Libraries

    ERIC Educational Resources Information Center

    Burns, C. Sean

    2013-01-01

    Soon after the university movement started in the late 1800s, academic libraries became the dominant providers of the tools and services required to locate and access scholarly information. However, with the advent of alternate discovery services, such as Google Scholar, in conjunction with open access scholarly content, researchers now have the…

  20. Paths of Discovery: Comparing the Search Effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and Conventional Library Resources

    ERIC Educational Resources Information Center

    Asher, Andrew D.; Duke, Lynda M.; Wilson, Suzanne

    2013-01-01

    In 2011, researchers at Bucknell University and Illinois Wesleyan University compared the search efficacy of Serial Solutions Summon, EBSCO Discovery Service, Google Scholar, and conventional library databases. Using a mixed-methods approach, qualitative and quantitative data were gathered on students' usage of these tools. Regardless of the…

  1. Participating in the Geospatial Web: Collaborative Mapping, Social Networks and Participatory GIS

    NASA Astrophysics Data System (ADS)

    Rouse, L. Jesse; Bergeron, Susan J.; Harris, Trevor M.

    In 2005, Google, Microsoft and Yahoo! released free Web mapping applications that opened up digital mapping to mainstream Internet users. Importantly, these companies also released free APIs for their platforms, allowing users to geo-locate and map their own data. These initiatives have spurred the growth of the Geospatial Web and represent spatially aware online communities and new ways of enabling communities to share information from the bottom up. This chapter explores how the emerging Geospatial Web can meet some of the fundamental needs of Participatory GIS projects to incorporate local knowledge into GIS, as well as promote public access and collaborative mapping.

  2. Greek Academic Librarians' Perceptions of the Impact of Google on Their Role as Information Providers

    ERIC Educational Resources Information Center

    Garoufallou, Emmanouel; Balatsoukas, Panos; Siatri, Rania; Zafeiriou, Georgia; Asderi, S.; Ekizoglou; P.

    2008-01-01

    The increased popularity of Google search engine in the daily routine in one's workplace and in the academic information seeking process is undeniable. "Googling" challenges the traditional skills of librarians as information providers and the role of library and information service provision in the digital era. This paper reports on the…

  3. Seeing Google through the Eyes of Turkish Academicians

    ERIC Educational Resources Information Center

    Alakurt, Turgay; Bardakci, Salih

    2017-01-01

    With its new variety of IT products and services created in the last decade for students, teachers and schools, Google has changed the face of education. Google technologies that can be used completely free of charge via a single account in any device offer innovative alternatives to meet the needs of education. These technologies also help…

  4. Change and Anomaly Detection in Real-Time GPS Data

    NASA Astrophysics Data System (ADS)

    Granat, R.; Pierce, M.; Gao, X.; Bock, Y.

    2008-12-01

    The California Real-Time Network (CRTN) is currently generating real-time GPS position data at a rate of 1-2Hz at over 80 locations. The CRTN data presents the possibility of studying dynamical solid earth processes in a way that complements existing seismic networks. To realize this possibility we have developed a prototype system for detecting changes and anomalies in the real-time data. Through this system, we can can correlate changes in multiple stations in order to detect signals with geographical extent. Our approach involves developing a statistical model for each GPS station in the network, and then using those models to segment the time series into a number of discrete states described by the model. We use a hidden Markov model (HMM) to describe the behavior of each station; fitting the model to the data requires neither labeled training examples nor a priori information about the system. As such, HMMs are well suited to this problem domain, in which the data remains largely uncharacterized. There are two main components to our approach. The first is the model fitting algorithm, regularized deterministic annealing expectation- maximization (RDAEM), which provides robust, high-quality results. The second is a web service infrastructure that connects the data to the statistical modeling analysis and allows us to easily present the results of that analysis through a web portal interface. This web service approach facilitates the automatic updating of station models to keep pace with dynamical changes in the data. Our web portal interface is critical to the process of interpreting the data. A Google Maps interface allows users to visually interpret state changes not only on individual stations but across the entire network. Users can drill down from the map interface to inspect detailed results for individual stations, download the time series data, and inspect fitted models. Alternatively, users can use the web portal look at the evolution of changes on the network by moving backwards and forwards in time.

  5. VegScape: U.S. Crop Condition Monitoring Service

    NASA Astrophysics Data System (ADS)

    mueller, R.; Yang, Z.; Di, L.

    2013-12-01

    Since 1995, the US Department of Agriculture (USDA)/National Agricultural Statistics Service (NASS) has provided qualitative biweekly vegetation condition indices to USDA policymakers and the public on a weekly basis during the growing season. Vegetation indices have proven useful for assessing crop condition and identifying the areal extent of floods, drought, major weather anomalies, and vulnerabilities of early/late season crops. With growing emphasis on more extreme weather events and food security issues rising to the forefront of national interest, a new vegetation condition monitoring system was developed. The new vegetation condition portal named VegScape was initiated at the start of the 2013 growing season. VegScape delivers web mapping service based interactive vegetation indices. Users can use an interactive map to explore, query and disseminate current crop conditions. Vegetation indices like Normal Difference Vegetation Index (NDVI), Vegetation Condition Index (VCI), and mean, median, and ratio comparisons to prior years can be constructed for analytical purposes and on-demand crop statistics. The NASA MODIS satellite with 250 meter (15 acres) resolution and thirteen years of data history provides improved spatial and temporal resolutions and delivers improved detailed timely (i.e., daily) crop specific condition and dynamics. VegScape thus provides supplemental information to support NASS' weekly crop reports. VegScape delivers an agricultural cultivated crop mask and the most recent Cropland Data Layer (CDL) product to exploit the agricultural domain and visualize prior years' planted crops. Additionally, the data can be directly exported to Google Earth for web mashups or delivered via web mapping services for uses in other applications. VegScape supports the ethos of data democracy by providing free and open access to digital geospatial data layers using open geospatial standards, thereby supporting transparent and collaborative government initiatives. NASS developed VegScape in cooperation with the Center for Spatial Information Science and Systems, George Mason University, Fairfax, VA. VegScape Ratio to Median NDVI

  6. Use of Web 2.0 tools by hospital pharmacists.

    PubMed

    Bonaga Serrano, B; Aldaz Francés, R; Garrigues Sebastiá, M R; Hernández San Salvador, M

    2014-04-01

    Web 2.0 tools are transforming the pathways health professionals use to communicate among themselves and with their patients so this situation forces a change of mind to implement them. The aim of our study is to assess the state of knowledge of the main Web 2.0 applications and how are used in a sample of hospital pharmacists. The study was carried out through an anonymous survey to all members of the Spanish Society of Hospital Pharmacy (SEFH) by means of a questionnaire sent by the Google Drive® application. After the 3-month study period was completed, collected data were compiled and then analyzed using SPPS v15.0. The response rate was 7.3%, being 70.5% female and 76.3% specialists. The majority of respondents (54.2%) were aged 20 to 35. Pubmed was the main way of accessing published articles. 65.2% of pharmacists knew the term "Web 2.0". 45.3% pharmacists were Twitter users and over 58.9% mainly for professional purposes. Most pharmacists believed that Twitter was a good tool to interact with professionals and patients. 78.7% do not use an agregator, but when used, Google Reader was the most common. Although Web 2.0 applications are gaining mainstream popularity some health professionals may resist using them. In fact, more than a half of surveyed pharmacists referred a lack of knowledge about Web 2.0 tools. It would be positive for pharmacists to use them properly during their professional practice to get the best out of them. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  7. Basic GA Tools to Evaluate Your Web Area

    EPA Pesticide Factsheets

    Learn steps and tips for creating these Google Analytics (GA) reports, so you can learn which pages are popular or unpopular, which PDFs are getting looked at, who is using your pages, what search terms they used, and more.

  8. From the Director: Surfing the Web for Health Information

    MedlinePlus

    ... Reliable Results Most Internet users first visit a search engine — like Google or Yahoo! — when seeking health information. ... medical terms like "cancer" or "diabetes" into a search engine, the top-ten results will likely include authoritative ...

  9. Multigraph: Interactive Data Graphs on the Web

    NASA Astrophysics Data System (ADS)

    Phillips, M. B.

    2010-12-01

    Many aspects of geophysical science involve time dependent data that is often presented in the form of a graph. Considering that the web has become a primary means of communication, there are surprisingly few good tools and techniques available for presenting time-series data on the web. The most common solution is to use a desktop tool such as Excel or Matlab to create a graph which is saved as an image and then included in a web page like any other image. This technique is straightforward, but it limits the user to one particular view of the data, and disconnects the graph from the data in a way that makes updating a graph with new data an often cumbersome manual process. This situation is somewhat analogous to the state of mapping before the advent of GIS. Maps existed only in printed form, and creating a map was a laborious process. In the last several years, however, the world of mapping has experienced a revolution in the form of web-based and other interactive computer technologies, so that it is now commonplace for anyone to easily browse through gigabytes of geographic data. Multigraph seeks to bring a similar ease of access to time series data. Multigraph is a program for displaying interactive time-series data graphs in web pages that includes a simple way of configuring the appearance of the graph and the data to be included. It allows multiple data sources to be combined into a single graph, and allows the user to explore the data interactively. Multigraph lets users explore and visualize "data space" in the same way that interactive mapping applications such as Google Maps facilitate exploring and visualizing geography. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file and requires no programming. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Graph of Global Temperature Anomalies from ClimateWatch Magazine (http://www.climatewatch.noaa.gov/2009/articles/climate-change-global-temperature)

  10. Visualization of seismic tomography on Google Earth: Improvement of KML generator and its web application to accept the data file in European standard format

    NASA Astrophysics Data System (ADS)

    Yamagishi, Y.; Yanaka, H.; Tsuboi, S.

    2009-12-01

    We have developed a conversion tool for the data of seismic tomography into KML, called KML generator, and made it available on the web site (http://www.jamstec.go.jp/pacific21/google_earth). The KML generator enables us to display vertical and horizontal cross sections of the model on Google Earth in three-dimensional manner, which would be useful to understand the Earth's interior. The previous generator accepts text files of grid-point data having longitude, latitude, and seismic velocity anomaly. Each data file contains the data for each depth. Metadata, such as bibliographic reference, grid-point interval, depth, are described in other information file. We did not allow users to upload their own tomographic model to the web application, because there is not standard format to represent tomographic model. Recently European seismology research project, NEIRES (Network of Research Infrastructures for European Seismology), advocates that the data of seismic tomography should be standardized. They propose a new format based on JSON (JavaScript Object Notation), which is one of the data-interchange formats, as a standard one for the tomography. This format consists of two parts, which are metadata and grid-point data values. The JSON format seems to be powerful to handle and to analyze the tomographic model, because the structure of the format is fully defined by JavaScript objects, thus the elements are directly accessible by a script. In addition, there exist JSON libraries for several programming languages. The International Federation of Digital Seismograph Network (FDSN) adapted this format as a FDSN standard format for seismic tomographic model. There might be a possibility that this format would not only be accepted by European seismologists but also be accepted as the world standard. Therefore we improve our KML generator for seismic tomography to accept the data file having also JSON format. We also improve the web application of the generator so that the JSON formatted data file can be uploaded. Users can convert any tomographic model data to KML. The KML obtained through the new generator should provide an arena to compare various tomographic models and other geophysical observations on Google Earth, which may act as a common platform for geoscience browser.

  11. Assessing Ebola-related web search behaviour: insights and implications from an analytical study of Google Trends-based query volumes.

    PubMed

    Alicino, Cristiano; Bragazzi, Nicola Luigi; Faccio, Valeria; Amicizia, Daniela; Panatto, Donatella; Gasparini, Roberto; Icardi, Giancarlo; Orsi, Andrea

    2015-12-10

    The 2014 Ebola epidemic in West Africa has attracted public interest worldwide, leading to millions of Ebola-related Internet searches being performed during the period of the epidemic. This study aimed to evaluate and interpret Google search queries for terms related to the Ebola outbreak both at the global level and in all countries where primary cases of Ebola occurred. The study also endeavoured to look at the correlation between the number of overall and weekly web searches and the number of overall and weekly new cases of Ebola. Google Trends (GT) was used to explore Internet activity related to Ebola. The study period was from 29 December 2013 to 14 June 2015. Pearson's correlation was performed to correlate Ebola-related relative search volumes (RSVs) with the number of weekly and overall Ebola cases. Multivariate regression was performed using Ebola-related RSV as a dependent variable, and the overall number of Ebola cases and the Human Development Index were used as predictor variables. The greatest RSV was registered in the three West African countries mainly affected by the Ebola epidemic. The queries varied in the different countries. Both quantitative and qualitative differences between the affected African countries and other Western countries with primary cases were noted, in relation to the different flux volumes and different time courses. In the affected African countries, web query search volumes were mostly concentrated in the capital areas. However, in Western countries, web queries were uniformly distributed over the national territory. In terms of the three countries mainly affected by the Ebola epidemic, the correlation between the number of new weekly cases of Ebola and the weekly GT index varied from weak to moderate. The correlation between the number of Ebola cases registered in all countries during the study period and the GT index was very high. Google Trends showed a coarse-grained nature, strongly correlating with global epidemiological data, but was weaker at country level, as it was prone to distortions induced by unbalanced media coverage and the digital divide. Global and local health agencies could usefully exploit GT data to identify disease-related information needs and plan proper communication strategies, particularly in the case of health-threatening events.

  12. User-driven Cloud Implementation of environmental models and data for all

    NASA Astrophysics Data System (ADS)

    Gurney, R. J.; Percy, B. J.; Elkhatib, Y.; Blair, G. S.

    2014-12-01

    Environmental data and models come from disparate sources over a variety of geographical and temporal scales with different resolutions and data standards, often including terabytes of data and model simulations. Unfortunately, these data and models tend to remain solely within the custody of the private and public organisations which create the data, and the scientists who build models and generate results. Although many models and datasets are theoretically available to others, the lack of ease of access tends to keep them out of reach of many. We have developed an intuitive web-based tool that utilises environmental models and datasets located in a cloud to produce results that are appropriate to the user. Storyboards showing the interfaces and visualisations have been created for each of several exemplars. A library of virtual machine images has been prepared to serve these exemplars. Each virtual machine image has been tailored to run computer models appropriate to the end user. Two approaches have been used; first as RESTful web services conforming to the Open Geospatial Consortium (OGC) Web Processing Service (WPS) interface standard using the Python-based PyWPS; second, a MySQL database interrogated using PHP code. In all cases, the web client sends the server an HTTP GET request to execute the process with a number of parameter values and, once execution terminates, an XML or JSON response is sent back and parsed at the client side to extract the results. All web services are stateless, i.e. application state is not maintained by the server, reducing its operational overheads and simplifying infrastructure management tasks such as load balancing and failure recovery. A hybrid cloud solution has been used with models and data sited on both private and public clouds. The storyboards have been transformed into intuitive web interfaces at the client side using HTML, CSS and JavaScript, utilising plug-ins such as jQuery and Flot (for graphics), and Google Maps APIs. We have demonstrated that a cloud infrastructure can be used to assemble a virtual research environment that, coupled with a user-driven development approach, is able to cater to the needs of a wide range of user groups, from domain experts to concerned members of the general public.

  13. Exploratory Visual Analytics of a Dynamically Built Network of Nodes in a WebGL-Enabled Browser

    DTIC Science & Technology

    2014-01-01

    dimensionality reduction, feature extraction, high-dimensional data, t-distributed stochastic neighbor embedding, neighbor retrieval visualizer, visual...WebGL-enabled rendering is supported natively by browsers such as the latest Mozilla Firefox , Google Chrome, and Microsoft Internet Explorer 11. At the...appropriate names. The resultant 26-node network is displayed in a Mozilla Firefox browser in figure 2 (also see appendix B). 3 Figure 1. The

  14. Design of Deformation Monitoring System for Volcano Mitigation

    NASA Astrophysics Data System (ADS)

    Islamy, M. R. F.; Salam, R. A.; Munir, M. M.; Irsyam, M.; Khairurrijal

    2016-08-01

    Indonesia has many active volcanoes that are potentially disastrous. It needs good mitigation systems to prevent victims and to reduce casualties from potential disaster caused by volcanoes eruption. Therefore, the system to monitor the deformation of volcano was built. This system employed telemetry with the combination of Radio Frequency (RF) communications of XBEE and General Packet Radio Service (GPRS) communication of SIM900. There are two types of modules in this system, first is the coordinator as a parent and second is the node as a child. Each node was connected to coordinator forming a Wireless Sensor Network (WSN) with a star topology and it has an inclinometer based sensor, a Global Positioning System (GPS), and an XBEE module. The coordinator collects data to each node, one a time, to prevent collision data between nodes, save data to SD Card and transmit data to web server via GPRS. Inclinometer was calibrated with self-built in calibrator and tested in high temperature environment to check the durability. The GPS was tested by displaying its position in web server via Google Map Application Protocol Interface (API v.3). It was shown that the coordinator can receive and transmit data from every node to web server very well and the system works well in a high temperature environment.

  15. GGRNA: an ultrafast, transcript-oriented search engine for genes and transcripts

    PubMed Central

    Naito, Yuki; Bono, Hidemasa

    2012-01-01

    GGRNA (http://GGRNA.dbcls.jp/) is a Google-like, ultrafast search engine for genes and transcripts. The web server accepts arbitrary words and phrases, such as gene names, IDs, gene descriptions, annotations of gene and even nucleotide/amino acid sequences through one simple search box, and quickly returns relevant RefSeq transcripts. A typical search takes just a few seconds, which dramatically enhances the usability of routine searching. In particular, GGRNA can search sequences as short as 10 nt or 4 amino acids, which cannot be handled easily by popular sequence analysis tools. Nucleotide sequences can be searched allowing up to three mismatches, or the query sequences may contain degenerate nucleotide codes (e.g. N, R, Y, S). Furthermore, Gene Ontology annotations, Enzyme Commission numbers and probe sequences of catalog microarrays are also incorporated into GGRNA, which may help users to conduct searches by various types of keywords. GGRNA web server will provide a simple and powerful interface for finding genes and transcripts for a wide range of users. All services at GGRNA are provided free of charge to all users. PMID:22641850

  16. GGRNA: an ultrafast, transcript-oriented search engine for genes and transcripts.

    PubMed

    Naito, Yuki; Bono, Hidemasa

    2012-07-01

    GGRNA (http://GGRNA.dbcls.jp/) is a Google-like, ultrafast search engine for genes and transcripts. The web server accepts arbitrary words and phrases, such as gene names, IDs, gene descriptions, annotations of gene and even nucleotide/amino acid sequences through one simple search box, and quickly returns relevant RefSeq transcripts. A typical search takes just a few seconds, which dramatically enhances the usability of routine searching. In particular, GGRNA can search sequences as short as 10 nt or 4 amino acids, which cannot be handled easily by popular sequence analysis tools. Nucleotide sequences can be searched allowing up to three mismatches, or the query sequences may contain degenerate nucleotide codes (e.g. N, R, Y, S). Furthermore, Gene Ontology annotations, Enzyme Commission numbers and probe sequences of catalog microarrays are also incorporated into GGRNA, which may help users to conduct searches by various types of keywords. GGRNA web server will provide a simple and powerful interface for finding genes and transcripts for a wide range of users. All services at GGRNA are provided free of charge to all users.

  17. Demonstrating the use of web analytics and an online survey to understand user groups of a national network of river level data

    NASA Astrophysics Data System (ADS)

    Macleod, Christopher Kit; Braga, Joao; Arts, Koen; Ioris, Antonio; Han, Xiwu; Sripada, Yaji; van der Wal, Rene

    2016-04-01

    The number of local, national and international networks of online environmental sensors are rapidly increasing. Where environmental data are made available online for public consumption, there is a need to advance our understanding of the relationships between the supply of and the different demands for such information. Understanding how individuals and groups of users are using online information resources may provide valuable insights into their activities and decision making. As part of the 'dot.rural wikiRivers' project we investigated the potential of web analytics and an online survey to generate insights into the use of a national network of river level data from across Scotland. These sources of online information were collected alongside phone interviews with volunteers sampled from the online survey, and interviews with providers of online river level data; as part of a larger project that set out to help improve the communication of Scotland's online river data. Our web analytics analysis was based on over 100 online sensors which are maintained by the Scottish Environmental Protection Agency (SEPA). Through use of Google Analytics data accessed via the R Ganalytics package we assessed: if the quality of data provided by Google Analytics free service is good enough for research purposes; if we could demonstrate what sensors were being used, when and where; how the nature and pattern of sensor data may affect web traffic; and whether we can identify and profile these users based on information from traffic sources. Web analytics data consists of a series of quantitative metrics which capture and summarize various dimensions of the traffic to a certain web page or set of pages. Examples of commonly used metrics include the number of total visits to a site and the number of total page views. Our analyses of the traffic sources from 2009 to 2011 identified several different major user groups. To improve our understanding of how the use of this national network of river level data may provide insights into the interactions between individuals and their usage of hydrological information, we ran an online survey linked to the SEPA river level pages for one year. We collected over 2000 complete responses to the survey. The survey included questions on user activities and the importance of river level information for their activities; alongside questions on what additional information they used in their decision making e.g. precipitation, and when and what river pages they visited. In this presentation we will present results from our analysis of the web analytics and online survey, and the insights they provide to understanding user groups of this national network of river level data.

  18. TouchTerrain: A simple web-tool for creating 3D-printable topographic models

    NASA Astrophysics Data System (ADS)

    Hasiuk, Franciszek J.; Harding, Chris; Renner, Alex Raymond; Winer, Eliot

    2017-12-01

    An open-source web-application, TouchTerrain, was developed to simplify the production of 3D-printable terrain models. Direct Digital Manufacturing (DDM) using 3D Printers can change how geoscientists, students, and stakeholders interact with 3D data, with the potential to improve geoscience communication and environmental literacy. No other manufacturing technology can convert digital data into tangible objects quickly at relatively low cost; however, the expertise necessary to produce a 3D-printed terrain model can be a substantial burden: knowledge of geographical information systems, computer aided design (CAD) software, and 3D printers may all be required. Furthermore, printing models larger than the build volume of a 3D printer can pose further technical hurdles. The TouchTerrain web-application simplifies DDM for elevation data by generating digital 3D models customized for a specific 3D printer's capabilities. The only required user input is the selection of a region-of-interest using the provided web-application with a Google Maps-style interface. Publically available digital elevation data is processed via the Google Earth Engine API. To allow the manufacture of 3D terrain models larger than a 3D printer's build volume the selected area can be split into multiple tiles without third-party software. This application significantly reduces the time and effort required for a non-expert like an educator to obtain 3D terrain models for use in class. The web application is deployed at http://touchterrain.geol.iastate.edu/.

  19. News trends and web search query of HIV/AIDS in Hong Kong.

    PubMed

    Chiu, Alice P Y; Lin, Qianying; He, Daihai

    2017-01-01

    The HIV epidemic in Hong Kong has worsened in recent years, with major contributions from high-risk subgroup of men who have sex with men (MSM). Internet use is prevalent among the majority of the local population, where they sought health information online. This study examines the impacts of HIV/AIDS and MSM news coverage on web search query in Hong Kong. Relevant news coverage about HIV/AIDS and MSM from January 1st, 2004 to December 31st, 2014 was obtained from the WiseNews databse. News trends were created by computing the number of relevant articles by type, topic, place of origin and sub-populations. We then obtained relevant search volumes from Google and analysed causality between news trends and Google Trends using Granger Causality test and orthogonal impulse function. We found that editorial news has an impact on "HIV" Google searches on HIV, with the search term popularity peaking at an average of two weeks after the news are published. Similarly, editorial news has an impact on the frequency of "AIDS" searches two weeks after. MSM-related news trends have a more fluctuating impact on "MSM" Google searches, although the time lag varies anywhere from one week later to ten weeks later. This infodemiological study shows that there is a positive impact of news trends on the online search behavior of HIV/AIDS or MSM-related issues for up to ten weeks after. Health promotional professionals could make use of this brief time window to tailor the timing of HIV awareness campaigns and public health interventions to maximise its reach and effectiveness.

  20. eHealth Interventions for HIV Prevention in High-Risk Men Who Have Sex With Men: A Systematic Review

    PubMed Central

    Travers, Jasmine; Rojas, Marlene; Carballo-Diéguez, Alex

    2014-01-01

    Background While the human immunodeficiency virus (HIV) incidence rate has remained steady in most groups, the overall incidence of HIV among men who have sex with men (MSM) has been steadily increasing in the United States. eHealth is a platform for health behavior change interventions and provides new opportunities for the delivery of HIV prevention messages. Objective The purpose of this systematic review was to examine the use of eHealth interventions for HIV prevention in high-risk MSM. Methods We systematically searched PubMed, OVID, ISI Web of Knowledge, Google Scholar, and Google for articles and grey literature reporting the original results of any studies related to HIV prevention in MSM and developed a standard data collection form to extract information on study characteristics and outcome data. Results In total, 13 articles met the inclusion criteria, of which five articles targeted HIV testing behaviors and eight focused on decreasing HIV risk behaviors. Interventions included Web-based education modules, text messaging (SMS, short message service), chat rooms, and social networking. The methodological quality of articles ranged from 49.4-94.6%. Wide variation in the interventions meant synthesis of the results using meta-analysis would not be appropriate. Conclusions This review shows evidence that eHealth for HIV prevention in high-risk MSM has the potential to be effective in the short term for reducing HIV risk behaviors and increasing testing rates. Given that many of these studies were short term and had other limitations, but showed strong preliminary evidence of improving outcomes, additional work needs to rigorously assess the use of eHealth strategies for HIV prevention in high-risk MSM. PMID:24862459

  1. NOAA's Big Data Partnership at the National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2015-12-01

    In April of 2015, the U.S. Department of Commerce announced NOAA's Big Data Partnership (BDP) with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Corp., and the Open Cloud Consortium through Cooperative Research and Development Agreements. Recent progress on the activities with these Partners at the National Centers for Environmental Information (NCEI) will be presented. These activities include the transfer of over 350 TB of NOAA's archived data from NCEI's tape-based archive system to BDP cloud providers; new opportunities for data mining and investigation; application of NOAA's data maturity and stewardship concepts to the BDP; and integration of both archived and near-realtime data streams into a synchronized, distributed data system. Both lessons learned and future opportunities for the environmental data community will be presented.

  2. Access High Quality Imagery from the NOAA View Portal

    NASA Astrophysics Data System (ADS)

    Pisut, D.; Powell, A. M.; Loomis, T.; Goel, V.; Mills, B.; Cowan, D.

    2013-12-01

    NOAA curates a vast treasure trove of environmental data, but one that is sometimes not easily accessed, especially for education, outreach, and media purposes. Traditional data portals in NOAA require extensive knowledge of the specific names of observation platforms, models, and analyses, along with nomenclature for variable outputs. A new website and web mapping service (WMS) from NOAA attempts to remedy such issues. The NOAA View data imagery portal provides a seamless entry point into data from across the agency: satellite, models, in-situ analysis, etc. The system provides the user with ability to browse, animate, and download high resolution (e.g., 4,000 x 2,000 pixel) imagery, Google Earth, and even proxy data files. The WMS architecture also allows the resources to be ingested into other software systems or applications.

  3. WebViz: A web browser based application for collaborative analysis of 3D data

    NASA Astrophysics Data System (ADS)

    Ruegg, C. S.

    2011-12-01

    In the age of high speed Internet where people can interact instantly, scientific tools have lacked technology which can incorporate this concept of communication using the web. To solve this issue a web application for geological studies has been created, tentatively titled WebViz. This web application utilizes tools provided by Google Web Toolkit to create an AJAX web application capable of features found in non web based software. Using these tools, a web application can be created to act as piece of software from anywhere in the globe with a reasonably speedy Internet connection. An application of this technology can be seen with data regarding the recent tsunami from the major japan earthquakes. After constructing the appropriate data to fit a computer render software called HVR, WebViz can request images of the tsunami data and display it to anyone who has access to the application. This convenience alone makes WebViz a viable solution, but the option to interact with this data with others around the world causes WebViz to be taken as a serious computational tool. WebViz also can be used on any javascript enabled browser such as those found on modern tablets and smart phones over a fast wireless connection. Due to the fact that WebViz's current state is built using Google Web Toolkit the portability of the application is in it's most efficient form. Though many developers have been involved with the project, each person has contributed to increase the usability and speed of the application. In the project's most recent form a dramatic speed increase has been designed as well as a more efficient user interface. The speed increase has been informally noticed in recent uses of the application in China and Australia with the hosting server being located at the University of Minnesota. The user interface has been improved to not only look better but the functionality has been improved. Major functions of the application are rotating the 3D object using buttons. These buttons have been replaced with a new layout that is easier to understand the function and is also easy to use with mobile devices. With these new changes, WebViz is easier to control and use for general use.

  4. Quality of Web-based Information for the 10 Most Common Fractures.

    PubMed

    Memon, Muzammil; Ginsberg, Lydia; Simunovic, Nicole; Ristevski, Bill; Bhandari, Mohit; Kleinlugtenbelt, Ydo Vincent

    2016-06-17

    In today's technologically advanced world, 75% of patients have used Google to search for health information. As a result, health care professionals fear that patients may be misinformed. Currently, there is a paucity of data on the quality and readability of Web-based health information on fractures. In this study, we assessed the quality and readability of Web-based health information related to the 10 most common fractures. Using the Google search engine, we assessed websites from the first results page for the 10 most common fractures using lay search terms. Website quality was measured using the DISCERN instrument, which scores websites as very poor (15-22.5), poor (22.5-37.5), fair (37.5-52.5), good (52.5-67.5), or excellent (67.5-75). The presence of Health on the Net code (HONcode) certification was assessed for all websites. Website readability was measured using the Flesch Reading Ease Score (0-100), where 60-69 is ideal for the general public, and the Flesch-Kincaid Grade Level (FKGL; -3.4 to ∞), where the mean FKGL of the US adult population is 8. Overall, website quality was "fair" for all fractures, with a mean (standard deviation) DISCERN score of 50.3 (5.8). The DISCERN score correlated positively with a higher website position on the search results page (r(2)=0.1, P=.002) and with HONcode certification (P=.007). The mean (standard deviation) Flesch Reading Ease Score and FKGL for all fractures were 62.2 (9.1) and 6.7 (1.6), respectively. The quality of Web-based health information on fracture care is fair, and its readability is appropriate for the general public. To obtain higher quality information, patients should select HONcode-certified websites. Furthermore, patients should select websites that are positioned higher on the results page because the Google ranking algorithms appear to rank the websites by quality.

  5. Quality of Web-based Information for the 10 Most Common Fractures

    PubMed Central

    Ginsberg, Lydia; Simunovic, Nicole; Ristevski, Bill; Bhandari, Mohit; Kleinlugtenbelt, Ydo Vincent

    2016-01-01

    Background In today's technologically advanced world, 75% of patients have used Google to search for health information. As a result, health care professionals fear that patients may be misinformed. Currently, there is a paucity of data on the quality and readability of Web-based health information on fractures. Objectives In this study, we assessed the quality and readability of Web-based health information related to the 10 most common fractures. Methods Using the Google search engine, we assessed websites from the first results page for the 10 most common fractures using lay search terms. Website quality was measured using the DISCERN instrument, which scores websites as very poor (15-22.5), poor (22.5-37.5), fair (37.5-52.5), good (52.5-67.5), or excellent (67.5-75). The presence of Health on the Net code (HONcode) certification was assessed for all websites. Website readability was measured using the Flesch Reading Ease Score (0-100), where 60-69 is ideal for the general public, and the Flesch-Kincaid Grade Level (FKGL; −3.4 to ∞), where the mean FKGL of the US adult population is 8. Results Overall, website quality was “fair” for all fractures, with a mean (standard deviation) DISCERN score of 50.3 (5.8). The DISCERN score correlated positively with a higher website position on the search results page (r2=0.1, P=.002) and with HONcode certification (P=.007). The mean (standard deviation) Flesch Reading Ease Score and FKGL for all fractures were 62.2 (9.1) and 6.7 (1.6), respectively. Conclusion The quality of Web-based health information on fracture care is fair, and its readability is appropriate for the general public. To obtain higher quality information, patients should select HONcode-certified websites. Furthermore, patients should select websites that are positioned higher on the results page because the Google ranking algorithms appear to rank the websites by quality. PMID:27317159

  6. High correlation of Middle East respiratory syndrome spread with Google search and Twitter trends in Korea.

    PubMed

    Shin, Soo-Yong; Seo, Dong-Woo; An, Jisun; Kwak, Haewoon; Kim, Sung-Han; Gwack, Jin; Jo, Min-Woo

    2016-09-06

    The Middle East respiratory syndrome coronavirus (MERS-CoV) was exported to Korea in 2015, resulting in a threat to neighboring nations. We evaluated the possibility of using a digital surveillance system based on web searches and social media data to monitor this MERS outbreak. We collected the number of daily laboratory-confirmed MERS cases and quarantined cases from May 11, 2015 to June 26, 2015 using the Korean government MERS portal. The daily trends observed via Google search and Twitter during the same time period were also ascertained using Google Trends and Topsy. Correlations among the data were then examined using Spearman correlation analysis. We found high correlations (>0.7) between Google search and Twitter results and the number of confirmed MERS cases for the previous three days using only four simple keywords: "MERS", " ("MERS (in Korean)"), " ("MERS symptoms (in Korean)"), and " ("MERS hospital (in Korean)"). Additionally, we found high correlations between the Google search and Twitter results and the number of quarantined cases using the above keywords. This study demonstrates the possibility of using a digital surveillance system to monitor the outbreak of MERS.

  7. 76 FR 34124 - Civil Supersonic Aircraft Panel Discussion

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-10

    ... and continuing to the second line in the second column, the Web site address should read as follows: https://spreadsheets.google.com/spreadsheet/viewform?formkey=dEFEdlRnYzBiaHZtTUozTHVtbkF4d0E6MQ . [FR...

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gwyn, Stephen D. J., E-mail: Stephen.Gwyn@nrc-cnrc.gc.ca

    This paper describes the image stacks and catalogs of the Canada-France-Hawaii Telescope Legacy Survey produced using the MegaPipe data pipeline at the Canadian Astronomy Data Centre. The Legacy Survey is divided into two parts. The Deep Survey consists of four fields each of 1 deg{sup 2}, with magnitude limits (50% completeness for point sources) of u = 27.5, g = 27.9, r = 27.7, i = 27.4, and z = 26.2. It contains 1.6 Multiplication-Sign 10{sup 6} sources. The Wide Survey consists of 150 deg{sup 2} split over four fields, with magnitude limits of u = 26.0, g = 26.5,more » r = 25.9, i = 25.7, and z = 24.6. It contains 3 Multiplication-Sign 10{sup 7} sources. This paper describes the calibration, image stacking, and catalog generation process. The images and catalogs are available on the web through several interfaces: normal image and text file catalog downloads, a 'Google Sky' interface, an image cutout service, and a catalog database query service.« less

  9. Understanding User Preferences and Awareness: Privacy Mechanisms in Location-Based Services

    NASA Astrophysics Data System (ADS)

    Burghardt, Thorben; Buchmann, Erik; Müller, Jens; Böhm, Klemens

    Location based services (LBS) let people retrieve and share information related to their current position. Examples are Google Latitude or Panoramio. Since LBS share user-related content, location information etc., they put user privacy at risk. Literature has proposed various privacy mechanisms for LBS. However, it is unclear which mechanisms humans really find useful, and how they make use of them. We present a user study that addresses these issues. To obtain realistic results, we have implemented a geotagging application on the web and on GPS cellphones, and our study participants use this application in their daily lives. We test five privacy mechanisms that differ in the awareness, mental effort and degree of informedness required from the users. Among other findings, we have observed that in situations where a single simple mechanism does not meet all privacy needs, people want to use simple and sophisticated mechanisms in combination. Further, individuals are concerned about the privacy of others, even when they do not value privacy for themselves.

  10. Matsu: An Elastic Cloud Connected to a SensorWeb for Disaster Response

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel

    2011-01-01

    This slide presentation reviews the use of cloud computing combined with the SensorWeb in aiding disaster response planning. Included is an overview of the architecture of the SensorWeb, and overviews of the phase 1 of the EO-1 system and the steps to improve it to transform it to an On-demand product cloud as part of the Open Cloud Consortium (OCC). The effectiveness of this system is demonstrated in the SensorWeb for the Namibia flood in 2010, using information blended from MODIS, TRMM, River Gauge data, and the Google Earth version of Namibia the system enabled river surge predictions and could enable planning for future disaster responses.

  11. Disease Monitoring and Health Campaign Evaluation Using Google Search Activities for HIV and AIDS, Stroke, Colorectal Cancer, and Marijuana Use in Canada: A Retrospective Observational Study.

    PubMed

    Ling, Rebecca; Lee, Joon

    2016-10-12

    Infodemiology can offer practical and feasible health research applications through the practice of studying information available on the Web. Google Trends provides publicly accessible information regarding search behaviors in a population, which may be studied and used for health campaign evaluation and disease monitoring. Additional studies examining the use and effectiveness of Google Trends for these purposes remain warranted. The objective of our study was to explore the use of infodemiology in the context of health campaign evaluation and chronic disease monitoring. It was hypothesized that following a launch of a campaign, there would be an increase in information seeking behavior on the Web. Second, increasing and decreasing disease patterns in a population would be associated with search activity patterns. This study examined 4 different diseases: human immunodeficiency virus (HIV) infection, stroke, colorectal cancer, and marijuana use. Using Google Trends, relative search volume data were collected throughout the period of February 2004 to January 2015. Campaign information and disease statistics were obtained from governmental publications. Search activity trends were graphed and assessed with disease trends and the campaign interval. Pearson product correlation statistics and joinpoint methodology analyses were used to determine significance. Disease patterns and online activity across all 4 diseases were significantly correlated: HIV infection (r=.36, P<.001), stroke (r=.40, P<.001), colorectal cancer (r= -.41, P<.001), and substance use (r=.64, P<.001). Visual inspection and the joinpoint analysis showed significant correlations for the campaigns on colorectal cancer and marijuana use in stimulating search activity. No significant correlations were observed for the campaigns on stroke and HIV regarding search activity. The use of infoveillance shows promise as an alternative and inexpensive solution to disease surveillance and health campaign evaluation. Further research is needed to understand Google Trends as a valid and reliable tool for health research.

  12. Keemei: cloud-based validation of tabular bioinformatics file formats in Google Sheets.

    PubMed

    Rideout, Jai Ram; Chase, John H; Bolyen, Evan; Ackermann, Gail; González, Antonio; Knight, Rob; Caporaso, J Gregory

    2016-06-13

    Bioinformatics software often requires human-generated tabular text files as input and has specific requirements for how those data are formatted. Users frequently manage these data in spreadsheet programs, which is convenient for researchers who are compiling the requisite information because the spreadsheet programs can easily be used on different platforms including laptops and tablets, and because they provide a familiar interface. It is increasingly common for many different researchers to be involved in compiling these data, including study coordinators, clinicians, lab technicians and bioinformaticians. As a result, many research groups are shifting toward using cloud-based spreadsheet programs, such as Google Sheets, which support the concurrent editing of a single spreadsheet by different users working on different platforms. Most of the researchers who enter data are not familiar with the formatting requirements of the bioinformatics programs that will be used, so validating and correcting file formats is often a bottleneck prior to beginning bioinformatics analysis. We present Keemei, a Google Sheets Add-on, for validating tabular files used in bioinformatics analyses. Keemei is available free of charge from Google's Chrome Web Store. Keemei can be installed and run on any web browser supported by Google Sheets. Keemei currently supports the validation of two widely used tabular bioinformatics formats, the Quantitative Insights into Microbial Ecology (QIIME) sample metadata mapping file format and the Spatially Referenced Genetic Data (SRGD) format, but is designed to easily support the addition of others. Keemei will save researchers time and frustration by providing a convenient interface for tabular bioinformatics file format validation. By allowing everyone involved with data entry for a project to easily validate their data, it will reduce the validation and formatting bottlenecks that are commonly encountered when human-generated data files are first used with a bioinformatics system. Simplifying the validation of essential tabular data files, such as sample metadata, will reduce common errors and thereby improve the quality and reliability of research outcomes.

  13. Disease Monitoring and Health Campaign Evaluation Using Google Search Activities for HIV and AIDS, Stroke, Colorectal Cancer, and Marijuana Use in Canada: A Retrospective Observational Study

    PubMed Central

    2016-01-01

    Background Infodemiology can offer practical and feasible health research applications through the practice of studying information available on the Web. Google Trends provides publicly accessible information regarding search behaviors in a population, which may be studied and used for health campaign evaluation and disease monitoring. Additional studies examining the use and effectiveness of Google Trends for these purposes remain warranted. Objective The objective of our study was to explore the use of infodemiology in the context of health campaign evaluation and chronic disease monitoring. It was hypothesized that following a launch of a campaign, there would be an increase in information seeking behavior on the Web. Second, increasing and decreasing disease patterns in a population would be associated with search activity patterns. This study examined 4 different diseases: human immunodeficiency virus (HIV) infection, stroke, colorectal cancer, and marijuana use. Methods Using Google Trends, relative search volume data were collected throughout the period of February 2004 to January 2015. Campaign information and disease statistics were obtained from governmental publications. Search activity trends were graphed and assessed with disease trends and the campaign interval. Pearson product correlation statistics and joinpoint methodology analyses were used to determine significance. Results Disease patterns and online activity across all 4 diseases were significantly correlated: HIV infection (r=.36, P<.001), stroke (r=.40, P<.001), colorectal cancer (r= −.41, P<.001), and substance use (r=.64, P<.001). Visual inspection and the joinpoint analysis showed significant correlations for the campaigns on colorectal cancer and marijuana use in stimulating search activity. No significant correlations were observed for the campaigns on stroke and HIV regarding search activity. Conclusions The use of infoveillance shows promise as an alternative and inexpensive solution to disease surveillance and health campaign evaluation. Further research is needed to understand Google Trends as a valid and reliable tool for health research. PMID:27733330

  14. Making Your Tools Useful to a Broader Audience

    NASA Astrophysics Data System (ADS)

    Lyness, M. D.; Broten, M. J.

    2006-12-01

    With the increasing growth of Web Services and SOAP the ability to connect and reuse computational and also visualization tools from all over the world via Web Interfaces that can be easily displayed in any current browser has provided the means to construct an ideal online research environment. The age-old question of usability is a major determining factor whether a particular tool would find great success in its community. An interface that can be understood purely by a user's intuition is desirable and more closely obtainable than ever before. Through the use of increasingly sophisticated web-oriented technologies including JavaScript, AJAX, and the DOM, web interfaces are able to harness the advantages of the Internet along with the functional capabilities of native applications such as menus, partial page changes, background processing, and visual effects to name a few. Also, with computers becoming a normal part of the educational process companies, such as Google and Microsoft, give us a synthetic intuition as a foundation for new designs. Understanding the way earth science researchers know how to use computers will allow the VLab portal (http://vlab.msi.umn.edu) and other projects to create interfaces that will get used. To provide detailed communication with the users of VLab's computational tools, projects like the Porky Portlet (http://www.gorerle.com/vlab-wiki/index.php?title=Porky_Portlet) spawned to empower users with a fully- detailed, interactive visual representation of progressing workflows. With the well-thought design of such tools and interfaces, researchers around the world will become accustomed to new highly engaging, visual web- based research environments.

  15. Implementation of Simple and Functional Web Applications at the Alaska Volcano Observatory Remote Sensing Group

    NASA Astrophysics Data System (ADS)

    Skoog, R. A.

    2007-12-01

    Web pages are ubiquitous and accessible, but when compared to stand-alone applications they are limited in capability. The Alaska Volcano Observatory (AVO) Remote Sensing Group has implemented web pages and supporting server software that provide relatively advanced features to any user able to meet basic requirements. Anyone in the world with access to a modern web browser (such as Mozilla Firefox 1.5 or Internet Explorer 6) and reasonable internet connection can fully use the tools, with no software installation or configuration. This allows faculty, staff and students at AVO to perform many aspects of volcano monitoring from home or the road as easily as from the office. Additionally, AVO collaborators such as the National Weather Service and the Anchorage Volcanic Ash Advisory Center are able to use these web tools to quickly assess volcanic events. Capabilities of this web software include (1) ability to obtain accurate measured remote sensing data values on an semi- quantitative compressed image of a large area, (2) to view any data from a wide time range of data swaths, (3) to view many different satellite remote sensing spectral bands and combinations, to adjust color range thresholds, (4) and to export to KML files which are viewable virtual globes such as Google Earth. The technologies behind this implementation are primarily Javascript, PHP, and MySQL which are free to use and well documented, in addition to Terascan, a commercial software package used to extract data from level-0 data files. These technologies will be presented in conjunction with the techniques used to combine them into the final product used by AVO and its collaborators for operational volcanic monitoring.

  16. Multigraph: Reusable Interactive Data Graphs

    NASA Astrophysics Data System (ADS)

    Phillips, M. B.

    2010-12-01

    There are surprisingly few good software tools available for presenting time series data on the internet. The most common practice is to use a desktop program such as Excel or Matlab to save a graph as an image which can be included in a web page like any other image. This disconnects the graph from the data in a way that makes updating a graph with new data a cumbersome manual process, and it limits the user to one particular view of the data. The Multigraph project defines an XML format for describing interactive data graphs, and software tools for creating and rendering those graphs in web pages and other internet connected applications. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions; the user can pan and zoom by clicking and dragging, in a familiar "Google Maps" kind of way. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. The Multigraph XML format, or "MUGL" for short, provides a concise description of the visual properties of a graph, such as axes, plot styles, data sources, labels, etc, as well as interactivity properties such as how and whether the user can pan or zoom along each axis. Multigraph reads a file in this format, draws the described graph, and allows the user to interact with it. Multigraph software currently includes a Flash application for embedding graphs in web pages, a Flex component for embedding graphs in larger Flex/Flash applications, and a plugin for creating graphs in the WordPress content management system. Plans for the future include a Java version for desktop viewing and editing, a command line version for batch and server side rendering, and possibly Android and iPhone versions. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Multigraph Display of Real Time Weather Data

  17. 78 FR 41178 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... performance of the market. In May 2008, the internet portal Yahoo! began offering its Web site viewers real... products that must be obtained in tandem. For example, while Yahoo! and Google now both disseminate NASDAQ...

  18. 78 FR 19772 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-02

    ... performance of the market. In May 2008, the internet portal Yahoo! began offering its Web site viewers real... products that must be obtained in tandem. For example, while Yahoo! and Google now both disseminate NASDAQ...

  19. Accuracy of Geographically Targeted Internet Advertisements on Google Adwords for Recruitment in a Randomized Trial

    PubMed Central

    Goldsmith, Lesley; Williams, Christopher J; Kamel Boulos, Maged N

    2012-01-01

    Background Google AdWords are increasingly used to recruit people into research studies and clinical services. They offer the potential to recruit from targeted control areas in cluster randomized controlled trials (RCTs), but little is known about the feasibility of accurately targeting ads by location and comparing with control areas. Objective To examine the accuracy and contamination of control areas by a location-targeted online intervention using Google AdWords in a pilot cluster RCT. Methods Based on previous use of online cognitive behavioral therapy for depression and population size, we purposively selected 16 of the 121 British postcode areas and randomized them to three intervention and one (do-nothing) control arms. Two intervention arms included use of location-targeted AdWords, and we compared these with the do-nothing control arm. We did not raise the visibility of our research website to normal Web searches. Users who clicked on the ad were directed to our project website, which collected the computer Internet protocol (IP) address, date, and time. Visitors were asked for their postcode area and to complete the Patient Health Questionnaire (depression). They were then offered links to several online depression resources. Google Analytics largely uses IP methods to estimate location, but AdWords uses additional information. We compared locations assessed by (1) Analytics, and (2) as self-identified by users. Results Ads were shown 300,523 times with 4207 click-throughs. There were few site visits except through AdWord click-throughs. Both methods of location assessment agreed there was little contamination of control areas. According to Analytics, 69.75% (2617/3752) of participants were in intervention areas, only 0% (8/3752) in control areas, but 30.04% (1127/3752) in other areas. However, according to user-stated postcodes, only 20.7% (463/2237) were in intervention areas, 1% (22/2236) in control areas, but 78.31% (1751/2236) in other areas. Both location assessments suggested most leakage from the intervention arms was to nearby postcode areas. Analytics data differed from postcodes reported by participants. Analysis of a subset of 200/2236 records over 10 days comparing IP-estimated location with stated postcode suggested that Google AdWords targeted correctly in just half the cases. Analytics agreed with our assessment that, overall, one-third were wrongly targeted by AdWords. There appeared little evidence that people who bothered to give their postcode did not answer truthfully. Conclusions Although there is likely to be substantial leakage from the targeted areas, if intervention and control areas are a sufficient distance apart, it is feasible to conduct a cluster RCT using online ads to target British postcode areas without significant contamination. Trial Registration Clinicaltrials.gov NCT01469689; http://clinicaltrials.gov/ct2/show/NCT01469689 (Archived by WebCite at http://www.webcitation.org/681iro5OU) PMID:22718043

  20. Accuracy of geographically targeted internet advertisements on Google AdWords for recruitment in a randomized trial.

    PubMed

    Jones, Ray B; Goldsmith, Lesley; Williams, Christopher J; Kamel Boulos, Maged N

    2012-06-20

    Google AdWords are increasingly used to recruit people into research studies and clinical services. They offer the potential to recruit from targeted control areas in cluster randomized controlled trials (RCTs), but little is known about the feasibility of accurately targeting ads by location and comparing with control areas. To examine the accuracy and contamination of control areas by a location-targeted online intervention using Google AdWords in a pilot cluster RCT. Based on previous use of online cognitive behavioral therapy for depression and population size, we purposively selected 16 of the 121 British postcode areas and randomized them to three intervention and one (do-nothing) control arms. Two intervention arms included use of location-targeted AdWords, and we compared these with the do-nothing control arm. We did not raise the visibility of our research website to normal Web searches. Users who clicked on the ad were directed to our project website, which collected the computer Internet protocol (IP) address, date, and time. Visitors were asked for their postcode area and to complete the Patient Health Questionnaire (depression). They were then offered links to several online depression resources. Google Analytics largely uses IP methods to estimate location, but AdWords uses additional information. We compared locations assessed by (1) Analytics, and (2) as self-identified by users. Ads were shown 300,523 times with 4207 click-throughs. There were few site visits except through AdWord click-throughs. Both methods of location assessment agreed there was little contamination of control areas. According to Analytics, 69.75% (2617/3752) of participants were in intervention areas, only 0% (8/3752) in control areas, but 30.04% (1127/3752) in other areas. However, according to user-stated postcodes, only 20.7% (463/2237) were in intervention areas, 1% (22/2236) in control areas, but 78.31% (1751/2236) in other areas. Both location assessments suggested most leakage from the intervention arms was to nearby postcode areas. Analytics data differed from postcodes reported by participants. Analysis of a subset of 200/2236 records over 10 days comparing IP-estimated location with stated postcode suggested that Google AdWords targeted correctly in just half the cases. Analytics agreed with our assessment that, overall, one-third were wrongly targeted by AdWords. There appeared little evidence that people who bothered to give their postcode did not answer truthfully. Although there is likely to be substantial leakage from the targeted areas, if intervention and control areas are a sufficient distance apart, it is feasible to conduct a cluster RCT using online ads to target British postcode areas without significant contamination. Clinicaltrials.gov NCT01469689; http://clinicaltrials.gov/ct2/show/NCT01469689 (Archived by WebCite at http://www.webcitation.org/681iro5OU).

  1. An intelligent and secure system for predicting and preventing Zika virus outbreak using Fog computing

    NASA Astrophysics Data System (ADS)

    Sareen, Sanjay; Gupta, Sunil Kumar; Sood, Sandeep K.

    2017-10-01

    Zika virus is a mosquito-borne disease that spreads very quickly in different parts of the world. In this article, we proposed a system to prevent and control the spread of Zika virus disease using integration of Fog computing, cloud computing, mobile phones and the Internet of things (IoT)-based sensor devices. Fog computing is used as an intermediary layer between the cloud and end users to reduce the latency time and extra communication cost that is usually found high in cloud-based systems. A fuzzy k-nearest neighbour is used to diagnose the possibly infected users, and Google map web service is used to provide the geographic positioning system (GPS)-based risk assessment to prevent the outbreak. It is used to represent each Zika virus (ZikaV)-infected user, mosquito-dense sites and breeding sites on the Google map that help the government healthcare authorities to control such risk-prone areas effectively and efficiently. The proposed system is deployed on Amazon EC2 cloud to evaluate its performance and accuracy using data set for 2 million users. Our system provides high accuracy of 94.5% for initial diagnosis of different users according to their symptoms and appropriate GPS-based risk assessment.

  2. A Web-Based Earth-Systems Knowledge Portal and Collaboration Platform

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.; Turner, A. K.

    2010-12-01

    In support of complex water-resource sustainability projects in the Great Basin region of the United States, Earth Knowledge, Inc. has developed several web-based data management and analysis platforms that have been used by its scientists, clients, and public to facilitate information exchanges, collaborations, and decision making. These platforms support accurate water-resource decision-making by combining second-generation internet (Web 2.0) technologies with traditional 2D GIS and web-based 2D and 3D mapping systems such as Google Maps, and Google Earth. Most data management and analysis systems use traditional software systems to address the data needs and usage behavior of the scientific community. In contrast, these platforms employ more accessible open-source and “off-the-shelf” consumer-oriented, hosted web-services. They exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize earth, engineering, and social science datasets. Thus, they respond to the information needs and web-interface expectations of both subject-matter experts and the public. Because the platforms continue to gather and store all the contributions of their broad-spectrum of users, each new assessment leverages the data, information, and expertise derived from previous investigations. In the last year, Earth Knowledge completed a conceptual system design and feasibility study for a platform, which has a Knowledge Portal providing access to users wishing to retrieve information or knowledge developed by the science enterprise and a Collaboration Environment Module, a framework that links the user-access functions to a Technical Core supporting technical and scientific analyses including Data Management, Analysis and Modeling, and Decision Management, and to essential system administrative functions within an Administrative Module. The over-riding technical challenge is the design and development of a single technical platform that is accessed through a flexible series of knowledge portal and collaboration environment styles reflecting the information needs and user expectations of a diverse community of users. Recent investigations have defined the information needs and expectations of the major end-users and also have reviewed and assessed a wide variety of modern web-based technologies. Combining these efforts produced design specifications and recommendations for the selection and integration of web- and client-based tools. When fully developed, the resulting platform will: -Support new, advanced information systems and decision environments that take full advantage of multiple data sources and platforms; -Provide a distribution network tailored to the timely delivery of products to a broad range of users that are needed to support applications in disaster management, resource management, energy, and urban sustainability; -Establish new integrated multiple-user requirements and knowledge databases that support researchers and promote infusion of successful technologies into existing processes; and -Develop new decision support strategies and presentation methodologies for applied earth science applications to reduce risk, cost, and time.

  3. Creation of a Web-Based GIS Server and Custom Geoprocessing Tools for Enhanced Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.

    2010-12-01

    Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster creation, profile, TRMM). The generation of a wide range of derivative maps (e.g., buffer zone, contour map, graphs, temporal rainfall distribution maps) from various map layers (e.g., geologic maps, geophysics, satellite images) allows for more user flexibility. The use of these tools along with Google Map’s API which enables the website user to utilize high quality GeoEye 2 images provide by Google in conjunction with our data, creates a more complete image of the area being observed and allows for custom derivative maps to be created in the field and viewed immediately on the web, processes that were restricted to offline databases.

  4. Preparing Precipitation Data Access, Value-added Services and Scientific Exploration Tools for the Integrated Multi-satellitE Retrievals for GPM (IMERG)

    NASA Astrophysics Data System (ADS)

    Ostrenga, D.; Liu, Z.; Kempler, S. J.; Vollmer, B.; Teng, W. L.

    2013-12-01

    The Precipitation Data and Information Services Center (PDISC) (http://disc.gsfc.nasa.gov/precipitation or google: NASA PDISC), located at the NASA Goddard Space Flight Center (GSFC) Earth Sciences (GES) Data and Information Services Center (DISC), is home of the Tropical Rainfall Measuring Mission (TRMM) data archive. For over 15 years, the GES DISC has served not only TRMM, but also other space-based, airborne-based, field campaign and ground-based precipitation data products to the precipitation community and other disciplinary communities as well. The TRMM Multi-Satellite Precipitation Analysis (TMPA) products are the most popular products in the TRMM product family in terms of data download and access through Mirador, the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) and other services. The next generation of TMPA, the Integrated Multi-satellitE Retrievals for GPM (IMERG) to be released in 2014 after the launch of GPM, will be significantly improved in terms of spatial and temporal resolutions. To better serve the user community, we are preparing data services and samples are listed below. To enable scientific exploration of Earth science data products without going through complicated and often time consuming processes, such as data downloading, data processing, etc., the GES DISC has developed Giovanni in consultation with members of the user community, requesting quick search, subset, analysis and display capabilities for their specific data of interest. For example, the TRMM Online Visualization and Analysis System (TOVAS, http://disc2.nascom.nasa.gov/Giovanni/tovas/) has proven extremely popular, especially as additional datasets have been added upon request. Giovanni will continue to evolve to accommodate GPM data and the multi-sensor data inter-comparisons that will be sure to follow. Additional PDISC tool and service capabilities being adapted for GPM data include: An on-line PDISC Portal (includes user guide, etc.); Data ingest, processing, distribution from on-line archive; Google-like Mirador data search and access engine; electronic distribution, Subscriptions; Uses semantic technology to help manage large amounts of multi-sensor data and their relationships; Data drill down and search capabilities; Data access through various web services, i.e., OPeNDAP, GDS, WMS, WCS; Conversion into various formats, e.g., netCDF, HDF, KML (for Google Earth), ascii; Exploration, visualization and statistical online analysis through Giovanni; Visualization and analysis of L2 data profiles and maps; Generation of derived products, such as, daily products; Parameter and spatial subsetting; Time and temporal aggregation; Regridding; Data version control and provenance; Data Stewardship - Continuous archive verification; Documentation; Science support for proper data usage, help desk; Monitoring services for applications; Expertise in data related standards and interoperability. This presentation will further describe the data services at the PDISC that are currently being utilized by precipitation science and application researchers, and the preparation plan for IMERG. Comments and feedback are welcome.

  5. A cognitive evaluation of four online search engines for answering definitional questions posed by physicians.

    PubMed

    Yu, Hong; Kaufman, David

    2007-01-01

    The Internet is having a profound impact on physicians' medical decision making. One recent survey of 277 physicians showed that 72% of physicians regularly used the Internet to research medical information and 51% admitted that information from web sites influenced their clinical decisions. This paper describes the first cognitive evaluation of four state-of-the-art Internet search engines: Google (i.e., Google and Scholar.Google), MedQA, Onelook, and PubMed for answering definitional questions (i.e., questions with the format of "What is X?") posed by physicians. Onelook is a portal for online definitions, and MedQA is a question answering system that automatically generates short texts to answer specific biomedical questions. Our evaluation criteria include quality of answer, ease of use, time spent, and number of actions taken. Our results show that MedQA outperforms Onelook and PubMed in most of the criteria, and that MedQA surpasses Google in time spent and number of actions, two important efficiency criteria. Our results show that Google is the best system for quality of answer and ease of use. We conclude that Google is an effective search engine for medical definitions, and that MedQA exceeds the other search engines in that it provides users direct answers to their questions; while the users of the other search engines have to visit several sites before finding all of the pertinent information.

  6. Web Content Accessibility of Consumer Health Information Web Sites for People with Disabilities: A Cross Sectional Evaluation

    PubMed Central

    Parmanto, Bambang

    2004-01-01

    Background The World Wide Web (WWW) has become an increasingly essential resource for health information consumers. The ability to obtain accurate medical information online quickly, conveniently and privately provides health consumers with the opportunity to make informed decisions and participate actively in their personal care. Little is known, however, about whether the content of this online health information is equally accessible to people with disabilities who must rely on special devices or technologies to process online information due to their visual, hearing, mobility, or cognitive limitations. Objective To construct a framework for an automated Web accessibility evaluation; to evaluate the state of accessibility of consumer health information Web sites; and to investigate the possible relationships between accessibility and other features of the Web sites, including function, popularity and importance. Methods We carried out a cross-sectional study of the state of accessibility of health information Web sites to people with disabilities. We selected 108 consumer health information Web sites from the directory service of a Web search engine. A measurement framework was constructed to automatically measure the level of Web Accessibility Barriers (WAB) of Web sites following Web accessibility specifications. We investigated whether there was a difference between WAB scores across various functional categories of the Web sites, and also evaluated the correlation between the WAB and Alexa traffic rank and Google Page Rank of the Web sites. Results We found that none of the Web sites we looked at are completely accessible to people with disabilities, i.e., there were no sites that had no violation of Web accessibility rules. However, governmental and educational health information Web sites do exhibit better Web accessibility than the other categories of Web sites (P < 0.001). We also found that the correlation between the WAB score and the popularity of a Web site is statistically significant (r = 0.28, P < 0.05), although there is no correlation between the WAB score and the importance of the Web sites (r = 0.15, P = 0.111). Conclusions Evaluation of health information Web sites shows that no Web site scrupulously abides by Web accessibility specifications, even for entities mandated under relevant laws and regulations. Government and education Web sites show better performance than Web sites among other categories. Accessibility of a Web site may have a positive impact on its popularity in general. However, the Web accessibility of a Web site may not have a significant relationship with its importance on the Web. PMID:15249268

  7. Decision Support System for the Response to Infectious Disease Emergencies Based on WebGIS and Mobile Services in China

    PubMed Central

    Gao, Su-qing; Wang, Zhen; Gao, Hong-wei; Liu, Peng; Wang, Ze-rui; Li, Yan-li; Zhu, Xu-guang; Li, Xin-lou; Xu, Bo; Li, Yin-jun; Yang, Hong; de Vlas, Sake J.; Shi, Tao-xing; Cao, Wu-chun

    2013-01-01

    Background For years, emerging infectious diseases have appeared worldwide and threatened the health of people. The emergence and spread of an infectious-disease outbreak are usually unforeseen, and have the features of suddenness and uncertainty. Timely understanding of basic information in the field, and the collection and analysis of epidemiological information, is helpful in making rapid decisions and responding to an infectious-disease emergency. Therefore, it is necessary to have an unobstructed channel and convenient tool for the collection and analysis of epidemiologic information in the field. Methodology/Principal Findings Baseline information for each county in mainland China was collected and a database was established by geo-coding information on a digital map of county boundaries throughout the country. Google Maps was used to display geographic information and to conduct calculations related to maps, and the 3G wireless network was used to transmit information collected in the field to the server. This study established a decision support system for the response to infectious-disease emergencies based on WebGIS and mobile services (DSSRIDE). The DSSRIDE provides functions including data collection, communication and analyses in real time, epidemiological detection, the provision of customized epidemiological questionnaires and guides for handling infectious disease emergencies, and the querying of professional knowledge in the field. These functions of the DSSRIDE could be helpful for epidemiological investigations in the field and the handling of infectious-disease emergencies. Conclusions/Significance The DSSRIDE provides a geographic information platform based on the Google Maps application programming interface to display information of infectious disease emergencies, and transfers information between workers in the field and decision makers through wireless transmission based on personal computers, mobile phones and personal digital assistants. After a 2-year practice and application in infectious disease emergencies, the DSSRIDE is becoming a useful platform and is a useful tool for investigations in the field carried out by response sections and individuals. The system is suitable for use in developing countries and low-income districts. PMID:23372780

  8. The Top 50 Articles on Minimally Invasive Spine Surgery.

    PubMed

    Virk, Sohrab S; Yu, Elizabeth

    2017-04-01

    Bibliometric study of current literature. To catalog the most important minimally invasive spine (MIS) surgery articles using the amount of citations as a marker of relevance. MIS surgery is a relatively new tool used by spinal surgeons. There is a dynamic and evolving field of research related to MIS techniques, clinical outcomes, and basic science research. To date, there is no comprehensive review of the most cited articles related to MIS surgery. A systematic search was performed over three widely used literature databases: Web of Science, Scopus, and Google Scholar. There were four searches performed using the terms "minimally invasive spine surgery," "endoscopic spine surgery," "percutaneous spinal surgery," and "lateral interbody surgery." The amount of citations included was averaged amongst the three databases to rank each article. The query of the three databases was performed in November 2015. Fifty articles were selected based upon the amount of citations each averaged amongst the three databases. The most cited article was titled "Extreme Lateral Interbody Fusion (XLIF): a novel surgical technique for anterior lumbar interbody fusion" by Ozgur et al and was credited with 447, 239, and 279 citations in Google Scholar, Web of Science, and Scopus, respectively. Citations ranged from 27 to 239 for Web of Science, 60 to 279 for Scopus, and 104 to 462 for Google Scholar. There was a large variety of articles written spanning over 14 different topics with the majority dealing with clinical outcomes related to MIS surgery. The majority of the most cited articles were level III and level IV studies. This is likely due to the relatively recent nature of technological advances in the field. Furthermore level I and level II studies are required in MIS surgery in the years ahead. 5.

  9. Using Google Applications as Part of Cloud Computing to Improve Knowledge and Teaching Skills of Faculty Members at the University of Bisha, Bisha, Saudi Arabia

    ERIC Educational Resources Information Center

    Alshihri, Bandar A.

    2017-01-01

    Cloud computing is a recent computing paradigm that has been integrated into the educational system. It provides numerous opportunities for delivering a variety of computing services in a way that has not been experienced before. The Google Company is among the top business companies that afford their cloud services by launching a number of…

  10. Analysis of the Capacity of Google Trends to Measure Interest in Conservation Topics and the Role of Online News

    PubMed Central

    Nghiem, Le T. P.; Papworth, Sarah K.; Lim, Felix K. S.; Carrasco, Luis R.

    2016-01-01

    With the continuous growth of internet usage, Google Trends has emerged as a source of information to investigate how social trends evolve over time. Knowing how the level of interest in conservation topics—approximated using Google search volume—varies over time can help support targeted conservation science communication. However, the evolution of search volume over time and the mechanisms that drive peaks in searches are poorly understood. We conducted time series analyses on Google search data from 2004 to 2013 to investigate: (i) whether interests in selected conservation topics have declined and (ii) the effect of news reporting and academic publishing on search volume. Although trends were sensitive to the term used as benchmark, we did not find that public interest towards conservation topics such as climate change, ecosystem services, deforestation, orangutan, invasive species and habitat loss was declining. We found, however, a robust downward trend for endangered species and an upward trend for ecosystem services. The quantity of news articles was related to patterns in Google search volume, whereas the number of research articles was not a good predictor but lagged behind Google search volume, indicating the role of news in the transfer of conservation science to the public. PMID:27028399

  11. Analysis of the Capacity of Google Trends to Measure Interest in Conservation Topics and the Role of Online News.

    PubMed

    Nghiem, Le T P; Papworth, Sarah K; Lim, Felix K S; Carrasco, Luis R

    2016-01-01

    With the continuous growth of internet usage, Google Trends has emerged as a source of information to investigate how social trends evolve over time. Knowing how the level of interest in conservation topics--approximated using Google search volume--varies over time can help support targeted conservation science communication. However, the evolution of search volume over time and the mechanisms that drive peaks in searches are poorly understood. We conducted time series analyses on Google search data from 2004 to 2013 to investigate: (i) whether interests in selected conservation topics have declined and (ii) the effect of news reporting and academic publishing on search volume. Although trends were sensitive to the term used as benchmark, we did not find that public interest towards conservation topics such as climate change, ecosystem services, deforestation, orangutan, invasive species and habitat loss was declining. We found, however, a robust downward trend for endangered species and an upward trend for ecosystem services. The quantity of news articles was related to patterns in Google search volume, whereas the number of research articles was not a good predictor but lagged behind Google search volume, indicating the role of news in the transfer of conservation science to the public.

  12. JournalMap: Geo-semantic searching for relevant knowledge

    USDA-ARS?s Scientific Manuscript database

    Ecologists struggling to understand rapidly changing environments and evolving ecosystem threats need quick access to relevant research and documentation of natural systems. The advent of semantic and aggregation searching (e.g., Google Scholar, Web of Science) has made it easier to find useful lite...

  13. Google Search Mastery Basics

    ERIC Educational Resources Information Center

    Hill, Paul; MacArthur, Stacey; Read, Nick

    2014-01-01

    Effective Internet search skills are essential with the continually increasing amount of information available on the Web. Extension personnel are required to find information to answer client questions and to conduct research on programs. Unfortunately, many lack the skills necessary to effectively navigate the Internet and locate needed…

  14. Using the Browser for Science: A Collaborative Toolkit for Astronomy

    NASA Astrophysics Data System (ADS)

    Connolly, A. J.; Smith, I.; Krughoff, K. S.; Gibson, R.

    2011-07-01

    Astronomical surveys have yielded hundreds of terabytes of catalogs and images that span many decades of the electromagnetic spectrum. Even when observatories provide user-friendly web interfaces, exploring these data resources remains a complex and daunting task. In contrast, gadgets and widgets have become popular in social networking (e.g. iGoogle, Facebook). They provide a simple way to make complex data easily accessible that can be customized based on the interest of the user. With ASCOT (an AStronomical COllaborative Toolkit) we expand on these concepts to provide a customizable and extensible gadget framework for use in science. Unlike iGoogle, where all of the gadgets are independent, the gadgets we develop communicate and share information, enabling users to visualize and interact with data through multiple, simultaneous views. With this approach, web-based applications for accessing and visualizing data can be generated easily and, by linking these tools together, integrated and powerful data analysis and discovery tools can be constructed.

  15. Service user and caregiver involvement in mental health system strengthening in low- and middle-income countries: systematic review.

    PubMed

    Semrau, Maya; Lempp, Heidi; Keynejad, Roxanne; Evans-Lacko, Sara; Mugisha, James; Raja, Shoba; Lamichhane, Jagannath; Alem, Atalay; Thornicroft, Graham; Hanlon, Charlotte

    2016-03-01

    The involvement of mental health service users and their caregivers in health system policy and planning, service monitoring and research can contribute to mental health system strengthening, but as yet there have been very few efforts to do so in low- and middle-income countries (LMICs). This systematic review examined the evidence and experience of service user and caregiver involvement in mental health system strengthening, as well as models of best practice for evaluation of capacity-building activities that facilitate their greater participation. Both the peer-reviewed and the grey literature were included in the review, which were identified through database searches (MEDLINE, Embase, PsycINFO, Web of Knowledge, Web of Science, Scopus, CINAHL, LILACS, SciELO, Google Scholar and Cochrane), as well as hand-searching of reference lists and the internet, and a snowballing process of contacting experts active in the area. This review included any kind of study design that described or evaluated service user, family or caregiver (though not community) involvement in LMICs (including service users with intellectual disabilities, dementia, or child and adolescent mental health problems) and that were relevant to mental health system strengthening across five categories. Data were extracted and summarised as a narrative review. Twenty papers matched the inclusion criteria. Overall, the review found that although there were examples of service user and caregiver involvement in mental health system strengthening in numerous countries, there was a lack of high-quality research and a weak evidence base for the work that was being conducted across countries. However, there was some emerging research on the development of policies and strategies, including advocacy work, and to a lesser extent the development of services, service monitoring and evaluation, with most service user involvement having taken place within advocacy and service delivery. Research was scarce within the other health system strengthening areas. Further research on service user and caregiver involvement in mental health system strengthening in LMICs is recommended, in particular research that includes more rigorous evaluation. A series of specific recommendations are provided based on the review.

  16. News trends and web search query of HIV/AIDS in Hong Kong

    PubMed Central

    Chiu, Alice P. Y.; Lin, Qianying

    2017-01-01

    Background The HIV epidemic in Hong Kong has worsened in recent years, with major contributions from high-risk subgroup of men who have sex with men (MSM). Internet use is prevalent among the majority of the local population, where they sought health information online. This study examines the impacts of HIV/AIDS and MSM news coverage on web search query in Hong Kong. Methods Relevant news coverage about HIV/AIDS and MSM from January 1st, 2004 to December 31st, 2014 was obtained from the WiseNews databse. News trends were created by computing the number of relevant articles by type, topic, place of origin and sub-populations. We then obtained relevant search volumes from Google and analysed causality between news trends and Google Trends using Granger Causality test and orthogonal impulse function. Results We found that editorial news has an impact on “HIV” Google searches on HIV, with the search term popularity peaking at an average of two weeks after the news are published. Similarly, editorial news has an impact on the frequency of “AIDS” searches two weeks after. MSM-related news trends have a more fluctuating impact on “MSM” Google searches, although the time lag varies anywhere from one week later to ten weeks later. Conclusions This infodemiological study shows that there is a positive impact of news trends on the online search behavior of HIV/AIDS or MSM-related issues for up to ten weeks after. Health promotional professionals could make use of this brief time window to tailor the timing of HIV awareness campaigns and public health interventions to maximise its reach and effectiveness. PMID:28922376

  17. FindZebra: a search engine for rare diseases.

    PubMed

    Dragusin, Radu; Petcu, Paula; Lioma, Christina; Larsen, Birger; Jørgensen, Henrik L; Cox, Ingemar J; Hansen, Lars Kai; Ingwersen, Peter; Winther, Ole

    2013-06-01

    The web has become a primary information resource about illnesses and treatments for both medical and non-medical users. Standard web search is by far the most common interface to this information. It is therefore of interest to find out how well web search engines work for diagnostic queries and what factors contribute to successes and failures. Among diseases, rare (or orphan) diseases represent an especially challenging and thus interesting class to diagnose as each is rare, diverse in symptoms and usually has scattered resources associated with it. We design an evaluation approach for web search engines for rare disease diagnosis which includes 56 real life diagnostic cases, performance measures, information resources and guidelines for customising Google Search to this task. In addition, we introduce FindZebra, a specialized (vertical) rare disease search engine. FindZebra is powered by open source search technology and uses curated freely available online medical information. FindZebra outperforms Google Search in both default set-up and customised to the resources used by FindZebra. We extend FindZebra with specialized functionalities exploiting medical ontological information and UMLS medical concepts to demonstrate different ways of displaying the retrieved results to medical experts. Our results indicate that a specialized search engine can improve the diagnostic quality without compromising the ease of use of the currently widely popular standard web search. The proposed evaluation approach can be valuable for future development and benchmarking. The FindZebra search engine is available at http://www.findzebra.com/. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. The sources and popularity of online drug information: an analysis of top search engine results and web page views.

    PubMed

    Law, Michael R; Mintzes, Barbara; Morgan, Steven G

    2011-03-01

    The Internet has become a popular source of health information. However, there is little information on what drug information and which Web sites are being searched. To investigate the sources of online information about prescription drugs by assessing the most common Web sites returned in online drug searches and to assess the comparative popularity of Web pages for particular drugs. This was a cross-sectional study of search results for the most commonly dispensed drugs in the US (n=278 active ingredients) on 4 popular search engines: Bing, Google (both US and Canada), and Yahoo. We determined the number of times a Web site appeared as the first result. A linked retrospective analysis counted Wikipedia page hits for each of these drugs in 2008 and 2009. About three quarters of the first result on Google USA for both brand and generic names linked to the National Library of Medicine. In contrast, Wikipedia was the first result for approximately 80% of generic name searches on the other 3 sites. On these other sites, over two thirds of brand name searches led to industry-sponsored sites. The Wikipedia pages with the highest number of hits were mainly for opiates, benzodiazepines, antibiotics, and antidepressants. Wikipedia and the National Library of Medicine rank highly in online drug searches. Further, our results suggest that patients most often seek information on drugs with the potential for dependence, for stigmatized conditions, that have received media attention, and for episodic treatments. Quality improvement efforts should focus on these drugs.

  19. Profile-IQ: Web-based data query system for local health department infrastructure and activities.

    PubMed

    Shah, Gulzar H; Leep, Carolyn J; Alexander, Dayna

    2014-01-01

    To demonstrate the use of National Association of County & City Health Officials' Profile-IQ, a Web-based data query system, and how policy makers, researchers, the general public, and public health professionals can use the system to generate descriptive statistics on local health departments. This article is a descriptive account of an important health informatics tool based on information from the project charter for Profile-IQ and the authors' experience and knowledge in design and use of this query system. Profile-IQ is a Web-based data query system that is based on open-source software: MySQL 5.5, Google Web Toolkit 2.2.0, Apache Commons Math library, Google Chart API, and Tomcat 6.0 Web server deployed on an Amazon EC2 server. It supports dynamic queries of National Profile of Local Health Departments data on local health department finances, workforce, and activities. Profile-IQ's customizable queries provide a variety of statistics not available in published reports and support the growing information needs of users who do not wish to work directly with data files for lack of staff skills or time, or to avoid a data use agreement. Profile-IQ also meets the growing demand of public health practitioners and policy makers for data to support quality improvement, community health assessment, and other processes associated with voluntary public health accreditation. It represents a step forward in the recent health informatics movement of data liberation and use of open source information technology solutions to promote public health.

  20. The New USGS Volcano Hazards Program Web Site

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Graham, S. E.; Parker, T. J.; Snedigar, S. F.

    2008-12-01

    The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) has launched a revised web site that uses a map-based interface to display hazards information for U.S. volcanoes. The web site is focused on better communication of hazards and background volcano information to our varied user groups by reorganizing content based on user needs and improving data display. The Home Page provides a synoptic view of the activity level of all volcanoes for which updates are written using a custom Google® Map. Updates are accessible by clicking on one of the map icons or clicking on the volcano of interest in the adjacent color-coded list of updates. The new navigation provides rapid access to volcanic activity information, background volcano information, images and publications, volcanic hazards, information about VHP, and the USGS volcano observatories. The Volcanic Activity section was tailored for emergency managers but provides information for all our user groups. It includes a Google® Map of the volcanoes we monitor, an Elevated Activity Page, a general status page, information about our Volcano Alert Levels and Aviation Color Codes, monitoring information, and links to monitoring data from VHP's volcano observatories: Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Long Valley Observatory (LVO), Hawaiian Volcano Observatory (HVO), and Yellowstone Volcano Observatory (YVO). The YVO web site was the first to move to the new navigation system and we are working on integrating the Long Valley Observatory web site next. We are excited to continue to implement new geospatial technologies to better display our hazards and supporting volcano information.

  1. Collaborative writing: Tools and tips.

    PubMed

    Eapen, Bell Raj

    2007-01-01

    Majority of technical writing is done by groups of experts and various web based applications have made this collaboration easy. Email exchange of word processor documents with tracked changes used to be the standard technique for collaborative writing. However web based tools like Google docs and Spreadsheets have made the process fast and efficient. Various versioning tools and synchronous editors are available for those who need additional functionality. Having a group leader who decides the scheduling, communication and conflict resolving protocols is important for successful collaboration.

  2. Using Web and Social Media for Influenza Surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corley, Courtney D.; Cook, Diane; Mikler, Armin R.

    2010-01-04

    Analysis of Google influenza-like-illness (ILI) search queries has shown a strongly correlated pattern with Centers for Disease Control (CDC) and Prevention seasonal ILI reporting data.Web and social media provide another resource to detect increases in ILI. This paper evaluates trends in blog posts that discuss influenza. Our key finding is that from 5-October 2008 to 31-January 2009 a high correlation exists between the frequency of posts, containing influenza keywords, per week and CDC influenza-like-illness surveillance data.

  3. Integrated Web-Based Access to and use of Satellite Remote Sensing Data for Improved Decision Making in Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Teng, W.; Chiu, L.; Kempler, S.; Liu, Z.; Nadeau, D.; Rui, H.

    2006-12-01

    Using NASA satellite remote sensing data from multiple sources for hydrologic applications can be a daunting task and requires a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time-consuming task that must be undertaken before the core investigation can begin. In order to facilitate such investigations, the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has developed the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure or "Giovanni," which supports a family of Web interfaces (instances) that allow users to perform interactive visualization and analysis online without downloading any data. Two such Giovanni instances are particularly relevant to hydrologic applications: the Tropical Rainfall Measuring Mission (TRMM) Online Visualization and Analysis System (TOVAS) and the Agricultural Online Visualization and Analysis System (AOVAS), both highly popular and widely used for a variety of applications, including those related to several NASA Applications of National Priority, such as Agricultural Efficiency, Disaster Management, Ecological Forecasting, Homeland Security, and Public Health. Dynamic, context- sensitive Web services provided by TOVAS and AOVAS enable users to seamlessly access NASA data from within, and deeply integrate the data into, their local client environments. One example is between TOVAS and Florida International University's TerraFly, a Web-enabled system that serves a broad segment of the research and applications community, by facilitating access to various textual, remotely sensed, and vector data. Another example is between AOVAS and the U.S. Department of Agriculture Foreign Agricultural Service (USDA FAS)'s Crop Explorer, the primary decision support tool used by FAS to monitor the production, supply, and demand of agricultural commodities worldwide. AOVAS is also part of GES DISC's Agricultural Information System (AIS), which can operationally provide satellite remote sensing data products (e.g., near- real-time rainfall) and analysis services to agricultural users. AIS enables the remote, interoperable access to distributed data, by using the GrADS-Data Server (GDS) and the Open Geospatial Consortium (OGC)- compliant MapServer. The latter allows the access of AIS data from any OGC-compliant client, such as the Earth-Sun System Gateway (ESG) or Google Earth. The Giovanni system is evolving towards a Service- Oriented Architecture and is highly customizable (e.g., adding new products or services), thus availing the hydrologic applications user community of Giovanni's simple-to-use and powerful capabilities to improve decision-making.

  4. Googling endometriosis: a systematic review of information available on the Internet.

    PubMed

    Hirsch, Martin; Aggarwal, Shivani; Barker, Claire; Davis, Colin J; Duffy, James M N

    2017-05-01

    The demand for health information online is increasing rapidly without clear governance. We aim to evaluate the credibility, quality, readability, and accuracy of online patient information concerning endometriosis. We searched 5 popular Internet search engines: aol.com, ask.com, bing.com, google.com, and yahoo.com. We developed a search strategy in consultation with patients with endometriosis, to identify relevant World Wide Web pages. Pages containing information related to endometriosis for women with endometriosis or the public were eligible. Two independent authors screened the search results. World Wide Web pages were evaluated using validated instruments across 3 of the 4 following domains: (1) credibility (White Paper instrument; range 0-10); (2) quality (DISCERN instrument; range 0-85); and (3) readability (Flesch-Kincaid instrument; range 0-100); and (4) accuracy (assessed by a prioritized criteria developed in consultation with health care professionals, researchers, and women with endometriosis based on the European Society of Human Reproduction and Embryology guidelines [range 0-30]). We summarized these data in diagrams, tables, and narratively. We identified 750 World Wide Web pages, of which 54 were included. Over a third of Web pages did not attribute authorship and almost half the included pages did not report the sources of information or academic references. No World Wide Web page provided information assessed as being written in plain English. A minority of web pages were assessed as high quality. A single World Wide Web page provided accurate information: evidentlycochrane.net. Available information was, in general, skewed toward the diagnosis of endometriosis. There were 16 credible World Wide Web pages, however the content limitations were infrequently discussed. No World Wide Web page scored highly across all 4 domains. In the unlikely event that a World Wide Web page reports high-quality, accurate, and credible health information it is typically challenging for a lay audience to comprehend. Health care professionals, and the wider community, should inform women with endometriosis of the risk of outdated, inaccurate, or even dangerous information online. The implementation of an information standard will incentivize providers of online information to establish and adhere to codes of conduct. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Scientific Datasets: Discovery and Aggregation for Semantic Interpretation.

    NASA Astrophysics Data System (ADS)

    Lopez, L. A.; Scott, S.; Khalsa, S. J. S.; Duerr, R.

    2015-12-01

    One of the biggest challenges that interdisciplinary researchers face is finding suitable datasets in order to advance their science; this problem remains consistent across multiple disciplines. A surprising number of scientists, when asked what tool they use for data discovery, reply "Google", which is an acceptable solution in some cases but not even Google can find -or cares to compile- all the data that's relevant for science and particularly geo sciences. If a dataset is not discoverable through a well known search provider it will remain dark data to the scientific world.For the past year, BCube, an EarthCube Building Block project, has been developing, testing and deploying a technology stack capable of data discovery at web-scale using the ultimate dataset: The Internet. This stack has 2 principal components, a web-scale crawling infrastructure and a semantic aggregator. The web-crawler is a modified version of Apache Nutch (the originator of Hadoop and other big data technologies) that has been improved and tailored for data and data service discovery. The second component is semantic aggregation, carried out by a python-based workflow that extracts valuable metadata and stores it in the form of triples through the use semantic technologies.While implementing the BCube stack we have run into several challenges such as a) scaling the project to cover big portions of the Internet at a reasonable cost, b) making sense of very diverse and non-homogeneous data, and lastly, c) extracting facts about these datasets using semantic technologies in order to make them usable for the geosciences community. Despite all these challenges we have proven that we can discover and characterize data that otherwise would have remained in the dark corners of the Internet. Having all this data indexed and 'triplelized' will enable scientists to access a trove of information relevant to their work in a more natural way. An important characteristic of the BCube stack is that all the code we have developed is open sourced and available to anyone who wants to experiment and collaborate with the project at: http://github.com/b-cube/

  6. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass.

    PubMed

    Sánchez, Jesús M; Carrera, Álvaro; Iglesias, Carlos Á; Serrano, Emilio

    2016-08-24

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services.

  7. USGS Coastal and Marine Geology Survey Data in Google Earth

    NASA Astrophysics Data System (ADS)

    Reiss, C.; Steele, C.; Ma, A.; Chin, J.

    2006-12-01

    The U.S. Geological Survey (USGS) Coastal and Marine Geology (CMG) program has a rich data catalog of geologic field activities and metadata called InfoBank, which has been a standard tool for researchers within and outside of the agency. Along with traditional web maps, the data are now accessible in Google Earth, which greatly expands the possible user audience. The Google Earth interface provides geographic orientation and panning/zooming capabilities to locate data relative to topography, bathymetry, and coastal areas. Viewing navigation with Google Earth's background imagery allows queries such as, why areas were not surveyed (answer presence of islands, shorelines, cliffs, etc.). Detailed box core subsample photos from selected sampling activities, published geotechnical data, and sample descriptions are now viewable on Google Earth, (for example, M-1-95-MB, P-2-95-MB, and P-1-97- MB box core samples). One example of the use of Google Earth is CMG's surveys of San Francisco's Ocean Beach since 2004. The surveys are conducted with an all-terrain vehicle (ATV) and shallow-water personal watercraft (PWC) equipped with Global Positioning System (GPS), and elevation and echo sounder data collectors. 3D topographic models with centimeter accuracy have been produced from these surveys to monitor beach and nearshore processes, including sand transport, sedimentation patterns, and seasonal trends. Using Google Earth, multiple track line data (examples: OB-1-05-CA and OB-2-05-CA) can be overlaid on beach imagery. The images also help explain the shape of track lines as objects are encountered.

  8. Could we do better? Behavioural tracking on recommended consumer health websites.

    PubMed

    Burkell, Jacquelyn; Fortier, Alexandre

    2015-09-01

    This study examines behavioural tracking practices on consumer health websites, contrasting tracking on sites recommended by information professionals with tracking on sites returned by Google. Two lists of consumer health websites were constructed: sites recommended by information professionals and sites returned by Google searches. Sites were divided into three groups according to source (Recommended-Only, Google-Only or both) and type (Government, Not-for-Profit or Commercial). Behavioural tracking practices on each website were documented using a protocol that detected cookies, Web beacons and Flash cookies. The presence and the number of trackers that collect personal information were contrasted across source and type of site; a second set of analyses specifically examined Advertising trackers. Recommended-Only sites show lower levels of tracking - especially tracking by advertisers - than do Google-Only sites or sites found through both sources. Government and Not-for-Profit sites have fewer trackers, particularly from advertisers, than do Commercial sites. Recommended sites, especially those from Government or Not-for-Profit organisations, present a lower privacy threat than sites returned by Google searches. Nonetheless, most recommended websites include some trackers, and half include at least one Advertising tracker. To protect patron privacy, information professionals should examine the tracking practices of the websites they recommend. © 2015 Health Libraries Group.

  9. An Introduction to Science Education in Rural Australia

    ERIC Educational Resources Information Center

    Lyons, Terry

    2008-01-01

    Here's a challenge. Try searching "Google" for the phrase "rural science teachers" in Australian web content. Surprisingly, my attempts returned only two hits, neither of which actually referred to Australian teachers. Searches for "rural science education" fare little better. On this evidence one could be forgiven…

  10. Rainfall erosivity in Brazil: A Review

    USDA-ARS?s Scientific Manuscript database

    In this paper, we review the erosivity studies conducted in Brazil to verify the quality and representativeness of the results generated and to provide a greater understanding of the rainfall erosivity (R-factor) in Brazil. We searched the ISI Web of Science, Scopus, SciELO, and Google Scholar datab...

  11. A web-based platform to support an evidence-based mental health intervention: lessons from the CBITS web site.

    PubMed

    Vona, Pamela; Wilmoth, Pete; Jaycox, Lisa H; McMillen, Janey S; Kataoka, Sheryl H; Wong, Marleen; DeRosier, Melissa E; Langley, Audra K; Kaufman, Joshua; Tang, Lingqi; Stein, Bradley D

    2014-11-01

    To explore the role of Web-based platforms in behavioral health, the study examined usage of a Web site for supporting training and implementation of an evidence-based intervention. Using data from an online registration survey and Google Analytics, the investigators examined user characteristics and Web site utilization. Site engagement was substantial across user groups. Visit duration differed by registrants' characteristics. Less experienced clinicians spent more time on the Web site. The training section accounted for most page views across user groups. Individuals previously trained in the Cognitive-Behavioral Intervention for Trauma in Schools intervention viewed more implementation assistance and online community pages than did other user groups. Web-based platforms have the potential to support training and implementation of evidence-based interventions for clinicians of varying levels of experience and may facilitate more rapid dissemination. Web-based platforms may be promising for trauma-related interventions, because training and implementation support should be readily available after a traumatic event.

  12. Visualizing Mars data and imagery with Google Earth

    NASA Astrophysics Data System (ADS)

    Beyer, R. A.; Broxton, M.; Gorelick, N.; Hancher, M.; Lundy, M.; Kolb, E.; Moratto, Z.; Nefian, A.; Scharff, T.; Weiss-Malik, M.

    2009-12-01

    There is a vast store of planetary geospatial data that has been collected by NASA but is difficult to access and visualize. Virtual globes have revolutionized the way we visualize and understand the Earth, but other planetary bodies including Mars and the Moon can be visualized in similar ways. Extraterrestrial virtual globes are poised to revolutionize planetary science, bring an exciting new dimension to science education, and allow ordinary users to explore imagery being sent back to Earth by planetary science satellites. The original Google Mars Web site allowed users to view base maps of Mars via the Web, but it did not have the full features of the 3D Google Earth client. We have previously demonstrated the use of Google Earth to display Mars imagery, but now with the launch of Mars in Google Earth, there is a base set of Mars data available for anyone to work from and add to. There are a variety of global maps to choose from and display. The Terrain layer has the MOLA gridded data topography, and where available, HRSC terrain models are mosaicked into the topography. In some locations there is also meter-scale terrain derived from HiRISE stereo imagery. There is rich information in the form of the IAU nomenclature database, data for the rovers and landers on the surface, and a Spacecraft Imagery layer which contains the image outlines for all HiRISE, CTX, CRISM, HRSC, and MOC image data released to the PDS and links back to their science data. There are also features like the Traveler's Guide to Mars, Historic Maps, Guided Tours, as well as the 'Live from Mars' feature, which shows the orbital tracks of both the Mars Odyssey and Mars Reconnaissance Orbiter for a few days in the recent past. It shows where they have acquired imagery, and also some preview image data. These capabilities have obvious public outreach and education benefits, but the potential benefits of allowing planetary scientists to rapidly explore these large and varied data collections—in geological context and within a single user interface—are also becoming evident. Because anyone can produce additional KML content for use in Google Earth, scientists can customize the environment to their needs as well as publish their own processed data and results for others to use. Many scientists and organizations have begun to do this already, resulting in a useful and growing collection of planetary-science-oriented Google Earth layers.

  13. Data visualization in interactive maps and time series

    NASA Astrophysics Data System (ADS)

    Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe

    2014-05-01

    State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.

  14. Searching for Real-World Effectiveness of Health Care Innovations: Scoping Study of Social Prescribing for Diabetes

    PubMed Central

    Loef, Martin; Polley, Marie

    2017-01-01

    Background Social prescribing is a process whereby primary care patients are linked or referred to nonmedical sources of support in the community and voluntary sector. It is a concept that has arisen in practice and implemented widely in the United Kingdom and has been evaluated by various organizations. Objective The aim of our study was to characterize, collate, and analyze the evidence from evaluation of social prescribing for type 2 diabetes in the United Kingdom and Ireland, comparing information available on publicly available websites with the published literature. Methods We used a broad, pragmatic definition of social prescribing and conducted Web-based searches for websites of organizations providing potentially relevant services. We also explored linked information. In parallel, we searched Medline, PubMed, Cochrane Library, Google Scholar, and reference lists for relevant studies published in peer-reviewed journals. We extracted the data systematically on the characteristics, any reported evaluation, outcomes measured and results, and terminology used to describe each service. Results We identified 40 UK- or Ireland-based projects that referred people with type 2 diabetes and prediabetes to nonmedical interventions or services provided in the community. We located evaluations of 24 projects; 11 as published papers, 12 as Web-based reports, and 1 as both a paper and a Web-based report. The interventions and services identified included structured group educational programs, exercise referral schemes, and individualized advice and support with signposting of health-related activities in the community. Although specific interventions such as community-based group educational programs and exercise referral have been evaluated in randomized controlled trials, evaluation of individualized social prescribing services involving people with type 2 diabetes has, in most cases, used pre-post and mixed methods approaches. These evaluations report generic improvement in a broad range of outcomes and provide an insight into the criteria for the success of social prescribing services. Conclusions Our study revealed the varied models of social prescribing and nonmedical, community-based services available to people with type 2 diabetes and the extent of evaluation of these, which would not have been achieved by searching databases alone. The findings of this scoping study do not prove that social prescribing is an effective measure for people with type 2 diabetes in the United Kingdom, but can be used to inform future evaluation and contribute to the development of the evidence base for social prescribing. Accessing Web-based information provides a potential method for investigating how specific innovative health concepts, such as social prescribing, have been translated, implemented, and evaluated in practice. Several challenges were encountered including defining the concept, focusing on process plus intervention, and searching diverse, evolving Web-based sources. Further exploration of this approach will inform future research on the application of innovative health care concepts into practice. PMID:28153817

  15. Using open-source programs to create a web-based portal for hydrologic information

    NASA Astrophysics Data System (ADS)

    Kim, H.

    2013-12-01

    Some hydrologic data sets, such as basin climatology, precipitation, and terrestrial water storage, are not easily obtainable and distributable due to their size and complexity. We present a Hydrologic Information Portal (HIP) that has been implemented at the University of California for Hydrologic Modeling (UCCHM) and that has been organized around the large river basins of North America. This portal can be easily accessed through a modern web browser that enables easy access and visualization of such hydrologic data sets. Some of the main features of our HIP include a set of data visualization features so that users can search, retrieve, analyze, integrate, organize, and map data within large river basins. Recent information technologies such as Google Maps, Tornado (Python asynchronous web server), NumPy/SciPy (Scientific Library for Python) and d3.js (Visualization library for JavaScript) were incorporated into the HIP to create ease in navigating large data sets. With such open source libraries, HIP can give public users a way to combine and explore various data sets by generating multiple chart types (Line, Bar, Pie, Scatter plot) directly from the Google Maps viewport. Every rendered object such as a basin shape on the viewport is clickable, and this is the first step to access the visualization of data sets.

  16. An Interactive Web System for Field Data Sharing and Collaboration

    NASA Astrophysics Data System (ADS)

    Weng, Y.; Sun, F.; Grigsby, J. D.

    2010-12-01

    A Web 2.0 system is designed and developed to facilitate data collection for the field studies in the Geological Sciences department at Ball State University. The system provides a student-centered learning platform that enables the users to first upload their collected data in various formats, interact and collaborate dynamically online, and ultimately create a shared digital repository of field experiences. The data types considered for the system and their corresponding format and requirements are listed in the table below. The system has six main functionalities as follows. (1) Only the registered users can access the system with confidential identification and password. (2) Each user can upload/revise/delete data in various formats such as image, audio, video, and text files to the system. (3) Interested users are allowed to co-edit the contents and join the collaboration whiteboard for further discussion. (4) The system integrates with Google, Yahoo, or Flickr to search for similar photos with same tags. (5) Users can search the web system according to the specific key words. (6) Photos with recorded GPS readings can be mashed and mapped to Google Maps/Earth for visualization. Application of the system to geology field trips at Ball State University will be demonstrated to assess the usability of the system.Data Requirements

  17. A web search on environmental topics: what is the role of ranking?

    PubMed

    Covolo, Loredana; Filisetti, Barbara; Mascaretti, Silvia; Limina, Rosa Maria; Gelatti, Umberto

    2013-12-01

    Although the Internet is easy to use, the mechanisms and logic behind a Web search are often unknown. Reliable information can be obtained, but it may not be visible as the Web site is not located in the first positions of search results. The possible risks of adverse health effects arising from environmental hazards are issues of increasing public interest, and therefore the information about these risks, particularly on topics for which there is no scientific evidence, is very crucial. The aim of this study was to investigate whether the presentation of information on some environmental health topics differed among various search engines, assuming that the most reliable information should come from institutional Web sites. Five search engines were used: Google, Yahoo!, Bing, Ask, and AOL. The following topics were searched in combination with the word "health": "nuclear energy," "electromagnetic waves," "air pollution," "waste," and "radon." For each topic three key words were used. The first 30 search results for each query were considered. The ranking variability among the search engines and the type of search results were analyzed for each topic and for each key word. The ranking of institutional Web sites was given particular consideration. Variable results were obtained when surfing the Internet on different environmental health topics. Multivariate logistic regression analysis showed that, when searching for radon and air pollution topics, it is more likely to find institutional Web sites in the first 10 positions compared with nuclear power (odds ratio=3.4, 95% confidence interval 2.1-5.4 and odds ratio=2.9, 95% confidence interval 1.8-4.7, respectively) and also when using Google compared with Bing (odds ratio=3.1, 95% confidence interval 1.9-5.1). The increasing use of online information could play an important role in forming opinions. Web users should become more aware of the importance of finding reliable information, and health institutions should be able to make that information more visible.

  18. Reusable Client-Side JavaScript Modules for Immersive Web-Based Real-Time Collaborative Neuroimage Visualization.

    PubMed

    Bernal-Rusiel, Jorge L; Rannou, Nicolas; Gollub, Randy L; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E; Pienaar, Rudolph

    2017-01-01

    In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView , a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution.

  19. InChI in the wild: an assessment of InChIKey searching in Google

    PubMed Central

    2013-01-01

    While chemical databases can be queried using the InChI string and InChIKey (IK) the latter was designed for open-web searching. It is becoming increasingly effective for this since more sources enhance crawling of their websites by the Googlebot and consequent IK indexing. Searchers who use Google as an adjunct to database access may be less familiar with the advantages of using the IK as explored in this review. As an example, the IK for atorvastatin retrieves ~200 low-redundancy links from a Google search in 0.3 of a second. These include most major databases and a very low false-positive rate. Results encompass less familiar but potentially useful sources and can be extended to isomer capture by using just the skeleton layer of the IK. Google Advanced Search can be used to filter large result sets. Image searching with the IK is also effective and complementary to open-web queries. Results can be particularly useful for less-common structures as exemplified by a major metabolite of atorvastatin giving only three hits. Testing also demonstrated document-to-document and document-to-database joins via structure matching. The necessary generation of an IK from chemical names can be accomplished using open tools and resources for patents, papers, abstracts or other text sources. Active global sharing of local IK-linked information can be accomplished via surfacing in open laboratory notebooks, blogs, Twitter, figshare and other routes. While information-rich chemistry (e.g. approved drugs) can exhibit swamping and redundancy effects, the much smaller IK result sets for link-poor structures become a transformative first-pass option. The IK indexing has therefore turned Google into a de-facto open global chemical information hub by merging links to most significant sources, including over 50 million PubChem and ChemSpider records. The simplicity, specificity and speed of matching make it a useful option for biologists or others less familiar with chemical searching. However, compared to rigorously maintained major databases, users need to be circumspect about the consistency of Google results and provenance of retrieved links. In addition, community engagement may be necessary to ameliorate possible future degradation of utility. PMID:23399051

  20. ARM Climate Research Facility: Outreach Tools and Strategies

    NASA Astrophysics Data System (ADS)

    Roeder, L.; Jundt, R.

    2009-12-01

    Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.

  1. Development of Visualizations and Loggable Activities for the Geosciences. Results from Recent TUES Sponsored Projects

    NASA Astrophysics Data System (ADS)

    De Paor, D. G.; Bailey, J. E.; Whitmeyer, S. J.

    2012-12-01

    Our TUES research centers on the role of digital data, visualizations, animations, and simulations in undergraduate geoscience education. Digital hardware (smartphones, tablets, GPSs, GigaPan robotic camera mounts, etc.) are revolutionizing field data collection. Software products (GIS, 3-D scanning and modeling programs, virtual globes, etc.) have truly transformed the way geoscientists teach, learn, and do research. Whilst Google-Earth-style visualizations are famously user-friend for the person browsing, they can be notoriously unfriendly for the content creator. Therefore, we developed tools to help educators create and share visualizations as easily as if posting on Facebook. Anyone whoIf you wish to display geological cross sections on Google Earth, go to digitalplanet.org, upload image files, position them on a line of section, and share with the world through our KMZ hosting service. Other tools facilitate screen overlay and 3-D map symbol generation. We advocate use of such technology to enable undergraduate students to 'publish' their first mapping efforts even while they are working in the field. A second outcome of our TUES projects merges Second-Life-style interaction with Google Earth. We created games in which students act as first responders for natural hazard mitigation, prospectors for natural resource explorations, and structural geologist for map-making. Students are represented by avatars and collaborate by exchange of text messages - the natural mode of communication for the current generation. Teachers view logs showing student movements as well as transcripts of text messages and can scaffold student learning and geofence students to prevent wandering. Early results of in-class testing show positive learning outcomes. The third aspect of our program emphasizes dissemination. Experience shows that great effort is required to overcome activation energy and ensure adoption of new technology into the curriculum. We organized a GSA Penrose Conference, a GSA Pardee Keynote Symposium, and AGU Townhall Meeting, and numerous workshops at annual and regional meetings, and set up a web site dedicated to dissemination of program products. Future plans include development of augmented reality teaching resources, hosting of community mapping services, and creation of a truly 4-D virtual globe.;

  2. Learning Geomorphology Using Aerial Photography in a Web-Facilitated Class

    ERIC Educational Resources Information Center

    Palmer, R. Evan

    2013-01-01

    General education students taking freshman-level physical geography and geomorphology classes at Arizona State University completed an online laboratory whose main tool was Google Earth. Early in the semester, oblique and planimetric views introduced students to a few volcanic, tectonic, glacial, karst, and coastal landforms. Semi-quantitative…

  3. The New Digital Awareness

    ERIC Educational Resources Information Center

    Bohle, Shannon

    2008-01-01

    With all the new advances in library technology--including metadata, social networking, and Web 2.0, along with the advent of nonlibrary and for-profit digital information companies like Wikisource and Google Print--librarians have barely had time to reflect on the nontechnical implications of these innovations. They need to take a step back and…

  4. Is It Cheating if Everybody Does It?

    ERIC Educational Resources Information Center

    Gustafon, Chris

    2004-01-01

    A teacher brings you a paper he suspects is not the student's own work, and a google search confirms it was copied right off a web page. Intellectual honesty issues are impossible to duck in the library, but plagiarism lessons are often met with yawns and eye rolls from students.

  5. Information Portals: The Next Generation Catalog

    ERIC Educational Resources Information Center

    Allison, DeeAnn

    2010-01-01

    Libraries today face an increasing challenge: to provide relevant information to diverse populations with differing needs while competing with Web search engines like Google. In 2009, a large group of libraries, including the University of Nebraska-Lincoln Libraries, joined with Innovative Interfaces as development partners to design a new type of…

  6. OntoCAT -- simple ontology search and integration in Java, R and REST/JavaScript

    PubMed Central

    2011-01-01

    Background Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. Results OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. Conclusions OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. Availability http://www.ontocat.org PMID:21619703

  7. OntoCAT--simple ontology search and integration in Java, R and REST/JavaScript.

    PubMed

    Adamusiak, Tomasz; Burdett, Tony; Kurbatova, Natalja; Joeri van der Velde, K; Abeygunawardena, Niran; Antonakaki, Despoina; Kapushesky, Misha; Parkinson, Helen; Swertz, Morris A

    2011-05-29

    Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. http://www.ontocat.org.

  8. The Qatar National Historic Environment Record: a Platform for the Development of a Fully-Integrated Cultural Heritage Management Application

    NASA Astrophysics Data System (ADS)

    Cuttler, R. T. H.; Tonner, T. W. W.; Al-Naimi, F. A.; Dingwall, L. M.; Al-Hemaidi, N.

    2013-07-01

    The development of the Qatar National Historic Environment Record (QNHER) by the Qatar Museums Authority and the University of Birmingham in 2008 was based on a customised, bilingual Access database and ArcGIS. While both platforms are stable and well supported, neither was designed for the documentation and retrieval of cultural heritage data. As a result it was decided to develop a custom application using Open Source code. The core module of this application is now completed and is orientated towards the storage and retrieval of geospatial heritage data for the curation of heritage assets. Based on MIDAS Heritage data standards and regionally relevant thesauri, it is a truly bilingual system. Significant attention has been paid to the user interface, which is userfriendly and intuitive. Based on a suite of web services and accessed through a web browser, the system makes full use of internet resources such as Google Maps and Bing Maps. The application avoids long term vendor ''tie-ins'' and as a fully integrated data management system, is now an important tool for both cultural resource managers and heritage researchers in Qatar.

  9. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass

    PubMed Central

    Sánchez, Jesús M.; Carrera, Álvaro; Iglesias, Carlos Á.; Serrano, Emilio

    2016-01-01

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services. PMID:27563911

  10. FwWebViewPlus: integration of web technologies into WinCC OA based Human-Machine Interfaces at CERN

    NASA Astrophysics Data System (ADS)

    Golonka, Piotr; Fabian, Wojciech; Gonzalez-Berges, Manuel; Jasiun, Piotr; Varela-Rodriguez, Fernando

    2014-06-01

    The rapid growth in popularity of web applications gives rise to a plethora of reusable graphical components, such as Google Chart Tools and JQuery Sparklines, implemented in JavaScript and run inside a web browser. In the paper we describe the tool that allows for seamless integration of web-based widgets into WinCC Open Architecture, the SCADA system used commonly at CERN to build complex Human-Machine Interfaces. Reuse of widely available widget libraries and pushing the development efforts to a higher abstraction layer based on a scripting language allow for significant reduction in maintenance of the code in multi-platform environments compared to those currently used in C++ visualization plugins. Adequately designed interfaces allow for rapid integration of new web widgets into WinCC OA. At the same time, the mechanisms familiar to HMI developers are preserved, making the use of new widgets "native". Perspectives for further integration between the realms of WinCC OA and Web development are also discussed.

  11. Health and medication information resources on the World Wide Web.

    PubMed

    Grossman, Sara; Zerilli, Tina

    2013-04-01

    Health care practitioners have increasingly used the Internet to obtain health and medication information. The vast number of Internet Web sites providing such information and concerns with their reliability makes it essential for users to carefully select and evaluate Web sites prior to use. To this end, this article reviews the general principles to consider in this process. Moreover, as cost may limit access to subscription-based health and medication information resources with established reputability, freely accessible online resources that may serve as an invaluable addition to one's reference collection are highlighted. These include government- and organization-sponsored resources (eg, US Food and Drug Administration Web site and the American Society of Health-System Pharmacists' Drug Shortage Resource Center Web site, respectively) as well as commercial Web sites (eg, Medscape, Google Scholar). Familiarity with such online resources can assist health care professionals in their ability to efficiently navigate the Web and may potentially expedite the information gathering and decision-making process, thereby improving patient care.

  12. Googling suicide: surfing for suicide information on the Internet.

    PubMed

    Recupero, Patricia R; Harms, Samara E; Noble, Jeffrey M

    2008-06-01

    This study examined the types of resources a suicidal person might find through search engines on the Internet. We were especially interested in determining the accessibility of potentially harmful resources, such as prosuicide forums, as such resources have been implicated in completed suicides and are known to exist on the Web. Using 5 popular search engines (Google, Yahoo!, Ask.com, Lycos, and Dogpile) and 4 suicide-related search terms (suicide, how to commit suicide, suicide methods, and how to kill yourself), we collected quantitative and qualitative data about the search results. The searches were conducted in August and September 2006. Several coraters assigned codes and characterizations to the first 30 Web sites per search term combination (and "sponsored links" on those pages), which were then confirmed by consensus ratings. Search results were classified as being prosuicide, antisuicide, suicide-neutral, not a suicide site, or error (i.e., page would not load). Additional information was collected to further characterize the nature of the information on these Web sites. Suicide-neutral and anti-suicide pages occurred most frequently (of 373 unique Web pages, 115 were coded as suicide-neutral, and 109 were anti-suicide). While pro-suicide resources were less frequent (41 Web pages), they were nonetheless easily accessible. Detailed how-to instructions for unusual and lethal suicide methods were likewise easily located through the searches. Mental health professionals should ask patients about their Internet use. Depressed, suicidal, or potentially suicidal patients who use the Internet may be especially at risk. Clinicians may wish to assist patients in locating helpful, supportive resources online so that patients' Internet use may be more therapeutic than harmful.

  13. Injury surveillance in low-resource settings using Geospatial and Social Web technologies

    PubMed Central

    2010-01-01

    Background Extensive public health gains have benefited high-income countries in recent decades, however, citizens of low and middle-income countries (LMIC) have largely not enjoyed the same advancements. This is in part due to the fact that public health data - the foundation for public health advances - are rarely collected in many LMIC. Injury data are particularly scarce in many low-resource settings, despite the huge associated burden of morbidity and mortality. Advances in freely-accessible and easy-to-use information and communication (ICT) technology may provide the impetus for increased public health data collection in settings with limited financial and personnel resources. Methods and Results A pilot study was conducted at a hospital in Cape Town, South Africa to assess the utility and feasibility of using free (non-licensed), and easy-to-use Social Web and GeoWeb tools for injury surveillance in low-resource settings. Data entry, geocoding, data exploration, and data visualization were successfully conducted using these technologies, including Google Spreadsheet, Mapalist, BatchGeocode, and Google Earth. Conclusion This study examined the potential for Social Web and GeoWeb technologies to contribute to public health data collection and analysis in low-resource settings through an injury surveillance pilot study conducted in Cape Town, South Africa. The success of this study illustrates the great potential for these technologies to be leveraged for public health surveillance in resource-constrained environments, given their ease-of-use and low-cost, and the sharing and collaboration capabilities they afford. The possibilities and potential limitations of these technologies are discussed in relation to the study, and to the field of public health in general. PMID:20497570

  14. Electronic Biomedical Literature Search for Budding Researcher

    PubMed Central

    Thakre, Subhash B.; Thakre S, Sushama S.; Thakre, Amol D.

    2013-01-01

    Search for specific and well defined literature related to subject of interest is the foremost step in research. When we are familiar with topic or subject then we can frame appropriate research question. Appropriate research question is the basis for study objectives and hypothesis. The Internet provides a quick access to an overabundance of the medical literature, in the form of primary, secondary and tertiary literature. It is accessible through journals, databases, dictionaries, textbooks, indexes, and e-journals, thereby allowing access to more varied, individualised, and systematic educational opportunities. Web search engine is a tool designed to search for information on the World Wide Web, which may be in the form of web pages, images, information, and other types of files. Search engines for internet-based search of medical literature include Google, Google scholar, Scirus, Yahoo search engine, etc., and databases include MEDLINE, PubMed, MEDLARS, etc. Several web-libraries (National library Medicine, Cochrane, Web of Science, Medical matrix, Emory libraries) have been developed as meta-sites, providing useful links to health resources globally. A researcher must keep in mind the strengths and limitations of a particular search engine/database while searching for a particular type of data. Knowledge about types of literature, levels of evidence, and detail about features of search engine as available, user interface, ease of access, reputable content, and period of time covered allow their optimal use and maximal utility in the field of medicine. Literature search is a dynamic and interactive process; there is no one way to conduct a search and there are many variables involved. It is suggested that a systematic search of literature that uses available electronic resource effectively, is more likely to produce quality research. PMID:24179937

  15. Electronic biomedical literature search for budding researcher.

    PubMed

    Thakre, Subhash B; Thakre S, Sushama S; Thakre, Amol D

    2013-09-01

    Search for specific and well defined literature related to subject of interest is the foremost step in research. When we are familiar with topic or subject then we can frame appropriate research question. Appropriate research question is the basis for study objectives and hypothesis. The Internet provides a quick access to an overabundance of the medical literature, in the form of primary, secondary and tertiary literature. It is accessible through journals, databases, dictionaries, textbooks, indexes, and e-journals, thereby allowing access to more varied, individualised, and systematic educational opportunities. Web search engine is a tool designed to search for information on the World Wide Web, which may be in the form of web pages, images, information, and other types of files. Search engines for internet-based search of medical literature include Google, Google scholar, Scirus, Yahoo search engine, etc., and databases include MEDLINE, PubMed, MEDLARS, etc. Several web-libraries (National library Medicine, Cochrane, Web of Science, Medical matrix, Emory libraries) have been developed as meta-sites, providing useful links to health resources globally. A researcher must keep in mind the strengths and limitations of a particular search engine/database while searching for a particular type of data. Knowledge about types of literature, levels of evidence, and detail about features of search engine as available, user interface, ease of access, reputable content, and period of time covered allow their optimal use and maximal utility in the field of medicine. Literature search is a dynamic and interactive process; there is no one way to conduct a search and there are many variables involved. It is suggested that a systematic search of literature that uses available electronic resource effectively, is more likely to produce quality research.

  16. Brave New Media World: Science Communication Voyages through the Global Seas

    NASA Astrophysics Data System (ADS)

    Clark, C. L.; Reisewitz, A.

    2010-12-01

    By leveraging online tools, such as blogs, Twitter, Facebook, Google Earth, flickr, web-based discussion boards, and a bi-monthly electronic magazine for the non-scientist, Scripps Institution of Oceanography is taking science communications out of the static webpage to create interactive journeys that spark social dialogue and helped raise awareness of science-based research on global marine environmental issues. Several new initiatives are being chronicled through popular blogs and expedition web sites as researchers share interesting scientific facts and unusual findings in near real-time.

  17. Impact of different cloud deployments on real-time video applications for mobile video cloud users

    NASA Astrophysics Data System (ADS)

    Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos

    2015-02-01

    The latest trend to access mobile cloud services through wireless network connectivity has amplified globally among both entrepreneurs and home end users. Although existing public cloud service vendors such as Google, Microsoft Azure etc. are providing on-demand cloud services with affordable cost for mobile users, there are still a number of challenges to achieve high-quality mobile cloud based video applications, especially due to the bandwidth-constrained and errorprone mobile network connectivity, which is the communication bottleneck for end-to-end video delivery. In addition, existing accessible clouds networking architectures are different in term of their implementation, services, resources, storage, pricing, support and so on, and these differences have varied impact on the performance of cloud-based real-time video applications. Nevertheless, these challenges and impacts have not been thoroughly investigated in the literature. In our previous work, we have implemented a mobile cloud network model that integrates localized and decentralized cloudlets (mini-clouds) and wireless mesh networks. In this paper, we deploy a real-time framework consisting of various existing Internet cloud networking architectures (Google Cloud, Microsoft Azure and Eucalyptus Cloud) and a cloudlet based on Ubuntu Enterprise Cloud over wireless mesh networking technology for mobile cloud end users. It is noted that the increasing trend to access real-time video streaming over HTTP/HTTPS is gaining popularity among both research and industrial communities to leverage the existing web services and HTTP infrastructure in the Internet. To study the performance under different deployments using different public and private cloud service providers, we employ real-time video streaming over the HTTP/HTTPS standard, and conduct experimental evaluation and in-depth comparative analysis of the impact of different deployments on the quality of service for mobile video cloud users. Empirical results are presented and discussed to quantify and explain the different impacts resulted from various cloud deployments, video application and wireless/mobile network setting, and user mobility. Additionally, this paper analyses the advantages, disadvantages, limitations and optimization techniques in various cloud networking deployments, in particular the cloudlet approach compared with the Internet cloud approach, with recommendations of optimized deployments highlighted. Finally, federated clouds and inter-cloud collaboration challenges and opportunities are discussed in the context of supporting real-time video applications for mobile users.

  18. Locum tenens and telepsychiatry: trends in psychiatric care.

    PubMed

    Thiele, Jonathan S; Doarn, Charles R; Shore, Jay H

    2015-06-01

    There is a national shortage of psychiatrists, and according to nationally available data, it is projected to get worse. Locum tenens psychiatry and telepsychiatry are two ways to fill the shortages of psychiatric providers that exist in many areas in the United States. Employment and salary data in these areas can be used to illuminate current trends and anticipate future solutions to the problem of increasing demand for, and decreasing supply of, psychiatrists in the United States. A search was conducted of the literature and relevant Web sites, including PubMed, Google Scholar, and www.google.com , as well as information obtained from locum tenens and telepsychiatry organizations. There is a dearth of data on the use of locum tenens in the field of psychiatry, with little available prior to 2000 and few published studies since then. The majority of the data available are survey data from commercial entities. These data show trends toward increasing demand for psychiatry along with increasing salaries and indicate the utilization of telepsychiatry and locum tenens telepsychiatry is increasing. The published academic data that are available show that although locum tenens psychiatry is slightly inferior to routine psychiatric care, telepsychiatry is generally equivalent to face-to-face care. One can anticipate that as the national shortage of psychiatrists is expected to accelerate, use of both locum tenens and telepsychiatry may also continue to increase. Telepsychiatry offers several possible advantages, including lower cost, longer-term services, quality of care, and models that can extend psychiatric services. If current trends continue, systems that demand face-to-face psychiatry may find themselves paying higher fees for locum tenens psychiatrists, whereas others may employ psychiatrists more efficiently with telepsychiatry.

  19. Using food-web theory to conserve ecosystems

    PubMed Central

    McDonald-Madden, E.; Sabbadin, R.; Game, E. T.; Baxter, P. W. J.; Chadès, I.; Possingham, H. P.

    2016-01-01

    Food-web theory can be a powerful guide to the management of complex ecosystems. However, we show that indices of species importance common in food-web and network theory can be a poor guide to ecosystem management, resulting in significantly more extinctions than necessary. We use Bayesian Networks and Constrained Combinatorial Optimization to find optimal management strategies for a wide range of real and hypothetical food webs. This Artificial Intelligence approach provides the ability to test the performance of any index for prioritizing species management in a network. While no single network theory index provides an appropriate guide to management for all food webs, a modified version of the Google PageRank algorithm reliably minimizes the chance and severity of negative outcomes. Our analysis shows that by prioritizing ecosystem management based on the network-wide impact of species protection rather than species loss, we can substantially improve conservation outcomes. PMID:26776253

  20. Regional early flood warning system: design and implementation

    NASA Astrophysics Data System (ADS)

    Chang, L. C.; Yang, S. N.; Kuo, C. L.; Wang, Y. F.

    2017-12-01

    This study proposes a prototype of the regional early flood inundation warning system in Tainan City, Taiwan. The AI technology is used to forecast multi-step-ahead regional flood inundation maps during storm events. The computing time is only few seconds that leads to real-time regional flood inundation forecasting. A database is built to organize data and information for building real-time forecasting models, maintaining the relations of forecasted points, and displaying forecasted results, while real-time data acquisition is another key task where the model requires immediately accessing rain gauge information to provide forecast services. All programs related database are constructed in Microsoft SQL Server by using Visual C# to extracting real-time hydrological data, managing data, storing the forecasted data and providing the information to the visual map-based display. The regional early flood inundation warning system use the up-to-date Web technologies driven by the database and real-time data acquisition to display the on-line forecasting flood inundation depths in the study area. The friendly interface includes on-line sequentially showing inundation area by Google Map, maximum inundation depth and its location, and providing KMZ file download of the results which can be watched on Google Earth. The developed system can provide all the relevant information and on-line forecast results that helps city authorities to make decisions during typhoon events and make actions to mitigate the losses.

  1. Interoperability In The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Rios, C.; Barbarisi, I.; Docasal, R.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; Grotheer, E.; Besse, S.; Martinez, S.; Heather, D.; De Marchi, G.; Lim, T.; Fraga, D.; Barthelemy, M.

    2015-12-01

    As the world becomes increasingly interconnected, there is a greater need to provide interoperability with software and applications that are commonly being used globally. For this purpose, the development of the new Planetary Science Archive (PSA), by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is focused on building a modern science archive that takes into account internationally recognised standards in order to provide access to the archive through tools from third parties, for example by the NASA Planetary Data System (PDS), the VESPA project from the Virtual Observatory of Paris as well as other international institutions. The protocols and standards currently being supported by the new Planetary Science Archive at this time are the Planetary Data Access Protocol (PDAP), the EuroPlanet-Table Access Protocol (EPN-TAP) and Open Geospatial Consortium (OGC) standards. The architecture of the PSA consists of a Geoserver (an open-source map server), the goal of which is to support use cases such as the distribution of search results, sharing and processing data through a OGC Web Feature Service (WFS) and a Web Map Service (WMS). This server also allows the retrieval of requested information in several standard output formats like Keyhole Markup Language (KML), Geography Markup Language (GML), shapefile, JavaScript Object Notation (JSON) and Comma Separated Values (CSV), among others. The provision of these various output formats enables end-users to be able to transfer retrieved data into popular applications such as Google Mars and NASA World Wind.

  2. Medical student appraisal: searching on smartphones.

    PubMed

    Khalifian, S; Markman, T; Sampognaro, P; Mitchell, S; Weeks, S; Dattilo, J

    2013-01-01

    The rapidly growing industry for mobile medical applications provides numerous smartphone resources designed for healthcare professionals. However, not all applications are equally useful in addressing the questions of early medical trainees. Three popular, free, mobile healthcare applications were evaluated along with a Google(TM) web search on both Apple(TM) and Android(TM) devices. Six medical students at a large academic hospital evaluated each application for a one-week period while on various clinical rotations. Google(TM) was the most frequently used search method and presented multimedia resources but was inefficient for obtaining clinical management information. Epocrates(TM) Pill ID feature was praised for its clinical utility. Medscape(TM) had the highest satisfaction of search and excelled through interactive educational features. Micromedex(TM) offered both FDA and off-label dosing for drugs. Google(TM) was the preferred search method for questions related to basic disease processes and multimedia resources, but was inadequate for clinical management. Caution should also be exercised when using Google(TM) in front of patients. Medscape(TM) was the most appealing application due to a broad scope of content and educational features relevant to medical trainees. Students should also be cognizant of how mobile technology may be perceived by their evaluators to avoid false impressions.

  3. Evaluation of the Content and Accessibility of Web Sites for Accredited Orthopaedic Trauma Surgery Fellowships.

    PubMed

    Shaath, M Kareem; Yeranosian, Michael G; Ippolito, Joseph A; Adams, Mark R; Sirkin, Michael S; Reilly, Mark C

    2018-05-02

    Orthopaedic trauma fellowship applicants use online-based resources when researching information on potential U.S. fellowship programs. The 2 primary sources for identifying programs are the Orthopaedic Trauma Association (OTA) database and the San Francisco Match (SF Match) database. Previous studies in other orthopaedic subspecialty areas have demonstrated considerable discrepancies among fellowship programs. The purpose of this study was to analyze content and availability of information on orthopaedic trauma surgery fellowship web sites. The online databases of the OTA and SF Match were reviewed to determine the availability of embedded program links or external links for the included programs. Thereafter, a Google search was performed for each program individually by typing the program's name, followed by the term "orthopaedic trauma fellowship." All identified fellowship web sites were analyzed for accessibility and content. Web sites were evaluated for comprehensiveness in mentioning key components of the orthopaedic trauma surgery curriculum. By consensus, we refined the final list of variables utilizing the methodology of previous studies on the topic. We identified 54 OTA-accredited fellowship programs, offering 87 positions. The majority (94%) of programs had web sites accessible through a Google search. Of the 51 web sites found, all (100%) described their program. Most commonly, hospital affiliation (88%), operative experiences (76%), and rotation overview (65%) were listed, and, least commonly, interview dates (6%), selection criteria (16%), on-call requirements (20%), and fellow evaluation criteria (20%) were listed. Programs with ≥2 fellows provided more information with regard to education content (p = 0.0001) and recruitment content (p = 0.013). Programs with Accreditation Council for Graduate Medical Education (ACGME) accreditation status also provided greater information with regard to education content (odds ratio, 4.0; p = 0.0001). Otherwise, no differences were seen by region, residency affiliation, medical school affiliation, or hospital affiliation. The SF Match and OTA databases provide few direct links to fellowship web sites. Individual program web sites do not effectively and completely convey information about the programs. The Internet is an underused resource for fellow recruitment. The lack of information on these sites allows for future opportunity to optimize this resource.

  4. Accessibility and quality of online information for pediatric orthopaedic surgery fellowships.

    PubMed

    Davidson, Austin R; Murphy, Robert F; Spence, David D; Kelly, Derek M; Warner, William C; Sawyer, Jeffrey R

    2014-12-01

    Pediatric orthopaedic fellowship applicants commonly use online-based resources for information on potential programs. Two primary sources are the San Francisco Match (SF Match) database and the Pediatric Orthopaedic Society of North America (POSNA) database. We sought to determine the accessibility and quality of information that could be obtained by using these 2 sources. The online databases of the SF Match and POSNA were reviewed to determine the availability of embedded program links or external links for the included programs. If not available in the SF Match or POSNA data, Web sites for listed programs were located with a Google search. All identified Web sites were analyzed for accessibility, content volume, and content quality. At the time of online review, 50 programs, offering 68 positions, were listed in the SF Match database. Although 46 programs had links included with their information, 36 (72%) of them simply listed http://www.sfmatch.org as their unique Web site. Ten programs (20%) had external links listed, but only 2 (4%) linked directly to the fellowship web page. The POSNA database does not list any links to the 47 programs it lists, which offer 70 positions. On the basis of a Google search of the 50 programs listed in the SF Match database, web pages were found for 35. Of programs with independent web pages, all had a description of the program and 26 (74%) described their application process. Twenty-nine (83%) listed research requirements, 22 (63%) described the rotation schedule, and 12 (34%) discussed the on-call expectations. A contact telephone number and/or email address was provided by 97% of programs. Twenty (57%) listed both the coordinator and fellowship director, 9 (26%) listed the coordinator only, 5 (14%) listed the fellowship director only, and 1 (3%) had no contact information given. The SF Match and POSNA databases provide few direct links to fellowship Web sites, and individual program Web sites either do not exist or do not effectively convey information about the programs. Improved accessibility and accurate information online would allow potential applicants to obtain information about pediatric fellowships in a more efficient manner.

  5. Paying Your Way to the Top: Search Engine Advertising.

    ERIC Educational Resources Information Center

    Scott, David M.

    2003-01-01

    Explains how organizations can buy listings on major Web search engines, making it the fastest growing form of advertising. Highlights include two network models, Google and Overture; bidding on phrases to buy as links to use with ads; ad ranking; benefits for small businesses; and paid listings versus regular search results. (LRW)

  6. Prevalence of purulent vaginal discharge in dairy herds depends on timing but not method of detection

    USDA-ARS?s Scientific Manuscript database

    A review of existing literature was conducted to determine the prevalence of purulent vaginal discharge (PVD) in dairy herds around the world and detection methodologies that influence prevalence estimates. Four databases (PubMed, Google Scholar, Web of Science, and Scopus) were queried with the sea...

  7. 76 FR 74776 - Forum-Trends in Extreme Winds, Waves, and Extratropical Storms Along the Coasts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-01

    ... Winds, Waves, and Extratropical Storms Along the Coasts AGENCY: National Environmental Satellite, Data... information, please check the forum Web site at https://sites.google.com/a/noaa.gov/extreme-winds-waves.../noaa.gov/extreme-winds-waves-extratropical-storms/home . Topics To Be Addressed This forum will address...

  8. Make Your Own Mashup Maps

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.; Whiting, Mervyn J.

    2008-01-01

    "Mashup" is a new technology term used to describe a web application that combines data or technology from several different sources. You can apply this concept in your classroom by having students create their own mashup maps. Google Maps provides you with the simple tools, map databases, and online help you'll need to quickly master this…

  9. Concordancers and Dictionaries as Problem-Solving Tools for ESL Academic Writing

    ERIC Educational Resources Information Center

    Yoon, Choongil

    2016-01-01

    The present study investigated how 6 Korean ESL graduate students in Canada used a suite of freely available reference resources, consisting of Web-based corpus tools, Google search engines, and dictionaries, for solving linguistic problems while completing an authentic academic writing assignment in English. Using a mixed methods design, the…

  10. Web-Based Collaborative Writing in L2 Contexts: Methodological Insights from Text Mining

    ERIC Educational Resources Information Center

    Yim, Soobin; Warschauer, Mark

    2017-01-01

    The increasingly widespread use of social software (e.g., Wikis, Google Docs) in second language (L2) settings has brought a renewed attention to collaborative writing. Although the current methodological approaches to examining collaborative writing are valuable to understand L2 students' interactional patterns or perceived experiences, they can…

  11. Voice-Recognition Augmented Performance Tools in Performance Poetry Pedagogy

    ERIC Educational Resources Information Center

    Devanny, David; McGowan, Jack

    2016-01-01

    This provocation shares findings from the use of bespoke voice-recognition performance software in a number of seminars (which took place in the 2014-2016 academic years at Glasgow School of Art, University of Warwick, and Falmouth University). The software, made available through this publication, is a web-app which uses Google Chrome's native…

  12. Factors Influencing Consent to Having Videotaped Mental Health Sessions

    ERIC Educational Resources Information Center

    Ko, Kenton; Goebert, Deborah

    2011-01-01

    Objective: The authors critically reviewed the literature regarding factors influencing consent to having videotaped mental health sessions. Methods: The authors searched the literature in PubMed, PsycINFO, Google Scholar, and Web of Science from the mid-1950s through February 2009. Results: The authors identified 27 studies, of which 19 (73%)…

  13. Teaching in Educational Leadership Using Web 2.0 Applications: Perspectives on What Works

    ERIC Educational Resources Information Center

    Shinsky, E. John; Stevens, Hans A.

    2011-01-01

    To prepare 21st Century school leaders, educational leadership professors need to learn and teach the utilization of increasingly sophisticated technologies in their courses. The co-authors, a professor and an educational specialist degree candidate, describe how the use of advanced technologies--such as Wikis, Google Docs, Wimba Classroom, and…

  14. Ready for Their Close-Ups

    ERIC Educational Resources Information Center

    Foster, Andrea L.

    2006-01-01

    American college students are increasingly posting videos of their lives online, due to Web sites like Vimeo and Google Video that host video material free and the ubiquity of camera phones and other devices that can take video-clips. However, the growing popularity of online socializing has many safety experts worried that students could be…

  15. Teaching Undergraduate Software Engineering Using Open Source Development Tools

    DTIC Science & Technology

    2012-01-01

    ware. Some example appliances are: a LAMP stack, Redmine, MySQL database, Moodle, Tom- cat on Apache, and Bugzilla. Some of the important features...Ada, C, C++, PHP , Py- thon, etc., and also supports a wide range of SDKs such as Google’s Android SDK and the Google Web Toolkit SDK. Additionally

  16. Assessing Journal Quality in Mathematics Education

    ERIC Educational Resources Information Center

    Nivens, Ryan Andrew; Otten, Samuel

    2017-01-01

    In this Research Commentary, we describe 3 journal metrics--the Web of Science's Impact Factor, Scopus's SCImago Journal Rank, and Google Scholar Metrics' h5-index--and compile the rankings (if they exist) for 69 mathematics education journals. We then discuss 2 paths that the mathematics education community should consider with regard to these…

  17. Web Searching: A Process-Oriented Experimental Study of Three Interactive Search Paradigms.

    ERIC Educational Resources Information Center

    Dennis, Simon; Bruza, Peter; McArthur, Robert

    2002-01-01

    Compares search effectiveness when using query-based Internet search via the Google search engine, directory-based search via Yahoo, and phrase-based query reformulation-assisted search via the Hyperindex browser by means of a controlled, user-based experimental study of undergraduates at the University of Queensland. Discusses cognitive load,…

  18. Where Do I Find It?--An Internet Glossary.

    ERIC Educational Resources Information Center

    Del Monte, Erin; Manso, Angela

    2001-01-01

    Lists 13 different Internet search engines that might be of interest to educators, including: AOL Search, Alta Vista, Google, Lycos, Northern Light, and Yahoo. Gives a brief description of each search engine's capabilities, strengths, and weaknesses and includes Web addresses of U.S. government offices, including the U.S. Department of Education.…

  19. School Librarians: Vital Educational Leaders

    ERIC Educational Resources Information Center

    Martineau, Pamela

    2010-01-01

    In the new millennium, school librarians are more likely to be found sitting behind a computer as they update the library web page or create a wiki on genetically modified organisms. Or they might be seen in the library computer lab as they lead students through tutorials on annotated bibliographies or Google docs. If adequately supported, school…

  20. The Physlet Approach to Simulation Design

    ERIC Educational Resources Information Center

    Christian, Wolfgang; Belloni, Mario; Esquembre, Francisco; Mason, Bruce A.; Barbato, Lyle; Riggsbee, Matt

    2015-01-01

    Over the past two years, the AAPT/ComPADRE staff and the Open Source Physics group have published the second edition of "Physlet Physics" and "Physlet Quantum Physics," delivered as interactive web pages on AAPT/ComPADRE and as free eBooks available through iTunes and Google Play. These two websites, and their associated books,…

  1. Modern Amphibious Operations: Why the United States Must Maintain a Joint Amphibious Forcible Entry Capability

    DTIC Science & Technology

    2012-03-23

    be reminded that the aforementioned movies depicted events that happened nearly 70 years ago.48 These films neither represent modem amphibious...contemporary sources. Few are more contemporary than those in this genre . Nothing can substitute a simple Google web search to get ideas about where

  2. 78 FR 62820 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ... on Web sites operated by Google, Interactive Data, and Dow Jones, among others. The text of the... and various forms of alternative trading systems (``ATSs''), including dark pools and electronic..., BATS Trading and Direct Edge. A proliferation of dark pools and other ATSs operate profitably with...

  3. Map of Life - A Dashboard for Monitoring Planetary Species Distributions

    NASA Astrophysics Data System (ADS)

    Jetz, W.

    2016-12-01

    Geographic information about biodiversity is vital for understanding the many services nature provides and their potential changes, yet remains unreliable and often insufficient. By integrating a wide range of knowledge about species distributions and their dynamics over time, Map of Life supports global biodiversity education, monitoring, research and decision-making. Built on a scalable web platform geared for large biodiversity and environmental data, Map of Life endeavors provides species range information globally and species lists for any area. With data and technology provided by NASA and Google Earth Engine, tools under development use remote sensing-based environmental layers to enable on-the-fly predictions of species distributions, range changes, and early warning signals for threatened species. The ultimate vision is a globally connected, collaborative knowledge- and tool-base for regional and local biodiversity decision-making, education, monitoring, and projection. For currently available tools, more information and to follow progress, go to MOL.org.

  4. Development of a cloud-based system for remote monitoring of a PVT panel

    NASA Astrophysics Data System (ADS)

    Saraiva, Luis; Alcaso, Adérito; Vieira, Paulo; Ramos, Carlos Figueiredo; Cardoso, Antonio Marques

    2016-10-01

    The paper presents a monitoring system developed for an energy conversion system based on the sun and known as thermophotovoltaic panel (PVT). The project was implemented using two embedded microcontrollers platforms (arduino Leonardo and arduino yún), wireless transmission systems (WI-FI and XBEE) and net computing ,commonly known as cloud (Google cloud). The main objective of the project is to provide remote access and real-time data monitoring (like: electrical current, electrical voltage, input fluid temperature, output fluid temperature, backward fluid temperature, up PV glass temperature, down PV glass temperature, ambient temperature, solar radiation, wind speed, wind direction and fluid mass flow). This project demonstrates the feasibility of using inexpensive microcontroller's platforms and free internet service in theWeb, to support the remote study of renewable energy systems, eliminating the acquisition of dedicated systems typically more expensive and limited in the kind of processing proposed.

  5. Predicting Ambulance Time of Arrival to the Emergency Department Using Global Positioning System and Google Maps

    PubMed Central

    Fleischman, Ross J.; Lundquist, Mark; Jui, Jonathan; Newgard, Craig D.; Warden, Craig

    2014-01-01

    Objective To derive and validate a model that accurately predicts ambulance arrival time that could be implemented as a Google Maps web application. Methods This was a retrospective study of all scene transports in Multnomah County, Oregon, from January 1 through December 31, 2008. Scene and destination hospital addresses were converted to coordinates. ArcGIS Network Analyst was used to estimate transport times based on street network speed limits. We then created a linear regression model to improve the accuracy of these street network estimates using weather, patient characteristics, use of lights and sirens, daylight, and rush-hour intervals. The model was derived from a 50% sample and validated on the remainder. Significance of the covariates was determined by p < 0.05 for a t-test of the model coefficients. Accuracy was quantified by the proportion of estimates that were within 5 minutes of the actual transport times recorded by computer-aided dispatch. We then built a Google Maps-based web application to demonstrate application in real-world EMS operations. Results There were 48,308 included transports. Street network estimates of transport time were accurate within 5 minutes of actual transport time less than 16% of the time. Actual transport times were longer during daylight and rush-hour intervals and shorter with use of lights and sirens. Age under 18 years, gender, wet weather, and trauma system entry were not significant predictors of transport time. Our model predicted arrival time within 5 minutes 73% of the time. For lights and sirens transports, accuracy was within 5 minutes 77% of the time. Accuracy was identical in the validation dataset. Lights and sirens saved an average of 3.1 minutes for transports under 8.8 minutes, and 5.3 minutes for longer transports. Conclusions An estimate of transport time based only on a street network significantly underestimated transport times. A simple model incorporating few variables can predict ambulance time of arrival to the emergency department with good accuracy. This model could be linked to global positioning system data and an automated Google Maps web application to optimize emergency department resource use. Use of lights and sirens had a significant effect on transport times. PMID:23865736

  6. Development of a Google-based search engine for data mining radiology reports.

    PubMed

    Erinjeri, Joseph P; Picus, Daniel; Prior, Fred W; Rubin, David A; Koppel, Paul

    2009-08-01

    The aim of this study is to develop a secure, Google-based data-mining tool for radiology reports using free and open source technologies and to explore its use within an academic radiology department. A Health Insurance Portability and Accountability Act (HIPAA)-compliant data repository, search engine and user interface were created to facilitate treatment, operations, and reviews preparatory to research. The Institutional Review Board waived review of the project, and informed consent was not required. Comprising 7.9 GB of disk space, 2.9 million text reports were downloaded from our radiology information system to a fileserver. Extensible markup language (XML) representations of the reports were indexed using Google Desktop Enterprise search engine software. A hypertext markup language (HTML) form allowed users to submit queries to Google Desktop, and Google's XML response was interpreted by a practical extraction and report language (PERL) script, presenting ranked results in a web browser window. The query, reason for search, results, and documents visited were logged to maintain HIPAA compliance. Indexing averaged approximately 25,000 reports per hour. Keyword search of a common term like "pneumothorax" yielded the first ten most relevant results of 705,550 total results in 1.36 s. Keyword search of a rare term like "hemangioendothelioma" yielded the first ten most relevant results of 167 total results in 0.23 s; retrieval of all 167 results took 0.26 s. Data mining tools for radiology reports will improve the productivity of academic radiologists in clinical, educational, research, and administrative tasks. By leveraging existing knowledge of Google's interface, radiologists can quickly perform useful searches.

  7. Reusable Client-Side JavaScript Modules for Immersive Web-Based Real-Time Collaborative Neuroimage Visualization

    PubMed Central

    Bernal-Rusiel, Jorge L.; Rannou, Nicolas; Gollub, Randy L.; Pieper, Steve; Murphy, Shawn; Robertson, Richard; Grant, Patricia E.; Pienaar, Rudolph

    2017-01-01

    In this paper we present a web-based software solution to the problem of implementing real-time collaborative neuroimage visualization. In both clinical and research settings, simple and powerful access to imaging technologies across multiple devices is becoming increasingly useful. Prior technical solutions have used a server-side rendering and push-to-client model wherein only the server has the full image dataset. We propose a rich client solution in which each client has all the data and uses the Google Drive Realtime API for state synchronization. We have developed a small set of reusable client-side object-oriented JavaScript modules that make use of the XTK toolkit, a popular open-source JavaScript library also developed by our team, for the in-browser rendering and visualization of brain image volumes. Efficient realtime communication among the remote instances is achieved by using just a small JSON object, comprising a representation of the XTK image renderers' state, as the Google Drive Realtime collaborative data model. The developed open-source JavaScript modules have already been instantiated in a web-app called MedView, a distributed collaborative neuroimage visualization application that is delivered to the users over the web without requiring the installation of any extra software or browser plugin. This responsive application allows multiple physically distant physicians or researchers to cooperate in real time to reach a diagnosis or scientific conclusion. It also serves as a proof of concept for the capabilities of the presented technological solution. PMID:28507515

  8. Survey of publications and the H-index of Academic Emergency Medicine Professors.

    PubMed

    Babineau, Matthew; Fischer, Christopher; Volz, Kathryn; Sanchez, Leon D

    2014-05-01

    The number of publications and how often these have been cited play a role in academic promotion. Bibliometrics that attempt to quantify the relative impact of scholarly work have been proposed. The h-index is defined as the number (h) of publications for an individual that have been cited at least h times. We calculated the h-index and number of publications for academic emergency physicians at the rank of professor. We accessed the Society for Academic Emergency Medicine professor list in January of 2012. We calculated the number of publications through Web of Science and PubMed and the h-index using Google scholar and Web of Science. We identified 299 professors of emergency medicine. The number of professors per institution ranged from 1 to 13. Median h-index in Web of Science was 11 (interquartile range [IQR] 6-17, range 0-51), in Google Scholar median h-index was 14 (IQR 9-22, range 0-63) The median number of publications reported in Web of Science was 36 (IQR 18-73, range 0-359. Total number of publications had a high correlation with the h-index (r=0.884). The h-index is only a partial measure of academic productivity. As a measure of the impact of an individual's publications it can provide a simple way to compare and measure academic progress and provide a metric that can be used when evaluating a person for academic promotion. Calculation of the h-index can provide a way to track academic progress and impact. [West J Emerg Med. 2014;15(3):290-292.].

  9. Breast reconstruction post mastectomy- Let's Google it. Accessibility, readability and quality of online information.

    PubMed

    Lynch, Noel P; Lang, Bronagh; Angelov, Sophia; McGarrigle, Sarah A; Boyle, Terence J; Al-Azawi, Dhafir; Connolly, Elizabeth M

    2017-04-01

    This study evaluated the readability, accessibility and quality of information pertaining to breast reconstruction post mastectomy on the Internet in the English language. Using the Google © search engine the keywords "Breast reconstruction post mastectomy" were searched for. We analyzed the top 75 sites. The Flesch Reading Ease Score and Gunning Fog Index were calculated to assess readability. Web site quality was assessed objectively using the University of Michigan Consumer Health Web site Evaluation Checklist. Accessibility was determined using an automated accessibility tool. In addition, the country of origin, type of organisation producing the site and presence of Health on the Net (HoN) Certification status was recorded. The Web sites were difficult to read and comprehend. The mean Flesch Reading Ease scores were 55.5. The mean Gunning Fog Index scores was 8.6. The mean Michigan score was 34.8 indicating weak quality of websites. Websites with HoN certification ranked higher in the search results (p = 0.007). Website quality was influenced by organisation type (p < 0.0001) with academic/healthcare, not for profit and government sites having higher Michigan scores. 20% of sites met the minimum accessibility criteria. Internet information on breast reconstruction post mastectomy and procedures is poorly written and we suggest that Webpages providing information must be made more readable and accessible. We suggest that health professionals should recommend Web sites that are easy to read and contain high-quality surgical information. Medical information on the Internet should be readable, accessible, reliable and of a consistent quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. The Google Online Marketing Challenge and Distributed Learning

    ERIC Educational Resources Information Center

    Brown, Ron T.; Albright, Kendra S.

    2013-01-01

    Stagnant perceptions continue to persist in the general public regarding the services libraries offer. LIS research suggests an increased need for marketing, yet LIS programs and students may not view marketing as core to the degree. The Google Online Marketing Challenge (GOMC), a global competition for online marketing, was incorporated into two…

  11. An initial log analysis of usage patterns on a research networking system.

    PubMed

    Boland, Mary Regina; Trembowelski, Sylvia; Bakken, Suzanne; Weng, Chunhua

    2012-08-01

    Usage data for research networking systems (RNSs) are valuable but generally unavailable for understanding scientific professionals' information needs and online collaborator seeking behaviors. This study contributes a method for evaluating RNSs and initial usage knowledge of one RNS obtained from using this method. We designed a log for an institutional RNS, defined categories of users and tasks, and analyzed correlations between usage patterns and user and query types. Our results show that scientific professionals spend more time performing deep Web searching on RNSs than generic Google users and we also show that retrieving scientist profiles is faster on an RNS than on Google (3.5 seconds vs. 34.2 seconds) whereas organization-specific browsing on a RNS takes longer than on Google (117.0 seconds vs. 34.2 seconds). Usage patterns vary by user role, e.g., faculty performed more informational queries than administrators, which implies role-specific user support is needed for RNSs. © 2012 Wiley Periodicals, Inc.

  12. An Initial Log Analysis of Usage Patterns on a Research Networking System

    PubMed Central

    Boland, Mary Regina; Trembowelski, Sylvia; Bakken, Suzanne; Weng, Chunhua

    2012-01-01

    Abstract Usage data for research networking systems (RNSs) are valuable but generally unavailable for understanding scientific professionals’ information needs and online collaborator seeking behaviors. This study contributes a method for evaluating RNSs and initial usage knowledge of one RNS obtained from using this method. We designed a log for an institutional RNS, defined categories of users and tasks, and analyzed correlations between usage patterns and user and query types. Our results show that scientific professionals spend more time performing deep Web searching on RNSs than generic Google users and we also show that retrieving scientist profiles is faster on an RNS than on Google (3.5 seconds vs. 34.2 seconds) whereas organization‐specific browsing on a RNS takes longer than on Google (117.0 seconds vs. 34.2 seconds). Usage patterns vary by user role, e.g., faculty performed more informational queries than administrators, which implies role‐specific user support is needed for RNSs. Clin Trans Sci 2012; Volume 5: 340–347 PMID:22883612

  13. Google matrix of business process management

    NASA Astrophysics Data System (ADS)

    Abel, M. W.; Shepelyansky, D. L.

    2011-12-01

    Development of efficient business process models and determination of their characteristic properties are subject of intense interdisciplinary research. Here, we consider a business process model as a directed graph. Its nodes correspond to the units identified by the modeler and the link direction indicates the causal dependencies between units. It is of primary interest to obtain the stationary flow on such a directed graph, which corresponds to the steady-state of a firm during the business process. Following the ideas developed recently for the World Wide Web, we construct the Google matrix for our business process model and analyze its spectral properties. The importance of nodes is characterized by PageRank and recently proposed CheiRank and 2DRank, respectively. The results show that this two-dimensional ranking gives a significant information about the influence and communication properties of business model units. We argue that the Google matrix method, described here, provides a new efficient tool helping companies to make their decisions on how to evolve in the exceedingly dynamic global market.

  14. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm.

    PubMed

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the "quality of service" as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services.

  15. The GEON Integrated Data Viewer (IDV) and IRIS DMC Services Illustrate CyberInfrastructure Support for Seismic Data Visualization and Interpretation

    NASA Astrophysics Data System (ADS)

    Meertens, C.; Wier, S.; Ahern, T.; Casey, R.; Weertman, B.; Laughbon, C.

    2008-12-01

    UNAVCO and the IRIS DMC are data service partners for seismic visualization, particularly for hypocentral data and tomography. UNAVCO provides the GEON Integrated Data Viewer (IDV), an extension of the Unidata IDV, a free, interactive, research-level, software display and analysis tool for data in 3D (latitude, longitude, depth) and 4D (with time), located on or inside the Earth. The GEON IDV is designed to meet the challenge of investigating complex, multi-variate, time-varying, three- dimensional geoscience data in the context of new remote and shared data sources. The GEON IDV supports data access from data sources using HTTP and FTP servers, OPeNDAP servers, THREDDS catalogs, RSS feeds, and WMS (web map) servers. The IRIS DMC (Data Management System) has developed web services providing data for earthquake hypocentral data and seismic tomography model grids. These services can be called by the GEON IDV to access data at IRIS without copying files. The IRIS Earthquake Browser (IEB) is a web-based query tool for hypocentral data. The IEB combines the DMC's large database of more than 1,900,000 earthquakes with the Google Maps web interface. With the IEB you can quickly find earthquakes in any region of the globe and then import this information into the GEON Integrated Data Viewer where the hypocenters may be visualized. You can select earthquakes by location region, time, depth, and magnitude. The IEB gives the IDV a URL to the selected data. The IDV then shows the data as maps or 3D displays, with interactive control of vertical scale, area, map projection, with symbol size and color control by magnitude or depth. The IDV can show progressive time animation of, for example, aftershocks filling a source region. The IRIS Tomoserver converts seismic tomography model output grids to NetCDF for use in the IDV. The Tomoserver accepts a tomographic model file as input from a user and provides an equivalent NetCDF file as output. The service supports NA04, S3D, A1D and CUB input file formats, contributed by their respective creators. The NetCDF file is saved to a location that can be referenced with a URL on an IRIS server. The URL for the NetCDF file is provided to the user. The user can download the data from IRIS, or copy the URL into IDV directly for interpretation, and the IDV will access the data at IRIS. The Tomoserver conversion software was developed by Instrumental Software Technologies, Inc. Use cases with the GEON IDV and IRIS DMC data services will be shown.

  16. An automated and integrated framework for dust storm detection based on ogc web processing services

    NASA Astrophysics Data System (ADS)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.

  17. A Web-based Google-Earth Coincident Imaging Tool for Satellite Calibration and Validation

    NASA Astrophysics Data System (ADS)

    Killough, B. D.; Chander, G.; Gowda, S.

    2009-12-01

    The Group on Earth Observations (GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS) to meet the needs of its nine “Societal Benefit Areas”, of which the most demanding, in terms of accuracy, is climate. To accomplish this vision, satellite on-orbit and ground-based data calibration and validation (Cal/Val) of Earth observation measurements are critical to our scientific understanding of the Earth system. Existing tools supporting space mission Cal/Val are often developed for specific campaigns or events with little desire for broad application. This paper describes a web-based Google-Earth based tool for the calculation of coincident satellite observations with the intention to support a diverse international group of satellite missions to improve data continuity, interoperability and data fusion. The Committee on Earth Observing Satellites (CEOS), which includes 28 space agencies and 20 other national and international organizations, are currently operating and planning over 240 Earth observation satellites in the next 15 years. The technology described here will better enable the use of multiple sensors to promote increased coordination toward a GEOSS. The CEOS Systems Engineering Office (SEO) and the Working Group on Calibration and Validation (WGCV) support the development of the CEOS Visualization Environment (COVE) tool to enhance international coordination of data exchange, mission planning and Cal/Val events. The objective is to develop a simple and intuitive application tool that leverages the capabilities of Google-Earth web to display satellite sensor coverage areas and for the identification of coincident scene locations along with dynamic menus for flexibility and content display. Key features and capabilities include user-defined evaluation periods (start and end dates) and regions of interest (rectangular areas) and multi-user collaboration. Users can select two or more CEOS missions from a database including Satellite Tool Kit (STK) generated orbit information and perform rapid calculations to identify coincident scenes where the groundtracks of the CEOS mission instrument fields-of-view intersect. Calculated results are displayed on a customized Google-Earth web interface to view location and time information along with optional output to EXCEL table format. In addition, multiple viewports can be used for comparisons. COVE was first introduced to the CEOS WGCV community in May 2009. Since that time, the development of a prototype version has progressed. It is anticipated that the capabilities and applications of COVE can support a variety of international Cal/Val activities as well as provide general information on Earth observation coverage for education and societal benefit. This project demonstrates the utility of a systems engineering tool with broad international appeal for enhanced communication and data evaluation opportunities among international CEOS agencies. The COVE tool is publicly accessible via NASA servers.

  18. Keywords to Recruit Spanish- and English-Speaking Participants: Evidence From an Online Postpartum Depression Randomized Controlled Trial

    PubMed Central

    Kelman, Alex R; Muñoz, Ricardo F

    2014-01-01

    Background One of the advantages of Internet-based research is the ability to efficiently recruit large, diverse samples of international participants. Currently, there is a dearth of information on the behind-the-scenes process to setting up successful online recruitment tools. Objective The objective of the study was to examine the comparative impact of Spanish- and English-language keywords for a Google AdWords campaign to recruit pregnant women to an Internet intervention and to describe the characteristics of those who enrolled in the trial. Methods Spanish- and English-language Google AdWords campaigns were created to advertise and recruit pregnant women to a Web-based randomized controlled trial for the prevention of postpartum depression, the Mothers and Babies/Mamás y Bebés Internet Project. Search engine users who clicked on the ads in response to keyword queries (eg, pregnancy, depression and pregnancy) were directed to the fully automated study website. Data on the performance of keywords associated with each Google ad reflect Web user queries from February 2009 to June 2012. Demographic information, self-reported depression symptom scores, major depressive episode status, and Internet use data were collected from enrolled participants before randomization in the intervention study. Results The Google ads received high exposure (12,983,196 impressions) and interest (176,295 clicks) from a global sample of Web users; 6745 pregnant women consented to participate and 2575 completed enrollment in the intervention study. Keywords that were descriptive of pregnancy and distress or pregnancy and health resulted in higher consent and enrollment rates (ie, high-performing ads). In both languages, broad keywords (eg, pregnancy) had the highest exposure, more consented participants, and greatest cost per consent (up to US $25.77 per consent). The online ads recruited a predominantly Spanish-speaking sample from Latin America of Mestizo racial identity. The English-speaking sample was also diverse with most participants residing in regions of Asia and Africa. Spanish-speaking participants were significantly more likely to be of Latino ethnic background, not married, completed fewer years of formal education, and were more likely to have accessed the Internet for depression information (P<.001). Conclusions The Internet is an effective method for reaching an international sample of pregnant women interested in online interventions to manage changes in their mood during the perinatal period. To increase efficiency, Internet advertisements need to be monitored and tailored to reflect the target population’s conceptualization of health issues being studied. Trial Registration ClinicalTrials.gov NCT00816725; http://clinicaltrials.gov/show/NCT00816725 (Archived by WebCite at http://www.webcitation.org/6LumonjZP). PMID:24407163

  19. KSC-2013-3236

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – As seen on Google Maps, the Rotating Service Structure at Launch Complex 39A at NASA's Kennedy Space Center housed space shuttle payloads temporarily so they could be loaded inside the 60-foot-long cargo bay of a shuttle before launch. The RSS, as the structure was known, was hinged to the Fixed Service Structure on one side and rolled on a rail on the other. As its name suggests, the enclosed facility would rotate into place around the shuttle as it stood at the launch pad. Once in place, the RSS protected the shuttle and its cargo. Google precisely mapped the space center and some of its historical facilities for the company's map page. The work allows Internet users to see inside buildings at Kennedy as they were used during the space shuttle era. Photo credit: Google/Wendy Wang

  20. Flexible Web services integration: a novel personalised social approach

    NASA Astrophysics Data System (ADS)

    Metrouh, Abdelmalek; Mokhati, Farid

    2018-05-01

    Dynamic composition or integration remains one of the key objectives of Web services technology. This paper aims to propose an innovative approach of dynamic Web services composition based on functional and non-functional attributes and individual preferences. In this approach, social networks of Web services are used to maintain interactions between Web services in order to select and compose Web services that are more tightly related to user's preferences. We use the concept of Web services community in a social network of Web services to reduce considerably their search space. These communities are created by the direct involvement of Web services providers.

  1. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm

    PubMed Central

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the “quality of service” as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services. PMID:26504894

  2. Online palliative care and oncology patient education resources through Google: Do they meet national health literacy recommendations?

    PubMed

    Prabhu, Arpan V; Crihalmeanu, Tudor; Hansberry, David R; Agarwal, Nitin; Glaser, Christine; Clump, David A; Heron, Dwight E; Beriwal, Sushil

    The Google search engine is a resource commonly used by patients to access health-related patient education information. The American Medical Association and National Institutes of Health recommend that patient education resources be written at a level between the third and seventh grade reading levels. We assessed the readability levels of online palliative care patient education resources using 10 readability algorithms widely accepted in the medical literature. In October 2016, searches were conducted for 10 individual terms pertaining to palliative care and oncology using the Google search engine; the first 10 articles written for the public for each term were downloaded for a total of 100 articles. The terms included palliative care, hospice, advance directive, cancer pain management, treatment of metastatic disease, treatment of brain metastasis, treatment of bone metastasis, palliative radiation therapy, palliative chemotherapy, and end-of-life care. We determined the average reading level of the articles by readability scale and Web site domain. Nine readability assessments with scores equivalent to academic grade level found that the 100 palliative care education articles were collectively written at a 12.1 reading level (standard deviation, 2.1; range, 7.6-17.3). Zero articles were written below a seventh grade level. Forty-nine (49%) articles were written above a high school graduate reading level. The Flesch Reading Ease scale classified the articles as "difficult" to read with a score of 45.6 of 100. The articles were collected from 62 Web site domains. Seven domains were accessed 3 or more times; among these, www.mskcc.org had the highest average reading level at a 14.5 grade level (standard deviation, 1.4; range, 13.4-16.1). Most palliative care education articles readily available on Google are written above national health literacy recommendations. There is need to revise these resources to allow patients and their families to derive the most benefit from these materials. Copyright © 2017 729. Published by Elsevier Inc. All rights reserved.

  3. Assessing species habitat using Google Street View: a case study of cliff-nesting vultures.

    PubMed

    Olea, Pedro P; Mateo-Tomás, Patricia

    2013-01-01

    The assessment of a species' habitat is a crucial issue in ecology and conservation. While the collection of habitat data has been boosted by the availability of remote sensing technologies, certain habitat types have yet to be collected through costly, on-ground surveys, limiting study over large areas. Cliffs are ecosystems that provide habitat for a rich biodiversity, especially raptors. Because of their principally vertical structure, however, cliffs are not easy to study by remote sensing technologies, posing a challenge for many researches and managers working with cliff-related biodiversity. We explore the feasibility of Google Street View, a freely available on-line tool, to remotely identify and assess the nesting habitat of two cliff-nesting vultures (the griffon vulture and the globally endangered Egyptian vulture) in northwestern Spain. Two main usefulness of Google Street View to ecologists and conservation biologists were evaluated: i) remotely identifying a species' potential habitat and ii) extracting fine-scale habitat information. Google Street View imagery covered 49% (1,907 km) of the roads of our study area (7,000 km²). The potential visibility covered by on-ground surveys was significantly greater (mean: 97.4%) than that of Google Street View (48.1%). However, incorporating Google Street View to the vulture's habitat survey would save, on average, 36% in time and 49.5% in funds with respect to the on-ground survey only. The ability of Google Street View to identify cliffs (overall accuracy = 100%) outperformed the classification maps derived from digital elevation models (DEMs) (62-95%). Nonetheless, high-performance DEM maps may be useful to compensate Google Street View coverage limitations. Through Google Street View we could examine 66% of the vultures' nesting-cliffs existing in the study area (n = 148): 64% from griffon vultures and 65% from Egyptian vultures. It also allowed us the extraction of fine-scale features of cliffs. This World Wide Web-based methodology may be a useful, complementary tool to remotely map and assess the potential habitat of cliff-dependent biodiversity over large geographic areas, saving survey-related costs.

  4. Assessing Species Habitat Using Google Street View: A Case Study of Cliff-Nesting Vultures

    PubMed Central

    Olea, Pedro P.; Mateo-Tomás, Patricia

    2013-01-01

    The assessment of a species’ habitat is a crucial issue in ecology and conservation. While the collection of habitat data has been boosted by the availability of remote sensing technologies, certain habitat types have yet to be collected through costly, on-ground surveys, limiting study over large areas. Cliffs are ecosystems that provide habitat for a rich biodiversity, especially raptors. Because of their principally vertical structure, however, cliffs are not easy to study by remote sensing technologies, posing a challenge for many researches and managers working with cliff-related biodiversity. We explore the feasibility of Google Street View, a freely available on-line tool, to remotely identify and assess the nesting habitat of two cliff-nesting vultures (the griffon vulture and the globally endangered Egyptian vulture) in northwestern Spain. Two main usefulness of Google Street View to ecologists and conservation biologists were evaluated: i) remotely identifying a species’ potential habitat and ii) extracting fine-scale habitat information. Google Street View imagery covered 49% (1,907 km) of the roads of our study area (7,000 km2). The potential visibility covered by on-ground surveys was significantly greater (mean: 97.4%) than that of Google Street View (48.1%). However, incorporating Google Street View to the vulture’s habitat survey would save, on average, 36% in time and 49.5% in funds with respect to the on-ground survey only. The ability of Google Street View to identify cliffs (overall accuracy = 100%) outperformed the classification maps derived from digital elevation models (DEMs) (62–95%). Nonetheless, high-performance DEM maps may be useful to compensate Google Street View coverage limitations. Through Google Street View we could examine 66% of the vultures’ nesting-cliffs existing in the study area (n = 148): 64% from griffon vultures and 65% from Egyptian vultures. It also allowed us the extraction of fine-scale features of cliffs. This World Wide Web-based methodology may be a useful, complementary tool to remotely map and assess the potential habitat of cliff-dependent biodiversity over large geographic areas, saving survey-related costs. PMID:23355880

  5. Proposal for a Web Encoding Service (wes) for Spatial Data Transactio

    NASA Astrophysics Data System (ADS)

    Siew, C. B.; Peters, S.; Rahman, A. A.

    2015-10-01

    Web services utilizations in Spatial Data Infrastructure (SDI) have been well established and standardized by Open Geospatial Consortium (OGC). Similar web services for 3D SDI are also being established in recent years, with extended capabilities to handle 3D spatial data. The increasing popularity of using City Geographic Markup Language (CityGML) for 3D city modelling applications leads to the needs for large spatial data handling for data delivery. This paper revisits the available web services in OGC Web Services (OWS), and propose the background concepts and requirements for encoding spatial data via Web Encoding Service (WES). Furthermore, the paper discusses the data flow of the encoder within web service, e.g. possible integration with Web Processing Service (WPS) or Web 3D Services (W3DS). The integration with available web service could be extended to other available web services for efficient handling of spatial data, especially 3D spatial data.

  6. Assessing quality of health services with the SERVQUAL model in Iran. A systematic review and meta-analysis.

    PubMed

    Teshnizi, Saeed Hosseini; Aghamolaei, Teamur; Kahnouji, Kobra; Teshnizi, Seyyed Mehrdad Hosseini; Ghani, Jalil

    2018-03-01

    The five-dimension service quality (SERVQUAL) scale is one of the most common tools for evaluating gaps between clients' perceptions and expectations. This study aimed to assess the quality of health services in Iran through a meta-analysis of all Iranian studies which used the SERVQUAL tool. A systematic literature review has been performed in Web of Science, PubMed, Scopus, Google Scholar, Iran Medex, Magiran and Scientific Information Database. All relevant English or Persian studies published between January 2009 and April 2016 were have been selected. Papers were considered if they regarded all five dimensions of the SERVQUAL tool for assessing the quality of health care services. Two reviewer independently extracted mean and standard deviation of five dimensions and characteristics of studies. The quality of studies included in meta-analysis using STROBE checklist. Of 315 studies initially identified, 12 were included in our meta-analysis. All analyses were performed in Stata MP v. 14. Patients' perceptions were lower than their expectations (gap = -1.64). Responsibility (-1.22) and reliability (-1.15) had the lowest gaps, and tangibility and empathy (-1.03) had the largest gaps. Except gender, other variables had no significant associations with gaps. Patients in the cities of Arak (-3.47) and Shiraz (-3.02) had the largest gaps. All dimensions of service quality were negative, which implies that the quality of health services in Iran has not been satisfying to patients and needs to be improved.

  7. Recent Advances in Geospatial Visualization with the New Google Earth

    NASA Astrophysics Data System (ADS)

    Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.

    2017-12-01

    Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.

  8. Using Blogging to Enhance the Initiation of Students into Academic Research

    ERIC Educational Resources Information Center

    Chong, Eddy K. M.

    2010-01-01

    For the net-generation students learning in a Web 2.0 world, research is often equated with Googling and approached with a mindset accustomed to cut-and-paste practices. Recognizing educators' concern over such students' learning dispositions on the one hand, and the educational affordances of blogging on the other, this study examines the use of…

  9. A Mathematical and Sociological Analysis of Google Search Algorithm

    DTIC Science & Technology

    2013-01-16

    through the collective intelligence of the web to determine a page’s importance. Let v be a vector of RN with N ≥ 8 billion. Any unit vector in RN is...scrolled up by some artifical hits. Aknowledgment: The authors would like to thank Dr. John Lavery for his encouragement and support which enable them to

  10. The Effect of Creative Drama as a Method on Skills: A Meta-Analysis Study

    ERIC Educational Resources Information Center

    Ulubey, Özgür

    2018-01-01

    The aim of the current study was to synthesize the findings of experimental studies addressing the effect of the creative drama method on the skills of students. Research data were derived from ProQuest Citations, Web of Science, Google Academic, National Thesis Center, EBSCO, ERIC, Taylor & Francis Online, and ScienceDirect databases using…

  11. Machine Translation-Assisted Language Learning: Writing for Beginners

    ERIC Educational Resources Information Center

    Garcia, Ignacio; Pena, Maria Isabel

    2011-01-01

    The few studies that deal with machine translation (MT) as a language learning tool focus on its use by advanced learners, never by beginners. Yet, freely available MT engines (i.e. Google Translate) and MT-related web initiatives (i.e. Gabble-on.com) position themselves to cater precisely to the needs of learners with a limited command of a…

  12. Opening Up Access to Open Access

    ERIC Educational Resources Information Center

    Singer, Ross

    2008-01-01

    As the corpus of gray literature grows and the price of serials rises, it becomes increasingly important to explore ways to integrate the free and open Web seamlessly into one's collections. Users, after all, are discovering these materials all the time via sites such as Google Scholar and Scirus or by searching arXiv.org or CiteSeer directly.…

  13. Overview of the TREC 2014 Federated Web Search Track

    DTIC Science & Technology

    2014-11-01

    Pictures e021 Dailymotion Video e123 Picsearch Photo/Pictures e022 YouTube Video e124 Wikimedia Photo/Pictures e023 Google Blogs Blogs e126 Funny or...song of ice and fire 7045 Natural Parks America 7072 price gibson howard roberts custom 7092 How much was a gallon of gas during depression 7111 what is

  14. So, You Want to Be a Leader

    ERIC Educational Resources Information Center

    Wager, J. James

    2012-01-01

    Thousands--if not tens of thousands--of books, monographs, and articles have been written on the subject of leadership. A Google search of the word returns nearly a half-billion Web sites. As a professional who has spent nearly 40 years in the higher education sector, the author has been blessed with opportunities to view and practice leadership…

  15. Leveraging Learning Technologies for Collaborative Writing in an Online Pharmacotherapy Course

    ERIC Educational Resources Information Center

    Pittenger, Amy L.; Olson-Kellogg, Becky

    2012-01-01

    The purpose of this project was to evaluate the development and delivery of a hypertext case scenario document to be used as the capstone assessment tool for doctoral-level physical therapy students. The integration of Web-based collaborative tools (PBworks[TM] and Google Sites[TM]) allowed students in this all-online course to apply their…

  16. Spaces for Interactive Engagement or Technology for Differential Academic Participation? Google Groups for Collaborative Learning at a South African University

    ERIC Educational Resources Information Center

    Rambe, Patient

    2017-01-01

    The rhetoric on the potential of Web 2.0 technologies to democratize online engagement of students often overlooks the discomforting, differential participation and asymmetrical engagement that accompanies student adoption of emerging technologies. This paper, therefore, constitutes a critical reality check for student adoption of technology to…

  17. A Content Analysis of Online HPV Immunization Information

    ERIC Educational Resources Information Center

    Pappa, Sara T.

    2016-01-01

    The Human Papillomavirus (HPV) can cause some types of cancer and is the most common sexually transmitted infection in the US. Because most people turn to the internet for health information, this study analyzed HPV information found online. A content analysis was conducted on 69 web search results (URLs) from Google, Yahoo, Bing and Ask. The…

  18. Exploring Writing Individually and Collaboratively Using Google Docs in EFL Contexts

    ERIC Educational Resources Information Center

    Alsubaie, Jawaher; Ashuraidah, Ali

    2017-01-01

    Online teaching and learning became popular with the evolution of the World Wide Web now days. Implementing online learning tools within EFL contexts will help better address the multitude of teaching and learning styles. Difficulty in academic writing can be considered one of the common problems that students face in and outside their classrooms.…

  19. Web-Based Interactive Steel Sculpture for the Google Generation

    ERIC Educational Resources Information Center

    Chou, Karen C.; Moaveni, Saeed

    2009-01-01

    In almost all the civil engineering programs in the United States, a student is required to take at least one design course in either steel or reinforced concrete. One of the topics covered in an introductory steel design course is the design of connections. Steel connections play important roles in the integrity of a structure, and many…

  20. Evidence-Based Intervention for Individuals with Acquired Apraxia of Speech. EBP Briefs. Volume 11, Issue 2

    ERIC Educational Resources Information Center

    Van Sickle, Angela

    2016-01-01

    Clinical Question: Would individuals with acquired apraxia of speech (AOS) demonstrate greater improvements for speech production with an articulatory kinematic approach or a rate/rhythm approach? Method: EBP Intervention Comparison Review. Study Sources: ASHA journal, Google Scholar, PubMed, CINAHL Plus with Full Text, Web of Science, Ovid, and…

Top