Fu, Linda Y; Zook, Kathleen; Spoehr-Labutta, Zachary; Hu, Pamela; Joseph, Jill G
2016-01-01
Online information can influence attitudes toward vaccination. The aim of the present study was to provide a systematic evaluation of the search engine ranking, quality, and content of Web pages that are critical versus noncritical of human papillomavirus (HPV) vaccination. We identified HPV vaccine-related Web pages with the Google search engine by entering 20 terms. We then assessed each Web page for critical versus noncritical bias and for the following quality indicators: authorship disclosure, source disclosure, attribution of at least one reference, currency, exclusion of testimonial accounts, and readability level less than ninth grade. We also determined Web page comprehensiveness in terms of mention of 14 HPV vaccine-relevant topics. Twenty searches yielded 116 unique Web pages. HPV vaccine-critical Web pages comprised roughly a third of the top, top 5- and top 10-ranking Web pages. The prevalence of HPV vaccine-critical Web pages was higher for queries that included term modifiers in addition to root terms. Compared with noncritical Web pages, Web pages critical of HPV vaccine overall had a lower quality score than those with a noncritical bias (p < .01) and covered fewer important HPV-related topics (p < .001). Critical Web pages required viewers to have higher reading skills, were less likely to include an author byline, and were more likely to include testimonial accounts. They also were more likely to raise unsubstantiated concerns about vaccination. Web pages critical of HPV vaccine may be frequently returned and highly ranked by search engine queries despite being of lower quality and less comprehensive than noncritical Web pages. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Online nutrition information for pregnant women: a content analysis.
Storr, Tayla; Maher, Judith; Swanepoel, Elizabeth
2017-04-01
Pregnant women actively seek health information online, including nutrition and food-related topics. However, the accuracy and readability of this information have not been evaluated. The aim of this study was to describe and evaluate pregnancy-related food and nutrition information available online. Four search engines were used to search for pregnancy-related nutrition web pages. Content analysis of web pages was performed. Web pages were assessed against the 2013 Australian Dietary Guidelines to assess accuracy. Flesch-Kincaid (F-K), Simple Measure of Gobbledygook (SMOG), Gunning Fog Index (FOG) and Flesch reading ease (FRE) formulas were used to assess readability. Data was analysed descriptively. Spearman's correlation was used to assess the relationship between web page characteristics. Kruskal-Wallis test was used to check for differences among readability and other web page characteristics. A total of 693 web pages were included. Web page types included commercial (n = 340), not-for-profit (n = 113), blogs (n = 112), government (n = 89), personal (n = 36) and educational (n = 3). The accuracy of online nutrition information varied with 39.7% of web pages containing accurate information, 22.8% containing mixed information and 37.5% containing inaccurate information. The average reading grade of all pages analysed measured by F-K, SMOG and FOG was 11.8. The mean FRE was 51.6, a 'fairly difficult to read' score. Only 0.5% of web pages were written at or below grade 6 according to F-K, SMOG and FOG. The findings suggest that accuracy of pregnancy-related nutrition information is a problem on the internet. Web page readability is generally difficult and means that the information may not be accessible to those who cannot read at a sophisticated level. © 2016 John Wiley & Sons Ltd. © 2016 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Gilstrap, Donald L.
1998-01-01
Explains how to build World Wide Web home pages using frames-based HTML so that librarians can manage Web-based information and improve their home pages. Provides descriptions and 15 examples for writing frames-HTML code, including advanced concepts and additional techniques for home-page design. (Author/LRW)
Classroom Web Pages: A "How-To" Guide for Educators.
ERIC Educational Resources Information Center
Fehling, Eric E.
This manual provides teachers, with very little or no technology experience, with a step-by-step guide for developing the necessary skills for creating a class Web Page. The first part of the manual is devoted to the thought processes preceding the actual creation of the Web Page. These include looking at other Web Pages, deciding what should be…
Narcissism and social networking Web sites.
Buffardi, Laura E; Campbell, W Keith
2008-10-01
The present research examined how narcissism is manifested on a social networking Web site (i.e., Facebook.com). Narcissistic personality self-reports were collected from social networking Web page owners. Then their Web pages were coded for both objective and subjective content features. Finally, strangers viewed the Web pages and rated their impression of the owner on agentic traits, communal traits, and narcissism. Narcissism predicted (a) higher levels of social activity in the online community and (b) more self-promoting content in several aspects of the social networking Web pages. Strangers who viewed the Web pages judged more narcissistic Web page owners to be more narcissistic. Finally, mediational analyses revealed several Web page content features that were influential in raters' narcissistic impressions of the owners, including quantity of social interaction, main photo self-promotion, and main photo attractiveness. Implications of the expression of narcissism in social networking communities are discussed.
Analysis of Web Spam for Non-English Content: Toward More Effective Language-Based Classifiers
Alsaleh, Mansour; Alarifi, Abdulrahman
2016-01-01
Web spammers aim to obtain higher ranks for their web pages by including spam contents that deceive search engines in order to include their pages in search results even when they are not related to the search terms. Search engines continue to develop new web spam detection mechanisms, but spammers also aim to improve their tools to evade detection. In this study, we first explore the effect of the page language on spam detection features and we demonstrate how the best set of detection features varies according to the page language. We also study the performance of Google Penguin, a newly developed anti-web spamming technique for their search engine. Using spam pages in Arabic as a case study, we show that unlike similar English pages, Google anti-spamming techniques are ineffective against a high proportion of Arabic spam pages. We then explore multiple detection features for spam pages to identify an appropriate set of features that yields a high detection accuracy compared with the integrated Google Penguin technique. In order to build and evaluate our classifier, as well as to help researchers to conduct consistent measurement studies, we collected and manually labeled a corpus of Arabic web pages, including both benign and spam pages. Furthermore, we developed a browser plug-in that utilizes our classifier to warn users about spam pages after clicking on a URL and by filtering out search engine results. Using Google Penguin as a benchmark, we provide an illustrative example to show that language-based web spam classifiers are more effective for capturing spam contents. PMID:27855179
Analysis of Web Spam for Non-English Content: Toward More Effective Language-Based Classifiers.
Alsaleh, Mansour; Alarifi, Abdulrahman
2016-01-01
Web spammers aim to obtain higher ranks for their web pages by including spam contents that deceive search engines in order to include their pages in search results even when they are not related to the search terms. Search engines continue to develop new web spam detection mechanisms, but spammers also aim to improve their tools to evade detection. In this study, we first explore the effect of the page language on spam detection features and we demonstrate how the best set of detection features varies according to the page language. We also study the performance of Google Penguin, a newly developed anti-web spamming technique for their search engine. Using spam pages in Arabic as a case study, we show that unlike similar English pages, Google anti-spamming techniques are ineffective against a high proportion of Arabic spam pages. We then explore multiple detection features for spam pages to identify an appropriate set of features that yields a high detection accuracy compared with the integrated Google Penguin technique. In order to build and evaluate our classifier, as well as to help researchers to conduct consistent measurement studies, we collected and manually labeled a corpus of Arabic web pages, including both benign and spam pages. Furthermore, we developed a browser plug-in that utilizes our classifier to warn users about spam pages after clicking on a URL and by filtering out search engine results. Using Google Penguin as a benchmark, we provide an illustrative example to show that language-based web spam classifiers are more effective for capturing spam contents.
[An evaluation of the quality of health web pages using a validated questionnaire].
Conesa Fuentes, Maria del Carmen; Aguinaga Ontoso, Enrique; Hernández Morante, Juan José
2011-01-01
The objective of the present study was to evaluate the quality of general health information in Spanish language web pages, and the official Regional Services web pages from the different Autonomous Regions. It is a cross-sectional study. We have used a previously validated questionnaire to study the present state of the health information on Internet for a lay-user point of view. By mean of PageRank (Google®), we obtained a group of webs, including a total of 65 health web pages. We applied some exclusion criteria, and finally obtained a total of 36 webs. We also analyzed the official web pages from the different Health Services in Spain (19 webs), making a total of 54 health web pages. In the light of our data, we observed that, the quality of the general information health web pages was generally rather low, especially regarding the information quality. Not one page reached the maximum score (19 points). The mean score of the web pages was of 9.8±2.8. In conclusion, to avoid the problems arising from the lack of quality, health professionals should design advertising campaigns and other media to teach the lay-user how to evaluate the information quality. Copyright © 2009 Elsevier España, S.L. All rights reserved.
The impact of visual layout factors on performance in Web pages: a cross-language study.
Parush, Avi; Shwarts, Yonit; Shtub, Avy; Chandra, M Jeya
2005-01-01
Visual layout has a strong impact on performance and is a critical factor in the design of graphical user interfaces (GUIs) and Web pages. Many design guidelines employed in Web page design were inherited from human performance literature and GUI design studies and practices. However, few studies have investigated the more specific patterns of performance with Web pages that may reflect some differences between Web page and GUI design. We investigated interactions among four visual layout factors in Web page design (quantity of links, alignment, grouping indications, and density) in two experiments: one with pages in Hebrew, entailing right-to-left reading, and the other with English pages, entailing left-to-right reading. Some performance patterns (measured by search times and eye movements) were similar between languages. Performance was particularly poor in pages with many links and variable densities, but it improved with the presence of uniform density. Alignment was not shown to be a performance-enhancing factor. The findings are discussed in terms of the similarities and differences in the impact of layout factors between GUIs and Web pages. Actual or potential applications of this research include specific guidelines for Web page design.
Educational use of World Wide Web pages on CD-ROM.
Engel, Thomas P; Smith, Michael
2002-01-01
The World Wide Web is increasingly important for medical education. Internet served pages may also be used on a local hard disk or CD-ROM without a network or server. This allows authors to reuse existing content and provide access to users without a network connection. CD-ROM offers several advantages over network delivery of Web pages for several applications. However, creating Web pages for CD-ROM requires careful planning. Issues include file names, relative links, directory names, default pages, server created content, image maps, other file types and embedded programming. With care, it is possible to create server based pages that can be copied directly to CD-ROM. In addition, Web pages on CD-ROM may reference Internet served pages to provide the best features of both methods.
Thompson, Andrew E; Graydon, Sara L
2009-01-01
With continuing use of the Internet, rheumatologists are referring patients to various websites to gain information about medications and diseases. Our goal was to develop and evaluate a Medication Website Assessment Tool (MWAT) for use by health professionals, and to explore the overall quality of methotrexate information presented on common English-language websites. Identification of websites was performed using a search strategy on the search engine Google. The first 250 hits were screened. Inclusion criteria included those English-language websites from authoritative sources, trusted medical, physicians', and common health-related websites. Websites from pharmaceutical companies, online pharmacies, and where the purpose seemed to be primarily advertisements were also included. Product monographs or technical-based web pages and web pages where the information was clearly directed at patients with cancer were excluded. Two reviewers independently scored each included web page for completeness and accuracy, format, readability, reliability, and credibility. An overall ranking was provided for each methotrexate information page. Twenty-eight web pages were included in the analysis. The average score for completeness and accuracy was 15.48+/-3.70 (maximum 24) with 10 out of 28 pages scoring 18 (75%) or higher. The average format score was 6.00+/-1.46 (maximum 8). The Flesch-Kincaid Grade Level revealed an average grade level of 10.07+/-1.84, with 5 out of 28 websites written at a reading level less than grade 8; however, no web page scored at a grade 5 to 6 level. An overall ranking was calculated identifying 8 web pages as appropriate sources of accurate and reliable methotrexate information. With the enormous amount of information available on the Internet, it is important to direct patients to web pages that are complete, accurate, readable, and credible sources of information. We identified web pages that may serve the interests of both rheumatologists and patients.
77 FR 70454 - Proposed Flood Hazard Determinations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... which included a Web page address through which the Preliminary Flood Insurance Rate Map (FIRM), and... be accessed. The information available through the Web page address has subsequently been updated... through the web page address listed in the table has been updated to reflect the Revised Preliminary...
Methodologies for Crawler Based Web Surveys.
ERIC Educational Resources Information Center
Thelwall, Mike
2002-01-01
Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…
World Wide Web Page Design: A Structured Approach.
ERIC Educational Resources Information Center
Gregory, Gwen; Brown, M. Marlo
1997-01-01
Describes how to develop a World Wide Web site based on structured programming concepts. Highlights include flowcharting, first page design, evaluation, page titles, documenting source code, text, graphics, and browsers. Includes a template for HTML writers, tips for using graphics, a sample homepage, guidelines for authoring structured HTML, and…
The Privilege of Ranking: Google Plays Ball.
ERIC Educational Resources Information Center
Wiggins, Richard
2003-01-01
Discussion of ranking systems used in various settings, including college football and academic admissions, focuses on the Google search engine. Explains the PageRank mathematical formula that scores Web pages by connecting the number of links; limitations, including authenticity and accuracy of ranked Web pages; relevancy; adjusting algorithms;…
NASA Technical Reports Server (NTRS)
Steeman, Gerald; Connell, Christopher
2000-01-01
Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.
Environment: General; Grammar & Usage; Money Management; Music History; Web Page Creation & Design.
ERIC Educational Resources Information Center
Web Feet, 2001
2001-01-01
Describes Web site resources for elementary and secondary education in the topics of: environment, grammar, money management, music history, and Web page creation and design. Each entry includes an illustration of a sample page on the site and an indication of the grade levels for which it is appropriate. (AEF)
| photos Web page Web page 2016 PDF | photos Web page Web page 2015 PDF | photos | video Web page Web page 2014 PDF | photos | videos Web page Web page 2013 PDF | photos Web page Web page 2012 PDF | photos Web page Web page 2011 PDF | photos PDF Web page 2010 PDF PDF PDF 2009 PDF PDF PDF 2008 PDF PDF PDF 2007
A step-by-step solution for embedding user-controlled cines into educational Web pages.
Cornfeld, Daniel
2008-03-01
The objective of this article is to introduce a simple method for embedding user-controlled cines into a Web page using a simple JavaScript. Step-by-step instructions are included and the source code is made available. This technique allows the creation of portable Web pages that allow the user to scroll through cases as if seated at a PACS workstation. A simple JavaScript allows scrollable image stacks to be included on Web pages. With this technique, you can quickly and easily incorporate entire stacks of CT or MR images into online teaching files. This technique has the potential for use in case presentations, online didactics, teaching archives, and resident testing.
Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G
2006-01-01
The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.
Kozlov, Elissa; Carpenter, Brian D
2017-04-01
Americans rely on the Internet for health information, and people are likely to turn to online resources to learn about palliative care as well. The purpose of this study was to analyze online palliative care information pages to evaluate the breadth of their content. We also compared how frequently basic facts about palliative care appeared on the Web pages to expert rankings of the importance of those facts to understanding palliative care. Twenty-six pages were identified. Two researchers independently coded each page for content. Palliative care professionals (n = 20) rated the importance of content domains for comparison with content frequency in the Web pages. We identified 22 recurring broad concepts about palliative care. Each information page included, on average, 9.2 of these broad concepts (standard deviation [SD] = 3.36, range = 5-15). Similarly, each broad concept was present in an average of 45% of the Web pages (SD = 30.4%, range = 8%-96%). Significant discrepancies emerged between expert ratings of the importance of the broad concepts and the frequency of their appearance in the Web pages ( r τ = .25, P > .05). This study demonstrates that palliative care information pages available online vary considerably in their content coverage. Furthermore, information that palliative care professionals rate as important for consumers to know is not always included in Web pages. We developed guidelines for information pages for the purpose of educating consumers in a consistent way about palliative care.
Googling endometriosis: a systematic review of information available on the Internet.
Hirsch, Martin; Aggarwal, Shivani; Barker, Claire; Davis, Colin J; Duffy, James M N
2017-05-01
The demand for health information online is increasing rapidly without clear governance. We aim to evaluate the credibility, quality, readability, and accuracy of online patient information concerning endometriosis. We searched 5 popular Internet search engines: aol.com, ask.com, bing.com, google.com, and yahoo.com. We developed a search strategy in consultation with patients with endometriosis, to identify relevant World Wide Web pages. Pages containing information related to endometriosis for women with endometriosis or the public were eligible. Two independent authors screened the search results. World Wide Web pages were evaluated using validated instruments across 3 of the 4 following domains: (1) credibility (White Paper instrument; range 0-10); (2) quality (DISCERN instrument; range 0-85); and (3) readability (Flesch-Kincaid instrument; range 0-100); and (4) accuracy (assessed by a prioritized criteria developed in consultation with health care professionals, researchers, and women with endometriosis based on the European Society of Human Reproduction and Embryology guidelines [range 0-30]). We summarized these data in diagrams, tables, and narratively. We identified 750 World Wide Web pages, of which 54 were included. Over a third of Web pages did not attribute authorship and almost half the included pages did not report the sources of information or academic references. No World Wide Web page provided information assessed as being written in plain English. A minority of web pages were assessed as high quality. A single World Wide Web page provided accurate information: evidentlycochrane.net. Available information was, in general, skewed toward the diagnosis of endometriosis. There were 16 credible World Wide Web pages, however the content limitations were infrequently discussed. No World Wide Web page scored highly across all 4 domains. In the unlikely event that a World Wide Web page reports high-quality, accurate, and credible health information it is typically challenging for a lay audience to comprehend. Health care professionals, and the wider community, should inform women with endometriosis of the risk of outdated, inaccurate, or even dangerous information online. The implementation of an information standard will incentivize providers of online information to establish and adhere to codes of conduct. Copyright © 2016 Elsevier Inc. All rights reserved.
How To Build a Web Site in Six Easy Steps.
ERIC Educational Resources Information Center
Yaworski, JoAnn
2002-01-01
Gives instructions in nontechnical terms for building a simple web site using Netscape Navigator or Communicator's web editor. Presents six steps that include: organizing information, creating a page and a background, linking files, linking to Internet web pages, linking images, and linking an email address. Gives advice for sending the web page…
Scheduled webinars can help you better manage EPA web content. Class topics include Drupal basics, creating different types of pages in the WebCMS such as document pages and forms, using Google Analytics, and best practices for metadata and accessibility.
The Potential of CGI: Using Pre-Built CGI Scripts to Make Interactive Web Pages.
ERIC Educational Resources Information Center
Nackerud, Shane A.
1998-01-01
Describes CGI (Common Gateway Interface) scripts that are available on the Web and explains how librarians can use them to make Web pages more interactive. Topics include CGI security; Perl scripts; UNIX; and HTML. (LRW)
Stockburger, D W
1999-05-01
Active server pages permit a software developer to customize the Web experience for users by inserting server-side script and database access into Web pages. This paper describes applications of these techniques and provides a primer on the use of these methods. Applications include a system that generates and grades individualized homework assignments and tests for statistics students. The student accesses the system as a Web page, prints out the assignment, does the assignment, and enters the answers on the Web page. The server, running on NT Server 4.0, grades the assignment, updates the grade book (on a database), and returns the answer key to the student.
Yes! You Can Build a Web Site.
ERIC Educational Resources Information Center
Holzberg, Carol
2001-01-01
With specially formatted templates or simple Web page editors, teachers can lay out text and graphics in a work space resembling the interface of a word processor. Several options are presented to help teachers build Web sites. ree templates include Class Homepage Builder, AppliTools: HomePage, MySchoolOnline.com, and BigChalk.com. Web design…
The Four Levels of Web Site Development Expertise.
ERIC Educational Resources Information Center
Ingram, Albert L.
2000-01-01
Discusses the design of Web pages and sites and proposes a four-level model of Web development expertise that can serve as a curriculum overview or as a plan for an individual's professional development. Highlights include page design, media use, client-side processing, server-side processing, and site structure. (LRW)
Avoiding Pornography Landmines while Traveling the Information Superhighway.
ERIC Educational Resources Information Center
Lehmann, Kay
2002-01-01
Discusses how to avoid pornographic sites when using the Internet in classrooms. Highlights include re-setting the Internet home page; putting appropriate links in a Word document; creating a Web page with appropriate links; downloading the content of a Web site; educating the students; and re-checking all Web addresses. (LRW)
World Wide Web home page for the South Platte NAWQA
Qi, Sharon L.; Dennehy, Kevin F.
1997-01-01
A World Wide Web home page for the U.S. Geological Survey's (USGS) National Water-Quality Assessment (NAWQA) Program, South Platte River Basin study is now online. The home page includes information about the basinwide investigation and provides viewing and downloading access to physical, chemical, and biological data collected by the study team.
Assessing Greek Public Hospitals' Websites.
Tsirintani, Maria; Binioris, Spyros
2015-01-01
Following a previous (2011) survey, this study assesses the web pages of Greek public hospitals according to specific criteria, which are included in the same web page evaluation model. Our purpose is to demonstrate the evolution of hospitals' web pages and document e-health applications trends. Using descriptive methods we found that public hospitals have made significant steps towards establishing and improving their web presence but there is still a lot of work that needs to be carried out in order to take advantage of the benefits of new technologies in the e-health ecosystem.
Cleanups In My Community (CIMC) - Incidents of National Significance, National Layer
This data layer provides access to Incidents of National Significance as part of the CIMC web service. Incidents of National Significance include all Presidentially-declared emergencies, major disasters, and catastrophes. Multiple federal departments and agencies, including EPA, coordinate actions to help prevent, prepare for, respond to, and recover from Incidents of National Significance. The Incidents of National Significance shown in this web service are derived from the epa.gov website and include links to the relevant web pages within the attribute table. Data about Incidents of National Significance are located on their own EPA web pages, and CIMC links to those pages. The CIMC web service was initially published in 2013, but the data are updated on the 18th of each month. The full schedule for data updates in CIMC is located here: https://iaspub.epa.gov/enviro/data_update_v2.
[Health information on the Internet and trust marks as quality indicators: vaccines case study].
Mayer, Miguel Angel; Leis, Angela; Sanz, Ferran
2009-10-01
To find out the prevalence of quality trust marks present in websites and to analyse the quality of these websites displaying trust marks compared with those that do not display them, in order to put forward these trust marks as a quality indicator. Cross-sectional study. Internet. Websites on vaccines. Using "vacunas OR vaccines" as key words, the features of 40 web pages were analysed. These web pages were selected from the page results of two search engines, Google and Yahoo! Based on a total of 9 criteria, the average score of criteria fulfilled was 7 (95% CI 3.96-10.04) points for the web pages offered by Yahoo! and 7.3 (95% CI 3.86-10.74) offered by Google. Amongst web pages offered by Yahoo!, there were three with clearly inaccurate information, while there were four in the pages offered by Google. Trust marks were displayed in 20% and 30% medical web pages, respectively, and their presence reached statistical significance (P=0.033) when fulfilling the quality criteria compared with web pages where trust marks were not displayed. A wide variety of web pages was obtained by search engines and a large number of them with useless information. Although the websites analysed had a good quality, between 15% and 20% showed inaccurate information. Websites where trust marks were displayed had more quality than those that did not display one and none of them were included amongst those where inaccurate information was found.
12 CFR 309.4 - Publicly available records.
Code of Federal Regulations, 2010 CFR
2010-01-01
... INFORMATION § 309.4 Publicly available records. (a) Records available on the FDIC's World Wide Web page—(1... on the FDIC's World Wide Web page, located at: http://www.fdic.gov. The FDIC has elected to publish a broad range of materials on its World Wide Web page, including consumer guides; financial and...
Is Domain Highlighting Actually Helpful in Identifying Phishing Web Pages?
Xiong, Aiping; Proctor, Robert W; Yang, Weining; Li, Ninghui
2017-06-01
To evaluate the effectiveness of domain highlighting in helping users identify whether Web pages are legitimate or spurious. As a component of the URL, a domain name can be overlooked. Consequently, browsers highlight the domain name to help users identify which Web site they are visiting. Nevertheless, few studies have assessed the effectiveness of domain highlighting, and the only formal study confounded highlighting with instructions to look at the address bar. We conducted two phishing detection experiments. Experiment 1 was run online: Participants judged the legitimacy of Web pages in two phases. In Phase 1, participants were to judge the legitimacy based on any information on the Web page, whereas in Phase 2, they were to focus on the address bar. Whether the domain was highlighted was also varied. Experiment 2 was conducted similarly but with participants in a laboratory setting, which allowed tracking of fixations. Participants differentiated the legitimate and fraudulent Web pages better than chance. There was some benefit of attending to the address bar, but domain highlighting did not provide effective protection against phishing attacks. Analysis of eye-gaze fixation measures was in agreement with the task performance, but heat-map results revealed that participants' visual attention was attracted by the highlighted domains. Failure to detect many fraudulent Web pages even when the domain was highlighted implies that users lacked knowledge of Web page security cues or how to use those cues. Potential applications include development of phishing prevention training incorporating domain highlighting with other methods to help users identify phishing Web pages.
Eng, J
1997-01-01
Java is a programming language that runs on a "virtual machine" built into World Wide Web (WWW)-browsing programs on multiple hardware platforms. Web pages were developed with Java to enable Web-browsing programs to overlay transparent graphics and text on displayed images so that the user could control the display of labels and annotations on the images, a key feature not available with standard Web pages. This feature was extended to include the presentation of normal radiologic anatomy. Java programming was also used to make Web browsers compatible with the Digital Imaging and Communications in Medicine (DICOM) file format. By enhancing the functionality of Web pages, Java technology should provide greater incentive for using a Web-based approach in the development of radiology teaching material.
Web server for priority ordered multimedia services
NASA Astrophysics Data System (ADS)
Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund
2001-10-01
In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
Discovering Authorities and Hubs in Different Topological Web Graph Structures.
ERIC Educational Resources Information Center
Meghabghab, George
2002-01-01
Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)
The Faculty Web Page: Contrivance or Continuation?
ERIC Educational Resources Information Center
Lennex, Lesia
2007-01-01
In an age of Internet education, what does it mean for a tenure/tenure-track faculty to have a web page? How many professors have web pages? If they have a page, what does it look like? Do they really need a web page at all? Many universities have faculty web pages. What do those collective pages look like? In what way do they represent the…
Policy-Aware Content Reuse on the Web
NASA Astrophysics Data System (ADS)
Seneviratne, Oshani; Kagal, Lalana; Berners-Lee, Tim
The Web allows users to share their work very effectively leading to the rapid re-use and remixing of content on the Web including text, images, and videos. Scientific research data, social networks, blogs, photo sharing sites and other such applications known collectively as the Social Web have lots of increasingly complex information. Such information from several Web pages can be very easily aggregated, mashed up and presented in other Web pages. Content generation of this nature inevitably leads to many copyright and license violations, motivating research into effective methods to detect and prevent such violations.
Age differences in search of web pages: the effects of link size, link number, and clutter.
Grahame, Michael; Laberge, Jason; Scialfa, Charles T
2004-01-01
Reaction time, eye movements, and errors were measured during visual search of Web pages to determine age-related differences in performance as a function of link size, link number, link location, and clutter. Participants (15 young adults, M = 23 years; 14 older adults, M = 57 years) searched Web pages for target links that varied from trial to trial. During one half of the trials, links were enlarged from 10-point to 12-point font. Target location was distributed among the left, center, and bottom portions of the screen. Clutter was manipulated according to the percentage of used space, including graphics and text, and the number of potentially distracting nontarget links was varied. Increased link size improved performance, whereas increased clutter and links hampered search, especially for older adults. Results also showed that links located in the left region of the page were found most easily. Actual or potential applications of this research include Web site design to increase usability, particularly for older adults.
Electronic doors to education: study of high school website accessibility in Iowa.
Klein, David; Myhill, William; Hansen, Linda; Asby, Gary; Michaelson, Susan; Blanck, Peter
2003-01-01
The Americans with Disabilities Act (ADA), and Sections 504 and 508 of the Rehabilitation Act, prohibit discrimination against people with disabilities in all aspects of daily life, including education, work, and access to places of public accommodations. Increasingly, these antidiscrimination laws are used by persons with disabilities to ensure equal access to e-commerce, and to private and public Internet websites. To help assess the impact of the anti-discrimination mandate for educational communities, this study examined 157 website home pages of Iowa public high schools (52% of high schools in Iowa) in terms of their electronic accessibility for persons with disabilities. We predicted that accessibility problems would limit students and others in obtaining information from the web pages as well as limiting ability to navigate to other web pages. Findings show that although many web pages examined included information in accessible formats, none of the home pages met World Wide Web Consortium (W3C) standards for accessibility. The most frequent accessibility problem was lack of alternative text (ALT tags) for graphics. Technical sophistication built into pages was found to reduce accessibility. Implications are discussed for schools and educational institutions, and for laws, policies, and procedures on website accessibility. Copyright 2003 John Wiley & Sons, Ltd.
Using Firefly Tools to Enhance Archive Web Pages
NASA Astrophysics Data System (ADS)
Roby, W.; Wu, X.; Ly, L.; Goldina, T.
2013-10-01
Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.
Contextual advertisement placement in printed media
NASA Astrophysics Data System (ADS)
Liu, Sam; Joshi, Parag
2010-02-01
Advertisements today provide the necessary revenue model supporting the WWW ecosystem. Targeted or contextual ad insertion plays an important role in optimizing the financial return of this model. Nearly all the current ads that appear on web sites are geared for display purposes such as banner and "pay-per-click". Little attention, however, is focused on deriving additional ad revenues when the content is repurposed for alternative mean of presentation, e.g. being printed. Although more and more content is moving to the Web, there are still many occasions where printed output of web content is desirable, such as maps and articles; thus printed ad insertion can potentially be lucrative. In this paper, we describe a contextual ad insertion network aimed to realize new revenue for print service providers for web printing. We introduce a cloud print service that enables contextual ads insertion, with respect to the main web page content, when a printout of the page is requested. To encourage service utilization, it would provide higher quality printouts than what is possible from current browser print drivers, which generally produce poor outputs, e.g. ill formatted pages. At this juncture we will limit the scope to only article-related web pages although the concept can be extended to arbitrary web pages. The key components of this system include (1) the extraction of article from web pages, (2) the extraction of semantics from article, (3) querying the ad database for matching advertisement or coupon, and (4) joint content and ad layout for print outputs.
77 FR 31917 - Energy Conservation Program: Energy Conservation Standards for Residential Dishwashers
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-30
... the docket Web page can be found at: http://www.regulations.gov/#!docketDetail ;D=EERE-2011-BT-STD-0060. The regulations.gov Web page contains instructions on how to access all documents, including...: (202) 586-7796. Email: [email protected] . SUPPLEMENTARY INFORMATION: Table of Contents I...
Electronic Ramp to Success: Designing Campus Web Pages for Users with Disabilities.
ERIC Educational Resources Information Center
Coombs, Norman
2002-01-01
Discusses key issues in addressing the challenge of Web accessibility for people with disabilities, including tools for Web authoring, repairing, and accessibility validation, and relevant legal issues. Presents standards for Web accessibility, including the Section 508 Standards from the Federal Access Board, and the World Wide Web Consortium's…
Going, going, still there: using the WebCite service to permanently archive cited web pages.
Eysenbach, Gunther; Trudel, Mathieu
2005-12-30
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics.
Description Meta Tags in Public Home and Linked Pages.
ERIC Educational Resources Information Center
Craven, Timothy C.
2001-01-01
Random samples of 1,872 Web pages registered with Yahoo! And 1,638 pages reachable from Yahoo!-registered pages were analyzed for use of meta tags and specifically those containing descriptions. Results: 727 (38.8%) of the Yahoo!-registered pages and 442 (27%) of the other pages included descriptions in meta tages. Some descriptions greatly…
Modeling Traffic on the Web Graph
NASA Astrophysics Data System (ADS)
Meiss, Mark R.; Gonçalves, Bruno; Ramasco, José J.; Flammini, Alessandro; Menczer, Filippo
Analysis of aggregate and individual Web requests shows that PageRank is a poor predictor of traffic. We use empirical data to characterize properties of Web traffic not reproduced by Markovian models, including both aggregate statistics such as page and link traffic, and individual statistics such as entropy and session size. As no current model reconciles all of these observations, we present an agent-based model that explains them through realistic browsing behaviors: (1) revisiting bookmarked pages; (2) backtracking; and (3) seeking out novel pages of topical interest. The resulting model can reproduce the behaviors we observe in empirical data, especially heterogeneous session lengths, reconciling the narrowly focused browsing patterns of individual users with the extreme variance in aggregate traffic measurements. We can thereby identify a few salient features that are necessary and sufficient to interpret Web traffic data. Beyond the descriptive and explanatory power of our model, these results may lead to improvements in Web applications such as search and crawling.
Outreach to International Students and Scholars Using the World Wide Web.
ERIC Educational Resources Information Center
Wei, Wei
1998-01-01
Describes the creation of a World Wide Web site for the Science Library International Outreach Program at the University of California, Santa Cruz. Discusses design elements, content, and promotion of the site. Copies of the home page and the page containing the outreach program's statement of purpose are included. (AEF)
World Wide Web Pages--Tools for Teaching and Learning.
ERIC Educational Resources Information Center
Beasley, Sarah; Kent, Jean
Created to help educators incorporate World Wide Web pages into teaching and learning, this collection of Web pages presents resources, materials, and techniques for using the Web. The first page focuses on tools for teaching and learning via the Web, providing pointers to sites containing the following: (1) course materials for both distance and…
Tozzi, Alberto Eugenio; Buonuomo, Paola Sabrina; Ciofi degli Atti, Marta Luisa; Carloni, Emanuela; Meloni, Marco; Gamba, Fiorenza
2010-01-01
Information available on the Internet about immunizations may influence parents' perception about human papillomavirus (HPV) immunization and their attitude toward vaccinating their daughters. We hypothesized that the quality of information on HPV available on the Internet may vary with language and with the level of knowledge of parents. To this end we compared the quality of a sample of Web pages in Italian with a sample of Web pages in English. Five reviewers assessed the quality of Web pages retrieved with popular search engines using criteria adapted from the Good Information Practice Essential Criteria for Vaccine Safety Web Sites recommended by the World Health Organization. Quality of Web pages was assessed in the domains of accessibility, credibility, content, and design. Scores in these domains were compared through nonparametric statistical tests. We retrieved and reviewed 74 Web sites in Italian and 117 in English. Most retrieved Web pages (33.5%) were from private agencies. Median scores were higher in Web pages in English compared with those in Italian in the domain of accessibility (p < .01), credibility (p < .01), and content (p < .01). The highest credibility and content scores were those of Web pages from governmental agencies or universities. Accessibility scores were positively associated with content scores (p < .01) and with credibility scores (p < .01). A total of 16.2% of Web pages in Italian opposed HPV immunization compared with 6.0% of those in English (p < .05). Quality of information and number of Web pages opposing HPV immunization may vary with the Web site language. High-quality Web pages on HPV, especially from public health agencies and universities, should be easily accessible and retrievable with common Web search engines. Copyright 2010 Society for Adolescent Medicine. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Lindsay, Lorin
Designing a web home page involves many decisions that affect how the page will look, the kind of technology required to use the page, the links the page will provide, and kinds of patrons who can use the page. The theme of information literacy needs to be built into every web page; users need to be taught the skills of sorting and applying…
ERIC Educational Resources Information Center
Block, Marylaine
2002-01-01
Discusses how to teach students to evaluate information they find on the Internet. Highlights include motivation of Web site owners; link-checking; having student create Web pages to help with their evaluation skills of other Web sites; critical thinking skills; and helpful Web sites. (LRW)
2016-06-01
an effective system monitoring and display capability. The SOM, C-SSE, and resource managers access MUOS via a web portal called the MUOS Planning...and Provisioning Application (PlanProvApp). This web portal is their window into MUOS and is designed to provide them with a shared understanding of...including page loading errors, partially loaded web pages, incomplete reports, and inaccurate reports. For example, MUOS reported that there were
Chilet-Rosell, Elisa; Martín Llaguno, Marta; Ruiz Cantero, María Teresa; Alonso-Coello, Pablo
2010-03-16
The balance of the benefits and risks of long term use of hormone replacement therapy (HRT) have been a matter of debate for decades. In Europe, HRT requires medical prescription and its advertising is only permitted when aimed at health professionals (direct to consumer advertising is allowed in some non European countries). The objective of this study is to analyse the appropriateness and quality of Internet advertising about HRT in Spain. A search was carried out on the Internet (January 2009) using the eight best-selling HRT drugs in Spain. The brand name of each drug was entered into Google's search engine. The web sites appearing on the first page of results and the corresponding companies were analysed using the European Code of Good Practice as the reference point. Five corporate web pages: none of them included bibliographic references or measures to ensure that the advertising was only accessible by health professionals. Regarding non-corporate web pages (n = 27): 41% did not include the company name or address, 44% made no distinction between patient and health professional information, 7% contained bibliographic references, 26% provided unspecific information for the use of HRT for osteoporosis and 19% included menstrual cycle regulation or boosting feminity as an indication. Two online pharmacies sold HRT drugs which could be bought online in Spain, did not include the name or contact details of the registered company, nor did they stipulate the need for a medical prescription or differentiate between patient and health professional information. Even though pharmaceutical companies have committed themselves to compliance with codes of good practice, deficiencies were observed regarding the identification, information and promotion of HRT medications on their web pages. Unaffected by legislation, non-corporate web pages are an ideal place for indirect HRT advertising, but they often contain misleading information. HRT can be bought online from Spain, without a medical consultation or prescription constituting a serious issue for public health. In our information society, it is the right and obligation of public health bodies to ensure that such information is not misleading.
2010-01-01
Background The balance of the benefits and risks of long term use of hormone replacement therapy (HRT) have been a matter of debate for decades. In Europe, HRT requires medical prescription and its advertising is only permitted when aimed at health professionals (direct to consumer advertising is allowed in some non European countries). The objective of this study is to analyse the appropriateness and quality of Internet advertising about HRT in Spain. Methods A search was carried out on the Internet (January 2009) using the eight best-selling HRT drugs in Spain. The brand name of each drug was entered into Google's search engine. The web sites appearing on the first page of results and the corresponding companies were analysed using the European Code of Good Practice as the reference point. Results Five corporate web pages: none of them included bibliographic references or measures to ensure that the advertising was only accessible by health professionals. Regarding non-corporate web pages (n = 27): 41% did not include the company name or address, 44% made no distinction between patient and health professional information, 7% contained bibliographic references, 26% provided unspecific information for the use of HRT for osteoporosis and 19% included menstrual cycle regulation or boosting feminity as an indication. Two online pharmacies sold HRT drugs which could be bought online in Spain, did not include the name or contact details of the registered company, nor did they stipulate the need for a medical prescription or differentiate between patient and health professional information. Conclusions Even though pharmaceutical companies have committed themselves to compliance with codes of good practice, deficiencies were observed regarding the identification, information and promotion of HRT medications on their web pages. Unaffected by legislation, non-corporate web pages are an ideal place for indirect HRT advertising, but they often contain misleading information. HRT can be bought online from Spain, without a medical consultation or prescription constituting a serious issue for public health. In our information society, it is the right and obligation of public health bodies to ensure that such information is not misleading. PMID:20233393
Optimizing Crawler4j using MapReduce Programming Model
NASA Astrophysics Data System (ADS)
Siddesh, G. M.; Suresh, Kavya; Madhuri, K. Y.; Nijagal, Madhushree; Rakshitha, B. R.; Srinivasa, K. G.
2017-06-01
World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programming model by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programming model improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.
Web technology for emergency medicine and secure transmission of electronic patient records.
Halamka, J D
1998-01-01
The American Heritage dictionary defines the word "web" as "something intricately contrived, especially something that ensnares or entangles." The wealth of medical resources on the World Wide Web is now so extensive, yet disorganized and unmonitored, that such a definition seems fitting. In emergency medicine, for example, a field in which accurate and complete information, including patients' records, is urgently needed, more than 5000 Web pages are available today, whereas fewer than 50 were available in December 1994. Most sites are static Web pages using the Internet to publish textbook material, but new technology is extending the scope of the Internet to include online medical education and secure exchange of clinical information. This article lists some of the best Web sites for use in emergency medicine and then describes a project in which the Web is used for transmission and protection of electronic medical records.
Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages
Trudel, Mathieu
2005-01-01
Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics. PMID:16403724
Server-Side Includes Made Simple.
ERIC Educational Resources Information Center
Fagan, Jody Condit
2002-01-01
Describes server-side include (SSI) codes which allow Webmasters to insert content into Web pages without programming knowledge. Explains how to enable the codes on a Web server, provides a step-by-step process for implementing them, discusses tags and syntax errors, and includes examples of their use on the Web site for Southern Illinois…
NASA Astrophysics Data System (ADS)
Lares, M.
The presence of institutions on the internet is nowadays very important to strenghten communication channels, both internal and with the general public. The Córdoba Observatory has several web portals, including the official web page, a blog and presence on several social networks. These are one of the fundamental pillars for outreach activities, and serve as communication channel for events and scientific, academic, and outreach news. They are also a source of information for the staff, as well as data related to the Observatory internal organization and scientific production. Several statistical studies are presented, based on data taken from the visits to the official web pages. I comment on some aspects of the role of web pages as a source of consultation and as a quick response to information needs. FULL TEXT IN SPANISH
10 CFR 9.35 - Duplication fees.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., Rockville, Maryland, may be found on the NRC's Web site at http://www.nrc.gov/reading-rm/pdr/copy-service...″ reduced). Pages 11″ × 17″ are $0.30 per page. Pages larger than 11″ × 17″, including engineering drawings...
10 CFR 9.35 - Duplication fees.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., Rockville, Maryland, may be found on the NRC's Web site at http://www.nrc.gov/reading-rm/pdr/copy-service...″ reduced). Pages 11″ × 17″ are $0.30 per page. Pages larger than 11″ × 17″, including engineering drawings...
Required Discussion Web Pages in Psychology Courses and Student Outcomes
ERIC Educational Resources Information Center
Pettijohn, Terry F., II; Pettijohn, Terry F.
2007-01-01
We conducted 2 studies that investigated student outcomes when using discussion Web pages in psychology classes. In Study 1, we assigned 213 students enrolled in Introduction to Psychology courses to either a mandatory or an optional Web page discussion condition. Students used the discussion Web page significantly more often and performed…
Continuing Education for Department of Defense Health Professionals
2015-11-24
American Pharmacists Association, 60 and American Nurses Association. 61 These associations and other health-focused organizations, including health...1298. Accessed May 29, 2014. 60. American Pharmacists Association. Learn [Web page]. 2014; http://www.pharmacist.com/node/26541. Accessed May 29...American Pharmacists Association. Learn [Web page]. 2014; http://www.pharmacist.com/node/26541. Accessed May 29, 2014. 61. American Nurses Association
How To Do Field Searching in Web Search Engines: A Field Trip.
ERIC Educational Resources Information Center
Hock, Ran
1998-01-01
Describes the field search capabilities of selected Web search engines (AltaVista, HotBot, Infoseek, Lycos, Yahoo!) and includes a chart outlining what fields (date, title, URL, images, audio, video, links, page depth) are searchable, where to go on the page to search them, the syntax required (if any), and how field search queries are entered.…
Classifying Web Pages by Using Knowledge Bases for Entity Retrieval
NASA Astrophysics Data System (ADS)
Kiritani, Yusuke; Ma, Qiang; Yoshikawa, Masatoshi
In this paper, we propose a novel method to classify Web pages by using knowledge bases for entity search, which is a kind of typical Web search for information related to a person, location or organization. First, we map a Web page to entities according to the similarities between the page and the entities. Various methods for computing such similarity are applied. For example, we can compute the similarity between a given page and a Wikipedia article describing a certain entity. The frequency of an entity appearing in the page is another factor used in computing the similarity. Second, we construct a directed acyclic graph, named PEC graph, based on the relations among Web pages, entities, and categories, by referring to YAGO, a knowledge base built on Wikipedia and WordNet. Finally, by analyzing the PEC graph, we classify Web pages into categories. The results of some preliminary experiments validate the methods proposed in this paper.
The ATLAS Public Web Pages: Online Management of HEP External Communication Content
NASA Astrophysics Data System (ADS)
Goldfarb, S.; Marcelloni, C.; Eli Phoboo, A.; Shaw, K.
2015-12-01
The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal [1] content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and the enforcement of a well-defined visual identity.
Creating a Classroom Kaleidoscope with the World Wide Web.
ERIC Educational Resources Information Center
Quinlan, Laurie A.
1997-01-01
Discusses the elements of classroom Web presentations: planning; construction, including design tips; classroom use; and assessment. Lists 14 World Wide Web resources for K-12 teachers; Internet search tools (directories, search engines and meta-search engines); a Web glossary; and an example of HTML for a simple Web page. (PEN)
Going, going, still there: using the WebCite service to permanently archive cited Web pages.
Eysenbach, Gunther
2006-01-01
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page.
WebAlchemist: a Web transcoding system for mobile Web access in handheld devices
NASA Astrophysics Data System (ADS)
Whang, Yonghyun; Jung, Changwoo; Kim, Jihong; Chung, Sungkwon
2001-11-01
In this paper, we describe the design and implementation of WebAlchemist, a prototype web transcoding system, which automatically converts a given HTML page into a sequence of equivalent HTML pages that can be properly displayed on a hand-held device. The Web/Alchemist system is based on a set of HTML transcoding heuristics managed by the Transcoding Manager (TM) module. In order to tackle difficult-to-transcode pages such as ones with large or complex table structures, we have developed several new transcoding heuristics that extract partial semantics from syntactic information such as the table width, font size and cascading style sheet. Subjective evaluation results using popular HTML pages (such as the CNN home page) show that WebAlchemist generates readable, structure-preserving transcoded pages, which can be properly displayed on hand-held devices.
Dynamic Web Pages: Performance Impact on Web Servers.
ERIC Educational Resources Information Center
Kothari, Bhupesh; Claypool, Mark
2001-01-01
Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)
An Extraction Method of an Informative DOM Node from a Web Page by Using Layout Information
NASA Astrophysics Data System (ADS)
Tsuruta, Masanobu; Masuyama, Shigeru
We propose an informative DOM node extraction method from a Web page for preprocessing of Web content mining. Our proposed method LM uses layout data of DOM nodes generated by a generic Web browser, and the learning set consists of hundreds of Web pages and the annotations of informative DOM nodes of those Web pages. Our method does not require large scale crawling of the whole Web site to which the target Web page belongs. We design LM so that it uses the information of the learning set more efficiently in comparison to the existing method that uses the same learning set. By experiments, we evaluate the methods obtained by combining one that consists of the method for extracting the informative DOM node both the proposed method and the existing methods, and the existing noise elimination methods: Heur removes advertisements and link-lists by some heuristics and CE removes the DOM nodes existing in the Web pages in the same Web site to which the target Web page belongs. Experimental results show that 1) LM outperforms other methods for extracting the informative DOM node, 2) the combination method (LM, {CE(10), Heur}) based on LM (precision: 0.755, recall: 0.826, F-measure: 0.746) outperforms other combination methods.
Results from a Web Impact Factor Crawler.
ERIC Educational Resources Information Center
Thelwall, Mike
2001-01-01
Discusses Web impact factors (WIFs), Web versions of the impact factors for journals, and how they can be calculated by using search engines. Highlights include HTML and document indexing; Web page links; a Web crawler designed for calculating WIFs; and WIFs for United Kingdom universities that measured research profiles or capability. (Author/LRW)
Building Interactive Simulations in Web Pages without Programming.
Mailen Kootsey, J; McAuley, Grant; Bernal, Julie
2005-01-01
A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.
ERIC Educational Resources Information Center
Fitzgerald, Mary Ann; Gregory, Vicki L.; Brock, Kathy; Bennett, Elizabeth; Chen, Shu-Hsien Lai; Marsh, Emily; Moore, Joi L.; Kim, Kyung-Sun; Esser, Linda R.
2002-01-01
Chapters in this section of "Educational Media and Technology Yearbook" examine important trends prominent in the landscape of the school library media profession in 2001. Themes include mandated educational reform; diversity in school library resources; communication through image-text juxtaposition in Web pages; and professional development and…
A radiology department intranet: development and applications.
Willing, S J; Berland, L L
1999-01-01
An intranet is a "private Internet" that uses the protocols of the World Wide Web to share information resources within a company or with the company's business partners and clients. The hardware requirements for an intranet begin with a dedicated Web server permanently connected to the departmental network. The heart of a Web server is the hypertext transfer protocol (HTTP) service, which receives a page request from a client's browser and transmits the page back to the client. Although knowledge of hypertext markup language (HTML) is not essential for authoring a Web page, a working familiarity with HTML is useful, as is knowledge of programming and database management. Security can be ensured by using scripts to write information in hidden fields or by means of "cookies." Interfacing databases and database management systems with the Web server and conforming the user interface to HTML syntax can be achieved by means of the common gateway interface (CGI), Active Server Pages (ASP), or other methods. An intranet in a radiology department could include the following types of content: on-call schedules, work schedules and a calendar, a personnel directory, resident resources, memorandums and discussion groups, software for a radiology information system, and databases.
Recognition of pornographic web pages by classifying texts and images.
Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve
2007-06-01
With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.
ERIC Educational Resources Information Center
Ariga, T.; Watanabe, T.
2008-01-01
The explosive growth of the Internet has made the knowledge and skills for creating Web pages into general subjects that all students should learn. It is now common to teach the technical side of the production of Web pages and many teaching materials have been developed. However teaching the aesthetic side of Web page design has been neglected,…
NASA Technical Reports Server (NTRS)
1996-01-01
A World Wide Web page, Webpress, designed for K-12 teachers is described. The primary emphasis of Webpress is the science of aeronautics, and the page includes many links to various NASA facilities as well as many other scientific organizations.
Formal Features of Cyberspace: Relationships between Web Page Complexity and Site Traffic.
ERIC Educational Resources Information Center
Bucy, Erik P.; Lang, Annie; Potter, Robert F.; Grabe, Maria Elizabeth
1999-01-01
Examines differences between the formal features of commercial versus noncommercial Web sites, and the relationship between Web page complexity and amount of traffic a site receives. Findings indicate that, although most pages in this stage of the Web's development remain technologically simple and noninteractive, there are significant…
Webmail: an Automated Web Publishing System
NASA Astrophysics Data System (ADS)
Bell, David
A system for publishing frequently updated information to the World Wide Web will be described. Many documents now hosted by the NOAO Web server require timely posting and frequent updates, but need only minor changes in markup or are in a standard format requiring only conversion to HTML. These include information from outside the organization, such as electronic bulletins, and a number of internal reports, both human and machine generated. Webmail uses procmail and Perl scripts to process incoming email messages in a variety of ways. This processing may include wrapping or conversion to HTML, posting to the Web or internal newsgroups, updating search indices or links on related pages, and sending email notification of the new pages to interested parties. The Webmail system has been in use at NOAO since early 1997 and has steadily grown to include fourteen recipes that together handle about fifty messages per week.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-13
... sites, social media pages, and any comparable Internet presence, and on Web sites, social media pages... prescribed by FINRA, on their Web sites, social media pages, and any comparable Internet presence, and on Web sites, social media pages, and any comparable Internet presence relating to a member's investment...
Visual Design Principles Applied To World Wide Web Construction.
ERIC Educational Resources Information Center
Luck, Donald D.; Hunter, J. Mark
This paper describes basic types of World Wide Web pages and presents design criteria for page layout based on principles of visual literacy. Discussion focuses on pages that present information in the following styles: billboard; directory/index; textual; and graphics. Problems and solutions in Web page construction are explored according to…
Panatto, Donatella; Amicizia, Daniela; Arata, Lucia; Lai, Piero Luigi; Gasparini, Roberto
2018-04-03
Squalene-based adjuvants have been included in influenza vaccines since 1997. Despite several advantages of adjuvanted seasonal and pandemic influenza vaccines, laypeople's perception of such formulations may be hesitant or even negative under certain circumstances. Moreover, in Italian, the term "squalene" has the same root as such common words as "shark" (squalo), "squalid" and "squalidness" that tend to have negative connotations. This study aimed to quantitatively and qualitatively analyze a representative sample of Italian web pages mentioning squalene-based adjuvants used in influenza vaccines. Every effort was made to limit the subjectivity of judgments. Eighty-four unique web pages were assessed. A high prevalence (47.6%) of pages with negative or ambiguous attitudes toward squalene-based adjuvants was established. Compared with web pages reporting balanced information on squalene-based adjuvants, those categorized as negative/ambiguous had significantly lower odds of belonging to a professional institution [adjusted odds ratio (aOR) = 0.12, p = .004], and significantly higher odds of containing pictures (aOR = 1.91, p = .034) and being more readable (aOR = 1.34, p = .006). Some differences in wording between positive/neutral and negative/ambiguous web pages were also observed. The most common scientifically unsound claims concerned safety issues and, in particular, claims linking squalene-based adjuvants to the Gulf War Syndrome and autoimmune disorders. Italian users searching the web for information on vaccine adjuvants have a high likelihood of finding unbalanced and misleading material. Information provided by institutional websites should be not only evidence-based but also carefully targeted towards laypeople. Conversely, authors writing for non-institutional websites should avoid sensationalism and provide their readers with more balanced information.
2018-01-01
ABSTRACT Squalene-based adjuvants have been included in influenza vaccines since 1997. Despite several advantages of adjuvanted seasonal and pandemic influenza vaccines, laypeople's perception of such formulations may be hesitant or even negative under certain circumstances. Moreover, in Italian, the term “squalene” has the same root as such common words as “shark” (squalo), “squalid” and “squalidness” that tend to have negative connotations. This study aimed to quantitatively and qualitatively analyze a representative sample of Italian web pages mentioning squalene-based adjuvants used in influenza vaccines. Every effort was made to limit the subjectivity of judgments. Eighty-four unique web pages were assessed. A high prevalence (47.6%) of pages with negative or ambiguous attitudes toward squalene-based adjuvants was established. Compared with web pages reporting balanced information on squalene-based adjuvants, those categorized as negative/ambiguous had significantly lower odds of belonging to a professional institution [adjusted odds ratio (aOR) = 0.12, p = .004], and significantly higher odds of containing pictures (aOR = 1.91, p = .034) and being more readable (aOR = 1.34, p = .006). Some differences in wording between positive/neutral and negative/ambiguous web pages were also observed. The most common scientifically unsound claims concerned safety issues and, in particular, claims linking squalene-based adjuvants to the Gulf War Syndrome and autoimmune disorders. Italian users searching the web for information on vaccine adjuvants have a high likelihood of finding unbalanced and misleading material. Information provided by institutional websites should be not only evidence-based but also carefully targeted towards laypeople. Conversely, authors writing for non-institutional websites should avoid sensationalism and provide their readers with more balanced information. PMID:29172967
An efficient scheme for automatic web pages categorization using the support vector machine
NASA Astrophysics Data System (ADS)
Bhalla, Vinod Kumar; Kumar, Neeraj
2016-07-01
In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.
Finding Specification Pages from the Web
NASA Astrophysics Data System (ADS)
Yoshinaga, Naoki; Torisawa, Kentaro
This paper presents a method of finding a specification page on the Web for a given object (e.g., ``Ch. d'Yquem'') and its class label (e.g., ``wine''). A specification page for an object is a Web page which gives concise attribute-value information about the object (e.g., ``county''-``Sauternes'') in well formatted structures. A simple unsupervised method using layout and symbolic decoration cues was applied to a large number of the Web pages to acquire candidate attributes for each class (e.g., ``county'' for a class ``wine''). We then filter out irrelevant words from the putative attributes through an author-aware scoring function that we called site frequency. We used the acquired attributes to select a representative specification page for a given object from the Web pages retrieved by a normal search engine. Experimental results revealed that our system greatly outperformed the normal search engine in terms of this specification retrieval.
A profile of anti-vaccination lobbying on the South African internet, 2011-2013.
Burnett, Rosemary Joyce; von Gogh, Lauren Jennifer; Moloi, Molelekeng H; François, Guido
2015-11-01
The South African Vaccination and Immunisation Centre receives many requests to explain the validity of internet-based anti-vaccination claims. Previous global studies on internet-based anti-vaccination lobbying had not identified anti-vaccination web pages originating in South Africa (SA). To characterise SA internet-based anti-vaccination lobbying. In 2011, searches for anti-vaccination content were performed using Google, Yahoo and MSN-Bing, limited to English-language SA web pages. Content analysis was performed on web pages expressing anti-vaccination sentiment about infant vaccination. This was repeated in 2012 and 2013 using Google, with the first 700 web pages per search being analysed. Blogs/forums, articles and e-shops constituted 40.3%, 55.2% and 4.5% of web pages, respectively. Authors were lay people (63.5%), complementary/alternative medicine (CAM) practitioners (23.1%), medical professionals practising CAM (7.7%) and medical professionals practising only allopathic medicine (5.8%). Advertisements appeared on 55.2% of web pages. Of these, 67.6% were sponsored by or linked to organisations with financial interests in discrediting vaccines, with 80.0% and 24.0% of web pages sponsored by these organisations claiming respectively that vaccines are ineffective and that vaccination is profit driven. The vast majority of web pages (92.5%) claimed that vaccines are not safe, and 77.6% of anti-vaccination claims originated from the USA. South Africans are creating web pages or blogs for local anti-vaccination lobbying. Research is needed to understand what influence internet-based anti-vaccination lobbying has on the uptake of infant vaccination in SA.
ARL Physics Web Pages: An Evaluation by Established, Transitional and Emerging Benchmarks.
ERIC Educational Resources Information Center
Duffy, Jane C.
2002-01-01
Provides an overview of characteristics among Association of Research Libraries (ARL) physics Web pages. Examines current academic Web literature and from that develops six benchmarks to measure physics Web pages: ease of navigation; logic of presentation; representation of all forms of information; engagement of the discipline; interactivity of…
Authoring Educational Courseware Using OXYGEN.
ERIC Educational Resources Information Center
Ip, Albert
Engaging learners on the World Wide Web is more than sending Web pages to the user. However, for many course delivery software programs, the smallest unit of delivery is a Web page. How content experts can create engaging Web pages has largely been ignored or taken for granted. This paper reports on an authoring model for creating pedagogically…
Web accessibility support for visually impaired users using link content analysis.
Iwata, Hajime; Kobayashi, Naofumi; Tachibana, Kenji; Shirogane, Junko; Fukazawa, Yoshiaki
2013-12-01
Web pages are used for a variety of purposes. End users must understand dynamically changing content and sequentially follow page links to find desired material, requiring significant time and effort. However, for visually impaired users using screen readers, it can be difficult to find links to web pages when link text and alternative text descriptions are inappropriate. Our method supports the discovery of content by analyzing 8 categories of link types, and allows visually impaired users to be aware of the content represented by links in advance. This facilitates end users access to necessary information on web pages. Our method of classifying web page links is therefore effective as a means of evaluating accessibility.
This data layer provides access to Base Realignment and Closure (BRAC) Superfund Sites as part of the CIMC web service. EPA works with DoD to facilitate the reuse and redevelopment of BRAC federal properties. When the BRAC program began in the early 1990s, EPA worked with DoD and the states to identify uncontaminated areas and these parcels were immediately made available for reuse. Since then EPA has worked with DoD to clean up the contaminated portions of bases. These are usually parcels that were training ranges, landfills, maintenance facilities and other past waste-disposal areas. Superfund is a program administered by the EPA to locate, investigate, and clean up worst hazardous waste sites throughout the United States. EPA administers the Superfund program in cooperation with individual states and tribal governments. These sites include abandoned warehouses, manufacturing facilities, processing plants, and landfills - the key word here being abandoned.This data layer shows Superfund Sites that are located at BRAC Federal Facilities. Additional Superfund sites and other BRAC sites (those that are not Superfund sites) are included in other data layers as part of this web service.BRAC Superfund Sites shown in this web service are derived from the epa.gov website and include links to the relevant web pages within the attribute table. Data about BRAC Superfund Sites are located on their own EPA web pages, and CIMC links to those pages. The CIMC web service
Macroscopic characterisations of Web accessibility
NASA Astrophysics Data System (ADS)
Lopes, Rui; Carriço, Luis
2010-12-01
The Web Science framework poses fundamental questions on the analysis of the Web, by focusing on how microscopic properties (e.g. at the level of a Web page or Web site) emerge into macroscopic properties and phenomena. One research topic on the analysis of the Web is Web accessibility evaluation, which centres on understanding how accessible a Web page is for people with disabilities. However, when framing Web accessibility evaluation on Web Science, we have found that existing research stays at the microscopic level. This article presents an experimental study on framing Web accessibility evaluation into Web Science's goals. This study resulted in novel accessibility properties of the Web not found at microscopic levels, as well as of Web accessibility evaluation processes themselves. We observed at large scale some of the empirical knowledge on how accessibility is perceived by designers and developers, such as the disparity of interpretations of accessibility evaluation tools warnings. We also found a direct relation between accessibility quality and Web page complexity. We provide a set of guidelines for designing Web pages, education on Web accessibility, as well as on the computational limits of large-scale Web accessibility evaluations.
Reporting on post-menopausal hormone therapy: an analysis of gynaecologists' web pages.
Bucksch, Jens; Kolip, Petra; Deitermann, Bernhilde
2004-01-01
The present study was designed to analyse Web pages of German gynaecologists with regard to postmenopausal hormone therapy (HT). There is a growing body of evidence, that the overall health risks of HT exceed the benefits. Making one's own informed choice has become a central concern for menopausal women. The Internet is an important source of health information, but the quality is often dubious. The study focused on the analysis of basic criteria such as last modification date and quality of the HT information content. The results of the Women's Health Initiative Study (WHI) were used as a benchmark. We searched for relevant Web pages by entering a combination of key words (9 x 13 = 117) into the search engine www.google.de. Each Web page was analysed using a standardized questionnaire. The basic criteria and the quality of content on each Web page were separately categorized by two evaluators. Disagreements were resolved by discussion. Of the 97 websites identified, basic criteria were not met by the majority. For example, the modification date was displayed by only 23 (23.7%) Web pages. The quality of content of most Web pages regarding HT was inaccurate and incomplete. Whilst only nine (9.3%) took up a balanced position, 66 (68%) recommended HT without any restrictions. In 22 cases the recommendation was indistinct and none of the sites refused HT. With regard to basic criteria, there was no difference between HT-recommending Web pages and sites with balanced position. Evidence-based information resulting from the WHI trial was insufficiently represented on gynaecologists' Web pages. Because of the growing number of consumers looking online for health information, the danger of obtaining harmful information has to be minimized. Web pages of gynaecologists do not appear to be recommendable for women because they do not provide recent evidence-based findings about HT.
Exploring Cultural Variation in Eye Movements on a Web Page between Americans and Koreans
ERIC Educational Resources Information Center
Yang, Changwoo
2009-01-01
This study explored differences in eye movement on a Web page between members of two different cultures to provide insight and guidelines for implementation of global Web site development. More specifically, the research examines whether differences of eye movement exist between the two cultures (American vs. Korean) when viewing a Web page, and…
Test Your Knowledge of Internet Vocabulary.
ERIC Educational Resources Information Center
Bigham, Vicki Smith
1998-01-01
Answers common questions about the Internet, i.e., what it is, its components, and the definitions of its various features. Questions include what Web pages and browsers are, and the definitions of URLs, ISPs, home pages, search engines, and hyperlinks. (GR)
Nuclear Science References (NSR)
be included. For more information, see the help page. The NSR database schema and Web applications have undergone some recent changes. This is a revised version of the NSR Web Interface. NSR Quick Manager: Boris Pritychenko, NNDC, Brookhaven National Laboratory Web Programming: Boris Pritychenko, NNDC
Allen, J W; Finch, R J; Coleman, M G; Nathanson, L K; O'Rourke, N A; Fielding, G A
2002-01-01
This study was undertaken to determine the quality of information on the Internet regarding laparoscopy. Four popular World Wide Web search engines were used with the key word "laparoscopy." Advertisements, patient- or physician-directed information, and controversial material were noted. A total of 14,030 Web pages were found, but only 104 were unique Web sites. The majority of the sites were duplicate pages, subpages within a main Web page, or dead links. Twenty-eight of the 104 pages had a medical product for sale, 26 were patient-directed, 23 were written by a physician or group of physicians, and six represented corporations. The remaining 21 were "miscellaneous." The 46 pages containing educational material were critically reviewed. At least one of the senior authors found that 32 of the pages contained controversial or misleading statements. All of the three senior authors (LKN, NAO, GAF) independently agreed that 17 of the 46 pages contained controversial information. The World Wide Web is not a reliable source for patient or physician information about laparoscopy. Authenticating medical information on the World Wide Web is a difficult task, and no government or surgical society has taken the lead in regulating what is presented as fact on the World Wide Web.
NASA Astrophysics Data System (ADS)
Dimopoulos, Kostas; Asimakopoulos, Apostolos
2010-06-01
This study aims to explore navigation patterns and preferred pages' characteristics of ten secondary school students' searching the web for information about cloning. The students navigated the Web for as long as they wished in a context of minimum support of teaching staff. Their navigation patterns were analyzed using audit trail data software. The characteristics of their preferred Web pages were also analyzed using a scheme of analysis largely based on socio-linguistics and socio-semiotics approaches. Two distinct groups of students could be discerned. The first consisted of more competent students, who during their navigation visited fewer relevant pages, however of higher credibility and more specialized content. The second group consists of weaker students, who visited more pages, mainly of lower credibility and rather popularized content. Implications for designing educational web pages and teaching are discussed.
NASA Astrophysics Data System (ADS)
Preger, B.; Verrecchia, F.; Pittori, C.; Antonelli, L. A.; Giommi, P.; Lazzarotto, F.; Evangelista, Y.
2008-05-01
The Italian Space Agency Science Data Center (ASDC) is a facility with several responsibilities including support to all the ASI scientific missions as for management and archival of the data, acting as the interface between ASI and the scientific community and providing on-line access to the data hosted. In this poster we describe the services that ASDC provides for SuperAGILE, in particular the ASDC public web pages devoted to the dissemination of SuperAGILE scientific results. SuperAGILE is the X-Ray imager onboard the AGILE mission, and provides the scientific community with orbit-by-orbit information on the observed sources. Crucial source information including position and flux in chosen energy bands will be reported in the SuperAGILE public web page at ASDC. Given their particular interest, another web page will be dedicated entirely to GRBs and other transients, where new event alerts will be notified and where users will find all the available informations on the GRBs detected by SuperAGILE.
FOCIH: Form-Based Ontology Creation and Information Harvesting
NASA Astrophysics Data System (ADS)
Tao, Cui; Embley, David W.; Liddle, Stephen W.
Creating an ontology and populating it with data are both labor-intensive tasks requiring a high degree of expertise. Thus, scaling ontology creation and population to the size of the web in an effort to create a web of data—which some see as Web 3.0—is prohibitive. Can we find ways to streamline these tasks and lower the barrier enough to enable Web 3.0? Toward this end we offer a form-based approach to ontology creation that provides a way to create Web 3.0 ontologies without the need for specialized training. And we offer a way to semi-automatically harvest data from the current web of pages for a Web 3.0 ontology. In addition to harvesting information with respect to an ontology, the approach also annotates web pages and links facts in web pages to ontological concepts, resulting in a web of data superimposed over the web of pages. Experience with our prototype system shows that mappings between conceptual-model-based ontologies and forms are sufficient for creating the kind of ontologies needed for Web 3.0, and experiments with our prototype system show that automatic harvesting, automatic annotation, and automatic superimposition of a web of data over a web of pages work well.
Students as Web Site Authors: Effects on Motivation and Achievement
ERIC Educational Resources Information Center
Jones, Brett D.
2003-01-01
This study examined the effects of a Web site design project on students' motivation and achievement. Tenth-grade biology students worked together in teams on an ecology project that required them to locate relevant information on the Internet, decide which information should be included on their Web site, organize the information into Web pages,…
Automating Information Discovery Within the Invisible Web
NASA Astrophysics Data System (ADS)
Sweeney, Edwina; Curran, Kevin; Xie, Ermai
A Web crawler or spider crawls through the Web looking for pages to index, and when it locates a new page it passes the page on to an indexer. The indexer identifies links, keywords, and other content and stores these within its database. This database is searched by entering keywords through an interface and suitable Web pages are returned in a results page in the form of hyperlinks accompanied by short descriptions. The Web, however, is increasingly moving away from being a collection of documents to a multidimensional repository for sounds, images, audio, and other formats. This is leading to a situation where certain parts of the Web are invisible or hidden. The term known as the "Deep Web" has emerged to refer to the mass of information that can be accessed via the Web but cannot be indexed by conventional search engines. The concept of the Deep Web makes searches quite complex for search engines. Google states that the claim that conventional search engines cannot find such documents as PDFs, Word, PowerPoint, Excel, or any non-HTML page is not fully accurate and steps have been taken to address this problem by implementing procedures to search items such as academic publications, news, blogs, videos, books, and real-time information. However, Google still only provides access to a fraction of the Deep Web. This chapter explores the Deep Web and the current tools available in accessing it.
49 CFR 573.9 - Address for submitting required reports and other information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Internet Web page http://www.safercar.gov/Vehicle+Manufacturers. A manufacturer must use the templates provided at this Web page for all submissions required under this section. Defect and noncompliance... at this Web page. [78 FR 51421, Aug. 20, 2013] ...
Ajax Architecture Implementation Techniques
NASA Astrophysics Data System (ADS)
Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader
2012-03-01
Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.
Tune in the Net with RealAudio.
ERIC Educational Resources Information Center
Buchanan, Larry
1997-01-01
Describes how to connect to the RealAudio Web site to download a player that provides sound from Web pages to the computer through streaming technology. Explains hardware and software requirements and provides addresses for other RealAudio Web sites are provided, including weather information and current news. (LRW)
Changing Instructional Practices through Technology Training, Part 2 of 2.
ERIC Educational Resources Information Center
Seamon, Mary
2001-01-01
This second of a two-part article introducing the steps in a school district's teacher professional development model discusses steps three through six: Web page or project; Internet Discovery (with its five phases-question, search, interpretation, composition, sharing); Cyberinquiry; and WebQuests. Three examples are included: Web Page…
12 CFR 516.30 - What information must I provide with my application?
Code of Federal Regulations, 2010 CFR
2010-01-01
... on OTS's web page at www.ots.treas.gov. (b) Captions and exhibits. You must caption the original... signatures on copies if you include a copy of the signed signature page or the copy otherwise indicates that...
Risk markers for disappearance of pediatric Web resources
Hernández-Borges, Angel A.; Jiménez-Sosa, Alejandro; Torres-Álvarez de Arcaya, Maria L.; Macías-Cervi, Pablo; Gaspar-Guardado, Maria A.; Ruíz-Rabaza, Ana
2005-01-01
Objectives: The authors sought to find out whether certain Webometric indexes of a sample of pediatric Web resources, and some tests based on them, could be helpful predictors of their disappearance. Methods: The authors performed a retrospective study of a sample of 363 pediatric Websites and pages they had followed for 4 years. Main measurements included: number of resources that disappeared, number of inbound links and their annual increment, average daily visits to the resources in the sample, sample compliance with the quality criteria of 3 international organizations, and online time of the Web resources. Results: On average, 11% of the sample disappeared annually. However, 13% of these were available again at the end of follow up. Disappearing and surviving Websites did not show differences in the variables studied. However, surviving Web pages had a higher number of inbound links and higher annual increment in inbound links. Similarly, Web pages that survived showed higher compliance with recognized sets of quality criteria than those that disappeared. A subset of 14 quality criteria whose compliance accounted for 90% of the probability of online permanence was identified. Finally, a progressive increment of inbound links was found to be a marker of good prognosis, showing high specificity and positive predictive value (88% and 94%, respectively). Conclusions: The number of inbound links and annual increment of inbound links could be useful markers of the permanence probability for pediatric Web pages. Strategies that assure the Web editors' awareness of their Web resources' popularity could stimulate them to improve the quality of their Websites. PMID:16059427
Accounting Data to Web Interface Using PERL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hargeaves, C
2001-08-13
This document will explain the process to create a web interface for the accounting information generated by the High Performance Storage Systems (HPSS) accounting report feature. The accounting report contains useful data but it is not easily accessed in a meaningful way. The accounting report is the only way to see summarized storage usage information. The first step is to take the accounting data, make it meaningful and store the modified data in persistent databases. The second step is to generate the various user interfaces, HTML pages, that will be used to access the data. The third step is tomore » transfer all required files to the web server. The web pages pass parameters to Common Gateway Interface (CGI) scripts that generate dynamic web pages and graphs. The end result is a web page with specific information presented in text with or without graphs. The accounting report has a specific format that allows the use of regular expressions to verify if a line is storage data. Each storage data line is stored in a detailed database file with a name that includes the run date. The detailed database is used to create a summarized database file that also uses run date in its name. The summarized database is used to create the group.html web page that includes a list of all storage users. Scripts that query the database folder to build a list of available databases generate two additional web pages. A master script that is run monthly as part of a cron job, after the accounting report has completed, manages all of these individual scripts. All scripts are written in the PERL programming language. Whenever possible data manipulation scripts are written as filters. All scripts are written to be single source, which means they will function properly on both the open and closed networks at LLNL. The master script handles the command line inputs for all scripts, file transfers to the web server and records run information in a log file. The rest of the scripts manipulate the accounting data or use the files created to generate HTML pages. Each script will be described in detail herein. The following is a brief description of HPSS taken directly from an HPSS web site. ''HPSS is a major development project, which began in 1993 as a Cooperative Research and Development Agreement (CRADA) between government and industry. The primary objective of HPSS is to move very large data objects between high performance computers, workstation clusters, and storage libraries at speeds many times faster than is possible with today's software systems. For example, HPSS can manage parallel data transfers from multiple network-connected disk arrays at rates greater than 1 Gbyte per second, making it possible to access high definition digitized video in real time.'' The HPSS accounting report is a canned report whose format is controlled by the HPSS developers.« less
The efficacy of a Web-based counterargument tutor.
Wolfe, Christopher R; Britt, M Anne; Petrovic, Melina; Albrecht, Michael; Kopp, Kristopher
2009-08-01
In two experiments, we developed and tested an interactive Web-based tutor to help students identify and evaluate counterarguments. In Experiment 1, we determined the extent to which high- and low-argumentationability participants were able to identify counterarguments. We tested the effectiveness of having participants read didactic text regarding counterarguments and highlight claims. Both preparations had some positive effects that were often limited to high-ability participants. The Web-based intervention included interactive exercises on identifying and using counterarguments. Web-based presentation was state driven, using a Java Server Pages page. As participants progressively identified argument elements, the page changed display state and presented feedback by checking what the user clicked against elements that we had coded in XML beforehand. Instructions and feedback strings were indexed by state, so that changing state selected new text to display. In Experiment 2, the tutor was effective in teaching participants to identify counterarguments, recognize responses, and determine whether counterarguments were rebutted, dismissed, or conceded.
Home Page, Sweet Home Page: Creating a Web Presence.
ERIC Educational Resources Information Center
Falcigno, Kathleen; Green, Tim
1995-01-01
Focuses primarily on design issues and practical concerns involved in creating World Wide Web documents for use within an organization. Concerns for those developing Web home pages are: learning HyperText Markup Language (HTML); defining customer group; allocating staff resources for maintenance of documents; providing feedback mechanism for…
An Analysis of Academic Library Web Pages for Faculty
ERIC Educational Resources Information Center
Gardner, Susan J.; Juricek, John Eric; Xu, F. Grace
2008-01-01
Web sites are increasingly used by academic libraries to promote key services and collections to teaching faculty. This study analyzes the content, location, language, and technological features of fifty-four academic library Web pages designed especially for faculty to expose patterns in the development of these pages.
Paparo, G. D.; Martin-Delgado, M. A.
2012-01-01
We introduce the characterization of a class of quantum PageRank algorithms in a scenario in which some kind of quantum network is realizable out of the current classical internet web, but no quantum computer is yet available. This class represents a quantization of the PageRank protocol currently employed to list web pages according to their importance. We have found an instance of this class of quantum protocols that outperforms its classical counterpart and may break the classical hierarchy of web pages depending on the topology of the web. PMID:22685626
Atmospheric Science Data Center
2013-03-21
... Web Links to Relevant CERES Information Relevant information about CERES, CERES references, ... Instrument Working Group Home Page Aerosol Retrieval Web Page (Center for Satellite Applications and Research) ...
A cross disciplinary study of link decay and the effectiveness of mitigation techniques
2013-01-01
Background The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. Results We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Conclusion Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved. PMID:24266891
A cross disciplinary study of link decay and the effectiveness of mitigation techniques.
Hennessey, Jason; Ge, Steven
2013-01-01
The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved.
Growing and navigating the small world Web by local content
Menczer, Filippo
2002-01-01
Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues. PMID:12381792
Growing and navigating the small world Web by local content
NASA Astrophysics Data System (ADS)
Menczer, Filippo
2002-10-01
Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues.
Growing and navigating the small world Web by local content.
Menczer, Filippo
2002-10-29
Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues.
An ant colony optimization based feature selection for web page classification.
Saraç, Esra; Özel, Selma Ayşe
2014-01-01
The increased popularity of the web has caused the inclusion of huge amount of information to the web, and as a result of this explosive information growth, automated web page classification systems are needed to improve search engines' performance. Web pages have a large number of features such as HTML/XML tags, URLs, hyperlinks, and text contents that should be considered during an automated classification process. The aim of this study is to reduce the number of features to be used to improve runtime and accuracy of the classification of web pages. In this study, we used an ant colony optimization (ACO) algorithm to select the best features, and then we applied the well-known C4.5, naive Bayes, and k nearest neighbor classifiers to assign class labels to web pages. We used the WebKB and Conference datasets in our experiments, and we showed that using the ACO for feature selection improves both accuracy and runtime performance of classification. We also showed that the proposed ACO based algorithm can select better features with respect to the well-known information gain and chi square feature selection methods.
Artieta-Pinedo, Isabel; Paz-Pascual, Carmen; Grandes, Gonzalo; Villanueva, Gemma
2018-03-01
the aim of this study is to evaluate the quality of web pages found by women when carrying out an exploratory search concerning pregnancy, childbirth, the postpartum period and breastfeeding. a descriptive study of the first 25 web pages that appear in the search engines Google, Yahoo and Bing, in October 2014 in the Basque Country (Spain), when entering eight Spanish words and seven English words related to pregnancy, childbirth, the postpartum period, breastfeeding and newborns. Web pages aimed at healthcare professionals and forums were excluded. The reliability was evaluated using the LIDA questionnaire, and the contents of the web pages with the highest scores were then described. a total of 126 web pages were found using the key search words. Of these, 14 scored in the top 30% for reliability. The content analysis of these found that the mean score for "references to the source of the information" was 3.4 (SD: 2.17), that for "up-to-date" was 4.30 (SD: 1.97) and the score for "conflict of interest statement" was 5.90 (SD: 2.16). The mean for web pages created by universities and official bodies was 13.64 (SD: 4.47), whereas the mean for those created by private bodies was 11.23 (SD: 4.51) (F (1,124)5.27. p=0.02). The content analysis of these web pages found that the most commonly discussed topic was breastfeeding, followed by self-care during pregnancy and the onset of childbirth. in this study, web pages from established healthcare or academic institutions were found to contain the most reliable information. The significant number of web pages found in this study with poor quality information indicates the need for healthcare professionals to guide women when sourcing information online. As the origin of the web page has a direct effect on reliability, the involvement of healthcare professionals in the use, counselling and generation of new technologies as an intervention tool is increasingly essential. Copyright © 2017 Elsevier Ltd. All rights reserved.
Table Extraction from Web Pages Using Conditional Random Fields to Extract Toponym Related Data
NASA Astrophysics Data System (ADS)
Luthfi Hanifah, Hayyu'; Akbar, Saiful
2017-01-01
Table is one of the ways to visualize information on web pages. The abundant number of web pages that compose the World Wide Web has been the motivation of information extraction and information retrieval research, including the research for table extraction. Besides, there is a need for a system which is designed to specifically handle location-related information. Based on this background, this research is conducted to provide a way to extract location-related data from web tables so that it can be used in the development of Geographic Information Retrieval (GIR) system. The location-related data will be identified by the toponym (location name). In this research, a rule-based approach with gazetteer is used to recognize toponym from web table. Meanwhile, to extract data from a table, a combination of rule-based approach and statistical-based approach is used. On the statistical-based approach, Conditional Random Fields (CRF) model is used to understand the schema of the table. The result of table extraction is presented on JSON format. If a web table contains toponym, a field will be added on the JSON document to store the toponym values. This field can be used to index the table data in accordance to the toponym, which then can be used in the development of GIR system.
XML: A Language To Manage the World Wide Web. ERIC Digest.
ERIC Educational Resources Information Center
Davis-Tanous, Jennifer R.
This digest provides an overview of XML (Extensible Markup Language), a markup language used to construct World Wide Web pages. Topics addressed include: (1) definition of a markup language, including comparison of XML with SGML (Standard Generalized Markup Language) and HTML (HyperText Markup Language); (2) how XML works, including sample tags,…
A Tutorial in Creating Web-Enabled Databases with Inmagic DB/TextWorks through ODBC.
ERIC Educational Resources Information Center
Breeding, Marshall
2000-01-01
Explains how to create Web-enabled databases. Highlights include Inmagic's DB/Text WebPublisher product called DB/TextWorks; ODBC (Open Database Connectivity) drivers; Perl programming language; HTML coding; Structured Query Language (SQL); Common Gateway Interface (CGI) programming; and examples of HTML pages and Perl scripts. (LRW)
Vermont hospital's web site focuses on valuable healthcare information.
Rees, Tom
2005-01-01
Brattleboro Memorial Hospital's web site celebrates a century of caring in the region of Brattleboro, Vt. The web site, bmhvt.org, is loaded with information, including a local links page that enables site visitors to hook up with the Brattleboro Chamber of Commerce, the Area Health Education Council, Lifeline Personal Response Service, and more.
ERIC Educational Resources Information Center
Fels, Deborah I.; Richards, Jan; Hardman, Jim; Lee, Daniel G.
2006-01-01
The World Wide Web has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The…
32 CFR 806.5 - Responsibilities.
Code of Federal Regulations, 2014 CFR
2014-07-01
... room (ERR) requirements by establishing a FOIA site on their installation public web page and making... a link to the Air Force FOIA web page at http://www.foia.af.mil. See § 806.12(c). (d) MAJCOM... installation public web page by updating or removing them when no longer needed. Software for tracking number...
32 CFR 806.5 - Responsibilities.
Code of Federal Regulations, 2012 CFR
2012-07-01
... room (ERR) requirements by establishing a FOIA site on their installation public web page and making... a link to the Air Force FOIA web page at http://www.foia.af.mil. See § 806.12(c). (d) MAJCOM... installation public web page by updating or removing them when no longer needed. Software for tracking number...
32 CFR 806.5 - Responsibilities.
Code of Federal Regulations, 2013 CFR
2013-07-01
... room (ERR) requirements by establishing a FOIA site on their installation public web page and making... a link to the Air Force FOIA web page at http://www.foia.af.mil. See § 806.12(c). (d) MAJCOM... installation public web page by updating or removing them when no longer needed. Software for tracking number...
Social Responsibility and Corporate Web Pages: Self-Presentation or Agenda-Setting?
ERIC Educational Resources Information Center
Esrock, Stuart L.; Leichty, Greg B.
1998-01-01
Examines how corporate entities use the Web to present themselves as socially responsible citizens and to advance policy positions. Samples randomly "Fortune 500" companies, revealing that, although 90% had Web pages and 82% of the sites addressed a corporate social responsibility issue, few corporations used their pages to monitor…
The Internet as a Reflective Mirror for a Company's Image.
ERIC Educational Resources Information Center
Fahrmann, Jennifer; Hartz, Kim; Wendling, Marijo; Yoder, Kevin
The Internet is becoming the primary way that businesses communicate and receive information. Corporate Web addresses and home pages have become a valuable tool for leaving a solid mark on potential clients, consumers, and competition. To determine how differences in Web pages design reflect corporate image, a study examined Web pages from two…
Web page sorting algorithm based on query keyword distance relation
NASA Astrophysics Data System (ADS)
Yang, Han; Cui, Hong Gang; Tang, Hao
2017-08-01
In order to optimize the problem of page sorting, according to the search keywords in the web page in the relationship between the characteristics of the proposed query keywords clustering ideas. And it is converted into the degree of aggregation of the search keywords in the web page. Based on the PageRank algorithm, the clustering degree factor of the query keyword is added to make it possible to participate in the quantitative calculation. This paper proposes an improved algorithm for PageRank based on the distance relation between search keywords. The experimental results show the feasibility and effectiveness of the method.
... reviewed/quality-filtered. The primary purpose of the Web page is educational and not to sell a ... in the directories. Availability and maintenance of the Web page The Web site is available consistently and ...
A readability assessment of online stroke information.
Sharma, Nikhil; Tridimas, Andreas; Fitzsimmons, Paul R
2014-07-01
Patients and carers increasingly access the Internet as a source of health information. Poor health literacy is extremely common and frequently limits patient's comprehension of health care information literature. We aimed to assess the readability of online consumer-orientated stroke information using 2 validated readability measures. The 100 highest Google ranked consumer-oriented stroke Web pages were assessed for reading difficulty using the Flesch-Kincaid and Simple Measure of Gobbledygook (SMOG) formulae. None of the included Web pages complied with the current readability guidelines when readability was measured using the gold standard SMOG formula. Mean Flesch-Kincaid grade level was 10.4 (95% confidence interval [CI] 9.97-10.9) and mean SMOG grade 12.1 (95% CI 11.7-12.4). Over half of the Web pages were produced at graduate reading levels or above. Not-for-profit Web pages were significantly easier to read (P=.0006). The Flesch-Kincaid formula significantly underestimated reading difficulty, with a mean underestimation of 1.65 grades (95% CI 1.49-1.81), P<.0001. Most consumer-orientated stroke information Web sites require major text revision to comply with readability guidelines and to be comprehensible to the average patient. The Flesch-Kincaid formula significantly underestimates reading difficulty, and SMOG should be used as the measure of choice. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.
National Centers for Environmental Prediction
. Government's official Web portal to all Federal, state and local government Web resources and services. MISSION Web Page [scroll down to "Verification" Section] HRRR Verification at NOAA ESRL HRRR Web Verification Web Page NOAA / National Weather Service National Centers for Environmental Prediction
SSE Transition to POWER is Now Complete
Atmospheric Science Data Center
2018-06-21
... A new POWER home page with enhanced responsive GIS-enabled web data services and mapping capabilities replaced the SSE site on June 13, 2018. This current set of SSE web applications and website is no longer accessible. The new POWER includes ...
Experience versus talent shapes the structure of the Web.
Kong, Joseph S; Sarshar, Nima; Roychowdhury, Vwani P
2008-09-16
We use sequential large-scale crawl data to empirically investigate and validate the dynamics that underlie the evolution of the structure of the web. We find that the overall structure of the web is defined by an intricate interplay between experience or entitlement of the pages (as measured by the number of inbound hyperlinks a page already has), inherent talent or fitness of the pages (as measured by the likelihood that someone visiting the page would give a hyperlink to it), and the continual high rates of birth and death of pages on the web. We find that the web is conservative in judging talent and the overall fitness distribution is exponential, showing low variability. The small variance in talent, however, is enough to lead to experience distributions with high variance: The preferential attachment mechanism amplifies these small biases and leads to heavy-tailed power-law (PL) inbound degree distributions over all pages, as well as over pages that are of the same age. The balancing act between experience and talent on the web allows newly introduced pages with novel and interesting content to grow quickly and surpass older pages. In this regard, it is much like what we observe in high-mobility and meritocratic societies: People with entitlement continue to have access to the best resources, but there is just enough screening for fitness that allows for talented winners to emerge and join the ranks of the leaders. Finally, we show that the fitness estimates have potential practical applications in ranking query results.
"Ordinary People Do This": Rhetorical Examinations of Novice Web Design
ERIC Educational Resources Information Center
Karper, Erin
2005-01-01
Even as weblogs, content management systems, and other forms of automated Web posting and journals are changing the way people create and place content on the Web, new Web pages mushroom overnight. However, many new Web designers produce Web pages that seem to ignore fundamental principles of "good design": full of colored backgrounds, animated…
Connecting Families through Innovative Technology in an Early Childhood Gifted Program.
ERIC Educational Resources Information Center
Kristovich, Sharon; Hertzog, Nancy B.; Klein, Marjorie
University Primary School (UPS) is an early childhood gifted program affiliated with the University of Illinois at Urbana-Champaign. This paper highlights three innovative uses of technology at UPS: Knowledge Web pages, photo portfolios, and Chickscope. The Knowledge Web pages are a collection of Web pages that serve as a virtual bulletin board…
ERIC Educational Resources Information Center
Hall, Richard H.; Hanna, Patrick
2004-01-01
The purpose of this experiment was to examine the effect of web page text/background colour combination on readability, retention, aesthetics, and behavioural intention. One hundred and thirty-six participants studied two Web pages, one with educational content and one with commercial content, in one of four colour-combination conditions. Major…
Web Pages for Your Classroom: The Easy Way!
ERIC Educational Resources Information Center
McCorkle, Sandra K.
This book provides the classroom teacher or librarian with templates and instructions for creating Web pages for use with middle school or high school students. The pages can then be used for doing research projects or other types of projects that familiarize students with the power, flexibility, and usefulness of the Web. Part I, Technology in…
Teaching E-Commerce Web Page Evaluation and Design: A Pilot Study Using Tourism Destination Sites
ERIC Educational Resources Information Center
Susser, Bernard; Ariga, Taeko
2006-01-01
This study explores a teaching method for improving business students' skills in e-commerce page evaluation and making Web design majors aware of business content issues through cooperative learning. Two groups of female students at a Japanese university studying either tourism or Web page design were assigned tasks that required cooperation to…
ERIC Educational Resources Information Center
Kammerer, Yvonne; Kalbfell, Eva; Gerjets, Peter
2016-01-01
In two experiments we systematically examined whether contradictions between two web pages--of which one was commercially biased as stated in an "about us" section--stimulated university students' consideration of source information both during and after reading. In Experiment 1 "about us" information of the web pages was…
ARM Climate Research Facility: Outreach Tools and Strategies
NASA Astrophysics Data System (ADS)
Roeder, L.; Jundt, R.
2009-12-01
Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.
Information about epilepsy on the internet: An exploratory study of Arabic websites.
Alkhateeb, Jamal M; Alhadidi, Muna S
2018-01-01
The aim of this study was to explore information about epilepsy found on Arabic websites. The researchers collected information from the internet between November 2016 and January 2017. Information was obtained using Google and Yahoo search engines. Keywords used were the Arabic equivalent of the following two keywords: epilepsy (Al-saraa) and convulsion (Tashanoj). A total of 144 web pages addressing epilepsy in Arabic were reviewed. The majority of web pages were websites of medical institutions and general health websites, followed by informational and educational websites, others, blogs and websites of individuals, and news and media sites. Topics most commonly addressed were medical treatments for epilepsy (50% of all pages) followed by epilepsy definition (41%) and epilepsy etiology (34.7%). The results also revealed that the vast majority of web pages did not mention the source of information. Many web pages also did not provide author information. Only a small proportion of the web pages provided adequate information. Relatively few web pages provided inaccurate information or made sweeping generalizations. As a result, it is concluded that the findings of the present study suggest that development of more credible Arabic websites on epilepsy is needed. These websites need to go beyond basic information, offering more evidence-based and updated information about epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.
12 CFR Appendix A to Part 40 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... the term “Social Security number” in the first bullet. (2) Institutions must use five (5) of the...; a Web site; or use of a mail-in opt-out form. Institutions may include the words “toll-free” before... specific Web address that takes consumers directly to the opt-out page or a general Web address that...
12 CFR Appendix A to Part 716 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... the term “Social Security number” in the first bullet. (2) Institutions must use five (5) of the...; a Web site; or use of a mail-in opt-out form. Institutions may include the words “toll-free” before... specific Web address that takes consumers directly to the opt-out page or a general Web address that...
12 CFR Appendix A to Part 573 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... the term “Social Security number” in the first bullet. (2) Institutions must use five (5) of the...; a Web site; or use of a mail-in opt-out form. Institutions may include the words “toll-free” before... specific Web address that takes consumers directly to the opt-out page or a general Web address that...
30 CFR 585.500 - How do I make payments under this part?
Code of Federal Regulations, 2012 CFR
2012-07-01
... credit card or automated clearing house payments through the Pay.gov Web site, and you must include one copy of the Pay.gov confirmation receipt page with your unsolicited request or signed lease instrument. You may access the Pay.gov Web site through links on the BOEM Offshore Web site at: http://www.boem...
30 CFR 285.500 - How do I make payments under this part?
Code of Federal Regulations, 2011 CFR
2011-07-01
... your lease, you must make credit card or automated clearing house payments through the Pay.gov Web site, and you must include one copy of the Pay.gov confirmation receipt page with your unsolicited request or signed lease instrument. You may access the Pay.gov Web site through links on the MMS Offshore Web...
30 CFR 585.500 - How do I make payments under this part?
Code of Federal Regulations, 2013 CFR
2013-07-01
... credit card or automated clearing house payments through the Pay.gov Web site, and you must include one copy of the Pay.gov confirmation receipt page with your unsolicited request or signed lease instrument. You may access the Pay.gov Web site through links on the BOEM Offshore Web site at: http://www.boem...
30 CFR 285.500 - How do I make payments under this part?
Code of Federal Regulations, 2010 CFR
2010-07-01
... or automated clearing house payments through the Pay.gov Web site, and you must include one copy of the Pay.gov confirmation receipt page with your unsolicited request or signed lease instrument. You may access the Pay.gov Web site through links on the MMS Offshore Web site at: http://www.mms.gov...
ERIC Educational Resources Information Center
Zeng, Xiaoming; Sligar, Steven R.
2008-01-01
Human resource development programs in various institutions communicate with their constituencies including persons with disabilities through websites. Web sites need to be accessible for legal, economic and ethical reasons. We used an automated web usability evaluation tool, aDesigner, to evaluate 205 home pages from the organizations of AHRD…
ERIC Educational Resources Information Center
Geelan, David R.; Taylor, Peter C.
2004-01-01
Computer mediated communication--including web pages, email and web-based bulletin boards--was used to support the development of a cooperative learning community among students in a web-based distance education unit for practicing science and mathematics educators. The students lived in several Australian states and a number of Pacific Rim…
An Ant Colony Optimization Based Feature Selection for Web Page Classification
2014-01-01
The increased popularity of the web has caused the inclusion of huge amount of information to the web, and as a result of this explosive information growth, automated web page classification systems are needed to improve search engines' performance. Web pages have a large number of features such as HTML/XML tags, URLs, hyperlinks, and text contents that should be considered during an automated classification process. The aim of this study is to reduce the number of features to be used to improve runtime and accuracy of the classification of web pages. In this study, we used an ant colony optimization (ACO) algorithm to select the best features, and then we applied the well-known C4.5, naive Bayes, and k nearest neighbor classifiers to assign class labels to web pages. We used the WebKB and Conference datasets in our experiments, and we showed that using the ACO for feature selection improves both accuracy and runtime performance of classification. We also showed that the proposed ACO based algorithm can select better features with respect to the well-known information gain and chi square feature selection methods. PMID:25136678
ERIC Educational Resources Information Center
Brandt, D. Scott
1998-01-01
Examines Internet security risks and how users can protect themselves. Discusses inadvertent bugs in software; programming problems with Common Gateway Interface (CGI); viruses; tracking of Web users; and preventing access to selected Web pages and filtering software. A glossary of Internet security-related terms is included. (AEF)
MedlinePlus Connect: Technical Information
... Service Technical Information Page MedlinePlus Connect Implementation Options Web Application How does it work? Responds to requests ... examples of MedlinePlus Connect Web Application response pages. Web Service How does it work? Responds to requests ...
78 FR 42775 - CGI Federal, Inc., and Custom Applications Management; Transfer of Data
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-17
... develop applications, Web sites, Web pages, web-based applications and databases, in accordance with EPA policies and related Federal standards and procedures. The Contractor will provide [[Page 42776
Explore the World of Particle Physics Measuring Single Photons The web pages that follow presume phenomenon and then return to our study of single photon measurement. Your choices include: These choices University of Colorado. A Java applet by Phillip Warner. Dive right into the single photon pages here
Network Update: Plug-Ins, Forms and All That Java.
ERIC Educational Resources Information Center
Higgins, Chris
1997-01-01
Notes that the desire to make World Wide Web (WWW) pages more interactive and laden with animation, sound, and video brings us to the threshold of the deeper levels of Web page creation. Lists and discusses resources available on the WWW that will aid in learning and using these dynamic functions for Web page development to assist in interactive…
Strong regularities in world wide web surfing
Huberman; Pirolli; Pitkow; Lukose
1998-04-03
One of the most common modes of accessing information in the World Wide Web is surfing from one document to another along hyperlinks. Several large empirical studies have revealed common patterns of surfing behavior. A model that assumes that users make a sequence of decisions to proceed to another page, continuing as long as the value of the current page exceeds some threshold, yields the probability distribution for the number of pages that a user visits within a given Web site. This model was verified by comparing its predictions with detailed measurements of surfing patterns. The model also explains the observed Zipf-like distributions in page hits observed at Web sites.
Digital Ethnography: Library Web Page Redesign among Digital Natives
ERIC Educational Resources Information Center
Klare, Diane; Hobbs, Kendall
2011-01-01
Presented with an opportunity to improve Wesleyan University's dated library home page, a team of librarians employed ethnographic techniques to explore how its users interacted with Wesleyan's current library home page and web pages in general. Based on the data that emerged, a group of library staff and members of the campus' information…
Programming Methodology for High Performance Applications on Tiled Architectures
2009-06-01
members, both voting and non- voting ; • A page of links, including links to all available PCA home pages, the home pages of other DARPA programs of...HPEC-SI); and • Organizational information on the Morphware Forum, such as membership requirements and voting procedures. The web site was the...The following organizations are voting members of the Morphware Forum at this writing: o Defense Advanced Research Projects Agency o Georgia
Wiki use in mental health practice: recognizing potential use of collaborative technology.
Bastida, Richard; McGrath, Ian; Maude, Phil
2010-04-01
Web 2.0, the second-generation of the World Wide Web, differs to earlier versions of Web development and design in that it facilitates more user-friendly, interactive information sharing and mechanisms for greater collaboration between users. Examples of Web 2.0 include Web-based communities, hosted services, social networking sites, video sharing sites, blogs, mashups, and wikis. Users are able to interact with others across the world or to add to or change website content. This paper examines examples of wiki use in the Australian mental health sector. A wiki can be described as an online collaborative and interactive database that can be easily edited by users. They are accessed via a standard Web browser which has an interface similar to traditional Web pages, thus do not require special application or software for the user. Although there is a paucity of literature describing wiki use in mental health, other industries have developed uses, including a repository of knowledge, a platform for collaborative writing, a project management tool, and an alternative to traditional Web pages or Intranets. This paper discusses the application of wikis in other industries and offers suggestions by way of examples of how this technology could be used in the mental health sector.
Experience versus talent shapes the structure of the Web
Kong, Joseph S.; Sarshar, Nima; Roychowdhury, Vwani P.
2008-01-01
We use sequential large-scale crawl data to empirically investigate and validate the dynamics that underlie the evolution of the structure of the web. We find that the overall structure of the web is defined by an intricate interplay between experience or entitlement of the pages (as measured by the number of inbound hyperlinks a page already has), inherent talent or fitness of the pages (as measured by the likelihood that someone visiting the page would give a hyperlink to it), and the continual high rates of birth and death of pages on the web. We find that the web is conservative in judging talent and the overall fitness distribution is exponential, showing low variability. The small variance in talent, however, is enough to lead to experience distributions with high variance: The preferential attachment mechanism amplifies these small biases and leads to heavy-tailed power-law (PL) inbound degree distributions over all pages, as well as over pages that are of the same age. The balancing act between experience and talent on the web allows newly introduced pages with novel and interesting content to grow quickly and surpass older pages. In this regard, it is much like what we observe in high-mobility and meritocratic societies: People with entitlement continue to have access to the best resources, but there is just enough screening for fitness that allows for talented winners to emerge and join the ranks of the leaders. Finally, we show that the fitness estimates have potential practical applications in ranking query results. PMID:18779560
What Can Pictures Tell Us About Web Pages? Improving Document Search Using Images.
Rodriguez-Vaamonde, Sergio; Torresani, Lorenzo; Fitzgibbon, Andrew W
2015-06-01
Traditional Web search engines do not use the images in the HTML pages to find relevant documents for a given query. Instead, they typically operate by computing a measure of agreement between the keywords provided by the user and only the text portion of each page. In this paper we study whether the content of the pictures appearing in a Web page can be used to enrich the semantic description of an HTML document and consequently boost the performance of a keyword-based search engine. We present a Web-scalable system that exploits a pure text-based search engine to find an initial set of candidate documents for a given query. Then, the candidate set is reranked using visual information extracted from the images contained in the pages. The resulting system retains the computational efficiency of traditional text-based search engines with only a small additional storage cost needed to encode the visual information. We test our approach on one of the TREC Million Query Track benchmarks where we show that the exploitation of visual content yields improvement in accuracies for two distinct text-based search engines, including the system with the best reported performance on this benchmark. We further validate our approach by collecting document relevance judgements on our search results using Amazon Mechanical Turk. The results of this experiment confirm the improvement in accuracy produced by our image-based reranker over a pure text-based system.
RCrawler: An R package for parallel web crawling and scraping
NASA Astrophysics Data System (ADS)
Khalil, Salim; Fakir, Mohamed
RCrawler is a contributed R package for domain-based web crawling and content scraping. As the first implementation of a parallel web crawler in the R environment, RCrawler can crawl, parse, store pages, extract contents, and produce data that can be directly employed for web content mining applications. However, it is also flexible, and could be adapted to other applications. The main features of RCrawler are multi-threaded crawling, content extraction, and duplicate content detection. In addition, it includes functionalities such as URL and content-type filtering, depth level controlling, and a robot.txt parser. Our crawler has a highly optimized system, and can download a large number of pages per second while being robust against certain crashes and spider traps. In this paper, we describe the design and functionality of RCrawler, and report on our experience of implementing it in an R environment, including different optimizations that handle the limitations of R. Finally, we discuss our experimental results.
Tool independence for the Web Accessibility Quantitative Metric.
Vigo, Markel; Brajnik, Giorgio; Arrue, Myriam; Abascal, Julio
2009-07-01
The Web Accessibility Quantitative Metric (WAQM) aims at accurately measuring the accessibility of web pages. One of the main features of WAQM among others is that it is evaluation tool independent for ranking and accessibility monitoring scenarios. This article proposes a method to attain evaluation tool independence for all foreseeable scenarios. After demonstrating that homepages have a more similar error profile than any other web page in a given web site, 15 homepages were measured with 10,000 different values of WAQM parameters using EvalAccess and LIFT, two automatic evaluation tools for accessibility. A similar procedure was followed with random pages and with several test files obtaining several tuples that minimise the difference between both tools. One thousand four hundred forty-nine web pages from 15 web sites were measured with these tuples and those values that minimised the difference between the tools were selected. Once the WAQM was tuned, the accessibility of 15 web sites was measured with two metrics for web sites, concluding that even if similar values can be produced, obtaining the same scores is undesirable since evaluation tools behave in a different way.
NASA Astrophysics Data System (ADS)
Herrera, Francisco Javier, Jr.
This study set out to examine how a web-based tool embedded with vocabulary strategies, as part of the science curriculum in a third grade two-way immersion classroom, would aid students' academic vocabulary development. Fourteen students (seven boys, seven girls; ten of which were English learners) participated in this study. Students utilized web pages as part of their science curriculum on the topic of ecology. The study documented students' use of the web pages as a data-gathering tool on the topic of ecology during science instruction. Students were video and audio taped as they explored the web pages. Results indicated that through the use of the intervention web pages students significantly improved their knowledge of academic English target words.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... process.'' 2. On Web page 15564, in the third column, first paragraph, remove the fourth sentence. 3. On Web page 15566, in the first column, fourth paragraph, second sentence, revise ``(b)(2)(ii) to read... relating to the authority of OPM's Office of Inspector General.'' Sec. 800.20 [Corrected] 0 13. On Web page...
NASA Technical Reports Server (NTRS)
Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)
1998-01-01
The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.
NASA Astrophysics Data System (ADS)
Showstack, Randy
With the growing interest in extreme climate and weather events, the National Oceanic and Atmospheric Administration (NOAA) has set up a one-stop Web site. It includes data on tornadoes, hurricanes, and heavy rainfall, temperature extremes, global climate change, satellite images, and El Niño and La Niña. The Web address is http://www.ncdc.noaa.gov.Another good climate Web site is the La Niña Home Page. Set up by the Environmental and Societal Impacts Group of the National Center for Atmospheric Research, the site includes forecasts, data sources, impacts, and Internet links.
30 CFR 203.3 - Do I have to pay a fee to request royalty relief?
Code of Federal Regulations, 2010 CFR
2010-07-01
... administer royalty relief. (b) You must file all payments electronically through the Pay.gov Web site and you must include a copy of the Pay.gov confirmation receipt page with your application or assessment. The Pay.gov Web site may be accessed through a link on the MMS Offshore Web site at: http://www.mms.gov...
30 CFR 203.3 - Do I have to pay a fee to request royalty relief?
Code of Federal Regulations, 2012 CFR
2012-07-01
... administer royalty relief. (b) You must file all payments electronically through the Pay.gov Web site and you must include a copy of the Pay.gov confirmation receipt page with your application or assessment. The Pay.gov Web site may be accessed through a link on the BSEE Offshore Web site at: http://www.bsee.gov...
30 CFR 203.3 - Do I have to pay a fee to request royalty relief?
Code of Federal Regulations, 2013 CFR
2013-07-01
... administer royalty relief. (b) You must file all payments electronically through the Pay.gov Web site and you must include a copy of the Pay.gov confirmation receipt page with your application or assessment. The Pay.gov Web site may be accessed through a link on the BSEE Offshore Web site at: http://www.bsee.gov...
30 CFR 203.3 - Do I have to pay a fee to request royalty relief?
Code of Federal Regulations, 2014 CFR
2014-07-01
... administer royalty relief. (b) You must file all payments electronically through the Pay.gov Web site and you must include a copy of the Pay.gov confirmation receipt page with your application or assessment. The Pay.gov Web site may be accessed through a link on the BSEE Offshore Web site at: http://www.bsee.gov...
Include Your Patrons in Web Design. Computers in Small Libraries
ERIC Educational Resources Information Center
Roberts, Gary
2005-01-01
Successful Web publishing requires not only technical skills but also a refined sense of taste, a good understanding of design, and strong writing abilities. When designing a library Web page, a person must possess all of these talents and be able to market to a broad spectrum of patrons. As a result, library sites vary widely in their style and…
Informatics in radiology (infoRAD): HTML and Web site design for the radiologist: a primer.
Ryan, Anthony G; Louis, Luck J; Yee, William C
2005-01-01
A Web site has enormous potential as a medium for the radiologist to store, present, and share information in the form of text, images, and video clips. With a modest amount of tutoring and effort, designing a site can be as painless as preparing a Microsoft PowerPoint presentation. The site can then be used as a hub for the development of further offshoots (eg, Web-based tutorials, storage for a teaching library, publication of information about one's practice, and information gathering from a wide variety of sources). By learning the basics of hypertext markup language (HTML), the reader will be able to produce a simple and effective Web page that permits display of text, images, and multimedia files. The process of constructing a Web page can be divided into five steps: (a) creating a basic template with formatted text, (b) adding color, (c) importing images and multimedia files, (d) creating hyperlinks, and (e) uploading one's page to the Internet. This Web page may be used as the basis for a Web-based tutorial comprising text documents and image files already in one's possession. Finally, there are many commercially available packages for Web page design that require no knowledge of HTML.
Using the web to validate document recognition results: experiments with business cards
NASA Astrophysics Data System (ADS)
Oertel, Clemens; O'Shea, Shauna; Bodnar, Adam; Blostein, Dorothea
2004-12-01
The World Wide Web is a vast information resource which can be useful for validating the results produced by document recognizers. Three computational steps are involved, all of them challenging: (1) use the recognition results in a Web search to retrieve Web pages that contain information similar to that in the document, (2) identify the relevant portions of the retrieved Web pages, and (3) analyze these relevant portions to determine what corrections (if any) should be made to the recognition result. We have conducted exploratory implementations of steps (1) and (2) in the business-card domain: we use fields of the business card to retrieve Web pages and identify the most relevant portions of those Web pages. In some cases, this information appears suitable for correcting OCR errors in the business card fields. In other cases, the approach fails due to stale information: when business cards are several years old and the business-card holder has changed jobs, then websites (such as the home page or company website) no longer contain information matching that on the business card. Our exploratory results indicate that in some domains it may be possible to develop effective means of querying the Web with recognition results, and to use this information to correct the recognition results and/or detect that the information is stale.
Using the web to validate document recognition results: experiments with business cards
NASA Astrophysics Data System (ADS)
Oertel, Clemens; O'Shea, Shauna; Bodnar, Adam; Blostein, Dorothea
2005-01-01
The World Wide Web is a vast information resource which can be useful for validating the results produced by document recognizers. Three computational steps are involved, all of them challenging: (1) use the recognition results in a Web search to retrieve Web pages that contain information similar to that in the document, (2) identify the relevant portions of the retrieved Web pages, and (3) analyze these relevant portions to determine what corrections (if any) should be made to the recognition result. We have conducted exploratory implementations of steps (1) and (2) in the business-card domain: we use fields of the business card to retrieve Web pages and identify the most relevant portions of those Web pages. In some cases, this information appears suitable for correcting OCR errors in the business card fields. In other cases, the approach fails due to stale information: when business cards are several years old and the business-card holder has changed jobs, then websites (such as the home page or company website) no longer contain information matching that on the business card. Our exploratory results indicate that in some domains it may be possible to develop effective means of querying the Web with recognition results, and to use this information to correct the recognition results and/or detect that the information is stale.
Delivering Library Services to Remote Students.
ERIC Educational Resources Information Center
Casado, Margaret
2001-01-01
Discusses library services at the University of Tennessee to reach off-campus and distance education students. Topics include online research; email; library instruction for faculty and students; Web interfaces; fax; telephone service; chat technology; the library's Web page; virtual classrooms; library links from a course management system; and…
Web-based surveillance of public information needs for informing preconception interventions.
D'Ambrosio, Angelo; Agricola, Eleonora; Russo, Luisa; Gesualdo, Francesco; Pandolfi, Elisabetta; Bortolus, Renata; Castellani, Carlo; Lalatta, Faustina; Mastroiacovo, Pierpaolo; Tozzi, Alberto Eugenio
2015-01-01
The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health. Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics. We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time. Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations.
Web-Based Surveillance of Public Information Needs for Informing Preconception Interventions
D’Ambrosio, Angelo; Agricola, Eleonora; Russo, Luisa; Gesualdo, Francesco; Pandolfi, Elisabetta; Bortolus, Renata; Castellani, Carlo; Lalatta, Faustina; Mastroiacovo, Pierpaolo; Tozzi, Alberto Eugenio
2015-01-01
Background The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health. Methods Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics. Results We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time. Conclusion Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations. PMID:25879682
Default Parallels Plesk Panel Page
services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this
Reliability and type of consumer health documents on the World Wide Web: an annotation study.
Martin, Melanie J
2011-01-01
In this paper we present a detailed scheme for annotating medical web pages designed for health care consumers. The annotation is along two axes: first, by reliability (the extent to which the medical information on the page can be trusted), second, by the type of page (patient leaflet, commercial, link, medical article, testimonial, or support). We analyze inter-rater agreement among three judges for each axis. Inter-rater agreement was moderate (0.77 accuracy, 0.62 F-measure, 0.49 Kappa) on the page reliability axis and good (0.81 accuracy, 0.72 F-measure, 0.73 Kappa) along the page type axis. We have shown promising results in this study that appropriate classes of pages can be developed and used by human annotators to annotate web pages with reasonable to good agreement. No.
Study on online community user motif using web usage mining
NASA Astrophysics Data System (ADS)
Alphy, Meera; Sharma, Ajay
2016-04-01
The Web usage mining is the application of data mining, which is used to extract useful information from the online community. The World Wide Web contains at least 4.73 billion pages according to Indexed Web and it contains at least 228.52 million pages according Dutch Indexed web on 6th august 2015, Thursday. It’s difficult to get needed data from these billions of web pages in World Wide Web. Here is the importance of web usage mining. Personalizing the search engine helps the web user to identify the most used data in an easy way. It reduces the time consumption; automatic site search and automatic restore the useful sites. This study represents the old techniques to latest techniques used in pattern discovery and analysis in web usage mining from 1996 to 2015. Analyzing user motif helps in the improvement of business, e-commerce, personalisation and improvement of websites.
Cleanups In My Community (CIMC) - Federal facilities that are also Superfund sites, National Layer
Federal facilities are properties owned by the federal government. This data layer provides access to Federal facilities that are Superfund sites as part of the CIMC web service. Data are collected using the Superfund Enterprise Management System (SEMS) and transferred to Envirofacts for access by the public. Data about Federal facility Superfund sites are located on their own EPA web pages, and CIMC links to those pages. Links to the relevant web pages for each site are provided within the attribute table. Federal facility sites can be either Superfund sites or RCRA Corrective Action sites, or they may have moved from one program to the other and back. In Cleanups in My Community, you can map or list any of these Federal Facility sites. This data layer shows only those facilities that are Superfund Sites. RCRA federal facility sites and other Superfund NPL sites are included in other data layers as part of this web service.Superfund is a program administered by the EPA to locate, investigate, and clean up worst hazardous waste sites throughout the United States. EPA administers the Superfund program in cooperation with individual states and tribal governments. These sites include abandoned warehouses, manufacturing facilities, processing plants, and landfills - the key word here being abandoned. The CIMC web service was initially published in 2013, but the data are updated on the 18th of each month. The full schedule for data updates in CIMC is located here:
ERIC Educational Resources Information Center
Fernandez-Cardenas, Juan Manuel
2008-01-01
This paper looks at the collaborative construction of web pages in History by a Year-4 group of children in a primary school in the UK. The aim of this paper is to find out: (a) How did children interpret their involvement in this literacy practice? (b) How the construction of web pages was interactionally accomplished? and (c) How can creativity…
Textual and visual content-based anti-phishing: a Bayesian approach.
Zhang, Haijun; Liu, Gang; Chow, Tommy W S; Liu, Wenyin
2011-10-01
A novel framework using a Bayesian approach for content-based phishing web page detection is presented. Our model takes into account textual and visual contents to measure the similarity between the protected web page and suspicious web pages. A text classifier, an image classifier, and an algorithm fusing the results from classifiers are introduced. An outstanding feature of this paper is the exploration of a Bayesian model to estimate the matching threshold. This is required in the classifier for determining the class of the web page and identifying whether the web page is phishing or not. In the text classifier, the naive Bayes rule is used to calculate the probability that a web page is phishing. In the image classifier, the earth mover's distance is employed to measure the visual similarity, and our Bayesian model is designed to determine the threshold. In the data fusion algorithm, the Bayes theory is used to synthesize the classification results from textual and visual content. The effectiveness of our proposed approach was examined in a large-scale dataset collected from real phishing cases. Experimental results demonstrated that the text classifier and the image classifier we designed deliver promising results, the fusion algorithm outperforms either of the individual classifiers, and our model can be adapted to different phishing cases. © 2011 IEEE
ERIC Educational Resources Information Center
Mitsuhara, Hiroyuki; Kurose, Yoshinobu; Ochi, Youji; Yano, Yoneo
The authors developed a Web-based Adaptive Educational System (Web-based AES) named ITMS (Individualized Teaching Material System). ITMS adaptively integrates knowledge on the distributed Web pages and generates individualized teaching material that has various contents. ITMS also presumes the learners' knowledge levels from the states of their…
A Neophyte Constructs a Web Site: Lessons Learned.
ERIC Educational Resources Information Center
Bent, Devin
1998-01-01
A political science professor at James Madison University (VA) constructed a Web page to support an undergraduate course in government. This article defines Web-site goals and audience, reviews other sites, and discusses organization of Web links and technical choices for HTML editor, page layout and use of image, audio, and video files. Stresses…
Atmospheric Science Data Center
2013-03-25
... for educational or informational purposes, including photo collections, textbooks, public exhibits, and Internet web pages. ... endorsement of commercial goods or services. If a NASA image includes an identifiable person, using the image for commercial purposes ...
EZStream: Distributing Live ISS Experiment Telemetry via Internet
NASA Technical Reports Server (NTRS)
Myers, Gerry; Welch, Clara L. (Technical Monitor)
2002-01-01
This paper will present the high-level architecture and components of the current version of EZStream as well as the product direction & enhancements to be incorporated through a Phase II grant. Security will be addressed such as data encryption and user login. Remote user devices will be discussed including web browsers on PC's and displays on PDA's and smart cell phones. The interaction between EZStream and TReK will be covered as well as the eventuality of EZStream to receive and parse binary data streams directly. This makes EZStream beneficial to both the International Partners and non-NASA applications. The options of developing client-side display web pages will be addressed and the development of new tools to allow creation of display web pages by non-programmers.
Information CPC Web Team USA.gov is the U.S. Government's official Web portal to all Federal, state and local government Web resources and services. This page has moved In about 10 seconds you will be transferred to its
Some Features of "Alt" Texts Associated with Images in Web Pages
ERIC Educational Resources Information Center
Craven, Timothy C.
2006-01-01
Introduction: This paper extends a series on summaries of Web objects, in this case, the alt attribute of image files. Method: Data were logged from 1894 pages from Yahoo!'s random page service and 4703 pages from the Google directory; an img tag was extracted randomly from each where present; its alt attribute, if any, was recorded; and the…
Enriching the trustworthiness of health-related web pages.
Gaudinat, Arnaud; Cruchet, Sarah; Boyer, Celia; Chrawdhry, Pravir
2011-06-01
We present an experimental mechanism for enriching web content with quality metadata. This mechanism is based on a simple and well-known initiative in the field of the health-related web, the HONcode. The Resource Description Framework (RDF) format and the Dublin Core Metadata Element Set were used to formalize these metadata. The model of trust proposed is based on a quality model for health-related web pages that has been tested in practice over a period of thirteen years. Our model has been explored in the context of a project to develop a research tool that automatically detects the occurrence of quality criteria in health-related web pages.
Technological Minimalism: A Cost-Effective Alternative for Course Design and Development.
ERIC Educational Resources Information Center
Lorenzo, George
2001-01-01
Discusses the use of minimum levels of technology, or technological minimalism, for Web-based multimedia course content. Highlights include cost effectiveness; problems with video streaming, the use of XML for Web pages, and Flash and Java applets; listservs instead of proprietary software; and proper faculty training. (LRW)
E-Texts, Mobile Browsing, and Rich Internet Applications
ERIC Educational Resources Information Center
Godwin-Jones, Robert
2007-01-01
Online reading is evolving beyond the perusal of static documents with Web pages inviting readers to become commentators, collaborators, and critics. The much-ballyhooed Web 2.0 is essentially a transition from online consumer to consumer/producer/participant. An online document may well include embedded multimedia or contain other forms of…
A Literature Review of Academic Library Web Page Studies
ERIC Educational Resources Information Center
Blummer, Barbara
2007-01-01
In the early 1990s, numerous academic libraries adopted the web as a communication tool with users. The literature on academic library websites includes research on both design and navigation. Early studies typically focused on design characteristics, since websites initially merely provided information on the services and collections available in…
Content Delivery in the "Blogosphere"
ERIC Educational Resources Information Center
Ferdig, Richard E.; Trammell, Kaye D.
2004-01-01
The interest in new media for teaching and learning has highlighted the potential of innovative software and hardware for education. This has included laptops, hand-helds, wireless systems and Web-based learning environments. Most recently, however, this interest has focused on blogs and blogging. Weblogs, or blogs, are Web pages often likened to…
Untangling the Tangled Webs We Weave: A Team Approach to Cyberspace.
ERIC Educational Resources Information Center
Broidy, Ellen; And Others
Working in a cooperative team environment across libraries and job classifications, librarians and support staff at the University of California at Irvine (UCI) have mounted several successful web projects, including two versions of the Libraries' home page, a virtual reference collection, and Science Library "ANTswer Machine." UCI's…
Net Survey: "Top Ten Mistakes" in Academic Web Design.
ERIC Educational Resources Information Center
Petrik, Paula
2000-01-01
Highlights the top ten mistakes in academic Web design: (1) bloated graphics; (2) scaling images; (3) dense text; (4) lack of contrast; (5) font size; (6) looping animations; (7) courseware authoring software; (8) scrolling/long pages; (9) excessive download; and (10) the nothing site. Includes resources. (CMK)
Manole, Bogdan-Alexandru; Wakefield, Daniel V; Dove, Austin P; Dulaney, Caleb R; Marcrom, Samuel R; Schwartz, David L; Farmer, Michael R
2017-12-24
The purpose of this study was to survey the accessibility and quality of prostate-specific antigen (PSA) screening information from National Cancer Institute (NCI) cancer center and public health organization Web sites. We surveyed the December 1, 2016, version of all 63 NCI-designated cancer center public Web sites and 5 major online clearinghouses from allied public/private organizations (cancer.gov, cancer.org, PCF.org, USPSTF.org, and CDC.gov). Web sites were analyzed according to a 50-item list of validated health care information quality measures. Web sites were graded by 2 blinded reviewers. Interrater agreement was confirmed by Cohen kappa coefficient. Ninety percent of Web sites addressed PSA screening. Cancer center sites covered 45% of topics surveyed, whereas organization Web sites addressed 70%. All organizational Web pages addressed the possibility of false-positive screening results; 41% of cancer center Web pages did not. Forty percent of cancer center Web pages also did not discuss next steps if a PSA test was positive. Only 6% of cancer center Web pages were rated by our reviewers as "superior" (eg, addressing >75% of the surveyed topics) versus 20% of organizational Web pages. Interrater agreement between our reviewers was high (kappa coefficient = 0.602). NCI-designated cancer center Web sites publish lower quality public information about PSA screening than sites run by major allied organizations. Nonetheless, information and communication deficiencies were observed across all surveyed sites. In an age of increasing patient consumerism, prospective prostate cancer patients would benefit from improved online PSA screening information from provider and advocacy organizations. Validated cancer patient Web educational standards remain an important, understudied priority. Copyright © 2018. Published by Elsevier Inc.
30 CFR 203.3 - Do I have to pay a fee to request royalty relief?
Code of Federal Regulations, 2011 CFR
2011-07-01
... electronically through the Pay.gov Web site and you must include a copy of the Pay.gov confirmation receipt page with your application or assessment. The Pay.gov Web site may be accessed through a link on the MMS Offshore Web site at: http://www.mms.gov/offshore/ homepage or directly through Pay.gov at: https://www.pay...
NASA Astrophysics Data System (ADS)
Gross, M. B.; Mayernik, M. S.; Rowan, L. R.; Khan, H.; Boler, F. M.; Maull, K. E.; Stott, D.; Williams, S.; Corson-Rikert, J.; Johns, E. M.; Daniels, M. D.; Krafft, D. B.
2015-12-01
UNAVCO, UCAR, and Cornell University are working together to leverage semantic web technologies to enable discovery of people, datasets, publications and other research products, as well as the connections between them. The EarthCollab project, an EarthCube Building Block, is enhancing an existing open-source semantic web application, VIVO, to address connectivity gaps across distributed networks of researchers and resources related to the following two geoscience-based communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. People, publications, datasets and grant information have been mapped to an extended version of the VIVO-ISF ontology and ingested into VIVO's database. Data is ingested using a custom set of scripts that include the ability to perform basic automated and curated disambiguation. VIVO can display a page for every object ingested, including connections to other objects in the VIVO database. A dataset page, for example, includes the dataset type, time interval, DOI, related publications, and authors. The dataset type field provides a connection to all other datasets of the same type. The author's page will show, among other information, related datasets and co-authors. Information previously spread across several unconnected databases is now stored in a single location. In addition to VIVO's default display, the new database can also be queried using SPARQL, a query language for semantic data. EarthCollab will also extend the VIVO web application. One such extension is the ability to cross-link separate VIVO instances across institutions, allowing local display of externally curated information. For example, Cornell's VIVO faculty pages will display UNAVCO's dataset information and UNAVCO's VIVO will display Cornell faculty member contact and position information. Additional extensions, including enhanced geospatial capabilities, will be developed following task-centered usability testing.
NASA Astrophysics Data System (ADS)
Ahlers, Dirk; Boll, Susanne
In recent years, the relation of Web information to a physical location has gained much attention. However, Web content today often carries only an implicit relation to a location. In this chapter, we present a novel location-based search engine that automatically derives spatial context from unstructured Web resources and allows for location-based search: our focused crawler applies heuristics to crawl and analyze Web pages that have a high probability of carrying a spatial relation to a certain region or place; the location extractor identifies the actual location information from the pages; our indexer assigns a geo-context to the pages and makes them available for a later spatial Web search. We illustrate the usage of our spatial Web search for location-based applications that provide information not only right-in-time but also right-on-the-spot.
CrazyEgg Reports for Single Page Analysis
CrazyEgg provides an in depth look at visitor behavior on one page. While you can use GA to do trend analysis of your web area, CrazyEgg helps diagnose the design of a single Web page by visually displaying all visitor clicks during a specified time.
Radiology teaching file cases on the World Wide Web.
Scalzetti, E M
1997-08-01
The presentation of a radiographic teaching file on the World Wide Web can be enhanced by attending to principles of web design. Chief among these are appropriate control of page layout, minimization of the time required to download a page from the remote server, and provision for navigation within and among the web pages that constitute the site. Page layout is easily accomplished by the use of tables; column widths can be fixed to maintain an acceptable line length for text. Downloading time is minimized by rigorous editing and by optimal compression of image files; beyond this, techniques like preloading of images and specification of image width and height are also helpful. Navigation controls should be clear, consistent, and readily available.
Introduction to the world wide web.
Downes, P K
2007-05-12
The World Wide Web used to be nicknamed the 'World Wide Wait'. Now, thanks to high speed broadband connections, browsing the web has become a much more enjoyable and productive activity. Computers need to know where web pages are stored on the Internet, in just the same way as we need to know where someone lives in order to post them a letter. This section explains how the World Wide Web works and how web pages can be viewed using a web browser.
Does content affect whether users remember that Web pages were hyperlinked?
Jones, Keith S; Ballew, Timothy V; Probst, C Adam
2008-10-01
We determined whether memory for hyperlinks improved when they represented relations between the contents of the Web pages. J. S. Farris (2003) found that memory for hyperlinks improved when they represented relations between the contents of the Web pages. However, Farris's (2003) participants could have used their knowledge of site content to answer questions about relations that were instantiated via the site's content and its hyperlinks. In Experiment 1, users navigated a Web site and then answered questions about relations that were instantiated only via content, only via hyperlinks, and via content and hyperlinks. Unlike Farris (2003), we split the latter into two sets. One asked whether certain content elements were related, and the other asked whether certain Web pages were hyperlinked. Experiment 2 replicated Experiment 1 with one modification: The questions that were asked about relations instantiated via content and hyperlinks were changed so that each question's wrong answer was also related to the question's target. Memory for hyperlinks improved when they represented relations instantiated within the content of the Web pages. This was true when (a) questions about content and hyperlinks were separated (Experiment 1) and (b) each question's wrong answer was also related to the question's target (Experiment 2). The accuracy of users' mental representations of local architecture depended on whether hyperlinks were related to the site's content. Designers who want users to remember hyperlinks should associate those hyperlinks with content that reflects the relation between the contents on the Web pages.
National Centers for Environmental Prediction
Reference List Table of Contents NCEP OPERATIONAL MODEL FORECAST GRAPHICS PARALLEL/EXPERIMENTAL MODEL Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS VERIFICATION (GRID VS.OBS) WEB PAGE (NCEP EXPERIMENTAL PAGE, INTERNAL USE ONLY) Interactive web page tool for
Parents on the web: risks for quality management of cough in children.
Pandolfini, C; Impicciatore, P; Bonati, M
2000-01-01
Health information on the Internet, with respect to common, self-limited childhood illnesses, has been found to be unreliable. Therefore, parents navigating on the Internet risk finding advice that is incomplete or, more importantly, not evidence-based. The importance that a resource such as the Internet as a source of quality health information for consumers should, however, be taken into consideration. For this reason, studies need to be performed regarding the quality of material provided. Various strategies have been proposed that would allow parents to distinguish trustworthy web documents from unreliable ones. One of these strategies is the use of a checklist for the appraisal of web pages based on their technical aspects. The purpose of this study was to assess the quality of information present on the Internet regarding the home management of cough in children and to examine the applicability of a checklist strategy that would allow consumers to select more trustworthy web pages. The Internet was searched for web pages regarding the home treatment of cough in children with the use of different search engines. Medline and the Cochrane database were searched for available evidence concerning the management of cough in children. Three checklists were created to assess different aspects of the web documents. The first checklist was designed to allow for a technical appraisal of the web pages and was based on components such as the name of the author and references used. The second was constructed to examine the completeness of the health information contained in the documents, such as causes and mechanism of cough, and pharmacological and nonpharmacological treatment. The third checklist assessed the quality of the information by measuring it against a gold standard document. This document was created by combining the policy statement issued by the American Academy of Pediatrics regarding the pharmacological treatment of cough in children with the guide of the World Health Organization on drugs for children. For each checklist, the web page contents were analyzed and quantitative measurements were assigned. Of the 19 web pages identified, 9 explained the purpose and/or mechanism of cough and 14 the causes. The most frequently mentioned pharmacological treatments were single-ingredient suppressant preparations, followed by single-ingredient expectorants. Dextromethorphan was the most commonly referred to suppressant and guaifenesin the most common expectorant. No documents discouraged the use of suppressants, although 4 of the 10 web documents that addressed expectorants discouraged their use. Sixteen web pages addressed nonpharmacological treatment, 14 of which suggested exposure to a humid environment and/or extra fluid. In most cases, the criteria in the technical appraisal checklist were not present in the web documents; moreover, 2 web pages did not provide any of the items. Regarding content completeness, 3 web pages satisfied all the requirements considered in the checklist and 2 documents did not meet any of the criteria. Of the 3 web pages that scored highest in technical aspect, 2 also supplied complete information. No relationship was found, however, between the technical aspect and the content completeness. Concerning the quality of the health information supplied, 10 pages received a negative score because they contained more incorrect than correct information, and 1 web page received a high score. This document was 1 of the 2 that also scored high in technical aspect and content completeness. No relationship was found, however, among quality of information, technical aspect, and content completeness. As the results of this study show, a parent navigating the Internet for information on the home management of cough in children will no doubt find incorrect advice among the search results. (ABSTRACT TRUNCATED)
Transactional interactive multimedia banner
NASA Astrophysics Data System (ADS)
Shae, Zon-Yin; Wang, Xiping; von Kaenel, Juerg
2000-05-01
Advertising in TV broadcasting has shown that multimedia is a very effective means to present merchandise and attract shoppers. This has been applied to the Web by including animated multimedia banner ads on web pages. However, the issues of coupling interactive browsing, shopping, and secure transactions e.g. from inside a multimedia banner, have only recently started to being explored. Currently there is an explosively growing amount of back-end services available (e.g., business to business commerce (B2B), business to consumer (B2C) commerce, and infomercial services) in the Internet. These services are mostly accessible through static HTML web pages at a few specific web portals. In this paper, we will investigate the feasibility of using interactive multimedia banners as pervasive access point for the B2C, B2B, and infomercial services. We present a system architecture that involves a layer of middleware agents functioning as the bridge between the interactive multimedia banners and back-end services.
ERIC Educational Resources Information Center
Carpi, Anthony
2001-01-01
Explains the advantages of using the World Wide Web as an educational tool and describes the Natural Science Pages project which is a teaching module involving Internet access and Web use and aiming to improve student achievement. (Contains 13 references.) (YDS)
NASA Astrophysics Data System (ADS)
Bloom, Jeffrey A.; Alonso, Rafael
2003-06-01
There are two primary challenges to monitoring the Web for steganographic media: finding suspect media and examining those found. The challenge that has received a great deal of attention is the second of these, the steganalysis problem. The other challenge, and one that has received much less attention, is the search problem. How does the steganalyzer get the suspect media in the first place? This paper describes an innovative method and architecture to address this search problem. The typical approaches to searching the web for covert communications are often based on the concept of "crawling" the Web via a smart "spider." Such spiders find new pages by following ever-expanding chains of links from one page to many next pages. Rather than seek pages by chasing links from other pages, we find candidate pages by identifying requests to access pages. To do this we monitor traffic on Internet backbones, identify and log HTTP requests, and use this information to guide our process. Our approach has the advantages that we examine pages to which no links exist, we examine pages as soon as they are requested, and we concentrate resources only on active pages, rather than examining pages that are never viewed.
A Study towards Building An Optimal Graph Theory Based Model For The Design of Tourism Website
NASA Astrophysics Data System (ADS)
Panigrahi, Goutam; Das, Anirban; Basu, Kajla
2010-10-01
Effective tourism website is a key to attract tourists from different parts of the world. Here we identify the factors of improving the effectiveness of website by considering it as a graph, where web pages including homepage are the nodes and hyperlinks are the edges between the nodes. In this model, the design constraints for building a tourism website are taken into consideration. Our objectives are to build a framework of an effective tourism website providing adequate level of information, service and also to enable the users to reach to the desired page by spending minimal loading time. In this paper an information hierarchy specifying the upper limit of outgoing link of a page has also been proposed. Following the hierarchy, the web developer can prepare an effective tourism website. Here loading time depends on page size and network traffic. We have assumed network traffic as uniform and the loading time is directly proportional with page size. This approach is done by quantifying the link structure of a tourism website. In this approach we also propose a page size distribution pattern of a tourism website.
Multigraph: Reusable Interactive Data Graphs
NASA Astrophysics Data System (ADS)
Phillips, M. B.
2010-12-01
There are surprisingly few good software tools available for presenting time series data on the internet. The most common practice is to use a desktop program such as Excel or Matlab to save a graph as an image which can be included in a web page like any other image. This disconnects the graph from the data in a way that makes updating a graph with new data a cumbersome manual process, and it limits the user to one particular view of the data. The Multigraph project defines an XML format for describing interactive data graphs, and software tools for creating and rendering those graphs in web pages and other internet connected applications. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions; the user can pan and zoom by clicking and dragging, in a familiar "Google Maps" kind of way. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. The Multigraph XML format, or "MUGL" for short, provides a concise description of the visual properties of a graph, such as axes, plot styles, data sources, labels, etc, as well as interactivity properties such as how and whether the user can pan or zoom along each axis. Multigraph reads a file in this format, draws the described graph, and allows the user to interact with it. Multigraph software currently includes a Flash application for embedding graphs in web pages, a Flex component for embedding graphs in larger Flex/Flash applications, and a plugin for creating graphs in the WordPress content management system. Plans for the future include a Java version for desktop viewing and editing, a command line version for batch and server side rendering, and possibly Android and iPhone versions. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Multigraph Display of Real Time Weather Data
User preference as quality markers of paediatric web sites.
Hernández-Borges, Angel A; Macías-Cervi, Pablo; Gaspar-Guardado, Asunción; Torres-Alvarez De Arcaya, María Luisa; Ruíz-Rabaza, Ana; Jiménez-Sosa, Alejandro
2003-09-01
Little is known about the ability of internet users to distinguish the best medical resources online, and how their preferences, measured by usage and popularity indexes, correlate with established quality criteria. Our objective was to analyse whether the number of inbound links and/or daily visits to a sample of paediatric web pages are reliable quality markers of the pages. Two-year follow-up study of 363 web pages with paediatric information. The number of inbound links and the average number of daily visits to the pages were calculated on a yearly basis. In addition, their rates of compliance with the codes of conduct, guidelines and/or principles of three international organizations were evaluated. The quality code most widely met by the sample web pages was the Health on the Net Foundation Code of Conduct (overall rate, 60.2%). Sample pages showed a low degree of compliance with principles related to privacy, confidentiality and electronic commerce (overall rate less than 45%). Most importantly, we observed a moderate, significant correlation between compliance with quality criteria and the number of inbound links (p < 0.001). However, no correlation was found between the number of daily visits to a page and its degree of compliance with the principles. Some indexes derived from the analysis of webmasters' hyperlinks could be reliable quality markers of medical web resources.
New York Times Current News Physics Applications
NASA Astrophysics Data System (ADS)
Cise, John
2010-03-01
Since 2007 I have been using NYTimes current News articles rich in graphics and physics variables for developing edited one page web (http://CisePhysics.homestead.com/files/NYT.htm) physics questions based on current events in the news. The NYTimes home page listed above contains currently ten pages with about 40 one page current edited News related physics articles per page containing: rich graphics, graphic editions by the author, edited articles, introduction to a question, questions, and answers. I use these web pages to introduce new physics concepts to students with current applications of concepts in the news. I also use these one page physics applications as pop quizzes and extra credit for students. As news happens(e.g. the 2010 Vancouver Olympics) I find the physics applications in the NYTimes articles and generate applications and questions. These new one page applications with questions are added to the home page: http://CisePhysics.homestead.com/files/NYT.htm The newest pages start with page 10 and work back in time to 9, 8, etc. The ten web pages with about 40 news articles per page are arranged in the traditional manner: vectors, kinematics, projectiles, Newton, Work & Energy, properties of matter, fluids, temperature, heat, waves, and sound. This site is listed as a resource in AAPT's Compadre site.
A web-based approach for electrocardiogram monitoring in the home.
Magrabi, F; Lovell, N H; Celler, B G
1999-05-01
A Web-based electrocardiogram (ECG) monitoring service in which a longitudinal clinical record is used for management of patients, is described. The Web application is used to collect clinical data from the patient's home. A database on the server acts as a central repository where this clinical information is stored. A Web browser provides access to the patient's records and ECG data. We discuss the technologies used to automate the retrieval and storage of clinical data from a patient database, and the recording and reviewing of clinical measurement data. On the client's Web browser, ActiveX controls embedded in the Web pages provide a link between the various components including the Web server, Web page, the specialised client side ECG review and acquisition software, and the local file system. The ActiveX controls also implement FTP functions to retrieve and submit clinical data to and from the server. An intelligent software agent on the server is activated whenever new ECG data is sent from the home. The agent compares historical data with newly acquired data. Using this method, an optimum patient care strategy can be evaluated, a summarised report along with reminders and suggestions for action is sent to the doctor and patient by email.
Click to go to NDBC home page Select to go to the NWS homepage Home About Us Contact Us Search NDBC Web link to RSS feed access page Web Widget Email Access Web Data Guide Maintenance Schedule Station Status Information USA.gov is the U.S. government's official web portal to all federal, state and local government
Click to go to NDBC home page Select to go to the NWS homepage Home About Us Contact Us Search NDBC Web link to RSS feed access page Web Widget Email Access Web Data Guide Maintenance Schedule Station Status Information USA.gov is the U.S. government's official web portal to all federal, state and local government
Integrated and Applied Curricula Discussion Group and Data Base Project. Final Report.
ERIC Educational Resources Information Center
Wisconsin Univ. - Stout, Menomonie. Center for Vocational, Technical and Adult Education.
A project was conducted to compile integrated and applied curriculum resources, develop databases on the World Wide Web, and encourage networking for high school and technical college educators through an Internet discussion group. Activities conducted during the project include the creation of a web page to guide users to resource banks…
Digital Content: The Babel of Cyberspace.
ERIC Educational Resources Information Center
Bruce, Bertram
1999-01-01
Takes a fanciful journey into the digital library imagined by Jorge Luis Borges, and uses it as a metaphor to examine what sort of library the World Wide Web is. Examines how digital libraries are growing and what they mean for literacy education. Includes a description of a particular Web page, and a glossary. (SR)
Federal Register 2010, 2011, 2012, 2013, 2014
2017-12-12
... be in the form of a web page for a fictitious drug targeted toward consumers who have chronic pain or... prescription drug information in multiple formats--including print, television, web, and other modes--would be... framed items. Moreover, the slider questions referenced by the commenter are semantic differentials...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-20
..., authorized the National Senior Center under 49 U.S.C. 5314(c). In recognition of the fundamental importance..., Capacity and experience for conducting face-to-face and Web-based training. IV. Proposal Submission... tasks, including capacity and experience for conducting face-to-face and Web- based [[Page 78973...
Future Trends in Children's Web Pages: Probing Hidden Biases for Information Quality
ERIC Educational Resources Information Center
Kurubacak, Gulsun
2007-01-01
As global digital communication continues to flourish, Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political restrictions…
76 FR 77203 - Notice of Intent To Seek Approval To Collect Information
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-12
... Web pages created and maintained by component organizations of the NAL. On average, 2 million people... interest collections have established a Web presence with a home page and links to sub-pages that provide... Semantic Differential Scale or multiple choice questions, and no more than 4 open-ended response questions...
Evaluating Information Quality: Hidden Biases on the Children's Web Pages
ERIC Educational Resources Information Center
Kurubacak, Gulsun
2006-01-01
As global digital communication continues to flourish, the Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political…
Collaborative Design of World Wide Web Pages: A Case Study.
ERIC Educational Resources Information Center
Andrew, Paige G; Musser, Linda R.
1997-01-01
This case study of the collaborative design of an earth science World Wide Web page at Pennsylvania State University highlights the role of librarians. Discusses the original Web site and links, planning, the intended audience, and redesign and recommended changes; and considers the potential contributions of librarians. (LRW)
ERIC Educational Resources Information Center
Galica, Carol
1997-01-01
Provides an annotated bibliography of selected NASA Web sites for K-12 math and science teachers: the NASA Lewis Research Center Learning Technologies K-12 Home Page, Spacelink, NASA Quest, Basic Aircraft Design Page, International Space Station, NASA Shuttle Web Site, LIFTOFF to Space Education, Telescopes in Education, and Space Educator's…
Knowledge-driven enhancements for task composition in bioinformatics.
Sutherland, Karen; McLeod, Kenneth; Ferguson, Gus; Burger, Albert
2009-10-01
A key application area of semantic technologies is the fast-developing field of bioinformatics. Sealife was a project within this field with the aim of creating semantics-based web browsing capabilities for the Life Sciences. This includes meaningfully linking significant terms from the text of a web page to executable web services. It also involves the semantic mark-up of biological terms, linking them to biomedical ontologies, then discovering and executing services based on terms that interest the user. A system was produced which allows a user to identify terms of interest on a web page and subsequently connects these to a choice of web services which can make use of these inputs. Elements of Artificial Intelligence Planning build on this to present a choice of higher level goals, which can then be broken down to construct a workflow. An Argumentation System was implemented to evaluate the results produced by three different gene expression databases. An evaluation of these modules was carried out on users from a variety of backgrounds. Users with little knowledge of web services were able to achieve tasks that used several services in much less time than they would have taken to do this manually. The Argumentation System was also considered a useful resource and feedback was collected on the best way to present results. Overall the system represents a move forward in helping users to both construct workflows and analyse results by incorporating specific domain knowledge into the software. It also provides a mechanism by which web pages can be linked to web services. However, this work covers a specific domain and much co-ordinated effort is needed to make all web services available for use in such a way, i.e. the integration of underlying knowledge is a difficult but essential task.
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: March 3, 2011 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: February 12, 2013 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: March 3, 2011 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...
Content and Design Features of Academic Health Sciences Libraries' Home Pages.
McConnaughy, Rozalynd P; Wilson, Steven P
2018-01-01
The goal of this content analysis was to identify commonly used content and design features of academic health sciences library home pages. After developing a checklist, data were collected from 135 academic health sciences library home pages. The core components of these library home pages included a contact phone number, a contact email address, an Ask-a-Librarian feature, the physical address listed, a feedback/suggestions link, subject guides, a discovery tool or database-specific search box, multimedia, social media, a site search option, a responsive web design, and a copyright year or update date.
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Beryllium Toxicity Patient Education Care Instruction Sheet ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: May 23, 2008 Page ...
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Get email updates To receive email updates ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: June 24, 2014 Page ...
30 CFR 585.500 - How do I make payments under this part?
Code of Federal Regulations, 2014 CFR
2014-07-01
... your electronic payments through the Fees for Services page on the BOEM Web site at http://www.boem.gov, and you must include one copy of the Pay.gov confirmation receipt page with your unsolicited request... Sale Notice/depends on bid With bid Pay.Gov § 585.501. Bonus Balance Lease issuance 30 CFR 1218.51 (2...
Class Projects on the Internet.
ERIC Educational Resources Information Center
Nicholson, Danny
1996-01-01
Discusses the use of the Internet in the classroom. Presents a project on renewable energy sources in which students produce web pages. Provides the web page address of the project completed by students. (ASK)
Frank, M S; Dreyer, K
2001-06-01
We describe a working software technology that enables educators to incorporate their expertise and teaching style into highly interactive and Socratic educational material for distribution on the world wide web. A graphically oriented interactive authoring system was developed to enable the computer novice to create and store within a database his or her domain expertise in the form of electronic knowledge. The authoring system supports and facilitates the input and integration of several types of content, including free-form, stylized text, miniature and full-sized images, audio, and interactive questions with immediate feedback. The system enables the choreography and sequencing of these entities for display within a web page as well as the sequencing of entire web pages within a case-based or thematic presentation. Images or segments of text can be hyperlinked with point-and-click to other entities such as adjunctive web pages, audio, or other images, cases, or electronic chapters. Miniature (thumbnail) images are automatically linked to their full-sized counterparts. The authoring system contains a graphically oriented word processor, an image editor, and capabilities to automatically invoke and use external image-editing software such as Photoshop. The system works in both local area network (LAN) and internet-centric environments. An internal metalanguage (invisible to the author but stored with the content) was invented to represent the choreographic directives that specify the interactive delivery of the content on the world wide web. A database schema was developed to objectify and store both this electronic knowledge and its associated choreographic metalanguage. A database engine was combined with page-rendering algorithms in order to retrieve content from the database and deliver it on the web in a Socratic style, assess the recipient's current fund of knowledge, and provide immediate feedback, thus stimulating in-person interaction with a human expert. This technology enables the educator to choreograph a stylized, interactive delivery of his or her message using multimedia components assembled in virtually any order, spanning any number of web pages for a given case or theme. An educator can thus exercise precise influence on specific learning objectives, embody his or her personal teaching style within the content, and ultimately enhance its educational impact. The described technology amplifies the efforts of the educator and provides a more dynamic and enriching learning environment for web-based education.
Making EPA's PDF documents accessible (by Section 508 standards) and user-friendly includes steps such as adding bookmarks, using electronic conversion rather than scanning pages, and adding metadata.
Creating and Maintaining Data-Driven Course Web Sites.
ERIC Educational Resources Information Center
Heines, Jesse M.
This paper deals with techniques for reducing the amount of work that needs to be redone each semester when one prepares an existing course Web site for a new class. The key concept is algorithmic generation of common page elements while still allowing full control over page content via WYSIWYG tools like Microsoft FrontPage and Macromedia…
Future Trends in Chlldren's Web Pages: Probing Hidden Biases for Information Quality
ERIC Educational Resources Information Center
Kurubacak, Gulsun
2007-01-01
As global digital communication continues to flourish, Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political restrictions…
Scattering Banner Acknowledgements The graphics used on the Neutron Scattering Web Pages were designed by reused on these web pages by kind permission of Jack Carpenter, and with the assistance of Mary Koelbl (IPD). Rick Goyette (IPNS) set up and maintains the Linux web server as well as helping to automate the
Google Wave: Collaboration Reworked
ERIC Educational Resources Information Center
Rethlefsen, Melissa L.
2010-01-01
Over the past several years, Internet users have become accustomed to Web 2.0 and cloud computing-style applications. It's commonplace and even intuitive to drag and drop gadgets on personalized start pages, to comment on a Facebook post without reloading the page, and to compose and save documents through a web browser. The web paradigm has…
81 FR 40262 - Notice of Intent To Seek Approval To Collect Information
Federal Register 2010, 2011, 2012, 2013, 2014
2016-06-21
... their level of satisfaction with existing services. The NAL Internet sites are a vast collection of Web pages. NAL Web pages are visited by an average of 8.6 million people per month. All NAL Information Centers have an established web presence that provides information to their respective audiences...
An Expertise Recommender using Web Mining
NASA Technical Reports Server (NTRS)
Joshi, Anupam; Chandrasekaran, Purnima; ShuYang, Michelle; Ramakrishnan, Ramya
2001-01-01
This report explored techniques to mine web pages of scientists to extract information regarding their expertise, build expertise chains and referral webs, and semi automatically combine this information with directory information services to create a recommender system that permits query by expertise. The approach included experimenting with existing techniques that have been reported in research literature in recent past , and adapted them as needed. In addition, software tools were developed to capture and use this information.
Saunders, Brian; Lyon, Stephen; Day, Matthew; Riley, Brenda; Chenette, Emily; Subramaniam, Shankar
2008-01-01
The UCSD-Nature Signaling Gateway Molecule Pages (http://www.signaling-gateway.org/molecule) provides essential information on more than 3800 mammalian proteins involved in cellular signaling. The Molecule Pages contain expert-authored and peer-reviewed information based on the published literature, complemented by regularly updated information derived from public data source references and sequence analysis. The expert-authored data includes both a full-text review about the molecule, with citations, and highly structured data for bioinformatics interrogation, including information on protein interactions and states, transitions between states and protein function. The expert-authored pages are anonymously peer reviewed by the Nature Publishing Group. The Molecule Pages data is present in an object-relational database format and is freely accessible to the authors, the reviewers and the public from a web browser that serves as a presentation layer. The Molecule Pages are supported by several applications that along with the database and the interfaces form a multi-tier architecture. The Molecule Pages and the Signaling Gateway are routinely accessed by a very large research community. PMID:17965093
Saunders, Brian; Lyon, Stephen; Day, Matthew; Riley, Brenda; Chenette, Emily; Subramaniam, Shankar; Vadivelu, Ilango
2008-01-01
The UCSD-Nature Signaling Gateway Molecule Pages (http://www.signaling-gateway.org/molecule) provides essential information on more than 3800 mammalian proteins involved in cellular signaling. The Molecule Pages contain expert-authored and peer-reviewed information based on the published literature, complemented by regularly updated information derived from public data source references and sequence analysis. The expert-authored data includes both a full-text review about the molecule, with citations, and highly structured data for bioinformatics interrogation, including information on protein interactions and states, transitions between states and protein function. The expert-authored pages are anonymously peer reviewed by the Nature Publishing Group. The Molecule Pages data is present in an object-relational database format and is freely accessible to the authors, the reviewers and the public from a web browser that serves as a presentation layer. The Molecule Pages are supported by several applications that along with the database and the interfaces form a multi-tier architecture. The Molecule Pages and the Signaling Gateway are routinely accessed by a very large research community.
A Virtual Tour of the Radio Astronomy Process
NASA Astrophysics Data System (ADS)
Conrad, S. B.; Finley, D. G.; Claussen, M. J.; Ulvestad, J. S.
2000-12-01
In the summer of 2000, two teachers working on a Masters of Science Teaching Degree at New Mexico Tech and participating in the Research Experience for Teachers (RET) program sponsored by the National Science Foundation, spent eight weeks as interns researching and working on projects at the National Radio Astronomy Observatory (NRAO) which will directly benefit students in their classrooms and also impact other science educators. One of the products of the interships is a set of web pages for NRAO's web page educational section. The purpose of these web pages is to familiarize students, teachers, and other people with the process that a radio astronomer goes through to do radio astronomy science. A virtual web tour was created of this process. This required interviewing radio astronomers and other professionals involved with this process at the NRAO (e.g. engineers, data analysts, and operations people), and synthesizing the interviews into a descriptive, visual-based set of web pages. These pages do meet the National as well as New Mexico Standards and Benchmarks for Science Education. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc. The NSF's RET program is gratefully acknowledged.
Total Petroleum Hydrocarbons (TPH): ToxFAQs
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: February 4, 2014 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...
Web Spam, Social Propaganda and the Evolution of Search Engine Rankings
NASA Astrophysics Data System (ADS)
Metaxas, Panagiotis Takis
Search Engines have greatly influenced the way we experience the web. Since the early days of the web, users have been relying on them to get informed and make decisions. When the web was relatively small, web directories were built and maintained using human experts to screen and categorize pages according to their characteristics. By the mid 1990's, however, it was apparent that the human expert model of categorizing web pages does not scale. The first search engines appeared and they have been evolving ever since, taking over the role that web directories used to play.
Grouping of Items in Mobile Web Questionnaires
ERIC Educational Resources Information Center
Mavletova, Aigul; Couper, Mick P.
2016-01-01
There is some evidence that a scrolling design may reduce breakoffs in mobile web surveys compared to a paging design, but there is little empirical evidence to guide the choice of the optimal number of items per page. We investigate the effect of the number of items presented on a page on data quality in two types of questionnaires: with or…
Network and User-Perceived Performance of Web Page Retrievals
NASA Technical Reports Server (NTRS)
Kruse, Hans; Allman, Mark; Mallasch, Paul
1998-01-01
The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.
Quality of consumer-targeted internet guidance on home firearm and ammunition storage.
Freundlich, Katherine L; Skoczylas, Maria Shakour; Schmidt, John P; Keshavarzi, Nahid R; Mohr, Bethany Anne
2016-10-01
Four storage practices protect against unintentional and/or self-inflicted firearm injury among children and adolescents: keeping guns locked (1) and unloaded (2) and keeping ammunition locked up (3) and in a separate location from the guns (4). Our aim was to mimic common Google search strategies on firearm/ammunition storage and assess whether the resulting web pages provided recommendations consistent with those supported by the literature. We identified 87 web pages by Google search of the 10 most commonly used search terms in the USA related to firearm/ammunition storage. Two non-blinded independent reviewers analysed web page technical quality according to a 17-item checklist derived from previous studies. A single reviewer analysed readability by US grade level assigned by Flesch-Kincaid Grade Level Index. Two separate, blinded, independent reviewers analysed deidentified web page content for accuracy and completeness describing the four accepted storage practices. Reviewers resolved disagreements by consensus. The web pages described, on average, less than one of four accepted storage practices (mean 0.2 (95% CL 0.1 to 0.4)). Only two web pages (2%) identified all four practices. Two web pages (2%) made assertions inconsistent with recommendations; both implied that loaded firearms could be stored safely. Flesch-Kincaid Grade Level Index averaged 8.0 (95% CL 7.3 to 8.7). The average technical quality score was 7.1 (95% CL 6.8 to 7.4) out of an available score of 17. There was a high degree of agreement between reviewers regarding completeness (weighted κ 0.78 (95% CL 0.61 to 0.97)). The internet currently provides incomplete information about safe firearm storage. Understanding existing deficiencies may inform future strategies for improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
A New and Improved MPB Web Site
NASA Astrophysics Data System (ADS)
Warner, Brian D.
2018-01-01
The Minor Planet Bulletin home page has a new URL: http://www.MinorPlanet.info/MPB/mpb.php. The new home page features free access (data rates may apply) to almost all papers from Volume 1 (1973) to present. Also included are a basic search feature that allows finding papers by title/abstract and/or authors and links to download the MPB authors guide and cumulative indices.
Semantic Advertising for Web 3.0
NASA Astrophysics Data System (ADS)
Thomas, Edward; Pan, Jeff Z.; Taylor, Stuart; Ren, Yuan; Jekjantuk, Nophadol; Zhao, Yuting
Advertising on the World Wide Web is based around automatically matching web pages with appropriate advertisements, in the form of banner ads, interactive adverts, or text links. Traditionally this has been done by manual classification of pages, or more recently using information retrieval techniques to find the most important keywords from the page, and match these to keywords being used by adverts. In this paper, we propose a new model for online advertising, based around lightweight embedded semantics. This will improve the relevancy of adverts on the World Wide Web and help to kick-start the use of RDFa as a mechanism for adding lightweight semantic attributes to the Web. Furthermore, we propose a system architecture for the proposed new model, based on our scalable ontology reasoning infrastructure TrOWL.
ERIC Educational Resources Information Center
Dimopoulos, Kostas; Asimakopoulos, Apostolos
2010-01-01
This study aims to explore navigation patterns and preferred pages' characteristics of ten secondary school students searching the web for information about cloning. The students navigated the Web for as long as they wished in a context of minimum support of teaching staff. Their navigation patterns were analyzed using audit trail data software.…
Web Site On a Budget: How to Find an Affordable Home for Your Pages.
ERIC Educational Resources Information Center
Callihan, Steven E.
1996-01-01
Offers advice for choosing an Internet provider: consider the amount of time, effort, and expertise one has, coupled with the complexity of the Web page, which impact price and choice of provider; and question providers about server speed, ports, architecture, traffic levels, fee structures, and registration of domain names. Lists 33 Web presence…
Building an Ajax Application from Scratch
ERIC Educational Resources Information Center
Clark, Jason A.
2006-01-01
The author of this article suggests that to refresh Web pages and online library catalogs in a more pleasing way, Ajax, an acronym for Asynchronous JavaScript and XML, should be used. Ajax is the way to use Web technologies that work together to refresh sections of Web pages to allow almost instant responses to user input. This article describes…
Castillo-Ortiz, Jose Dionisio; de Jesus Valdivia-Nuno, Jose; Ramirez-Gomez, Andrea; Garagarza-Mariscal, Heber; Gallegos-Rios, Carlos; Flores-Hernandez, Gabriel; Hernandez-Sanchez, Luis; Brambila-Barba, Victor; Castaneda-Sanchez, Jose Juan; Barajas-Ochoa, Zalathiel; Suarez-Rico, Angel; Sanchez-Gonzalez, Jorge Manuel; Ramos-Remus, Cesar
2016-09-01
The aim of this study was to assess the changes in the characteristics of rheumatoid arthritis information on the Internet over a 15-year period and the positioning of Web sites posted by universities, hospitals, and medical associations. We replicated the methods of a 2001 study assessing rheumatoid arthritis information on the Internet using WebCrawler. All Web sites and pages were critically assessed for relevance, scope, authorship, type of publication, and financial objectives. Differences between studies were considered significant if 95 % confidence intervals did not overlap. Additionally, we added a Google search with assessments of the quality of content of web pages and of the Web sites posted by medical institutions. There were significant differences between the present study's WebCrawler search and the 2001-referent study. There were increases in information sites (82 vs 36 %) and rheumatoid arthritis-specific discussion pages (59 vs 8 %), and decreases in advertisements (2 vs 48 %) and alternative therapies (27 vs 45 %). The quality of content of web pages is still dispersed; just 37 % were rated as good. Among the first 300 hits, 30 (10 %) were posted by medical institutions, 17 of them in the USA. Regarding readability, 7 % of these 30 web pages required 6 years, 27 % required 7-9 years, 27 % required 10-12 years, and 40 % required 12 or more years of schooling. The Internet has evolved in the last 15 years. Medical institutions are also better positioned. However, there are still areas for improvement, such as the quality of the content, leadership of medical institutions, and readability of information.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-30
... prescribed by FINRA, on their Web sites, social media pages, and any comparable Internet presence, and on Web sites, social media pages, and any comparable Internet presence relating to a member's investment...
ToxGuides: Quick Reference Pocket Guide for Toxicological Profiles
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Get email updates To receive email updates ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: January 21, 2015 Page ...
WebWatcher: Machine Learning and Hypertext
1995-05-29
WebWatcher: Machine Learning and Hypertext Thorsten Joachims, Tom Mitchell, Dayne Freitag, and Robert Armstrong School of Computer Science Carnegie...HTML-page about machine learning in which we in- serted a hyperlink to WebWatcher (line 6). The user follows this hyperlink and gets to a page which...AND SUBTITLE WebWatcher: Machine Learning and Hypertext 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT
Proteopedia: Exciting Advances in the 3D Encyclopedia of Biomolecular Structure
NASA Astrophysics Data System (ADS)
Prilusky, Jaime; Hodis, Eran; Sussman, Joel L.
Proteopedia is a collaborative, 3D web-encyclopedia of protein, nucleic acid and other structures. Proteopedia ( http://www.proteopedia.org ) presents 3D biomolecule structures in a broadly accessible manner to a diverse scientific audience through easy-to-use molecular visualization tools integrated into a wiki environment that anyone with a user account can edit. We describe recent advances in the web resource in the areas of content and software. In terms of content, we describe a large growth in user-added content as well as improvements in automatically-generated content for all PDB entry pages in the resource. In terms of software, we describe new features ranging from the capability to create pages hidden from public view to the capability to export pages for offline viewing. New software features also include an improved file-handling system and availability of biological assemblies of protein structures alongside their asymmetric units.
Space Weather Services and Products of RWC Russia in 2007
NASA Astrophysics Data System (ADS)
Burov, Viatcheslav; Avdyushin, Sergei; Denisova, Valentina
RWC Russia (Institute of Applied Geophysics) - forecasting center unites activity of the National Heliogeophysic Service of Russia and Regional Warning Center ISES. The Center has been operating since 1974. There are several services that carry out gathering, processing and spreading of the total information data flow, (including both Russian and foreign-exchange data), and forecasts. Forecasting activities results are issued in the form of special messages, the major part of which corresponds to standard codes. Our Web page: www.geospace.ru are represented the current data and the forecasts. At present both a weekly 7-day geomagnetic forecast and the actual disturbance activity information for the previous week are available on the Web page. And, the data of some ionosphere and magnetic stations are available on this page too. Various types of our forecast alert and routine observations are considered in the report
Guide to the Internet. The world wide web.
Pallen, M.
1995-01-01
The world wide web provides a uniform, user friendly interface to the Internet. Web pages can contain text and pictures and are interconnected by hypertext links. The addresses of web pages are recorded as uniform resource locators (URLs), transmitted by hypertext transfer protocol (HTTP), and written in hypertext markup language (HTML). Programs that allow you to use the web are available for most operating systems. Powerful on line search engines make it relatively easy to find information on the web. Browsing through the web--"net surfing"--is both easy and enjoyable. Contributing to the web is not difficult, and the web opens up new possibilities for electronic publishing and electronic journals. Images p1554-a Fig 5 PMID:8520402
The use of the World Wide Web by medical journals in 2003 and 2005: an observational study.
Schriger, David L; Ouk, Sripha; Altman, Douglas G
2007-01-01
The 2- to 6-page print journal article has been the standard for 200 years, yet this format severely limits the amount of detailed information that can be conveyed. The World Wide Web provides a low-cost option for posting extended text and supplementary information. It also can enhance the experience of journal editors, reviewers, readers, and authors through added functionality (eg, online submission and peer review, postpublication critique, and e-mail notification of table of contents.) Our aim was to characterize ways that journals were using the World Wide Web in 2005 and note changes since 2003. We analyzed the Web sites of 138 high-impact print journals in 3 ways. First, we compared the print and Web versions of March 2003 and 2005 issues of 28 journals (20 of which were randomly selected from the 138) to determine how often articles were published Web only and how often print articles were augmented by Web-only supplements. Second, we examined what functions were offered by each journal Web site. Third, for journals that offered Web pages for reader commentary about each article, we analyzed the number of comments and characterized these comments. Fifty-six articles (7%) in 5 journals were Web only. Thirteen of the 28 journals had no supplementary online content. By 2005, several journals were including Web-only supplements in >20% of their papers. Supplementary methods, tables, and figures predominated. The use of supplementary material increased by 5% from 2% to 7% in the 20-journal random sample from 2003 to 2005. Web sites had similar functionality with an emphasis on linking each article to related material and e-mailing readers about activity related to each article. There was little evidence of journals using the Web to provide readers an interactive experience with the data or with each other. Seventeen of the 138 journals offered rapid-response pages. Only 18% of eligible articles had any comments after 5 months. Journal Web sites offer similar functionality. The use of online-only articles and online-only supplements is increasing.
Time Series Data Visualization in World Wide Telescope
NASA Astrophysics Data System (ADS)
Fay, J.
WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.
Reese Sorenson's Individual Professional Page
NASA Technical Reports Server (NTRS)
Sorenson, Reese; Nixon, David (Technical Monitor)
1998-01-01
The subject document is a World Wide Web (WWW) page entitled, "Reese Sorenson's Individual Professional Page." Its can be accessed at "http://george.arc.nasa.gov/sorenson/personal/index.html". The purpose of this page is to make the reader aware of me, who I am, and what I do. It lists my work assignments, my computer experience, my place in the NASA hierarchy, publications by me, awards received by me, my education, and how to contact me. Writing this page was a learning experience, pursuant to an element in my Job Description which calls for me to be able to use the latest computers. This web page contains very little technical information, none of which is classified or sensitive.
ERIC Educational Resources Information Center
Rocha, Tania; Bessa, Maximino; Goncalves, Martinho; Cabral, Luciana; Godinho, Francisco; Peres, Emanuel; Reis, Manuel C.; Magalhaes, Luis; Chalmers, Alan
2012-01-01
Background: One of the most mentioned problems of web accessibility, as recognized in several different studies, is related to the difficulty regarding the perception of what is or is not clickable in a web page. In particular, a key problem is the recognition of hyperlinks by a specific group of people, namely those with intellectual…
SurveyWiz and factorWiz: JavaScript Web pages that make HTML forms for research on the Internet.
Birnbaum, M H
2000-05-01
SurveyWiz and factorWiz are Web pages that act as wizards to create HTML forms that enable one to collect data via the Web. SurveyWiz allows the user to enter survey questions or personality test items with a mixture of text boxes and scales of radio buttons. One can add demographic questions of age, sex, education, and nationality with the push of a button. FactorWiz creates the HTML for within-subjects, two-factor designs as large as 9 x 9, or higher order factorial designs up to 81 cells. The user enters levels of the row and column factors, which can be text, images, or other multimedia. FactorWiz generates the stimulus combinations, randomizes their order, and creates the page. In both programs HTML is displayed in a window, and the user copies it to a text editor to save it. When uploaded to a Web server and supported by a CGI script, the created Web pages allow data to be collected, coded, and saved on the server. These programs are intended to assist researchers and students in quickly creating studies that can be administered via the Web.
Future View: Web Navigation based on Learning User's Browsing Strategy
NASA Astrophysics Data System (ADS)
Nagino, Norikatsu; Yamada, Seiji
In this paper, we propose a Future View system that assists user's usual Web browsing. The Future View will prefetch Web pages based on user's browsing strategies and present them to a user in order to assist Web browsing. To learn user's browsing strategy, the Future View uses two types of learning classifier systems: a content-based classifier system for contents change patterns and an action-based classifier system for user's action patterns. The results of learning is applied to crawling by Web robots, and the gathered Web pages are presented to a user through a Web browser interface. We experimentally show effectiveness of navigation using the Future View.
ERIC Educational Resources Information Center
Metz, Ray E.; Junion-Metz, Gail
This book provides basic information about the World Wide Web and serves as a guide to the tools and techniques needed to browse the Web, integrate it into library services, or build an attractive, user-friendly home page for the library. Chapter 1 provides an overview of Web basics and chapter 2 discusses some of the big issues related to…
Key-phrase based classification of public health web pages.
Dolamic, Ljiljana; Boyer, Célia
2013-01-01
This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.
NASA Astrophysics Data System (ADS)
Macleod, Christopher Kit; Braga, Joao; Arts, Koen; Ioris, Antonio; Han, Xiwu; Sripada, Yaji; van der Wal, Rene
2016-04-01
The number of local, national and international networks of online environmental sensors are rapidly increasing. Where environmental data are made available online for public consumption, there is a need to advance our understanding of the relationships between the supply of and the different demands for such information. Understanding how individuals and groups of users are using online information resources may provide valuable insights into their activities and decision making. As part of the 'dot.rural wikiRivers' project we investigated the potential of web analytics and an online survey to generate insights into the use of a national network of river level data from across Scotland. These sources of online information were collected alongside phone interviews with volunteers sampled from the online survey, and interviews with providers of online river level data; as part of a larger project that set out to help improve the communication of Scotland's online river data. Our web analytics analysis was based on over 100 online sensors which are maintained by the Scottish Environmental Protection Agency (SEPA). Through use of Google Analytics data accessed via the R Ganalytics package we assessed: if the quality of data provided by Google Analytics free service is good enough for research purposes; if we could demonstrate what sensors were being used, when and where; how the nature and pattern of sensor data may affect web traffic; and whether we can identify and profile these users based on information from traffic sources. Web analytics data consists of a series of quantitative metrics which capture and summarize various dimensions of the traffic to a certain web page or set of pages. Examples of commonly used metrics include the number of total visits to a site and the number of total page views. Our analyses of the traffic sources from 2009 to 2011 identified several different major user groups. To improve our understanding of how the use of this national network of river level data may provide insights into the interactions between individuals and their usage of hydrological information, we ran an online survey linked to the SEPA river level pages for one year. We collected over 2000 complete responses to the survey. The survey included questions on user activities and the importance of river level information for their activities; alongside questions on what additional information they used in their decision making e.g. precipitation, and when and what river pages they visited. In this presentation we will present results from our analysis of the web analytics and online survey, and the insights they provide to understanding user groups of this national network of river level data.
ERIC Educational Resources Information Center
Jackson, Mary E.
2002-01-01
Explains portals as tools that gather a variety of electronic information resources, including local library resources, into a single Web page. Highlights include cross-database searching; integration with university portals and course management software; the ARL (Association of Research Libraries) Scholars Portal Initiative; and selected vendors…
An experiment with content distribution methods in touchscreen mobile devices.
Garcia-Lopez, Eva; Garcia-Cabot, Antonio; de-Marcos, Luis
2015-09-01
This paper compares the usability of three different content distribution methods (scrolling, paging and internal links) in touchscreen mobile devices as means to display web documents. Usability is operationalized in terms of effectiveness, efficiency and user satisfaction. These dimensions are then measured in an experiment (N = 23) in which users are required to find words in regular-length web documents. Results suggest that scrolling is statistically better in terms of efficiency and user satisfaction. It is also found to be more effective but results were not significant. Our findings are also compared with existing literature to propose the following guideline: "try to use vertical scrolling in web pages for mobile devices instead of paging or internal links, except when the content is too large, then paging is recommended". With an ever increasing number of touchscreen web-enabled mobile devices, this new guideline can be relevant for content developers targeting the mobile web as well as institutions trying to improve the usability of their content for mobile platforms. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Containers and Packaging: Product-Specific Data
This web page provide numbers on the different containers and packaging products in our municipal solid waste. These include containers of all types, such as glass, steel, plastic, aluminum, wood, and other types of packaging
Martin-Facklam, Meret; Kostrzewa, Michael; Martin, Peter; Haefeli, Walter E
2004-01-01
The generally poor quality of health information on the world wide web (WWW) has caused preventable adverse outcomes. Quality management of information on the internet is therefore critical given its widespread use. In order to develop strategies for the safe use of drugs, we scored general and content quality of pages about sildenafil and performed an intervention to improve their quality. The internet was searched with Yahoo and AltaVista for pages about sildenafil and 303 pages were included. For assessment of content quality a score based on accuracy and completeness of essential drug information was assigned. For assessment of general quality, four criteria were evaluated and their association with high content quality was determined by multivariate logistic regression analysis. The pages were randomly allocated to either control or intervention group. Evaluation took place before, as well as 7 and 22 weeks after an intervention which consisted of two letters with individualized feedback information on the respective page which were sent electronically to the address mentioned on the page. Providing references to scientific publications or prescribing information was significantly associated with high content quality (odds ratio: 8.2, 95% CI 3.2, 20.5). The intervention had no influence on general or content quality. To prevent adverse outcomes caused by misinformation on the WWW individualized feedback to the address mentioned on the page was ineffective. It is currently probably the most straight-forward approach to inform lay persons about indicators of high information quality, i.e. the provision of references.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less
Accessibility and quality of online information for pediatric orthopaedic surgery fellowships.
Davidson, Austin R; Murphy, Robert F; Spence, David D; Kelly, Derek M; Warner, William C; Sawyer, Jeffrey R
2014-12-01
Pediatric orthopaedic fellowship applicants commonly use online-based resources for information on potential programs. Two primary sources are the San Francisco Match (SF Match) database and the Pediatric Orthopaedic Society of North America (POSNA) database. We sought to determine the accessibility and quality of information that could be obtained by using these 2 sources. The online databases of the SF Match and POSNA were reviewed to determine the availability of embedded program links or external links for the included programs. If not available in the SF Match or POSNA data, Web sites for listed programs were located with a Google search. All identified Web sites were analyzed for accessibility, content volume, and content quality. At the time of online review, 50 programs, offering 68 positions, were listed in the SF Match database. Although 46 programs had links included with their information, 36 (72%) of them simply listed http://www.sfmatch.org as their unique Web site. Ten programs (20%) had external links listed, but only 2 (4%) linked directly to the fellowship web page. The POSNA database does not list any links to the 47 programs it lists, which offer 70 positions. On the basis of a Google search of the 50 programs listed in the SF Match database, web pages were found for 35. Of programs with independent web pages, all had a description of the program and 26 (74%) described their application process. Twenty-nine (83%) listed research requirements, 22 (63%) described the rotation schedule, and 12 (34%) discussed the on-call expectations. A contact telephone number and/or email address was provided by 97% of programs. Twenty (57%) listed both the coordinator and fellowship director, 9 (26%) listed the coordinator only, 5 (14%) listed the fellowship director only, and 1 (3%) had no contact information given. The SF Match and POSNA databases provide few direct links to fellowship Web sites, and individual program Web sites either do not exist or do not effectively convey information about the programs. Improved accessibility and accurate information online would allow potential applicants to obtain information about pediatric fellowships in a more efficient manner.
SLAC Detailed Page: For staff, users, and collaborators - Page no longer
information about this change.) This page will automatically redirect to the For Staff page. You may also want to visit the new Detailed Index web page. Please change your bookmarks accordingly. SLAC Stanford
Software tools for developing an acoustics multimedia CD-ROM
NASA Astrophysics Data System (ADS)
Bigelow, Todd W.; Wheeler, Paul A.
2003-10-01
A multimedia CD-ROM was developed to accompany the textbook, Science of Sound, by Tom Rossing. This paper discusses the multimedia elements included in the CD-ROM and the various software packages used to create them. PowerPoint presentations with an audio-track background were converted to web pages using Impatica. Animations of acoustic examples and quizzes were developed using Flash by Macromedia. Vegas Video and Sound Forge by Sonic Foundry were used for editing video and audio clips while Cleaner by Discreet was used to compress the clips for use over the internet. Math tutorials were presented as whiteboard presentations using Hitachis Starboard to create the graphics and TechSmiths Camtasia Studio to record the presentations. The CD-ROM is in a web-page format created with Macromedias Dreamweaver. All of these elements are integrated into a single course supplement that can be viewed by any computer with a web browser.
Collaborative GIS for flood susceptibility mapping: An example from Mekong river basin of Viet Nam
NASA Astrophysics Data System (ADS)
Thanh, B.
2016-12-01
Flooding is one of the most dangerous natural disasters in Vietnam. Floods have caused serious damages to people and made adverse impact on social economic development across the country, especially in lower river basin where there is high risk of flooding as consequences of the climate change and social activities. This paper presents a collaborative platform of a combination of an interactive web-GIS framework and a multi-criteria evaluation (MCE) tool. MCE is carried out in server side through web interface, in which parameters used for evaluation are groups into three major categories, including (1) climatic factor: precipitation, typhoon frequency, temperature, humidity (2) physiographic data: DEM, topographic wetness index, NDVI, stream power index, soil texture, distance to river (3) social factor: NDBI, land use pattern. Web-based GIS is based on open-source technology that includes an information page, a page for MCE tool that users can interactively alter parameters in flood susceptible mapping, and a discussion page. The system is designed for local participation in prediction of the flood risk magnitude under impacts of natural processes and human intervention. The proposed flood susceptibility assessment prototype was implemented in the Mekong river basin, Viet Nam. Index images were calculated using Landsat data, and other were collected from authorized agencies. This study shows the potential to combine web-GIS and spatial analysis tool to flood hazard risk assessment. The combination can be a supportive solution that potentially assists the interaction between stakeholders in information exchange and in disaster management, thus provides for better analysis, control and decision-making.
ERIC Educational Resources Information Center
Moskowitz, Steven
2004-01-01
In fall 2002 the Brewster Central School District introduced teacher Web pages to a teaching staff of more than 300. One of the major goals of the project was to improve teacher computer literacy. Approximately one year prior to this project, the professional staff was asked by the district technology committee to complete a technology survey so…
Romano, Ron; Baum, Neil
2014-01-01
Having a Web page and a blog site are the minimum requirements for an Internet presence in the new millennium. However, a Web page that loads on a personal computer or a laptop will be ineffective on a mobile or cellular phone. Today, with more existing and potential patients having access to cellular technology, it is necessary to reconfigure the appearance of your Web site that appears on a mobile phone. This article discusses mobile computing and suggestions for improving the appearance of your Web site on a mobile or cellular phone.
Interstellar Initiative Web Page Design
NASA Technical Reports Server (NTRS)
Mehta, Alkesh
1999-01-01
This summer at NASA/MSFC, I have contributed to two projects: Interstellar Initiative Web Page Design and Lenz's Law Relative Motion Demonstration. In the Web Design Project, I worked on an Outline. The Web Design Outline was developed to provide a foundation for a Hierarchy Tree Structure. The Outline would help design a Website information base for future and near-term missions. The Website would give in-depth information on Propulsion Systems and Interstellar Travel. The Lenz's Law Relative Motion Demonstrator is discussed in this volume by Russell Lee.
ERIC Educational Resources Information Center
Kupersmith, John
2003-01-01
Examines special-purpose entry points to library Web sites. Discusses in-house homepages; branch-specific pages or single library system-wide pages; staff use pages; versions in different languages; "MyLibrary" pages where users can customize the menu; standalone "branded" sites; publicly accessible pages; and best practices.…
ERIC Educational Resources Information Center
Corson-Finnerty, Adam
2000-01-01
Discusses how the Internet can be a valuable new fund-raising tool for libraries and other non-profit organizations. Topics include designing effective Web pages, the use of e-mail, and the use of permission marketing to build a constituency. (Author/LRW)
Referencing web pages and e-journals.
Bryson, David
2013-12-01
One of the areas that can confuse students and authors alike is how to reference web pages and electronic journals (e-journals). The aim of this professional development article is to go back to first principles for referencing and see how with examples these should be referenced.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
... answers to 5 questions. NIOSH has also created a new NIOSH Cancer and RELs Policy Web Topic Page [see http... be available on the NIOSH Web page at http://www.cdc.gov/niosh/docket , and comments will be...
McDonough, Brianna; Felter, Elizabeth; Downes, Amia; Trauth, Jeanette
2015-04-01
Pregnant and postpartum women have special needs during public health emergencies but often have inadequate levels of disaster preparedness. Thus, improving maternal emergency preparedness is a public health priority. More research is needed to identify the strengths and weaknesses of various approaches to how preparedness information is communicated to these women. A sample of web pages from the Centers for Disease Control and Prevention intended to address the preparedness needs of pregnant and postpartum populations was examined for suitability for this audience. Five of the 7 web pages examined were considered adequate. One web page was considered not suitable and one the raters split between not suitable and adequate. None of the resources examined were considered superior. If these resources are considered some of the best available to pregnant and postpartum women, more work is needed to improve the suitability of educational resources, especially for audiences with low literacy and low incomes.
Userscripts for the life sciences.
Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J
2007-12-21
The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity.
Userscripts for the Life Sciences
Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J
2007-01-01
Background The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Results Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. Conclusion This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity. PMID:18154664
Multigraph: Interactive Data Graphs on the Web
NASA Astrophysics Data System (ADS)
Phillips, M. B.
2010-12-01
Many aspects of geophysical science involve time dependent data that is often presented in the form of a graph. Considering that the web has become a primary means of communication, there are surprisingly few good tools and techniques available for presenting time-series data on the web. The most common solution is to use a desktop tool such as Excel or Matlab to create a graph which is saved as an image and then included in a web page like any other image. This technique is straightforward, but it limits the user to one particular view of the data, and disconnects the graph from the data in a way that makes updating a graph with new data an often cumbersome manual process. This situation is somewhat analogous to the state of mapping before the advent of GIS. Maps existed only in printed form, and creating a map was a laborious process. In the last several years, however, the world of mapping has experienced a revolution in the form of web-based and other interactive computer technologies, so that it is now commonplace for anyone to easily browse through gigabytes of geographic data. Multigraph seeks to bring a similar ease of access to time series data. Multigraph is a program for displaying interactive time-series data graphs in web pages that includes a simple way of configuring the appearance of the graph and the data to be included. It allows multiple data sources to be combined into a single graph, and allows the user to explore the data interactively. Multigraph lets users explore and visualize "data space" in the same way that interactive mapping applications such as Google Maps facilitate exploring and visualizing geography. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file and requires no programming. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Graph of Global Temperature Anomalies from ClimateWatch Magazine (http://www.climatewatch.noaa.gov/2009/articles/climate-change-global-temperature)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carnes, E.T.; Truett, D.F.; Truett, L.F.
In the handful of years since the World Wide Web (WWW or Web) came into being, Web sites have developed at an astonishing rate. With the influx of Web pages comes a disparity of site types, including personal homepages, commercial sales sites, and educational data. The variety of sites and the deluge of information contained on the Web exemplify the individual nature of the WWW. Whereas some people argue that it is this eclecticism which gives the Web its charm, we propose that sites which are repositories of technical data would benefit from standardization. This paper proffers a methodology formore » publishing ecological research on the Web. The template we describe uses capabilities of HTML (the HyperText Markup Language) to enhance the value of the traditional scientific paper.« less
Identification of Malicious Web Pages by Inductive Learning
NASA Astrophysics Data System (ADS)
Liu, Peishun; Wang, Xuefang
Malicious web pages are an increasing threat to current computer systems in recent years. Traditional anti-virus techniques focus typically on detection of the static signatures of Malware and are ineffective against these new threats because they cannot deal with zero-day attacks. In this paper, a novel classification method for detecting malicious web pages is presented. This method is generalization and specialization of attack pattern based on inductive learning, which can be used for updating and expanding knowledge database. The attack pattern is established from an example and generalized by inductive learning, which can be used to detect unknown attacks whose behavior is similar to the example.
Web pages: What can you see in a single fixation?
Jahanian, Ali; Keshvari, Shaiyan; Rosenholtz, Ruth
2018-01-01
Research in human vision suggests that in a single fixation, humans can extract a significant amount of information from a natural scene, e.g. the semantic category, spatial layout, and object identities. This ability is useful, for example, for quickly determining location, navigating around obstacles, detecting threats, and guiding eye movements to gather more information. In this paper, we ask a new question: What can we see at a glance at a web page - an artificial yet complex "real world" stimulus? Is it possible to notice the type of website, or where the relevant elements are, with only a glimpse? We find that observers, fixating at the center of a web page shown for only 120 milliseconds, are well above chance at classifying the page into one of ten categories. Furthermore, this ability is supported in part by text that they can read at a glance. Users can also understand the spatial layout well enough to reliably localize the menu bar and to detect ads, even though the latter are often camouflaged among other graphical elements. We discuss the parallels between web page gist and scene gist, and the implications of our findings for both vision science and human-computer interaction.
Automatic Hidden-Web Table Interpretation by Sibling Page Comparison
NASA Astrophysics Data System (ADS)
Tao, Cui; Embley, David W.
The longstanding problem of automatic table interpretation still illudes us. Its solution would not only be an aid to table processing applications such as large volume table conversion, but would also be an aid in solving related problems such as information extraction and semi-structured data management. In this paper, we offer a conceptual modeling solution for the common special case in which so-called sibling pages are available. The sibling pages we consider are pages on the hidden web, commonly generated from underlying databases. We compare them to identify and connect nonvarying components (category labels) and varying components (data values). We tested our solution using more than 2,000 tables in source pages from three different domains—car advertisements, molecular biology, and geopolitical information. Experimental results show that the system can successfully identify sibling tables, generate structure patterns, interpret tables using the generated patterns, and automatically adjust the structure patterns, if necessary, as it processes a sequence of hidden-web pages. For these activities, the system was able to achieve an overall F-measure of 94.5%.
2008-09-01
IWPC 21 Berners - Lee , Tim . (1999). Weaving the Web. New York: HarperCollins Publishers, Inc. 22... Berners - Lee , Tim . (1999). Weaving the Web. New York: HarperCollins Publishers, Inc. Berners - Lee , T., Hendler, J., & Lassila, O. (2001). The Semantic...environment where software agents roaming from page to page can readily carry out sophisticated tasks for users. T. Berners - Lee , J. Hendler, and O
Commentary: Building Web Research Strategies for Teachers and Students
ERIC Educational Resources Information Center
Maloy, Robert W.
2016-01-01
This paper presents web research strategies for teachers and students to use in building Dramatic Event, Historical Biography, and Influential Literature wiki pages for history/social studies learning. Dramatic Events refer to milestone or turning point moments in history. Historical Biographies and Influential Literature pages feature…
Impact of a Physician-Led Social Media Sharing Program on a Medical Journal's Web Traffic.
Trueger, N Seth; Bokarius, Andrew V; Carroll, Stephen; April, Michael D; Thoma, Brent
2018-01-01
The use of social media by health professionals and medical journals is increasing. The aim of this study was to compare online views of articles in press (AIPs) released by Annals of Emergency Medicine before and after a nine-person social media team started actively posting links to AIPs using their personal Twitter accounts. An observational before-and-after study was conducted. Web traffic data for Annals were obtained from the publisher (Elsevier), detailing the number of page views to annemergmed.com by referring websites during the study period. The preintervention time period was defined as January 1, 2013, to June 30, 2014, and the postintervention period as July 1, 2014, to July 31, 2015. The primary outcome was page views from Twitter per AIP released each month to account for the number of articles published each month. Secondary outcomes included page views from Facebook (on which there was no article-sharing intervention) and total article views per month. The median page views from Twitter per individual AIP released each month increased from 33 in the preintervention period to 130, for an effect size of 97 (95% confidence interval, 56-111; P < .001). There was a smaller increase in median page views from Facebook per individual AIP of 21 (95% confidence interval, 10-32). There was no significant increase in these median values for total page views per AIP. Twitter sharing of AIPs increased the number of page views that came from Twitter but did not increase the overall number of page views. Copyright © 2017 American College of Radiology. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-16
... activities and included the Banana Island Sanctuary (including King Spring), the Sunset Shores Sanctuary, and... manatees (Nature Coast Coalition 2010 Web site). In addition to viewing manatees, area recreationists... 24654; May 12, 1994). This expansion was [[Page 15621
Citations to Web pages in scientific articles: the permanence of archived references.
Thorp, Andrea W; Schriger, David L
2011-02-01
We validate the use of archiving Internet references by comparing the accessibility of published uniform resource locators (URLs) with corresponding archived URLs over time. We scanned the "Articles in Press" section in Annals of Emergency Medicine from March 2009 through June 2010 for Internet references in research articles. If an Internet reference produced the authors' expected content, the Web page was archived with WebCite (http://www.webcitation.org). Because the archived Web page does not change, we compared it with the original URL to determine whether the original Web page had changed. We attempted to access each original URL and archived Web site URL at 3-month intervals from the time of online publication during an 18-month study period. Once a URL no longer existed or failed to contain the original authors' expected content, it was excluded from further study. The number of original URLs and archived URLs that remained accessible over time was totaled and compared. A total of 121 articles were reviewed and 144 Internet references were found within 55 articles. Of the original URLs, 15% (21/144; 95% confidence interval [CI] 9% to 21%) were inaccessible at publication. During the 18-month observation period, there was no loss of archived URLs (apart from the 4% [5/123; 95% CI 2% to 9%] that could not be archived), whereas 35% (49/139) of the original URLs were lost (46% loss; 95% CI 33% to 61% by the Kaplan-Meier method; difference between curves P<.0001, log rank test). Archiving a referenced Web page at publication can help preserve the authors' expected information. Copyright © 2010 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.
MedlinePlus Connect: How it Works
... it looks depends on how it is implemented. Web Application The Web application returns a formatted response ... for more examples of Web Application response pages. Web Service The MedlinePlus Connect REST-based Web service ...
75 FR 29466 - Prohibition Against Certain Flights Within the Territory and Airspace of Afghanistan
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-26
... access to and the use of intelligence; Operational security (OPSEC), including handling, storage, and...://www.faa.gov/regulations_policies or Accessing the Government Printing Office's Web page at: http://www...
VisBOL: Web-Based Tools for Synthetic Biology Design Visualization.
McLaughlin, James Alastair; Pocock, Matthew; Mısırlı, Göksel; Madsen, Curtis; Wipat, Anil
2016-08-19
VisBOL is a Web-based application that allows the rendering of genetic circuit designs, enabling synthetic biologists to visually convey designs in SBOL visual format. VisBOL designs can be exported to formats including PNG and SVG images to be embedded in Web pages, presentations and publications. The VisBOL tool enables the automated generation of visualizations from designs specified using the Synthetic Biology Open Language (SBOL) version 2.0, as well as a range of well-known bioinformatics formats including GenBank and Pigeoncad notation. VisBOL is provided both as a user accessible Web site and as an open-source (BSD) JavaScript library that can be used to embed diagrams within other content and software.
Characterization of topological structure on complex networks.
Nakamura, Ikuo
2003-10-01
Characterizing the topological structure of complex networks is a significant problem especially from the viewpoint of data mining on the World Wide Web. "Page rank" used in the commercial search engine Google is such a measure of authority to rank all the nodes matching a given query. We have investigated the page-rank distribution of the real Web and a growing network model, both of which have directed links and exhibit a power law distributions of in-degree (the number of incoming links to the node) and out-degree (the number of outgoing links from the node), respectively. We find a concentration of page rank on a small number of nodes and low page rank on high degree regimes in the real Web, which can be explained by topological properties of the network, e.g., network motifs, and connectivities of nearest neighbors.
Eysenbach, Gunther; Powell, John; Kuss, Oliver; Sa, Eun-Ryoung
The quality of consumer health information on the World Wide Web is an important issue for medicine, but to date no systematic and comprehensive synthesis of the methods and evidence has been performed. To establish a methodological framework on how quality on the Web is evaluated in practice, to determine the heterogeneity of the results and conclusions, and to compare the methodological rigor of these studies, to determine to what extent the conclusions depend on the methodology used, and to suggest future directions for research. We searched MEDLINE and PREMEDLINE (1966 through September 2001), Science Citation Index (1997 through September 2001), Social Sciences Citation Index (1997 through September 2001), Arts and Humanities Citation Index (1997 through September 2001), LISA (1969 through July 2001), CINAHL (1982 through July 2001), PsychINFO (1988 through September 2001), EMBASE (1988 through June 2001), and SIGLE (1980 through June 2001). We also conducted hand searches, general Internet searches, and a personal bibliographic database search. We included published and unpublished empirical studies in any language in which investigators searched the Web systematically for specific health information, evaluated the quality of Web sites or pages, and reported quantitative results. We screened 7830 citations and retrieved 170 potentially eligible full articles. A total of 79 distinct studies met the inclusion criteria, evaluating 5941 health Web sites and 1329 Web pages, and reporting 408 evaluation results for 86 different quality criteria. Two reviewers independently extracted study characteristics, medical domains, search strategies used, methods and criteria of quality assessment, results (percentage of sites or pages rated as inadequate pertaining to a quality criterion), and quality and rigor of study methods and reporting. Most frequently used quality criteria used include accuracy, completeness, readability, design, disclosures, and references provided. Fifty-five studies (70%) concluded that quality is a problem on the Web, 17 (22%) remained neutral, and 7 studies (9%) came to a positive conclusion. Positive studies scored significantly lower in search (P =.02) and evaluation (P =.04) methods. Due to differences in study methods and rigor, quality criteria, study population, and topic chosen, study results and conclusions on health-related Web sites vary widely. Operational definitions of quality criteria are needed.
Martin-Facklam, Meret; Kostrzewa, Michael; Martin, Peter; Haefeli, Walter E
2004-01-01
Aims The generally poor quality of health information on the world wide web (WWW) has caused preventable adverse outcomes. Quality management of information on the internet is therefore critical given its widespread use. In order to develop strategies for the safe use of drugs, we scored general and content quality of pages about sildenafil and performed an intervention to improve their quality. Methods The internet was searched with Yahoo and AltaVista for pages about sildenafil and 303 pages were included. For assessment of content quality a score based on accuracy and completeness of essential drug information was assigned. For assessment of general quality, four criteria were evaluated and their association with high content quality was determined by multivariate logistic regression analysis. The pages were randomly allocated to either control or intervention group. Evaluation took place before, as well as 7 and 22 weeks after an intervention which consisted of two letters with individualized feedback information on the respective page which were sent electronically to the address mentioned on the page. Results Providing references to scientific publications or prescribing information was significantly associated with high content quality (odds ratio: 8.2, 95% CI 3.2, 20.5). The intervention had no influence on general or content quality. Conclusions To prevent adverse outcomes caused by misinformation on the WWW individualized feedback to the address mentioned on the page was ineffective. It is currently probably the most straight-forward approach to inform lay persons about indicators of high information quality, i.e. the provision of references. PMID:14678344
Searching for Bill and Jane: Electronic Full-Text Literature.
ERIC Educational Resources Information Center
Still, Julie; Kassabian, Vibiana
1998-01-01
Examines electronic full-text literature available on the World Wide Web and on CD-ROM. Discusses authors and genres, electronic texts, and fees. Highlights Shakespeare, Jane Austen, and nature writing. Provides a bibliography of Web guides, specialized Shakespeare pages, and pages dealing with the Shakespeare authorship debate and secondary…
Ecosystem Food Web Lift-The-Flap Pages
ERIC Educational Resources Information Center
Atwood-Blaine, Dana; Rule, Audrey C.; Morgan, Hannah
2016-01-01
In the lesson on which this practical article is based, third grade students constructed a "lift-the-flap" page to explore food webs on the prairie. The moveable papercraft focused student attention on prairie animals' external structures and how the inferred functions of those structures could support further inferences about the…
12 CFR 708a.3 - Board of directors' approval and members' opportunity to comment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... fashion in the lobby of the credit union's home office and branch offices and on the credit union's Web site, if it has one. If the notice is not on the home page of the Web site, the home page must have a...
[Improving vaccination social marketing by monitoring the web].
Ferro, A; Bonanni, P; Castiglia, P; Montante, A; Colucci, M; Miotto, S; Siddu, A; Murrone, L; Baldo, V
2014-01-01
Immunisation is one of the most important and cost- effective interventions in Public Health because of their significant positive impact on population health.However, since Jenner's discovery there always been a lively debate between supporters and opponents of vaccination; Today the antivaccination movement spreads its message mostly on the web, disseminating inaccurate data through blogs and forums, increasing vaccine rejection.In this context, the Società Italiana di Igiene (SItI) created a web project in order to fight the misinformation on the web regarding vaccinations, through a series of information tools, including scientific articles, educational information, video and multimedia presentations The web portal (http://www.vaccinarsi.org) was published in May 2013 and now is already available over one hundred web pages related to vaccinations Recently a Forum, a periodic newsletter and a Twitter page have been created. There has been an average of 10,000 hits per month. Currently our users are mostly healthcare professionals. The visibility of the site is very good and it currently ranks first in the Google's search engine, taping the word "vaccinarsi" The results of the first four months of activity are extremely encouraging and show the importance of this project; furthermore the application for quality certification by independent international Organizations has been submitted.
JSME: a free molecule editor in JavaScript.
Bienfait, Bruno; Ertl, Peter
2013-01-01
A molecule editor, i.e. a program facilitating graphical input and interactive editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. Today, when a web browser has become the universal scientific user interface, a tool to edit molecules directly within the web browser is essential. One of the most popular tools for molecular structure input on the web is the JME applet. Since its release nearly 15 years ago, however the web environment has changed and Java applets are facing increasing implementation hurdles due to their maintenance and support requirements, as well as security issues. This prompted us to update the JME editor and port it to a modern Internet programming language - JavaScript. The actual molecule editing Java code of the JME editor was translated into JavaScript with help of the Google Web Toolkit compiler and a custom library that emulates a subset of the GUI features of the Java runtime environment. In this process, the editor was enhanced by additional functionalities including a substituent menu, copy/paste, drag and drop and undo/redo capabilities and an integrated help. In addition to desktop computers, the editor supports molecule editing on touch devices, including iPhone, iPad and Android phones and tablets. In analogy to JME the new editor is named JSME. This new molecule editor is compact, easy to use and easy to incorporate into web pages. A free molecule editor written in JavaScript was developed and is released under the terms of permissive BSD license. The editor is compatible with JME, has practically the same user interface as well as the web application programming interface. The JSME editor is available for download from the project web page http://peter-ertl.com/jsme/
2000-01-01
horoscope page (for Scorpio). Although this particular combination might be unique or unpopular, if we decompose the page into four WebViews, one for metro...news, one for international news, one for the weather and one for the horoscope , then these WebViews can be accessed frequently enough to merit...query results, the cost of accessing them is about the same as the cost of generating them from scratch, using the virt policy. This will also be true
The New USGS Volcano Hazards Program Web Site
NASA Astrophysics Data System (ADS)
Venezky, D. Y.; Graham, S. E.; Parker, T. J.; Snedigar, S. F.
2008-12-01
The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) has launched a revised web site that uses a map-based interface to display hazards information for U.S. volcanoes. The web site is focused on better communication of hazards and background volcano information to our varied user groups by reorganizing content based on user needs and improving data display. The Home Page provides a synoptic view of the activity level of all volcanoes for which updates are written using a custom Google® Map. Updates are accessible by clicking on one of the map icons or clicking on the volcano of interest in the adjacent color-coded list of updates. The new navigation provides rapid access to volcanic activity information, background volcano information, images and publications, volcanic hazards, information about VHP, and the USGS volcano observatories. The Volcanic Activity section was tailored for emergency managers but provides information for all our user groups. It includes a Google® Map of the volcanoes we monitor, an Elevated Activity Page, a general status page, information about our Volcano Alert Levels and Aviation Color Codes, monitoring information, and links to monitoring data from VHP's volcano observatories: Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Long Valley Observatory (LVO), Hawaiian Volcano Observatory (HVO), and Yellowstone Volcano Observatory (YVO). The YVO web site was the first to move to the new navigation system and we are working on integrating the Long Valley Observatory web site next. We are excited to continue to implement new geospatial technologies to better display our hazards and supporting volcano information.
Marketing on the World Wide Web.
ERIC Educational Resources Information Center
Teague, John H.
1995-01-01
Discusses the World Wide Web, its importance for marketing, its advantages, non-commercial promotions on the Web, how businesses use the Web, the Web market, resistance to Internet commercialization, getting on the Web, creating Web pages, rising above the noise, and some of the Web's problems and limitations. (SR)
Suzuki, Lalita K; Beale, Ivan L
2006-01-01
The content of personal Web home pages created by adolescents with cancer is a new source of information about this population of potential benefit to oncology nurses and psychologists. Individual Internet elements found on 21 home pages created by youths with cancer (14-22 years old) were rated for cancer-related self-presentation, information dissemination, and interpersonal connection. Examples of adolescents' online narratives were also recorded. Adolescents with cancer used various Internet elements on their home pages for cancer-related self-presentation (eg, welcome messages, essays, personal history and diary pages, news articles, and poetry), information dissemination (e.g., through personal interest pages, multimedia presentations, lists, charts, and hyperlinks), and interpersonal connection (eg, guestbook entries). Results suggest that various elements found on personal home pages are being used by a limited number of young patients with cancer for self-expression, information access, and contact with peers.
Extracting knowledge from the World Wide Web
Henzinger, Monika; Lawrence, Steve
2004-01-01
The World Wide Web provides a unprecedented opportunity to automatically analyze a large sample of interests and activity in the world. We discuss methods for extracting knowledge from the web by randomly sampling and analyzing hosts and pages, and by analyzing the link structure of the web and how links accumulate over time. A variety of interesting and valuable information can be extracted, such as the distribution of web pages over domains, the distribution of interest in different areas, communities related to different topics, the nature of competition in different categories of sites, and the degree of communication between different communities or countries. PMID:14745041
MyFreePACS: a free web-based radiology image storage and viewing tool.
de Regt, David; Weinberger, Ed
2004-08-01
We developed an easy-to-use method for central storage and subsequent viewing of radiology images for use on any PC equipped with Internet Explorer. We developed MyFreePACS, a program that uses a DICOM server to receive and store images and transmit them over the Web to the MyFreePACS Web client. The MyFreePACS Web client is a Web page that uses an ActiveX control for viewing and manipulating images. The client contains many of the tools found in modern image viewing stations including 3D localization and multiplanar reformation. The system is built entirely with free components and is freely available for download and installation from the Web at www.myfreepacs.com.
Enhancing Geoscience Research Discovery Through the Semantic Web
NASA Astrophysics Data System (ADS)
Rowan, Linda R.; Gross, M. Benjamin; Mayernik, Matthew; Khan, Huda; Boler, Frances; Maull, Keith; Stott, Don; Williams, Steve; Corson-Rikert, Jon; Johns, Erica M.; Daniels, Michael; Krafft, Dean B.; Meertens, Charles
2016-04-01
UNAVCO, UCAR, and Cornell University are working together to leverage semantic web technologies to enable discovery of people, datasets, publications and other research products, as well as the connections between them. The EarthCollab project, a U.S. National Science Foundation EarthCube Building Block, is enhancing an existing open-source semantic web application, VIVO, to enhance connectivity across distributed networks of researchers and resources related to the following two geoscience-based communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. People, publications, datasets and grant information have been mapped to an extended version of the VIVO-ISF ontology and ingested into VIVO's database. Much of the VIVO ontology was built for the life sciences, so we have added some components of existing geoscience-based ontologies and a few terms from a local ontology that we created. The UNAVCO VIVO instance, connect.unavco.org, utilizes persistent identifiers whenever possible; for example using ORCIDs for people, publication DOIs, data DOIs and unique NSF grant numbers. Data is ingested using a custom set of scripts that include the ability to perform basic automated and curated disambiguation. VIVO can display a page for every object ingested, including connections to other objects in the VIVO database. A dataset page, for example, includes the dataset type, time interval, DOI, related publications, and authors. The dataset type field provides a connection to all other datasets of the same type. The author's page shows, among other information, related datasets and co-authors. Information previously spread across several unconnected databases is now stored in a single location. In addition to VIVO's default display, the new database can be queried using SPARQL, a query language for semantic data. EarthCollab is extending the VIVO web application. One such extension is the ability to cross-link separate VIVO instances across institutions, allowing local display of externally curated information. For example, Cornell's VIVO faculty pages will display UNAVCO's dataset information and UNAVCO's VIVO will display Cornell faculty member contact and position information. About half of UNAVCO's membership is international and we hope to connect our data to institutions in other countries with a similar approach. Additional extensions, including enhanced geospatial capabilities, will be developed based on task-centered usability testing.
Assessment and revision of clinical pharmacy practice internet web sites.
Edwards, Krystal L; Salvo, Marissa C; Ward, Kristina E; Attridge, Russell T; Kiser, Katie; Pinner, Nathan A; Gallegos, Patrick J; Kesteloot, Lori Lynn; Hylton, Ann; Bookstaver, P Brandon
2014-02-01
Health care professionals, trainees, and patients use the Internet extensively. Editable Web sites may contain inaccurate, incomplete, and/or outdated information that may mislead the public's perception of the topic. To evaluate the editable, online descriptions of clinical pharmacy and pharmacist and attempt to improve their accuracy. The authors identified key areas within clinical pharmacy to evaluate for accuracy and appropriateness on the Internet. Current descriptions that were reviewed on public domain Web sites included: (1) clinical pharmacy and the clinical pharmacist, (2) pharmacy education, (3) clinical pharmacy and development and provision for reimbursement, (4) clinical pharmacists and advanced specialty certifications/training opportunities, (5) pharmacists and advocacy, and (6) clinical pharmacists and interdisciplinary/interprofessional content. The authors assessed each content area to determine accuracy and prioritized the need for updating, when applicable, to achieve consistency in descriptions and relevancy. The authors found that Wikipedia, a public domain that allows users to update, was consistently the most common Web site produced in search results. The authors' evaluation resulted in the creation or revision of 14 Wikipedia Web pages. However, rejection of 3 proposed newly created Web pages affected the authors' ability to address identified content areas with deficiencies and/or inaccuracies. Through assessing and updating editable Web sites, the authors strengthened the online representation of clinical pharmacy in a clear, cohesive, and accurate manner. However, ongoing assessments of the Internet are continually needed to ensure accuracy and appropriateness.
A web site on lung cancer: who are the users and what are they looking for?
Linssen, Cilia; Schook, Romane M; The, Anne-Mei; Lammers, Ernst; Festen, Jan; Postmus, Pieter E
2007-09-01
The Dutch Lung Cancer Information Centre launched the Web site www.longkanker.info in November 2003. The purpose of this article is to describe the launching of the Web site, its development, the type of visitors to the Web site, what they were looking for, and whether they found what they requested. Supervised by a panel (pulmonologists, patients, communication specialists), a large amount of material about lung cancer has been collected and edited into accessible language by health care providers, and the Web site has been divided into special categories following the different stages that lung cancer patients, relatives, and health care providers go through during the illness. The Web site is updated regularly. Search engines have been used to check the position of the Web site as a "hit." Pulmonologists have been informed about the founding of the Web site, and all lung cancer outpatient clinics in The Netherlands have received posters, folders, and cards to inform their patients. Visitor numbers, page views, and visitor numbers per page view have been registered continuously. Visitor satisfaction polls were placed in the second half of 2004 and the second half of 2005. The Web site appeared as first hit when using search engines immediately after launching it. Half of the visitors came to the Web site via search engines or links found at other sites. The number of visitors started at 4600 in the first month, doubled in the next months, and reached 18,000 per month 2 years after its launch. The number of visited pages increased to 87,000 per month, with an average number of five pages per visitor. Thirty percent of the visitors return within the same month. The most popular pages are interactive pages with the overview of all questions to "ask the doctor" at the top with forum messages, survival figures of all form of lung cancer, and information about the disease. The first satisfaction poll obtained 650 respondents and the second 382. The visitors to the Web site are caregivers (57%), patients (8%), and others (students, people fearing lung cancer). Of the visitors, 895 found what they were looking for, and the satisfaction is the highest among nurses and caregivers (91% and 95%, respectively) and the lowest among physicians and patients (85% and 83%). Given the number of visitors to the lung cancer Web site, it can be concluded that there is a great need for additional information among patients and caregivers. The launched Web site www.longkanker.info has reached its goal of providing a dependable source of information about lung cancer and satisfying its visitors.
... page: //medlineplus.gov/ency/article/002844.htm Funnel-web spider bite To use the sharing features on ... the effects of a bite from the funnel-web spider. Male funnel-web spiders are more poisonous ...
Gamage, Deepa G; Fuller, Candice A; Cummings, Rosey; Tomnay, Jane E; Chung, Mark; Chen, Marcus; Garrett, Cameryn C; Hocking, Jane S; Bradshaw, Catriona S; Fairley, Christopher K
2011-09-01
'TESTme' is a sexually transmissible infection (STI) screening service for Victorian young people living in rural areas. We evaluated the effectiveness of advertising for this service over an 11-month pilot period. The advertising that was used included websites, a Facebook page, posters, flyers, business cards, wrist bands and professional development sessions for health nurses that occurred throughout the pilot period. We also used once-off methods including advertisements in newspapers, student diaries and short messages to mobile phones. Twenty-eight clients had a consultation through TESTme. Twenty found the service through health professionals, six through the Melbourne Sexual Health Centre (MSHC) web page, one through the Facebook page and one through the student diary. The total direct costs incurred by the centre for advertising were $20850. The advertising cost per client reached for each advertising method was $26 for health professionals, $80 for the MSHC web advertisement, $1408 for Facebook and $790 for the student diary. Other advertising methods cost $12248 and did not attract any clients. Advertising STI health services for rural young people would be best to focus on referrals from other health services or health care websites.
Incorporating a Rich Media Presentation Format into a Lecture-Based Course Structure
ERIC Educational Resources Information Center
Moss, Nicholas
2005-01-01
The e-syllabus is a set of Web pages in which each course in the curriculum is assigned a "course page" and multiple "session pages." Course pages have a standardized format that provides course objectives, course policies, required or recommended textbooks, grading scales, and faculty listings. A separate session page is…
The Impact of Salient Advertisements on Reading and Attention on Web Pages
ERIC Educational Resources Information Center
Simola, Jaana; Kuisma, Jarmo; Oorni, Anssi; Uusitalo, Liisa; Hyona, Jukka
2011-01-01
Human vision is sensitive to salient features such as motion. Therefore, animation and onset of advertisements on Websites may attract visual attention and disrupt reading. We conducted three eye tracking experiments with authentic Web pages to assess whether (a) ads are efficiently ignored, (b) ads attract overt visual attention and disrupt…
Teaching Learning Theories Via the Web.
ERIC Educational Resources Information Center
Schnackenberg, Heidi L.
This paper describes a World Wide Web site on learning theories, developed as a class assignment for a course on learning and instructional theories at Concordia University (Quebec). Groups of two to four students developed pages on selected theories of learning that were then linked to a main page developed by the instructor and a doctoral…
User Perceptions of the Library's Web Pages: A Focus Group Study at Texas A&M University.
ERIC Educational Resources Information Center
Crowley, Gwyneth H.; Leffel, Rob; Ramirez, Diana; Hart, Judith L.; Armstrong, Tommy S., II
2002-01-01
This focus group study explored library patrons' opinions about Texas A&M library's Web pages. Discusses information seeking behavior which indicated that patrons are confused when trying to navigate the Public Access Menu and suggests the need for a more intuitive interface. (Author/LRW)
Some Thoughts on Free Textbooks
ERIC Educational Resources Information Center
Stewart, Robert
2009-01-01
The author publishes and freely distributes three online textbooks. "Introduction to Physical Oceanography" is available as a typeset book in Portable Document Format (PDF) or as web pages. "Our Ocean Planet: Oceanography in the 21st Century" and "Environmental Science in the 21st Century" are both available as web pages. All three books, which…
This page contains the August 2002 final rule fact sheet on the NESHAP for Paper and Other Web Coating. Also on this page is an April 2004 presentation that on the NESHAP, designed to be used for basic education
The Status of African Studies Digitized Content: Three Metadata Schemes.
ERIC Educational Resources Information Center
Kuntz, Patricia S.
The proliferation of Web pages and digitized material mounted on Internet servers has become unmanageable. Librarians and users are concerned that documents and information are being lost in cyberspace as a result of few bibliographic controls and common standards. Librarians in cooperation with software creators and Web page designers are…
Teaching Intrapersonal Communication with the World-Wide Web: Cognitive Technology.
ERIC Educational Resources Information Center
Shedletsky, Leonard J.; Aitken, Joan E.
This paper offers a brief description of a course on intrapersonal communication with a home page approach using the World Wide Web. The paper notes that students use the home page for completing assignments, readings, posting responses, self-evaluation testing, research, and displaying some of their papers for the course. The paper contains…
2014-05-01
developed techniques for building better IP geolocation systems. Geolocation has many applications, such as presenting advertisements for local business ...presenting advertisements for local business establishments on web pages to debugging network performance issues to attributing attack traffic to...Pennsylvania.” Geolocation has many applications, such as presenting advertisements for local business establishments on web pages to debugging network
Making the World Wide Web Accessible to All Students.
ERIC Educational Resources Information Center
Guthrie, Sally A.
2000-01-01
Examines the accessibility of Web sites belonging to 80 colleges of communications and schools of journalism by examining the hypertext markup language (HTML) used to format the pages. Suggests ways to revise the markup of pages to make them more accessible to students with vision, hearing, and mobility problems. Lists resources of the latest…
A Prototype HTML Training System for Graphic Communication Majors
ERIC Educational Resources Information Center
Runquist, Roger L.
2010-01-01
This design research demonstrates a prototype content management system capable of training graphic communication students in the creation of basic HTML web pages. The prototype serve as a method of helping students learn basic HTML structure and commands earlier in their academic careers. Exposure to the concepts of web page creation early in…
Key Spatial Relations-based Focused Crawling (KSRs-FC) for Borderlands Situation Analysis
NASA Astrophysics Data System (ADS)
Hou, D. Y.; Wu, H.; Chen, J.; Li, R.
2013-11-01
Place names play an important role in Borderlands Situation topics, while current focused crawling methods treat them in the same way as other common keywords, which may lead to the omission of many useful web pages. In the paper, place names in web pages and their spatial relations were firstly discussed. Then, a focused crawling method named KSRs-FC was proposed to deal with the collection of situation information about borderlands. In this method, place names and common keywords were represented separately, and some of the spatial relations related to web pages crawling were used in the relevance calculation between the given topic and web pages. Furthermore, an information collection system for borderlands situation analysis was developed based on KSRs-FC. Finally, F-Score method was adopted to quantitatively evaluate this method by comparing with traditional method. Experimental results showed that the F-Score value of the proposed method increased by 11% compared to traditional method with the same sample data. Obviously, KSRs-FC method can effectively reduce the misjudgement of relevant webpages.
Academic medical center libraries on the Web.
Tannery, N H; Wessel, C B
1998-01-01
Academic medical center libraries are moving towards publishing electronically, utilizing networked technologies, and creating digital libraries. The catalyst for this movement has been the Web. An analysis of academic medical center library Web pages was undertaken to assess the information created and communicated in early 1997. A summary of present uses and suggestions for future applications is provided. A method for evaluating and describing the content of library Web sites was designed. The evaluation included categorizing basic information such as description and access to library services, access to commercial databases, and use of interactive forms. The main goal of the evaluation was to assess original resources produced by these libraries. PMID:9803298
The aware toolbox for the detection of law infringements on web pages
NASA Astrophysics Data System (ADS)
Shahab, Asif; Kieninger, Thomas; Dengel, Andreas
2010-01-01
In the project Aware we aim to develop an automatic assistant for the detection of law infringements on web pages. The motivation for this project is that many authors of web pages are at some points infringing copyrightor other laws, mostly without being aware of that fact, and are more and more often confronted with costly legal warnings. As the legal environment is constantly changing, an important requirement of Aware is that the domain knowledge can be maintained (and initially defined) by numerous legal experts remotely working without further assistance of the computer scientists. Consequently, the software platform was chosen to be a web-based generic toolbox that can be configured to suit individual analysis experts, definitions of analysis flow, information gathering and report generation. The report generated by the system summarizes all critical elements of a given web page and provides case specific hints to the page author and thus forms a new type of service. Regarding the analysis subsystems, Aware mainly builds on existing state-of-the-art technologies. Their usability has been evaluated for each intended task. In order to control the heterogeneous analysis components and to gather the information, a lightweight scripting shell has been developed. This paper describes the analysis technologies, ranging from text based information extraction, over optical character recognition and phonetic fuzzy string matching to a set of image analysis and retrieval tools; as well as the scripting language to define the analysis flow.
Fallis, Don; Frické, Martin
2002-01-01
To identify indicators of accuracy for consumer health information on the Internet. The results will help lay people distinguish accurate from inaccurate health information on the Internet. Several popular search engines (Yahoo, AltaVista, and Google) were used to find Web pages on the treatment of fever in children. The accuracy and completeness of these Web pages was determined by comparing their content with that of an instrument developed from authoritative sources on treating fever in children. The presence on these Web pages of a number of proposed indicators of accuracy, taken from published guidelines for evaluating the quality of health information on the Internet, was noted. Correlation between the accuracy of Web pages on treating fever in children and the presence of proposed indicators of accuracy on these pages. Likelihood ratios for the presence (and absence) of these proposed indicators. One hundred Web pages were identified and characterized as "more accurate" or "less accurate." Three indicators correlated with accuracy: displaying the HONcode logo, having an organization domain, and displaying a copyright. Many proposed indicators taken from published guidelines did not correlate with accuracy (e.g., the author being identified and the author having medical credentials) or inaccuracy (e.g., lack of currency and advertising). This method provides a systematic way of identifying indicators that are correlated with the accuracy (or inaccuracy) of health information on the Internet. Three such indicators have been identified in this study. Identifying such indicators and informing the providers and consumers of health information about them would be valuable for public health care.
Indicators of Accuracy of Consumer Health Information on the Internet
Fallis, Don; Frické, Martin
2002-01-01
Objectives: To identify indicators of accuracy for consumer health information on the Internet. The results will help lay people distinguish accurate from inaccurate health information on the Internet. Design: Several popular search engines (Yahoo, AltaVista, and Google) were used to find Web pages on the treatment of fever in children. The accuracy and completeness of these Web pages was determined by comparing their content with that of an instrument developed from authoritative sources on treating fever in children. The presence on these Web pages of a number of proposed indicators of accuracy, taken from published guidelines for evaluating the quality of health information on the Internet, was noted. Main Outcome Measures: Correlation between the accuracy of Web pages on treating fever in children and the presence of proposed indicators of accuracy on these pages. Likelihood ratios for the presence (and absence) of these proposed indicators. Results: One hundred Web pages were identified and characterized as “more accurate” or “less accurate.” Three indicators correlated with accuracy: displaying the HONcode logo, having an organization domain, and displaying a copyright. Many proposed indicators taken from published guidelines did not correlate with accuracy (e.g., the author being identified and the author having medical credentials) or inaccuracy (e.g., lack of currency and advertising). Conclusions: This method provides a systematic way of identifying indicators that are correlated with the accuracy (or inaccuracy) of health information on the Internet. Three such indicators have been identified in this study. Identifying such indicators and informing the providers and consumers of health information about them would be valuable for public health care. PMID:11751805
Effect of font size, italics, and colour count on web usability.
Bhatia, Sanjiv K; Samal, Ashok; Rajan, Nithin; Kiviniemi, Marc T
2011-04-01
Web usability measures the ease of use of a website. This study attempts to find the effect of three factors - font size, italics, and colour count - on web usability. The study was performed using a set of tasks and developing a survey questionnaire. We performed the study using a set of human subjects, selected from the undergraduate students taking courses in psychology. The data computed from the tasks and survey questionnaire were statistically analysed to find if there was any effect of font size, italics, and colour count on the three web usability dimensions. We found that for the student population considered, there was no significant effect of font size on usability. However, the manipulation of italics and colour count did influence some aspects of usability. The subjects performed better for pages with no italics and high italics compared to moderate italics. The subjects rated the pages that contained only one colour higher than the web pages with four or six colours. This research will help web developers better understand the effect of font size, italics, and colour count on web usability in general, and for young adults, in particular.
Effect of font size, italics, and colour count on web usability
Samal, Ashok; Rajan, Nithin; Kiviniemi, Marc T.
2013-01-01
Web usability measures the ease of use of a website. This study attempts to find the effect of three factors – font size, italics, and colour count – on web usability. The study was performed using a set of tasks and developing a survey questionnaire. We performed the study using a set of human subjects, selected from the undergraduate students taking courses in psychology. The data computed from the tasks and survey questionnaire were statistically analysed to find if there was any effect of font size, italics, and colour count on the three web usability dimensions. We found that for the student population considered, there was no significant effect of font size on usability. However, the manipulation of italics and colour count did influence some aspects of usability. The subjects performed better for pages with no italics and high italics compared to moderate italics. The subjects rated the pages that contained only one colour higher than the web pages with four or six colours. This research will help web developers better understand the effect of font size, italics, and colour count on web usability in general, and for young adults, in particular. PMID:24358055
StreamStats in Georgia: a water-resources web application
Gotvald, Anthony J.; Musser, Jonathan W.
2015-07-31
StreamStats is being implemented on a State-by-State basis to allow for customization of the data development and underlying datasets to address their specific needs, issues, and objectives. The USGS, in cooperation with the Georgia Environmental Protection Division and Georgia Department of Transportation, has implemented StreamStats for Georgia. The Georgia StreamStats Web site is available through the national StreamStats Web-page portal at http://streamstats.usgs.gov. Links are provided on this Web page for individual State applications, instructions for using StreamStats, definitions of basin characteristics and streamflow statistics, and other supporting information.
NASA Astrophysics Data System (ADS)
McGibbney, L. J.; Armstrong, E. M.
2016-12-01
Figuratively speaking, Scientific Datasets (SD) are shared by data producers in a multitude of shapes, sizes and flavors. Primarily however they exist as machine-independent manifestations supporting the creation, access, and sharing of array-oriented SD that can on occasion be spread across multiple files. Within the Earth Sciences, the most notable general examples include the HDF family, NetCDF, etc. with other formats such as GRIB being used pervasively within specific domains such as the Oceanographic, Atmospheric and Meteorological sciences. Such file formats contain Coverage Data e.g. a digital representation of some spatio-temporal phenomenon. A challenge for large data producers such as NASA and NOAA as well as consumers of coverage datasets (particularly surrounding visualization and interactive use within web clients) is that this is still not a straight-forward issue due to size, serialization and inherent complexity. Additionally existing data formats are either unsuitable for the Web (like netCDF files) or hard to interpret independently due to missing standard structures and metadata (e.g. the OPeNDAP protocol). Therefore alternative, Web friendly manifestations of such datasets are required.CoverageJSON is an emerging data format for publishing coverage data to the web in a web-friendly, way which fits in with the linked data publication paradigm hence lowering the barrier for interpretation by consumers via mobile devices and client applications, etc. as well as data producers who can build next generation Web friendly Web services around datasets. This work will detail how CoverageJSON is being evaluated at NASA JPL's PO.DAAC as an enabling data representation format for publishing SD as Linked Open Data embedded within SD landing pages as well as via semantic data repositories. We are currently evaluating how utilization of CoverageJSON within SD landing pages addresses the long-standing acknowledgement that SD producers are not currently addressing content-based optimization within their SD landing pages for better crawlability by commercial search engines.
Modeling the Webgraph: How Far We Are
NASA Astrophysics Data System (ADS)
Donato, Debora; Laura, Luigi; Leonardi, Stefano; Millozzi, Stefano
The following sections are included: * Introduction * Preliminaries * WebBase * In-degree and out-degree * PageRank * Bipartite cliques * Strongly connected components * Stochastic models of the webgraph * Models of the webgraph * A multi-layer model * Large scale simulation * Algorithmic techniques for generating and measuring webgraphs * Data representation and multifiles * Generating webgraphs * Traversal with two bits for each node * Semi-external breadth first search * Semi-external depth first search * Computation of the SCCs * Computation of the bow-tie regions * Disjoint bipartite cliques * PageRank * Summary and outlook
2016-04-01
the DOD will put DOD systems and data at a risk level comparable to that of their neighbors in the cloud. Just as a user browses a Web page on the...proxy servers for controlling user access to Web pages, and large-scale storage for data management. Each of these devices allows access to the...user to develop applications. Acunetics.com describes Web applications as “computer programs allowing Website visitors to submit and retrieve data
ERIC Educational Resources Information Center
Bird, Bruce
This paper discusses the development of two World Wide Web sites at Anne Arundel Community College (Maryland). The criteria for the selection of hardware and software for Web site development that led to the decision to use Microsoft FrontPage 98 are described along with its major components and features. The discussion of the Science Division Web…
ERIC Educational Resources Information Center
Kim, Youngsoo; Lee, Jong Yeon
1999-01-01
Provides an overview of cyber education in Korea that is meeting the educational needs of corporations as well as universities. Topics include Web pages; plans for performance evaluation; system architecture; and future possibilities, including service providers, information infrastructure, and rules and regulations. (Contains 23 references.) (LRW)
32 CFR 806b.53 - Training tools.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 6 2011-07-01 2011-07-01 false Training tools. 806b.53 Section 806b.53 National... Training § 806b.53 Training tools. Helpful resources include: (a) The Air Force Freedom of Information Act Web page which includes a Privacy Overview, Privacy Act training slides, the Air Force systems of...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-14
...? What Do I Need To Know?'' is available for viewing on the FERC Web site ( www.ferc.gov ). This fact...; Installation of four new meter and regulation (M&R) stations including: [ssquf] Central M&R Receipt Station--a new M&R receipt station, including [[Page 56836
Rebooting the East: Automation in University Libraries of the Former German Democratic Republic.
ERIC Educational Resources Information Center
Seadle, Michael
1996-01-01
Provides a history of the automation efforts at former East German libraries that have made their resources available for the first time. Highlights include World Wide Web home page addresses; library problems, including censorship; automation guidelines, funding, and cooperation; online catalogs; and specific examples of university libraries'…
A Study of HTML Title Tag Creation Behavior of Academic Web Sites
ERIC Educational Resources Information Center
Noruzi, Alireza
2007-01-01
The HTML title tag information should identify and describe exactly what a Web page contains. This paper analyzes the "Title element" and raises a significant question: "Why is the title tag important?" Search engines base search results and page rankings on certain criteria. Among the most important criteria is the presence of the search keywords…
ERIC Educational Resources Information Center
Ludlow, John B.; Platin, Enrique
2000-01-01
Compared self-guided slide/tape (ST) and Web page (WP) instruction in normal radiographic anatomy of periapical and panoramic images using objective test performance and subjective preferences of 74 freshman dental students. Test performance was not different between image types or presentation technologies, but students preferred WP for…
Web Pages as an Interdisciplinary Tool in English for Architects Classes.
ERIC Educational Resources Information Center
Mansilla, Paloma Ubeda
2002-01-01
Proposes the use of web pages as an interdisciplinary tool in classes of English for professional and academic purposes. Languages and computing are two areas of knowledge that the graduate of the Polytechnic University of Madrid and its School of architecture need to study in order to supplement the education received during their degree with the…
Code of Federal Regulations, 2010 CFR
2010-07-01
.... This hourly rate is listed on the Commission's Web site at http://www.fmshrc.gov. Fees for searches of... listed on the Commission's Web site at http://www.fmshrc.gov. (c) Duplicating fee. The copy fee for each page of paper up to 81/2″ × 14″ shall be $.15 per copy per page. Any private sector services required...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-11
... deep saline geologic formations for permanent geologic storage. DATES: DOE invites the public to...; or by fax (304) 285-4403. The Draft EIS is available on DOE's NEPA Web page at: http://nepa.energy.gov/DOE_NEPA_documents.htm ; and on the National Energy Technology Laboratory's Web page at: http...
A Guide to Fast and Simple Web Site Development. Using Microsoft FrontPage.
ERIC Educational Resources Information Center
La, Minh; Beachler, Judith
Designed by California's Los Rios Community College District for use in instructional workshops, this guide is intended to help institutional researchers create World Wide Web sites using Microsoft FrontPage (MF) software. The first part of the guide presents practical suggestions for working with the software to create a site, covering the…
Rocha, Tânia; Bessa, Maximino; Gonçalves, Martinho; Cabral, Luciana; Godinho, Francisco; Peres, Emanuel; Reis, Manuel C; Magalhães, Luís; Chalmers, Alan
2012-11-01
One of the most mentioned problems of web accessibility, as recognized in several different studies, is related to the difficulty regarding the perception of what is or is not clickable in a web page. In particular, a key problem is the recognition of hyperlinks by a specific group of people, namely those with intellectual disabilities. This experiment investigated a methodology based on the direct observation, video recording, interview and data obtained by an eye tracker device. Ten participants took part in this study. They were divided into two groups and asked to perform two tasks: 'Sing a song' and 'Listen to a story' in two websites. These websites were developed to include specific details. The first website presented an image navigation menu (INM), whereas the other one showed a text navigation menu (TNM). There was a general improvement regarding the participants' performance when using INMs. The referred analysis indeed shows that not only did these specific participants gain a better understanding of the demanding task, but also they showed an improved perception concerning the content of the navigation menu that included hyperlinks with images. © 2012 Blackwell Publishing Ltd.
Code of Federal Regulations, 2014 CFR
2014-10-01
... taken. Confidential commercial information means trade secrets and confidential, privileged, and/or... be taken. Department or DOT means the Department of Transportation, including the Office of the... electronically through its FOIA Web pages (http://www.dot.gov/foia) and at the physical locations identified in...
Enabling Scientists: Serving Sci-Tech Library Users with Disabilities.
ERIC Educational Resources Information Center
Coonin, Bryna
2001-01-01
Discusses how librarians in scientific and technical libraries can contribute to an accessible electronic library environment for users with disabilities to ensure independent access to information. Topics include relevant assistive technologies; creating accessible Web pages; monitoring accessibility of electronic databases; preparing accessible…
California: Library Information Technologies.
ERIC Educational Resources Information Center
Will, Barbara, Ed.
1996-01-01
Describes six information technology projects in California libraries, including Internet access in public libraries; digital library developments at the University of California, Berkeley; the World Wide Web home page for the state library; Pacific Bell's role in statewide connectivity; state government initiatives; and services of the state…
Web mining for topics defined by complex and precise predicates
NASA Astrophysics Data System (ADS)
Lee, Ching-Cheng; Sampathkumar, Sushma
2004-04-01
The enormous growth of the World Wide Web has made it important to perform resource discovery efficiently for any given topic. Several new techniques have been proposed in the recent years for this kind of topic specific web-mining, and among them a key new technique called focused crawling which is able to crawl topic-specific portions of the web without having to explore all pages. Most existing research on focused crawling considers a simple topic definition that typically consists of one or more keywords connected by an OR operator. However this kind of simple topic definition may result in too many irrelevant pages in which the same keyword appears in a wrong context. In this research we explore new strategies for crawling topic specific portions of the web using complex and precise predicates. A complex predicate will allow the user to precisely specify a topic using Boolean operators such as "AND", "OR" and "NOT". Our work will concentrate on defining a format to specify this kind of a complex topic definition and secondly on devising a crawl strategy to crawl the topic specific portions of the web defined by the complex predicate, efficiently and with minimal overhead. Our new crawl strategy will improve the performance of topic-specific web crawling by reducing the number of irrelevant pages crawled. In order to demonstrate the effectiveness of the above approach, we have built a complete focused crawler called "Eureka" with complex predicate support, and a search engine that indexes and supports end-user searches on the crawled pages.
Water fluoridation and the quality of information available online.
Frangos, Zachary; Steffens, Maryke; Leask, Julie
2018-02-13
The Internet has transformed the way in which people approach their health care, with online resources becoming a primary source of health information. Little work has assessed the quality of online information regarding community water fluoridation. This study sought to assess the information available to individuals searching online for information, with emphasis on the credibility and quality of websites. We identified the top 10 web pages returned from different search engines, using common fluoridation search terms (identified in Google Trends). Web pages were scored using a credibility, quality and health literacy tool based on Global Advisory Committee on Vaccine Safety (GAVCS) and Center for Disease Control and Prevention (CDC) criteria. Scores were compared according to their fluoridation stance and domain type, then ranked by quality. The functionality of the scoring tool was analysed via a Bland-Altman plot of inter-rater reliability. Five-hundred web pages were returned, of which 55 were scored following removal of duplicates and irrelevant pages. Of these, 28 (51%) were pro-fluoridation, 16 (29%) were neutral and 11 (20%) were anti-fluoridation. Pro, neutral and anti-fluoridation pages scored well against health literacy standards (0.91, 0.90 and 0.81/1 respectively). Neutral and pro-fluoridation web pages showed strong credibility, with mean scores of 0.80 and 0.85 respectively, while anti-fluoridation scored 0.62/1. Most pages scored poorly for content quality, providing a moderate amount of superficial information. Those seeking online information regarding water fluoridation are faced with comprehensible, yet poorly referenced, superficial information. Sites were credible and user friendly; however, our results suggest that online resources need to focus on providing more transparent information with appropriate figures to consolidate the information. © 2018 FDI World Dental Federation.
Checking an integrated model of web accessibility and usability evaluation for disabled people.
Federici, Stefano; Micangeli, Andrea; Ruspantini, Irene; Borgianni, Stefano; Corradi, Fabrizio; Pasqualotto, Emanuele; Olivetti Belardinelli, Marta
2005-07-08
A combined objective-oriented and subjective-oriented method for evaluating accessibility and usability of web pages for students with disability was tested. The objective-oriented approach is devoted to verifying the conformity of interfaces to standard rules stated by national and international organizations responsible for web technology standardization, such as W3C. Conversely, the subjective-oriented approach allows assessing how the final users interact with the artificial system, accessing levels of user satisfaction based on personal factors and environmental barriers. Five kinds of measurements were applied as objective-oriented and subjective-oriented tests. Objective-oriented evaluations were performed on the Help Desk web page for students with disability, included in the website of a large Italian state university. Subjective-oriented tests were administered to 19 students labeled as disabled on the basis of their own declaration at the University enrolment: 13 students were tested by means of the SUMI test and six students by means of the 'Cooperative evaluation'. Objective-oriented and subjective-oriented methods highlighted different and sometimes conflicting results. Both methods have pointed out much more consistency regarding levels of accessibility than of usability. Since usability is largely affected by individual differences in user's own (dis)abilities, subjective-oriented measures underscored the fact that blind students encountered much more web surfing difficulties.
Automatic generation of Web mining environments
NASA Astrophysics Data System (ADS)
Cibelli, Maurizio; Costagliola, Gennaro
1999-02-01
The main problem related to the retrieval of information from the world wide web is the enormous number of unstructured documents and resources, i.e., the difficulty of locating and tracking appropriate sources. This paper presents a web mining environment (WME), which is capable of finding, extracting and structuring information related to a particular domain from web documents, using general purpose indices. The WME architecture includes a web engine filter (WEF), to sort and reduce the answer set returned by a web engine, a data source pre-processor (DSP), which processes html layout cues in order to collect and qualify page segments, and a heuristic-based information extraction system (HIES), to finally retrieve the required data. Furthermore, we present a web mining environment generator, WMEG, that allows naive users to generate a WME specific to a given domain by providing a set of specifications.
The quality and readability of online consumer information about gynecologic cancer.
Sobota, Aleksandra; Ozakinci, Gozde
2015-03-01
The Internet has become an important source of health-related information for consumers, among whom younger women constitute a notable group. The aims of this study were (1) to evaluate the quality and readability of online information about gynecologic cancer using validated instruments and (2) to relate the quality of information to its readability. Using the Alexa Rank, we obtained a list of 35 Web pages providing information about 7 gynecologic malignancies. These were assessed using the Health on the Net (HON) seal of approval, the Journal of the American Medical Association (JAMA) benchmarks, and the DISCERN instrument. Flesch readability score was calculated for sections related to symptoms and signs and treatment. Less than 30% of the Web pages displayed the HON seal or achieved all JAMA benchmarks. The majority of the treatment sections were of moderate to high quality according to the DISCERN. There was no significant relationship between the presence of the HON seal and readability. Web pages achieving all JAMA benchmarks were significantly more difficult to read and understand than Web pages that missed any of the JAMA benchmarks. Treatment-related content of moderate to high quality as assessed by the DISCERN had a significantly better readability score than the low-quality content. The online information about gynecologic cancer provided by the most frequently visited Web pages is of variable quality and in general difficult to read and understand. The relationship between the quality and readability remains unclear. Health care providers should direct their patients to reliable material online because patients consider the Internet as an important source of information.
Web party effect: a cocktail party effect in the web environment
Gerbino, Walter
2015-01-01
In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all) web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots) on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search). Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment): users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others. PMID:25802803
Web party effect: a cocktail party effect in the web environment.
Rigutti, Sara; Fantoni, Carlo; Gerbino, Walter
2015-01-01
In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all) web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots) on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search). Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment): users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others.
Blues for the Lecture Theatre--The Pharmacology Songbook
ERIC Educational Resources Information Center
MacDonald, Ewen; Saarti, Jarmo
2006-01-01
In 2005, we were able to digitally record the so-called pharmacology songbook; a set of songs with lyrics devoted to pharmacological topics. A CD was prepared entitled The Beta-blocker Blues and its contents are now all freely available in mp3 format from our web page (Ewen MacDonald & friends, 2005). The web page also contains the lyrics and…
ERIC Educational Resources Information Center
Snyder, Robin M.
HTML provides a platform-independent way of creating and making multimedia presentations for classroom instruction and making that content available on the Internet. However, time in class is very valuable, so that any way to automate or otherwise assist the presenter in Web page navigation during class can save valuable seconds. This paper…
ERIC Educational Resources Information Center
Chou, Huey-Wen; Wang, Yu-Fang
1999-01-01
Compares the effects of two training methods on computer attitude and performance in a World Wide Web page design program in a field experiment with high school students in Taiwan. Discusses individual differences, Kolb's Experiential Learning Theory and Learning Style Inventory, Computer Attitude Scale, and results of statistical analyses.…
Satellite Imagery Products - Office of Satellite and Product Operations
» Disclaimer » Web Linking Policy » Use of Data and Products » FAQs: Imagery Contact Us Services Argos DCS : Page | VIS | IR | Water Vapor Sample GOES Watervapor composite Detailed Product List Composite Imagery Surface Data GIS Data Available Through Interactive Internet Mapping GIS Fire and Smoke Detection Web Page
ERIC Educational Resources Information Center
Slowinski, Joseph
1999-01-01
Offers suggestions on how to add the power of a free online translator, links, and multicultural search engines to a teacher's classroom home page. Describes the Alta Vista Babelfish online translation service that can be used to translate Web pages on a variety of topics written in German, Spanish, Italian, French, or Portuguese. (SLD)
Self-presentation on the Web: agencies serving abused and assaulted women.
Sorenson, Susan B; Shi, Rui; Zhang, Jingwen; Xue, Jia
2014-04-01
We examined the content and usability of the Web sites of agencies serving women victims of violence. We entered the names of a systematic 10% sample of 3774 agencies listed in 2 national directories into a search engine. We took (in April 2012) and analyzed screenshots of the 261 resulting home pages and the readability of 193 home and first-level pages. Victims (94%) and donors (68%) were the primary intended audiences. About one half used social media and one third provided cues to action. Almost all (96.4%) of the Web pages were rated "fairly difficult" to "very confusing" to read, and 81.4% required more than a ninth-grade education to understand. The service and marketing functions were met fairly well by the agency home pages, but usability (particularly readability and offer of a mobile version) and efforts to increase user safety could be improved. Internet technologies are an essential platform for public health. They are particularly useful for reaching people with stigmatized health conditions because of the anonymity allowed. The one third of agencies that lack a Web site will not reach the substantial portion of the population that uses the Internet to find health information and other resources.
Business Systems Branch Abilities, Capabilities, and Services Web Page
NASA Technical Reports Server (NTRS)
Cortes-Pena, Aida Yoguely
2009-01-01
During the INSPIRE summer internship I acted as the Business Systems Branch Capability Owner for the Kennedy Web-based Initiative for Communicating Capabilities System (KWICC), with the responsibility of creating a portal that describes the services provided by this Branch. This project will help others achieve a clear view ofthe services that the Business System Branch provides to NASA and the Kennedy Space Center. After collecting the data through the interviews with subject matter experts and the literature in Business World and other web sites I identified discrepancies, made the necessary corrections to the sites and placed the information from the report into the KWICC web page.
Lifting Events in RDF from Interactions with Annotated Web Pages
NASA Astrophysics Data System (ADS)
Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad
In this paper we present a method and an implementation for creating and processing semantic events from interaction with Web pages which opens possibilities to build event-driven applications for the (Semantic) Web. Events, simple or complex, are models for things that happen e.g., when a user interacts with a Web page. Events are consumed in some meaningful way e.g., for monitoring reasons or to trigger actions such as responses. In order for receiving parties to understand events e.g., comprehend what has led to an event, we propose a general event schema using RDFS. In this schema we cover the composition of complex events and event-to-event relationships. These events can then be used to route semantic information about an occurrence to different recipients helping in making the Semantic Web active. Additionally, we present an architecture for detecting and composing events in Web clients. For the contents of events we show a way of how they are enriched with semantic information about the context in which they occurred. The paper is presented in conjunction with the use case of Semantic Advertising, which extends traditional clickstream analysis by introducing semantic short-term profiling, enabling discovery of the current interest of a Web user and therefore supporting advertisement providers in responding with more relevant advertisements.
Challenges in Managing Information Extraction
ERIC Educational Resources Information Center
Shen, Warren H.
2009-01-01
This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…
Metadata: Pure and Simple, or Is It?
ERIC Educational Resources Information Center
Chalmers, Marilyn
2002-01-01
Discusses issues concerning metadata in Web pages based on experiences in a vocational education center library in Queensland (Australia). Highlights include Dublin Core elements; search engines; controlled vocabulary; performance measurement to assess usage patterns and provide quality control over the vocabulary; and considerations given the…
Legal Issues in Educational Technology: Implications for School Leaders.
ERIC Educational Resources Information Center
Quinn, David M.
2003-01-01
Discusses several legal issues involving the use of educational technology: Freedom of speech, regulation of Internet material harmful to minors, student-developed Web pages, harassment and hostile work environment, staff and student privacy, special education, plagiarism, and copyright issues. Includes recommendations for addressing technology…
Analysis of Technique to Extract Data from the Web for Improved Performance
NASA Astrophysics Data System (ADS)
Gupta, Neena; Singh, Manish
2010-11-01
The World Wide Web rapidly guides the world into a newly amazing electronic world, where everyone can publish anything in electronic form and extract almost all the information. Extraction of information from semi structured or unstructured documents, such as web pages, is a useful yet complex task. Data extraction, which is important for many applications, extracts the records from the HTML files automatically. Ontologies can achieve a high degree of accuracy in data extraction. We analyze method for data extraction OBDE (Ontology-Based Data Extraction), which automatically extracts the query result records from the web with the help of agents. OBDE first constructs an ontology for a domain according to information matching between the query interfaces and query result pages from different web sites within the same domain. Then, the constructed domain ontology is used during data extraction to identify the query result section in a query result page and to align and label the data values in the extracted records. The ontology-assisted data extraction method is fully automatic and overcomes many of the deficiencies of current automatic data extraction methods.
The effects of link format and screen location on visual search of web pages.
Ling, Jonathan; Van Schaik, Paul
2004-06-22
Navigation of web pages is of critical importance to the usability of web-based systems such as the World Wide Web and intranets. The primary means of navigation is through the use of hyperlinks. However, few studies have examined the impact of the presentation format of these links on visual search. The present study used a two-factor mixed measures design to investigate whether there was an effect of link format (plain text, underlined, bold, or bold and underlined) upon speed and accuracy of visual search and subjective measures in both the navigation and content areas of web pages. An effect of link format on speed of visual search for both hits and correct rejections was found. This effect was observed in the navigation and the content areas. Link format did not influence accuracy in either screen location. Participants showed highest preference for links that were in bold and underlined, regardless of screen area. These results are discussed in the context of visual search processes and design recommendations are given.
Singer, Philipp; Helic, Denis; Taraghi, Behnam; Strohmaier, Markus
2014-01-01
One of the most frequently used models for understanding human navigation on the Web is the Markov chain model, where Web pages are represented as states and hyperlinks as probabilities of navigating from one page to another. Predominantly, human navigation on the Web has been thought to satisfy the memoryless Markov property stating that the next page a user visits only depends on her current page and not on previously visited ones. This idea has found its way in numerous applications such as Google's PageRank algorithm and others. Recently, new studies suggested that human navigation may better be modeled using higher order Markov chain models, i.e., the next page depends on a longer history of past clicks. Yet, this finding is preliminary and does not account for the higher complexity of higher order Markov chain models which is why the memoryless model is still widely used. In this work we thoroughly present a diverse array of advanced inference methods for determining the appropriate Markov chain order. We highlight strengths and weaknesses of each method and apply them for investigating memory and structure of human navigation on the Web. Our experiments reveal that the complexity of higher order models grows faster than their utility, and thus we confirm that the memoryless model represents a quite practical model for human navigation on a page level. However, when we expand our analysis to a topical level, where we abstract away from specific page transitions to transitions between topics, we find that the memoryless assumption is violated and specific regularities can be observed. We report results from experiments with two types of navigational datasets (goal-oriented vs. free form) and observe interesting structural differences that make a strong argument for more contextual studies of human navigation in future work.
Singer, Philipp; Helic, Denis; Taraghi, Behnam; Strohmaier, Markus
2014-01-01
One of the most frequently used models for understanding human navigation on the Web is the Markov chain model, where Web pages are represented as states and hyperlinks as probabilities of navigating from one page to another. Predominantly, human navigation on the Web has been thought to satisfy the memoryless Markov property stating that the next page a user visits only depends on her current page and not on previously visited ones. This idea has found its way in numerous applications such as Google's PageRank algorithm and others. Recently, new studies suggested that human navigation may better be modeled using higher order Markov chain models, i.e., the next page depends on a longer history of past clicks. Yet, this finding is preliminary and does not account for the higher complexity of higher order Markov chain models which is why the memoryless model is still widely used. In this work we thoroughly present a diverse array of advanced inference methods for determining the appropriate Markov chain order. We highlight strengths and weaknesses of each method and apply them for investigating memory and structure of human navigation on the Web. Our experiments reveal that the complexity of higher order models grows faster than their utility, and thus we confirm that the memoryless model represents a quite practical model for human navigation on a page level. However, when we expand our analysis to a topical level, where we abstract away from specific page transitions to transitions between topics, we find that the memoryless assumption is violated and specific regularities can be observed. We report results from experiments with two types of navigational datasets (goal-oriented vs. free form) and observe interesting structural differences that make a strong argument for more contextual studies of human navigation in future work. PMID:25013937
Students using visual thinking to learn science in a Web-based environment
NASA Astrophysics Data System (ADS)
Plough, Jean Margaret
United States students' science test scores are low, especially in problem solving, and traditional science instruction could be improved. Consequently, visual thinking, constructing science structures, and problem solving in a web-based environment may be valuable strategies for improving science learning. This ethnographic study examined the science learning of fifteen fourth grade students in an after school computer club involving diverse students at an inner city school. The investigation was done from the perspective of the students, and it described the processes of visual thinking, web page construction, and problem solving in a web-based environment. The study utilized informal group interviews, field notes, Visual Learning Logs, and student web pages, and incorporated a Standards-Based Rubric which evaluated students' performance on eight science and technology standards. The Visual Learning Logs were drawings done on the computer to represent science concepts related to the Food Chain. Students used the internet to search for information on a plant or animal of their choice. Next, students used this internet information, with the information from their Visual Learning Logs, to make web pages on their plant or animal. Later, students linked their web pages to form Science Structures. Finally, students linked their Science Structures with the structures of other students, and used these linked structures as models for solving problems. Further, during informal group interviews, students answered questions about visual thinking, problem solving, and science concepts. The results of this study showed clearly that (1) making visual representations helped students understand science knowledge, (2) making links between web pages helped students construct Science Knowledge Structures, and (3) students themselves said that visual thinking helped them learn science. In addition, this study found that when using Visual Learning Logs, the main overall ideas of the science concepts were usually represented accurately. Further, looking for information on the internet may cause new problems in learning. Likewise, being absent, starting late, and/or dropping out all may negatively influence students' proficiency on the standards. Finally, the way Science Structures are constructed and linked may provide insights into the way individual students think and process information.
Google Analytics: Single Page Traffic Reports
These are pages that live outside of Google Analytics (GA) but allow you to view GA data for any individual page on either the public EPA web or EPA intranet. You do need to log in to Google Analytics to view them.
Working with WebQuests: Making the Web Accessible to Students with Disabilities.
ERIC Educational Resources Information Center
Kelly, Rebecca
2000-01-01
This article describes how students with disabilities in regular classes are using the WebQuest lesson format to access the Internet. It explains essential WebQuest principles, creating a draft Web page, and WebQuest components. It offers an example of a WebQuest about salvaging the sunken ships, Titanic and Lusitania. A WebQuest planning form is…
A tool for improving the Web accessibility of visually handicapped persons.
Fujiki, Tadayoshi; Hanada, Eisuke; Yamada, Tomomi; Noda, Yoshihiro; Antoku, Yasuaki; Nakashima, Naoki; Nose, Yoshiaki
2006-04-01
Abstract Much has been written concerning the difficulties faced by visually handicapped persons when they access the internet. To solve some of the problems and to make web pages more accessible, we developed a tool we call the "Easy Bar," which works as a toolbar on the web browser. The functions of the Easy Bar are to change the size of web texts and images, to adjust the color, and to clear cached data that is automatically saved by the web browser. These functions are executed with ease by clicking buttons and operating a pull-down list. Since the icons built into Easy Bar are quite large, it is not necessary for the user to deal with delicate operations. The functions of Easy Bar run on any web page without increasing the processing time. For the visually handicapped, Easy Bar would contribute greatly to improved web accessibility to medical information.
Thompson, Terrill; Burgstahler, Sheryl; Moore, Elizabeth J
2010-01-01
This article reports on a follow-up assessment to Thompson et al. (Proceedings of The First International Conference on Technology-based Learning with Disability, July 19-20, Dayton, Ohio, USA; 2007. pp 127-136), in which higher education home pages were evaluated over a 5-year period on their accessibility to individuals with disabilities. The purpose of this article is to identify trends in web accessibility and long-term impact of outreach and education. Home pages from 127 higher education institutions in the Northwest were evaluated for accessibility three times over a 6-month period in 2004-2005 (Phase I), and again in 2009 (Phase II). Schools in the study were offered varying degrees of training and/or support on web accessibility during Phase I. Pages were evaluated for accessibility using a set of manual checkpoints developed by the researchers. Over the 5-year period reported in this article, significant positive gains in accessibility were revealed on some measures, but accessibility declined on other measures. The areas of improvement are arguably the more basic, easy-to-implement accessibility features, while the area of decline is keyboard accessibility, which is likely associated with the emergence of dynamic new technologies on web pages. Even on those measures where accessibility is improving, it is still strikingly low. In Phase I of the study, institutions that received extensive training and support were more likely than other institutions to show improved accessibility on the measures where institutions improved overall, but were equally or more likely than others to show a decline on measures where institutions showed an overall decline. In Phase II, there was no significant difference between institutions who had received support earlier in the study, and those who had not. Results suggest that growing numbers of higher education institutions in the Northwest are motivated to add basic accessibility features to their home pages, and that outreach and education may have a positive effect on these measures. However, the results also reveal negative trends in accessibility, and outreach and education may not be strong enough to counter the factors that motivate institutions to deploy inaccessible emerging technologies. Further research is warranted toward identifying the motivational factors that are associated with increased and decreased web accessibility, and much additional work is needed to ensure that higher education web pages are accessible to individuals with disabilities.
CH5M3D: an HTML5 program for creating 3D molecular structures.
Earley, Clarke W
2013-11-18
While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user's computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/.
Web-based pathology practice examination usage.
Klatt, Edward C
2014-01-01
General and subject specific practice examinations for students in health sciences studying pathology were placed onto a free public internet web site entitled web path and were accessed four clicks from the home web site menu. Multiple choice questions were coded into. html files with JavaScript functions for web browser viewing in a timed format. A Perl programming language script with common gateway interface for web page forms scored examinations and placed results into a log file on an internet computer server. The four general review examinations of 30 questions each could be completed in up to 30 min. The 17 subject specific examinations of 10 questions each with accompanying images could be completed in up to 15 min each. The results of scores and user educational field of study from log files were compiled from June 2006 to January 2014. The four general review examinations had 31,639 accesses with completion of all questions, for a completion rate of 54% and average score of 75%. A score of 100% was achieved by 7% of users, ≥90% by 21%, and ≥50% score by 95% of users. In top to bottom web page menu order, review examination usage was 44%, 24%, 17%, and 15% of all accessions. The 17 subject specific examinations had 103,028 completions, with completion rate 73% and average score 74%. Scoring at 100% was 20% overall, ≥90% by 37%, and ≥50% score by 90% of users. The first three menu items on the web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three accounted for no more than 2.2% each. Completion rates were higher for shorter 10 questions subject examinations. Users identifying themselves as MD/DO scored higher than other users, averaging 75%. Usage was higher for examinations at the top of the web page menu. Scores achieved suggest that a cohort of serious users fully completing the examinations had sufficient preparation to use them to support their pathology education.
CH5M3D: an HTML5 program for creating 3D molecular structures
2013-01-01
Background While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Results Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user’s computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. Conclusions A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/. PMID:24246004
Patients' rights on the World Wide Web.
Taylor, M K
2001-01-01
Managed care reform, commonly referred to as "patients' rights" legislation, has become a hot topic. Many groups, including consumers, health care professionals, employers, managed care organizations, political parties, and government agencies, have strong opinions about measures that should be taken and what the outcomes of these measures might be. Those investigating this multidisciplinary topic will want to examine health care administration, ethics, health services research, and political science sources. Web resources covered in this article include: clearinghouses; government agencies; federal legislative and legal sites; and home pages of professional and trade associations, policy research institutes, and consumer advocacy organizations.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-31
... today's final rule will be required on March 7, 2015 and January 1, 2018, as set forth in Table I.1 in... information that is exempt from public disclosure. A link to the docket web page can be found at: www.regulations.gov/#!docketDetail ;D=EERE-2008-BT-STD-0019. The regulations.gov web page contains instructions on...
THUIR at TREC 2009 Web Track: Finding Relevant and Diverse Results for Large Scale Web Search
2009-11-01
Porn words‟ filtering is also one of the anti-spam techniques in real world search engines. A list of porn words was found from the internet [2...When the numbers of the porn words in the page is larger than α, then the page is taken as the spam. In our experiments, the threshold is set to 16
ERIC Educational Resources Information Center
Gallo, Gail; Wichowski, Chester P.
This second of two guides on Netscape Communicator 4.5 contains six lessons on advanced searches, multimedia, and composing a World Wide Web page. Lesson 1 is a review of the Navigator window, toolbars, and menus. Lesson 2 covers AltaVista's advanced search tips, searching for information excluding certain text, and advanced and nested Boolean…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-15
....gov . To view a copy of this information collection request (ICR) submitted to OMB: (1) Go to the web page http://reginfo.gov/public/do/PRAMain , (2) look for the section of the web page called ``Currently..., digital, LPTV and TV translator stations. The CBPA directs that Class A stations must comply with the...
Lafrenière, Darquise; Hurlimann, Thierry; Menuz, Vincent; Godard, Béatrice
2014-10-01
The push for knowledge translation on the part of health research funding agencies is significant in Canada, and many strategies have been adopted to promote the conversion of knowledge into action. In recent years, an increasing number of health researchers have been studying arts-based interventions to transform knowledge into action. This article reports on the results of an online questionnaire aimed at evaluating the effectiveness of a knowledge dissemination intervention (KDI) conveying findings from a study on the scientific and ethical challenges raised by nutrigenomics-nutrigenetics (NGx) research. The KDI was based on the use of four Web pages combining original, interactive cartoon-like illustrations accompanied by text to disseminate findings to Canadian Research Ethics Boards members, as well as to NGx researchers and researchers in ethics worldwide. Between May and October 2012, the links to the Web pages were sent in a personal email to target audience members, one thematic Web page at a time. On each thematic Web page, members of the target audience were invited to answer nine evaluation questions assessing the effectiveness of the KDI on four criteria, (i) acquisition of knowledge; (ii) change in initial understanding; (iii) generation of questions from the findings; and (iv) intent to change own practice. Response rate was low; results indicate that: (i) content of the four Web pages did not bring new knowledge to a majority of the respondents, (ii) initial understanding of the findings did not change for a majority of NGx researchers and a minority of ethics respondents, (iii) although the KDI did raise questions for respondents, it did not move them to change their practice. While target end-users may not feel that they actually learned from the KDI, it seems that the findings conveyed encouraged reflection and raised useful and valuable questions for them. Moreover, the evaluation of the KDI proved to be useful to gain knowledge about our target audiences' views since respondents' comments allowed us to improve our understanding of the disseminated knowledge as well as to modify (and hopefully improve) the content of the Web pages used for dissemination. Copyright © 2014 Elsevier Ltd. All rights reserved.
77 FR 44306 - Service Delivery Plan
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-27
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2012-0048] Service Delivery Plan AGENCY: Social... publicly available. Do not include in your comments any personal information, such as Social Security... function of the Web page to find docket number SSA-2012-0048. The system will issue you a tracking number...
of adaptive optics systems for the next generation of high resolution astronomy instrumentation. The largest telescopes in support of UC Astronomy, including those at the Keck, Gemini, and Lick Observatories optics for astronomy: MEMS and fiber lasers lead the way. In Adaptive Optics: Analysis, Methods and
NASA Technical Reports Server (NTRS)
1999-01-01
This viewgraph presentation gives an overview of the Access to Space website, including information on the 'tool boxes' available on the website for access opportunities, performance, interfaces, volume, environments, 'wish list' entry, and educational outreach.
Project outputs will include: 1) the sustainability network and associated web pages; 2) sustainability indicators and associated maps representing the current values of the metrics; 3) an integrated assessment model of the impacts of electricity generation alternatives on a ...
Promoting Creative Thinking through the Use of ICT.
ERIC Educational Resources Information Center
Wheeler, Steve; Waite, S. J.; Bromfield, C.
2002-01-01
Reports on a pilot study that investigated the creative impact of information and communication technology (ICT) in a grade 6 class in England. Presents a model of creativity that includes problem solving, creative cognition, and social interaction; and discusses the creation of personal Web pages. (Author/LRW)
Tools for Creating Mobile Applications for Extension
ERIC Educational Resources Information Center
Drill, Sabrina L.
2012-01-01
Considerations and tools for developing mobile applications for Extension include evaluating the topic, purpose, and audience. Different computing platforms may be used, and apps designed as modified Web pages or implicitly programmed for a particular platform. User privacy is another important consideration, especially for data collection apps.…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-14
..., including through the use of information technology. Please note that written comments received in response... Information for Financial Aid Professionals Web page. Only foreign schools who are registered with Federal... DEPARTMENT OF EDUCATION Proposed Information Collection Requests; Federal Student Aid; Foreign...
Lemaire, Edward; Greene, G
2003-01-01
We produced continuing education material in physical rehabilitation using a variety of electronic media. We compared four methods of delivering the learning modules: in person with a computer projector, desktop videoconferencing, Web pages and CD-ROM. Health-care workers at eight community hospitals and two nursing homes were asked to participate in the project. A total of 394 questionnaires were received for all modalities: 73 for in-person sessions, 50 for desktop conferencing, 227 for Web pages and 44 for CD-ROM. This represents a 100% response rate from the in-person, desktop conferencing and CD-ROM groups; the response rate for the Web group is unknown, since the questionnaires were completed online. Almost all participants found the modules to be helpful in their work. The CD-ROM group gave significantly higher ratings than the Web page group, although all four learning modalities received high ratings. A combination of all four modalities would be required to provide the best possible learning opportunity.
Bruendl, Johannes; Rothbauer, Clemens; Ludwig, Bernd; Dotzler, Bernhard; Wolff, Christian; Reimann, Sandra; Borgmann, Hendrik; Burger, Maximilian; Breyer, Johannes
2018-01-01
The internet is an emerging source of information for prostate cancer (PCa) patients. Since little is known about the quality of information on PCa provided online, we investigated its accordance to the latest European Association of Urology (EAU) guidelines. A total of 89 German web pages were included for analysis. A quality model classifying the provider of information and its expertise was introduced. Correctness of provided information was systematically compared to the EAU guidelines. Information was provided by medical experts (41%), media (11%), and pharmaceutical companies (6%). Certificates were found in 23% with a significantly higher rate if provided by medical experts (p = 0.003). The minority of web pages showed information in accordance with the EAU guidelines regarding screening (63%), diagnosis (32%), classification (39%), therapy (36%), complications (8%), and follow-up (27%). Web pages by medical experts as well as websites with any kind of certification showed a significantly higher guideline conformity regarding diagnosis (p = 0.027, p = 0.002), therapy (p = 0.010, p = 0.011), follow-up (p = 0.005, p < 0.001), and availability of references (p = 0.017, p = 0.003). The present study reveals that online health information on PCa lacks concordance to current guidelines. Certified websites or websites provided by medical experts showed a significantly higher quality and accordance with guidelines. © 2018 S. Karger AG, Basel.
2008-08-15
running the real-time application we used in our previous study on IBM WebSphere Real Time. IBM WebSphere Real Time automatically sets Metronome , its...the experiment show that the modified code for the Shadow Design Pattern runs well under Metronome . 15. NUMBER OF PAGES 25 14. SUBJECT TERMS...includes the real-time garbage collector called the Metronome . Unlike the Sun RTGC, we cannot change the priority of the Metronome RTGC. Metronome is
QNAP 1263U Network Attached Storage (NAS)/ Storage Area Network (SAN) Device Users Guide
2016-11-01
standard Ethernet network. Operating either a NAS or SAN is vital for the integrity of the data stored on the drives found in the device. Redundant...speed of the network itself. Many standards are in place for transferring data, including more standard ones such as File Transfer Protocol and Server ...following are the procedures for connecting to the NAS administrative web page: 1) Open a web browser and browse to 192.168.40.8:8080. 2) Enter the
2013-03-01
construed as an official Department of the Army position, policy or decision unless so designated by other documentation. 2 REPORT...valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE March 2013 2. REPORT TYPE Final 3. DATES COVERED 15...communication experts and graphic designers , to create 4 pilot web intervention pages. This included 2 pilot versions of the Home Page and a section landing
A multilingual assessment of melanoma information quality on the Internet.
Bari, Lilla; Kemeny, Lajos; Bari, Ferenc
2014-06-01
This study aims to assess and compare melanoma information quality in Hungarian, Czech, and German languages on the Internet. We used country-specific Google search engines to retrieve the first 25 uniform resource locators (URLs) by searching the word "melanoma" in the given language. Using the automated toolbar of Health On the Net Foundation (HON), we assessed each Web site for HON certification based on the Health On the Net Foundation Code of Conduct (HONcode). Information quality was determined using a 35-point checklist created by Bichakjian et al. (J Clin Oncol 20:134-141, 2002), with the NCCN melanoma guideline as control. After excluding duplicate and link-only pages, a total of 24 Hungarian, 18 Czech, and 21 German melanoma Web sites were evaluated and rated. The amount of HON certified Web sites was the highest among the German Web pages (19%). One of the retrieved Hungarian and none of the Czech Web sites were HON certified. We found the highest number of Web sites containing comprehensive, correct melanoma information in German language, followed by Czech and Hungarian pages. Although the majority of the Web sites lacked data about incidence, risk factors, prevention, treatment, work-up, and follow-up, at least one comprehensive, high-quality Web site was found in each language. Several Web sites contained incorrect information in each language. While a small amount of comprehensive, quality melanoma-related Web sites was found, most of the retrieved Web content lacked basic disease information, such as risk factors, prevention, and treatment. A significant number of Web sites contained malinformation. In case of melanoma, primary and secondary preventions are of especially high importance; therefore, the improvement of disease information quality available on the Internet is necessary.
Googling suicide: surfing for suicide information on the Internet.
Recupero, Patricia R; Harms, Samara E; Noble, Jeffrey M
2008-06-01
This study examined the types of resources a suicidal person might find through search engines on the Internet. We were especially interested in determining the accessibility of potentially harmful resources, such as prosuicide forums, as such resources have been implicated in completed suicides and are known to exist on the Web. Using 5 popular search engines (Google, Yahoo!, Ask.com, Lycos, and Dogpile) and 4 suicide-related search terms (suicide, how to commit suicide, suicide methods, and how to kill yourself), we collected quantitative and qualitative data about the search results. The searches were conducted in August and September 2006. Several coraters assigned codes and characterizations to the first 30 Web sites per search term combination (and "sponsored links" on those pages), which were then confirmed by consensus ratings. Search results were classified as being prosuicide, antisuicide, suicide-neutral, not a suicide site, or error (i.e., page would not load). Additional information was collected to further characterize the nature of the information on these Web sites. Suicide-neutral and anti-suicide pages occurred most frequently (of 373 unique Web pages, 115 were coded as suicide-neutral, and 109 were anti-suicide). While pro-suicide resources were less frequent (41 Web pages), they were nonetheless easily accessible. Detailed how-to instructions for unusual and lethal suicide methods were likewise easily located through the searches. Mental health professionals should ask patients about their Internet use. Depressed, suicidal, or potentially suicidal patients who use the Internet may be especially at risk. Clinicians may wish to assist patients in locating helpful, supportive resources online so that patients' Internet use may be more therapeutic than harmful.
Modelling Safe Interface Interactions in Web Applications
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Cabot, Jordi; Grossniklaus, Michael
Current Web applications embed sophisticated user interfaces and business logic. The original interaction paradigm of the Web based on static content pages that are browsed by hyperlinks is, therefore, not valid anymore. In this paper, we advocate a paradigm shift for browsers and Web applications, that improves the management of user interaction and browsing history. Pages are replaced by States as basic navigation nodes, and Back/Forward navigation along the browsing history is replaced by a full-fledged interactive application paradigm, supporting transactions at the interface level and featuring Undo/Redo capabilities. This new paradigm offers a safer and more precise interaction model, protecting the user from unexpected behaviours of the applications and the browser.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer Summary: In... Cancer Genetics Services Directory Web-based Application Form and Update Mailer. [[Page 14035
Computer and Voice Network Management Through Low Earth Orbiting Satellites
2006-03-01
Correction Chart” [web page] (29 July 2005 [cited 01 DEC 05]); available from World Wide Web @ http://www.amsat.orgamsat/ ariss /news...Available from World Wide Web @ http://www.amsat.orgamsat/ ariss /news/ISS_frequencies_and_Doppler_correction. rtf “Technical Specifications” [web
Vona, Pamela; Wilmoth, Pete; Jaycox, Lisa H; McMillen, Janey S; Kataoka, Sheryl H; Wong, Marleen; DeRosier, Melissa E; Langley, Audra K; Kaufman, Joshua; Tang, Lingqi; Stein, Bradley D
2014-11-01
To explore the role of Web-based platforms in behavioral health, the study examined usage of a Web site for supporting training and implementation of an evidence-based intervention. Using data from an online registration survey and Google Analytics, the investigators examined user characteristics and Web site utilization. Site engagement was substantial across user groups. Visit duration differed by registrants' characteristics. Less experienced clinicians spent more time on the Web site. The training section accounted for most page views across user groups. Individuals previously trained in the Cognitive-Behavioral Intervention for Trauma in Schools intervention viewed more implementation assistance and online community pages than did other user groups. Web-based platforms have the potential to support training and implementation of evidence-based interventions for clinicians of varying levels of experience and may facilitate more rapid dissemination. Web-based platforms may be promising for trauma-related interventions, because training and implementation support should be readily available after a traumatic event.
A Multi-User Model for Effectively Communicating Research Through Electronic Media
NASA Astrophysics Data System (ADS)
Hinds, J. J.; Fairley, J. P.
2003-12-01
Electronic media have demonstrated potential for data exchange, dissemination of results to other scientists, communication with community interest groups, and education of the general public regarding scientific advances. Few researchers, however, receive training in the skills required to capture the attention of the broad spectrum of Internet users. Because different people assimilate information in different ways, effective communication is best accomplished using an appropriate mix of photographs, graphics, tables, and text. In addition, effective web page design requires a clear, consistent organizational structure, easily-navigated layout, and attention to details such as page printability, downloading time, and minimal page scrolling. One of the strengths of electronic media is that the user can chose an appropriate level of involvement for his or her interest. In designing a web page for the multidisciplinary NSF/EPSCoR "Biocomplexity in Extreme Environments" project, we divided potential users into three categories based on our perception of the level of detail they required: 1) project participants, 2) non-participants with technical backgrounds, and 3) the general public. By understanding the needs and expectations of potential viewers, it was possible to present each group with an appropriate balance of visual and textural elements. For example, project participants are often most interested in raw data, which can be effectively presented in tabular format. Non-participants with technical backgrounds are more interested in analyzed data, while a project overview, presented through photographs and graphics with minimal text, will be most effective for communicating with the general public. The completed web page illustrates one solution for effectively communicating with a diverse audience, and provides examples for meeting many of the challenges of web page design.
Code of Federal Regulations, 2011 CFR
2011-07-01
... information that were involved in the data breach (e.g., full name, Social Security number, date of birth... conspicuous posting on the home page of VA's Web site and notification in major print and broadcast media, including major media in geographic areas where the affected individuals likely reside. Such a notice in...
Code of Federal Regulations, 2014 CFR
2014-07-01
... information that were involved in the data breach (e.g., full name, Social Security number, date of birth... conspicuous posting on the home page of VA's Web site and notification in major print and broadcast media, including major media in geographic areas where the affected individuals likely reside. Such a notice in...
Code of Federal Regulations, 2012 CFR
2012-07-01
... information that were involved in the data breach (e.g., full name, Social Security number, date of birth... conspicuous posting on the home page of VA's Web site and notification in major print and broadcast media, including major media in geographic areas where the affected individuals likely reside. Such a notice in...
Code of Federal Regulations, 2013 CFR
2013-07-01
... information that were involved in the data breach (e.g., full name, Social Security number, date of birth... conspicuous posting on the home page of VA's Web site and notification in major print and broadcast media, including major media in geographic areas where the affected individuals likely reside. Such a notice in...
ERIC Educational Resources Information Center
Herron, Terri L.
1998-01-01
Discusses ways to use the Internet as a pedagogical tool in higher education, with illustrations from techniques and resources used in a graduate course in accounting information systems. Examples include use of an online textbook, an Internet-based project, electronic mail, a class Web page, and Internet searching to find course-related…
The Traveler's Guide to E-mail Access.
ERIC Educational Resources Information Center
Clyde, Anne
1999-01-01
Presents options that travelers can use to keep in e-mail contact. Discusses equipment/access issues related to traveling with a laptop; Internet cafes; free e-mail services; accessing home mail via a Web page; and new options e-mail access for travelers. Includes Internet resources on Internet access providers. (AEF)
Code of Federal Regulations, 2010 CFR
2010-07-01
... information that were involved in the data breach (e.g., full name, Social Security number, date of birth... conspicuous posting on the home page of VA's Web site and notification in major print and broadcast media, including major media in geographic areas where the affected individuals likely reside. Such a notice in...
CyberHunt: Head Off to Antarctica.
ERIC Educational Resources Information Center
Kloza, Brad
2001-01-01
Explains how to take an elementary class on a cyber visit to the continent of Antarctica, the highest, driest, and coldest continent on earth. A student reproducible page presents eight web sites to visit in this quest as well as questions to answer about each site. Answers to the questions are included. (SM)
78 FR 63747 - Public Housing Capital Fund Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... comment on the proposed rule and further consideration of issues by HUD. I. Executive Summary A. Purpose..., including development of public housing units, and buildings, facilities, and/or related appurtenances (i.e... Department of Labor's wage rate site, http://www.wdol.gov/ . HUD also has a Web page with Davis-Bacon...
A Review of Research Ethics in Internet-Based Research
ERIC Educational Resources Information Center
Convery, Ian; Cox, Diane
2012-01-01
Internet-based research methods can include: online surveys, web page content analysis, videoconferencing for online focus groups and/or interviews, analysis of "e-conversations" through social networking sites, email, chat rooms, discussion boards and/or blogs. Over the last ten years, an upsurge in internet-based research (IBR) has led…
New Campus Crime Prevention Resources Available
ERIC Educational Resources Information Center
Campus Law Enforcement Journal, 2012
2012-01-01
The Campus Crime Prevention Committee has compiled a list of university and college crime prevention agencies and resources, which includes contact information, links to agency crime prevention web pages, and a list of resources they offer (i.e., brochures, guides, PowerPoint programs, videos, etc.) as well as a spreadsheet showing organizations…
78 FR 42072 - Consumer Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-15
... people with disabilities. You can listen to the audio and use a screen reader to read displayed documents...://accessibleevent.com . The Web page prompts for an Event Code which is 005202376. To learn about the features of... accommodations for people with disabilities are available upon request. The request should include a detailed...
Wyoming: Open Range for Library Technology.
ERIC Educational Resources Information Center
Maul, Helen Meadors
1996-01-01
Describes the development of library technology and the need for telecommunications in a state with a lack of population density. Topics include the state library's role; shared library resources and library networks; government information; the Wyoming State Home Page on the World Wide Web; Ariel software; network coordinating; and central…
Improving Web Accessibility in a University Setting
ERIC Educational Resources Information Center
Olive, Geoffrey C.
2010-01-01
Improving Web accessibility for disabled users visiting a university's Web site is explored following the World Wide Web Consortium (W3C) guidelines and Section 508 of the Rehabilitation Act rules for Web page designers to ensure accessibility. The literature supports the view that accessibility is sorely lacking, not only in the USA, but also…
Data base for early postfire succession in Northern Rocky Mountain forests
Peter F. Stickney; Robert B. Campbell
2000-01-01
Web site and CD-ROM include 21 pages of text plus electronic data for 55 succession sites including color plates, tables, and figures. Provides data on quantitative postfire changes of plant species and forest vegetation components for up to the first 25 years of secondary plant succession for 55 forest sites in northern Idaho and northwestern Montana. Cover (aerial...
Going beyond Google for Faster and Smarter Web Searching
ERIC Educational Resources Information Center
Vine, Rita
2004-01-01
With more than 4 billion web pages in its database, Google is suitable for many different kinds of searches. When you know what you are looking for, Google can be a pretty good first choice, as long as you want to search a word pattern that can be expected to appear on any results pages. The problem starts when you don't know exactly what you're…
NASA Astrophysics Data System (ADS)
Zhang, Xiaowen; Chen, Bingfeng
2017-08-01
Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.
Bates, Benjamin R; Romina, Sharon; Ahmed, Rukhsana; Hopson, Danielle
2006-03-01
Recent use of the Internet as a source of health information has raised concerns about consumers' ability to tell 'good' information from 'bad' information. Although consumers report that they use source credibility to judge information quality, several observational studies suggest that consumers make little use of source credibility. This study examines consumer evaluations of web pages attributed to a credible source as compared to generic web pages on measures of message quality. In spring 2005, a community-wide convenience survey was distributed in a regional hub city in Ohio, USA. 519 participants were randomly assigned one of six messages discussing lung cancer prevention: three messages each attributed to a highly credible national organization and three identical messages each attributed to a generic web page. Independent sample t-tests were conducted to compare each attributed message to its counterpart attributed to a generic web page on measures of trustworthiness, truthfulness, readability, and completeness. The results demonstrated that differences in attribution to a source did not have a significant effect on consumers' evaluations of the quality of the information.Conclusions. The authors offer suggestions for national organizations to promote credibility to consumers as a heuristic for choosing better online health information through the use of media co-channels to emphasize credibility.
The quality of patient-orientated Internet information on oral lichen planus: a pilot study.
López-Jornet, Pía; Camacho-Alonso, Fabio
2010-10-01
This study examines the accessibility and quality Web pages related with oral lichen planus. Sites were identified using two search engines (Google and Yahoo!) and the search terms 'oral lichen planus' and 'oral lesion lichenoid'. The first 100 sites in each search were visited and classified. The web sites were evaluated for content quality by using the validated DISCERN rating instrument. JAMA benchmarks and 'Health on the Net' seal (HON). A total of 109,000 sites were recorded in Google using the search terms and 520,000 in Yahoo! A total of 19 Web pages considered relevant were examined on Google and 20 on Yahoo! As regards the JAMA benchmarks, only two pages satisfied the four criteria in Google (10%), and only three (15%) in Yahoo! As regards DISCERN, the overall quality of web site information was poor, no site reaching the maximum score. In Google 78.94% of sites had important deficiencies, and 50% in Yahoo!, the difference between the two search engines being statistically significant (P = 0.031). Only five pages (17.2%) on Google and eight (40%) on Yahoo! showed the HON code. Based on our review, doctors must assume primary responsibility for educating and counselling their patients. © 2010 Blackwell Publishing Ltd.
Self-Presentation on the Web: Agencies Serving Abused and Assaulted Women
Shi, Rui; Zhang, Jingwen; Xue, Jia
2014-01-01
Objectives. We examined the content and usability of the Web sites of agencies serving women victims of violence. Methods. We entered the names of a systematic 10% sample of 3774 agencies listed in 2 national directories into a search engine. We took (in April 2012) and analyzed screenshots of the 261 resulting home pages and the readability of 193 home and first-level pages. Results. Victims (94%) and donors (68%) were the primary intended audiences. About one half used social media and one third provided cues to action. Almost all (96.4%) of the Web pages were rated “fairly difficult” to “very confusing” to read, and 81.4% required more than a ninth-grade education to understand. Conclusions. The service and marketing functions were met fairly well by the agency home pages, but usability (particularly readability and offer of a mobile version) and efforts to increase user safety could be improved. Internet technologies are an essential platform for public health. They are particularly useful for reaching people with stigmatized health conditions because of the anonymity allowed. The one third of agencies that lack a Web site will not reach the substantial portion of the population that uses the Internet to find health information and other resources. PMID:24524489
Krogmann, David W
2002-01-01
The history of photosynthesis research can be found in original papers and books. However, a special history is available from the prefatory chapters and the personal perspectives of various researchers who published them in several journals over the last 40 years. We have compiled a list of such perspectives published since 1964. Selection is not easy, especially of authors who were not directly engaged in photosynthesis research; some are included for their special insights related to central issues in the study of photosynthesis. Our journal, Photosynthesis Research, contains other valuable historic data in the occasional tributes, obituaries and historical notes, that have been published. Lists of these items are included. This article ends by listing the Nobel prizes related to photosynthesis and the Kettering Awards for Excellence in Photosynthesis Research. Wherever possible, a web page address is provided. The web page addresses have been taken from the article 'Photosynthesis and the Web: 2001' by Larry Orr and Govindjee, available at http://www.life.uiuc.edu/govindjee/photoweb and at http://photoscience.la.asu.edu/photosyn/ photoweb/default.html.When I find a bit of leisureI trifle with my papers.This is one of the lesserfrailities.'- Horace, Satires I, IV.
Non-indexed medical journals in the Web: new perspectives in the medical literature.
Germenis, A E; Kokkinides, P A; Stavropoulos-Giokas, C
1997-11-01
Many medical journals, publishing in national languages, meet serious financial problems and difficulties when they attempt to become indexed in the international indices. Obviously, this not only affects the scientific quality of non-indexed medical journals (NIMJs) but also affects the awareness of the scientific community of topics with apparently local but potentially broader scientific significance. This is a reality for over 100 Greek medical journals, none of which has a life longer than 30 years or more than 2000 subscribers. Among them, the 'Archives of Hellenic Medicine' (AHM) is published and sponsored by the Athens Medical Society (the oldest medical society in Greece founded in 1835). This peer-reviewed Journal is being published for 13 years, bimonthly, in Greek. Attempting to overcome the above mentioned problems and to be involved in the process of discovering the most effective way of scientific 'skywriting', 2 years ago, the AHM entered full-text in the Web and it was decided that up to 500% of its volume should be covered by English-language papers. As a result, the AHM are now included in the main Web lists of medical journals and their home page is linked in many academic pages having approximately 500 hits/month. Furthermore, 45 retrievals of AHM's English-language papers or English abstracts of Greek-language articles were reported by e-mail response from abroad. Considered apart from the paper-publishing, the expenses of the digital publishing of the AHM are about half of those of paper-publishing, as they were before the appearance of the Journal in the Web. Up to now, about 40% of the Journal's digital publishing cost is covered by advertisements included in its pages and by a modification of its paper-publishing policy. It is concluded that the international scientific community is not indifferent for information published in NIMJs. Medical national minorities working abroad express special interest for this type of information. The Web makes the NIMJs accessible to these potential readers, who would never have the chance to acquire them in their printed form.
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.
2017-12-01
The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.
World Wide Web Home Page Design: Patterns and Anomalies of Higher Education Library Home Pages.
ERIC Educational Resources Information Center
Stover, Mark; Zink, Steven D.
1996-01-01
A review of college and university library home pages concluded that many higher education home pages are badly designed, difficult to navigate, and a poor reflection on the institution. The most common shortcoming was the tendency to create too many links or overly large graphics. An appendix lists points to consider when constructing a home…
NASA Astrophysics Data System (ADS)
Palla, Gergely; Farkas, Illés J.; Pollner, Péter; Derényi, Imre; Vicsek, Tamás
2007-06-01
A search technique locating network modules, i.e. internally densely connected groups of nodes in directed networks is introduced by extending the clique percolation method originally proposed for undirected networks. After giving a suitable definition for directed modules we investigate their percolation transition in the Erdos-Rényi graph both analytically and numerically. We also analyse four real-world directed networks, including Google's own web-pages, an email network, a word association graph and the transcriptional regulatory network of the yeast Saccharomyces cerevisiae. The obtained directed modules are validated by additional information available for the nodes. We find that directed modules of real-world graphs inherently overlap and the investigated networks can be classified into two major groups in terms of the overlaps between the modules. Accordingly, in the word-association network and Google's web-pages, overlaps are likely to contain in-hubs, whereas the modules in the email and transcriptional regulatory network tend to overlap via out-hubs.
An Energy Overview of the Kingdom of Thailand
DOE Office of Scientific and Technical Information (OSTI.GOV)
anon.
The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Thailand. The site is designed to be dynamic. Updates to the overviews will be made as need and resource s permit.« less
An Energy Overview of the Republic of Egypt
DOE Office of Scientific and Technical Information (OSTI.GOV)
anon.
2003-10-17
The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Egypt. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
anon.
2003-10-20
The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Romania. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less
An Energy Overview of Venezuela
DOE Office of Scientific and Technical Information (OSTI.GOV)
anon.
2003-10-20
The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Venezuela. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less
An Energy Overview of the Czech Republic
DOE Office of Scientific and Technical Information (OSTI.GOV)
anon.
2003-10-17
The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is the Czech Republic. The site is designed to be dynamic. Updates to the overviews will be made as need an d resources permit.« less
An Energy Overview of Argentina
DOE Office of Scientific and Technical Information (OSTI.GOV)
anon.
2003-10-20
The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Argentina. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
anon.
The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site--each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Natural Gas, Coal,more » Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Colombia. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less
An Energy Overview of the Republic of Poland
DOE Office of Scientific and Technical Information (OSTI.GOV)
anon.
The DOE Office of Fossil Energy is maintaining a web site that is meant to provide useful business- and energy-related information about countries and regions of the world for exporters, project developers, and researchers. The site consists of more than 130 country pages (organized into seven different world regions), with each country page having its own set of links to information sources about that country. There are also more than 30 Country Energy Overviews at the web site -- each of these is a comprehensive review of a specific country's entire energy situation, including sections on Energy Policy, Oil, Naturalmore » Gas, Coal, Hydroelectric/Renewables, Nuclear Power, Energy Transmission Infrastructure, Electricity, Electric Industry Overview, Environmental Activities, Privatization, Trade, and Economic Situation. The specific country highlighted in this Country Energy Overview is Poland. The site is designed to be dynamic. Updates to the overviews will be made as need and resources permit.« less
A novel visualization model for web search results.
Nguyen, Tien N; Zhang, Jin
2006-01-01
This paper presents an interactive visualization system, named WebSearchViz, for visualizing the Web search results and acilitating users' navigation and exploration. The metaphor in our model is the solar system with its planets and asteroids revolving around the sun. Location, color, movement, and spatial distance of objects in the visual space are used to represent the semantic relationships between a query and relevant Web pages. Especially, the movement of objects and their speeds add a new dimension to the visual space, illustrating the degree of relevance among a query and Web search results in the context of users' subjects of interest. By interacting with the visual space, users are able to observe the semantic relevance between a query and a resulting Web page with respect to their subjects of interest, context information, or concern. Users' subjects of interest can be dynamically changed, redefined, added, or deleted from the visual space.
Standards opportunities around data-bearing Web pages.
Karger, David
2013-03-28
The evolving Web has seen ever-growing use of structured data, thanks to the way it enhances information authoring, querying, visualization and sharing. To date, however, most structured data authoring and management tools have been oriented towards programmers and Web developers. End users have been left behind, unable to leverage structured data for information management and communication as well as professionals. In this paper, I will argue that many of the benefits of structured data management can be provided to end users as well. I will describe an approach and tools that allow end users to define their own schemas (without knowing what a schema is), manage data and author (not program) interactive Web visualizations of that data using the Web tools with which they are already familiar, such as plain Web pages, blogs, wikis and WYSIWYG document editors. I will describe our experience deploying these tools and some lessons relevant to their future evolution.
Shuter, Jonathan; Morales, Daniela A; Considine-Dunn, Shannon E; An, Lawrence C; Stanton, Cassandra A
2014-09-01
To evaluate the feasibility and preliminary efficacy of a Web-based tobacco treatment for persons living with HIV (PLWH). Prospective, randomized controlled trial. HIV-care center in the Bronx, New York. Eligibility criteria included HIV infection, current tobacco usage, interest in quitting, and access to a computer with internet. One hundred thirty-eight subjects enrolled, and 134 completed the study. Positively Smoke Free on the Web (PSFW), an 8-session, 7-week targeted tobacco treatment program for PLWH, was compared with standard care (brief advice to quit and self-help brochure). All subjects were offered nicotine patches. The main feasibility outcomes were number of sessions logged into, number of Web pages visited, number of interactive clicks, and total time logged in. The main efficacy outcome was biochemically verified, 7-day point prevalence abstinence 3 months after intervention. PSFW subjects logged into a mean of 5.5 of 8 sessions and 26.2 of 41 pages. They executed a mean of 10 interactive clicks during a mean total of 59.8 minutes logged in. Most required reminder phone calls to complete the sessions. Educational level, anxiety score, and home access of the Web site were associated with Web site usage. Ten percent of the PSFW group vs. 4.3% of controls achieved the abstinence end point. Among those who completed all 8 sessions, 17.9% were abstinent, and among women completers, 30.8% were abstinent. Web-based treatment is a feasible strategy for PLWH smokers, and preliminary findings suggest therapeutic efficacy.
2009-06-01
search engines are not up to this task, as they have been optimized to catalog information quickly and efficiently for user ease of access while promoting retail commerce at the same time. This thesis presents a performance analysis of a new search engine algorithm designed to help find IED education networks using the Nutch open-source search engine architecture. It reveals which web pages are more important via references from other web pages regardless of domain. In addition, this thesis discusses potential evaluation and monitoring techniques to be used in conjunction
Application of Project Portfolio Management
NASA Astrophysics Data System (ADS)
Pankowska, Malgorzata
The main goal of the chapter is the presentation of the application project portfolio management approach to support development of e-Municipality and public administration information systems. The models of how people publish and utilize information on the web have been transformed continually. Instead of simply viewing on static web pages, users publish their own content through blogs and photo- and video-sharing slides. Analysed in this chapter, ICT (Information Communication Technology) projects for municipalities cover the mixture of the static web pages, e-Government information systems, and Wikis. So, for the management of the ICT projects' mixtures the portfolio project management approach is proposed.
ERIC Educational Resources Information Center
Snider, Jean; Martin, Florence
2012-01-01
Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…
A Structural and Content-Based Analysis for Web Filtering.
ERIC Educational Resources Information Center
Lee, P. Y.; Hui, S. C.; Fong, A. C. M.
2003-01-01
Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)
Tweeting links to Cochrane Schizophrenia Group reviews: a randomised controlled trial.
Adams, C E; Jayaram, M; Bodart, A Y M; Sampson, S; Zhao, S; Montgomery, A A
2016-03-08
To assess the effects of using health social media on web activity. Individually randomised controlled parallel group superiority trial. Twitter and Weibo. 170 Cochrane Schizophrenia Group full reviews with an abstract and plain language summary web page. Three randomly ordered slightly different 140 character or less messages, each containing a short URL to the freely accessible summary page sent on specific times on one single day. This was compared with no messaging. The primary outcome was web page visits at 1 week. Secondary outcomes were other metrics of web activity at 1 week. 85 reviews were randomised to each of the intervention and control arms. Google Analytics allowed 100% follow-up within 1 week of completion. Intervention and control reviews received a total of 1162 and 449 visits, respectively (IRR 2.7, 95% CI 2.2 to 3.3). Fewer intervention reviews had single page only visits (16% vs 31%, OR 0.41, 0.19 to 0.88) and users spent more time viewing intervention reviews (geometric mean 76 vs 31 s, ratio 2.5, 1.3 to 4.6). Other secondary metrics of web activity all showed strong evidence in favour of the intervention. Tweeting in this limited area of healthcare increases 'product placement' of evidence with the potential for that to influence care. ISRCTN84658943. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Index to Print Version of EPA Stylebook
This EPA Communication Product Standards index provides page numbers for topics such as Ampersands, Bitmapped Graphics, Exhibits and Displays, Podium Signage, Proofing, Sentence Length, Title Page, and Web Forms.
Semantic similarity measure in biomedical domain leverage web search engine.
Chen, Chi-Huang; Hsieh, Sheau-Ling; Weng, Yung-Ching; Chang, Wen-Yung; Lai, Feipei
2010-01-01
Semantic similarity measure plays an essential role in Information Retrieval and Natural Language Processing. In this paper we propose a page-count-based semantic similarity measure and apply it in biomedical domains. Previous researches in semantic web related applications have deployed various semantic similarity measures. Despite the usefulness of the measurements in those applications, measuring semantic similarity between two terms remains a challenge task. The proposed method exploits page counts returned by the Web Search Engine. We define various similarity scores for two given terms P and Q, using the page counts for querying P, Q and P AND Q. Moreover, we propose a novel approach to compute semantic similarity using lexico-syntactic patterns with page counts. These different similarity scores are integrated adapting support vector machines, to leverage the robustness of semantic similarity measures. Experimental results on two datasets achieve correlation coefficients of 0.798 on the dataset provided by A. Hliaoutakis, 0.705 on the dataset provide by T. Pedersen with physician scores and 0.496 on the dataset provided by T. Pedersen et al. with expert scores.
Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Site Map News National Centers for Environmental Prediction Climate Prediction Center 5830 University Research Court College Park, Maryland 20740 Climate Prediction Center Web Team Page last modified: December 13, 2005
The Library as Information Provider: The Home Page.
ERIC Educational Resources Information Center
Clyde, Laurel A.
1996-01-01
Discusses ways in which libraries are using the World Wide Web to provide information via a home page, based on information from a survey in Iceland as well as a larger study that conducted content analyses of home pages of public and school libraries in 13 countries. (Author/LRW)
Web usage mining at an academic health sciences library: an exploratory study.
Bracke, Paul J
2004-10-01
This paper explores the potential of multinomial logistic regression analysis to perform Web usage mining for an academic health sciences library Website. Usage of database-driven resource gateway pages was logged for a six-month period, including information about users' network addresses, referring uniform resource locators (URLs), and types of resource accessed. It was found that referring URL did vary significantly by two factors: whether a user was on-campus and what type of resource was accessed. Although the data available for analysis are limited by the nature of the Web and concerns for privacy, this method demonstrates the potential for gaining insight into Web usage that supplements Web log analysis. It can be used to improve the design of static and dynamic Websites today and could be used in the design of more advanced Web systems in the future.
Automatic page layout using genetic algorithms for electronic albuming
NASA Astrophysics Data System (ADS)
Geigel, Joe; Loui, Alexander C. P.
2000-12-01
In this paper, we describe a flexible system for automatic page layout that makes use of genetic algorithms for albuming applications. The system is divided into two modules, a page creator module which is responsible for distributing images amongst various album pages, and an image placement module which positions images on individual pages. Final page layouts are specified in a textual form using XML for printing or viewing over the Internet. The system makes use of genetic algorithms, a class of search and optimization algorithms that are based on the concepts of biological evolution, for generating solutions with fitness based on graphic design preferences supplied by the user. The genetic page layout algorithm has been incorporated into a web-based prototype system for interactive page layout over the Internet. The prototype system is built using client-server architecture and is implemented in java. The system described in this paper has demonstrated the feasibility of using genetic algorithms for automated page layout in albuming and web-based imaging applications. We believe that the system adequately proves the validity of the concept, providing creative layouts in a reasonable number of iterations. By optimizing the layout parameters of the fitness function, we hope to further improve the quality of the final layout in terms of user preference and computation speed.
Web page quality: can we measure it and what do we find? A report of exploratory findings.
Abbott, V P
2000-06-01
The aim of this study was to report exploratory findings from an attempt to quantify the quality of a sample of World Wide Web (WWW) pages relating to MMR vaccine that a typical user might locate. Forty pages obtained from a search of the WWW using two search engines and the search expression 'mmr vaccine' were analysed using a standard proforma. The proforma looked at the information the pages contained in terms of three categories: content, authorship and aesthetics. The information from each category was then quantified into a summary statistic, and receiver operating characteristic (ROC) curves were generated using a 'gold standard' of quality derived from the published literature. Optimal cut-off points for each of the three sections were calculated that best discriminated 'good' from 'bad' pages. Pages were also assessed as to whether they were pro- or anti-vaccination. For this sample, the combined contents and authorship score, with a cut-off of five, was a good discriminator, having 88 per cent sensitivity and 92 per cent specificity. Aesthetics was not a good discriminator. In the sample, 32.5 per cent of pages were pro-vaccination; 42.5 per cent were anti-vaccination and 25 per cent were neutral. The relative risk of being of poor quality if anti-vaccination was 3.3 (95 per cent confidence interval 1.8, 6.1). The sample of Web pages did contain some quality information on MMR vaccine. It also contained a great deal of misleading, inaccurate data. The proforma, combined with a knowledge of the literature, may help to distinguish between the two.
Content and Workflow Management for Library Websites: Case Studies
ERIC Educational Resources Information Center
Yu, Holly, Ed.
2005-01-01
Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…
Ultrabroadband photonic internet: safety aspects
NASA Astrophysics Data System (ADS)
Kalicki, Arkadiusz; Romaniuk, Ryszard
2008-11-01
Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.
EnviroAtlas One Meter Resolution Urban Land Cover Data (2008-2012) Web Service
This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas ). The EnviroAtlas One Meter-scale Urban Land Cover (MULC) Data were generated individually for each EnviroAtlas community. Source imagery varies by community. Land cover classes mapped also vary by community and include the following: water, impervious surfaces, soil and barren land, trees, shrub, grass and herbaceous, agriculture, orchards, woody wetlands, and emergent wetlands. Accuracy assessments were completed for each community's classification. For specific information about methods and accuracy of each community's land cover classification, consult their individual metadata records: Austin, TX (https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7B91A32A9D-96F5-4FA0-BC97-73BAD5D1F158%7D); Cleveland, OH (https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7B82ab1edf-8fc8-4667-9c52-5a5acffffa34%7D); Des Moines, IA (https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7BA4152198-978D-4C0B-959F-42EABA9C4E1B%7D); Durham, NC (https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7B2FF66877-A037-4693-9718-D1870AA3F084%7D); Fresno, CA (https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7B87041CF3-05BC-43C3-82DA-F066267C9871%7D); Green Bay, WI (https://edg.epa.gov/metadata/catalog/search/resource/details.page?uuid=%7BD602E7C9-7F53-4C24
Lim, Cherry; Wannapinij, Prapass; White, Lisa; Day, Nicholas P J; Cooper, Ben S; Peacock, Sharon J; Limmathurotsakul, Direk
2013-01-01
Estimates of the sensitivity and specificity for new diagnostic tests based on evaluation against a known gold standard are imprecise when the accuracy of the gold standard is imperfect. Bayesian latent class models (LCMs) can be helpful under these circumstances, but the necessary analysis requires expertise in computational programming. Here, we describe open-access web-based applications that allow non-experts to apply Bayesian LCMs to their own data sets via a user-friendly interface. Applications for Bayesian LCMs were constructed on a web server using R and WinBUGS programs. The models provided (http://mice.tropmedres.ac) include two Bayesian LCMs: the two-tests in two-population model (Hui and Walter model) and the three-tests in one-population model (Walter and Irwig model). Both models are available with simplified and advanced interfaces. In the former, all settings for Bayesian statistics are fixed as defaults. Users input their data set into a table provided on the webpage. Disease prevalence and accuracy of diagnostic tests are then estimated using the Bayesian LCM, and provided on the web page within a few minutes. With the advanced interfaces, experienced researchers can modify all settings in the models as needed. These settings include correlation among diagnostic test results and prior distributions for all unknown parameters. The web pages provide worked examples with both models using the original data sets presented by Hui and Walter in 1980, and by Walter and Irwig in 1988. We also illustrate the utility of the advanced interface using the Walter and Irwig model on a data set from a recent melioidosis study. The results obtained from the web-based applications were comparable to those published previously. The newly developed web-based applications are open-access and provide an important new resource for researchers worldwide to evaluate new diagnostic tests.
16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Requirements for Web site registration or... PRODUCTS (Eff. June 28, 2010) § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the...
Online Literacy Is a Lesser Kind
ERIC Educational Resources Information Center
Bauerlein, Mark
2008-01-01
Web skimming may be a kind of literacy but it's not the kind that matters most. In this article, the author contends that web skimming indicates a decline of literacy. The author discusses research conducted by Jakob Nielsen, a Web researcher, on how users skim web pages. He shows how the web is damaging the right way to read.
EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's Web based information assests. EPA's Web Taxonomy is being provided in Simple Knowledge Organization System (SKOS) format. SKOS is a standard for sharing and linking knowledge organization systems that promises to make Federal terminology resources more interoperable.
16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...
16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...
16 CFR 1130.7 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.7 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...
16 CFR § 1130.8 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Requirements for Web site registration or... OR TODDLER PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the...
A Web Server for MACCS Magnetometer Data
NASA Technical Reports Server (NTRS)
Engebretson, Mark J.
1998-01-01
NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.
Multimedia Data Capture with Multicast Dissemination for Online Distance Learning
2001-12-01
Juan Gril and Dr. Don Brutzman to wrap the multiple videos in a user- friendly environment. The web pages also contain the original PowerPoint...this CD, Juan Gril , a volunteer for the Siggraph 2001 Online Committee, created web pages that match the style and functionality desired by the...leader. The Committee for 2001 consisted of Don Brutzman, Stephen. Matsuba, Mike Collins, Allen Dutton, Juan Gril , Mike Hunsberger, Jerry Isdale
Effects of picture amount on preference, balance, and dynamic feel of Web pages.
Chiang, Shu-Ying; Chen, Chien-Hsiung
2012-04-01
This study investigates the effects of picture amount on subjective evaluation. The experiment herein adopted two variables to define picture amount: column ratio and picture size. Six column ratios were employed: 7:93,15:85, 24:76, 33:67, 41:59, and 50:50. Five picture sizes were examined: 140 x 81, 220 x 127, 300 x 173, 380 x 219, and 460 x 266 pixels. The experiment implemented a within-subject design; 104 participants were asked to evaluate 30 web page layouts. Repeated measurements revealed that the column ratio and picture size have significant effects on preference, balance, and dynamic feel. The results indicated the most appropriate picture amount for display: column ratios of 15:85 and 24:76, and picture sizes of 220 x 127, 300 x 173, and 380 x 219. The research findings can serve as the basis for the application of design guidelines for future web page interface design.
Adherence to a web-based intervention program for traumatized persons in mainland China.
Wang, Zhiyun
2014-01-01
This paper investigated adherence to a self-help web-based intervention for PTSD (Chinese My Trauma Recovery, CMTR) in mainland China and evaluated the association between adherence measures and potential predictors, for example, traumatic symptoms and self-efficacy. Data from 56 urban and 90 rural trauma survivors were reported who used at least one of the seven recovery modules of CMTR. The results showed that 80% urban users visited CMTR four or less days and 87% rural users visited CMTR for 5 or 6 days. On average, urban users visited 2.54 (SD=1.99) modules on the first visiting day and less from the second day; rural users visited 1.10 (SD=0.54) modules on the first visiting day, and it became stable in the following days. In both samples, depression scores at pre-test were significantly or trend significantly associated with the number of visited web pages in the relaxation and professional help modules (r=0.20-0.26, all p<0.14); traumatic symptom scores at pre-test significantly or trend significantly correlated to the number of visited web pages in the relaxation, professional help, and mastery tools modules (r=0.20-0.26, all p<0.10). Moreover, urban users' coping self-efficacy scores at pre-test significantly or trend significantly related to the number of visited web pages in the relaxation, professional help, social support, and mastery tool modules (r=0.20-0.33, all p<0.16). These findings suggest that individuals tend to focus on one or two recovery modules when they visit CMTR, and the number of web pages visited during the intervention period relates to users' traumatic and depressive symptoms and self-efficacy before intervention.
Who Do You Think You Are? Personal Home Pages and Self-Presentation on the World Wide Web.
ERIC Educational Resources Information Center
Dominick, Joseph R.
1999-01-01
Analyzes 319 personal home pages. Finds the typical page had a brief biography, a counter or guest book, and links to other pages but did not contain much personal information. Finds that strategies of self-presentation were employed with the same frequency as they were in interpersonal settings, and gender differences in self-presentation were…
National Centers for Environmental Prediction
/ VISION | About EMC EMC > NOAH > HOME Home Operational Products Experimental Data Verification Model PAGE LOGO NCEP HOME NWS LOGO NOAA HOME NOAA HOME Disclaimer for this non-operational web page
Plant Genome Resources at the National Center for Biotechnology Information
Wheeler, David L.; Smith-White, Brian; Chetvernin, Vyacheslav; Resenchuk, Sergei; Dombrowski, Susan M.; Pechous, Steven W.; Tatusova, Tatiana; Ostell, James
2005-01-01
The National Center for Biotechnology Information (NCBI) integrates data from more than 20 biological databases through a flexible search and retrieval system called Entrez. A core Entrez database, Entrez Nucleotide, includes GenBank and is tightly linked to the NCBI Taxonomy database, the Entrez Protein database, and the scientific literature in PubMed. A suite of more specialized databases for genomes, genes, gene families, gene expression, gene variation, and protein domains dovetails with the core databases to make Entrez a powerful system for genomic research. Linked to the full range of Entrez databases is the NCBI Map Viewer, which displays aligned genetic, physical, and sequence maps for eukaryotic genomes including those of many plants. A specialized plant query page allow maps from all plant genomes covered by the Map Viewer to be searched in tandem to produce a display of aligned maps from several species. PlantBLAST searches against the sequences shown in the Map Viewer allow BLAST alignments to be viewed within a genomic context. In addition, precomputed sequence similarities, such as those for proteins offered by BLAST Link, enable fluid navigation from unannotated to annotated sequences, quickening the pace of discovery. NCBI Web pages for plants, such as Plant Genome Central, complete the system by providing centralized access to NCBI's genomic resources as well as links to organism-specific Web pages beyond NCBI. PMID:16010002
ERIC Educational Resources Information Center
Larson, Ray R.
1996-01-01
Examines the bibliometrics of the World Wide Web based on analysis of Web pages collected by the Inktomi "Web Crawler" and on the use of the DEC AltaVista search engine for cocitation analysis of a set of Earth Science related Web sites. Looks at the statistical characteristics of Web documents and their hypertext links, and the…
A Practical Ontology Query Expansion Algorithm for Semantic-Aware Learning Objects Retrieval
ERIC Educational Resources Information Center
Lee, Ming-Che; Tsai, Kun Hua; Wang, Tzone I.
2008-01-01
Following the rapid development of Internet, particularly web page interaction technology, distant e-learning has become increasingly realistic and popular. To solve the problems associated with sharing and reusing teaching materials in different e-learning systems, several standard formats, including SCORM, IMS, LOM, and AICC, etc., recently have…
Picturing Service-Learning: Defining the Field, Setting Expectations, Shaping Learning
ERIC Educational Resources Information Center
Donahue, David M.; Fenner, Derek; Mitchell, Tania D.
2015-01-01
This study used content analysis and audiencing to understand how service-learning is presented visually by institutions of higher education and interpreted by college students. Data included 834 photographs from the service-learning web pages of 63 four-year institutions in California. The majority showed a narrow range of direct service…
EPA Pesticide Chemical Search allows a user to easily find the pesticide chemical or active ingredient that they are interested in by using an array of simple to advanced search options. Chemical Search provides a single point of reference for easy access to information previously published in a variety of locations, including various EPA web pages and Regulations.gov.
Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction
ERIC Educational Resources Information Center
Sun, Chong
2012-01-01
More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…
Determine Baseline Energy Consumption | Climate Neutral Research Campuses |
the campus boundary and any off-site energy impacts you will be calculating. For example, the fuel usually included in the baseline. However, the impacts of joint ventures that take place off-site are Web page. Scope 3: Transportation impacts from commuters and business travel, which can be derived