Educational use of World Wide Web pages on CD-ROM.
Engel, Thomas P; Smith, Michael
2002-01-01
The World Wide Web is increasingly important for medical education. Internet served pages may also be used on a local hard disk or CD-ROM without a network or server. This allows authors to reuse existing content and provide access to users without a network connection. CD-ROM offers several advantages over network delivery of Web pages for several applications. However, creating Web pages for CD-ROM requires careful planning. Issues include file names, relative links, directory names, default pages, server created content, image maps, other file types and embedded programming. With care, it is possible to create server based pages that can be copied directly to CD-ROM. In addition, Web pages on CD-ROM may reference Internet served pages to provide the best features of both methods.
Authoring Educational Courseware Using OXYGEN.
ERIC Educational Resources Information Center
Ip, Albert
Engaging learners on the World Wide Web is more than sending Web pages to the user. However, for many course delivery software programs, the smallest unit of delivery is a Web page. How content experts can create engaging Web pages has largely been ignored or taken for granted. This paper reports on an authoring model for creating pedagogically…
Home Page, Sweet Home Page: Creating a Web Presence.
ERIC Educational Resources Information Center
Falcigno, Kathleen; Green, Tim
1995-01-01
Focuses primarily on design issues and practical concerns involved in creating World Wide Web documents for use within an organization. Concerns for those developing Web home pages are: learning HyperText Markup Language (HTML); defining customer group; allocating staff resources for maintenance of documents; providing feedback mechanism for…
FOCIH: Form-Based Ontology Creation and Information Harvesting
NASA Astrophysics Data System (ADS)
Tao, Cui; Embley, David W.; Liddle, Stephen W.
Creating an ontology and populating it with data are both labor-intensive tasks requiring a high degree of expertise. Thus, scaling ontology creation and population to the size of the web in an effort to create a web of data—which some see as Web 3.0—is prohibitive. Can we find ways to streamline these tasks and lower the barrier enough to enable Web 3.0? Toward this end we offer a form-based approach to ontology creation that provides a way to create Web 3.0 ontologies without the need for specialized training. And we offer a way to semi-automatically harvest data from the current web of pages for a Web 3.0 ontology. In addition to harvesting information with respect to an ontology, the approach also annotates web pages and links facts in web pages to ontological concepts, resulting in a web of data superimposed over the web of pages. Experience with our prototype system shows that mappings between conceptual-model-based ontologies and forms are sufficient for creating the kind of ontologies needed for Web 3.0, and experiments with our prototype system show that automatic harvesting, automatic annotation, and automatic superimposition of a web of data over a web of pages work well.
World Wide Web Pages--Tools for Teaching and Learning.
ERIC Educational Resources Information Center
Beasley, Sarah; Kent, Jean
Created to help educators incorporate World Wide Web pages into teaching and learning, this collection of Web pages presents resources, materials, and techniques for using the Web. The first page focuses on tools for teaching and learning via the Web, providing pointers to sites containing the following: (1) course materials for both distance and…
Classroom Web Pages: A "How-To" Guide for Educators.
ERIC Educational Resources Information Center
Fehling, Eric E.
This manual provides teachers, with very little or no technology experience, with a step-by-step guide for developing the necessary skills for creating a class Web Page. The first part of the manual is devoted to the thought processes preceding the actual creation of the Web Page. These include looking at other Web Pages, deciding what should be…
SurveyWiz and factorWiz: JavaScript Web pages that make HTML forms for research on the Internet.
Birnbaum, M H
2000-05-01
SurveyWiz and factorWiz are Web pages that act as wizards to create HTML forms that enable one to collect data via the Web. SurveyWiz allows the user to enter survey questions or personality test items with a mixture of text boxes and scales of radio buttons. One can add demographic questions of age, sex, education, and nationality with the push of a button. FactorWiz creates the HTML for within-subjects, two-factor designs as large as 9 x 9, or higher order factorial designs up to 81 cells. The user enters levels of the row and column factors, which can be text, images, or other multimedia. FactorWiz generates the stimulus combinations, randomizes their order, and creates the page. In both programs HTML is displayed in a window, and the user copies it to a text editor to save it. When uploaded to a Web server and supported by a CGI script, the created Web pages allow data to be collected, coded, and saved on the server. These programs are intended to assist researchers and students in quickly creating studies that can be administered via the Web.
ERIC Educational Resources Information Center
Ariga, T.; Watanabe, T.
2008-01-01
The explosive growth of the Internet has made the knowledge and skills for creating Web pages into general subjects that all students should learn. It is now common to teach the technical side of the production of Web pages and many teaching materials have been developed. However teaching the aesthetic side of Web page design has been neglected,…
Scheduled webinars can help you better manage EPA web content. Class topics include Drupal basics, creating different types of pages in the WebCMS such as document pages and forms, using Google Analytics, and best practices for metadata and accessibility.
"Ordinary People Do This": Rhetorical Examinations of Novice Web Design
ERIC Educational Resources Information Center
Karper, Erin
2005-01-01
Even as weblogs, content management systems, and other forms of automated Web posting and journals are changing the way people create and place content on the Web, new Web pages mushroom overnight. However, many new Web designers produce Web pages that seem to ignore fundamental principles of "good design": full of colored backgrounds, animated…
Web Pages for Your Classroom: The Easy Way!
ERIC Educational Resources Information Center
McCorkle, Sandra K.
This book provides the classroom teacher or librarian with templates and instructions for creating Web pages for use with middle school or high school students. The pages can then be used for doing research projects or other types of projects that familiarize students with the power, flexibility, and usefulness of the Web. Part I, Technology in…
How To Build a Web Site in Six Easy Steps.
ERIC Educational Resources Information Center
Yaworski, JoAnn
2002-01-01
Gives instructions in nontechnical terms for building a simple web site using Netscape Navigator or Communicator's web editor. Presents six steps that include: organizing information, creating a page and a background, linking files, linking to Internet web pages, linking images, and linking an email address. Gives advice for sending the web page…
NASA Technical Reports Server (NTRS)
Steeman, Gerald; Connell, Christopher
2000-01-01
Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.
A cross disciplinary study of link decay and the effectiveness of mitigation techniques
2013-01-01
Background The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. Results We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Conclusion Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved. PMID:24266891
A cross disciplinary study of link decay and the effectiveness of mitigation techniques.
Hennessey, Jason; Ge, Steven
2013-01-01
The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved.
A Guide to Fast and Simple Web Site Development. Using Microsoft FrontPage.
ERIC Educational Resources Information Center
La, Minh; Beachler, Judith
Designed by California's Los Rios Community College District for use in instructional workshops, this guide is intended to help institutional researchers create World Wide Web sites using Microsoft FrontPage (MF) software. The first part of the guide presents practical suggestions for working with the software to create a site, covering the…
Informatics in radiology (infoRAD): HTML and Web site design for the radiologist: a primer.
Ryan, Anthony G; Louis, Luck J; Yee, William C
2005-01-01
A Web site has enormous potential as a medium for the radiologist to store, present, and share information in the form of text, images, and video clips. With a modest amount of tutoring and effort, designing a site can be as painless as preparing a Microsoft PowerPoint presentation. The site can then be used as a hub for the development of further offshoots (eg, Web-based tutorials, storage for a teaching library, publication of information about one's practice, and information gathering from a wide variety of sources). By learning the basics of hypertext markup language (HTML), the reader will be able to produce a simple and effective Web page that permits display of text, images, and multimedia files. The process of constructing a Web page can be divided into five steps: (a) creating a basic template with formatted text, (b) adding color, (c) importing images and multimedia files, (d) creating hyperlinks, and (e) uploading one's page to the Internet. This Web page may be used as the basis for a Web-based tutorial comprising text documents and image files already in one's possession. Finally, there are many commercially available packages for Web page design that require no knowledge of HTML.
Avoiding Pornography Landmines while Traveling the Information Superhighway.
ERIC Educational Resources Information Center
Lehmann, Kay
2002-01-01
Discusses how to avoid pornographic sites when using the Internet in classrooms. Highlights include re-setting the Internet home page; putting appropriate links in a Word document; creating a Web page with appropriate links; downloading the content of a Web site; educating the students; and re-checking all Web addresses. (LRW)
Using Firefly Tools to Enhance Archive Web Pages
NASA Astrophysics Data System (ADS)
Roby, W.; Wu, X.; Ly, L.; Goldina, T.
2013-10-01
Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.
Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G
2006-01-01
The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.
76 FR 77203 - Notice of Intent To Seek Approval To Collect Information
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-12
... Web pages created and maintained by component organizations of the NAL. On average, 2 million people... interest collections have established a Web presence with a home page and links to sub-pages that provide... Semantic Differential Scale or multiple choice questions, and no more than 4 open-ended response questions...
Creating and Maintaining Data-Driven Course Web Sites.
ERIC Educational Resources Information Center
Heines, Jesse M.
This paper deals with techniques for reducing the amount of work that needs to be redone each semester when one prepares an existing course Web site for a new class. The key concept is algorithmic generation of common page elements while still allowing full control over page content via WYSIWYG tools like Microsoft FrontPage and Macromedia…
Artieta-Pinedo, Isabel; Paz-Pascual, Carmen; Grandes, Gonzalo; Villanueva, Gemma
2018-03-01
the aim of this study is to evaluate the quality of web pages found by women when carrying out an exploratory search concerning pregnancy, childbirth, the postpartum period and breastfeeding. a descriptive study of the first 25 web pages that appear in the search engines Google, Yahoo and Bing, in October 2014 in the Basque Country (Spain), when entering eight Spanish words and seven English words related to pregnancy, childbirth, the postpartum period, breastfeeding and newborns. Web pages aimed at healthcare professionals and forums were excluded. The reliability was evaluated using the LIDA questionnaire, and the contents of the web pages with the highest scores were then described. a total of 126 web pages were found using the key search words. Of these, 14 scored in the top 30% for reliability. The content analysis of these found that the mean score for "references to the source of the information" was 3.4 (SD: 2.17), that for "up-to-date" was 4.30 (SD: 1.97) and the score for "conflict of interest statement" was 5.90 (SD: 2.16). The mean for web pages created by universities and official bodies was 13.64 (SD: 4.47), whereas the mean for those created by private bodies was 11.23 (SD: 4.51) (F (1,124)5.27. p=0.02). The content analysis of these web pages found that the most commonly discussed topic was breastfeeding, followed by self-care during pregnancy and the onset of childbirth. in this study, web pages from established healthcare or academic institutions were found to contain the most reliable information. The significant number of web pages found in this study with poor quality information indicates the need for healthcare professionals to guide women when sourcing information online. As the origin of the web page has a direct effect on reliability, the involvement of healthcare professionals in the use, counselling and generation of new technologies as an intervention tool is increasingly essential. Copyright © 2017 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-29
... answers to 5 questions. NIOSH has also created a new NIOSH Cancer and RELs Policy Web Topic Page [see http... be available on the NIOSH Web page at http://www.cdc.gov/niosh/docket , and comments will be...
Suzuki, Lalita K; Beale, Ivan L
2006-01-01
The content of personal Web home pages created by adolescents with cancer is a new source of information about this population of potential benefit to oncology nurses and psychologists. Individual Internet elements found on 21 home pages created by youths with cancer (14-22 years old) were rated for cancer-related self-presentation, information dissemination, and interpersonal connection. Examples of adolescents' online narratives were also recorded. Adolescents with cancer used various Internet elements on their home pages for cancer-related self-presentation (eg, welcome messages, essays, personal history and diary pages, news articles, and poetry), information dissemination (e.g., through personal interest pages, multimedia presentations, lists, charts, and hyperlinks), and interpersonal connection (eg, guestbook entries). Results suggest that various elements found on personal home pages are being used by a limited number of young patients with cancer for self-expression, information access, and contact with peers.
Marketing on the World Wide Web.
ERIC Educational Resources Information Center
Teague, John H.
1995-01-01
Discusses the World Wide Web, its importance for marketing, its advantages, non-commercial promotions on the Web, how businesses use the Web, the Web market, resistance to Internet commercialization, getting on the Web, creating Web pages, rising above the noise, and some of the Web's problems and limitations. (SR)
A Tutorial in Creating Web-Enabled Databases with Inmagic DB/TextWorks through ODBC.
ERIC Educational Resources Information Center
Breeding, Marshall
2000-01-01
Explains how to create Web-enabled databases. Highlights include Inmagic's DB/Text WebPublisher product called DB/TextWorks; ODBC (Open Database Connectivity) drivers; Perl programming language; HTML coding; Structured Query Language (SQL); Common Gateway Interface (CGI) programming; and examples of HTML pages and Perl scripts. (LRW)
Web Database Development: Implications for Academic Publishing.
ERIC Educational Resources Information Center
Fernekes, Bob
This paper discusses the preliminary planning, design, and development of a pilot project to create an Internet accessible database and search tool for locating and distributing company data and scholarly work. Team members established four project objectives: (1) to develop a Web accessible database and decision tool that creates Web pages on the…
ERIC Educational Resources Information Center
Metz, Ray E.; Junion-Metz, Gail
This book provides basic information about the World Wide Web and serves as a guide to the tools and techniques needed to browse the Web, integrate it into library services, or build an attractive, user-friendly home page for the library. Chapter 1 provides an overview of Web basics and chapter 2 discusses some of the big issues related to…
ERIC Educational Resources Information Center
Block, Marylaine
2002-01-01
Discusses how to teach students to evaluate information they find on the Internet. Highlights include motivation of Web site owners; link-checking; having student create Web pages to help with their evaluation skills of other Web sites; critical thinking skills; and helpful Web sites. (LRW)
Designing and Implementing a Unique Website Design Project in an Undergraduate Course
ERIC Educational Resources Information Center
Kontos, George
2016-01-01
The following paper describes a distinctive collaborative service-learning project done in an undergraduate class on web design. In this project, students in a web design class contacted local community non-profit organizations to create websites (collections of web pages) to benefit these organizations. The two phases of creating a website,…
A profile of anti-vaccination lobbying on the South African internet, 2011-2013.
Burnett, Rosemary Joyce; von Gogh, Lauren Jennifer; Moloi, Molelekeng H; François, Guido
2015-11-01
The South African Vaccination and Immunisation Centre receives many requests to explain the validity of internet-based anti-vaccination claims. Previous global studies on internet-based anti-vaccination lobbying had not identified anti-vaccination web pages originating in South Africa (SA). To characterise SA internet-based anti-vaccination lobbying. In 2011, searches for anti-vaccination content were performed using Google, Yahoo and MSN-Bing, limited to English-language SA web pages. Content analysis was performed on web pages expressing anti-vaccination sentiment about infant vaccination. This was repeated in 2012 and 2013 using Google, with the first 700 web pages per search being analysed. Blogs/forums, articles and e-shops constituted 40.3%, 55.2% and 4.5% of web pages, respectively. Authors were lay people (63.5%), complementary/alternative medicine (CAM) practitioners (23.1%), medical professionals practising CAM (7.7%) and medical professionals practising only allopathic medicine (5.8%). Advertisements appeared on 55.2% of web pages. Of these, 67.6% were sponsored by or linked to organisations with financial interests in discrediting vaccines, with 80.0% and 24.0% of web pages sponsored by these organisations claiming respectively that vaccines are ineffective and that vaccination is profit driven. The vast majority of web pages (92.5%) claimed that vaccines are not safe, and 77.6% of anti-vaccination claims originated from the USA. South Africans are creating web pages or blogs for local anti-vaccination lobbying. Research is needed to understand what influence internet-based anti-vaccination lobbying has on the uptake of infant vaccination in SA.
Key-phrase based classification of public health web pages.
Dolamic, Ljiljana; Boyer, Célia
2013-01-01
This paper describes and evaluates the public health web pages classification model based on key phrase extraction and matching. Easily extendible both in terms of new classes as well as the new language this method proves to be a good solution for text classification faced with the total lack of training data. To evaluate the proposed solution we have used a small collection of public health related web pages created by a double blind manual classification. Our experiments have shown that by choosing the adequate threshold value the desired value for either precision or recall can be achieved.
Building the Service-Based Library Web Site: A Step-by-Step Guide to Design and Options.
ERIC Educational Resources Information Center
Garlock, Kristen L.; Piontek, Sherry
The World Wide Web, with its captivating multimedia features and hypertext capabilities, has brought millions of new users to the Internet. Library staff who could create a home page on the Web could present basic information about the library and its services, showcase its resources, create links to quality material inside and outside the…
Creating a Classroom Kaleidoscope with the World Wide Web.
ERIC Educational Resources Information Center
Quinlan, Laurie A.
1997-01-01
Discusses the elements of classroom Web presentations: planning; construction, including design tips; classroom use; and assessment. Lists 14 World Wide Web resources for K-12 teachers; Internet search tools (directories, search engines and meta-search engines); a Web glossary; and an example of HTML for a simple Web page. (PEN)
ERIC Educational Resources Information Center
Snyder, Robin M.
HTML provides a platform-independent way of creating and making multimedia presentations for classroom instruction and making that content available on the Internet. However, time in class is very valuable, so that any way to automate or otherwise assist the presenter in Web page navigation during class can save valuable seconds. This paper…
Accounting Data to Web Interface Using PERL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hargeaves, C
2001-08-13
This document will explain the process to create a web interface for the accounting information generated by the High Performance Storage Systems (HPSS) accounting report feature. The accounting report contains useful data but it is not easily accessed in a meaningful way. The accounting report is the only way to see summarized storage usage information. The first step is to take the accounting data, make it meaningful and store the modified data in persistent databases. The second step is to generate the various user interfaces, HTML pages, that will be used to access the data. The third step is tomore » transfer all required files to the web server. The web pages pass parameters to Common Gateway Interface (CGI) scripts that generate dynamic web pages and graphs. The end result is a web page with specific information presented in text with or without graphs. The accounting report has a specific format that allows the use of regular expressions to verify if a line is storage data. Each storage data line is stored in a detailed database file with a name that includes the run date. The detailed database is used to create a summarized database file that also uses run date in its name. The summarized database is used to create the group.html web page that includes a list of all storage users. Scripts that query the database folder to build a list of available databases generate two additional web pages. A master script that is run monthly as part of a cron job, after the accounting report has completed, manages all of these individual scripts. All scripts are written in the PERL programming language. Whenever possible data manipulation scripts are written as filters. All scripts are written to be single source, which means they will function properly on both the open and closed networks at LLNL. The master script handles the command line inputs for all scripts, file transfers to the web server and records run information in a log file. The rest of the scripts manipulate the accounting data or use the files created to generate HTML pages. Each script will be described in detail herein. The following is a brief description of HPSS taken directly from an HPSS web site. ''HPSS is a major development project, which began in 1993 as a Cooperative Research and Development Agreement (CRADA) between government and industry. The primary objective of HPSS is to move very large data objects between high performance computers, workstation clusters, and storage libraries at speeds many times faster than is possible with today's software systems. For example, HPSS can manage parallel data transfers from multiple network-connected disk arrays at rates greater than 1 Gbyte per second, making it possible to access high definition digitized video in real time.'' The HPSS accounting report is a canned report whose format is controlled by the HPSS developers.« less
Working with WebQuests: Making the Web Accessible to Students with Disabilities.
ERIC Educational Resources Information Center
Kelly, Rebecca
2000-01-01
This article describes how students with disabilities in regular classes are using the WebQuest lesson format to access the Internet. It explains essential WebQuest principles, creating a draft Web page, and WebQuest components. It offers an example of a WebQuest about salvaging the sunken ships, Titanic and Lusitania. A WebQuest planning form is…
Web server for priority ordered multimedia services
NASA Astrophysics Data System (ADS)
Celenk, Mehmet; Godavari, Rakesh K.; Vetnes, Vermund
2001-10-01
In this work, our aim is to provide finer priority levels in the design of a general-purpose Web multimedia server with provisions of the CM services. The type of services provided include reading/writing a web page, downloading/uploading an audio/video stream, navigating the Web through browsing, and interactive video teleconferencing. The selected priority encoding levels for such operations follow the order of admin read/write, hot page CM and Web multicasting, CM read, Web read, CM write and Web write. Hot pages are the most requested CM streams (e.g., the newest movies, video clips, and HDTV channels) and Web pages (e.g., portal pages of the commercial Internet search engines). Maintaining a list of these hot Web pages and CM streams in a content addressable buffer enables a server to multicast hot streams with lower latency and higher system throughput. Cold Web pages and CM streams are treated as regular Web and CM requests. Interactive CM operations such as pause (P), resume (R), fast-forward (FF), and rewind (RW) have to be executed without allocation of extra resources. The proposed multimedia server model is a part of the distributed network with load balancing schedulers. The SM is connected to an integrated disk scheduler (IDS), which supervises an allocated disk manager. The IDS follows the same priority handling as the SM, and implements a SCAN disk-scheduling method for an improved disk access and a higher throughput. Different disks are used for the Web and CM services in order to meet the QoS requirements of CM services. The IDS ouput is forwarded to an Integrated Transmission Scheduler (ITS). The ITS creates a priority ordered buffering of the retrieved Web pages and CM data streams that are fed into an auto regressive moving average (ARMA) based traffic shaping circuitry before being transmitted through the network.
World Wide Web Home Page Design: Patterns and Anomalies of Higher Education Library Home Pages.
ERIC Educational Resources Information Center
Stover, Mark; Zink, Steven D.
1996-01-01
A review of college and university library home pages concluded that many higher education home pages are badly designed, difficult to navigate, and a poor reflection on the institution. The most common shortcoming was the tendency to create too many links or overly large graphics. An appendix lists points to consider when constructing a home…
Software tools for developing an acoustics multimedia CD-ROM
NASA Astrophysics Data System (ADS)
Bigelow, Todd W.; Wheeler, Paul A.
2003-10-01
A multimedia CD-ROM was developed to accompany the textbook, Science of Sound, by Tom Rossing. This paper discusses the multimedia elements included in the CD-ROM and the various software packages used to create them. PowerPoint presentations with an audio-track background were converted to web pages using Impatica. Animations of acoustic examples and quizzes were developed using Flash by Macromedia. Vegas Video and Sound Forge by Sonic Foundry were used for editing video and audio clips while Cleaner by Discreet was used to compress the clips for use over the internet. Math tutorials were presented as whiteboard presentations using Hitachis Starboard to create the graphics and TechSmiths Camtasia Studio to record the presentations. The CD-ROM is in a web-page format created with Macromedias Dreamweaver. All of these elements are integrated into a single course supplement that can be viewed by any computer with a web browser.
Basic GA Tools to Evaluate Your Web Area
Learn steps and tips for creating these Google Analytics (GA) reports, so you can learn which pages are popular or unpopular, which PDFs are getting looked at, who is using your pages, what search terms they used, and more.
A Course Evolves-Physical Anthropology.
ERIC Educational Resources Information Center
O'Neil, Dennis
2001-01-01
Describes the development of an online physical anthropology course at Palomar College (California) that evolved from online tutorials. Discusses the ability to update materials on the Web more quickly than in traditional textbooks; creating Web pages that are readable by most Web browsers; test security issues; and clarifying ownership of online…
Multimedia Data Capture with Multicast Dissemination for Online Distance Learning
2001-12-01
Juan Gril and Dr. Don Brutzman to wrap the multiple videos in a user- friendly environment. The web pages also contain the original PowerPoint...this CD, Juan Gril , a volunteer for the Siggraph 2001 Online Committee, created web pages that match the style and functionality desired by the...leader. The Committee for 2001 consisted of Don Brutzman, Stephen. Matsuba, Mike Collins, Allen Dutton, Juan Gril , Mike Hunsberger, Jerry Isdale
A Virtual Tour of the Radio Astronomy Process
NASA Astrophysics Data System (ADS)
Conrad, S. B.; Finley, D. G.; Claussen, M. J.; Ulvestad, J. S.
2000-12-01
In the summer of 2000, two teachers working on a Masters of Science Teaching Degree at New Mexico Tech and participating in the Research Experience for Teachers (RET) program sponsored by the National Science Foundation, spent eight weeks as interns researching and working on projects at the National Radio Astronomy Observatory (NRAO) which will directly benefit students in their classrooms and also impact other science educators. One of the products of the interships is a set of web pages for NRAO's web page educational section. The purpose of these web pages is to familiarize students, teachers, and other people with the process that a radio astronomer goes through to do radio astronomy science. A virtual web tour was created of this process. This required interviewing radio astronomers and other professionals involved with this process at the NRAO (e.g. engineers, data analysts, and operations people), and synthesizing the interviews into a descriptive, visual-based set of web pages. These pages do meet the National as well as New Mexico Standards and Benchmarks for Science Education. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc. The NSF's RET program is gratefully acknowledged.
Business Systems Branch Abilities, Capabilities, and Services Web Page
NASA Technical Reports Server (NTRS)
Cortes-Pena, Aida Yoguely
2009-01-01
During the INSPIRE summer internship I acted as the Business Systems Branch Capability Owner for the Kennedy Web-based Initiative for Communicating Capabilities System (KWICC), with the responsibility of creating a portal that describes the services provided by this Branch. This project will help others achieve a clear view ofthe services that the Business System Branch provides to NASA and the Kennedy Space Center. After collecting the data through the interviews with subject matter experts and the literature in Business World and other web sites I identified discrepancies, made the necessary corrections to the sites and placed the information from the report into the KWICC web page.
ERIC Educational Resources Information Center
Block, Marylaine
2001-01-01
Discusses Web sites and Weblogs (or blogs) created by librarians as informal, interactive zines. Considers Web publishing software which makes it easier to start, motivation for self-publishing, and differences from trade publications; and provides a list of librarian blogs and zines. (LRW)
Going, going, still there: using the WebCite service to permanently archive cited web pages.
Eysenbach, Gunther; Trudel, Mathieu
2005-12-30
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics.
Home Page: The Mode of Transport through the Information Superhighway
NASA Technical Reports Server (NTRS)
Lujan, Michelle R.
1995-01-01
The purpose of the project with the Aeroacoustics Branch was to create and submit a home page for the internet about branch information. In order to do this, one must also become familiar with the way that the internet operates. Learning HyperText Markup Language (HTML), and the ability to create a document using this language was the final objective in order to place a home page on the internet (World Wide Web). A manual of instructions regarding maintenance of the home page, and how to keep it up to date was also necessary in order to provide branch members with the opportunity to make any pertinent changes.
Parents on the web: risks for quality management of cough in children.
Pandolfini, C; Impicciatore, P; Bonati, M
2000-01-01
Health information on the Internet, with respect to common, self-limited childhood illnesses, has been found to be unreliable. Therefore, parents navigating on the Internet risk finding advice that is incomplete or, more importantly, not evidence-based. The importance that a resource such as the Internet as a source of quality health information for consumers should, however, be taken into consideration. For this reason, studies need to be performed regarding the quality of material provided. Various strategies have been proposed that would allow parents to distinguish trustworthy web documents from unreliable ones. One of these strategies is the use of a checklist for the appraisal of web pages based on their technical aspects. The purpose of this study was to assess the quality of information present on the Internet regarding the home management of cough in children and to examine the applicability of a checklist strategy that would allow consumers to select more trustworthy web pages. The Internet was searched for web pages regarding the home treatment of cough in children with the use of different search engines. Medline and the Cochrane database were searched for available evidence concerning the management of cough in children. Three checklists were created to assess different aspects of the web documents. The first checklist was designed to allow for a technical appraisal of the web pages and was based on components such as the name of the author and references used. The second was constructed to examine the completeness of the health information contained in the documents, such as causes and mechanism of cough, and pharmacological and nonpharmacological treatment. The third checklist assessed the quality of the information by measuring it against a gold standard document. This document was created by combining the policy statement issued by the American Academy of Pediatrics regarding the pharmacological treatment of cough in children with the guide of the World Health Organization on drugs for children. For each checklist, the web page contents were analyzed and quantitative measurements were assigned. Of the 19 web pages identified, 9 explained the purpose and/or mechanism of cough and 14 the causes. The most frequently mentioned pharmacological treatments were single-ingredient suppressant preparations, followed by single-ingredient expectorants. Dextromethorphan was the most commonly referred to suppressant and guaifenesin the most common expectorant. No documents discouraged the use of suppressants, although 4 of the 10 web documents that addressed expectorants discouraged their use. Sixteen web pages addressed nonpharmacological treatment, 14 of which suggested exposure to a humid environment and/or extra fluid. In most cases, the criteria in the technical appraisal checklist were not present in the web documents; moreover, 2 web pages did not provide any of the items. Regarding content completeness, 3 web pages satisfied all the requirements considered in the checklist and 2 documents did not meet any of the criteria. Of the 3 web pages that scored highest in technical aspect, 2 also supplied complete information. No relationship was found, however, between the technical aspect and the content completeness. Concerning the quality of the health information supplied, 10 pages received a negative score because they contained more incorrect than correct information, and 1 web page received a high score. This document was 1 of the 2 that also scored high in technical aspect and content completeness. No relationship was found, however, among quality of information, technical aspect, and content completeness. As the results of this study show, a parent navigating the Internet for information on the home management of cough in children will no doubt find incorrect advice among the search results. (ABSTRACT TRUNCATED)
CH5M3D: an HTML5 program for creating 3D molecular structures.
Earley, Clarke W
2013-11-18
While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user's computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/.
CH5M3D: an HTML5 program for creating 3D molecular structures
2013-01-01
Background While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Results Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user’s computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. Conclusions A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/. PMID:24246004
| photos Web page Web page 2016 PDF | photos Web page Web page 2015 PDF | photos | video Web page Web page 2014 PDF | photos | videos Web page Web page 2013 PDF | photos Web page Web page 2012 PDF | photos Web page Web page 2011 PDF | photos PDF Web page 2010 PDF PDF PDF 2009 PDF PDF PDF 2008 PDF PDF PDF 2007
Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages
Trudel, Mathieu
2005-01-01
Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics. PMID:16403724
NASA Astrophysics Data System (ADS)
Allebach, J. P.; Ortiz Segovia, Maria; Atkins, C. Brian; O'Brien-Strain, Eamonn; Damera-Venkata, Niranjan; Bhatti, Nina; Liu, Jerry; Lin, Qian
2010-02-01
Businesses have traditionally relied on different types of media to communicate with existing and potential customers. With the emergence of the Web, the relation between the use of print and electronic media has continually evolved. In this paper, we investigate one possible scenario that combines the use of the Web and print. Specifically, we consider the scenario where a small- or medium-sized business (SMB) has an existing web site from which they wish to pull content to create a print piece. Our assumption is that the web site was developed by a professional designer, working in conjunction with the business owner or marketing team, and that it contains a rich assembly of content that is presented in an aesthetically pleasing manner. Our goal is to understand the process that a designer would follow to create an effective and aesthetically pleasing print piece. We are particularly interested to understand the choices made by the designer with respect to placement and size of the text and graphic elements on the page. Toward this end, we conducted an experiment in which professional designers worked with SMBs to create print pieces from their respective web pages. In this paper, we report our findings from this experiment, and examine the underlying conclusions regarding the resulting document aesthetics in the context of the existing design, and engineering and computer science literatures that address this topic
An Administrative Model for Virtual Website Hosting.
ERIC Educational Resources Information Center
Kandies, Jerry
The process of creating and maintaining a World Wide Web homepage for a national organization--the Association of Collegiate Business Schools and Programs (ACBSP)--is detailed in this paper. The logical design confines the conceptual relationships among the components of the Web pages and their hyperlinks, whereas the physical design concerns…
Academic medical center libraries on the Web.
Tannery, N H; Wessel, C B
1998-01-01
Academic medical center libraries are moving towards publishing electronically, utilizing networked technologies, and creating digital libraries. The catalyst for this movement has been the Web. An analysis of academic medical center library Web pages was undertaken to assess the information created and communicated in early 1997. A summary of present uses and suggestions for future applications is provided. A method for evaluating and describing the content of library Web sites was designed. The evaluation included categorizing basic information such as description and access to library services, access to commercial databases, and use of interactive forms. The main goal of the evaluation was to assess original resources produced by these libraries. PMID:9803298
Making EPA's PDF documents accessible (by Section 508 standards) and user-friendly includes steps such as adding bookmarks, using electronic conversion rather than scanning pages, and adding metadata.
Creative Commons: A New Tool for Schools
ERIC Educational Resources Information Center
Pitler, Howard
2006-01-01
Technology-savvy instructors often require students to create Web pages or videos, tasks that require finding materials such as images, music, or text on the Web, reusing them, and then republishing them in a technique that author Howard Pitler calls "remixing." However, this requires both the student and the instructor to deal with often thorny…
The Web-Database Connection Tools for Sharing Information on the Campus Intranet.
ERIC Educational Resources Information Center
Thibeault, Nancy E.
This paper evaluates four tools for creating World Wide Web pages that interface with Microsoft Access databases: DB Gateway, Internet Database Assistant (IDBA), Microsoft Internet Database Connector (IDC), and Cold Fusion. The system requirements and features of each tool are discussed. A sample application, "The Virtual Help Desk"…
[Improving vaccination social marketing by monitoring the web].
Ferro, A; Bonanni, P; Castiglia, P; Montante, A; Colucci, M; Miotto, S; Siddu, A; Murrone, L; Baldo, V
2014-01-01
Immunisation is one of the most important and cost- effective interventions in Public Health because of their significant positive impact on population health.However, since Jenner's discovery there always been a lively debate between supporters and opponents of vaccination; Today the antivaccination movement spreads its message mostly on the web, disseminating inaccurate data through blogs and forums, increasing vaccine rejection.In this context, the Società Italiana di Igiene (SItI) created a web project in order to fight the misinformation on the web regarding vaccinations, through a series of information tools, including scientific articles, educational information, video and multimedia presentations The web portal (http://www.vaccinarsi.org) was published in May 2013 and now is already available over one hundred web pages related to vaccinations Recently a Forum, a periodic newsletter and a Twitter page have been created. There has been an average of 10,000 hits per month. Currently our users are mostly healthcare professionals. The visibility of the site is very good and it currently ranks first in the Google's search engine, taping the word "vaccinarsi" The results of the first four months of activity are extremely encouraging and show the importance of this project; furthermore the application for quality certification by independent international Organizations has been submitted.
Lifting Events in RDF from Interactions with Annotated Web Pages
NASA Astrophysics Data System (ADS)
Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad
In this paper we present a method and an implementation for creating and processing semantic events from interaction with Web pages which opens possibilities to build event-driven applications for the (Semantic) Web. Events, simple or complex, are models for things that happen e.g., when a user interacts with a Web page. Events are consumed in some meaningful way e.g., for monitoring reasons or to trigger actions such as responses. In order for receiving parties to understand events e.g., comprehend what has led to an event, we propose a general event schema using RDFS. In this schema we cover the composition of complex events and event-to-event relationships. These events can then be used to route semantic information about an occurrence to different recipients helping in making the Semantic Web active. Additionally, we present an architecture for detecting and composing events in Web clients. For the contents of events we show a way of how they are enriched with semantic information about the context in which they occurred. The paper is presented in conjunction with the use case of Semantic Advertising, which extends traditional clickstream analysis by introducing semantic short-term profiling, enabling discovery of the current interest of a Web user and therefore supporting advertisement providers in responding with more relevant advertisements.
The Impact on Education of the World Wide Web.
ERIC Educational Resources Information Center
Hobbs, D. J.; Taylor, R. J.
This paper describes a project which created a set of World Wide Web (WWW) pages documenting the state of the art in educational multimedia design; a prototype WWW-based multimedia teaching tool--a podiatry test using HTML forms, 24-bit color images and MPEG video--was also designed, developed, and evaluated. The project was conducted between…
An Expertise Recommender using Web Mining
NASA Technical Reports Server (NTRS)
Joshi, Anupam; Chandrasekaran, Purnima; ShuYang, Michelle; Ramakrishnan, Ramya
2001-01-01
This report explored techniques to mine web pages of scientists to extract information regarding their expertise, build expertise chains and referral webs, and semi automatically combine this information with directory information services to create a recommender system that permits query by expertise. The approach included experimenting with existing techniques that have been reported in research literature in recent past , and adapted them as needed. In addition, software tools were developed to capture and use this information.
Information Portals: A New Tool for Teaching Information Literacy Skills
ERIC Educational Resources Information Center
Kolah, Debra; Fosmire, Michael
2010-01-01
Librarians at Rice and Purdue Universities created novel assignments to teach students important information literacy skills. The assignments required the students to use a third-party web site, PageFlakes and NetVibes, respectively, to create a dynamically updated portal to information they needed for their research and class projects. The use of…
2014-01-01
Background Logos are commonly used in molecular biology to provide a compact graphical representation of the conservation pattern of a set of sequences. They render the information contained in sequence alignments or profile hidden Markov models by drawing a stack of letters for each position, where the height of the stack corresponds to the conservation at that position, and the height of each letter within a stack depends on the frequency of that letter at that position. Results We present a new tool and web server, called Skylign, which provides a unified framework for creating logos for both sequence alignments and profile hidden Markov models. In addition to static image files, Skylign creates a novel interactive logo plot for inclusion in web pages. These interactive logos enable scrolling, zooming, and inspection of underlying values. Skylign can avoid sampling bias in sequence alignments by down-weighting redundant sequences and by combining observed counts with informed priors. It also simplifies the representation of gap parameters, and can optionally scale letter heights based on alternate calculations of the conservation of a position. Conclusion Skylign is available as a website, a scriptable web service with a RESTful interface, and as a software package for download. Skylign’s interactive logos are easily incorporated into a web page with just a few lines of HTML markup. Skylign may be found at http://skylign.org. PMID:24410852
Multigraph: Reusable Interactive Data Graphs
NASA Astrophysics Data System (ADS)
Phillips, M. B.
2010-12-01
There are surprisingly few good software tools available for presenting time series data on the internet. The most common practice is to use a desktop program such as Excel or Matlab to save a graph as an image which can be included in a web page like any other image. This disconnects the graph from the data in a way that makes updating a graph with new data a cumbersome manual process, and it limits the user to one particular view of the data. The Multigraph project defines an XML format for describing interactive data graphs, and software tools for creating and rendering those graphs in web pages and other internet connected applications. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions; the user can pan and zoom by clicking and dragging, in a familiar "Google Maps" kind of way. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. The Multigraph XML format, or "MUGL" for short, provides a concise description of the visual properties of a graph, such as axes, plot styles, data sources, labels, etc, as well as interactivity properties such as how and whether the user can pan or zoom along each axis. Multigraph reads a file in this format, draws the described graph, and allows the user to interact with it. Multigraph software currently includes a Flash application for embedding graphs in web pages, a Flex component for embedding graphs in larger Flex/Flash applications, and a plugin for creating graphs in the WordPress content management system. Plans for the future include a Java version for desktop viewing and editing, a command line version for batch and server side rendering, and possibly Android and iPhone versions. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Multigraph Display of Real Time Weather Data
The Faculty Web Page: Contrivance or Continuation?
ERIC Educational Resources Information Center
Lennex, Lesia
2007-01-01
In an age of Internet education, what does it mean for a tenure/tenure-track faculty to have a web page? How many professors have web pages? If they have a page, what does it look like? Do they really need a web page at all? Many universities have faculty web pages. What do those collective pages look like? In what way do they represent the…
Creating a Mobile Library Website
ERIC Educational Resources Information Center
Cutshall, Tom C.; Blake, Lindsay; Bandy, Sandra L.
2011-01-01
The overwhelming results were iPhones and Android devices. Since the library wasn't equipped technologically to develop an in-house application platform and because we wanted the content to work across all mobile platforms, we decided to focus on creating a mobile web-based platform. From the NLM page of mobile sites we chose the basic PubMed/…
Developing an internet presence for your practice.
Maley, Catherine; Baum, Neil
2009-01-01
Yesterday, it was the Yellow Pages that informed the public where and how to reach their physicians. Today, it is the Internet. With the Internet, patients have 24/7 access to your practice that will do far more than any Yellow Pages or advertising could possibly do. This article discusses the importance of the Internet for the contemporary physician and how to create a useful and interactive Web site.
Developing an Internet Presence for Your Practice
Maley, Catherine; Baum, Neil
2009-01-01
Yesterday, it was the Yellow Pages that informed the public where and how to reach their physicians. Today, it is the Internet. With the Internet, patients have 24/7 access to your practice that will do far more than any Yellow Pages or advertising could possibly do. This article discusses the importance of the Internet for the contemporary physician and how to create a useful and interactive Web site. PMID:21603434
Fu, Linda Y; Zook, Kathleen; Spoehr-Labutta, Zachary; Hu, Pamela; Joseph, Jill G
2016-01-01
Online information can influence attitudes toward vaccination. The aim of the present study was to provide a systematic evaluation of the search engine ranking, quality, and content of Web pages that are critical versus noncritical of human papillomavirus (HPV) vaccination. We identified HPV vaccine-related Web pages with the Google search engine by entering 20 terms. We then assessed each Web page for critical versus noncritical bias and for the following quality indicators: authorship disclosure, source disclosure, attribution of at least one reference, currency, exclusion of testimonial accounts, and readability level less than ninth grade. We also determined Web page comprehensiveness in terms of mention of 14 HPV vaccine-relevant topics. Twenty searches yielded 116 unique Web pages. HPV vaccine-critical Web pages comprised roughly a third of the top, top 5- and top 10-ranking Web pages. The prevalence of HPV vaccine-critical Web pages was higher for queries that included term modifiers in addition to root terms. Compared with noncritical Web pages, Web pages critical of HPV vaccine overall had a lower quality score than those with a noncritical bias (p < .01) and covered fewer important HPV-related topics (p < .001). Critical Web pages required viewers to have higher reading skills, were less likely to include an author byline, and were more likely to include testimonial accounts. They also were more likely to raise unsubstantiated concerns about vaccination. Web pages critical of HPV vaccine may be frequently returned and highly ranked by search engine queries despite being of lower quality and less comprehensive than noncritical Web pages. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1997-01-01
IntelliWeb and IntelliPrint, products from MicroMass Communications, utilize C Language Integrated Production System (CLIPS), a development and delivery expert systems tool developed at Johnson Space Center. IntelliWeb delivers personalized messages by dynamically creating single web pages or entire web sites based on information provided by each website visitor. IntelliPrint is a product designed to create tailored, individualized messages via printed media. The software uses proprietary technology to generate printed messages that are personally relevant and tailored to meet each individual's needs. Intelliprint is in use in many operations including Brystol-Myers Squibb's personalized newsletter, "Living at Your Best," geared to each recipient based on a health and lifestyle survey taken earlier; and SmithKline Beecham's "Nicorette Committed Quitters Program," in which customized motivational materials support participants in their attempt to quit smoking.
A multilingual assessment of melanoma information quality on the Internet.
Bari, Lilla; Kemeny, Lajos; Bari, Ferenc
2014-06-01
This study aims to assess and compare melanoma information quality in Hungarian, Czech, and German languages on the Internet. We used country-specific Google search engines to retrieve the first 25 uniform resource locators (URLs) by searching the word "melanoma" in the given language. Using the automated toolbar of Health On the Net Foundation (HON), we assessed each Web site for HON certification based on the Health On the Net Foundation Code of Conduct (HONcode). Information quality was determined using a 35-point checklist created by Bichakjian et al. (J Clin Oncol 20:134-141, 2002), with the NCCN melanoma guideline as control. After excluding duplicate and link-only pages, a total of 24 Hungarian, 18 Czech, and 21 German melanoma Web sites were evaluated and rated. The amount of HON certified Web sites was the highest among the German Web pages (19%). One of the retrieved Hungarian and none of the Czech Web sites were HON certified. We found the highest number of Web sites containing comprehensive, correct melanoma information in German language, followed by Czech and Hungarian pages. Although the majority of the Web sites lacked data about incidence, risk factors, prevention, treatment, work-up, and follow-up, at least one comprehensive, high-quality Web site was found in each language. Several Web sites contained incorrect information in each language. While a small amount of comprehensive, quality melanoma-related Web sites was found, most of the retrieved Web content lacked basic disease information, such as risk factors, prevention, and treatment. A significant number of Web sites contained malinformation. In case of melanoma, primary and secondary preventions are of especially high importance; therefore, the improvement of disease information quality available on the Internet is necessary.
Clefts of the lip and palate: is the Internet a trustworthy source of information for patients?
Karamitros, G A; Kitsos, N A
2018-04-02
Great numbers of patients use the Internet to obtain information and familiarize themselves with medical conditions. However, the quality of Internet-based information on clefts of the lip and palate has not yet been examined. The goal of this study was to assess the quality of Internet-based patient information on orofacial clefts. Websites were evaluated based on the modified Ensuring Quality Information for Patients (EQIP) instrument (36 items). Three hundred websites were identified using the most popular search engines. Of these, 146 were assessed after the exclusion of duplicates, irrelevant sites, and web pages in languages other than English. Thirty-four (23.2%) web pages, designed mostly by academic centres and hospitals, covered more than 22 items and were classified as high-score websites. The EQIP score achieved by websites ranged between 4 and 30, out of a total possible 36 points; the median score was 19 points. The top five high-scoring web pages are highlighted. The overall quality of Internet-based patient information on orofacial clefts is low. Also, the majority of web pages created by medical practitioners have a marketing perspective and in order to attract more patients/customers avoid mentioning the risks of the reconstructive procedures needed. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Neri, E; Laghi, A; Regge, D; Sacco, P; Gallo, T; Turini, F; Talini, E; Ferrari, R; Mellaro, M; Rengo, M; Marchi, S; Caramella, D; Bartolozzi, C
2008-12-01
The aim of this paper is to describe the Web site of the Italian Project on CT Colonography (Research Project of High National Interest, PRIN No. 2005062137) and present the prototype of the online database. The Web site was created with Microsoft Office Publisher 2003 software, which allows the realisation of multiple Web pages linked through a main menu located on the home page. The Web site contains a database of computed tomography (CT) colonography studies in the Digital Imaging and Communications in Medicine (DICOM) standard, all acquired with multidetector-row CT according to the parameters defined by the European Society of Abdominal and Gastrointestinal Radiology (ESGAR). The cases present different bowel-cleansing and tagging methods, and each case has been anonymised and classified according to the Colonography Reporting and Data System (C-RADS). The Web site is available at http address www.ctcolonography.org and is composed of eight pages. Download times for a 294-Mbyte file were 33 min from a residential ADSL (6 Mbit/s) network, 200 s from a local university network (100 Mbit/s) and 2 h and 50 min from a remote academic site in the USA. The Web site received 256 accesses in the 22 days since it went online. The Web site is an immediate and up-to-date tool for publicising the activity of the research project and a valuable learning resource for CT colonography.
[An evaluation of the quality of health web pages using a validated questionnaire].
Conesa Fuentes, Maria del Carmen; Aguinaga Ontoso, Enrique; Hernández Morante, Juan José
2011-01-01
The objective of the present study was to evaluate the quality of general health information in Spanish language web pages, and the official Regional Services web pages from the different Autonomous Regions. It is a cross-sectional study. We have used a previously validated questionnaire to study the present state of the health information on Internet for a lay-user point of view. By mean of PageRank (Google®), we obtained a group of webs, including a total of 65 health web pages. We applied some exclusion criteria, and finally obtained a total of 36 webs. We also analyzed the official web pages from the different Health Services in Spain (19 webs), making a total of 54 health web pages. In the light of our data, we observed that, the quality of the general information health web pages was generally rather low, especially regarding the information quality. Not one page reached the maximum score (19 points). The mean score of the web pages was of 9.8±2.8. In conclusion, to avoid the problems arising from the lack of quality, health professionals should design advertising campaigns and other media to teach the lay-user how to evaluate the information quality. Copyright © 2009 Elsevier España, S.L. All rights reserved.
Proteopedia: Exciting Advances in the 3D Encyclopedia of Biomolecular Structure
NASA Astrophysics Data System (ADS)
Prilusky, Jaime; Hodis, Eran; Sussman, Joel L.
Proteopedia is a collaborative, 3D web-encyclopedia of protein, nucleic acid and other structures. Proteopedia ( http://www.proteopedia.org ) presents 3D biomolecule structures in a broadly accessible manner to a diverse scientific audience through easy-to-use molecular visualization tools integrated into a wiki environment that anyone with a user account can edit. We describe recent advances in the web resource in the areas of content and software. In terms of content, we describe a large growth in user-added content as well as improvements in automatically-generated content for all PDB entry pages in the resource. In terms of software, we describe new features ranging from the capability to create pages hidden from public view to the capability to export pages for offline viewing. New software features also include an improved file-handling system and availability of biological assemblies of protein structures alongside their asymmetric units.
Enabling Scientists: Serving Sci-Tech Library Users with Disabilities.
ERIC Educational Resources Information Center
Coonin, Bryna
2001-01-01
Discusses how librarians in scientific and technical libraries can contribute to an accessible electronic library environment for users with disabilities to ensure independent access to information. Topics include relevant assistive technologies; creating accessible Web pages; monitoring accessibility of electronic databases; preparing accessible…
Bringing Control System User Interfaces to the Web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xihui; Kasemir, Kay
With the evolution of web based technologies, especially HTML5 [1], it becomes possible to create web-based control system user interfaces (UI) that are cross-browser and cross-device compatible. This article describes two technologies that facilitate this goal. The first one is the WebOPI [2], which can seamlessly display CSS BOY [3] Operator Interfaces (OPI) in web browsers without modification to the original OPI file. The WebOPI leverages the powerful graphical editing capabilities of BOY and provides the convenience of re-using existing OPI files. On the other hand, it uses generic JavaScript and a generic communication mechanism between the web browser andmore » web server. It is not optimized for a control system, which results in unnecessary network traffic and resource usage. Our second technology is the WebSocket-based Process Data Access (WebPDA) [4]. It is a protocol that provides efficient control system data communication using WebSocket [5], so that users can create web-based control system UIs using standard web page technologies such as HTML, CSS and JavaScript. WebPDA is control system independent, potentially supporting any type of control system.« less
Narcissism and social networking Web sites.
Buffardi, Laura E; Campbell, W Keith
2008-10-01
The present research examined how narcissism is manifested on a social networking Web site (i.e., Facebook.com). Narcissistic personality self-reports were collected from social networking Web page owners. Then their Web pages were coded for both objective and subjective content features. Finally, strangers viewed the Web pages and rated their impression of the owner on agentic traits, communal traits, and narcissism. Narcissism predicted (a) higher levels of social activity in the online community and (b) more self-promoting content in several aspects of the social networking Web pages. Strangers who viewed the Web pages judged more narcissistic Web page owners to be more narcissistic. Finally, mediational analyses revealed several Web page content features that were influential in raters' narcissistic impressions of the owners, including quantity of social interaction, main photo self-promotion, and main photo attractiveness. Implications of the expression of narcissism in social networking communities are discussed.
2012-09-01
boxes) using a third-party commercial software component. When creating version 1, it was necessary to enter raw Hypertext Markup Language (HTML) tags...Markup Language (HTML) web page. Figure 12. Authors create procedures using the Procedure Editor. Users run procedures using the...step presents instructions to the user using formatted text and graphics specified using the Hypertext Markup Language (HTML). Instructions can
ERIC Educational Resources Information Center
Luterbach, Kenneth J.; Rodriguez, Diane; Love, Lakecia
2012-01-01
This paper describes an instructional development effort to create effective and compelling instruction for eCommerce students. Results from a small field study inform the development project. Four high school students in an eCommerce course completed the standalone tutorial developed to teach them how to create a web page in the HyperText Markup…
ERIC Educational Resources Information Center
Maxymuk, John
This guide provides desktop publishing basics and instructions for specific library applications, enabling any librarian to function as the writer, editor, designer, proofreader, and printer of a variety of different publications. The guide is designed to help librarians create publications that are attractive, effective, useful, and easily read.…
The impact of visual layout factors on performance in Web pages: a cross-language study.
Parush, Avi; Shwarts, Yonit; Shtub, Avy; Chandra, M Jeya
2005-01-01
Visual layout has a strong impact on performance and is a critical factor in the design of graphical user interfaces (GUIs) and Web pages. Many design guidelines employed in Web page design were inherited from human performance literature and GUI design studies and practices. However, few studies have investigated the more specific patterns of performance with Web pages that may reflect some differences between Web page and GUI design. We investigated interactions among four visual layout factors in Web page design (quantity of links, alignment, grouping indications, and density) in two experiments: one with pages in Hebrew, entailing right-to-left reading, and the other with English pages, entailing left-to-right reading. Some performance patterns (measured by search times and eye movements) were similar between languages. Performance was particularly poor in pages with many links and variable densities, but it improved with the presence of uniform density. Alignment was not shown to be a performance-enhancing factor. The findings are discussed in terms of the similarities and differences in the impact of layout factors between GUIs and Web pages. Actual or potential applications of this research include specific guidelines for Web page design.
Tozzi, Alberto Eugenio; Buonuomo, Paola Sabrina; Ciofi degli Atti, Marta Luisa; Carloni, Emanuela; Meloni, Marco; Gamba, Fiorenza
2010-01-01
Information available on the Internet about immunizations may influence parents' perception about human papillomavirus (HPV) immunization and their attitude toward vaccinating their daughters. We hypothesized that the quality of information on HPV available on the Internet may vary with language and with the level of knowledge of parents. To this end we compared the quality of a sample of Web pages in Italian with a sample of Web pages in English. Five reviewers assessed the quality of Web pages retrieved with popular search engines using criteria adapted from the Good Information Practice Essential Criteria for Vaccine Safety Web Sites recommended by the World Health Organization. Quality of Web pages was assessed in the domains of accessibility, credibility, content, and design. Scores in these domains were compared through nonparametric statistical tests. We retrieved and reviewed 74 Web sites in Italian and 117 in English. Most retrieved Web pages (33.5%) were from private agencies. Median scores were higher in Web pages in English compared with those in Italian in the domain of accessibility (p < .01), credibility (p < .01), and content (p < .01). The highest credibility and content scores were those of Web pages from governmental agencies or universities. Accessibility scores were positively associated with content scores (p < .01) and with credibility scores (p < .01). A total of 16.2% of Web pages in Italian opposed HPV immunization compared with 6.0% of those in English (p < .05). Quality of information and number of Web pages opposing HPV immunization may vary with the Web site language. High-quality Web pages on HPV, especially from public health agencies and universities, should be easily accessible and retrievable with common Web search engines. Copyright 2010 Society for Adolescent Medicine. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Lindsay, Lorin
Designing a web home page involves many decisions that affect how the page will look, the kind of technology required to use the page, the links the page will provide, and kinds of patrons who can use the page. The theme of information literacy needs to be built into every web page; users need to be taught the skills of sorting and applying…
FlaME: Flash Molecular Editor - a 2D structure input tool for the web.
Dallakian, Pavel; Haider, Norbert
2011-02-01
So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME). In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME) utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol() and setMol(). In addition, structures can be copied to the system clipboard. A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application) and the HTML elements on a web page, using JavaScript functions.
Knowledge-driven enhancements for task composition in bioinformatics.
Sutherland, Karen; McLeod, Kenneth; Ferguson, Gus; Burger, Albert
2009-10-01
A key application area of semantic technologies is the fast-developing field of bioinformatics. Sealife was a project within this field with the aim of creating semantics-based web browsing capabilities for the Life Sciences. This includes meaningfully linking significant terms from the text of a web page to executable web services. It also involves the semantic mark-up of biological terms, linking them to biomedical ontologies, then discovering and executing services based on terms that interest the user. A system was produced which allows a user to identify terms of interest on a web page and subsequently connects these to a choice of web services which can make use of these inputs. Elements of Artificial Intelligence Planning build on this to present a choice of higher level goals, which can then be broken down to construct a workflow. An Argumentation System was implemented to evaluate the results produced by three different gene expression databases. An evaluation of these modules was carried out on users from a variety of backgrounds. Users with little knowledge of web services were able to achieve tasks that used several services in much less time than they would have taken to do this manually. The Argumentation System was also considered a useful resource and feedback was collected on the best way to present results. Overall the system represents a move forward in helping users to both construct workflows and analyse results by incorporating specific domain knowledge into the software. It also provides a mechanism by which web pages can be linked to web services. However, this work covers a specific domain and much co-ordinated effort is needed to make all web services available for use in such a way, i.e. the integration of underlying knowledge is a difficult but essential task.
Optimizing Crawler4j using MapReduce Programming Model
NASA Astrophysics Data System (ADS)
Siddesh, G. M.; Suresh, Kavya; Madhuri, K. Y.; Nijagal, Madhushree; Rakshitha, B. R.; Srinivasa, K. G.
2017-06-01
World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programming model by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programming model improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.
The rendering context for stereoscopic 3D web
NASA Astrophysics Data System (ADS)
Chen, Qinshui; Wang, Wenmin; Wang, Ronggang
2014-03-01
3D technologies on the Web has been studied for many years, but they are basically monoscopic 3D. With the stereoscopic technology gradually maturing, we are researching to integrate the binocular 3D technology into the Web, creating a stereoscopic 3D browser that will provide users with a brand new experience of human-computer interaction. In this paper, we propose a novel approach to apply stereoscopy technologies to the CSS3 3D Transforms. Under our model, each element can create or participate in a stereoscopic 3D rendering context, in which 3D Transforms such as scaling, translation and rotation, can be applied and be perceived in a truly 3D space. We first discuss the underlying principles of stereoscopy. After that we discuss how these principles can be applied to the Web. A stereoscopic 3D browser with backward compatibility is also created for demonstration purposes. We take advantage of the open-source WebKit project, integrating the 3D display ability into the rendering engine of the web browser. For each 3D web page, our 3D browser will create two slightly different images, each representing the left-eye view and right-eye view, both to be combined on the 3D display to generate the illusion of depth. And as the result turns out, elements can be manipulated in a truly 3D space.
Tools for Creating Mobile Applications for Extension
ERIC Educational Resources Information Center
Drill, Sabrina L.
2012-01-01
Considerations and tools for developing mobile applications for Extension include evaluating the topic, purpose, and audience. Different computing platforms may be used, and apps designed as modified Web pages or implicitly programmed for a particular platform. User privacy is another important consideration, especially for data collection apps.…
Multigraph: Interactive Data Graphs on the Web
NASA Astrophysics Data System (ADS)
Phillips, M. B.
2010-12-01
Many aspects of geophysical science involve time dependent data that is often presented in the form of a graph. Considering that the web has become a primary means of communication, there are surprisingly few good tools and techniques available for presenting time-series data on the web. The most common solution is to use a desktop tool such as Excel or Matlab to create a graph which is saved as an image and then included in a web page like any other image. This technique is straightforward, but it limits the user to one particular view of the data, and disconnects the graph from the data in a way that makes updating a graph with new data an often cumbersome manual process. This situation is somewhat analogous to the state of mapping before the advent of GIS. Maps existed only in printed form, and creating a map was a laborious process. In the last several years, however, the world of mapping has experienced a revolution in the form of web-based and other interactive computer technologies, so that it is now commonplace for anyone to easily browse through gigabytes of geographic data. Multigraph seeks to bring a similar ease of access to time series data. Multigraph is a program for displaying interactive time-series data graphs in web pages that includes a simple way of configuring the appearance of the graph and the data to be included. It allows multiple data sources to be combined into a single graph, and allows the user to explore the data interactively. Multigraph lets users explore and visualize "data space" in the same way that interactive mapping applications such as Google Maps facilitate exploring and visualizing geography. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file and requires no programming. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Graph of Global Temperature Anomalies from ClimateWatch Magazine (http://www.climatewatch.noaa.gov/2009/articles/climate-change-global-temperature)
Required Discussion Web Pages in Psychology Courses and Student Outcomes
ERIC Educational Resources Information Center
Pettijohn, Terry F., II; Pettijohn, Terry F.
2007-01-01
We conducted 2 studies that investigated student outcomes when using discussion Web pages in psychology classes. In Study 1, we assigned 213 students enrolled in Introduction to Psychology courses to either a mandatory or an optional Web page discussion condition. Students used the discussion Web page significantly more often and performed…
Classifying Web Pages by Using Knowledge Bases for Entity Retrieval
NASA Astrophysics Data System (ADS)
Kiritani, Yusuke; Ma, Qiang; Yoshikawa, Masatoshi
In this paper, we propose a novel method to classify Web pages by using knowledge bases for entity search, which is a kind of typical Web search for information related to a person, location or organization. First, we map a Web page to entities according to the similarities between the page and the entities. Various methods for computing such similarity are applied. For example, we can compute the similarity between a given page and a Wikipedia article describing a certain entity. The frequency of an entity appearing in the page is another factor used in computing the similarity. Second, we construct a directed acyclic graph, named PEC graph, based on the relations among Web pages, entities, and categories, by referring to YAGO, a knowledge base built on Wikipedia and WordNet. Finally, by analyzing the PEC graph, we classify Web pages into categories. The results of some preliminary experiments validate the methods proposed in this paper.
Going, going, still there: using the WebCite service to permanently archive cited Web pages.
Eysenbach, Gunther
2006-01-01
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page.
Technology Resources: Mathematics Accessibility for All Not Accommodation for Some
ERIC Educational Resources Information Center
Duranczyk, Irene M.
2009-01-01
When faculty and learning assistance staff create teaching documents and web pages envisioning the widest range of users they can save time while achieving access for all. There are tools and techniques available to make mathematics visual, orally, and dynamically more accessible through multimodal presentation forms. Resources from Design…
ERIC Educational Resources Information Center
Gilstrap, Donald L.
1998-01-01
Explains how to build World Wide Web home pages using frames-based HTML so that librarians can manage Web-based information and improve their home pages. Provides descriptions and 15 examples for writing frames-HTML code, including advanced concepts and additional techniques for home-page design. (Author/LRW)
WebAlchemist: a Web transcoding system for mobile Web access in handheld devices
NASA Astrophysics Data System (ADS)
Whang, Yonghyun; Jung, Changwoo; Kim, Jihong; Chung, Sungkwon
2001-11-01
In this paper, we describe the design and implementation of WebAlchemist, a prototype web transcoding system, which automatically converts a given HTML page into a sequence of equivalent HTML pages that can be properly displayed on a hand-held device. The Web/Alchemist system is based on a set of HTML transcoding heuristics managed by the Transcoding Manager (TM) module. In order to tackle difficult-to-transcode pages such as ones with large or complex table structures, we have developed several new transcoding heuristics that extract partial semantics from syntactic information such as the table width, font size and cascading style sheet. Subjective evaluation results using popular HTML pages (such as the CNN home page) show that WebAlchemist generates readable, structure-preserving transcoded pages, which can be properly displayed on hand-held devices.
Dynamic Web Pages: Performance Impact on Web Servers.
ERIC Educational Resources Information Center
Kothari, Bhupesh; Claypool, Mark
2001-01-01
Discussion of Web servers and requests for dynamic pages focuses on experimentally measuring and analyzing the performance of the three dynamic Web page generation technologies: CGI, FastCGI, and Servlets. Develops a multivariate linear regression model and predicts Web server performance under some typical dynamic requests. (Author/LRW)
Online nutrition information for pregnant women: a content analysis.
Storr, Tayla; Maher, Judith; Swanepoel, Elizabeth
2017-04-01
Pregnant women actively seek health information online, including nutrition and food-related topics. However, the accuracy and readability of this information have not been evaluated. The aim of this study was to describe and evaluate pregnancy-related food and nutrition information available online. Four search engines were used to search for pregnancy-related nutrition web pages. Content analysis of web pages was performed. Web pages were assessed against the 2013 Australian Dietary Guidelines to assess accuracy. Flesch-Kincaid (F-K), Simple Measure of Gobbledygook (SMOG), Gunning Fog Index (FOG) and Flesch reading ease (FRE) formulas were used to assess readability. Data was analysed descriptively. Spearman's correlation was used to assess the relationship between web page characteristics. Kruskal-Wallis test was used to check for differences among readability and other web page characteristics. A total of 693 web pages were included. Web page types included commercial (n = 340), not-for-profit (n = 113), blogs (n = 112), government (n = 89), personal (n = 36) and educational (n = 3). The accuracy of online nutrition information varied with 39.7% of web pages containing accurate information, 22.8% containing mixed information and 37.5% containing inaccurate information. The average reading grade of all pages analysed measured by F-K, SMOG and FOG was 11.8. The mean FRE was 51.6, a 'fairly difficult to read' score. Only 0.5% of web pages were written at or below grade 6 according to F-K, SMOG and FOG. The findings suggest that accuracy of pregnancy-related nutrition information is a problem on the internet. Web page readability is generally difficult and means that the information may not be accessible to those who cannot read at a sophisticated level. © 2016 John Wiley & Sons Ltd. © 2016 John Wiley & Sons Ltd.
An Extraction Method of an Informative DOM Node from a Web Page by Using Layout Information
NASA Astrophysics Data System (ADS)
Tsuruta, Masanobu; Masuyama, Shigeru
We propose an informative DOM node extraction method from a Web page for preprocessing of Web content mining. Our proposed method LM uses layout data of DOM nodes generated by a generic Web browser, and the learning set consists of hundreds of Web pages and the annotations of informative DOM nodes of those Web pages. Our method does not require large scale crawling of the whole Web site to which the target Web page belongs. We design LM so that it uses the information of the learning set more efficiently in comparison to the existing method that uses the same learning set. By experiments, we evaluate the methods obtained by combining one that consists of the method for extracting the informative DOM node both the proposed method and the existing methods, and the existing noise elimination methods: Heur removes advertisements and link-lists by some heuristics and CE removes the DOM nodes existing in the Web pages in the same Web site to which the target Web page belongs. Experimental results show that 1) LM outperforms other methods for extracting the informative DOM node, 2) the combination method (LM, {CE(10), Heur}) based on LM (precision: 0.755, recall: 0.826, F-measure: 0.746) outperforms other combination methods.
Frank, M S; Dreyer, K
2001-06-01
We describe a working software technology that enables educators to incorporate their expertise and teaching style into highly interactive and Socratic educational material for distribution on the world wide web. A graphically oriented interactive authoring system was developed to enable the computer novice to create and store within a database his or her domain expertise in the form of electronic knowledge. The authoring system supports and facilitates the input and integration of several types of content, including free-form, stylized text, miniature and full-sized images, audio, and interactive questions with immediate feedback. The system enables the choreography and sequencing of these entities for display within a web page as well as the sequencing of entire web pages within a case-based or thematic presentation. Images or segments of text can be hyperlinked with point-and-click to other entities such as adjunctive web pages, audio, or other images, cases, or electronic chapters. Miniature (thumbnail) images are automatically linked to their full-sized counterparts. The authoring system contains a graphically oriented word processor, an image editor, and capabilities to automatically invoke and use external image-editing software such as Photoshop. The system works in both local area network (LAN) and internet-centric environments. An internal metalanguage (invisible to the author but stored with the content) was invented to represent the choreographic directives that specify the interactive delivery of the content on the world wide web. A database schema was developed to objectify and store both this electronic knowledge and its associated choreographic metalanguage. A database engine was combined with page-rendering algorithms in order to retrieve content from the database and deliver it on the web in a Socratic style, assess the recipient's current fund of knowledge, and provide immediate feedback, thus stimulating in-person interaction with a human expert. This technology enables the educator to choreograph a stylized, interactive delivery of his or her message using multimedia components assembled in virtually any order, spanning any number of web pages for a given case or theme. An educator can thus exercise precise influence on specific learning objectives, embody his or her personal teaching style within the content, and ultimately enhance its educational impact. The described technology amplifies the efforts of the educator and provides a more dynamic and enriching learning environment for web-based education.
FlaME: Flash Molecular Editor - a 2D structure input tool for the web
2011-01-01
Background So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME). In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME) utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. Implementation The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol() and setMol(). In addition, structures can be copied to the system clipboard. Conclusion A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application) and the HTML elements on a web page, using JavaScript functions. PMID:21284863
Aladin Lite: Embed your Sky in the Browser
NASA Astrophysics Data System (ADS)
Boch, T.; Fernique, P.
2014-05-01
I will introduce and describe Aladin Lite1, a lightweight interactive sky viewer running natively in the browser. The past five years have seen the emergence of powerful and complex web applications, thanks to major improvements in JavaScript engines and the advent of HTML5. At the same time, browser plugins Java applets, Flash, Silverlight) that were commonly used to run rich Internet applications are declining and are not well suited for mobile devices. The Aladin team took this opportunity to develop Aladin Lite, a lightweight version of Aladin geared towards simple visualization of a sky region. Relying on the widely supported HTML5 canvas element, it provides an intuitive user interface running on desktops and tablets. This first version allows one to interactively visualize multi-resolution HEALPix image and superimpose tabular data and footprints. Aladin Lite is easily embeddable on any web page and may be of interest for data providers which will be able to use it as an interactive previewer for their own image surveys, previously pre-processed as explained in details in the poster "Create & publish your Hierarchical Progressive Survey". I will present the main features of Aladin Lite as well as the JavaScript API which gives the building blocks to create rich interactions between a web page and Aladin Lite.
2013-03-01
construed as an official Department of the Army position, policy or decision unless so designated by other documentation. 2 REPORT...valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE March 2013 2. REPORT TYPE Final 3. DATES COVERED 15...communication experts and graphic designers , to create 4 pilot web intervention pages. This included 2 pilot versions of the Home Page and a section landing
A DICOM Based Collaborative Platform for Real-Time Medical Teleconsultation on Medical Images.
Maglogiannis, Ilias; Andrikos, Christos; Rassias, Georgios; Tsanakas, Panayiotis
2017-01-01
The paper deals with the design of a Web-based platform for real-time medical teleconsultation on medical images. The proposed platform combines the principles of heterogeneous Workflow Management Systems (WfMSs), the peer-to-peer networking architecture and the SPA (Single-Page Application) concept, to facilitate medical collaboration among healthcare professionals geographically distributed. The presented work leverages state-of-the-art features of the web to support peer-to-peer communication using the WebRTC (Web Real Time Communication) protocol and client-side data processing for creating an integrated collaboration environment. The paper discusses the technical details of implementation and presents the operation of the platform in practice along with some initial results.
Technical development of PubMed interact: an improved interface for MEDLINE/PubMed searches.
Muin, Michael; Fontelo, Paul
2006-11-03
The project aims to create an alternative search interface for MEDLINE/PubMed that may provide assistance to the novice user and added convenience to the advanced user. An earlier version of the project was the 'Slider Interface for MEDLINE/PubMed searches' (SLIM) which provided JavaScript slider bars to control search parameters. In this new version, recent developments in Web-based technologies were implemented. These changes may prove to be even more valuable in enhancing user interactivity through client-side manipulation and management of results. PubMed Interact is a Web-based MEDLINE/PubMed search application built with HTML, JavaScript and PHP. It is implemented on a Windows Server 2003 with Apache 2.0.52, PHP 4.4.1 and MySQL 4.1.18. PHP scripts provide the backend engine that connects with E-Utilities and parses XML files. JavaScript manages client-side functionalities and converts Web pages into interactive platforms using dynamic HTML (DHTML), Document Object Model (DOM) tree manipulation and Ajax methods. With PubMed Interact, users can limit searches with JavaScript slider bars, preview result counts, delete citations from the list, display and add related articles and create relevance lists. Many interactive features occur at client-side, which allow instant feedback without reloading or refreshing the page resulting in a more efficient user experience. PubMed Interact is a highly interactive Web-based search application for MEDLINE/PubMed that explores recent trends in Web technologies like DOM tree manipulation and Ajax. It may become a valuable technical development for online medical search applications.
ERIC Educational Resources Information Center
Lally, Carolyn
1998-01-01
Provides a background to the development of the Internet; discusses Web sites as foreign-language-learning tools; and describes the Nicenet Internet Classroom Assistant that can be used as a software template for teachers to create their own Internet pages for foreign-language instruction. (Author/LRW)
Internet Resources: Using Web Pages in Social Studies.
ERIC Educational Resources Information Center
Dale, Jack
1999-01-01
Contends that students in social studies classes can utilize Hypertext Markup Language (HTML) as a presentation and collaborative tool by developing websites. Presents two activities where students submitted webpages for country case studies and created a timeline for the French Revolution. Describes how to use HTML by discussing the various tags.…
Heteronarrative Analysis: Examining Online Photographic Narratives
ERIC Educational Resources Information Center
Kaufmann, Jodi Jan
2011-01-01
Millions of young people are using personal web pages and social networking sites to "deliberately create an identity to be presented to others". One of the primary means of presenting oneself on these sites is through a collection of photographs. Photographic narratives can be critically analyzed for the gender and sexual stories they tell.…
Facilitating Student Experimentation with Statistical Concepts.
ERIC Educational Resources Information Center
Smith, Patricia K.
2002-01-01
Offers a Web page with seven Java applets allowing students to experiment with key concepts in an introductory statistics course. Indicates the applets can be used in three ways: to place links to the applets, to create in-class demonstrations of statistical concepts, and to lead students through experiments and discover statistical relationships.…
School Librarians: Vital Educational Leaders
ERIC Educational Resources Information Center
Martineau, Pamela
2010-01-01
In the new millennium, school librarians are more likely to be found sitting behind a computer as they update the library web page or create a wiki on genetically modified organisms. Or they might be seen in the library computer lab as they lead students through tutorials on annotated bibliographies or Google docs. If adequately supported, school…
Recognition of pornographic web pages by classifying texts and images.
Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve
2007-06-01
With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.
Formal Features of Cyberspace: Relationships between Web Page Complexity and Site Traffic.
ERIC Educational Resources Information Center
Bucy, Erik P.; Lang, Annie; Potter, Robert F.; Grabe, Maria Elizabeth
1999-01-01
Examines differences between the formal features of commercial versus noncommercial Web sites, and the relationship between Web page complexity and amount of traffic a site receives. Findings indicate that, although most pages in this stage of the Web's development remain technologically simple and noninteractive, there are significant…
Lin, Jimmy
2008-01-01
Background Graph analysis algorithms such as PageRank and HITS have been successful in Web environments because they are able to extract important inter-document relationships from manually-created hyperlinks. We consider the application of these techniques to biomedical text retrieval. In the current PubMed® search interface, a MEDLINE® citation is connected to a number of related citations, which are in turn connected to other citations. Thus, a MEDLINE record represents a node in a vast content-similarity network. This article explores the hypothesis that these networks can be exploited for text retrieval, in the same manner as hyperlink graphs on the Web. Results We conducted a number of reranking experiments using the TREC 2005 genomics track test collection in which scores extracted from PageRank and HITS analysis were combined with scores returned by an off-the-shelf retrieval engine. Experiments demonstrate that incorporating PageRank scores yields significant improvements in terms of standard ranked-retrieval metrics. Conclusion The link structure of content-similarity networks can be exploited to improve the effectiveness of information retrieval systems. These results generalize the applicability of graph analysis algorithms to text retrieval in the biomedical domain. PMID:18538027
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-13
... sites, social media pages, and any comparable Internet presence, and on Web sites, social media pages... prescribed by FINRA, on their Web sites, social media pages, and any comparable Internet presence, and on Web sites, social media pages, and any comparable Internet presence relating to a member's investment...
Visual Design Principles Applied To World Wide Web Construction.
ERIC Educational Resources Information Center
Luck, Donald D.; Hunter, J. Mark
This paper describes basic types of World Wide Web pages and presents design criteria for page layout based on principles of visual literacy. Discussion focuses on pages that present information in the following styles: billboard; directory/index; textual; and graphics. Problems and solutions in Web page construction are explored according to…
An efficient scheme for automatic web pages categorization using the support vector machine
NASA Astrophysics Data System (ADS)
Bhalla, Vinod Kumar; Kumar, Neeraj
2016-07-01
In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.
Illness and Internet empowerment: writing and reading breast cancer in cyberspace.
Pitts, Victoria
2004-01-01
The Internet is now a site where women with breast cancer both read and write about the illness, and in doing so negotiate identity and definitions of situation in disembodied space. Cyberspace has been imagined as a liberatory realm where women can transgress gender roles, invent selves and create new forms of knowledge. This study explores the personal web pages of women with breast cancer with an interest in exploring the issue of 'cyber-agency' or empowerment in cyberspace. I suggest here that women's web pages might offer potentially critical opportunities for women's knowledge-making in relation to what are often highly political aspects of the body, gender and illness. However, the Internet is not an inherently empowering technology, and it can be a medium for affirming norms of femininity, consumerism, individualism and other powerful social messages.
Finding Specification Pages from the Web
NASA Astrophysics Data System (ADS)
Yoshinaga, Naoki; Torisawa, Kentaro
This paper presents a method of finding a specification page on the Web for a given object (e.g., ``Ch. d'Yquem'') and its class label (e.g., ``wine''). A specification page for an object is a Web page which gives concise attribute-value information about the object (e.g., ``county''-``Sauternes'') in well formatted structures. A simple unsupervised method using layout and symbolic decoration cues was applied to a large number of the Web pages to acquire candidate attributes for each class (e.g., ``county'' for a class ``wine''). We then filter out irrelevant words from the putative attributes through an author-aware scoring function that we called site frequency. We used the acquired attributes to select a representative specification page for a given object from the Web pages retrieved by a normal search engine. Experimental results revealed that our system greatly outperformed the normal search engine in terms of this specification retrieval.
ARL Physics Web Pages: An Evaluation by Established, Transitional and Emerging Benchmarks.
ERIC Educational Resources Information Center
Duffy, Jane C.
2002-01-01
Provides an overview of characteristics among Association of Research Libraries (ARL) physics Web pages. Examines current academic Web literature and from that develops six benchmarks to measure physics Web pages: ease of navigation; logic of presentation; representation of all forms of information; engagement of the discipline; interactivity of…
Assessment and revision of clinical pharmacy practice internet web sites.
Edwards, Krystal L; Salvo, Marissa C; Ward, Kristina E; Attridge, Russell T; Kiser, Katie; Pinner, Nathan A; Gallegos, Patrick J; Kesteloot, Lori Lynn; Hylton, Ann; Bookstaver, P Brandon
2014-02-01
Health care professionals, trainees, and patients use the Internet extensively. Editable Web sites may contain inaccurate, incomplete, and/or outdated information that may mislead the public's perception of the topic. To evaluate the editable, online descriptions of clinical pharmacy and pharmacist and attempt to improve their accuracy. The authors identified key areas within clinical pharmacy to evaluate for accuracy and appropriateness on the Internet. Current descriptions that were reviewed on public domain Web sites included: (1) clinical pharmacy and the clinical pharmacist, (2) pharmacy education, (3) clinical pharmacy and development and provision for reimbursement, (4) clinical pharmacists and advanced specialty certifications/training opportunities, (5) pharmacists and advocacy, and (6) clinical pharmacists and interdisciplinary/interprofessional content. The authors assessed each content area to determine accuracy and prioritized the need for updating, when applicable, to achieve consistency in descriptions and relevancy. The authors found that Wikipedia, a public domain that allows users to update, was consistently the most common Web site produced in search results. The authors' evaluation resulted in the creation or revision of 14 Wikipedia Web pages. However, rejection of 3 proposed newly created Web pages affected the authors' ability to address identified content areas with deficiencies and/or inaccuracies. Through assessing and updating editable Web sites, the authors strengthened the online representation of clinical pharmacy in a clear, cohesive, and accurate manner. However, ongoing assessments of the Internet are continually needed to ensure accuracy and appropriateness.
Web accessibility support for visually impaired users using link content analysis.
Iwata, Hajime; Kobayashi, Naofumi; Tachibana, Kenji; Shirogane, Junko; Fukazawa, Yoshiaki
2013-12-01
Web pages are used for a variety of purposes. End users must understand dynamically changing content and sequentially follow page links to find desired material, requiring significant time and effort. However, for visually impaired users using screen readers, it can be difficult to find links to web pages when link text and alternative text descriptions are inappropriate. Our method supports the discovery of content by analyzing 8 categories of link types, and allows visually impaired users to be aware of the content represented by links in advance. This facilitates end users access to necessary information on web pages. Our method of classifying web page links is therefore effective as a means of evaluating accessibility.
Googling endometriosis: a systematic review of information available on the Internet.
Hirsch, Martin; Aggarwal, Shivani; Barker, Claire; Davis, Colin J; Duffy, James M N
2017-05-01
The demand for health information online is increasing rapidly without clear governance. We aim to evaluate the credibility, quality, readability, and accuracy of online patient information concerning endometriosis. We searched 5 popular Internet search engines: aol.com, ask.com, bing.com, google.com, and yahoo.com. We developed a search strategy in consultation with patients with endometriosis, to identify relevant World Wide Web pages. Pages containing information related to endometriosis for women with endometriosis or the public were eligible. Two independent authors screened the search results. World Wide Web pages were evaluated using validated instruments across 3 of the 4 following domains: (1) credibility (White Paper instrument; range 0-10); (2) quality (DISCERN instrument; range 0-85); and (3) readability (Flesch-Kincaid instrument; range 0-100); and (4) accuracy (assessed by a prioritized criteria developed in consultation with health care professionals, researchers, and women with endometriosis based on the European Society of Human Reproduction and Embryology guidelines [range 0-30]). We summarized these data in diagrams, tables, and narratively. We identified 750 World Wide Web pages, of which 54 were included. Over a third of Web pages did not attribute authorship and almost half the included pages did not report the sources of information or academic references. No World Wide Web page provided information assessed as being written in plain English. A minority of web pages were assessed as high quality. A single World Wide Web page provided accurate information: evidentlycochrane.net. Available information was, in general, skewed toward the diagnosis of endometriosis. There were 16 credible World Wide Web pages, however the content limitations were infrequently discussed. No World Wide Web page scored highly across all 4 domains. In the unlikely event that a World Wide Web page reports high-quality, accurate, and credible health information it is typically challenging for a lay audience to comprehend. Health care professionals, and the wider community, should inform women with endometriosis of the risk of outdated, inaccurate, or even dangerous information online. The implementation of an information standard will incentivize providers of online information to establish and adhere to codes of conduct. Copyright © 2016 Elsevier Inc. All rights reserved.
Analysis of Web Spam for Non-English Content: Toward More Effective Language-Based Classifiers
Alsaleh, Mansour; Alarifi, Abdulrahman
2016-01-01
Web spammers aim to obtain higher ranks for their web pages by including spam contents that deceive search engines in order to include their pages in search results even when they are not related to the search terms. Search engines continue to develop new web spam detection mechanisms, but spammers also aim to improve their tools to evade detection. In this study, we first explore the effect of the page language on spam detection features and we demonstrate how the best set of detection features varies according to the page language. We also study the performance of Google Penguin, a newly developed anti-web spamming technique for their search engine. Using spam pages in Arabic as a case study, we show that unlike similar English pages, Google anti-spamming techniques are ineffective against a high proportion of Arabic spam pages. We then explore multiple detection features for spam pages to identify an appropriate set of features that yields a high detection accuracy compared with the integrated Google Penguin technique. In order to build and evaluate our classifier, as well as to help researchers to conduct consistent measurement studies, we collected and manually labeled a corpus of Arabic web pages, including both benign and spam pages. Furthermore, we developed a browser plug-in that utilizes our classifier to warn users about spam pages after clicking on a URL and by filtering out search engine results. Using Google Penguin as a benchmark, we provide an illustrative example to show that language-based web spam classifiers are more effective for capturing spam contents. PMID:27855179
Analysis of Web Spam for Non-English Content: Toward More Effective Language-Based Classifiers.
Alsaleh, Mansour; Alarifi, Abdulrahman
2016-01-01
Web spammers aim to obtain higher ranks for their web pages by including spam contents that deceive search engines in order to include their pages in search results even when they are not related to the search terms. Search engines continue to develop new web spam detection mechanisms, but spammers also aim to improve their tools to evade detection. In this study, we first explore the effect of the page language on spam detection features and we demonstrate how the best set of detection features varies according to the page language. We also study the performance of Google Penguin, a newly developed anti-web spamming technique for their search engine. Using spam pages in Arabic as a case study, we show that unlike similar English pages, Google anti-spamming techniques are ineffective against a high proportion of Arabic spam pages. We then explore multiple detection features for spam pages to identify an appropriate set of features that yields a high detection accuracy compared with the integrated Google Penguin technique. In order to build and evaluate our classifier, as well as to help researchers to conduct consistent measurement studies, we collected and manually labeled a corpus of Arabic web pages, including both benign and spam pages. Furthermore, we developed a browser plug-in that utilizes our classifier to warn users about spam pages after clicking on a URL and by filtering out search engine results. Using Google Penguin as a benchmark, we provide an illustrative example to show that language-based web spam classifiers are more effective for capturing spam contents.
The Fountain of Stem Cell-Based Youth? Online Portrayals of Anti-Aging Stem Cell Technologies.
Rachul, Christen M; Percec, Ivona; Caulfield, Timothy
2015-08-01
The hype surrounding stem cell science has created a market opportunity for the cosmetic industry. Cosmetic and anti-aging products and treatments that make claims regarding stem cell technology are increasingly popular, despite a lack of evidence for safety and efficacy of such products. This study explores how stem cell-based products and services are portrayed to the public through online sources, in order to gain insight into the key messages available to consumers. A content analysis of 100 web pages was conducted to examine the portrayals of stem cell-based cosmetic and anti-aging products and treatments. A qualitative discourse analysis of one web page further examined how language contributes to the portrayals of these products and treatments to public audiences. The majority of web pages portrayed stem cell-based products as ready for public use. Very few web pages substantiated claims with scientific evidence, and even fewer mentioned any risks or limitations associated with stem cell science. The discourse analysis revealed that the framing and use of metaphor obscures the certainty of the efficacy of and length of time for stem cell-based anti-aging technology to be publicly available. This study highlights the need to educate patients and the public on the current limits of stem cell applications in this context. In addition, generating scientific evidence for stem cell-based anti-aging and aesthetic applications is needed for optimizing benefits and minimizing adverse effects for the public. Having more evidence on efficacy and risks will help to protect patients who are eagerly seeking out these treatments. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.
Blogging as Public Pedagogy: Creating Alternative Educational Futures
ERIC Educational Resources Information Center
Dennis, Carol Azumah
2015-01-01
In this study, I explore "blogging", the use of a regularly updated website or web page, authored and curated by an individual or small group, written in a conversational style, as a form of public pedagogy. I analyse blogs as pre-figurative spaces where people go to learn with/in a public sphere, through collaboration with interested…
The Cyber Sisters Club: Using the Internet To Bridge the Technology Gap with Inner City Girls.
ERIC Educational Resources Information Center
Lichtman, Judy
1998-01-01
Describes a program developed at Penn State Berks-Lehigh Valley College (PA) for inner-city minority girls to use computer technology that was otherwise unavailable to them. Highlights include access issues, gender issues, girls' preferences in a learning environment, making technology relevant, introducing new skills, creating Web page, and…
The Legacy of the Baroque in Virtual Representations of Library Space
ERIC Educational Resources Information Center
Garrett, Jeffrey
2004-01-01
Library home pages and digital library sites have many properties and purposes in common with the Baroque wall-system libraries of seventeenth- and eighteenth-century Europe. Like their Baroque antecedents, contemporary library Web sites exploit the moment of entrance and the experience of the threshold to create and sustain the illusion of a…
Casting the Net: The Development of a Resource Collection for an Internet Database.
ERIC Educational Resources Information Center
McKiernan, Gerry
CyberStacks(sm), a demonstration prototype World Wide Web information service, was established on the home page server at Iowa State University with the intent of facilitating identification and use of significant Internet resources in science and technology. CyberStacks(sm) was created in response to perceived deficiencies in early efforts to…
Integrating Databases with Maps: The Delivery of Cultural Data through TimeMap.
ERIC Educational Resources Information Center
Johnson, Ian
TimeMap is a unique integration of database management, metadata and interactive maps, designed to contextualise and deliver cultural data through maps. TimeMap extends conventional maps with the time dimension, creating and animating maps "on-the-fly"; delivers them as a kiosk application or embedded in Web pages; links flexibly to…
How to Break through Techno-Shock and Build Multi-Media Units.
ERIC Educational Resources Information Center
Sutz, Rachel; Warren, Maria W.; Williams, Holly
1998-01-01
Describes how three teachers learned about using Hyperstudio (presentation software), constructed a Web page, and created an original film as part of a unit on Florida writers. Recommends three major strategies for learning a new technology: choose the literary works, then the technology; build on your strengths; and learn to talk to the…
Macroscopic characterisations of Web accessibility
NASA Astrophysics Data System (ADS)
Lopes, Rui; Carriço, Luis
2010-12-01
The Web Science framework poses fundamental questions on the analysis of the Web, by focusing on how microscopic properties (e.g. at the level of a Web page or Web site) emerge into macroscopic properties and phenomena. One research topic on the analysis of the Web is Web accessibility evaluation, which centres on understanding how accessible a Web page is for people with disabilities. However, when framing Web accessibility evaluation on Web Science, we have found that existing research stays at the microscopic level. This article presents an experimental study on framing Web accessibility evaluation into Web Science's goals. This study resulted in novel accessibility properties of the Web not found at microscopic levels, as well as of Web accessibility evaluation processes themselves. We observed at large scale some of the empirical knowledge on how accessibility is perceived by designers and developers, such as the disparity of interpretations of accessibility evaluation tools warnings. We also found a direct relation between accessibility quality and Web page complexity. We provide a set of guidelines for designing Web pages, education on Web accessibility, as well as on the computational limits of large-scale Web accessibility evaluations.
Reporting on post-menopausal hormone therapy: an analysis of gynaecologists' web pages.
Bucksch, Jens; Kolip, Petra; Deitermann, Bernhilde
2004-01-01
The present study was designed to analyse Web pages of German gynaecologists with regard to postmenopausal hormone therapy (HT). There is a growing body of evidence, that the overall health risks of HT exceed the benefits. Making one's own informed choice has become a central concern for menopausal women. The Internet is an important source of health information, but the quality is often dubious. The study focused on the analysis of basic criteria such as last modification date and quality of the HT information content. The results of the Women's Health Initiative Study (WHI) were used as a benchmark. We searched for relevant Web pages by entering a combination of key words (9 x 13 = 117) into the search engine www.google.de. Each Web page was analysed using a standardized questionnaire. The basic criteria and the quality of content on each Web page were separately categorized by two evaluators. Disagreements were resolved by discussion. Of the 97 websites identified, basic criteria were not met by the majority. For example, the modification date was displayed by only 23 (23.7%) Web pages. The quality of content of most Web pages regarding HT was inaccurate and incomplete. Whilst only nine (9.3%) took up a balanced position, 66 (68%) recommended HT without any restrictions. In 22 cases the recommendation was indistinct and none of the sites refused HT. With regard to basic criteria, there was no difference between HT-recommending Web pages and sites with balanced position. Evidence-based information resulting from the WHI trial was insufficiently represented on gynaecologists' Web pages. Because of the growing number of consumers looking online for health information, the danger of obtaining harmful information has to be minimized. Web pages of gynaecologists do not appear to be recommendable for women because they do not provide recent evidence-based findings about HT.
Exploring Cultural Variation in Eye Movements on a Web Page between Americans and Koreans
ERIC Educational Resources Information Center
Yang, Changwoo
2009-01-01
This study explored differences in eye movement on a Web page between members of two different cultures to provide insight and guidelines for implementation of global Web site development. More specifically, the research examines whether differences of eye movement exist between the two cultures (American vs. Korean) when viewing a Web page, and…
Allen, J W; Finch, R J; Coleman, M G; Nathanson, L K; O'Rourke, N A; Fielding, G A
2002-01-01
This study was undertaken to determine the quality of information on the Internet regarding laparoscopy. Four popular World Wide Web search engines were used with the key word "laparoscopy." Advertisements, patient- or physician-directed information, and controversial material were noted. A total of 14,030 Web pages were found, but only 104 were unique Web sites. The majority of the sites were duplicate pages, subpages within a main Web page, or dead links. Twenty-eight of the 104 pages had a medical product for sale, 26 were patient-directed, 23 were written by a physician or group of physicians, and six represented corporations. The remaining 21 were "miscellaneous." The 46 pages containing educational material were critically reviewed. At least one of the senior authors found that 32 of the pages contained controversial or misleading statements. All of the three senior authors (LKN, NAO, GAF) independently agreed that 17 of the 46 pages contained controversial information. The World Wide Web is not a reliable source for patient or physician information about laparoscopy. Authenticating medical information on the World Wide Web is a difficult task, and no government or surgical society has taken the lead in regulating what is presented as fact on the World Wide Web.
NASA Astrophysics Data System (ADS)
Dimopoulos, Kostas; Asimakopoulos, Apostolos
2010-06-01
This study aims to explore navigation patterns and preferred pages' characteristics of ten secondary school students' searching the web for information about cloning. The students navigated the Web for as long as they wished in a context of minimum support of teaching staff. Their navigation patterns were analyzed using audit trail data software. The characteristics of their preferred Web pages were also analyzed using a scheme of analysis largely based on socio-linguistics and socio-semiotics approaches. Two distinct groups of students could be discerned. The first consisted of more competent students, who during their navigation visited fewer relevant pages, however of higher credibility and more specialized content. The second group consists of weaker students, who visited more pages, mainly of lower credibility and rather popularized content. Implications for designing educational web pages and teaching are discussed.
Automating Information Discovery Within the Invisible Web
NASA Astrophysics Data System (ADS)
Sweeney, Edwina; Curran, Kevin; Xie, Ermai
A Web crawler or spider crawls through the Web looking for pages to index, and when it locates a new page it passes the page on to an indexer. The indexer identifies links, keywords, and other content and stores these within its database. This database is searched by entering keywords through an interface and suitable Web pages are returned in a results page in the form of hyperlinks accompanied by short descriptions. The Web, however, is increasingly moving away from being a collection of documents to a multidimensional repository for sounds, images, audio, and other formats. This is leading to a situation where certain parts of the Web are invisible or hidden. The term known as the "Deep Web" has emerged to refer to the mass of information that can be accessed via the Web but cannot be indexed by conventional search engines. The concept of the Deep Web makes searches quite complex for search engines. Google states that the claim that conventional search engines cannot find such documents as PDFs, Word, PowerPoint, Excel, or any non-HTML page is not fully accurate and steps have been taken to address this problem by implementing procedures to search items such as academic publications, news, blogs, videos, books, and real-time information. However, Google still only provides access to a fraction of the Deep Web. This chapter explores the Deep Web and the current tools available in accessing it.
Toward automated assessment of health Web page quality using the DISCERN instrument.
Allam, Ahmed; Schulz, Peter J; Krauthammer, Michael
2017-05-01
As the Internet becomes the number one destination for obtaining health-related information, there is an increasing need to identify health Web pages that convey an accurate and current view of medical knowledge. In response, the research community has created multicriteria instruments for reliably assessing online medical information quality. One such instrument is DISCERN, which measures health Web page quality by assessing an array of features. In order to scale up use of the instrument, there is interest in automating the quality evaluation process by building machine learning (ML)-based DISCERN Web page classifiers. The paper addresses 2 key issues that are essential before constructing automated DISCERN classifiers: (1) generation of a robust DISCERN training corpus useful for training classification algorithms, and (2) assessment of the usefulness of the current DISCERN scoring schema as a metric for evaluating the performance of these algorithms. Using DISCERN, 272 Web pages discussing treatment options in breast cancer, arthritis, and depression were evaluated and rated by trained coders. First, different consensus models were compared to obtain a robust aggregated rating among the coders, suitable for a DISCERN ML training corpus. Second, a new DISCERN scoring criterion was proposed (features-based score) as an ML performance metric that is more reflective of the score distribution across different DISCERN quality criteria. First, we found that a probabilistic consensus model applied to the DISCERN instrument was robust against noise (random ratings) and superior to other approaches for building a training corpus. Second, we found that the established DISCERN scoring schema (overall score) is ill-suited to measure ML performance for automated classifiers. Use of a probabilistic consensus model is advantageous for building a training corpus for the DISCERN instrument, and use of a features-based score is an appropriate ML metric for automated DISCERN classifiers. The code for the probabilistic consensus model is available at https://bitbucket.org/A_2/em_dawid/ . © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
49 CFR 573.9 - Address for submitting required reports and other information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Internet Web page http://www.safercar.gov/Vehicle+Manufacturers. A manufacturer must use the templates provided at this Web page for all submissions required under this section. Defect and noncompliance... at this Web page. [78 FR 51421, Aug. 20, 2013] ...
Ajax Architecture Implementation Techniques
NASA Astrophysics Data System (ADS)
Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader
2012-03-01
Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.
Technical development of PubMed Interact: an improved interface for MEDLINE/PubMed searches
Muin, Michael; Fontelo, Paul
2006-01-01
Background The project aims to create an alternative search interface for MEDLINE/PubMed that may provide assistance to the novice user and added convenience to the advanced user. An earlier version of the project was the 'Slider Interface for MEDLINE/PubMed searches' (SLIM) which provided JavaScript slider bars to control search parameters. In this new version, recent developments in Web-based technologies were implemented. These changes may prove to be even more valuable in enhancing user interactivity through client-side manipulation and management of results. Results PubMed Interact is a Web-based MEDLINE/PubMed search application built with HTML, JavaScript and PHP. It is implemented on a Windows Server 2003 with Apache 2.0.52, PHP 4.4.1 and MySQL 4.1.18. PHP scripts provide the backend engine that connects with E-Utilities and parses XML files. JavaScript manages client-side functionalities and converts Web pages into interactive platforms using dynamic HTML (DHTML), Document Object Model (DOM) tree manipulation and Ajax methods. With PubMed Interact, users can limit searches with JavaScript slider bars, preview result counts, delete citations from the list, display and add related articles and create relevance lists. Many interactive features occur at client-side, which allow instant feedback without reloading or refreshing the page resulting in a more efficient user experience. Conclusion PubMed Interact is a highly interactive Web-based search application for MEDLINE/PubMed that explores recent trends in Web technologies like DOM tree manipulation and Ajax. It may become a valuable technical development for online medical search applications. PMID:17083729
Is Domain Highlighting Actually Helpful in Identifying Phishing Web Pages?
Xiong, Aiping; Proctor, Robert W; Yang, Weining; Li, Ninghui
2017-06-01
To evaluate the effectiveness of domain highlighting in helping users identify whether Web pages are legitimate or spurious. As a component of the URL, a domain name can be overlooked. Consequently, browsers highlight the domain name to help users identify which Web site they are visiting. Nevertheless, few studies have assessed the effectiveness of domain highlighting, and the only formal study confounded highlighting with instructions to look at the address bar. We conducted two phishing detection experiments. Experiment 1 was run online: Participants judged the legitimacy of Web pages in two phases. In Phase 1, participants were to judge the legitimacy based on any information on the Web page, whereas in Phase 2, they were to focus on the address bar. Whether the domain was highlighted was also varied. Experiment 2 was conducted similarly but with participants in a laboratory setting, which allowed tracking of fixations. Participants differentiated the legitimate and fraudulent Web pages better than chance. There was some benefit of attending to the address bar, but domain highlighting did not provide effective protection against phishing attacks. Analysis of eye-gaze fixation measures was in agreement with the task performance, but heat-map results revealed that participants' visual attention was attracted by the highlighted domains. Failure to detect many fraudulent Web pages even when the domain was highlighted implies that users lacked knowledge of Web page security cues or how to use those cues. Potential applications include development of phishing prevention training incorporating domain highlighting with other methods to help users identify phishing Web pages.
[Health information on the Internet and trust marks as quality indicators: vaccines case study].
Mayer, Miguel Angel; Leis, Angela; Sanz, Ferran
2009-10-01
To find out the prevalence of quality trust marks present in websites and to analyse the quality of these websites displaying trust marks compared with those that do not display them, in order to put forward these trust marks as a quality indicator. Cross-sectional study. Internet. Websites on vaccines. Using "vacunas OR vaccines" as key words, the features of 40 web pages were analysed. These web pages were selected from the page results of two search engines, Google and Yahoo! Based on a total of 9 criteria, the average score of criteria fulfilled was 7 (95% CI 3.96-10.04) points for the web pages offered by Yahoo! and 7.3 (95% CI 3.86-10.74) offered by Google. Amongst web pages offered by Yahoo!, there were three with clearly inaccurate information, while there were four in the pages offered by Google. Trust marks were displayed in 20% and 30% medical web pages, respectively, and their presence reached statistical significance (P=0.033) when fulfilling the quality criteria compared with web pages where trust marks were not displayed. A wide variety of web pages was obtained by search engines and a large number of them with useless information. Although the websites analysed had a good quality, between 15% and 20% showed inaccurate information. Websites where trust marks were displayed had more quality than those that did not display one and none of them were included amongst those where inaccurate information was found.
77 FR 70454 - Proposed Flood Hazard Determinations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... which included a Web page address through which the Preliminary Flood Insurance Rate Map (FIRM), and... be accessed. The information available through the Web page address has subsequently been updated... through the web page address listed in the table has been updated to reflect the Revised Preliminary...
An Analysis of Academic Library Web Pages for Faculty
ERIC Educational Resources Information Center
Gardner, Susan J.; Juricek, John Eric; Xu, F. Grace
2008-01-01
Web sites are increasingly used by academic libraries to promote key services and collections to teaching faculty. This study analyzes the content, location, language, and technological features of fifty-four academic library Web pages designed especially for faculty to expose patterns in the development of these pages.
Paparo, G. D.; Martin-Delgado, M. A.
2012-01-01
We introduce the characterization of a class of quantum PageRank algorithms in a scenario in which some kind of quantum network is realizable out of the current classical internet web, but no quantum computer is yet available. This class represents a quantization of the PageRank protocol currently employed to list web pages according to their importance. We have found an instance of this class of quantum protocols that outperforms its classical counterpart and may break the classical hierarchy of web pages depending on the topology of the web. PMID:22685626
Methodologies for Crawler Based Web Surveys.
ERIC Educational Resources Information Center
Thelwall, Mike
2002-01-01
Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…
Promoting pedagogical experimentation: using a wiki in graduate level education.
Martin, Carolyn Thompson
2012-12-01
Learning to write in a scholarly manner is often a challenge for graduate students. This study describes nursing students' use of a wiki to encourage writing collaboration among students by allowing them to cocreate, review, and edit each other's material as it is created. Students are introduced to the online wiki site the first week of the course. A technology representative assists students with a short introduction and class visits. All students participate in making decisions related to the overall character of the site. They create pages on topics related to their clinical placements. Student pages are peer and content expert reviewed for accuracy and comprehensiveness. Students include pictures, YouTube links, attachments, videos, and Web site links into their pages. Evidence-based content includes pharmacology, diagnostic criteria, pathophysiology, history, genetics, and references. Students present their pages, and feedback questionnaires are collected at the end of the semester. The wiki writing assignment introduces students, faculty, and the community to graduate student projects while exposing students to new technology. Areas explored include issues and best practices regarding classroom pedagogy, as well as student support and technical challenges in the use of a wiki. Suggestions for improvement are discussed.
Atmospheric Science Data Center
2013-03-21
... Web Links to Relevant CERES Information Relevant information about CERES, CERES references, ... Instrument Working Group Home Page Aerosol Retrieval Web Page (Center for Satellite Applications and Research) ...
Creating Dynamic Websites Using jQuery
ERIC Educational Resources Information Center
Miller-Francisco, Emily
2010-01-01
As e-resource systems and web coordinator for Southern Oregon University, the author is deeply involved with the university library's website. In the latest revision of this website, the author knew she needed to jazz it up a little. With screen real estate on the main page at a premium, the author hoped to use a tabbed box and an accordion-style…
Benefits and Pitfalls of Using HTML as a CD-ROM Development Tool.
ERIC Educational Resources Information Center
Misanchuk, Earl R.; Schwier, Richard A.
The hypertext markup language (HTML) used to develop pages for the world wide web also has potential for use in creating some types of multimedia instruction destined for CD-ROMs. After providing a brief overview of HTML, this document presents pros and cons relevant to CD-ROM production. HTML can offer compatibility to both Windows and Macintosh…
ERIC Educational Resources Information Center
Wang, Tzone I; Tsai, Kun Hua; Lee, Ming Che; Chiu, Ti Kai
2007-01-01
With vigorous development of the Internet, especially the web page interaction technology, distant E-learning has become more and more realistic and popular. Digital courses may consist of many learning units or learning objects and, currently, many learning objects are created according to SCORM standard. It can be seen that, in the near future,…
The Wiki as a Virtual Space for Qualitative Data Collection
ERIC Educational Resources Information Center
Castanos, Carolina; Piercy, Fred P.
2010-01-01
The authors make a case for using wiki technology in qualitative research. A wiki is an online database that allows users to create, edit, and/or reflect on the content of a web page. Thus, wiki technology can support qualitative research that attempts to understand the shared thinking of participants. To illustrate the use of the wiki for this…
Projects made with the Berkeley Lab Circuit Board
dependence of cosmic rays. Greg Poe, a student at Travis High School in Richmond, Texas, received an the journal Physics Education. He used the Berkeley Lab circuit board together with spare parts from New York Schools Cosmic Particle Telescope workshop. Ken Cecire has created a web page which describes
PC-based web authoring: How to learn as little unix as possible while getting on the Web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gennari, L.T.; Breaux, M.; Minton, S.
1996-09-01
This document is a general guide for creating Web pages, using commonly available word processing and file transfer applications. It is not a full guide to HTML, nor does it provide an introduction to the many WYSIWYG HTML editors available. The viability of the authoring method it describes will not be affected by changes in the HTML specification or the rapid release-and-obsolescence cycles of commercial WYSIWYG HTML editors. This document provides a gentle introduction to HTML for the beginner, and as the user gains confidence and experience, encourages greater familiarity with HTML through continued exposure to and hands-on usage ofmore » HTML code.« less
Mac-based Web authoring: How to learn as little Unix as possible while getting on the Web.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gennari, L.T.
1996-06-01
This document is a general guide for creating Web pages, using commonly available word processing and file transfer applications. It is not a full guide to HTML, nor does it provide an introduction to the many WYSIWYG HTML editors available. The viability of the authoring method it describes will not be affected by changes in the HTML specification or the rapid release-and-obsolescence cycles of commercial WYSIWYG HTML editors. This document provides a gentle introduction to HTML for the beginner and as the user gains confidence and experience, encourages greater familiarity with HTML through continued exposure to and hands-on usage ofmore » HTML code.« less
Adding Audio Supported Smartboard Lectures to an Introductory Astronomy Online Laboratory
NASA Astrophysics Data System (ADS)
Lahaise, U. G. L.
2003-12-01
SMART Board(TM) and RealProducer(R) Plus technologies were used to develop a series of narrated pre-lab introductory online lectures. Smartboard slides were created by capturing images from internet pages and power point slides, then annotated and saved as web pages using smartboard technology. Short audio files were recorded using the RealProducer Plus software which were then linked to individual slides. WebCT was used to deliver the online laboratory. Students in an Introductory Astronomy of the Solar System Online laboratory used the lectures to prepare for laboratory exercises. The narrated pre-lab lectures were added to six out of eight suitable laboratory exercises. A survey was given to the students to research their online laboratory experience, in general, and the impact of the narrated smartboard lectures on their learning success, specifically. Data were collected for two accelerated sessions. Results show that students find the online laboratory equally hard or harder than a separate online lecture. The accelerated format created great time pressure which negatively affected their study habits. About half of the students used the narrated pre-lab lectures consistently. Preliminary findings show that lab scores in the accelerated sessions were brought up to the level of full semester courses.
Kozlov, Elissa; Carpenter, Brian D
2017-04-01
Americans rely on the Internet for health information, and people are likely to turn to online resources to learn about palliative care as well. The purpose of this study was to analyze online palliative care information pages to evaluate the breadth of their content. We also compared how frequently basic facts about palliative care appeared on the Web pages to expert rankings of the importance of those facts to understanding palliative care. Twenty-six pages were identified. Two researchers independently coded each page for content. Palliative care professionals (n = 20) rated the importance of content domains for comparison with content frequency in the Web pages. We identified 22 recurring broad concepts about palliative care. Each information page included, on average, 9.2 of these broad concepts (standard deviation [SD] = 3.36, range = 5-15). Similarly, each broad concept was present in an average of 45% of the Web pages (SD = 30.4%, range = 8%-96%). Significant discrepancies emerged between expert ratings of the importance of the broad concepts and the frequency of their appearance in the Web pages ( r τ = .25, P > .05). This study demonstrates that palliative care information pages available online vary considerably in their content coverage. Furthermore, information that palliative care professionals rate as important for consumers to know is not always included in Web pages. We developed guidelines for information pages for the purpose of educating consumers in a consistent way about palliative care.
Growing and navigating the small world Web by local content
Menczer, Filippo
2002-01-01
Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues. PMID:12381792
Growing and navigating the small world Web by local content
NASA Astrophysics Data System (ADS)
Menczer, Filippo
2002-10-01
Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues.
Growing and navigating the small world Web by local content.
Menczer, Filippo
2002-10-29
Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues.
An ant colony optimization based feature selection for web page classification.
Saraç, Esra; Özel, Selma Ayşe
2014-01-01
The increased popularity of the web has caused the inclusion of huge amount of information to the web, and as a result of this explosive information growth, automated web page classification systems are needed to improve search engines' performance. Web pages have a large number of features such as HTML/XML tags, URLs, hyperlinks, and text contents that should be considered during an automated classification process. The aim of this study is to reduce the number of features to be used to improve runtime and accuracy of the classification of web pages. In this study, we used an ant colony optimization (ACO) algorithm to select the best features, and then we applied the well-known C4.5, naive Bayes, and k nearest neighbor classifiers to assign class labels to web pages. We used the WebKB and Conference datasets in our experiments, and we showed that using the ACO for feature selection improves both accuracy and runtime performance of classification. We also showed that the proposed ACO based algorithm can select better features with respect to the well-known information gain and chi square feature selection methods.
SLIM: an alternative Web interface for MEDLINE/PubMed searches – a preliminary study
Muin, Michael; Fontelo, Paul; Liu, Fang; Ackerman, Michael
2005-01-01
Background With the rapid growth of medical information and the pervasiveness of the Internet, online search and retrieval systems have become indispensable tools in medicine. The progress of Web technologies can provide expert searching capabilities to non-expert information seekers. The objective of the project is to create an alternative search interface for MEDLINE/PubMed searches using JavaScript slider bars. SLIM, or Slider Interface for MEDLINE/PubMed searches, was developed with PHP and JavaScript. Interactive slider bars in the search form controlled search parameters such as limits, filters and MeSH terminologies. Connections to PubMed were done using the Entrez Programming Utilities (E-Utilities). Custom scripts were created to mimic the automatic term mapping process of Entrez. Page generation times for both local and remote connections were recorded. Results Alpha testing by developers showed SLIM to be functionally stable. Page generation times to simulate loading times were recorded the first week of alpha and beta testing. Average page generation times for the index page, previews and searches were 2.94 milliseconds, 0.63 seconds and 3.84 seconds, respectively. Eighteen physicians from the US, Australia and the Philippines participated in the beta testing and provided feedback through an online survey. Most users found the search interface user-friendly and easy to use. Information on MeSH terms and the ability to instantly hide and display abstracts were identified as distinctive features. Conclusion SLIM can be an interactive time-saving tool for online medical literature research that improves user control and capability to instantly refine and refocus search strategies. With continued development and by integrating search limits, methodology filters, MeSH terms and levels of evidence, SLIM may be useful in the practice of evidence-based medicine. PMID:16321145
SLIM: an alternative Web interface for MEDLINE/PubMed searches - a preliminary study.
Muin, Michael; Fontelo, Paul; Liu, Fang; Ackerman, Michael
2005-12-01
With the rapid growth of medical information and the pervasiveness of the Internet, online search and retrieval systems have become indispensable tools in medicine. The progress of Web technologies can provide expert searching capabilities to non-expert information seekers. The objective of the project is to create an alternative search interface for MEDLINE/PubMed searches using JavaScript slider bars. SLIM, or Slider Interface for MEDLINE/PubMed searches, was developed with PHP and JavaScript. Interactive slider bars in the search form controlled search parameters such as limits, filters and MeSH terminologies. Connections to PubMed were done using the Entrez Programming Utilities (E-Utilities). Custom scripts were created to mimic the automatic term mapping process of Entrez. Page generation times for both local and remote connections were recorded. Alpha testing by developers showed SLIM to be functionally stable. Page generation times to simulate loading times were recorded the first week of alpha and beta testing. Average page generation times for the index page, previews and searches were 2.94 milliseconds, 0.63 seconds and 3.84 seconds, respectively. Eighteen physicians from the US, Australia and the Philippines participated in the beta testing and provided feedback through an online survey. Most users found the search interface user-friendly and easy to use. Information on MeSH terms and the ability to instantly hide and display abstracts were identified as distinctive features. SLIM can be an interactive time-saving tool for online medical literature research that improves user control and capability to instantly refine and refocus search strategies. With continued development and by integrating search limits, methodology filters, MeSH terms and levels of evidence, SLIM may be useful in the practice of evidence-based medicine.
Stockburger, D W
1999-05-01
Active server pages permit a software developer to customize the Web experience for users by inserting server-side script and database access into Web pages. This paper describes applications of these techniques and provides a primer on the use of these methods. Applications include a system that generates and grades individualized homework assignments and tests for statistics students. The student accesses the system as a Web page, prints out the assignment, does the assignment, and enters the answers on the Web page. The server, running on NT Server 4.0, grades the assignment, updates the grade book (on a database), and returns the answer key to the student.
Outreach for Outreach: Targeting social media audiences to promote a NASA kids’ web site
NASA Astrophysics Data System (ADS)
Pham, C. C.
2009-12-01
The Space Place is a successful NASA web site that benefits upper elementary school students and educators by providing games, activities, and resources to stimulate interest in science, technology, engineering, and mathematics, as well as to inform the audience of NASA’s contributions. As online social networking grows to be a central component of modern communication, The Space Place has explored the benefits of integrating social networks with the web site to increase awareness of materials the web site offers. This study analyzes the capabilities of social networks, and specifically the demographics of Twitter and Facebook. It then compares these results with the content, audience, and perceived demographics of The Space Place web site. Based upon the demographic results, we identified a target constituency that would benefit from the integration of social networks into The Space Place web site. As a result of this study, a Twitter feed has been established that releases a daily tweet from The Space Place. In addition, a Facebook page has been created to showcase new content and prompt interaction among fans of The Space Place. Currently, plans are under way to populate the Space Place Facebook page. Each social network has been utilized in an effort to spark excitement about the content on The Space Place, as well as to attract followers to the main NASA Space Place web site. To pursue this idea further, a plan has been developed to promote NASA Space Place’s social media tools among the target audience.
32 CFR 806.5 - Responsibilities.
Code of Federal Regulations, 2014 CFR
2014-07-01
... room (ERR) requirements by establishing a FOIA site on their installation public web page and making... a link to the Air Force FOIA web page at http://www.foia.af.mil. See § 806.12(c). (d) MAJCOM... installation public web page by updating or removing them when no longer needed. Software for tracking number...
32 CFR 806.5 - Responsibilities.
Code of Federal Regulations, 2012 CFR
2012-07-01
... room (ERR) requirements by establishing a FOIA site on their installation public web page and making... a link to the Air Force FOIA web page at http://www.foia.af.mil. See § 806.12(c). (d) MAJCOM... installation public web page by updating or removing them when no longer needed. Software for tracking number...
32 CFR 806.5 - Responsibilities.
Code of Federal Regulations, 2013 CFR
2013-07-01
... room (ERR) requirements by establishing a FOIA site on their installation public web page and making... a link to the Air Force FOIA web page at http://www.foia.af.mil. See § 806.12(c). (d) MAJCOM... installation public web page by updating or removing them when no longer needed. Software for tracking number...
Social Responsibility and Corporate Web Pages: Self-Presentation or Agenda-Setting?
ERIC Educational Resources Information Center
Esrock, Stuart L.; Leichty, Greg B.
1998-01-01
Examines how corporate entities use the Web to present themselves as socially responsible citizens and to advance policy positions. Samples randomly "Fortune 500" companies, revealing that, although 90% had Web pages and 82% of the sites addressed a corporate social responsibility issue, few corporations used their pages to monitor…
Environment: General; Grammar & Usage; Money Management; Music History; Web Page Creation & Design.
ERIC Educational Resources Information Center
Web Feet, 2001
2001-01-01
Describes Web site resources for elementary and secondary education in the topics of: environment, grammar, money management, music history, and Web page creation and design. Each entry includes an illustration of a sample page on the site and an indication of the grade levels for which it is appropriate. (AEF)
The Internet as a Reflective Mirror for a Company's Image.
ERIC Educational Resources Information Center
Fahrmann, Jennifer; Hartz, Kim; Wendling, Marijo; Yoder, Kevin
The Internet is becoming the primary way that businesses communicate and receive information. Corporate Web addresses and home pages have become a valuable tool for leaving a solid mark on potential clients, consumers, and competition. To determine how differences in Web pages design reflect corporate image, a study examined Web pages from two…
Intelligent medical information filtering.
Quintana, Y
1998-01-01
This paper describes an intelligent information filtering system to assist users to be notified of updates to new and relevant medical information. Among the major problems users face is the large volume of medical information that is generated each day, and the need to filter and retrieve relevant information. The Internet has dramatically increased the amount of electronically accessible medical information and reduced the cost and time needed to publish. The opportunity of the Internet for the medical profession and consumers is to have more information to make decisions and this could potentially lead to better medical decisions and outcomes. However, without the assistance from professional medical librarians, retrieving new and relevant information from databases and the Internet remains a challenge. Many physicians do not have access to the services of a medical librarian. Most physicians indicate on surveys that they do not prefer to retrieve the literature themselves, or visit libraries because of the lack of recent materials, poor organisation and indexing of materials, lack of appropriate and available material, and lack of time. The information filtering system described in this paper records the online web browsing behaviour of each user and creates a user profile of the index terms found on the web pages visited by the user. A relevance-ranking algorithm then matches the user profiles to the index terms of new health care web pages that are added each day. The system creates customised summaries of new information for each user. A user can then connect to the web site to read the new information. Relevance feedback buttons on each page ask the user to rate the usefulness of the page to their immediate information needs. Errors in relevance ranking are reduced in this system by having both the user profile and medical information represented in the same representation language using a controlled vocabulary. This system also updates the user profiles, automatically relieving this burden from the user, but also allowing the user to explicitly state preferences. An initial evaluation of this system was done with health consumers using a web site on consumer health. It was found that users often modified their criteria for what they considered relevant not only between browsing sessions but also during a session. A user's criteria for what is relevant is constantly changing as they interact with the information. New revised metrics of recall and precision are needed to account for the partially relevant judgements and the dynamically changing criteria of users. Future research, development, and evaluation of interactive information retrieval systems will need to take into account the users' dynamically changing criteria of relevance.
Thompson, Andrew E; Graydon, Sara L
2009-01-01
With continuing use of the Internet, rheumatologists are referring patients to various websites to gain information about medications and diseases. Our goal was to develop and evaluate a Medication Website Assessment Tool (MWAT) for use by health professionals, and to explore the overall quality of methotrexate information presented on common English-language websites. Identification of websites was performed using a search strategy on the search engine Google. The first 250 hits were screened. Inclusion criteria included those English-language websites from authoritative sources, trusted medical, physicians', and common health-related websites. Websites from pharmaceutical companies, online pharmacies, and where the purpose seemed to be primarily advertisements were also included. Product monographs or technical-based web pages and web pages where the information was clearly directed at patients with cancer were excluded. Two reviewers independently scored each included web page for completeness and accuracy, format, readability, reliability, and credibility. An overall ranking was provided for each methotrexate information page. Twenty-eight web pages were included in the analysis. The average score for completeness and accuracy was 15.48+/-3.70 (maximum 24) with 10 out of 28 pages scoring 18 (75%) or higher. The average format score was 6.00+/-1.46 (maximum 8). The Flesch-Kincaid Grade Level revealed an average grade level of 10.07+/-1.84, with 5 out of 28 websites written at a reading level less than grade 8; however, no web page scored at a grade 5 to 6 level. An overall ranking was calculated identifying 8 web pages as appropriate sources of accurate and reliable methotrexate information. With the enormous amount of information available on the Internet, it is important to direct patients to web pages that are complete, accurate, readable, and credible sources of information. We identified web pages that may serve the interests of both rheumatologists and patients.
Web page sorting algorithm based on query keyword distance relation
NASA Astrophysics Data System (ADS)
Yang, Han; Cui, Hong Gang; Tang, Hao
2017-08-01
In order to optimize the problem of page sorting, according to the search keywords in the web page in the relationship between the characteristics of the proposed query keywords clustering ideas. And it is converted into the degree of aggregation of the search keywords in the web page. Based on the PageRank algorithm, the clustering degree factor of the query keyword is added to make it possible to participate in the quantitative calculation. This paper proposes an improved algorithm for PageRank based on the distance relation between search keywords. The experimental results show the feasibility and effectiveness of the method.
... reviewed/quality-filtered. The primary purpose of the Web page is educational and not to sell a ... in the directories. Availability and maintenance of the Web page The Web site is available consistently and ...
National Centers for Environmental Prediction
. Government's official Web portal to all Federal, state and local government Web resources and services. MISSION Web Page [scroll down to "Verification" Section] HRRR Verification at NOAA ESRL HRRR Web Verification Web Page NOAA / National Weather Service National Centers for Environmental Prediction
Experience versus talent shapes the structure of the Web.
Kong, Joseph S; Sarshar, Nima; Roychowdhury, Vwani P
2008-09-16
We use sequential large-scale crawl data to empirically investigate and validate the dynamics that underlie the evolution of the structure of the web. We find that the overall structure of the web is defined by an intricate interplay between experience or entitlement of the pages (as measured by the number of inbound hyperlinks a page already has), inherent talent or fitness of the pages (as measured by the likelihood that someone visiting the page would give a hyperlink to it), and the continual high rates of birth and death of pages on the web. We find that the web is conservative in judging talent and the overall fitness distribution is exponential, showing low variability. The small variance in talent, however, is enough to lead to experience distributions with high variance: The preferential attachment mechanism amplifies these small biases and leads to heavy-tailed power-law (PL) inbound degree distributions over all pages, as well as over pages that are of the same age. The balancing act between experience and talent on the web allows newly introduced pages with novel and interesting content to grow quickly and surpass older pages. In this regard, it is much like what we observe in high-mobility and meritocratic societies: People with entitlement continue to have access to the best resources, but there is just enough screening for fitness that allows for talented winners to emerge and join the ranks of the leaders. Finally, we show that the fitness estimates have potential practical applications in ranking query results.
NASA Astrophysics Data System (ADS)
Esparza, Javier
In many areas of computer science entities can “reproduce”, “replicate”, or “create new instances”. Paramount examples are threads in multithreaded programs, processes in operating systems, and computer viruses, but many others exist: procedure calls create new incarnations of the callees, web crawlers discover new pages to be explored (and so “create” new tasks), divide-and-conquer procedures split a problem into subproblems, and leaves of tree-based data structures become internal nodes with children. For lack of a better name, I use the generic term systems with process creation to refer to all these entities.
CytoscapeRPC: a plugin to create, modify and query Cytoscape networks from scripting languages.
Bot, Jan J; Reinders, Marcel J T
2011-09-01
CytoscapeRPC is a plugin for Cytoscape which allows users to create, query and modify Cytoscape networks from any programming language which supports XML-RPC. This enables them to access Cytoscape functionality and visualize their data interactively without leaving the programming environment with which they are familiar. Install through the Cytoscape plugin manager or visit the web page: http://wiki.nbic.nl/index.php/CytoscapeRPC for the user tutorial and download. j.j.bot@tudelft.nl; j.j.bot@tudelft.nl.
Effective Web and Desktop Retrieval with Enhanced Semantic Spaces
NASA Astrophysics Data System (ADS)
Daoud, Amjad M.
We describe the design and implementation of the NETBOOK prototype system for collecting, structuring and efficiently creating semantic vectors for concepts, noun phrases, and documents from a corpus of free full text ebooks available on the World Wide Web. Automatic generation of concept maps from correlated index terms and extracted noun phrases are used to build a powerful conceptual index of individual pages. To ensure scalabilty of our system, dimension reduction is performed using Random Projection [13]. Furthermore, we present a complete evaluation of the relative effectiveness of the NETBOOK system versus the Google Desktop [8].
Connecting Families through Innovative Technology in an Early Childhood Gifted Program.
ERIC Educational Resources Information Center
Kristovich, Sharon; Hertzog, Nancy B.; Klein, Marjorie
University Primary School (UPS) is an early childhood gifted program affiliated with the University of Illinois at Urbana-Champaign. This paper highlights three innovative uses of technology at UPS: Knowledge Web pages, photo portfolios, and Chickscope. The Knowledge Web pages are a collection of Web pages that serve as a virtual bulletin board…
ERIC Educational Resources Information Center
Hall, Richard H.; Hanna, Patrick
2004-01-01
The purpose of this experiment was to examine the effect of web page text/background colour combination on readability, retention, aesthetics, and behavioural intention. One hundred and thirty-six participants studied two Web pages, one with educational content and one with commercial content, in one of four colour-combination conditions. Major…
Teaching E-Commerce Web Page Evaluation and Design: A Pilot Study Using Tourism Destination Sites
ERIC Educational Resources Information Center
Susser, Bernard; Ariga, Taeko
2006-01-01
This study explores a teaching method for improving business students' skills in e-commerce page evaluation and making Web design majors aware of business content issues through cooperative learning. Two groups of female students at a Japanese university studying either tourism or Web page design were assigned tasks that required cooperation to…
ERIC Educational Resources Information Center
Kammerer, Yvonne; Kalbfell, Eva; Gerjets, Peter
2016-01-01
In two experiments we systematically examined whether contradictions between two web pages--of which one was commercially biased as stated in an "about us" section--stimulated university students' consideration of source information both during and after reading. In Experiment 1 "about us" information of the web pages was…
Dental practice websites: creating a Web presence.
Miller, Syrene A; Forrest, Jane L
2002-07-01
Web technology provides an opportunity for dentists to showcase their practice philosophy, quality of care, office setting, and staff in a creative manner. Having a Website provides a practice with innovative and cost-effective communications and marketing tools for current and potential patients who use the Internet. The main benefits of using a Website to promote one's practice are: Making office time more productive, tasks more timely, follow-up less necessary Engaging patients in an interactive and visual learning process Providing online forms and procedure examples for patients Projecting a competent and current image Tracking the usage of Web pages. Several options are available when considering the development of a Website. These options range in cost based on customization of the site and ongoing support services, such as site updates, technical assistance, and Web usage statistics. In most cases, Websites are less expensive than advertising in the phone book. Options in creating a Website include building one's own, employing a company that offers Website templates, and employing a company that offers customized sites. These development options and benefits will continue to grow as individuals access the Web and more information and sites become available.
Information about epilepsy on the internet: An exploratory study of Arabic websites.
Alkhateeb, Jamal M; Alhadidi, Muna S
2018-01-01
The aim of this study was to explore information about epilepsy found on Arabic websites. The researchers collected information from the internet between November 2016 and January 2017. Information was obtained using Google and Yahoo search engines. Keywords used were the Arabic equivalent of the following two keywords: epilepsy (Al-saraa) and convulsion (Tashanoj). A total of 144 web pages addressing epilepsy in Arabic were reviewed. The majority of web pages were websites of medical institutions and general health websites, followed by informational and educational websites, others, blogs and websites of individuals, and news and media sites. Topics most commonly addressed were medical treatments for epilepsy (50% of all pages) followed by epilepsy definition (41%) and epilepsy etiology (34.7%). The results also revealed that the vast majority of web pages did not mention the source of information. Many web pages also did not provide author information. Only a small proportion of the web pages provided adequate information. Relatively few web pages provided inaccurate information or made sweeping generalizations. As a result, it is concluded that the findings of the present study suggest that development of more credible Arabic websites on epilepsy is needed. These websites need to go beyond basic information, offering more evidence-based and updated information about epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.
An Ant Colony Optimization Based Feature Selection for Web Page Classification
2014-01-01
The increased popularity of the web has caused the inclusion of huge amount of information to the web, and as a result of this explosive information growth, automated web page classification systems are needed to improve search engines' performance. Web pages have a large number of features such as HTML/XML tags, URLs, hyperlinks, and text contents that should be considered during an automated classification process. The aim of this study is to reduce the number of features to be used to improve runtime and accuracy of the classification of web pages. In this study, we used an ant colony optimization (ACO) algorithm to select the best features, and then we applied the well-known C4.5, naive Bayes, and k nearest neighbor classifiers to assign class labels to web pages. We used the WebKB and Conference datasets in our experiments, and we showed that using the ACO for feature selection improves both accuracy and runtime performance of classification. We also showed that the proposed ACO based algorithm can select better features with respect to the well-known information gain and chi square feature selection methods. PMID:25136678
Creating library tutorials for nursing students.
Schroeder, Heidi
2010-04-01
This article describes one librarian's experiences with creating, promoting, and assessing online library tutorials. Tutorials were designed to provide on-demand and accessible library instruction to nursing students at Michigan State University. Topics for tutorials were chosen based on the librarian's liaison experiences and suggestions from nursing faculty. The tutorials were created using Camtasia and required the application of several tools and techniques. Tutorials were promoted through Web pages, the ANGEL course management system, blog posts, librarian interactions, e-mails, and more. In order to assess the tutorials' perceived effectiveness, feedback was gathered using a short survey. Future plans for the nursing tutorials project are also discussed.
Using JavaScript and the FDSN web service to create an interactive earthquake information system
NASA Astrophysics Data System (ADS)
Fischer, Kasper D.
2015-04-01
The FDSN web service provides a web interface to access earthquake meta-data (e. g. event or station information) and waveform date over the internet. Requests are send to a server as URLs and the output is either XML or miniSEED. This makes it hard to read by humans but easy to process with different software. Different data centers are already supporting the FDSN web service, e. g. USGS, IRIS, ORFEUS. The FDSN web service is also part of the Seiscomp3 (http://www.seiscomp3.org) software. The Seismological Observatory of the Ruhr-University switched to Seiscomp3 as the standard software for the analysis of mining induced earthquakes at the beginning of 2014. This made it necessary to create a new web-based earthquake information service for the publication of results to the general public. This has be done by processing the output of a FDSN web service query by javascript running in a standard browser. The result is an interactive map presenting the observed events and further information of events and stations on a single web page as a table and on a map. In addition the user can download event information, waveform data and station data in different formats like miniSEED, quakeML or FDSNxml. The developed code and all used libraries are open source and freely available.
MedlinePlus Connect: Technical Information
... Service Technical Information Page MedlinePlus Connect Implementation Options Web Application How does it work? Responds to requests ... examples of MedlinePlus Connect Web Application response pages. Web Service How does it work? Responds to requests ...
78 FR 42775 - CGI Federal, Inc., and Custom Applications Management; Transfer of Data
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-17
... develop applications, Web sites, Web pages, web-based applications and databases, in accordance with EPA policies and related Federal standards and procedures. The Contractor will provide [[Page 42776
Network Update: Plug-Ins, Forms and All That Java.
ERIC Educational Resources Information Center
Higgins, Chris
1997-01-01
Notes that the desire to make World Wide Web (WWW) pages more interactive and laden with animation, sound, and video brings us to the threshold of the deeper levels of Web page creation. Lists and discusses resources available on the WWW that will aid in learning and using these dynamic functions for Web page development to assist in interactive…
Strong regularities in world wide web surfing
Huberman; Pirolli; Pitkow; Lukose
1998-04-03
One of the most common modes of accessing information in the World Wide Web is surfing from one document to another along hyperlinks. Several large empirical studies have revealed common patterns of surfing behavior. A model that assumes that users make a sequence of decisions to proceed to another page, continuing as long as the value of the current page exceeds some threshold, yields the probability distribution for the number of pages that a user visits within a given Web site. This model was verified by comparing its predictions with detailed measurements of surfing patterns. The model also explains the observed Zipf-like distributions in page hits observed at Web sites.
Digital Ethnography: Library Web Page Redesign among Digital Natives
ERIC Educational Resources Information Center
Klare, Diane; Hobbs, Kendall
2011-01-01
Presented with an opportunity to improve Wesleyan University's dated library home page, a team of librarians employed ethnographic techniques to explore how its users interacted with Wesleyan's current library home page and web pages in general. Based on the data that emerged, a group of library staff and members of the campus' information…
Experience versus talent shapes the structure of the Web
Kong, Joseph S.; Sarshar, Nima; Roychowdhury, Vwani P.
2008-01-01
We use sequential large-scale crawl data to empirically investigate and validate the dynamics that underlie the evolution of the structure of the web. We find that the overall structure of the web is defined by an intricate interplay between experience or entitlement of the pages (as measured by the number of inbound hyperlinks a page already has), inherent talent or fitness of the pages (as measured by the likelihood that someone visiting the page would give a hyperlink to it), and the continual high rates of birth and death of pages on the web. We find that the web is conservative in judging talent and the overall fitness distribution is exponential, showing low variability. The small variance in talent, however, is enough to lead to experience distributions with high variance: The preferential attachment mechanism amplifies these small biases and leads to heavy-tailed power-law (PL) inbound degree distributions over all pages, as well as over pages that are of the same age. The balancing act between experience and talent on the web allows newly introduced pages with novel and interesting content to grow quickly and surpass older pages. In this regard, it is much like what we observe in high-mobility and meritocratic societies: People with entitlement continue to have access to the best resources, but there is just enough screening for fitness that allows for talented winners to emerge and join the ranks of the leaders. Finally, we show that the fitness estimates have potential practical applications in ranking query results. PMID:18779560
Tool independence for the Web Accessibility Quantitative Metric.
Vigo, Markel; Brajnik, Giorgio; Arrue, Myriam; Abascal, Julio
2009-07-01
The Web Accessibility Quantitative Metric (WAQM) aims at accurately measuring the accessibility of web pages. One of the main features of WAQM among others is that it is evaluation tool independent for ranking and accessibility monitoring scenarios. This article proposes a method to attain evaluation tool independence for all foreseeable scenarios. After demonstrating that homepages have a more similar error profile than any other web page in a given web site, 15 homepages were measured with 10,000 different values of WAQM parameters using EvalAccess and LIFT, two automatic evaluation tools for accessibility. A similar procedure was followed with random pages and with several test files obtaining several tuples that minimise the difference between both tools. One thousand four hundred forty-nine web pages from 15 web sites were measured with these tuples and those values that minimised the difference between the tools were selected. Once the WAQM was tuned, the accessibility of 15 web sites was measured with two metrics for web sites, concluding that even if similar values can be produced, obtaining the same scores is undesirable since evaluation tools behave in a different way.
NASA Astrophysics Data System (ADS)
Herrera, Francisco Javier, Jr.
This study set out to examine how a web-based tool embedded with vocabulary strategies, as part of the science curriculum in a third grade two-way immersion classroom, would aid students' academic vocabulary development. Fourteen students (seven boys, seven girls; ten of which were English learners) participated in this study. Students utilized web pages as part of their science curriculum on the topic of ecology. The study documented students' use of the web pages as a data-gathering tool on the topic of ecology during science instruction. Students were video and audio taped as they explored the web pages. Results indicated that through the use of the intervention web pages students significantly improved their knowledge of academic English target words.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... process.'' 2. On Web page 15564, in the third column, first paragraph, remove the fourth sentence. 3. On Web page 15566, in the first column, fourth paragraph, second sentence, revise ``(b)(2)(ii) to read... relating to the authority of OPM's Office of Inspector General.'' Sec. 800.20 [Corrected] 0 13. On Web page...
NASA Technical Reports Server (NTRS)
Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)
1998-01-01
The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.
Eng, J
1997-01-01
Java is a programming language that runs on a "virtual machine" built into World Wide Web (WWW)-browsing programs on multiple hardware platforms. Web pages were developed with Java to enable Web-browsing programs to overlay transparent graphics and text on displayed images so that the user could control the display of labels and annotations on the images, a key feature not available with standard Web pages. This feature was extended to include the presentation of normal radiologic anatomy. Java programming was also used to make Web browsers compatible with the Digital Imaging and Communications in Medicine (DICOM) file format. By enhancing the functionality of Web pages, Java technology should provide greater incentive for using a Web-based approach in the development of radiology teaching material.
Contextual advertisement placement in printed media
NASA Astrophysics Data System (ADS)
Liu, Sam; Joshi, Parag
2010-02-01
Advertisements today provide the necessary revenue model supporting the WWW ecosystem. Targeted or contextual ad insertion plays an important role in optimizing the financial return of this model. Nearly all the current ads that appear on web sites are geared for display purposes such as banner and "pay-per-click". Little attention, however, is focused on deriving additional ad revenues when the content is repurposed for alternative mean of presentation, e.g. being printed. Although more and more content is moving to the Web, there are still many occasions where printed output of web content is desirable, such as maps and articles; thus printed ad insertion can potentially be lucrative. In this paper, we describe a contextual ad insertion network aimed to realize new revenue for print service providers for web printing. We introduce a cloud print service that enables contextual ads insertion, with respect to the main web page content, when a printout of the page is requested. To encourage service utilization, it would provide higher quality printouts than what is possible from current browser print drivers, which generally produce poor outputs, e.g. ill formatted pages. At this juncture we will limit the scope to only article-related web pages although the concept can be extended to arbitrary web pages. The key components of this system include (1) the extraction of article from web pages, (2) the extraction of semantics from article, (3) querying the ad database for matching advertisement or coupon, and (4) joint content and ad layout for print outputs.
Using the web to validate document recognition results: experiments with business cards
NASA Astrophysics Data System (ADS)
Oertel, Clemens; O'Shea, Shauna; Bodnar, Adam; Blostein, Dorothea
2004-12-01
The World Wide Web is a vast information resource which can be useful for validating the results produced by document recognizers. Three computational steps are involved, all of them challenging: (1) use the recognition results in a Web search to retrieve Web pages that contain information similar to that in the document, (2) identify the relevant portions of the retrieved Web pages, and (3) analyze these relevant portions to determine what corrections (if any) should be made to the recognition result. We have conducted exploratory implementations of steps (1) and (2) in the business-card domain: we use fields of the business card to retrieve Web pages and identify the most relevant portions of those Web pages. In some cases, this information appears suitable for correcting OCR errors in the business card fields. In other cases, the approach fails due to stale information: when business cards are several years old and the business-card holder has changed jobs, then websites (such as the home page or company website) no longer contain information matching that on the business card. Our exploratory results indicate that in some domains it may be possible to develop effective means of querying the Web with recognition results, and to use this information to correct the recognition results and/or detect that the information is stale.
Using the web to validate document recognition results: experiments with business cards
NASA Astrophysics Data System (ADS)
Oertel, Clemens; O'Shea, Shauna; Bodnar, Adam; Blostein, Dorothea
2005-01-01
The World Wide Web is a vast information resource which can be useful for validating the results produced by document recognizers. Three computational steps are involved, all of them challenging: (1) use the recognition results in a Web search to retrieve Web pages that contain information similar to that in the document, (2) identify the relevant portions of the retrieved Web pages, and (3) analyze these relevant portions to determine what corrections (if any) should be made to the recognition result. We have conducted exploratory implementations of steps (1) and (2) in the business-card domain: we use fields of the business card to retrieve Web pages and identify the most relevant portions of those Web pages. In some cases, this information appears suitable for correcting OCR errors in the business card fields. In other cases, the approach fails due to stale information: when business cards are several years old and the business-card holder has changed jobs, then websites (such as the home page or company website) no longer contain information matching that on the business card. Our exploratory results indicate that in some domains it may be possible to develop effective means of querying the Web with recognition results, and to use this information to correct the recognition results and/or detect that the information is stale.
A step-by-step solution for embedding user-controlled cines into educational Web pages.
Cornfeld, Daniel
2008-03-01
The objective of this article is to introduce a simple method for embedding user-controlled cines into a Web page using a simple JavaScript. Step-by-step instructions are included and the source code is made available. This technique allows the creation of portable Web pages that allow the user to scroll through cases as if seated at a PACS workstation. A simple JavaScript allows scrollable image stacks to be included on Web pages. With this technique, you can quickly and easily incorporate entire stacks of CT or MR images into online teaching files. This technique has the potential for use in case presentations, online didactics, teaching archives, and resident testing.
Creating a course-based web site in a university environment
NASA Astrophysics Data System (ADS)
Robin, Bernard R.; Mcneil, Sara G.
1997-06-01
The delivery of educational materials is undergoing a remarkable change from the traditional lecture method to dissemination of courses via the World Wide Web. This paradigm shift from a paper-based structure to an electronic one has profound implications for university faculty. Students are enrolling in classes with the expectation of using technology and logging on to the Internet, and professors are realizing that the potential of the Web can have a significant impact on classroom activities. An effective method of integrating electronic technologies into teaching and learning is to publish classroom materials on the World Wide Web. Already, many faculty members are creating their own home pages and Web sites for courses that include syllabi, handouts, and student work. Additionally, educators are finding value in adding hypertext links to a wide variety of related Web resources from online research and electronic journals to government and commercial sites. A number of issues must be considered when developing course-based Web sites. These include meeting the needs of a target audience, designing effective instructional materials, and integrating graphics and other multimedia components. There are also numerous technical issues that must be addressed in developing, uploading and maintaining HTML documents. This article presents a model for a university faculty who want to begin using the Web in their teaching and is based on the experiences of two College of Education professors who are using the Web as an integral part of their graduate courses.
Web-based surveillance of public information needs for informing preconception interventions.
D'Ambrosio, Angelo; Agricola, Eleonora; Russo, Luisa; Gesualdo, Francesco; Pandolfi, Elisabetta; Bortolus, Renata; Castellani, Carlo; Lalatta, Faustina; Mastroiacovo, Pierpaolo; Tozzi, Alberto Eugenio
2015-01-01
The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health. Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics. We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time. Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations.
Web-Based Surveillance of Public Information Needs for Informing Preconception Interventions
D’Ambrosio, Angelo; Agricola, Eleonora; Russo, Luisa; Gesualdo, Francesco; Pandolfi, Elisabetta; Bortolus, Renata; Castellani, Carlo; Lalatta, Faustina; Mastroiacovo, Pierpaolo; Tozzi, Alberto Eugenio
2015-01-01
Background The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health. Methods Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics. Results We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time. Conclusion Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations. PMID:25879682
Virtual Brain Bank a public collection of classified head MRI
NASA Astrophysics Data System (ADS)
Barrios, Fernando A.
2000-10-01
In this work I present the effort at the Neurobiology Center for creating a digital Brain Bank, a collection of well classified human brains that are used for teaching and research, this bank will be based in a collection of high resolution three dimensional head MRI. For this reason this bank is being named "virtual" and eventually will be of public access though a WEB page in the INTERNET.
Using Social Media Tools to Enhance Tacit Knowledge Sharing Within the USMC
2013-09-01
Officer xviii THIS PAGE INTENTIONALLY LEFT BLANK xix ACKNOWLEDGMENTS I would like to offer my deepest gratitude to my beautiful ...KM and organizational behavior blogger , emphasizes PKM as a critical piece to any organization’s KM architecture (Figure 11) (Jarche, 2013...publishing, and opened the floodgates to bloggers eager to create content for the masses. The enormous collection of blogs currently on the World Wide Web
Occupational Survey Report. Visual Information, AFSC 3V0X1
2000-04-01
of the career ladder include: Scan artwork using flatbed scanners Convert graphic file formats Design layouts Letter certificates using laser...Design layouts Scan artwork using flatbed scanners Produce artwork using mouse or digitizing tablets Design and produce imagery for web pages Produce...DAFSC 3V031 PERSONNEL TASKS A0034 Scan artwork using flatbed scanners C0065 Design layouts A0004 Convert graphic file formats A0006 Create
Reliability and type of consumer health documents on the World Wide Web: an annotation study.
Martin, Melanie J
2011-01-01
In this paper we present a detailed scheme for annotating medical web pages designed for health care consumers. The annotation is along two axes: first, by reliability (the extent to which the medical information on the page can be trusted), second, by the type of page (patient leaflet, commercial, link, medical article, testimonial, or support). We analyze inter-rater agreement among three judges for each axis. Inter-rater agreement was moderate (0.77 accuracy, 0.62 F-measure, 0.49 Kappa) on the page reliability axis and good (0.81 accuracy, 0.72 F-measure, 0.73 Kappa) along the page type axis. We have shown promising results in this study that appropriate classes of pages can be developed and used by human annotators to annotate web pages with reasonable to good agreement. No.
Study on online community user motif using web usage mining
NASA Astrophysics Data System (ADS)
Alphy, Meera; Sharma, Ajay
2016-04-01
The Web usage mining is the application of data mining, which is used to extract useful information from the online community. The World Wide Web contains at least 4.73 billion pages according to Indexed Web and it contains at least 228.52 million pages according Dutch Indexed web on 6th august 2015, Thursday. It’s difficult to get needed data from these billions of web pages in World Wide Web. Here is the importance of web usage mining. Personalizing the search engine helps the web user to identify the most used data in an easy way. It reduces the time consumption; automatic site search and automatic restore the useful sites. This study represents the old techniques to latest techniques used in pattern discovery and analysis in web usage mining from 1996 to 2015. Analyzing user motif helps in the improvement of business, e-commerce, personalisation and improvement of websites.
ERIC Educational Resources Information Center
Fernandez-Cardenas, Juan Manuel
2008-01-01
This paper looks at the collaborative construction of web pages in History by a Year-4 group of children in a primary school in the UK. The aim of this paper is to find out: (a) How did children interpret their involvement in this literacy practice? (b) How the construction of web pages was interactionally accomplished? and (c) How can creativity…
Textual and visual content-based anti-phishing: a Bayesian approach.
Zhang, Haijun; Liu, Gang; Chow, Tommy W S; Liu, Wenyin
2011-10-01
A novel framework using a Bayesian approach for content-based phishing web page detection is presented. Our model takes into account textual and visual contents to measure the similarity between the protected web page and suspicious web pages. A text classifier, an image classifier, and an algorithm fusing the results from classifiers are introduced. An outstanding feature of this paper is the exploration of a Bayesian model to estimate the matching threshold. This is required in the classifier for determining the class of the web page and identifying whether the web page is phishing or not. In the text classifier, the naive Bayes rule is used to calculate the probability that a web page is phishing. In the image classifier, the earth mover's distance is employed to measure the visual similarity, and our Bayesian model is designed to determine the threshold. In the data fusion algorithm, the Bayes theory is used to synthesize the classification results from textual and visual content. The effectiveness of our proposed approach was examined in a large-scale dataset collected from real phishing cases. Experimental results demonstrated that the text classifier and the image classifier we designed deliver promising results, the fusion algorithm outperforms either of the individual classifiers, and our model can be adapted to different phishing cases. © 2011 IEEE
Fast access to the CMS detector condition data employing HTML5 technologies
NASA Astrophysics Data System (ADS)
Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo
2011-12-01
This paper focuses on using HTML version 5 (HTML5) for accessing condition data for the CMS experiment, evaluating the benefits and risks posed by the use of this technology. According to the authors of HTML5, this technology attempts to solve issues found in previous iterations of HTML and addresses the needs of web applications, an area previously not adequately covered by HTML. We demonstrate that employing HTML5 brings important benefits in terms of access performance to the CMS condition data. The combined use of web storage and web sockets allows increasing the performance and reducing the costs in term of computation power, memory usage and network bandwidth for client and server. Above all, the web workers allow creating different scripts that can be executed using multi-thread mode, exploiting multi-core microprocessors. Web workers have been employed in order to substantially decrease the web page rendering time to display the condition data stored in the CMS condition database.
Currò, Vincenzo; Buonuomo, Paola Sabrina; Zambiano, Annaluce; Vituzzi, Andrea; Onesimo, Roberta; D'Atri, Alessandro
2007-01-01
The aim of this study is to verify the usefulness for parents of a web evaluation framework composed of ten quality criteria to improve their ability to assess the quality level of medical web sites. We conducted a randomised controlled trial that included two groups of parents who independently evaluated five paediatric web sites by filling out two distinct questionnaires: group A with the evaluation framework, group B without it. 40 volunteers were recruited from parents referring to the General Paediatrics Out-patients Department who satisfied the following eligibility criteria: Internet users, at least 1 child under 12 months old, no professional skill in Internet and medicine. The survey was taken between February 2, 2000 and March 22, 2000. Parents evaluated each web site and assigned a score, compared with a gold standard created by a group of experts. Suggesting evaluation criteria to parents seem useful for an improvement of their ability to evaluate web sites.
G6PDdb, an integrated database of glucose-6-phosphate dehydrogenase (G6PD) mutations.
Kwok, Colin J; Martin, Andrew C R; Au, Shannon W N; Lam, Veronica M S
2002-03-01
G6PDdb (http://www.rubic.rdg.ac.uk/g6pd/ or http://www.bioinf.org.uk/g6pd/) is a newly created web-accessible locus-specific mutation database for the human Glucose-6-phosphate dehydrogenase (G6PD) gene. The relational database integrates up-to-date mutational and structural data from various databanks (GenBank, Protein Data Bank, etc.) with biochemically characterized variants and their associated phenotypes obtained from published literature and the Favism website. An automated analysis of the mutations likely to have a significant impact on the structure of the protein has been performed using a recently developed procedure. The database may be queried online and the full results of the analysis of the structural impact of mutations are available. The web page provides a form for submitting additional mutation data and is linked to resources such as the Favism website, OMIM, HGMD, HGVBASE, and the PDB. This database provides insights into the molecular aspects and clinical significance of G6PD deficiency for researchers and clinicians and the web page functions as a knowledge base relevant to the understanding of G6PD deficiency and its management. Copyright 2002 Wiley-Liss, Inc.
ERIC Educational Resources Information Center
Mitsuhara, Hiroyuki; Kurose, Yoshinobu; Ochi, Youji; Yano, Yoneo
The authors developed a Web-based Adaptive Educational System (Web-based AES) named ITMS (Individualized Teaching Material System). ITMS adaptively integrates knowledge on the distributed Web pages and generates individualized teaching material that has various contents. ITMS also presumes the learners' knowledge levels from the states of their…
A Neophyte Constructs a Web Site: Lessons Learned.
ERIC Educational Resources Information Center
Bent, Devin
1998-01-01
A political science professor at James Madison University (VA) constructed a Web page to support an undergraduate course in government. This article defines Web-site goals and audience, reviews other sites, and discusses organization of Web links and technical choices for HTML editor, page layout and use of image, audio, and video files. Stresses…
Assessing Greek Public Hospitals' Websites.
Tsirintani, Maria; Binioris, Spyros
2015-01-01
Following a previous (2011) survey, this study assesses the web pages of Greek public hospitals according to specific criteria, which are included in the same web page evaluation model. Our purpose is to demonstrate the evolution of hospitals' web pages and document e-health applications trends. Using descriptive methods we found that public hospitals have made significant steps towards establishing and improving their web presence but there is still a lot of work that needs to be carried out in order to take advantage of the benefits of new technologies in the e-health ecosystem.
Information CPC Web Team USA.gov is the U.S. Government's official Web portal to all Federal, state and local government Web resources and services. This page has moved In about 10 seconds you will be transferred to its
The Potential of CGI: Using Pre-Built CGI Scripts to Make Interactive Web Pages.
ERIC Educational Resources Information Center
Nackerud, Shane A.
1998-01-01
Describes CGI (Common Gateway Interface) scripts that are available on the Web and explains how librarians can use them to make Web pages more interactive. Topics include CGI security; Perl scripts; UNIX; and HTML. (LRW)
Some Features of "Alt" Texts Associated with Images in Web Pages
ERIC Educational Resources Information Center
Craven, Timothy C.
2006-01-01
Introduction: This paper extends a series on summaries of Web objects, in this case, the alt attribute of image files. Method: Data were logged from 1894 pages from Yahoo!'s random page service and 4703 pages from the Google directory; an img tag was extracted randomly from each where present; its alt attribute, if any, was recorded; and the…
News from Online: What's New with Chime?
NASA Astrophysics Data System (ADS)
Dorland, Liz
2002-07-01
The Chime plugin (pronounced like the bells) provides a simple route to presenting interactive molecular structures to students via the Internet or in classroom presentations. Small inorganic molecules, ionic structures, organic molecules and giant macromolecules can all be viewed in several formats including ball and stick and spacefilling. Extensive Chime resources on the Internet allow chemistry and biochemistry instructors to create their own Web pages or to use some of the many tutorials for students already online. This article describes about twenty Chime-based Web sites in three categories: Chime Resources, Materials for Student and Classroom Use, and Structure Databases. A list of links is provided.
Enriching the trustworthiness of health-related web pages.
Gaudinat, Arnaud; Cruchet, Sarah; Boyer, Celia; Chrawdhry, Pravir
2011-06-01
We present an experimental mechanism for enriching web content with quality metadata. This mechanism is based on a simple and well-known initiative in the field of the health-related web, the HONcode. The Resource Description Framework (RDF) format and the Dublin Core Metadata Element Set were used to formalize these metadata. The model of trust proposed is based on a quality model for health-related web pages that has been tested in practice over a period of thirteen years. Our model has been explored in the context of a project to develop a research tool that automatically detects the occurrence of quality criteria in health-related web pages.
Manole, Bogdan-Alexandru; Wakefield, Daniel V; Dove, Austin P; Dulaney, Caleb R; Marcrom, Samuel R; Schwartz, David L; Farmer, Michael R
2017-12-24
The purpose of this study was to survey the accessibility and quality of prostate-specific antigen (PSA) screening information from National Cancer Institute (NCI) cancer center and public health organization Web sites. We surveyed the December 1, 2016, version of all 63 NCI-designated cancer center public Web sites and 5 major online clearinghouses from allied public/private organizations (cancer.gov, cancer.org, PCF.org, USPSTF.org, and CDC.gov). Web sites were analyzed according to a 50-item list of validated health care information quality measures. Web sites were graded by 2 blinded reviewers. Interrater agreement was confirmed by Cohen kappa coefficient. Ninety percent of Web sites addressed PSA screening. Cancer center sites covered 45% of topics surveyed, whereas organization Web sites addressed 70%. All organizational Web pages addressed the possibility of false-positive screening results; 41% of cancer center Web pages did not. Forty percent of cancer center Web pages also did not discuss next steps if a PSA test was positive. Only 6% of cancer center Web pages were rated by our reviewers as "superior" (eg, addressing >75% of the surveyed topics) versus 20% of organizational Web pages. Interrater agreement between our reviewers was high (kappa coefficient = 0.602). NCI-designated cancer center Web sites publish lower quality public information about PSA screening than sites run by major allied organizations. Nonetheless, information and communication deficiencies were observed across all surveyed sites. In an age of increasing patient consumerism, prospective prostate cancer patients would benefit from improved online PSA screening information from provider and advocacy organizations. Validated cancer patient Web educational standards remain an important, understudied priority. Copyright © 2018. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Ahlers, Dirk; Boll, Susanne
In recent years, the relation of Web information to a physical location has gained much attention. However, Web content today often carries only an implicit relation to a location. In this chapter, we present a novel location-based search engine that automatically derives spatial context from unstructured Web resources and allows for location-based search: our focused crawler applies heuristics to crawl and analyze Web pages that have a high probability of carrying a spatial relation to a certain region or place; the location extractor identifies the actual location information from the pages; our indexer assigns a geo-context to the pages and makes them available for a later spatial Web search. We illustrate the usage of our spatial Web search for location-based applications that provide information not only right-in-time but also right-on-the-spot.
CrazyEgg Reports for Single Page Analysis
CrazyEgg provides an in depth look at visitor behavior on one page. While you can use GA to do trend analysis of your web area, CrazyEgg helps diagnose the design of a single Web page by visually displaying all visitor clicks during a specified time.
Radiology teaching file cases on the World Wide Web.
Scalzetti, E M
1997-08-01
The presentation of a radiographic teaching file on the World Wide Web can be enhanced by attending to principles of web design. Chief among these are appropriate control of page layout, minimization of the time required to download a page from the remote server, and provision for navigation within and among the web pages that constitute the site. Page layout is easily accomplished by the use of tables; column widths can be fixed to maintain an acceptable line length for text. Downloading time is minimized by rigorous editing and by optimal compression of image files; beyond this, techniques like preloading of images and specification of image width and height are also helpful. Navigation controls should be clear, consistent, and readily available.
Introduction to the world wide web.
Downes, P K
2007-05-12
The World Wide Web used to be nicknamed the 'World Wide Wait'. Now, thanks to high speed broadband connections, browsing the web has become a much more enjoyable and productive activity. Computers need to know where web pages are stored on the Internet, in just the same way as we need to know where someone lives in order to post them a letter. This section explains how the World Wide Web works and how web pages can be viewed using a web browser.
Yes! You Can Build a Web Site.
ERIC Educational Resources Information Center
Holzberg, Carol
2001-01-01
With specially formatted templates or simple Web page editors, teachers can lay out text and graphics in a work space resembling the interface of a word processor. Several options are presented to help teachers build Web sites. ree templates include Class Homepage Builder, AppliTools: HomePage, MySchoolOnline.com, and BigChalk.com. Web design…
Does content affect whether users remember that Web pages were hyperlinked?
Jones, Keith S; Ballew, Timothy V; Probst, C Adam
2008-10-01
We determined whether memory for hyperlinks improved when they represented relations between the contents of the Web pages. J. S. Farris (2003) found that memory for hyperlinks improved when they represented relations between the contents of the Web pages. However, Farris's (2003) participants could have used their knowledge of site content to answer questions about relations that were instantiated via the site's content and its hyperlinks. In Experiment 1, users navigated a Web site and then answered questions about relations that were instantiated only via content, only via hyperlinks, and via content and hyperlinks. Unlike Farris (2003), we split the latter into two sets. One asked whether certain content elements were related, and the other asked whether certain Web pages were hyperlinked. Experiment 2 replicated Experiment 1 with one modification: The questions that were asked about relations instantiated via content and hyperlinks were changed so that each question's wrong answer was also related to the question's target. Memory for hyperlinks improved when they represented relations instantiated within the content of the Web pages. This was true when (a) questions about content and hyperlinks were separated (Experiment 1) and (b) each question's wrong answer was also related to the question's target (Experiment 2). The accuracy of users' mental representations of local architecture depended on whether hyperlinks were related to the site's content. Designers who want users to remember hyperlinks should associate those hyperlinks with content that reflects the relation between the contents on the Web pages.
Wormhole: A Powerful Data Mashup
NASA Technical Reports Server (NTRS)
Widen, David
2011-01-01
The mobile platform is quickly becoming the standard way that users interact with online resources. The iOS operating system allows iPhone and iPad users to seamlessly access highly interactive web applications that until recently were only available via a desktop or laptop. Wormhole is an AJAX application implemented as a smart web widget that allows users to easily supplement web pages with data directly from the Instrument Operations Subsystems division (IOS) at JPL. It creates an interactive mashup using a website's core content enhanced by dynamically retrieved image and metadata supplied by IOS using the webification API. Currently, this technology is limited in scope to NASA data; however, it can easily be augmented to serve many other needs. This web widget can be delivered in various ways, including as a bookmarklet. The underlying technology that powers Wormhole also has applications to other divisions while they are running current missions.
National Centers for Environmental Prediction
Reference List Table of Contents NCEP OPERATIONAL MODEL FORECAST GRAPHICS PARALLEL/EXPERIMENTAL MODEL Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS VERIFICATION (GRID VS.OBS) WEB PAGE (NCEP EXPERIMENTAL PAGE, INTERNAL USE ONLY) Interactive web page tool for
World Wide Web Page Design: A Structured Approach.
ERIC Educational Resources Information Center
Gregory, Gwen; Brown, M. Marlo
1997-01-01
Describes how to develop a World Wide Web site based on structured programming concepts. Highlights include flowcharting, first page design, evaluation, page titles, documenting source code, text, graphics, and browsers. Includes a template for HTML writers, tips for using graphics, a sample homepage, guidelines for authoring structured HTML, and…
The Privilege of Ranking: Google Plays Ball.
ERIC Educational Resources Information Center
Wiggins, Richard
2003-01-01
Discussion of ranking systems used in various settings, including college football and academic admissions, focuses on the Google search engine. Explains the PageRank mathematical formula that scores Web pages by connecting the number of links; limitations, including authenticity and accuracy of ranked Web pages; relevancy; adjusting algorithms;…
XMM-Newton Remote Interface to Science Analysis Software: First Public Version
NASA Astrophysics Data System (ADS)
Ibarra, A.; Gabriel, C.
2011-07-01
We present the first public beta release of the XMM-Newton Remote Interface to Science Analysis (RISA) software, available through the official XMM-Newton web pages. In a nutshell, RISA is a web based application that encapsulates the XMM-Newton data analysis software. The client identifies observations and creates XMM-Newton workflows. The server processes the client request, creates job templates and sends the jobs to a computer. RISA has been designed to help, at the same time, non-expert and professional XMM-Newton users. Thanks to the predefined threads, non-expert users can easily produce light curves and spectra. And on the other hand, expert user can use the full parameter interface to tune their own analysis. In both cases, the VO compliant client/server design frees the users from having to install any specific software to analyze XMM-Newton data.
Web-based X-ray quality control documentation.
David, George; Burnett, Lou Ann; Schenkel, Robert
2003-01-01
The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal department recommended that we secure the site in order to keep out those wishing to make mischief. Our interim solution has not been to password protect the page, which we feared would hinder access for occasional legitimate users, but also not to provide links to it from other hospital and department pages. Utility and productivity were improved and time and money were saved by making radiological equipment quality control documentation instantly available on-line.
ERIC Educational Resources Information Center
Carpi, Anthony
2001-01-01
Explains the advantages of using the World Wide Web as an educational tool and describes the Natural Science Pages project which is a teaching module involving Internet access and Web use and aiming to improve student achievement. (Contains 13 references.) (YDS)
NASA Astrophysics Data System (ADS)
Bloom, Jeffrey A.; Alonso, Rafael
2003-06-01
There are two primary challenges to monitoring the Web for steganographic media: finding suspect media and examining those found. The challenge that has received a great deal of attention is the second of these, the steganalysis problem. The other challenge, and one that has received much less attention, is the search problem. How does the steganalyzer get the suspect media in the first place? This paper describes an innovative method and architecture to address this search problem. The typical approaches to searching the web for covert communications are often based on the concept of "crawling" the Web via a smart "spider." Such spiders find new pages by following ever-expanding chains of links from one page to many next pages. Rather than seek pages by chasing links from other pages, we find candidate pages by identifying requests to access pages. To do this we monitor traffic on Internet backbones, identify and log HTTP requests, and use this information to guide our process. Our approach has the advantages that we examine pages to which no links exist, we examine pages as soon as they are requested, and we concentrate resources only on active pages, rather than examining pages that are never viewed.
User preference as quality markers of paediatric web sites.
Hernández-Borges, Angel A; Macías-Cervi, Pablo; Gaspar-Guardado, Asunción; Torres-Alvarez De Arcaya, María Luisa; Ruíz-Rabaza, Ana; Jiménez-Sosa, Alejandro
2003-09-01
Little is known about the ability of internet users to distinguish the best medical resources online, and how their preferences, measured by usage and popularity indexes, correlate with established quality criteria. Our objective was to analyse whether the number of inbound links and/or daily visits to a sample of paediatric web pages are reliable quality markers of the pages. Two-year follow-up study of 363 web pages with paediatric information. The number of inbound links and the average number of daily visits to the pages were calculated on a yearly basis. In addition, their rates of compliance with the codes of conduct, guidelines and/or principles of three international organizations were evaluated. The quality code most widely met by the sample web pages was the Health on the Net Foundation Code of Conduct (overall rate, 60.2%). Sample pages showed a low degree of compliance with principles related to privacy, confidentiality and electronic commerce (overall rate less than 45%). Most importantly, we observed a moderate, significant correlation between compliance with quality criteria and the number of inbound links (p < 0.001). However, no correlation was found between the number of daily visits to a page and its degree of compliance with the principles. Some indexes derived from the analysis of webmasters' hyperlinks could be reliable quality markers of medical web resources.
New York Times Current News Physics Applications
NASA Astrophysics Data System (ADS)
Cise, John
2010-03-01
Since 2007 I have been using NYTimes current News articles rich in graphics and physics variables for developing edited one page web (http://CisePhysics.homestead.com/files/NYT.htm) physics questions based on current events in the news. The NYTimes home page listed above contains currently ten pages with about 40 one page current edited News related physics articles per page containing: rich graphics, graphic editions by the author, edited articles, introduction to a question, questions, and answers. I use these web pages to introduce new physics concepts to students with current applications of concepts in the news. I also use these one page physics applications as pop quizzes and extra credit for students. As news happens(e.g. the 2010 Vancouver Olympics) I find the physics applications in the NYTimes articles and generate applications and questions. These new one page applications with questions are added to the home page: http://CisePhysics.homestead.com/files/NYT.htm The newest pages start with page 10 and work back in time to 9, 8, etc. The ten web pages with about 40 news articles per page are arranged in the traditional manner: vectors, kinematics, projectiles, Newton, Work & Energy, properties of matter, fluids, temperature, heat, waves, and sound. This site is listed as a resource in AAPT's Compadre site.
Click to go to NDBC home page Select to go to the NWS homepage Home About Us Contact Us Search NDBC Web link to RSS feed access page Web Widget Email Access Web Data Guide Maintenance Schedule Station Status Information USA.gov is the U.S. government's official web portal to all federal, state and local government
Click to go to NDBC home page Select to go to the NWS homepage Home About Us Contact Us Search NDBC Web link to RSS feed access page Web Widget Email Access Web Data Guide Maintenance Schedule Station Status Information USA.gov is the U.S. government's official web portal to all federal, state and local government
A strategy for providing electronic library services to members of the AGATE Consortium
NASA Technical Reports Server (NTRS)
Thompson, J. Garth
1995-01-01
In November, 1992, NASA Administrator Daniel Goldin established a Task Force to evaluate conditions which have lead to the precipitous decline of the US General Aviation System and to recommend actions needed to re-establish US leadership in General Aviation. The Task Force Report and a report by Dr. Bruce J. Holmes, Manager of the General Aviation/Commuter Office at NASA Langley Research Center provided the directions for the formation of the Advanced General Aviation Transport Experiments (AGATE), a consortium of government, industry and university committed to the revitalization of the US General Aviation Industry. One of the recommendations of the Task Force Report was that 'a central repository of information should be created to disseminate NASA research as well as other domestic and foreign aeronautical research that has been accomplished, is ongoing or is planned... A user friendly environment should be created.' This paper describes technical and logistic issues and recommends a plan for providing technical information to members of the AGATE Consortium. It is recommended that the General Aviation office establish and maintain an electronic literature page on the AGATE server. This page should provide a user friendly interface to existing technical report and index servers identified in the report and listed in the Recommendations section. A page should also be provided which gives links to Web resources. A list of specific resources is provided in the Recommendations section. Links should also be provided to a page with tips on searching, a form to provide for feedback and suggestions from users for other resources. Finally, a page should be maintained which provides pointers to other resources like the LaRCsim workstation simulation software which is avail from LaRC at no cost. The developments of the Web is very dynamic. These developments should be monitored regularly by the GA staff and links to additional resources should be provided on the server as they become available. An recommendation to NASA Headquarters should be made to establish a logically central access to all of the NASA Technical Libraries, to make these resources available both to all NASA employees and to the AGATE Consortium.
Future Trends in Children's Web Pages: Probing Hidden Biases for Information Quality
ERIC Educational Resources Information Center
Kurubacak, Gulsun
2007-01-01
As global digital communication continues to flourish, Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political restrictions…
World Wide Web home page for the South Platte NAWQA
Qi, Sharon L.; Dennehy, Kevin F.
1997-01-01
A World Wide Web home page for the U.S. Geological Survey's (USGS) National Water-Quality Assessment (NAWQA) Program, South Platte River Basin study is now online. The home page includes information about the basinwide investigation and provides viewing and downloading access to physical, chemical, and biological data collected by the study team.
Evaluating Information Quality: Hidden Biases on the Children's Web Pages
ERIC Educational Resources Information Center
Kurubacak, Gulsun
2006-01-01
As global digital communication continues to flourish, the Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political…
The Four Levels of Web Site Development Expertise.
ERIC Educational Resources Information Center
Ingram, Albert L.
2000-01-01
Discusses the design of Web pages and sites and proposes a four-level model of Web development expertise that can serve as a curriculum overview or as a plan for an individual's professional development. Highlights include page design, media use, client-side processing, server-side processing, and site structure. (LRW)
Collaborative Design of World Wide Web Pages: A Case Study.
ERIC Educational Resources Information Center
Andrew, Paige G; Musser, Linda R.
1997-01-01
This case study of the collaborative design of an earth science World Wide Web page at Pennsylvania State University highlights the role of librarians. Discusses the original Web site and links, planning, the intended audience, and redesign and recommended changes; and considers the potential contributions of librarians. (LRW)
ERIC Educational Resources Information Center
Galica, Carol
1997-01-01
Provides an annotated bibliography of selected NASA Web sites for K-12 math and science teachers: the NASA Lewis Research Center Learning Technologies K-12 Home Page, Spacelink, NASA Quest, Basic Aircraft Design Page, International Space Station, NASA Shuttle Web Site, LIFTOFF to Space Education, Telescopes in Education, and Space Educator's…
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: March 3, 2011 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: February 12, 2013 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: March 3, 2011 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Beryllium Toxicity Patient Education Care Instruction Sheet ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: May 23, 2008 Page ...
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Get email updates To receive email updates ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: June 24, 2014 Page ...
Class Projects on the Internet.
ERIC Educational Resources Information Center
Nicholson, Danny
1996-01-01
Discusses the use of the Internet in the classroom. Presents a project on renewable energy sources in which students produce web pages. Provides the web page address of the project completed by students. (ASK)
Future Trends in Chlldren's Web Pages: Probing Hidden Biases for Information Quality
ERIC Educational Resources Information Center
Kurubacak, Gulsun
2007-01-01
As global digital communication continues to flourish, Children's Web pages become more critical for children to realize not only the surface but also breadth and deeper meanings in presenting these milieus. These pages not only are very diverse and complex but also enable intense communication across social, cultural and political restrictions…
Creating a Facebook Page for the Seismological Society of America
NASA Astrophysics Data System (ADS)
Newman, S. B.
2009-12-01
In August, 2009 I created a Facebook “fan” page for the Seismological Society of America. We had been exploring cost-effective options for providing forums for two-way communication for some months. We knew that a number of larger technical societies had invested significant sums of money to create customized social networking sites but that a small society would need to use existing low-cost software options. The first thing I discovered when I began to set up the fan page was that an unofficial SSA Facebook group already existed, established by Steven J. Gibbons, a member in Norway. Steven had done an excellent job of posting material about SSA. Partly because of the existing group, the official SSA fan page gained fans rapidly. We began by posting information about our own activities and then added links to activities in the broader geoscience community. While much of this material also appeared on our website and in our publication, Seismological Research Letters (SRL), the tone on the FB page is different. It is less formal with more emphasis on photos and links to other sites, including our own. Fans who are active on FB see the posts as part of their social network and do not need to take the initiative to go to the SSA site. Although the goal was to provide a forum for two-way communication, our initial experience was that people were clearly reading the page but not contributing content. This appears to be case with fan pages of sister geoscience societies. FB offers some demographic information to fan site administrators. In an initial review of the demographics it appeared that fans were younger than the overall demographics of the Society. It appeared that a few of the fans are not members or even scientists. Open questions are: what content will be most useful to fans? How will the existence of the page benefit the membership as a whole? Will the page ultimately encourage two-way communication as hoped? Web 2.0 is generating a series of new communications outlets (FB, Twitter, wikis). As each new communication forum is added without generating additional income, small societies must also confront the need to staff them with existing resources.
Scattering Banner Acknowledgements The graphics used on the Neutron Scattering Web Pages were designed by reused on these web pages by kind permission of Jack Carpenter, and with the assistance of Mary Koelbl (IPD). Rick Goyette (IPNS) set up and maintains the Linux web server as well as helping to automate the
Google Wave: Collaboration Reworked
ERIC Educational Resources Information Center
Rethlefsen, Melissa L.
2010-01-01
Over the past several years, Internet users have become accustomed to Web 2.0 and cloud computing-style applications. It's commonplace and even intuitive to drag and drop gadgets on personalized start pages, to comment on a Facebook post without reloading the page, and to compose and save documents through a web browser. The web paradigm has…
12 CFR 309.4 - Publicly available records.
Code of Federal Regulations, 2010 CFR
2010-01-01
... INFORMATION § 309.4 Publicly available records. (a) Records available on the FDIC's World Wide Web page—(1... on the FDIC's World Wide Web page, located at: http://www.fdic.gov. The FDIC has elected to publish a broad range of materials on its World Wide Web page, including consumer guides; financial and...
81 FR 40262 - Notice of Intent To Seek Approval To Collect Information
Federal Register 2010, 2011, 2012, 2013, 2014
2016-06-21
... their level of satisfaction with existing services. The NAL Internet sites are a vast collection of Web pages. NAL Web pages are visited by an average of 8.6 million people per month. All NAL Information Centers have an established web presence that provides information to their respective audiences...
Semantic photo books: leveraging blogs and social media for photo book creation
NASA Astrophysics Data System (ADS)
Rabbath, Mohamad; Sandhaus, Philipp; Boll, Susanne
2011-03-01
Recently, we observed a substantial increase in the users' interest in sharing their photos online in travel blogs, social communities and photo sharing websites. An interesting aspect of these web platforms is their high level of user-media interaction and thus a high-quality source of semantic annotations: Users comment on the photos of each others, add external links to their travel blogs, tag each other in the social communities and add captions and descriptions to their photos. However, while those media assets are shared online, many users still highly appreciate the representation of these media in appealing physical photo books where the semantics are represented in form of descriptive text, maps, and external elements in addition to their related photos. Thus, in this paper we aim at fulfilling this need and provide an approach for creating photo books from Web 2.0 resources. We concentrate on two kinds of online shared media as resources for printable photo books: (a) Blogs especially travel blogs (b) Social community websites like Facebook which witness a rapidly growing number of shared media elements including photos. We introduce an approach to select media elements including photos, geographical maps and texts from both blogs and social networks semi-automatically, and then use these elements to create a printable photo book with an appealing layout. Because the selected media elements can be too many for the resulting book, we choose the most proper ones by exploiting content based, social based, and interactive based criteria. Additionally we add external media elements such as geographical maps, texts and externally hosted photos from linked resources. Having selected the important media, our approach uses a genetic algorithm to create an appealing layout using aesthetical rules, such as positioning the photo with the related text or map in a way that respects the golden ratio and symmetry. Distributing the media over the pages is done by optimizing the distribution according to several rules such that no pages with purely textual elements without photos are produced. For the page layout appropriate photos are chosen for the background based on their salience. Other media assets, such as texts, photos and geographical maps are positioned in the foreground by a dynamic page layout algorithm respecting both the content of the photos and the background, and common rules for visual layout. The result of our system is a photo book in a printable format. We implemented our approach as web services that analyze the media elements, enrich them, and create the layout in order to finally publish a photo book. The connection to those services is implemented in two interfaces. The first is a tool to select entries from personal blogs, and the second is a Facebook application that allows the user to select photos from his albums.
ANTP Protocol Suite Software Implementation Architecture in Python
2011-06-03
a popular platform of networking programming, an area in which C has traditionally dominated. 2 NetController AeroRP AeroNP AeroNP API AeroTP...visualisation of the running system. For example using the Google Maps API , the main logging web page can show all the running nodes in the system. By...communication between AeroNP and AeroRP and runs on the operating system as daemon. Furthermore, it creates an API interface to mange the communication between
Modeling Traffic on the Web Graph
NASA Astrophysics Data System (ADS)
Meiss, Mark R.; Gonçalves, Bruno; Ramasco, José J.; Flammini, Alessandro; Menczer, Filippo
Analysis of aggregate and individual Web requests shows that PageRank is a poor predictor of traffic. We use empirical data to characterize properties of Web traffic not reproduced by Markovian models, including both aggregate statistics such as page and link traffic, and individual statistics such as entropy and session size. As no current model reconciles all of these observations, we present an agent-based model that explains them through realistic browsing behaviors: (1) revisiting bookmarked pages; (2) backtracking; and (3) seeking out novel pages of topical interest. The resulting model can reproduce the behaviors we observe in empirical data, especially heterogeneous session lengths, reconciling the narrowly focused browsing patterns of individual users with the extreme variance in aggregate traffic measurements. We can thereby identify a few salient features that are necessary and sufficient to interpret Web traffic data. Beyond the descriptive and explanatory power of our model, these results may lead to improvements in Web applications such as search and crawling.
Policy-Aware Content Reuse on the Web
NASA Astrophysics Data System (ADS)
Seneviratne, Oshani; Kagal, Lalana; Berners-Lee, Tim
The Web allows users to share their work very effectively leading to the rapid re-use and remixing of content on the Web including text, images, and videos. Scientific research data, social networks, blogs, photo sharing sites and other such applications known collectively as the Social Web have lots of increasingly complex information. Such information from several Web pages can be very easily aggregated, mashed up and presented in other Web pages. Content generation of this nature inevitably leads to many copyright and license violations, motivating research into effective methods to detect and prevent such violations.
Total Petroleum Hydrocarbons (TPH): ToxFAQs
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: February 4, 2014 Page ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Contact Us: Agency for Toxic Substances and ...
Web Spam, Social Propaganda and the Evolution of Search Engine Rankings
NASA Astrophysics Data System (ADS)
Metaxas, Panagiotis Takis
Search Engines have greatly influenced the way we experience the web. Since the early days of the web, users have been relying on them to get informed and make decisions. When the web was relatively small, web directories were built and maintained using human experts to screen and categorize pages according to their characteristics. By the mid 1990's, however, it was apparent that the human expert model of categorizing web pages does not scale. The first search engines appeared and they have been evolving ever since, taking over the role that web directories used to play.
Khawaja, Zain-Ul-Abdin; Ali, Khudejah Iqbal; Khan, Shanze
2017-02-01
Social marketing related to sexual health is a problematic task, especially in religiously and/or culturally conservative countries. Social media presents a possible alternative channel for sexual health efforts to disseminate information and engage new users. In an effort to understand how well sexual health campaigns and organizations have leveraged this opportunity, this study presents a systematic examination of ongoing Facebook-based sexual health efforts in conservative Asian countries. It was discovered that out of hundreds of sexual health organizations identified in the region, less than half had created a Facebook page. Of those that had, only 31 were found to have posted sexual health-relevant content at least once a month. Many of these 31 organizations were also unsuccessful in maintaining regular official and user activity on their page. In order to assess the quality of the Facebook pages as Web-based information resources, the sexual health-related official activity on each page was analyzed for information (a) value, (b) reliability, (c) currency, and (d) system accessibility. User responsiveness to official posts on the pages was also used to discuss the potential of Facebook as a sexual health information delivery platform.
Grouping of Items in Mobile Web Questionnaires
ERIC Educational Resources Information Center
Mavletova, Aigul; Couper, Mick P.
2016-01-01
There is some evidence that a scrolling design may reduce breakoffs in mobile web surveys compared to a paging design, but there is little empirical evidence to guide the choice of the optimal number of items per page. We investigate the effect of the number of items presented on a page on data quality in two types of questionnaires: with or…
Network and User-Perceived Performance of Web Page Retrievals
NASA Technical Reports Server (NTRS)
Kruse, Hans; Allman, Mark; Mallasch, Paul
1998-01-01
The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.
Quality of consumer-targeted internet guidance on home firearm and ammunition storage.
Freundlich, Katherine L; Skoczylas, Maria Shakour; Schmidt, John P; Keshavarzi, Nahid R; Mohr, Bethany Anne
2016-10-01
Four storage practices protect against unintentional and/or self-inflicted firearm injury among children and adolescents: keeping guns locked (1) and unloaded (2) and keeping ammunition locked up (3) and in a separate location from the guns (4). Our aim was to mimic common Google search strategies on firearm/ammunition storage and assess whether the resulting web pages provided recommendations consistent with those supported by the literature. We identified 87 web pages by Google search of the 10 most commonly used search terms in the USA related to firearm/ammunition storage. Two non-blinded independent reviewers analysed web page technical quality according to a 17-item checklist derived from previous studies. A single reviewer analysed readability by US grade level assigned by Flesch-Kincaid Grade Level Index. Two separate, blinded, independent reviewers analysed deidentified web page content for accuracy and completeness describing the four accepted storage practices. Reviewers resolved disagreements by consensus. The web pages described, on average, less than one of four accepted storage practices (mean 0.2 (95% CL 0.1 to 0.4)). Only two web pages (2%) identified all four practices. Two web pages (2%) made assertions inconsistent with recommendations; both implied that loaded firearms could be stored safely. Flesch-Kincaid Grade Level Index averaged 8.0 (95% CL 7.3 to 8.7). The average technical quality score was 7.1 (95% CL 6.8 to 7.4) out of an available score of 17. There was a high degree of agreement between reviewers regarding completeness (weighted κ 0.78 (95% CL 0.61 to 0.97)). The internet currently provides incomplete information about safe firearm storage. Understanding existing deficiencies may inform future strategies for improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Semantic Advertising for Web 3.0
NASA Astrophysics Data System (ADS)
Thomas, Edward; Pan, Jeff Z.; Taylor, Stuart; Ren, Yuan; Jekjantuk, Nophadol; Zhao, Yuting
Advertising on the World Wide Web is based around automatically matching web pages with appropriate advertisements, in the form of banner ads, interactive adverts, or text links. Traditionally this has been done by manual classification of pages, or more recently using information retrieval techniques to find the most important keywords from the page, and match these to keywords being used by adverts. In this paper, we propose a new model for online advertising, based around lightweight embedded semantics. This will improve the relevancy of adverts on the World Wide Web and help to kick-start the use of RDFa as a mechanism for adding lightweight semantic attributes to the Web. Furthermore, we propose a system architecture for the proposed new model, based on our scalable ontology reasoning infrastructure TrOWL.
ERIC Educational Resources Information Center
Dimopoulos, Kostas; Asimakopoulos, Apostolos
2010-01-01
This study aims to explore navigation patterns and preferred pages' characteristics of ten secondary school students searching the web for information about cloning. The students navigated the Web for as long as they wished in a context of minimum support of teaching staff. Their navigation patterns were analyzed using audit trail data software.…
Web Site On a Budget: How to Find an Affordable Home for Your Pages.
ERIC Educational Resources Information Center
Callihan, Steven E.
1996-01-01
Offers advice for choosing an Internet provider: consider the amount of time, effort, and expertise one has, coupled with the complexity of the Web page, which impact price and choice of provider; and question providers about server speed, ports, architecture, traffic levels, fee structures, and registration of domain names. Lists 33 Web presence…
Building an Ajax Application from Scratch
ERIC Educational Resources Information Center
Clark, Jason A.
2006-01-01
The author of this article suggests that to refresh Web pages and online library catalogs in a more pleasing way, Ajax, an acronym for Asynchronous JavaScript and XML, should be used. Ajax is the way to use Web technologies that work together to refresh sections of Web pages to allow almost instant responses to user input. This article describes…
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
Castillo-Ortiz, Jose Dionisio; de Jesus Valdivia-Nuno, Jose; Ramirez-Gomez, Andrea; Garagarza-Mariscal, Heber; Gallegos-Rios, Carlos; Flores-Hernandez, Gabriel; Hernandez-Sanchez, Luis; Brambila-Barba, Victor; Castaneda-Sanchez, Jose Juan; Barajas-Ochoa, Zalathiel; Suarez-Rico, Angel; Sanchez-Gonzalez, Jorge Manuel; Ramos-Remus, Cesar
2016-09-01
The aim of this study was to assess the changes in the characteristics of rheumatoid arthritis information on the Internet over a 15-year period and the positioning of Web sites posted by universities, hospitals, and medical associations. We replicated the methods of a 2001 study assessing rheumatoid arthritis information on the Internet using WebCrawler. All Web sites and pages were critically assessed for relevance, scope, authorship, type of publication, and financial objectives. Differences between studies were considered significant if 95 % confidence intervals did not overlap. Additionally, we added a Google search with assessments of the quality of content of web pages and of the Web sites posted by medical institutions. There were significant differences between the present study's WebCrawler search and the 2001-referent study. There were increases in information sites (82 vs 36 %) and rheumatoid arthritis-specific discussion pages (59 vs 8 %), and decreases in advertisements (2 vs 48 %) and alternative therapies (27 vs 45 %). The quality of content of web pages is still dispersed; just 37 % were rated as good. Among the first 300 hits, 30 (10 %) were posted by medical institutions, 17 of them in the USA. Regarding readability, 7 % of these 30 web pages required 6 years, 27 % required 7-9 years, 27 % required 10-12 years, and 40 % required 12 or more years of schooling. The Internet has evolved in the last 15 years. Medical institutions are also better positioned. However, there are still areas for improvement, such as the quality of the content, leadership of medical institutions, and readability of information.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-30
... prescribed by FINRA, on their Web sites, social media pages, and any comparable Internet presence, and on Web sites, social media pages, and any comparable Internet presence relating to a member's investment...
ToxGuides: Quick Reference Pocket Guide for Toxicological Profiles
... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Get email updates To receive email updates ... Favorites Del.icio.us Digg Facebook Google Bookmarks Yahoo MyWeb Page last reviewed: January 21, 2015 Page ...
WebWatcher: Machine Learning and Hypertext
1995-05-29
WebWatcher: Machine Learning and Hypertext Thorsten Joachims, Tom Mitchell, Dayne Freitag, and Robert Armstrong School of Computer Science Carnegie...HTML-page about machine learning in which we in- serted a hyperlink to WebWatcher (line 6). The user follows this hyperlink and gets to a page which...AND SUBTITLE WebWatcher: Machine Learning and Hypertext 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT
Guide to the Internet. The world wide web.
Pallen, M.
1995-01-01
The world wide web provides a uniform, user friendly interface to the Internet. Web pages can contain text and pictures and are interconnected by hypertext links. The addresses of web pages are recorded as uniform resource locators (URLs), transmitted by hypertext transfer protocol (HTTP), and written in hypertext markup language (HTML). Programs that allow you to use the web are available for most operating systems. Powerful on line search engines make it relatively easy to find information on the web. Browsing through the web--"net surfing"--is both easy and enjoyable. Contributing to the web is not difficult, and the web opens up new possibilities for electronic publishing and electronic journals. Images p1554-a Fig 5 PMID:8520402
BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.
Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel
2015-06-02
Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.
DOORS to the semantic web and grid with a PORTAL for biomedical computing.
Taswell, Carl
2008-03-01
The semantic web remains in the early stages of development. It has not yet achieved the goals envisioned by its founders as a pervasive web of distributed knowledge and intelligence. Success will be attained when a dynamic synergism can be created between people and a sufficient number of infrastructure systems and tools for the semantic web in analogy with those for the original web. The domain name system (DNS), web browsers, and the benefits of publishing web pages motivated many people to register domain names and publish web sites on the original web. An analogous resource label system, semantic search applications, and the benefits of collaborative semantic networks will motivate people to register resource labels and publish resource descriptions on the semantic web. The Domain Ontology Oriented Resource System (DOORS) and Problem Oriented Registry of Tags and Labels (PORTAL) are proposed as infrastructure systems for resource metadata within a paradigm that can serve as a bridge between the original web and the semantic web. The Internet Registry Information Service (IRIS) registers [corrected] domain names while DNS publishes domain addresses with mapping of names to addresses for the original web. Analogously, PORTAL registers resource labels and tags while DOORS publishes resource locations and descriptions with mapping of labels to locations for the semantic web. BioPORT is proposed as a prototype PORTAL registry specific for the problem domain of biomedical computing.
A Template Engine for Parsing Objects from Textual Representations
NASA Astrophysics Data System (ADS)
Rajković, Milan; Stanković, Milena; Marković, Ivica
2011-09-01
Template engines are widely used for separation of business and presentation logic. They are commonly used in web applications for clean rendering of HTML pages. Another area of usage is message formatting in distributed applications where they transform objects to appropriate representations. This paper explores the possibility of using templates for a reverse process—for creating objects starting from their representations. We present the prototype of engine that we have developed, and describe benefits and drawbacks of this approach.
Reese Sorenson's Individual Professional Page
NASA Technical Reports Server (NTRS)
Sorenson, Reese; Nixon, David (Technical Monitor)
1998-01-01
The subject document is a World Wide Web (WWW) page entitled, "Reese Sorenson's Individual Professional Page." Its can be accessed at "http://george.arc.nasa.gov/sorenson/personal/index.html". The purpose of this page is to make the reader aware of me, who I am, and what I do. It lists my work assignments, my computer experience, my place in the NASA hierarchy, publications by me, awards received by me, my education, and how to contact me. Writing this page was a learning experience, pursuant to an element in my Job Description which calls for me to be able to use the latest computers. This web page contains very little technical information, none of which is classified or sensitive.
ERIC Educational Resources Information Center
Rocha, Tania; Bessa, Maximino; Goncalves, Martinho; Cabral, Luciana; Godinho, Francisco; Peres, Emanuel; Reis, Manuel C.; Magalhaes, Luis; Chalmers, Alan
2012-01-01
Background: One of the most mentioned problems of web accessibility, as recognized in several different studies, is related to the difficulty regarding the perception of what is or is not clickable in a web page. In particular, a key problem is the recognition of hyperlinks by a specific group of people, namely those with intellectual…
Future View: Web Navigation based on Learning User's Browsing Strategy
NASA Astrophysics Data System (ADS)
Nagino, Norikatsu; Yamada, Seiji
In this paper, we propose a Future View system that assists user's usual Web browsing. The Future View will prefetch Web pages based on user's browsing strategies and present them to a user in order to assist Web browsing. To learn user's browsing strategy, the Future View uses two types of learning classifier systems: a content-based classifier system for contents change patterns and an action-based classifier system for user's action patterns. The results of learning is applied to crawling by Web robots, and the gathered Web pages are presented to a user through a Web browser interface. We experimentally show effectiveness of navigation using the Future View.
Ensemble: a web-based system for psychology survey and experiment management.
Tomic, Stefan T; Janata, Petr
2007-08-01
We provide a description of Ensemble, a suite of Web-integrated modules for managing and analyzing data associated with psychology experiments in a small research lab. The system delivers interfaces via a Web browser for creating and presenting simple surveys without the need to author Web pages and with little or no programming effort. The surveys may be extended by selecting and presenting auditory and/or visual stimuli with MATLAB and Flash to enable a wide range of psychophysical and cognitive experiments which do not require the recording of precise reaction times. Additionally, one is provided with the ability to administer and present experiments remotely. The software technologies employed by the various modules of Ensemble are MySQL, PHP, MATLAB, and Flash. The code for Ensemble is open source and available to the public, so that its functions can be readily extended by users. We describe the architecture of the system, the functionality of each module, and provide basic examples of the interfaces.
Thermal Protection System Imagery Inspection Management System -TIIMS
NASA Technical Reports Server (NTRS)
Goza, Sharon; Melendrez, David L.; Henningan, Marsha; LaBasse, Daniel; Smith, Daniel J.
2011-01-01
TIIMS is used during the inspection phases of every mission to provide quick visual feedback, detailed inspection data, and determination to the mission management team. This system consists of a visual Web page interface, an SQL database, and a graphical image generator. These combine to allow a user to ascertain quickly the status of the inspection process, and current determination of any problem zones. The TIIMS system allows inspection engineers to enter their determinations into a database and to link pertinent images and video to those database entries. The database then assigns criteria to each zone and tile, and via query, sends the information to a graphical image generation program. Using the official TIPS database tile positions and sizes, the graphical image generation program creates images of the current status of the orbiter, coloring zones, and tiles based on a predefined key code. These images are then displayed on a Web page using customized JAVA scripts to display the appropriate zone of the orbiter based on the location of the user's cursor. The close-up graphic and database entry for that particular zone can then be seen by selecting the zone. This page contains links into the database to access the images used by the inspection engineer when they make the determination entered into the database. Status for the inspection zones changes as determinations are refined and shown by the appropriate color code.
An experiment with content distribution methods in touchscreen mobile devices.
Garcia-Lopez, Eva; Garcia-Cabot, Antonio; de-Marcos, Luis
2015-09-01
This paper compares the usability of three different content distribution methods (scrolling, paging and internal links) in touchscreen mobile devices as means to display web documents. Usability is operationalized in terms of effectiveness, efficiency and user satisfaction. These dimensions are then measured in an experiment (N = 23) in which users are required to find words in regular-length web documents. Results suggest that scrolling is statistically better in terms of efficiency and user satisfaction. It is also found to be more effective but results were not significant. Our findings are also compared with existing literature to propose the following guideline: "try to use vertical scrolling in web pages for mobile devices instead of paging or internal links, except when the content is too large, then paging is recommended". With an ever increasing number of touchscreen web-enabled mobile devices, this new guideline can be relevant for content developers targeting the mobile web as well as institutions trying to improve the usability of their content for mobile platforms. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less
Software ``Best'' Practices: Agile Deconstructed
NASA Astrophysics Data System (ADS)
Fraser, Steven
Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.
NASA Technical Reports Server (NTRS)
Harper, R. Stephen
1999-01-01
COSS (Crew On-Orbit System Support) is changing. Designed as computer based in-flight refresher training, it is getting good reviews and the demands on the product can be expected to increase. Last year, the lessons were written using Authorware, which had a number of limitations. The most important one was that the navigation and the layout functions were both in one package that was not easy to learn. The lesson creator had to be good at both programming and design. There were also a number of other problems, as detailed in my report last year. This year the COSS unit made the switch to embrace modularity. The navigation function is handled by a player that was custom-written using Delphi. The layout pages are now standard HTML files that can be created using any number of products. This new system gives new flexibility and unties the process from one product (and one company). The player can be re-written by a programmer without affecting the lesson pages. It is also now possible for anybody with a word-processor to make part of the HTML lesson pages and to use many of the new commercially available tools that are being designed for web pages. This summer I created a computer-based training (CBT) lesson on the IBM ThinkPad 760 ED and 760XD laptop computers that should fly on the International Space Station. I also examined the COSS system, the new player and the other new software products.
SLAC Detailed Page: For staff, users, and collaborators - Page no longer
information about this change.) This page will automatically redirect to the For Staff page. You may also want to visit the new Detailed Index web page. Please change your bookmarks accordingly. SLAC Stanford
Customizable scientific web-portal for DIII-D nuclear fusion experiment
NASA Astrophysics Data System (ADS)
Abla, G.; Kim, E. N.; Schissel, D. P.
2010-04-01
Increasing utilization of the Internet and convenient web technologies has made the web-portal a major application interface for remote participation and control of scientific instruments. While web-portals have provided a centralized gateway for multiple computational services, the amount of visual output often is overwhelming due to the high volume of data generated by complex scientific instruments and experiments. Since each scientist may have different priorities and areas of interest in the experiment, filtering and organizing information based on the individual user's need can increase the usability and efficiency of a web-portal. DIII-D is the largest magnetic nuclear fusion device in the US. A web-portal has been designed to support the experimental activities of DIII-D researchers worldwide. It offers a customizable interface with personalized page layouts and list of services for users to select. Each individual user can create a unique working environment to fit his own needs and interests. Customizable services are: real-time experiment status monitoring, diagnostic data access, interactive data analysis and visualization. The web-portal also supports interactive collaborations by providing collaborative logbook, and online instant announcement services. The DIII-D web-portal development utilizes multi-tier software architecture, and Web 2.0 technologies and tools, such as AJAX and Django, to develop a highly-interactive and customizable user interface.
Electronic doors to education: study of high school website accessibility in Iowa.
Klein, David; Myhill, William; Hansen, Linda; Asby, Gary; Michaelson, Susan; Blanck, Peter
2003-01-01
The Americans with Disabilities Act (ADA), and Sections 504 and 508 of the Rehabilitation Act, prohibit discrimination against people with disabilities in all aspects of daily life, including education, work, and access to places of public accommodations. Increasingly, these antidiscrimination laws are used by persons with disabilities to ensure equal access to e-commerce, and to private and public Internet websites. To help assess the impact of the anti-discrimination mandate for educational communities, this study examined 157 website home pages of Iowa public high schools (52% of high schools in Iowa) in terms of their electronic accessibility for persons with disabilities. We predicted that accessibility problems would limit students and others in obtaining information from the web pages as well as limiting ability to navigate to other web pages. Findings show that although many web pages examined included information in accessible formats, none of the home pages met World Wide Web Consortium (W3C) standards for accessibility. The most frequent accessibility problem was lack of alternative text (ALT tags) for graphics. Technical sophistication built into pages was found to reduce accessibility. Implications are discussed for schools and educational institutions, and for laws, policies, and procedures on website accessibility. Copyright 2003 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Moskowitz, Steven
2004-01-01
In fall 2002 the Brewster Central School District introduced teacher Web pages to a teaching staff of more than 300. One of the major goals of the project was to improve teacher computer literacy. Approximately one year prior to this project, the professional staff was asked by the district technology committee to complete a technology survey so…
Romano, Ron; Baum, Neil
2014-01-01
Having a Web page and a blog site are the minimum requirements for an Internet presence in the new millennium. However, a Web page that loads on a personal computer or a laptop will be ineffective on a mobile or cellular phone. Today, with more existing and potential patients having access to cellular technology, it is necessary to reconfigure the appearance of your Web site that appears on a mobile phone. This article discusses mobile computing and suggestions for improving the appearance of your Web site on a mobile or cellular phone.
Interstellar Initiative Web Page Design
NASA Technical Reports Server (NTRS)
Mehta, Alkesh
1999-01-01
This summer at NASA/MSFC, I have contributed to two projects: Interstellar Initiative Web Page Design and Lenz's Law Relative Motion Demonstration. In the Web Design Project, I worked on an Outline. The Web Design Outline was developed to provide a foundation for a Hierarchy Tree Structure. The Outline would help design a Website information base for future and near-term missions. The Website would give in-depth information on Propulsion Systems and Interstellar Travel. The Lenz's Law Relative Motion Demonstrator is discussed in this volume by Russell Lee.
From theater to the world wide web--a new online era for surgical education.
O'Leary, D Peter; Corrigan, Mark A; McHugh, Seamus M; Hill, A D; Redmond, H Paul
2012-01-01
Traditionally, surgical education has been confined to operating and lecture theaters. Access to the World Wide Web and services, such as YouTube and iTunes has expanded enormously. Each week throughout Ireland, nonconsultant hospital doctors work hard to create presentations for surgical teaching. Once presented, these valuable presentations are often never used again. We aimed to compile surgical presentations online and establish a new online surgical education tool. We also sought to measure the effect of this educational tool on surgical presentation quality. Surgical presentations from Cork University Hospital and Beaumont Hospital presented between January 2010 and April 2011 were uploaded to http://www.pilgrimshospital.com/presentations. A YouTube channel and iTunes application were created. Web site hits were monitored. Quality of presentations was assessed by 4 independent senior surgical judges using a validated PowerPoint assessment form. Judges were randomly given 6 presentations; 3 presentations were pre-web site setup and 3 were post-web site setup. Once uploading commenced, presenters were informed. A total of 89 presentations have been uploaded to date. This includes 55 cases, 17 journal club, and 17 short bullet presentations. This has been associated with 46,037 web site page views. Establishment of the web site was associated with a significant improvement in the quality of presentations. Mean scores for pre- and post-web site group were 6.2 vs 7.7 out of 9 respectively, p = 0.037. This novel educational tool provides a unique method to enable surgical education become more accessible to trainees, while also improving the overall quality of surgical teaching PowerPoint presentations. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Panatto, Donatella; Amicizia, Daniela; Arata, Lucia; Lai, Piero Luigi; Gasparini, Roberto
2018-04-03
Squalene-based adjuvants have been included in influenza vaccines since 1997. Despite several advantages of adjuvanted seasonal and pandemic influenza vaccines, laypeople's perception of such formulations may be hesitant or even negative under certain circumstances. Moreover, in Italian, the term "squalene" has the same root as such common words as "shark" (squalo), "squalid" and "squalidness" that tend to have negative connotations. This study aimed to quantitatively and qualitatively analyze a representative sample of Italian web pages mentioning squalene-based adjuvants used in influenza vaccines. Every effort was made to limit the subjectivity of judgments. Eighty-four unique web pages were assessed. A high prevalence (47.6%) of pages with negative or ambiguous attitudes toward squalene-based adjuvants was established. Compared with web pages reporting balanced information on squalene-based adjuvants, those categorized as negative/ambiguous had significantly lower odds of belonging to a professional institution [adjusted odds ratio (aOR) = 0.12, p = .004], and significantly higher odds of containing pictures (aOR = 1.91, p = .034) and being more readable (aOR = 1.34, p = .006). Some differences in wording between positive/neutral and negative/ambiguous web pages were also observed. The most common scientifically unsound claims concerned safety issues and, in particular, claims linking squalene-based adjuvants to the Gulf War Syndrome and autoimmune disorders. Italian users searching the web for information on vaccine adjuvants have a high likelihood of finding unbalanced and misleading material. Information provided by institutional websites should be not only evidence-based but also carefully targeted towards laypeople. Conversely, authors writing for non-institutional websites should avoid sensationalism and provide their readers with more balanced information.
2018-01-01
ABSTRACT Squalene-based adjuvants have been included in influenza vaccines since 1997. Despite several advantages of adjuvanted seasonal and pandemic influenza vaccines, laypeople's perception of such formulations may be hesitant or even negative under certain circumstances. Moreover, in Italian, the term “squalene” has the same root as such common words as “shark” (squalo), “squalid” and “squalidness” that tend to have negative connotations. This study aimed to quantitatively and qualitatively analyze a representative sample of Italian web pages mentioning squalene-based adjuvants used in influenza vaccines. Every effort was made to limit the subjectivity of judgments. Eighty-four unique web pages were assessed. A high prevalence (47.6%) of pages with negative or ambiguous attitudes toward squalene-based adjuvants was established. Compared with web pages reporting balanced information on squalene-based adjuvants, those categorized as negative/ambiguous had significantly lower odds of belonging to a professional institution [adjusted odds ratio (aOR) = 0.12, p = .004], and significantly higher odds of containing pictures (aOR = 1.91, p = .034) and being more readable (aOR = 1.34, p = .006). Some differences in wording between positive/neutral and negative/ambiguous web pages were also observed. The most common scientifically unsound claims concerned safety issues and, in particular, claims linking squalene-based adjuvants to the Gulf War Syndrome and autoimmune disorders. Italian users searching the web for information on vaccine adjuvants have a high likelihood of finding unbalanced and misleading material. Information provided by institutional websites should be not only evidence-based but also carefully targeted towards laypeople. Conversely, authors writing for non-institutional websites should avoid sensationalism and provide their readers with more balanced information. PMID:29172967
ERIC Educational Resources Information Center
Kupersmith, John
2003-01-01
Examines special-purpose entry points to library Web sites. Discusses in-house homepages; branch-specific pages or single library system-wide pages; staff use pages; versions in different languages; "MyLibrary" pages where users can customize the menu; standalone "branded" sites; publicly accessible pages; and best practices.…
Referencing web pages and e-journals.
Bryson, David
2013-12-01
One of the areas that can confuse students and authors alike is how to reference web pages and electronic journals (e-journals). The aim of this professional development article is to go back to first principles for referencing and see how with examples these should be referenced.
Age differences in search of web pages: the effects of link size, link number, and clutter.
Grahame, Michael; Laberge, Jason; Scialfa, Charles T
2004-01-01
Reaction time, eye movements, and errors were measured during visual search of Web pages to determine age-related differences in performance as a function of link size, link number, link location, and clutter. Participants (15 young adults, M = 23 years; 14 older adults, M = 57 years) searched Web pages for target links that varied from trial to trial. During one half of the trials, links were enlarged from 10-point to 12-point font. Target location was distributed among the left, center, and bottom portions of the screen. Clutter was manipulated according to the percentage of used space, including graphics and text, and the number of potentially distracting nontarget links was varied. Increased link size improved performance, whereas increased clutter and links hampered search, especially for older adults. Results also showed that links located in the left region of the page were found most easily. Actual or potential applications of this research include Web site design to increase usability, particularly for older adults.
McDonough, Brianna; Felter, Elizabeth; Downes, Amia; Trauth, Jeanette
2015-04-01
Pregnant and postpartum women have special needs during public health emergencies but often have inadequate levels of disaster preparedness. Thus, improving maternal emergency preparedness is a public health priority. More research is needed to identify the strengths and weaknesses of various approaches to how preparedness information is communicated to these women. A sample of web pages from the Centers for Disease Control and Prevention intended to address the preparedness needs of pregnant and postpartum populations was examined for suitability for this audience. Five of the 7 web pages examined were considered adequate. One web page was considered not suitable and one the raters split between not suitable and adequate. None of the resources examined were considered superior. If these resources are considered some of the best available to pregnant and postpartum women, more work is needed to improve the suitability of educational resources, especially for audiences with low literacy and low incomes.
Userscripts for the life sciences.
Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J
2007-12-21
The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity.
Userscripts for the Life Sciences
Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J
2007-01-01
Background The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Results Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. Conclusion This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity. PMID:18154664
NASA Astrophysics Data System (ADS)
Lares, M.
The presence of institutions on the internet is nowadays very important to strenghten communication channels, both internal and with the general public. The Córdoba Observatory has several web portals, including the official web page, a blog and presence on several social networks. These are one of the fundamental pillars for outreach activities, and serve as communication channel for events and scientific, academic, and outreach news. They are also a source of information for the staff, as well as data related to the Observatory internal organization and scientific production. Several statistical studies are presented, based on data taken from the visits to the official web pages. I comment on some aspects of the role of web pages as a source of consultation and as a quick response to information needs. FULL TEXT IN SPANISH
Cleanups In My Community (CIMC) - Incidents of National Significance, National Layer
This data layer provides access to Incidents of National Significance as part of the CIMC web service. Incidents of National Significance include all Presidentially-declared emergencies, major disasters, and catastrophes. Multiple federal departments and agencies, including EPA, coordinate actions to help prevent, prepare for, respond to, and recover from Incidents of National Significance. The Incidents of National Significance shown in this web service are derived from the epa.gov website and include links to the relevant web pages within the attribute table. Data about Incidents of National Significance are located on their own EPA web pages, and CIMC links to those pages. The CIMC web service was initially published in 2013, but the data are updated on the 18th of each month. The full schedule for data updates in CIMC is located here: https://iaspub.epa.gov/enviro/data_update_v2.
Forensic Science and the Internet - Current Utilization and Future Potential.
Chamakura, R P
1997-12-01
The Internet has become a very powerful and inexpensive tool for the free distribution of knowledge and information. It is a learning and research tool, a virtual library without borders and membership requirements, a help desk, and a publication house providing newspapers with current information and journals with instant publication. Very soon, when live audio and video transmission is perfected, the Internet (popularly referred to as the Net) also will be a live classroom and everyday conference site. This article provides a brief overview of the basic structure and essential components of the Internet. A limited number of home pages/Web sites that are already made available on the Net by scientists, laboratories, and colleges in the forensic science community are presented in table forms. Home pages/Web sites containing useful information pertinent to different disciplines of forensic science are also categorized in various tables. The ease and benefits of the Internet use are exemplified by the author's personal experience. Currently, only a few forensic scientists and institutions have made their presence felt. More participation and active contribution and the creation of on-line searchable databases in all specialties of forensic science are urgently needed. Leading forensic journals should take the lead and create on-line searchable indexes with abstracts. Creating Internet repositories of unpublished papers is an idea worth looking into. Leading forensic science institutions should also develop use of the Net to provide training and retraining opportunities for forensic scientists. Copyright © 1997 Central Police University.
Identification of Malicious Web Pages by Inductive Learning
NASA Astrophysics Data System (ADS)
Liu, Peishun; Wang, Xuefang
Malicious web pages are an increasing threat to current computer systems in recent years. Traditional anti-virus techniques focus typically on detection of the static signatures of Malware and are ineffective against these new threats because they cannot deal with zero-day attacks. In this paper, a novel classification method for detecting malicious web pages is presented. This method is generalization and specialization of attack pattern based on inductive learning, which can be used for updating and expanding knowledge database. The attack pattern is established from an example and generalized by inductive learning, which can be used to detect unknown attacks whose behavior is similar to the example.
Web pages: What can you see in a single fixation?
Jahanian, Ali; Keshvari, Shaiyan; Rosenholtz, Ruth
2018-01-01
Research in human vision suggests that in a single fixation, humans can extract a significant amount of information from a natural scene, e.g. the semantic category, spatial layout, and object identities. This ability is useful, for example, for quickly determining location, navigating around obstacles, detecting threats, and guiding eye movements to gather more information. In this paper, we ask a new question: What can we see at a glance at a web page - an artificial yet complex "real world" stimulus? Is it possible to notice the type of website, or where the relevant elements are, with only a glimpse? We find that observers, fixating at the center of a web page shown for only 120 milliseconds, are well above chance at classifying the page into one of ten categories. Furthermore, this ability is supported in part by text that they can read at a glance. Users can also understand the spatial layout well enough to reliably localize the menu bar and to detect ads, even though the latter are often camouflaged among other graphical elements. We discuss the parallels between web page gist and scene gist, and the implications of our findings for both vision science and human-computer interaction.
Automatic Hidden-Web Table Interpretation by Sibling Page Comparison
NASA Astrophysics Data System (ADS)
Tao, Cui; Embley, David W.
The longstanding problem of automatic table interpretation still illudes us. Its solution would not only be an aid to table processing applications such as large volume table conversion, but would also be an aid in solving related problems such as information extraction and semi-structured data management. In this paper, we offer a conceptual modeling solution for the common special case in which so-called sibling pages are available. The sibling pages we consider are pages on the hidden web, commonly generated from underlying databases. We compare them to identify and connect nonvarying components (category labels) and varying components (data values). We tested our solution using more than 2,000 tables in source pages from three different domains—car advertisements, molecular biology, and geopolitical information. Experimental results show that the system can successfully identify sibling tables, generate structure patterns, interpret tables using the generated patterns, and automatically adjust the structure patterns, if necessary, as it processes a sequence of hidden-web pages. For these activities, the system was able to achieve an overall F-measure of 94.5%.
2008-09-01
IWPC 21 Berners - Lee , Tim . (1999). Weaving the Web. New York: HarperCollins Publishers, Inc. 22... Berners - Lee , Tim . (1999). Weaving the Web. New York: HarperCollins Publishers, Inc. Berners - Lee , T., Hendler, J., & Lassila, O. (2001). The Semantic...environment where software agents roaming from page to page can readily carry out sophisticated tasks for users. T. Berners - Lee , J. Hendler, and O
Commentary: Building Web Research Strategies for Teachers and Students
ERIC Educational Resources Information Center
Maloy, Robert W.
2016-01-01
This paper presents web research strategies for teachers and students to use in building Dramatic Event, Historical Biography, and Influential Literature wiki pages for history/social studies learning. Dramatic Events refer to milestone or turning point moments in history. Historical Biographies and Influential Literature pages feature…
Citations to Web pages in scientific articles: the permanence of archived references.
Thorp, Andrea W; Schriger, David L
2011-02-01
We validate the use of archiving Internet references by comparing the accessibility of published uniform resource locators (URLs) with corresponding archived URLs over time. We scanned the "Articles in Press" section in Annals of Emergency Medicine from March 2009 through June 2010 for Internet references in research articles. If an Internet reference produced the authors' expected content, the Web page was archived with WebCite (http://www.webcitation.org). Because the archived Web page does not change, we compared it with the original URL to determine whether the original Web page had changed. We attempted to access each original URL and archived Web site URL at 3-month intervals from the time of online publication during an 18-month study period. Once a URL no longer existed or failed to contain the original authors' expected content, it was excluded from further study. The number of original URLs and archived URLs that remained accessible over time was totaled and compared. A total of 121 articles were reviewed and 144 Internet references were found within 55 articles. Of the original URLs, 15% (21/144; 95% confidence interval [CI] 9% to 21%) were inaccessible at publication. During the 18-month observation period, there was no loss of archived URLs (apart from the 4% [5/123; 95% CI 2% to 9%] that could not be archived), whereas 35% (49/139) of the original URLs were lost (46% loss; 95% CI 33% to 61% by the Kaplan-Meier method; difference between curves P<.0001, log rank test). Archiving a referenced Web page at publication can help preserve the authors' expected information. Copyright © 2010 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.
MedlinePlus Connect: How it Works
... it looks depends on how it is implemented. Web Application The Web application returns a formatted response ... for more examples of Web Application response pages. Web Service The MedlinePlus Connect REST-based Web service ...
Characterization of topological structure on complex networks.
Nakamura, Ikuo
2003-10-01
Characterizing the topological structure of complex networks is a significant problem especially from the viewpoint of data mining on the World Wide Web. "Page rank" used in the commercial search engine Google is such a measure of authority to rank all the nodes matching a given query. We have investigated the page-rank distribution of the real Web and a growing network model, both of which have directed links and exhibit a power law distributions of in-degree (the number of incoming links to the node) and out-degree (the number of outgoing links from the node), respectively. We find a concentration of page rank on a small number of nodes and low page rank on high degree regimes in the real Web, which can be explained by topological properties of the network, e.g., network motifs, and connectivities of nearest neighbors.
A radiology department intranet: development and applications.
Willing, S J; Berland, L L
1999-01-01
An intranet is a "private Internet" that uses the protocols of the World Wide Web to share information resources within a company or with the company's business partners and clients. The hardware requirements for an intranet begin with a dedicated Web server permanently connected to the departmental network. The heart of a Web server is the hypertext transfer protocol (HTTP) service, which receives a page request from a client's browser and transmits the page back to the client. Although knowledge of hypertext markup language (HTML) is not essential for authoring a Web page, a working familiarity with HTML is useful, as is knowledge of programming and database management. Security can be ensured by using scripts to write information in hidden fields or by means of "cookies." Interfacing databases and database management systems with the Web server and conforming the user interface to HTML syntax can be achieved by means of the common gateway interface (CGI), Active Server Pages (ASP), or other methods. An intranet in a radiology department could include the following types of content: on-call schedules, work schedules and a calendar, a personnel directory, resident resources, memorandums and discussion groups, software for a radiology information system, and databases.
Searching for Bill and Jane: Electronic Full-Text Literature.
ERIC Educational Resources Information Center
Still, Julie; Kassabian, Vibiana
1998-01-01
Examines electronic full-text literature available on the World Wide Web and on CD-ROM. Discusses authors and genres, electronic texts, and fees. Highlights Shakespeare, Jane Austen, and nature writing. Provides a bibliography of Web guides, specialized Shakespeare pages, and pages dealing with the Shakespeare authorship debate and secondary…
77 FR 31917 - Energy Conservation Program: Energy Conservation Standards for Residential Dishwashers
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-30
... the docket Web page can be found at: http://www.regulations.gov/#!docketDetail ;D=EERE-2011-BT-STD-0060. The regulations.gov Web page contains instructions on how to access all documents, including...: (202) 586-7796. Email: [email protected] . SUPPLEMENTARY INFORMATION: Table of Contents I...
Ecosystem Food Web Lift-The-Flap Pages
ERIC Educational Resources Information Center
Atwood-Blaine, Dana; Rule, Audrey C.; Morgan, Hannah
2016-01-01
In the lesson on which this practical article is based, third grade students constructed a "lift-the-flap" page to explore food webs on the prairie. The moveable papercraft focused student attention on prairie animals' external structures and how the inferred functions of those structures could support further inferences about the…
12 CFR 708a.3 - Board of directors' approval and members' opportunity to comment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... fashion in the lobby of the credit union's home office and branch offices and on the credit union's Web site, if it has one. If the notice is not on the home page of the Web site, the home page must have a...
Visualization Tools for Lattice QCD - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Massimo Di Pierro
2012-03-15
Our research project is about the development of visualization tools for Lattice QCD. We developed various tools by extending existing libraries, adding new algorithms, exposing new APIs, and creating web interfaces (including the new NERSC gauge connection web site). Our tools cover the full stack of operations from automating download of data, to generating VTK files (topological charge, plaquette, Polyakov lines, quark and meson propagators, currents), to turning the VTK files into images, movies, and web pages. Some of the tools have their own web interfaces. Some Lattice QCD visualization have been created in the past but, to our knowledge,more » our tools are the only ones of their kind since they are general purpose, customizable, and relatively easy to use. We believe they will be valuable to physicists working in the field. They can be used to better teach Lattice QCD concepts to new graduate students; they can be used to observe the changes in topological charge density and detect possible sources of bias in computations; they can be used to observe the convergence of the algorithms at a local level and determine possible problems; they can be used to probe heavy-light mesons with currents and determine their spatial distribution; they can be used to detect corrupted gauge configurations. There are some indirect results of this grant that will benefit a broader audience than Lattice QCD physicists.« less
NASA Astrophysics Data System (ADS)
Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.
2014-12-01
Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.
2000-01-01
horoscope page (for Scorpio). Although this particular combination might be unique or unpopular, if we decompose the page into four WebViews, one for metro...news, one for international news, one for the weather and one for the horoscope , then these WebViews can be accessed frequently enough to merit...query results, the cost of accessing them is about the same as the cost of generating them from scratch, using the virt policy. This will also be true
Description Meta Tags in Public Home and Linked Pages.
ERIC Educational Resources Information Center
Craven, Timothy C.
2001-01-01
Random samples of 1,872 Web pages registered with Yahoo! And 1,638 pages reachable from Yahoo!-registered pages were analyzed for use of meta tags and specifically those containing descriptions. Results: 727 (38.8%) of the Yahoo!-registered pages and 442 (27%) of the other pages included descriptions in meta tages. Some descriptions greatly…
Enhancing Geoscience Research Discovery Through the Semantic Web
NASA Astrophysics Data System (ADS)
Rowan, Linda R.; Gross, M. Benjamin; Mayernik, Matthew; Khan, Huda; Boler, Frances; Maull, Keith; Stott, Don; Williams, Steve; Corson-Rikert, Jon; Johns, Erica M.; Daniels, Michael; Krafft, Dean B.; Meertens, Charles
2016-04-01
UNAVCO, UCAR, and Cornell University are working together to leverage semantic web technologies to enable discovery of people, datasets, publications and other research products, as well as the connections between them. The EarthCollab project, a U.S. National Science Foundation EarthCube Building Block, is enhancing an existing open-source semantic web application, VIVO, to enhance connectivity across distributed networks of researchers and resources related to the following two geoscience-based communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. People, publications, datasets and grant information have been mapped to an extended version of the VIVO-ISF ontology and ingested into VIVO's database. Much of the VIVO ontology was built for the life sciences, so we have added some components of existing geoscience-based ontologies and a few terms from a local ontology that we created. The UNAVCO VIVO instance, connect.unavco.org, utilizes persistent identifiers whenever possible; for example using ORCIDs for people, publication DOIs, data DOIs and unique NSF grant numbers. Data is ingested using a custom set of scripts that include the ability to perform basic automated and curated disambiguation. VIVO can display a page for every object ingested, including connections to other objects in the VIVO database. A dataset page, for example, includes the dataset type, time interval, DOI, related publications, and authors. The dataset type field provides a connection to all other datasets of the same type. The author's page shows, among other information, related datasets and co-authors. Information previously spread across several unconnected databases is now stored in a single location. In addition to VIVO's default display, the new database can be queried using SPARQL, a query language for semantic data. EarthCollab is extending the VIVO web application. One such extension is the ability to cross-link separate VIVO instances across institutions, allowing local display of externally curated information. For example, Cornell's VIVO faculty pages will display UNAVCO's dataset information and UNAVCO's VIVO will display Cornell faculty member contact and position information. About half of UNAVCO's membership is international and we hope to connect our data to institutions in other countries with a similar approach. Additional extensions, including enhanced geospatial capabilities, will be developed based on task-centered usability testing.
Extracting knowledge from the World Wide Web
Henzinger, Monika; Lawrence, Steve
2004-01-01
The World Wide Web provides a unprecedented opportunity to automatically analyze a large sample of interests and activity in the world. We discuss methods for extracting knowledge from the web by randomly sampling and analyzing hosts and pages, and by analyzing the link structure of the web and how links accumulate over time. A variety of interesting and valuable information can be extracted, such as the distribution of web pages over domains, the distribution of interest in different areas, communities related to different topics, the nature of competition in different categories of sites, and the degree of communication between different communities or countries. PMID:14745041
2013-01-01
Background Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Results Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes: • Support for multi-component compounds (mixtures) • Import and export of SD-files • Optional security (authorization) For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures). Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. Conclusions By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework. PMID:24325762
Kiener, Joos
2013-12-11
Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework.
A web site on lung cancer: who are the users and what are they looking for?
Linssen, Cilia; Schook, Romane M; The, Anne-Mei; Lammers, Ernst; Festen, Jan; Postmus, Pieter E
2007-09-01
The Dutch Lung Cancer Information Centre launched the Web site www.longkanker.info in November 2003. The purpose of this article is to describe the launching of the Web site, its development, the type of visitors to the Web site, what they were looking for, and whether they found what they requested. Supervised by a panel (pulmonologists, patients, communication specialists), a large amount of material about lung cancer has been collected and edited into accessible language by health care providers, and the Web site has been divided into special categories following the different stages that lung cancer patients, relatives, and health care providers go through during the illness. The Web site is updated regularly. Search engines have been used to check the position of the Web site as a "hit." Pulmonologists have been informed about the founding of the Web site, and all lung cancer outpatient clinics in The Netherlands have received posters, folders, and cards to inform their patients. Visitor numbers, page views, and visitor numbers per page view have been registered continuously. Visitor satisfaction polls were placed in the second half of 2004 and the second half of 2005. The Web site appeared as first hit when using search engines immediately after launching it. Half of the visitors came to the Web site via search engines or links found at other sites. The number of visitors started at 4600 in the first month, doubled in the next months, and reached 18,000 per month 2 years after its launch. The number of visited pages increased to 87,000 per month, with an average number of five pages per visitor. Thirty percent of the visitors return within the same month. The most popular pages are interactive pages with the overview of all questions to "ask the doctor" at the top with forum messages, survival figures of all form of lung cancer, and information about the disease. The first satisfaction poll obtained 650 respondents and the second 382. The visitors to the Web site are caregivers (57%), patients (8%), and others (students, people fearing lung cancer). Of the visitors, 895 found what they were looking for, and the satisfaction is the highest among nurses and caregivers (91% and 95%, respectively) and the lowest among physicians and patients (85% and 83%). Given the number of visitors to the lung cancer Web site, it can be concluded that there is a great need for additional information among patients and caregivers. The launched Web site www.longkanker.info has reached its goal of providing a dependable source of information about lung cancer and satisfying its visitors.
... page: //medlineplus.gov/ency/article/002844.htm Funnel-web spider bite To use the sharing features on ... the effects of a bite from the funnel-web spider. Male funnel-web spiders are more poisonous ...
Risk markers for disappearance of pediatric Web resources
Hernández-Borges, Angel A.; Jiménez-Sosa, Alejandro; Torres-Álvarez de Arcaya, Maria L.; Macías-Cervi, Pablo; Gaspar-Guardado, Maria A.; Ruíz-Rabaza, Ana
2005-01-01
Objectives: The authors sought to find out whether certain Webometric indexes of a sample of pediatric Web resources, and some tests based on them, could be helpful predictors of their disappearance. Methods: The authors performed a retrospective study of a sample of 363 pediatric Websites and pages they had followed for 4 years. Main measurements included: number of resources that disappeared, number of inbound links and their annual increment, average daily visits to the resources in the sample, sample compliance with the quality criteria of 3 international organizations, and online time of the Web resources. Results: On average, 11% of the sample disappeared annually. However, 13% of these were available again at the end of follow up. Disappearing and surviving Websites did not show differences in the variables studied. However, surviving Web pages had a higher number of inbound links and higher annual increment in inbound links. Similarly, Web pages that survived showed higher compliance with recognized sets of quality criteria than those that disappeared. A subset of 14 quality criteria whose compliance accounted for 90% of the probability of online permanence was identified. Finally, a progressive increment of inbound links was found to be a marker of good prognosis, showing high specificity and positive predictive value (88% and 94%, respectively). Conclusions: The number of inbound links and annual increment of inbound links could be useful markers of the permanence probability for pediatric Web pages. Strategies that assure the Web editors' awareness of their Web resources' popularity could stimulate them to improve the quality of their Websites. PMID:16059427
Incorporating a Rich Media Presentation Format into a Lecture-Based Course Structure
ERIC Educational Resources Information Center
Moss, Nicholas
2005-01-01
The e-syllabus is a set of Web pages in which each course in the curriculum is assigned a "course page" and multiple "session pages." Course pages have a standardized format that provides course objectives, course policies, required or recommended textbooks, grading scales, and faculty listings. A separate session page is…
The Impact of Salient Advertisements on Reading and Attention on Web Pages
ERIC Educational Resources Information Center
Simola, Jaana; Kuisma, Jarmo; Oorni, Anssi; Uusitalo, Liisa; Hyona, Jukka
2011-01-01
Human vision is sensitive to salient features such as motion. Therefore, animation and onset of advertisements on Websites may attract visual attention and disrupt reading. We conducted three eye tracking experiments with authentic Web pages to assess whether (a) ads are efficiently ignored, (b) ads attract overt visual attention and disrupt…
Teaching Learning Theories Via the Web.
ERIC Educational Resources Information Center
Schnackenberg, Heidi L.
This paper describes a World Wide Web site on learning theories, developed as a class assignment for a course on learning and instructional theories at Concordia University (Quebec). Groups of two to four students developed pages on selected theories of learning that were then linked to a main page developed by the instructor and a doctoral…
User Perceptions of the Library's Web Pages: A Focus Group Study at Texas A&M University.
ERIC Educational Resources Information Center
Crowley, Gwyneth H.; Leffel, Rob; Ramirez, Diana; Hart, Judith L.; Armstrong, Tommy S., II
2002-01-01
This focus group study explored library patrons' opinions about Texas A&M library's Web pages. Discusses information seeking behavior which indicated that patrons are confused when trying to navigate the Public Access Menu and suggests the need for a more intuitive interface. (Author/LRW)
Some Thoughts on Free Textbooks
ERIC Educational Resources Information Center
Stewart, Robert
2009-01-01
The author publishes and freely distributes three online textbooks. "Introduction to Physical Oceanography" is available as a typeset book in Portable Document Format (PDF) or as web pages. "Our Ocean Planet: Oceanography in the 21st Century" and "Environmental Science in the 21st Century" are both available as web pages. All three books, which…
This page contains the August 2002 final rule fact sheet on the NESHAP for Paper and Other Web Coating. Also on this page is an April 2004 presentation that on the NESHAP, designed to be used for basic education
The Status of African Studies Digitized Content: Three Metadata Schemes.
ERIC Educational Resources Information Center
Kuntz, Patricia S.
The proliferation of Web pages and digitized material mounted on Internet servers has become unmanageable. Librarians and users are concerned that documents and information are being lost in cyberspace as a result of few bibliographic controls and common standards. Librarians in cooperation with software creators and Web page designers are…
Outreach to International Students and Scholars Using the World Wide Web.
ERIC Educational Resources Information Center
Wei, Wei
1998-01-01
Describes the creation of a World Wide Web site for the Science Library International Outreach Program at the University of California, Santa Cruz. Discusses design elements, content, and promotion of the site. Copies of the home page and the page containing the outreach program's statement of purpose are included. (AEF)
Teaching Intrapersonal Communication with the World-Wide Web: Cognitive Technology.
ERIC Educational Resources Information Center
Shedletsky, Leonard J.; Aitken, Joan E.
This paper offers a brief description of a course on intrapersonal communication with a home page approach using the World Wide Web. The paper notes that students use the home page for completing assignments, readings, posting responses, self-evaluation testing, research, and displaying some of their papers for the course. The paper contains…
2014-05-01
developed techniques for building better IP geolocation systems. Geolocation has many applications, such as presenting advertisements for local business ...presenting advertisements for local business establishments on web pages to debugging network performance issues to attributing attack traffic to...Pennsylvania.” Geolocation has many applications, such as presenting advertisements for local business establishments on web pages to debugging network
Making the World Wide Web Accessible to All Students.
ERIC Educational Resources Information Center
Guthrie, Sally A.
2000-01-01
Examines the accessibility of Web sites belonging to 80 colleges of communications and schools of journalism by examining the hypertext markup language (HTML) used to format the pages. Suggests ways to revise the markup of pages to make them more accessible to students with vision, hearing, and mobility problems. Lists resources of the latest…
A Prototype HTML Training System for Graphic Communication Majors
ERIC Educational Resources Information Center
Runquist, Roger L.
2010-01-01
This design research demonstrates a prototype content management system capable of training graphic communication students in the creation of basic HTML web pages. The prototype serve as a method of helping students learn basic HTML structure and commands earlier in their academic careers. Exposure to the concepts of web page creation early in…
Key Spatial Relations-based Focused Crawling (KSRs-FC) for Borderlands Situation Analysis
NASA Astrophysics Data System (ADS)
Hou, D. Y.; Wu, H.; Chen, J.; Li, R.
2013-11-01
Place names play an important role in Borderlands Situation topics, while current focused crawling methods treat them in the same way as other common keywords, which may lead to the omission of many useful web pages. In the paper, place names in web pages and their spatial relations were firstly discussed. Then, a focused crawling method named KSRs-FC was proposed to deal with the collection of situation information about borderlands. In this method, place names and common keywords were represented separately, and some of the spatial relations related to web pages crawling were used in the relevance calculation between the given topic and web pages. Furthermore, an information collection system for borderlands situation analysis was developed based on KSRs-FC. Finally, F-Score method was adopted to quantitatively evaluate this method by comparing with traditional method. Experimental results showed that the F-Score value of the proposed method increased by 11% compared to traditional method with the same sample data. Obviously, KSRs-FC method can effectively reduce the misjudgement of relevant webpages.
2016-06-01
an effective system monitoring and display capability. The SOM, C-SSE, and resource managers access MUOS via a web portal called the MUOS Planning...and Provisioning Application (PlanProvApp). This web portal is their window into MUOS and is designed to provide them with a shared understanding of...including page loading errors, partially loaded web pages, incomplete reports, and inaccurate reports. For example, MUOS reported that there were
The aware toolbox for the detection of law infringements on web pages
NASA Astrophysics Data System (ADS)
Shahab, Asif; Kieninger, Thomas; Dengel, Andreas
2010-01-01
In the project Aware we aim to develop an automatic assistant for the detection of law infringements on web pages. The motivation for this project is that many authors of web pages are at some points infringing copyrightor other laws, mostly without being aware of that fact, and are more and more often confronted with costly legal warnings. As the legal environment is constantly changing, an important requirement of Aware is that the domain knowledge can be maintained (and initially defined) by numerous legal experts remotely working without further assistance of the computer scientists. Consequently, the software platform was chosen to be a web-based generic toolbox that can be configured to suit individual analysis experts, definitions of analysis flow, information gathering and report generation. The report generated by the system summarizes all critical elements of a given web page and provides case specific hints to the page author and thus forms a new type of service. Regarding the analysis subsystems, Aware mainly builds on existing state-of-the-art technologies. Their usability has been evaluated for each intended task. In order to control the heterogeneous analysis components and to gather the information, a lightweight scripting shell has been developed. This paper describes the analysis technologies, ranging from text based information extraction, over optical character recognition and phonetic fuzzy string matching to a set of image analysis and retrieval tools; as well as the scripting language to define the analysis flow.
Fallis, Don; Frické, Martin
2002-01-01
To identify indicators of accuracy for consumer health information on the Internet. The results will help lay people distinguish accurate from inaccurate health information on the Internet. Several popular search engines (Yahoo, AltaVista, and Google) were used to find Web pages on the treatment of fever in children. The accuracy and completeness of these Web pages was determined by comparing their content with that of an instrument developed from authoritative sources on treating fever in children. The presence on these Web pages of a number of proposed indicators of accuracy, taken from published guidelines for evaluating the quality of health information on the Internet, was noted. Correlation between the accuracy of Web pages on treating fever in children and the presence of proposed indicators of accuracy on these pages. Likelihood ratios for the presence (and absence) of these proposed indicators. One hundred Web pages were identified and characterized as "more accurate" or "less accurate." Three indicators correlated with accuracy: displaying the HONcode logo, having an organization domain, and displaying a copyright. Many proposed indicators taken from published guidelines did not correlate with accuracy (e.g., the author being identified and the author having medical credentials) or inaccuracy (e.g., lack of currency and advertising). This method provides a systematic way of identifying indicators that are correlated with the accuracy (or inaccuracy) of health information on the Internet. Three such indicators have been identified in this study. Identifying such indicators and informing the providers and consumers of health information about them would be valuable for public health care.
Indicators of Accuracy of Consumer Health Information on the Internet
Fallis, Don; Frické, Martin
2002-01-01
Objectives: To identify indicators of accuracy for consumer health information on the Internet. The results will help lay people distinguish accurate from inaccurate health information on the Internet. Design: Several popular search engines (Yahoo, AltaVista, and Google) were used to find Web pages on the treatment of fever in children. The accuracy and completeness of these Web pages was determined by comparing their content with that of an instrument developed from authoritative sources on treating fever in children. The presence on these Web pages of a number of proposed indicators of accuracy, taken from published guidelines for evaluating the quality of health information on the Internet, was noted. Main Outcome Measures: Correlation between the accuracy of Web pages on treating fever in children and the presence of proposed indicators of accuracy on these pages. Likelihood ratios for the presence (and absence) of these proposed indicators. Results: One hundred Web pages were identified and characterized as “more accurate” or “less accurate.” Three indicators correlated with accuracy: displaying the HONcode logo, having an organization domain, and displaying a copyright. Many proposed indicators taken from published guidelines did not correlate with accuracy (e.g., the author being identified and the author having medical credentials) or inaccuracy (e.g., lack of currency and advertising). Conclusions: This method provides a systematic way of identifying indicators that are correlated with the accuracy (or inaccuracy) of health information on the Internet. Three such indicators have been identified in this study. Identifying such indicators and informing the providers and consumers of health information about them would be valuable for public health care. PMID:11751805
Effect of font size, italics, and colour count on web usability.
Bhatia, Sanjiv K; Samal, Ashok; Rajan, Nithin; Kiviniemi, Marc T
2011-04-01
Web usability measures the ease of use of a website. This study attempts to find the effect of three factors - font size, italics, and colour count - on web usability. The study was performed using a set of tasks and developing a survey questionnaire. We performed the study using a set of human subjects, selected from the undergraduate students taking courses in psychology. The data computed from the tasks and survey questionnaire were statistically analysed to find if there was any effect of font size, italics, and colour count on the three web usability dimensions. We found that for the student population considered, there was no significant effect of font size on usability. However, the manipulation of italics and colour count did influence some aspects of usability. The subjects performed better for pages with no italics and high italics compared to moderate italics. The subjects rated the pages that contained only one colour higher than the web pages with four or six colours. This research will help web developers better understand the effect of font size, italics, and colour count on web usability in general, and for young adults, in particular.
Effect of font size, italics, and colour count on web usability
Samal, Ashok; Rajan, Nithin; Kiviniemi, Marc T.
2013-01-01
Web usability measures the ease of use of a website. This study attempts to find the effect of three factors – font size, italics, and colour count – on web usability. The study was performed using a set of tasks and developing a survey questionnaire. We performed the study using a set of human subjects, selected from the undergraduate students taking courses in psychology. The data computed from the tasks and survey questionnaire were statistically analysed to find if there was any effect of font size, italics, and colour count on the three web usability dimensions. We found that for the student population considered, there was no significant effect of font size on usability. However, the manipulation of italics and colour count did influence some aspects of usability. The subjects performed better for pages with no italics and high italics compared to moderate italics. The subjects rated the pages that contained only one colour higher than the web pages with four or six colours. This research will help web developers better understand the effect of font size, italics, and colour count on web usability in general, and for young adults, in particular. PMID:24358055
OntologyWidget – a reusable, embeddable widget for easily locating ontology terms
Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, JH Pate; Ball, Catherine A; Sherlock, Gavin
2007-01-01
Background Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. Results We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML) and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD), which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO) website [1]. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1) install Apache Tomcat [2] on one's web server, (2) download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3) create an html (HyperText Markup Language) file that refers to the OntologyWidget using a simple, well-defined format. Conclusion We have developed OntologyWidget, an easy-to-use ontology search and display tool that can be used on any web page by creating a simple html description. OntologyWidget provides a rapid auto-complete search function paired with an interactive tree display. We have developed a web service layer that communicates between the web page interface and a database of ontology terms. We currently store 40 of the ontologies from the OBO website [1], as well as a several others. These ontologies are automatically updated on a weekly basis. OntologyWidget can be used in any web-based application to take advantage of the ontologies we provide via web services or any other ontology that is provided elsewhere in the correct format. The full source code for the JavaScript and description of the OntologyWidget is available from . PMID:17854506
OntologyWidget - a reusable, embeddable widget for easily locating ontology terms.
Beauheim, Catherine C; Wymore, Farrell; Nitzberg, Michael; Zachariah, Zachariah K; Jin, Heng; Skene, J H Pate; Ball, Catherine A; Sherlock, Gavin
2007-09-13
Biomedical ontologies are being widely used to annotate biological data in a computer-accessible, consistent and well-defined manner. However, due to their size and complexity, annotating data with appropriate terms from an ontology is often challenging for experts and non-experts alike, because there exist few tools that allow one to quickly find relevant ontology terms to easily populate a web form. We have produced a tool, OntologyWidget, which allows users to rapidly search for and browse ontology terms. OntologyWidget can easily be embedded in other web-based applications. OntologyWidget is written using AJAX (Asynchronous JavaScript and XML) and has two related elements. The first is a dynamic auto-complete ontology search feature. As a user enters characters into the search box, the appropriate ontology is queried remotely for terms that match the typed-in text, and the query results populate a drop-down list with all potential matches. Upon selection of a term from the list, the user can locate this term within a generic and dynamic ontology browser, which comprises the second element of the tool. The ontology browser shows the paths from a selected term to the root as well as parent/child tree hierarchies. We have implemented web services at the Stanford Microarray Database (SMD), which provide the OntologyWidget with access to over 40 ontologies from the Open Biological Ontology (OBO) website 1. Each ontology is updated weekly. Adopters of the OntologyWidget can either use SMD's web services, or elect to rely on their own. Deploying the OntologyWidget can be accomplished in three simple steps: (1) install Apache Tomcat 2 on one's web server, (2) download and install the OntologyWidget servlet stub that provides access to the SMD ontology web services, and (3) create an html (HyperText Markup Language) file that refers to the OntologyWidget using a simple, well-defined format. We have developed OntologyWidget, an easy-to-use ontology search and display tool that can be used on any web page by creating a simple html description. OntologyWidget provides a rapid auto-complete search function paired with an interactive tree display. We have developed a web service layer that communicates between the web page interface and a database of ontology terms. We currently store 40 of the ontologies from the OBO website 1, as well as a several others. These ontologies are automatically updated on a weekly basis. OntologyWidget can be used in any web-based application to take advantage of the ontologies we provide via web services or any other ontology that is provided elsewhere in the correct format. The full source code for the JavaScript and description of the OntologyWidget is available from http://smd.stanford.edu/ontologyWidget/.
StreamStats in Georgia: a water-resources web application
Gotvald, Anthony J.; Musser, Jonathan W.
2015-07-31
StreamStats is being implemented on a State-by-State basis to allow for customization of the data development and underlying datasets to address their specific needs, issues, and objectives. The USGS, in cooperation with the Georgia Environmental Protection Division and Georgia Department of Transportation, has implemented StreamStats for Georgia. The Georgia StreamStats Web site is available through the national StreamStats Web-page portal at http://streamstats.usgs.gov. Links are provided on this Web page for individual State applications, instructions for using StreamStats, definitions of basin characteristics and streamflow statistics, and other supporting information.
Frank, M S; Dreyer, K
2001-06-01
We describe a virtual web site hosting technology that enables educators in radiology to emblazon and make available for delivery on the world wide web their own interactive educational content, free from dependencies on in-house resources and policies. This suite of technologies includes a graphically oriented software application, designed for the computer novice, to facilitate the input, storage, and management of domain expertise within a database system. The database stores this expertise as choreographed and interlinked multimedia entities including text, imagery, interactive questions, and audio. Case-based presentations or thematic lectures can be authored locally, previewed locally within a web browser, then uploaded at will as packaged knowledge objects to an educator's (or department's) personal web site housed within a virtual server architecture. This architecture can host an unlimited number of unique educational web sites for individuals or departments in need of such service. Each virtual site's content is stored within that site's protected back-end database connected to Internet Information Server (Microsoft Corp, Redmond WA) using a suite of Active Server Page (ASP) modules that incorporate Microsoft's Active Data Objects (ADO) technology. Each person's or department's electronic teaching material appears as an independent web site with different levels of access--controlled by a username-password strategy--for teachers and students. There is essentially no static hypertext markup language (HTML). Rather, all pages displayed for a given site are rendered dynamically from case-based or thematic content that is fetched from that virtual site's database. The dynamically rendered HTML is displayed within a web browser in a Socratic fashion that can assess the recipient's current fund of knowledge while providing instantaneous user-specific feedback. Each site is emblazoned with the logo and identification of the participating institution. Individuals with teacher-level access can use a web browser to upload new content as well as manage content already stored on their virtual site. Each virtual site stores, collates, and scores participants' responses to the interactive questions posed on line. This virtual web site strategy empowers the educator with an end-to-end solution for creating interactive educational content and hosting that content within the educator's personalized and protected educational site on the world wide web, thus providing a valuable outlet that can magnify the impact of his or her talents and contributions.
2016-04-01
the DOD will put DOD systems and data at a risk level comparable to that of their neighbors in the cloud. Just as a user browses a Web page on the...proxy servers for controlling user access to Web pages, and large-scale storage for data management. Each of these devices allows access to the...user to develop applications. Acunetics.com describes Web applications as “computer programs allowing Website visitors to submit and retrieve data
ERIC Educational Resources Information Center
Bird, Bruce
This paper discusses the development of two World Wide Web sites at Anne Arundel Community College (Maryland). The criteria for the selection of hardware and software for Web site development that led to the decision to use Microsoft FrontPage 98 are described along with its major components and features. The discussion of the Science Division Web…
A Study of HTML Title Tag Creation Behavior of Academic Web Sites
ERIC Educational Resources Information Center
Noruzi, Alireza
2007-01-01
The HTML title tag information should identify and describe exactly what a Web page contains. This paper analyzes the "Title element" and raises a significant question: "Why is the title tag important?" Search engines base search results and page rankings on certain criteria. Among the most important criteria is the presence of the search keywords…
ERIC Educational Resources Information Center
Ludlow, John B.; Platin, Enrique
2000-01-01
Compared self-guided slide/tape (ST) and Web page (WP) instruction in normal radiographic anatomy of periapical and panoramic images using objective test performance and subjective preferences of 74 freshman dental students. Test performance was not different between image types or presentation technologies, but students preferred WP for…
Web Pages as an Interdisciplinary Tool in English for Architects Classes.
ERIC Educational Resources Information Center
Mansilla, Paloma Ubeda
2002-01-01
Proposes the use of web pages as an interdisciplinary tool in classes of English for professional and academic purposes. Languages and computing are two areas of knowledge that the graduate of the Polytechnic University of Madrid and its School of architecture need to study in order to supplement the education received during their degree with the…
Code of Federal Regulations, 2010 CFR
2010-07-01
.... This hourly rate is listed on the Commission's Web site at http://www.fmshrc.gov. Fees for searches of... listed on the Commission's Web site at http://www.fmshrc.gov. (c) Duplicating fee. The copy fee for each page of paper up to 81/2″ × 14″ shall be $.15 per copy per page. Any private sector services required...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-11
... deep saline geologic formations for permanent geologic storage. DATES: DOE invites the public to...; or by fax (304) 285-4403. The Draft EIS is available on DOE's NEPA Web page at: http://nepa.energy.gov/DOE_NEPA_documents.htm ; and on the National Energy Technology Laboratory's Web page at: http...
Building Interactive Simulations in Web Pages without Programming.
Mailen Kootsey, J; McAuley, Grant; Bernal, Julie
2005-01-01
A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.
Web mining for topics defined by complex and precise predicates
NASA Astrophysics Data System (ADS)
Lee, Ching-Cheng; Sampathkumar, Sushma
2004-04-01
The enormous growth of the World Wide Web has made it important to perform resource discovery efficiently for any given topic. Several new techniques have been proposed in the recent years for this kind of topic specific web-mining, and among them a key new technique called focused crawling which is able to crawl topic-specific portions of the web without having to explore all pages. Most existing research on focused crawling considers a simple topic definition that typically consists of one or more keywords connected by an OR operator. However this kind of simple topic definition may result in too many irrelevant pages in which the same keyword appears in a wrong context. In this research we explore new strategies for crawling topic specific portions of the web using complex and precise predicates. A complex predicate will allow the user to precisely specify a topic using Boolean operators such as "AND", "OR" and "NOT". Our work will concentrate on defining a format to specify this kind of a complex topic definition and secondly on devising a crawl strategy to crawl the topic specific portions of the web defined by the complex predicate, efficiently and with minimal overhead. Our new crawl strategy will improve the performance of topic-specific web crawling by reducing the number of irrelevant pages crawled. In order to demonstrate the effectiveness of the above approach, we have built a complete focused crawler called "Eureka" with complex predicate support, and a search engine that indexes and supports end-user searches on the crawled pages.
Water fluoridation and the quality of information available online.
Frangos, Zachary; Steffens, Maryke; Leask, Julie
2018-02-13
The Internet has transformed the way in which people approach their health care, with online resources becoming a primary source of health information. Little work has assessed the quality of online information regarding community water fluoridation. This study sought to assess the information available to individuals searching online for information, with emphasis on the credibility and quality of websites. We identified the top 10 web pages returned from different search engines, using common fluoridation search terms (identified in Google Trends). Web pages were scored using a credibility, quality and health literacy tool based on Global Advisory Committee on Vaccine Safety (GAVCS) and Center for Disease Control and Prevention (CDC) criteria. Scores were compared according to their fluoridation stance and domain type, then ranked by quality. The functionality of the scoring tool was analysed via a Bland-Altman plot of inter-rater reliability. Five-hundred web pages were returned, of which 55 were scored following removal of duplicates and irrelevant pages. Of these, 28 (51%) were pro-fluoridation, 16 (29%) were neutral and 11 (20%) were anti-fluoridation. Pro, neutral and anti-fluoridation pages scored well against health literacy standards (0.91, 0.90 and 0.81/1 respectively). Neutral and pro-fluoridation web pages showed strong credibility, with mean scores of 0.80 and 0.85 respectively, while anti-fluoridation scored 0.62/1. Most pages scored poorly for content quality, providing a moderate amount of superficial information. Those seeking online information regarding water fluoridation are faced with comprehensible, yet poorly referenced, superficial information. Sites were credible and user friendly; however, our results suggest that online resources need to focus on providing more transparent information with appropriate figures to consolidate the information. © 2018 FDI World Dental Federation.
Environmental Models as a Service: Enabling Interoperability ...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the
The quality and readability of online consumer information about gynecologic cancer.
Sobota, Aleksandra; Ozakinci, Gozde
2015-03-01
The Internet has become an important source of health-related information for consumers, among whom younger women constitute a notable group. The aims of this study were (1) to evaluate the quality and readability of online information about gynecologic cancer using validated instruments and (2) to relate the quality of information to its readability. Using the Alexa Rank, we obtained a list of 35 Web pages providing information about 7 gynecologic malignancies. These were assessed using the Health on the Net (HON) seal of approval, the Journal of the American Medical Association (JAMA) benchmarks, and the DISCERN instrument. Flesch readability score was calculated for sections related to symptoms and signs and treatment. Less than 30% of the Web pages displayed the HON seal or achieved all JAMA benchmarks. The majority of the treatment sections were of moderate to high quality according to the DISCERN. There was no significant relationship between the presence of the HON seal and readability. Web pages achieving all JAMA benchmarks were significantly more difficult to read and understand than Web pages that missed any of the JAMA benchmarks. Treatment-related content of moderate to high quality as assessed by the DISCERN had a significantly better readability score than the low-quality content. The online information about gynecologic cancer provided by the most frequently visited Web pages is of variable quality and in general difficult to read and understand. The relationship between the quality and readability remains unclear. Health care providers should direct their patients to reliable material online because patients consider the Internet as an important source of information.
Web party effect: a cocktail party effect in the web environment
Gerbino, Walter
2015-01-01
In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all) web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots) on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search). Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment): users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others. PMID:25802803
Web party effect: a cocktail party effect in the web environment.
Rigutti, Sara; Fantoni, Carlo; Gerbino, Walter
2015-01-01
In goal-directed web navigation, labels compete for selection: this process often involves knowledge integration and requires selective attention to manage the dizziness of web layouts. Here we ask whether the competition for selection depends on all web navigation options or only on those options that are more likely to be useful for information seeking, and provide evidence in favor of the latter alternative. Participants in our experiment navigated a representative set of real websites of variable complexity, in order to reach an information goal located two clicks away from the starting home page. The time needed to reach the goal was accounted for by a novel measure of home page complexity based on a part of (not all) web options: the number of links embedded within web navigation elements weighted by the number and type of embedding elements. Our measure fully mediated the effect of several standard complexity metrics (the overall number of links, words, images, graphical regions, the JPEG file size of home page screenshots) on information seeking time and usability ratings. Furthermore, it predicted the cognitive demand of web navigation, as revealed by the duration judgment ratio (i.e., the ratio of subjective to objective duration of information search). Results demonstrate that focusing on relevant links while ignoring other web objects optimizes the deployment of attentional resources necessary to navigation. This is in line with a web party effect (i.e., a cocktail party effect in the web environment): users tune into web elements that are relevant for the achievement of their navigation goals and tune out all others.
Continuing Education for Department of Defense Health Professionals
2015-11-24
American Pharmacists Association, 60 and American Nurses Association. 61 These associations and other health-focused organizations, including health...1298. Accessed May 29, 2014. 60. American Pharmacists Association. Learn [Web page]. 2014; http://www.pharmacist.com/node/26541. Accessed May 29...American Pharmacists Association. Learn [Web page]. 2014; http://www.pharmacist.com/node/26541. Accessed May 29, 2014. 61. American Nurses Association
Blues for the Lecture Theatre--The Pharmacology Songbook
ERIC Educational Resources Information Center
MacDonald, Ewen; Saarti, Jarmo
2006-01-01
In 2005, we were able to digitally record the so-called pharmacology songbook; a set of songs with lyrics devoted to pharmacological topics. A CD was prepared entitled The Beta-blocker Blues and its contents are now all freely available in mp3 format from our web page (Ewen MacDonald & friends, 2005). The web page also contains the lyrics and…
ERIC Educational Resources Information Center
Chou, Huey-Wen; Wang, Yu-Fang
1999-01-01
Compares the effects of two training methods on computer attitude and performance in a World Wide Web page design program in a field experiment with high school students in Taiwan. Discusses individual differences, Kolb's Experiential Learning Theory and Learning Style Inventory, Computer Attitude Scale, and results of statistical analyses.…
Satellite Imagery Products - Office of Satellite and Product Operations
» Disclaimer » Web Linking Policy » Use of Data and Products » FAQs: Imagery Contact Us Services Argos DCS : Page | VIS | IR | Water Vapor Sample GOES Watervapor composite Detailed Product List Composite Imagery Surface Data GIS Data Available Through Interactive Internet Mapping GIS Fire and Smoke Detection Web Page
ERIC Educational Resources Information Center
Slowinski, Joseph
1999-01-01
Offers suggestions on how to add the power of a free online translator, links, and multicultural search engines to a teacher's classroom home page. Describes the Alta Vista Babelfish online translation service that can be used to translate Web pages on a variety of topics written in German, Spanish, Italian, French, or Portuguese. (SLD)
How To Do Field Searching in Web Search Engines: A Field Trip.
ERIC Educational Resources Information Center
Hock, Ran
1998-01-01
Describes the field search capabilities of selected Web search engines (AltaVista, HotBot, Infoseek, Lycos, Yahoo!) and includes a chart outlining what fields (date, title, URL, images, audio, video, links, page depth) are searchable, where to go on the page to search them, the syntax required (if any), and how field search queries are entered.…
Self-presentation on the Web: agencies serving abused and assaulted women.
Sorenson, Susan B; Shi, Rui; Zhang, Jingwen; Xue, Jia
2014-04-01
We examined the content and usability of the Web sites of agencies serving women victims of violence. We entered the names of a systematic 10% sample of 3774 agencies listed in 2 national directories into a search engine. We took (in April 2012) and analyzed screenshots of the 261 resulting home pages and the readability of 193 home and first-level pages. Victims (94%) and donors (68%) were the primary intended audiences. About one half used social media and one third provided cues to action. Almost all (96.4%) of the Web pages were rated "fairly difficult" to "very confusing" to read, and 81.4% required more than a ninth-grade education to understand. The service and marketing functions were met fairly well by the agency home pages, but usability (particularly readability and offer of a mobile version) and efforts to increase user safety could be improved. Internet technologies are an essential platform for public health. They are particularly useful for reaching people with stigmatized health conditions because of the anonymity allowed. The one third of agencies that lack a Web site will not reach the substantial portion of the population that uses the Internet to find health information and other resources.
Table Extraction from Web Pages Using Conditional Random Fields to Extract Toponym Related Data
NASA Astrophysics Data System (ADS)
Luthfi Hanifah, Hayyu'; Akbar, Saiful
2017-01-01
Table is one of the ways to visualize information on web pages. The abundant number of web pages that compose the World Wide Web has been the motivation of information extraction and information retrieval research, including the research for table extraction. Besides, there is a need for a system which is designed to specifically handle location-related information. Based on this background, this research is conducted to provide a way to extract location-related data from web tables so that it can be used in the development of Geographic Information Retrieval (GIR) system. The location-related data will be identified by the toponym (location name). In this research, a rule-based approach with gazetteer is used to recognize toponym from web table. Meanwhile, to extract data from a table, a combination of rule-based approach and statistical-based approach is used. On the statistical-based approach, Conditional Random Fields (CRF) model is used to understand the schema of the table. The result of table extraction is presented on JSON format. If a web table contains toponym, a field will be added on the JSON document to store the toponym values. This field can be used to index the table data in accordance to the toponym, which then can be used in the development of GIR system.
Analysis of Technique to Extract Data from the Web for Improved Performance
NASA Astrophysics Data System (ADS)
Gupta, Neena; Singh, Manish
2010-11-01
The World Wide Web rapidly guides the world into a newly amazing electronic world, where everyone can publish anything in electronic form and extract almost all the information. Extraction of information from semi structured or unstructured documents, such as web pages, is a useful yet complex task. Data extraction, which is important for many applications, extracts the records from the HTML files automatically. Ontologies can achieve a high degree of accuracy in data extraction. We analyze method for data extraction OBDE (Ontology-Based Data Extraction), which automatically extracts the query result records from the web with the help of agents. OBDE first constructs an ontology for a domain according to information matching between the query interfaces and query result pages from different web sites within the same domain. Then, the constructed domain ontology is used during data extraction to identify the query result section in a query result page and to align and label the data values in the extracted records. The ontology-assisted data extraction method is fully automatic and overcomes many of the deficiencies of current automatic data extraction methods.
The effects of link format and screen location on visual search of web pages.
Ling, Jonathan; Van Schaik, Paul
2004-06-22
Navigation of web pages is of critical importance to the usability of web-based systems such as the World Wide Web and intranets. The primary means of navigation is through the use of hyperlinks. However, few studies have examined the impact of the presentation format of these links on visual search. The present study used a two-factor mixed measures design to investigate whether there was an effect of link format (plain text, underlined, bold, or bold and underlined) upon speed and accuracy of visual search and subjective measures in both the navigation and content areas of web pages. An effect of link format on speed of visual search for both hits and correct rejections was found. This effect was observed in the navigation and the content areas. Link format did not influence accuracy in either screen location. Participants showed highest preference for links that were in bold and underlined, regardless of screen area. These results are discussed in the context of visual search processes and design recommendations are given.
Singer, Philipp; Helic, Denis; Taraghi, Behnam; Strohmaier, Markus
2014-01-01
One of the most frequently used models for understanding human navigation on the Web is the Markov chain model, where Web pages are represented as states and hyperlinks as probabilities of navigating from one page to another. Predominantly, human navigation on the Web has been thought to satisfy the memoryless Markov property stating that the next page a user visits only depends on her current page and not on previously visited ones. This idea has found its way in numerous applications such as Google's PageRank algorithm and others. Recently, new studies suggested that human navigation may better be modeled using higher order Markov chain models, i.e., the next page depends on a longer history of past clicks. Yet, this finding is preliminary and does not account for the higher complexity of higher order Markov chain models which is why the memoryless model is still widely used. In this work we thoroughly present a diverse array of advanced inference methods for determining the appropriate Markov chain order. We highlight strengths and weaknesses of each method and apply them for investigating memory and structure of human navigation on the Web. Our experiments reveal that the complexity of higher order models grows faster than their utility, and thus we confirm that the memoryless model represents a quite practical model for human navigation on a page level. However, when we expand our analysis to a topical level, where we abstract away from specific page transitions to transitions between topics, we find that the memoryless assumption is violated and specific regularities can be observed. We report results from experiments with two types of navigational datasets (goal-oriented vs. free form) and observe interesting structural differences that make a strong argument for more contextual studies of human navigation in future work.
Singer, Philipp; Helic, Denis; Taraghi, Behnam; Strohmaier, Markus
2014-01-01
One of the most frequently used models for understanding human navigation on the Web is the Markov chain model, where Web pages are represented as states and hyperlinks as probabilities of navigating from one page to another. Predominantly, human navigation on the Web has been thought to satisfy the memoryless Markov property stating that the next page a user visits only depends on her current page and not on previously visited ones. This idea has found its way in numerous applications such as Google's PageRank algorithm and others. Recently, new studies suggested that human navigation may better be modeled using higher order Markov chain models, i.e., the next page depends on a longer history of past clicks. Yet, this finding is preliminary and does not account for the higher complexity of higher order Markov chain models which is why the memoryless model is still widely used. In this work we thoroughly present a diverse array of advanced inference methods for determining the appropriate Markov chain order. We highlight strengths and weaknesses of each method and apply them for investigating memory and structure of human navigation on the Web. Our experiments reveal that the complexity of higher order models grows faster than their utility, and thus we confirm that the memoryless model represents a quite practical model for human navigation on a page level. However, when we expand our analysis to a topical level, where we abstract away from specific page transitions to transitions between topics, we find that the memoryless assumption is violated and specific regularities can be observed. We report results from experiments with two types of navigational datasets (goal-oriented vs. free form) and observe interesting structural differences that make a strong argument for more contextual studies of human navigation in future work. PMID:25013937
Students using visual thinking to learn science in a Web-based environment
NASA Astrophysics Data System (ADS)
Plough, Jean Margaret
United States students' science test scores are low, especially in problem solving, and traditional science instruction could be improved. Consequently, visual thinking, constructing science structures, and problem solving in a web-based environment may be valuable strategies for improving science learning. This ethnographic study examined the science learning of fifteen fourth grade students in an after school computer club involving diverse students at an inner city school. The investigation was done from the perspective of the students, and it described the processes of visual thinking, web page construction, and problem solving in a web-based environment. The study utilized informal group interviews, field notes, Visual Learning Logs, and student web pages, and incorporated a Standards-Based Rubric which evaluated students' performance on eight science and technology standards. The Visual Learning Logs were drawings done on the computer to represent science concepts related to the Food Chain. Students used the internet to search for information on a plant or animal of their choice. Next, students used this internet information, with the information from their Visual Learning Logs, to make web pages on their plant or animal. Later, students linked their web pages to form Science Structures. Finally, students linked their Science Structures with the structures of other students, and used these linked structures as models for solving problems. Further, during informal group interviews, students answered questions about visual thinking, problem solving, and science concepts. The results of this study showed clearly that (1) making visual representations helped students understand science knowledge, (2) making links between web pages helped students construct Science Knowledge Structures, and (3) students themselves said that visual thinking helped them learn science. In addition, this study found that when using Visual Learning Logs, the main overall ideas of the science concepts were usually represented accurately. Further, looking for information on the internet may cause new problems in learning. Likewise, being absent, starting late, and/or dropping out all may negatively influence students' proficiency on the standards. Finally, the way Science Structures are constructed and linked may provide insights into the way individual students think and process information.
Google Analytics: Single Page Traffic Reports
These are pages that live outside of Google Analytics (GA) but allow you to view GA data for any individual page on either the public EPA web or EPA intranet. You do need to log in to Google Analytics to view them.
A readability assessment of online stroke information.
Sharma, Nikhil; Tridimas, Andreas; Fitzsimmons, Paul R
2014-07-01
Patients and carers increasingly access the Internet as a source of health information. Poor health literacy is extremely common and frequently limits patient's comprehension of health care information literature. We aimed to assess the readability of online consumer-orientated stroke information using 2 validated readability measures. The 100 highest Google ranked consumer-oriented stroke Web pages were assessed for reading difficulty using the Flesch-Kincaid and Simple Measure of Gobbledygook (SMOG) formulae. None of the included Web pages complied with the current readability guidelines when readability was measured using the gold standard SMOG formula. Mean Flesch-Kincaid grade level was 10.4 (95% confidence interval [CI] 9.97-10.9) and mean SMOG grade 12.1 (95% CI 11.7-12.4). Over half of the Web pages were produced at graduate reading levels or above. Not-for-profit Web pages were significantly easier to read (P=.0006). The Flesch-Kincaid formula significantly underestimated reading difficulty, with a mean underestimation of 1.65 grades (95% CI 1.49-1.81), P<.0001. Most consumer-orientated stroke information Web sites require major text revision to comply with readability guidelines and to be comprehensible to the average patient. The Flesch-Kincaid formula significantly underestimates reading difficulty, and SMOG should be used as the measure of choice. Copyright © 2014 National Stroke Association. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Fitzgerald, Mary Ann; Gregory, Vicki L.; Brock, Kathy; Bennett, Elizabeth; Chen, Shu-Hsien Lai; Marsh, Emily; Moore, Joi L.; Kim, Kyung-Sun; Esser, Linda R.
2002-01-01
Chapters in this section of "Educational Media and Technology Yearbook" examine important trends prominent in the landscape of the school library media profession in 2001. Themes include mandated educational reform; diversity in school library resources; communication through image-text juxtaposition in Web pages; and professional development and…
Web technology for emergency medicine and secure transmission of electronic patient records.
Halamka, J D
1998-01-01
The American Heritage dictionary defines the word "web" as "something intricately contrived, especially something that ensnares or entangles." The wealth of medical resources on the World Wide Web is now so extensive, yet disorganized and unmonitored, that such a definition seems fitting. In emergency medicine, for example, a field in which accurate and complete information, including patients' records, is urgently needed, more than 5000 Web pages are available today, whereas fewer than 50 were available in December 1994. Most sites are static Web pages using the Internet to publish textbook material, but new technology is extending the scope of the Internet to include online medical education and secure exchange of clinical information. This article lists some of the best Web sites for use in emergency medicine and then describes a project in which the Web is used for transmission and protection of electronic medical records.
A tool for improving the Web accessibility of visually handicapped persons.
Fujiki, Tadayoshi; Hanada, Eisuke; Yamada, Tomomi; Noda, Yoshihiro; Antoku, Yasuaki; Nakashima, Naoki; Nose, Yoshiaki
2006-04-01
Abstract Much has been written concerning the difficulties faced by visually handicapped persons when they access the internet. To solve some of the problems and to make web pages more accessible, we developed a tool we call the "Easy Bar," which works as a toolbar on the web browser. The functions of the Easy Bar are to change the size of web texts and images, to adjust the color, and to clear cached data that is automatically saved by the web browser. These functions are executed with ease by clicking buttons and operating a pull-down list. Since the icons built into Easy Bar are quite large, it is not necessary for the user to deal with delicate operations. The functions of Easy Bar run on any web page without increasing the processing time. For the visually handicapped, Easy Bar would contribute greatly to improved web accessibility to medical information.
Use of camera drive in stereoscopic display of learning contents of introductory physics
NASA Astrophysics Data System (ADS)
Matsuura, Shu
2011-03-01
Simple 3D physics simulations with stereoscopic display were created for a part of introductory physics e-Learning. First, cameras to see the 3D world can be made controllable by the user. This enabled to observe the system and motions of objects from any position in the 3D world. Second, cameras were made attachable to one of the moving object in the simulation so as to observe the relative motion of other objects. By this option, it was found that users perceive the velocity and acceleration more sensibly on stereoscopic display than on non-stereoscopic 3D display. Simulations were made using Adobe Flash ActionScript, and Papervison 3D library was used to render the 3D models in the flash web pages. To display the stereogram, two viewports from virtual cameras were displayed in parallel in the same web page. For observation of stereogram, the images of two viewports were superimposed by using 3D stereogram projection box (T&TS CO., LTD.), and projected on an 80-inch screen. The virtual cameras were controlled by keyboard and also by Nintendo Wii remote controller buttons. In conclusion, stereoscopic display offers learners more opportunities to play with the simulated models, and to perceive the characteristics of motion better.
Recent El Niño brought downpour of media coverage
NASA Astrophysics Data System (ADS)
Hare, Steven R.
Media coverage of the 1997-1998 tropical ocean warming event made the term “El Nino” a household word. So pervasive was coverage of El Nino that it became the fodder of late night talk show monologues and an oft-invoked gremlin responsible for many of society's ailments. As a fisheries biologist studying climate impacts on marine resources, I followed the event very closely and created an El Nino Web site (http://www. iphc.washington.edu/PAGES/IPHC/Staff/ hare/html/1997ENSO/ 1997ENSO.html) in the spring of 1997 when the magnitude of the event was becoming obvious.As part of my daily routine in updating the Web page, I began tracking El Nino media coverage over the Internet. Between June 1997 and July 1998,1 accumulated links to stories about El Nino. I attempted to maintain a constant level of effort so that the number of stories accurately reflected the level of coverage given the event as it progressed. In fisheries lingo, this is known as a Catch Per Unit Effort (CPUE) index. Because Internet content is often removed after a period of time, a retrospective accumulation of daily stories would not yield as accurate a count as the contemporary CPUE index I maintained.
Student satisfaction with a Website designed for three nursing courses.
Zwolski, K
2000-01-01
The website described was not designed to replace classroom teaching, but to serve as an additional tool for students attending a traditional course. Based on my experience and the data obtained from the evaluation questionnaire, the following points can be made: students are enthusiastic about the Internet and will access a web page that accompanies a particular course or courses a website can allow for objectives, not normally engendered by traditional methods, to be achieved. These may include, for instance, fostering a sense of community, providing new means of communication between professor and student and serving as a portal to the vast resources of the Internet. A single-theme website can effectively address the learning needs of students at different levels, in this case both undergraduate and graduate students A well-designed website can increase the visibility of the educational institution that sponsors it It is not easy to measure a website's effectiveness in helping students achieve traditional course objectives or its impact on student learning. The questionnaire results confirm students' satisfaction with the website and their belief that it was an important and useful learning tool. This is significant and positive. Future research is needed to measure the degree to which a website can increase learning in a particular area. The site required about 150 hours to construct and about 6-8 hours per week to maintain. This is a considerable amount of faculty time. Although I cannot speak for others, I firmly believe that this is a worthwhile investment. The website is clearly appreciated by students, and it seems logical to conclude that it is fulfilling some learning needs that may not be met by other methods. In addition, it provides the educator with a new vehicle for communication. It is exhilarating to create with new formats and to use expertise in a given area to reach students, foster community, and establish a presence beyond the classroom. Creating and maintaining a web page is labor intensive, but it is, in my opinion, worth the effort. I strongly urge nurse educators to explore the possibilities of developing websites to accompany individual courses and to consider even more interactive web pages that include online discussion groups and provide space for posting student work. An active website needs frequent maintenance and updates. I recommend that academic administrators recognize web authoring as a valid and legitimate activity and provide nursing faculty with necessary support. This might include workshops on web authoring or Internet use, released time or credited time for initial website design, and credit allocation for site maintenance. The Internet is the most extensive collection of information available. As webmaster and pathophysiology expert, I am guiding my students; as a teacher, I am both assuming and recognizing a new role. As a teacher, I need to assume the responsibility for guiding students to worthwhile resources in the subject area. The website is a portal to the world, but a portal that I oversee.
Thompson, Terrill; Burgstahler, Sheryl; Moore, Elizabeth J
2010-01-01
This article reports on a follow-up assessment to Thompson et al. (Proceedings of The First International Conference on Technology-based Learning with Disability, July 19-20, Dayton, Ohio, USA; 2007. pp 127-136), in which higher education home pages were evaluated over a 5-year period on their accessibility to individuals with disabilities. The purpose of this article is to identify trends in web accessibility and long-term impact of outreach and education. Home pages from 127 higher education institutions in the Northwest were evaluated for accessibility three times over a 6-month period in 2004-2005 (Phase I), and again in 2009 (Phase II). Schools in the study were offered varying degrees of training and/or support on web accessibility during Phase I. Pages were evaluated for accessibility using a set of manual checkpoints developed by the researchers. Over the 5-year period reported in this article, significant positive gains in accessibility were revealed on some measures, but accessibility declined on other measures. The areas of improvement are arguably the more basic, easy-to-implement accessibility features, while the area of decline is keyboard accessibility, which is likely associated with the emergence of dynamic new technologies on web pages. Even on those measures where accessibility is improving, it is still strikingly low. In Phase I of the study, institutions that received extensive training and support were more likely than other institutions to show improved accessibility on the measures where institutions improved overall, but were equally or more likely than others to show a decline on measures where institutions showed an overall decline. In Phase II, there was no significant difference between institutions who had received support earlier in the study, and those who had not. Results suggest that growing numbers of higher education institutions in the Northwest are motivated to add basic accessibility features to their home pages, and that outreach and education may have a positive effect on these measures. However, the results also reveal negative trends in accessibility, and outreach and education may not be strong enough to counter the factors that motivate institutions to deploy inaccessible emerging technologies. Further research is warranted toward identifying the motivational factors that are associated with increased and decreased web accessibility, and much additional work is needed to ensure that higher education web pages are accessible to individuals with disabilities.
Web-based pathology practice examination usage.
Klatt, Edward C
2014-01-01
General and subject specific practice examinations for students in health sciences studying pathology were placed onto a free public internet web site entitled web path and were accessed four clicks from the home web site menu. Multiple choice questions were coded into. html files with JavaScript functions for web browser viewing in a timed format. A Perl programming language script with common gateway interface for web page forms scored examinations and placed results into a log file on an internet computer server. The four general review examinations of 30 questions each could be completed in up to 30 min. The 17 subject specific examinations of 10 questions each with accompanying images could be completed in up to 15 min each. The results of scores and user educational field of study from log files were compiled from June 2006 to January 2014. The four general review examinations had 31,639 accesses with completion of all questions, for a completion rate of 54% and average score of 75%. A score of 100% was achieved by 7% of users, ≥90% by 21%, and ≥50% score by 95% of users. In top to bottom web page menu order, review examination usage was 44%, 24%, 17%, and 15% of all accessions. The 17 subject specific examinations had 103,028 completions, with completion rate 73% and average score 74%. Scoring at 100% was 20% overall, ≥90% by 37%, and ≥50% score by 90% of users. The first three menu items on the web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three accounted for no more than 2.2% each. Completion rates were higher for shorter 10 questions subject examinations. Users identifying themselves as MD/DO scored higher than other users, averaging 75%. Usage was higher for examinations at the top of the web page menu. Scores achieved suggest that a cohort of serious users fully completing the examinations had sufficient preparation to use them to support their pathology education.
Discovering Authorities and Hubs in Different Topological Web Graph Structures.
ERIC Educational Resources Information Center
Meghabghab, George
2002-01-01
Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)
TIM, a ray-tracing program for METATOY research and its dissemination
NASA Astrophysics Data System (ADS)
Lambert, Dean; Hamilton, Alasdair C.; Constable, George; Snehanshu, Harsh; Talati, Sharvil; Courtial, Johannes
2012-03-01
TIM (The Interactive METATOY) is a ray-tracing program specifically tailored towards our research in METATOYs, which are optical components that appear to be able to create wave-optically forbidden light-ray fields. For this reason, TIM possesses features not found in other ray-tracing programs. TIM can either be used interactively or by modifying the openly available source code; in both cases, it can easily be run as an applet embedded in a web page. Here we describe the basic structure of TIM's source code and how to extend it, and we give examples of how we have used TIM in our own research. Program summaryProgram title: TIM Catalogue identifier: AEKY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 124 478 No. of bytes in distributed program, including test data, etc.: 4 120 052 Distribution format: tar.gz Programming language: Java Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6 Operating system: Any; developed under Mac OS X Version 10.6 RAM: Typically 145 MB (interactive version running under Mac OS X Version 10.6) Classification: 14, 18 External routines: JAMA [1] (source code included) Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories; can visualise geometric optic transformations; can create anaglyphs (for viewing with coloured "3D glasses") and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene.
10 CFR 9.35 - Duplication fees.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., Rockville, Maryland, may be found on the NRC's Web site at http://www.nrc.gov/reading-rm/pdr/copy-service...″ reduced). Pages 11″ × 17″ are $0.30 per page. Pages larger than 11″ × 17″, including engineering drawings...
10 CFR 9.35 - Duplication fees.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., Rockville, Maryland, may be found on the NRC's Web site at http://www.nrc.gov/reading-rm/pdr/copy-service...″ reduced). Pages 11″ × 17″ are $0.30 per page. Pages larger than 11″ × 17″, including engineering drawings...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-31
... today's final rule will be required on March 7, 2015 and January 1, 2018, as set forth in Table I.1 in... information that is exempt from public disclosure. A link to the docket web page can be found at: www.regulations.gov/#!docketDetail ;D=EERE-2008-BT-STD-0019. The regulations.gov web page contains instructions on...
THUIR at TREC 2009 Web Track: Finding Relevant and Diverse Results for Large Scale Web Search
2009-11-01
Porn words‟ filtering is also one of the anti-spam techniques in real world search engines. A list of porn words was found from the internet [2...When the numbers of the porn words in the page is larger than α, then the page is taken as the spam. In our experiments, the threshold is set to 16
ERIC Educational Resources Information Center
Gallo, Gail; Wichowski, Chester P.
This second of two guides on Netscape Communicator 4.5 contains six lessons on advanced searches, multimedia, and composing a World Wide Web page. Lesson 1 is a review of the Navigator window, toolbars, and menus. Lesson 2 covers AltaVista's advanced search tips, searching for information excluding certain text, and advanced and nested Boolean…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-15
....gov . To view a copy of this information collection request (ICR) submitted to OMB: (1) Go to the web page http://reginfo.gov/public/do/PRAMain , (2) look for the section of the web page called ``Currently..., digital, LPTV and TV translator stations. The CBPA directs that Class A stations must comply with the...
Lafrenière, Darquise; Hurlimann, Thierry; Menuz, Vincent; Godard, Béatrice
2014-10-01
The push for knowledge translation on the part of health research funding agencies is significant in Canada, and many strategies have been adopted to promote the conversion of knowledge into action. In recent years, an increasing number of health researchers have been studying arts-based interventions to transform knowledge into action. This article reports on the results of an online questionnaire aimed at evaluating the effectiveness of a knowledge dissemination intervention (KDI) conveying findings from a study on the scientific and ethical challenges raised by nutrigenomics-nutrigenetics (NGx) research. The KDI was based on the use of four Web pages combining original, interactive cartoon-like illustrations accompanied by text to disseminate findings to Canadian Research Ethics Boards members, as well as to NGx researchers and researchers in ethics worldwide. Between May and October 2012, the links to the Web pages were sent in a personal email to target audience members, one thematic Web page at a time. On each thematic Web page, members of the target audience were invited to answer nine evaluation questions assessing the effectiveness of the KDI on four criteria, (i) acquisition of knowledge; (ii) change in initial understanding; (iii) generation of questions from the findings; and (iv) intent to change own practice. Response rate was low; results indicate that: (i) content of the four Web pages did not bring new knowledge to a majority of the respondents, (ii) initial understanding of the findings did not change for a majority of NGx researchers and a minority of ethics respondents, (iii) although the KDI did raise questions for respondents, it did not move them to change their practice. While target end-users may not feel that they actually learned from the KDI, it seems that the findings conveyed encouraged reflection and raised useful and valuable questions for them. Moreover, the evaluation of the KDI proved to be useful to gain knowledge about our target audiences' views since respondents' comments allowed us to improve our understanding of the disseminated knowledge as well as to modify (and hopefully improve) the content of the Web pages used for dissemination. Copyright © 2014 Elsevier Ltd. All rights reserved.
Chilet-Rosell, Elisa; Martín Llaguno, Marta; Ruiz Cantero, María Teresa; Alonso-Coello, Pablo
2010-03-16
The balance of the benefits and risks of long term use of hormone replacement therapy (HRT) have been a matter of debate for decades. In Europe, HRT requires medical prescription and its advertising is only permitted when aimed at health professionals (direct to consumer advertising is allowed in some non European countries). The objective of this study is to analyse the appropriateness and quality of Internet advertising about HRT in Spain. A search was carried out on the Internet (January 2009) using the eight best-selling HRT drugs in Spain. The brand name of each drug was entered into Google's search engine. The web sites appearing on the first page of results and the corresponding companies were analysed using the European Code of Good Practice as the reference point. Five corporate web pages: none of them included bibliographic references or measures to ensure that the advertising was only accessible by health professionals. Regarding non-corporate web pages (n = 27): 41% did not include the company name or address, 44% made no distinction between patient and health professional information, 7% contained bibliographic references, 26% provided unspecific information for the use of HRT for osteoporosis and 19% included menstrual cycle regulation or boosting feminity as an indication. Two online pharmacies sold HRT drugs which could be bought online in Spain, did not include the name or contact details of the registered company, nor did they stipulate the need for a medical prescription or differentiate between patient and health professional information. Even though pharmaceutical companies have committed themselves to compliance with codes of good practice, deficiencies were observed regarding the identification, information and promotion of HRT medications on their web pages. Unaffected by legislation, non-corporate web pages are an ideal place for indirect HRT advertising, but they often contain misleading information. HRT can be bought online from Spain, without a medical consultation or prescription constituting a serious issue for public health. In our information society, it is the right and obligation of public health bodies to ensure that such information is not misleading.
2010-01-01
Background The balance of the benefits and risks of long term use of hormone replacement therapy (HRT) have been a matter of debate for decades. In Europe, HRT requires medical prescription and its advertising is only permitted when aimed at health professionals (direct to consumer advertising is allowed in some non European countries). The objective of this study is to analyse the appropriateness and quality of Internet advertising about HRT in Spain. Methods A search was carried out on the Internet (January 2009) using the eight best-selling HRT drugs in Spain. The brand name of each drug was entered into Google's search engine. The web sites appearing on the first page of results and the corresponding companies were analysed using the European Code of Good Practice as the reference point. Results Five corporate web pages: none of them included bibliographic references or measures to ensure that the advertising was only accessible by health professionals. Regarding non-corporate web pages (n = 27): 41% did not include the company name or address, 44% made no distinction between patient and health professional information, 7% contained bibliographic references, 26% provided unspecific information for the use of HRT for osteoporosis and 19% included menstrual cycle regulation or boosting feminity as an indication. Two online pharmacies sold HRT drugs which could be bought online in Spain, did not include the name or contact details of the registered company, nor did they stipulate the need for a medical prescription or differentiate between patient and health professional information. Conclusions Even though pharmaceutical companies have committed themselves to compliance with codes of good practice, deficiencies were observed regarding the identification, information and promotion of HRT medications on their web pages. Unaffected by legislation, non-corporate web pages are an ideal place for indirect HRT advertising, but they often contain misleading information. HRT can be bought online from Spain, without a medical consultation or prescription constituting a serious issue for public health. In our information society, it is the right and obligation of public health bodies to ensure that such information is not misleading. PMID:20233393
Lemaire, Edward; Greene, G
2003-01-01
We produced continuing education material in physical rehabilitation using a variety of electronic media. We compared four methods of delivering the learning modules: in person with a computer projector, desktop videoconferencing, Web pages and CD-ROM. Health-care workers at eight community hospitals and two nursing homes were asked to participate in the project. A total of 394 questionnaires were received for all modalities: 73 for in-person sessions, 50 for desktop conferencing, 227 for Web pages and 44 for CD-ROM. This represents a 100% response rate from the in-person, desktop conferencing and CD-ROM groups; the response rate for the Web group is unknown, since the questionnaires were completed online. Almost all participants found the modules to be helpful in their work. The CD-ROM group gave significantly higher ratings than the Web page group, although all four learning modalities received high ratings. A combination of all four modalities would be required to provide the best possible learning opportunity.
The efficacy of a Web-based counterargument tutor.
Wolfe, Christopher R; Britt, M Anne; Petrovic, Melina; Albrecht, Michael; Kopp, Kristopher
2009-08-01
In two experiments, we developed and tested an interactive Web-based tutor to help students identify and evaluate counterarguments. In Experiment 1, we determined the extent to which high- and low-argumentationability participants were able to identify counterarguments. We tested the effectiveness of having participants read didactic text regarding counterarguments and highlight claims. Both preparations had some positive effects that were often limited to high-ability participants. The Web-based intervention included interactive exercises on identifying and using counterarguments. Web-based presentation was state driven, using a Java Server Pages page. As participants progressively identified argument elements, the page changed display state and presented feedback by checking what the user clicked against elements that we had coded in XML beforehand. Instructions and feedback strings were indexed by state, so that changing state selected new text to display. In Experiment 2, the tutor was effective in teaching participants to identify counterarguments, recognize responses, and determine whether counterarguments were rebutted, dismissed, or conceded.
The ATLAS Public Web Pages: Online Management of HEP External Communication Content
NASA Astrophysics Data System (ADS)
Goldfarb, S.; Marcelloni, C.; Eli Phoboo, A.; Shaw, K.
2015-12-01
The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal [1] content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and the enforcement of a well-defined visual identity.
A Semantically Enabled Metadata Repository for Solar Irradiance Data Products
NASA Astrophysics Data System (ADS)
Wilson, A.; Cox, M.; Lindholm, D. M.; Nadiadi, I.; Traver, T.
2014-12-01
The Laboratory for Atmospheric and Space Physics, LASP, has been conducting research in Atmospheric and Space science for over 60 years, and providing the associated data products to the public. LASP has a long history, in particular, of making space-based measurements of the solar irradiance, which serves as crucial input to several areas of scientific research, including solar-terrestrial interactions, atmospheric, and climate. LISIRD, the LASP Interactive Solar Irradiance Data Center, serves these datasets to the public, including solar spectral irradiance (SSI) and total solar irradiance (TSI) data. The LASP extended metadata repository, LEMR, is a database of information about the datasets served by LASP, such as parameters, uncertainties, temporal and spectral ranges, current version, alerts, etc. It serves as the definitive, single source of truth for that information. The database is populated with information garnered via web forms and automated processes. Dataset owners keep the information current and verified for datasets under their purview. This information can be pulled dynamically for many purposes. Web sites such as LISIRD can include this information in web page content as it is rendered, ensuring users get current, accurate information. It can also be pulled to create metadata records in various metadata formats, such as SPASE (for heliophysics) and ISO 19115. Once these records are be made available to the appropriate registries, our data will be discoverable by users coming in via those organizations. The database is implemented as a RDF triplestore, a collection of instances of subject-object-predicate data entities identifiable with a URI. This capability coupled with SPARQL over HTTP read access enables semantic queries over the repository contents. To create the repository we leveraged VIVO, an open source semantic web application, to manage and create new ontologies and populate repository content. A variety of ontologies were used in creating the triplestore, including ontologies that came with VIVO such as FOAF. Also, the W3C DCAT ontology was integrated and extended to describe properties of our data products that we needed to capture, such as spectral range. The presentation will describe the architecture, ontology issues, and tools used to create LEMR and plans for its evolution.
Googling suicide: surfing for suicide information on the Internet.
Recupero, Patricia R; Harms, Samara E; Noble, Jeffrey M
2008-06-01
This study examined the types of resources a suicidal person might find through search engines on the Internet. We were especially interested in determining the accessibility of potentially harmful resources, such as prosuicide forums, as such resources have been implicated in completed suicides and are known to exist on the Web. Using 5 popular search engines (Google, Yahoo!, Ask.com, Lycos, and Dogpile) and 4 suicide-related search terms (suicide, how to commit suicide, suicide methods, and how to kill yourself), we collected quantitative and qualitative data about the search results. The searches were conducted in August and September 2006. Several coraters assigned codes and characterizations to the first 30 Web sites per search term combination (and "sponsored links" on those pages), which were then confirmed by consensus ratings. Search results were classified as being prosuicide, antisuicide, suicide-neutral, not a suicide site, or error (i.e., page would not load). Additional information was collected to further characterize the nature of the information on these Web sites. Suicide-neutral and anti-suicide pages occurred most frequently (of 373 unique Web pages, 115 were coded as suicide-neutral, and 109 were anti-suicide). While pro-suicide resources were less frequent (41 Web pages), they were nonetheless easily accessible. Detailed how-to instructions for unusual and lethal suicide methods were likewise easily located through the searches. Mental health professionals should ask patients about their Internet use. Depressed, suicidal, or potentially suicidal patients who use the Internet may be especially at risk. Clinicians may wish to assist patients in locating helpful, supportive resources online so that patients' Internet use may be more therapeutic than harmful.
Modelling Safe Interface Interactions in Web Applications
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Cabot, Jordi; Grossniklaus, Michael
Current Web applications embed sophisticated user interfaces and business logic. The original interaction paradigm of the Web based on static content pages that are browsed by hyperlinks is, therefore, not valid anymore. In this paper, we advocate a paradigm shift for browsers and Web applications, that improves the management of user interaction and browsing history. Pages are replaced by States as basic navigation nodes, and Back/Forward navigation along the browsing history is replaced by a full-fledged interactive application paradigm, supporting transactions at the interface level and featuring Undo/Redo capabilities. This new paradigm offers a safer and more precise interaction model, protecting the user from unexpected behaviours of the applications and the browser.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer Summary: In... Cancer Genetics Services Directory Web-based Application Form and Update Mailer. [[Page 14035
What Can Pictures Tell Us About Web Pages? Improving Document Search Using Images.
Rodriguez-Vaamonde, Sergio; Torresani, Lorenzo; Fitzgibbon, Andrew W
2015-06-01
Traditional Web search engines do not use the images in the HTML pages to find relevant documents for a given query. Instead, they typically operate by computing a measure of agreement between the keywords provided by the user and only the text portion of each page. In this paper we study whether the content of the pictures appearing in a Web page can be used to enrich the semantic description of an HTML document and consequently boost the performance of a keyword-based search engine. We present a Web-scalable system that exploits a pure text-based search engine to find an initial set of candidate documents for a given query. Then, the candidate set is reranked using visual information extracted from the images contained in the pages. The resulting system retains the computational efficiency of traditional text-based search engines with only a small additional storage cost needed to encode the visual information. We test our approach on one of the TREC Million Query Track benchmarks where we show that the exploitation of visual content yields improvement in accuracies for two distinct text-based search engines, including the system with the best reported performance on this benchmark. We further validate our approach by collecting document relevance judgements on our search results using Amazon Mechanical Turk. The results of this experiment confirm the improvement in accuracy produced by our image-based reranker over a pure text-based system.
Computer and Voice Network Management Through Low Earth Orbiting Satellites
2006-03-01
Correction Chart” [web page] (29 July 2005 [cited 01 DEC 05]); available from World Wide Web @ http://www.amsat.orgamsat/ ariss /news...Available from World Wide Web @ http://www.amsat.orgamsat/ ariss /news/ISS_frequencies_and_Doppler_correction. rtf “Technical Specifications” [web
Vona, Pamela; Wilmoth, Pete; Jaycox, Lisa H; McMillen, Janey S; Kataoka, Sheryl H; Wong, Marleen; DeRosier, Melissa E; Langley, Audra K; Kaufman, Joshua; Tang, Lingqi; Stein, Bradley D
2014-11-01
To explore the role of Web-based platforms in behavioral health, the study examined usage of a Web site for supporting training and implementation of an evidence-based intervention. Using data from an online registration survey and Google Analytics, the investigators examined user characteristics and Web site utilization. Site engagement was substantial across user groups. Visit duration differed by registrants' characteristics. Less experienced clinicians spent more time on the Web site. The training section accounted for most page views across user groups. Individuals previously trained in the Cognitive-Behavioral Intervention for Trauma in Schools intervention viewed more implementation assistance and online community pages than did other user groups. Web-based platforms have the potential to support training and implementation of evidence-based interventions for clinicians of varying levels of experience and may facilitate more rapid dissemination. Web-based platforms may be promising for trauma-related interventions, because training and implementation support should be readily available after a traumatic event.
Acquiring geographical data with web harvesting
NASA Astrophysics Data System (ADS)
Dramowicz, K.
2016-04-01
Many websites contain very attractive and up to date geographical information. This information can be extracted, stored, analyzed and mapped using web harvesting techniques. Poorly organized data from websites are transformed with web harvesting into a more structured format, which can be stored in a database and analyzed. Almost 25% of web traffic is related to web harvesting, mostly while using search engines. This paper presents how to harvest geographic information from web documents using the free tool called the Beautiful Soup, one of the most commonly used Python libraries for pulling data from HTML and XML files. It is a relatively easy task to process one static HTML table. The more challenging task is to extract and save information from tables located in multiple and poorly organized websites. Legal and ethical aspects of web harvesting are discussed as well. The paper demonstrates two case studies. The first one shows how to extract various types of information about the Good Country Index from the multiple web pages, load it into one attribute table and map the results. The second case study shows how script tools and GIS can be used to extract information from one hundred thirty six websites about Nova Scotia wines. In a little more than three minutes a database containing one hundred and six liquor stores selling these wines is created. Then the availability and spatial distribution of various types of wines (by grape types, by wineries, and by liquor stores) are mapped and analyzed.
A Multi-User Model for Effectively Communicating Research Through Electronic Media
NASA Astrophysics Data System (ADS)
Hinds, J. J.; Fairley, J. P.
2003-12-01
Electronic media have demonstrated potential for data exchange, dissemination of results to other scientists, communication with community interest groups, and education of the general public regarding scientific advances. Few researchers, however, receive training in the skills required to capture the attention of the broad spectrum of Internet users. Because different people assimilate information in different ways, effective communication is best accomplished using an appropriate mix of photographs, graphics, tables, and text. In addition, effective web page design requires a clear, consistent organizational structure, easily-navigated layout, and attention to details such as page printability, downloading time, and minimal page scrolling. One of the strengths of electronic media is that the user can chose an appropriate level of involvement for his or her interest. In designing a web page for the multidisciplinary NSF/EPSCoR "Biocomplexity in Extreme Environments" project, we divided potential users into three categories based on our perception of the level of detail they required: 1) project participants, 2) non-participants with technical backgrounds, and 3) the general public. By understanding the needs and expectations of potential viewers, it was possible to present each group with an appropriate balance of visual and textural elements. For example, project participants are often most interested in raw data, which can be effectively presented in tabular format. Non-participants with technical backgrounds are more interested in analyzed data, while a project overview, presented through photographs and graphics with minimal text, will be most effective for communicating with the general public. The completed web page illustrates one solution for effectively communicating with a diverse audience, and provides examples for meeting many of the challenges of web page design.
Improving Web Accessibility in a University Setting
ERIC Educational Resources Information Center
Olive, Geoffrey C.
2010-01-01
Improving Web accessibility for disabled users visiting a university's Web site is explored following the World Wide Web Consortium (W3C) guidelines and Section 508 of the Rehabilitation Act rules for Web page designers to ensure accessibility. The literature supports the view that accessibility is sorely lacking, not only in the USA, but also…
Results from a Web Impact Factor Crawler.
ERIC Educational Resources Information Center
Thelwall, Mike
2001-01-01
Discusses Web impact factors (WIFs), Web versions of the impact factors for journals, and how they can be calculated by using search engines. Highlights include HTML and document indexing; Web page links; a Web crawler designed for calculating WIFs; and WIFs for United Kingdom universities that measured research profiles or capability. (Author/LRW)
Electronic Ramp to Success: Designing Campus Web Pages for Users with Disabilities.
ERIC Educational Resources Information Center
Coombs, Norman
2002-01-01
Discusses key issues in addressing the challenge of Web accessibility for people with disabilities, including tools for Web authoring, repairing, and accessibility validation, and relevant legal issues. Presents standards for Web accessibility, including the Section 508 Standards from the Federal Access Board, and the World Wide Web Consortium's…
Marenco, Luis; Ascoli, Giorgio A; Martone, Maryann E; Shepherd, Gordon M; Miller, Perry L
2008-09-01
This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information's (NCBI's) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation.
Establishing best practices to improve usability of web interfaces providing atmospheric data
NASA Astrophysics Data System (ADS)
Oakley, N.; Daudert, B.
2014-12-01
Accessing scientific data through an online portal can be a frustrating task. The concept of making web interfaces easy to use known as "usability" has been thoroughly researched in the field of e-commerce but has not been explicitly addressed in the atmospheric sciences. As more observation stations are installed, satellite missions flown, models run, and field campaigns performed, large amounts of data are produced. Portals on the Internet have become the favored mechanisms to share this information and are ever increasing in number. Portals are often created without being tested for usability with the target audience though the expenses of testing are low and the returns high. To remain competitive and relevant in the provision of atmospheric data, it is imperative that developers understand design elements of a successful portal to make their product stand out among others. This presentation informs the audience of the benefits and basic principles of usability for web pages presenting atmospheric data. We will also share some of the best practices and recommendations we have formulated from the results of usability testing performed on two data provision web sites hosted by the Western Regional Climate Center.
Ascoli, Giorgio A.; Martone, Maryann E.; Shepherd, Gordon M.; Miller, Perry L.
2009-01-01
This paper describes the NIF LinkOut Broker (NLB) that has been built as part of the Neuroscience Information Framework (NIF) project. The NLB is designed to coordinate the assembly of links to neuroscience information items (e.g., experimental data, knowledge bases, and software tools) that are (1) accessible via the Web, and (2) related to entries in the National Center for Biotechnology Information’s (NCBI’s) Entrez system. The NLB collects these links from each resource and passes them to the NCBI which incorporates them into its Entrez LinkOut service. In this way, an Entrez user looking at a specific Entrez entry can LinkOut directly to related neuroscience information. The information stored in the NLB can also be utilized in other ways. A second approach, which is operational on a pilot basis, is for the NLB Web server to create dynamically its own Web page of LinkOut links for each NCBI identifier in the NLB database. This approach can allow other resources (in addition to the NCBI Entrez) to LinkOut to related neuroscience information. The paper describes the current NLB system and discusses certain design issues that arose during its implementation. PMID:18975149
Going beyond Google for Faster and Smarter Web Searching
ERIC Educational Resources Information Center
Vine, Rita
2004-01-01
With more than 4 billion web pages in its database, Google is suitable for many different kinds of searches. When you know what you are looking for, Google can be a pretty good first choice, as long as you want to search a word pattern that can be expected to appear on any results pages. The problem starts when you don't know exactly what you're…
NASA Astrophysics Data System (ADS)
Zhang, Xiaowen; Chen, Bingfeng
2017-08-01
Based on the frequent sub-tree mining algorithm, this paper proposes a construction scheme of web page comment information extraction system based on frequent subtree mining, referred to as FSM system. The entire system architecture and the various modules to do a brief introduction, and then the core of the system to do a detailed description, and finally give the system prototype.
Bates, Benjamin R; Romina, Sharon; Ahmed, Rukhsana; Hopson, Danielle
2006-03-01
Recent use of the Internet as a source of health information has raised concerns about consumers' ability to tell 'good' information from 'bad' information. Although consumers report that they use source credibility to judge information quality, several observational studies suggest that consumers make little use of source credibility. This study examines consumer evaluations of web pages attributed to a credible source as compared to generic web pages on measures of message quality. In spring 2005, a community-wide convenience survey was distributed in a regional hub city in Ohio, USA. 519 participants were randomly assigned one of six messages discussing lung cancer prevention: three messages each attributed to a highly credible national organization and three identical messages each attributed to a generic web page. Independent sample t-tests were conducted to compare each attributed message to its counterpart attributed to a generic web page on measures of trustworthiness, truthfulness, readability, and completeness. The results demonstrated that differences in attribution to a source did not have a significant effect on consumers' evaluations of the quality of the information.Conclusions. The authors offer suggestions for national organizations to promote credibility to consumers as a heuristic for choosing better online health information through the use of media co-channels to emphasize credibility.
The quality of patient-orientated Internet information on oral lichen planus: a pilot study.
López-Jornet, Pía; Camacho-Alonso, Fabio
2010-10-01
This study examines the accessibility and quality Web pages related with oral lichen planus. Sites were identified using two search engines (Google and Yahoo!) and the search terms 'oral lichen planus' and 'oral lesion lichenoid'. The first 100 sites in each search were visited and classified. The web sites were evaluated for content quality by using the validated DISCERN rating instrument. JAMA benchmarks and 'Health on the Net' seal (HON). A total of 109,000 sites were recorded in Google using the search terms and 520,000 in Yahoo! A total of 19 Web pages considered relevant were examined on Google and 20 on Yahoo! As regards the JAMA benchmarks, only two pages satisfied the four criteria in Google (10%), and only three (15%) in Yahoo! As regards DISCERN, the overall quality of web site information was poor, no site reaching the maximum score. In Google 78.94% of sites had important deficiencies, and 50% in Yahoo!, the difference between the two search engines being statistically significant (P = 0.031). Only five pages (17.2%) on Google and eight (40%) on Yahoo! showed the HON code. Based on our review, doctors must assume primary responsibility for educating and counselling their patients. © 2010 Blackwell Publishing Ltd.
Self-Presentation on the Web: Agencies Serving Abused and Assaulted Women
Shi, Rui; Zhang, Jingwen; Xue, Jia
2014-01-01
Objectives. We examined the content and usability of the Web sites of agencies serving women victims of violence. Methods. We entered the names of a systematic 10% sample of 3774 agencies listed in 2 national directories into a search engine. We took (in April 2012) and analyzed screenshots of the 261 resulting home pages and the readability of 193 home and first-level pages. Results. Victims (94%) and donors (68%) were the primary intended audiences. About one half used social media and one third provided cues to action. Almost all (96.4%) of the Web pages were rated “fairly difficult” to “very confusing” to read, and 81.4% required more than a ninth-grade education to understand. Conclusions. The service and marketing functions were met fairly well by the agency home pages, but usability (particularly readability and offer of a mobile version) and efforts to increase user safety could be improved. Internet technologies are an essential platform for public health. They are particularly useful for reaching people with stigmatized health conditions because of the anonymity allowed. The one third of agencies that lack a Web site will not reach the substantial portion of the population that uses the Internet to find health information and other resources. PMID:24524489
Modeling the customer in electronic commerce.
Helander, M G; Khalid, H M
2000-12-01
This paper reviews interface design of web pages for e-commerce. Different tasks in e-commerce are contrasted. A systems model is used to illustrate the information flow between three subsystems in e-commerce: store environment, customer, and web technology. A customer makes several decisions: to enter the store, to navigate, to purchase, to pay, and to keep the merchandize. This artificial environment must be designed so that it can support customer decision-making. To retain customers it must be pleasing and fun, and create a task with natural flow. Customers have different needs, competence and motivation, which affect decision-making. It may therefore be important to customize the design of the e-store environment. Future ergonomics research will have to investigate perceptual aspects, such as presentation of merchandize, and cognitive issues, such as product search and navigation, as well as decision making while considering various economic parameters. Five theories on e-commerce research are presented.
A novel visualization model for web search results.
Nguyen, Tien N; Zhang, Jin
2006-01-01
This paper presents an interactive visualization system, named WebSearchViz, for visualizing the Web search results and acilitating users' navigation and exploration. The metaphor in our model is the solar system with its planets and asteroids revolving around the sun. Location, color, movement, and spatial distance of objects in the visual space are used to represent the semantic relationships between a query and relevant Web pages. Especially, the movement of objects and their speeds add a new dimension to the visual space, illustrating the degree of relevance among a query and Web search results in the context of users' subjects of interest. By interacting with the visual space, users are able to observe the semantic relevance between a query and a resulting Web page with respect to their subjects of interest, context information, or concern. Users' subjects of interest can be dynamically changed, redefined, added, or deleted from the visual space.
Standards opportunities around data-bearing Web pages.
Karger, David
2013-03-28
The evolving Web has seen ever-growing use of structured data, thanks to the way it enhances information authoring, querying, visualization and sharing. To date, however, most structured data authoring and management tools have been oriented towards programmers and Web developers. End users have been left behind, unable to leverage structured data for information management and communication as well as professionals. In this paper, I will argue that many of the benefits of structured data management can be provided to end users as well. I will describe an approach and tools that allow end users to define their own schemas (without knowing what a schema is), manage data and author (not program) interactive Web visualizations of that data using the Web tools with which they are already familiar, such as plain Web pages, blogs, wikis and WYSIWYG document editors. I will describe our experience deploying these tools and some lessons relevant to their future evolution.
2009-06-01
search engines are not up to this task, as they have been optimized to catalog information quickly and efficiently for user ease of access while promoting retail commerce at the same time. This thesis presents a performance analysis of a new search engine algorithm designed to help find IED education networks using the Nutch open-source search engine architecture. It reveals which web pages are more important via references from other web pages regardless of domain. In addition, this thesis discusses potential evaluation and monitoring techniques to be used in conjunction
Application of Project Portfolio Management
NASA Astrophysics Data System (ADS)
Pankowska, Malgorzata
The main goal of the chapter is the presentation of the application project portfolio management approach to support development of e-Municipality and public administration information systems. The models of how people publish and utilize information on the web have been transformed continually. Instead of simply viewing on static web pages, users publish their own content through blogs and photo- and video-sharing slides. Analysed in this chapter, ICT (Information Communication Technology) projects for municipalities cover the mixture of the static web pages, e-Government information systems, and Wikis. So, for the management of the ICT projects' mixtures the portfolio project management approach is proposed.
ERIC Educational Resources Information Center
Snider, Jean; Martin, Florence
2012-01-01
Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…
A Structural and Content-Based Analysis for Web Filtering.
ERIC Educational Resources Information Center
Lee, P. Y.; Hui, S. C.; Fong, A. C. M.
2003-01-01
Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)
Tweeting links to Cochrane Schizophrenia Group reviews: a randomised controlled trial.
Adams, C E; Jayaram, M; Bodart, A Y M; Sampson, S; Zhao, S; Montgomery, A A
2016-03-08
To assess the effects of using health social media on web activity. Individually randomised controlled parallel group superiority trial. Twitter and Weibo. 170 Cochrane Schizophrenia Group full reviews with an abstract and plain language summary web page. Three randomly ordered slightly different 140 character or less messages, each containing a short URL to the freely accessible summary page sent on specific times on one single day. This was compared with no messaging. The primary outcome was web page visits at 1 week. Secondary outcomes were other metrics of web activity at 1 week. 85 reviews were randomised to each of the intervention and control arms. Google Analytics allowed 100% follow-up within 1 week of completion. Intervention and control reviews received a total of 1162 and 449 visits, respectively (IRR 2.7, 95% CI 2.2 to 3.3). Fewer intervention reviews had single page only visits (16% vs 31%, OR 0.41, 0.19 to 0.88) and users spent more time viewing intervention reviews (geometric mean 76 vs 31 s, ratio 2.5, 1.3 to 4.6). Other secondary metrics of web activity all showed strong evidence in favour of the intervention. Tweeting in this limited area of healthcare increases 'product placement' of evidence with the potential for that to influence care. ISRCTN84658943. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Quan, Sherman D; Wu, Robert C; Rossos, Peter G; Arany, Teri; Groe, Silvi; Morra, Dante; Wong, Brian M; Cavalcanti, Rodrigo; Coke, William; Lau, Francis Y
2013-03-01
Institutions have tried to replace the use of numeric pagers for clinical communication by implementing health information technology (HIT) solutions. However, failing to account for the sociotechnical aspects of HIT or the interplay of technology with existing clinical workflow, culture, and social interactions may create other unintended consequences. To evaluate a Web-based messaging system that allows asynchronous communication between health providers and identify the unintended consequences associated with implementing such technology. Intervention-a Web-based messaging system at the University Health Network to replace numeric paging practices in May 2010. The system facilitated clinical communication on the medical wards for coordinating patient care. Study design-pre-post mixed methods utilizing both quantitative and qualitative measures. Five residents, 8 nurses, 2 pharmacists, and 2 social workers were interviewed. Pre-post interruption-15 residents from 5 clinical teams in both periods. The study compared the type of messages sent to physicians before and after implementation of the Web-based messaging system; a constant comparative analysis of semistructured interviews was used to generate key themes related to unintended consequences. Interruptions increased 233%, from 3 pages received per resident per day pre-implementation to 10 messages received per resident per day post-implementation. Key themes relating to unintended consequences that emerged from the interviews included increase in interruptions, accountability, and tactics to improve personal productivity. Meaningful improvements in clinical communication can occur but require more than just replacing pagers. Introducing HIT without addressing the sociotechnical aspects of HIT that underlie clinical communication can lead to unintended consequences. Copyright © 2013 Society of Hospital Medicine.
Index to Print Version of EPA Stylebook
This EPA Communication Product Standards index provides page numbers for topics such as Ampersands, Bitmapped Graphics, Exhibits and Displays, Podium Signage, Proofing, Sentence Length, Title Page, and Web Forms.
Semantic similarity measure in biomedical domain leverage web search engine.
Chen, Chi-Huang; Hsieh, Sheau-Ling; Weng, Yung-Ching; Chang, Wen-Yung; Lai, Feipei
2010-01-01
Semantic similarity measure plays an essential role in Information Retrieval and Natural Language Processing. In this paper we propose a page-count-based semantic similarity measure and apply it in biomedical domains. Previous researches in semantic web related applications have deployed various semantic similarity measures. Despite the usefulness of the measurements in those applications, measuring semantic similarity between two terms remains a challenge task. The proposed method exploits page counts returned by the Web Search Engine. We define various similarity scores for two given terms P and Q, using the page counts for querying P, Q and P AND Q. Moreover, we propose a novel approach to compute semantic similarity using lexico-syntactic patterns with page counts. These different similarity scores are integrated adapting support vector machines, to leverage the robustness of semantic similarity measures. Experimental results on two datasets achieve correlation coefficients of 0.798 on the dataset provided by A. Hliaoutakis, 0.705 on the dataset provide by T. Pedersen with physician scores and 0.496 on the dataset provided by T. Pedersen et al. with expert scores.
Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Site Map News National Centers for Environmental Prediction Climate Prediction Center 5830 University Research Court College Park, Maryland 20740 Climate Prediction Center Web Team Page last modified: December 13, 2005
The Library as Information Provider: The Home Page.
ERIC Educational Resources Information Center
Clyde, Laurel A.
1996-01-01
Discusses ways in which libraries are using the World Wide Web to provide information via a home page, based on information from a survey in Iceland as well as a larger study that conducted content analyses of home pages of public and school libraries in 13 countries. (Author/LRW)
Automatic page layout using genetic algorithms for electronic albuming
NASA Astrophysics Data System (ADS)
Geigel, Joe; Loui, Alexander C. P.
2000-12-01
In this paper, we describe a flexible system for automatic page layout that makes use of genetic algorithms for albuming applications. The system is divided into two modules, a page creator module which is responsible for distributing images amongst various album pages, and an image placement module which positions images on individual pages. Final page layouts are specified in a textual form using XML for printing or viewing over the Internet. The system makes use of genetic algorithms, a class of search and optimization algorithms that are based on the concepts of biological evolution, for generating solutions with fitness based on graphic design preferences supplied by the user. The genetic page layout algorithm has been incorporated into a web-based prototype system for interactive page layout over the Internet. The prototype system is built using client-server architecture and is implemented in java. The system described in this paper has demonstrated the feasibility of using genetic algorithms for automated page layout in albuming and web-based imaging applications. We believe that the system adequately proves the validity of the concept, providing creative layouts in a reasonable number of iterations. By optimizing the layout parameters of the fitness function, we hope to further improve the quality of the final layout in terms of user preference and computation speed.
PUBLISHER'S ANNOUNCEMENT: Important changes for 2008
NASA Astrophysics Data System (ADS)
2008-02-01
As a result of reviewing several aspects of our content, both in print and online, we have made some changes for 2008. These changes are described below. Article numbering Inverse Problems has moved from sequential page numbering to an article numbering system, offering important advantages and flexibility by speeding up the publication process. Articles in different issues or sections can be published online as soon as they are ready, without having to wait for a whole issue or section to be allocated page numbers. The bibliographic citation will change slightly. Articles should be referenced using the six-digit article number in place of a page number, and this number must include any leading zeros. For instance: Surname X and Surname Y 2008 Inverse Problems 24 015001 Articles will continue to be published on the web in advance of the print edition. A new look and feel We have taken the opportunity to refresh the design of Inverse Problems' cover in order to modernise the typography and create a consistent look and feel across IOP Publishing's range of publications. We hope you like the new cover. If you have any questions or comments about any of these changes, please contact us at ip@iop.org Kate Watt Publisher, Inverse Problems
Web page quality: can we measure it and what do we find? A report of exploratory findings.
Abbott, V P
2000-06-01
The aim of this study was to report exploratory findings from an attempt to quantify the quality of a sample of World Wide Web (WWW) pages relating to MMR vaccine that a typical user might locate. Forty pages obtained from a search of the WWW using two search engines and the search expression 'mmr vaccine' were analysed using a standard proforma. The proforma looked at the information the pages contained in terms of three categories: content, authorship and aesthetics. The information from each category was then quantified into a summary statistic, and receiver operating characteristic (ROC) curves were generated using a 'gold standard' of quality derived from the published literature. Optimal cut-off points for each of the three sections were calculated that best discriminated 'good' from 'bad' pages. Pages were also assessed as to whether they were pro- or anti-vaccination. For this sample, the combined contents and authorship score, with a cut-off of five, was a good discriminator, having 88 per cent sensitivity and 92 per cent specificity. Aesthetics was not a good discriminator. In the sample, 32.5 per cent of pages were pro-vaccination; 42.5 per cent were anti-vaccination and 25 per cent were neutral. The relative risk of being of poor quality if anti-vaccination was 3.3 (95 per cent confidence interval 1.8, 6.1). The sample of Web pages did contain some quality information on MMR vaccine. It also contained a great deal of misleading, inaccurate data. The proforma, combined with a knowledge of the literature, may help to distinguish between the two.
Content and Workflow Management for Library Websites: Case Studies
ERIC Educational Resources Information Center
Yu, Holly, Ed.
2005-01-01
Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…
Ultrabroadband photonic internet: safety aspects
NASA Astrophysics Data System (ADS)
Kalicki, Arkadiusz; Romaniuk, Ryszard
2008-11-01
Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.
Fordis, Michael; Alexander, J Douglas; McKellar, Julie
2007-08-01
In the wake of Hurricane Katrina's landfall on August 29, 2005, and the subsequent levee failures, operations of Tulane University School of Medicine became unsustainable. As New Orleans collapsed, faculty, students, residents, and staff were scattered nationwide. In response, four Texas medical schools created an alliance to assist Tulane in temporarily relocating operations to south Texas. Resuming operations in a three- to four-week time span required developing and implementing a coordinated communication plan in the face of widespread communication infrastructure disruptions. A keystone of the strategy involved rapidly creating a "recovery Web site" to provide essential information on immediate recovery plans, mechanisms for reestablishing communications with displaced persons, housing relocation options (over 200 students, faculty, and staff were relocated using Web site resources), classes and residency training, and other issues (e.g., financial services, counseling support) vitally important to affected individuals. The database-driven Web site was launched in four days on September 11, 2005, by modifying an existing system and completing new programming. Additional functions were added during the next week, and the site operated continuously until March 2006, providing about 890,000 pages of information in over 100,000 visitor sessions. The site proved essential in disseminating announcements, reestablishing communications among the Tulane family, and supporting relocation and recovery. This experience shows the importance of information technology in collaborative efforts of academic health centers in early disaster response and recovery, reinforcing recommendations published recently by the Association of Academic Health Centers and the National Academy of Sciences.
16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Requirements for Web site registration or... PRODUCTS (Eff. June 28, 2010) § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the...
Online Literacy Is a Lesser Kind
ERIC Educational Resources Information Center
Bauerlein, Mark
2008-01-01
Web skimming may be a kind of literacy but it's not the kind that matters most. In this article, the author contends that web skimming indicates a decline of literacy. The author discusses research conducted by Jakob Nielsen, a Web researcher, on how users skim web pages. He shows how the web is damaging the right way to read.
EPA's Web Taxonomy is a faceted hierarchical vocabulary used to tag web pages with terms from a controlled vocabulary. Tagging enables search and discovery of EPA's Web based information assests. EPA's Web Taxonomy is being provided in Simple Knowledge Organization System (SKOS) format. SKOS is a standard for sharing and linking knowledge organization systems that promises to make Federal terminology resources more interoperable.
16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...
16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...
16 CFR 1130.7 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.7 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...
16 CFR § 1130.8 - Requirements for Web site registration or alternative e-mail registration.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Requirements for Web site registration or... OR TODDLER PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the...
A Web Server for MACCS Magnetometer Data
NASA Technical Reports Server (NTRS)
Engebretson, Mark J.
1998-01-01
NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.
Effects of picture amount on preference, balance, and dynamic feel of Web pages.
Chiang, Shu-Ying; Chen, Chien-Hsiung
2012-04-01
This study investigates the effects of picture amount on subjective evaluation. The experiment herein adopted two variables to define picture amount: column ratio and picture size. Six column ratios were employed: 7:93,15:85, 24:76, 33:67, 41:59, and 50:50. Five picture sizes were examined: 140 x 81, 220 x 127, 300 x 173, 380 x 219, and 460 x 266 pixels. The experiment implemented a within-subject design; 104 participants were asked to evaluate 30 web page layouts. Repeated measurements revealed that the column ratio and picture size have significant effects on preference, balance, and dynamic feel. The results indicated the most appropriate picture amount for display: column ratios of 15:85 and 24:76, and picture sizes of 220 x 127, 300 x 173, and 380 x 219. The research findings can serve as the basis for the application of design guidelines for future web page interface design.
Viewing ISS Data in Real Time via the Internet
NASA Technical Reports Server (NTRS)
Myers, Gerry; Chamberlain, Jim
2004-01-01
EZStream is a computer program that enables authorized users at diverse terrestrial locations to view, in real time, data generated by scientific payloads aboard the International Space Station (ISS). The only computation/communication resource needed for use of EZStream is a computer equipped with standard Web-browser software and a connection to the Internet. EZStream runs in conjunction with the TReK software, described in a prior NASA Tech Briefs article, that coordinates multiple streams of data for the ground communication system of the ISS. EZStream includes server components that interact with TReK within the ISS ground communication system and client components that reside in the users' remote computers. Once an authorized client has logged in, a server component of EZStream pulls the requested data from a TReK application-program interface and sends the data to the client. Future EZStream enhancements will include (1) extensions that enable the server to receive and process arbitrary data streams on its own and (2) a Web-based graphical-user-interface-building subprogram that enables a client who lacks programming expertise to create customized display Web pages.
Adherence to a web-based intervention program for traumatized persons in mainland China.
Wang, Zhiyun
2014-01-01
This paper investigated adherence to a self-help web-based intervention for PTSD (Chinese My Trauma Recovery, CMTR) in mainland China and evaluated the association between adherence measures and potential predictors, for example, traumatic symptoms and self-efficacy. Data from 56 urban and 90 rural trauma survivors were reported who used at least one of the seven recovery modules of CMTR. The results showed that 80% urban users visited CMTR four or less days and 87% rural users visited CMTR for 5 or 6 days. On average, urban users visited 2.54 (SD=1.99) modules on the first visiting day and less from the second day; rural users visited 1.10 (SD=0.54) modules on the first visiting day, and it became stable in the following days. In both samples, depression scores at pre-test were significantly or trend significantly associated with the number of visited web pages in the relaxation and professional help modules (r=0.20-0.26, all p<0.14); traumatic symptom scores at pre-test significantly or trend significantly correlated to the number of visited web pages in the relaxation, professional help, and mastery tools modules (r=0.20-0.26, all p<0.10). Moreover, urban users' coping self-efficacy scores at pre-test significantly or trend significantly related to the number of visited web pages in the relaxation, professional help, social support, and mastery tool modules (r=0.20-0.33, all p<0.16). These findings suggest that individuals tend to focus on one or two recovery modules when they visit CMTR, and the number of web pages visited during the intervention period relates to users' traumatic and depressive symptoms and self-efficacy before intervention.
Who Do You Think You Are? Personal Home Pages and Self-Presentation on the World Wide Web.
ERIC Educational Resources Information Center
Dominick, Joseph R.
1999-01-01
Analyzes 319 personal home pages. Finds the typical page had a brief biography, a counter or guest book, and links to other pages but did not contain much personal information. Finds that strategies of self-presentation were employed with the same frequency as they were in interpersonal settings, and gender differences in self-presentation were…
National Centers for Environmental Prediction
/ VISION | About EMC EMC > NOAH > HOME Home Operational Products Experimental Data Verification Model PAGE LOGO NCEP HOME NWS LOGO NOAA HOME NOAA HOME Disclaimer for this non-operational web page
ERIC Educational Resources Information Center
Larson, Ray R.
1996-01-01
Examines the bibliometrics of the World Wide Web based on analysis of Web pages collected by the Inktomi "Web Crawler" and on the use of the DEC AltaVista search engine for cocitation analysis of a set of Earth Science related Web sites. Looks at the statistical characteristics of Web documents and their hypertext links, and the…
How to Create a Navigational Wireframe in Word, With Site Map Example
Use Microsoft Word's graphic tools to create a wireframe: an organization chart showing the top three levels of HTML content (Home page, secondary pages, and tertiary pages). This is an important step in planning the structure of a website.
Best Practices for Searchable Collection Pages
Searchable Collection pages are stand-alone documents that do not have any web area navigation. They should not recreate existing content on other sites and should be tagged with quality metadata and taxonomy terms.
... you can put on your web pages. Conjunctivitis (Pink Eye) One-Page Overview Pink, itchy eyes? Conjunctivitis – or ... protect yourself from getting and spreading pink eye . Pink Eye: What To Do Discusses causes and treatment, when ...
National Centers for Environmental Prediction
albedos (testing) Vegetation types Soil texture Images of Snow files: NAM snow page The NESDIS/IMS snow /ice images On Hua-Lu Pan's home page (EMC/NCEP) On the NCAR/RAP Weather Data Page Related soil moisture web sites NCEP/NASA NDAS CPC Soil Moisture Monitoring and Prediction NOAA / National Weather
Avoid the Void: Quick and Easy Site Submission Strategies.
ERIC Educational Resources Information Center
Sullivan, Danny
2000-01-01
Explains how to submit Web sites and promote them to make them more findable by search engines. Discusses submitting to Yahoo!; the Open Directory and other human-powered directories; proper tagging with HTML; designing pages to improve the number indexed; and submitting additional pages as well as the home page. (LRW)
Synoptic reporting in tumor pathology: advantages of a web-based system.
Qu, Zhenhong; Ninan, Shibu; Almosa, Ahmed; Chang, K G; Kuruvilla, Supriya; Nguyen, Nghia
2007-06-01
The American College of Surgeons Commission on Cancer (ACS-CoC) mandates that pathology reports at ACS-CoC-approved cancer programs include all scientifically validated data elements for each site and tumor specimen. The College of American Pathologists (CAP) has produced cancer checklists in static text formats to assist reporting. To be inclusive, the CAP checklists are pages long, requiring extensive text editing and multiple intermediate steps. We created a set of dynamic tumor-reporting templates, using Microsoft Active Server Page (ASP.NET), with drop-down list and data-compile features, and added a reminder function to indicate missing information. Users can access this system on the Internet, prepare the tumor report by selecting relevant data from drop-down lists with an embedded tumor staging scheme, and directly transfer the final report into a laboratory information system by using the copy-and-paste function. By minimizing extensive text editing and eliminating intermediate steps, this system can reduce reporting errors, improve work efficiency, and increase compliance.
School Web Sites: Are They Accessible to All?
ERIC Educational Resources Information Center
Wells, Julie A.; Barron, Ann E.
2006-01-01
In 2002, the National Center for Educational Statistics reported that 99% of public schools had Internet access and 86% of those schools had a web site or web page (Kleiner & Lewis, 2003). This study examined accessibility issues on elementary school homepages. Using a random sample of elementary school web sites, the researchers documented…
Representation of the serial killer on the Italian Internet.
Villano, P; Bastianoni, P; Melotti, G
2001-10-01
The representation of serial killers was examined from the analysis of 317 Web pages in the Italian language to study how the psychological profiles of serial killers are described on the Italian Internet. The correspondence analysis of the content of these Web pages shows that in Italy the serial killer is associated with words such as "monster" and "horror," which suggest and imply psychological perversion and aberrant acts. These traits are peculiar for the Italian scenario.
2002-06-01
Student memo for personnel MCLLS . . . . . . . . . . . . . . 75 i. Migrate data to SQL Server...The Web Server is on the same server as the SWORD database in the current version. 4: results set 5: dynamic HTML page 6: dynamic HTML page 3: SQL ...still be supported by Access. SQL Server would be a more viable tool for a fully developed application based on the number of potential users and
Modernizing the Mobility Air Force for Tomorrow’s Air Traffic Management System
2012-01-01
Decision Support System GLONASS Global’naya Navigatsionnaya Sputnikovaya Sistema [Global Navigation Satellite System] GPS Global Positioning System HF high...spreadsheet, November 2009. Eurocontrol, “Link 2000+ Programme: Frequently Asked Questions,” web page, undated(a). As of June 5, 2012: http...www.eurocontrol.int/faq/link2000 ———, “Link 2000+ Programme,” web page, undated(b). As of June 5, 2012: http://www.eurocontrol.int/programmes/link-2000-programme
Oh What a Tangled Biofilm Web Bacteria Weave
... What a Tangled Biofilm Web Bacteria Weave Inside Life Science View All Articles | Inside Life Science Home Page Oh What a Tangled Biofilm Web ... Cellular Conversations Learning from Bacterial Chatter This Inside Life Science article also appears on LiveScience . Learn about related ...
Law, Michael R; Mintzes, Barbara; Morgan, Steven G
2011-03-01
The Internet has become a popular source of health information. However, there is little information on what drug information and which Web sites are being searched. To investigate the sources of online information about prescription drugs by assessing the most common Web sites returned in online drug searches and to assess the comparative popularity of Web pages for particular drugs. This was a cross-sectional study of search results for the most commonly dispensed drugs in the US (n=278 active ingredients) on 4 popular search engines: Bing, Google (both US and Canada), and Yahoo. We determined the number of times a Web site appeared as the first result. A linked retrospective analysis counted Wikipedia page hits for each of these drugs in 2008 and 2009. About three quarters of the first result on Google USA for both brand and generic names linked to the National Library of Medicine. In contrast, Wikipedia was the first result for approximately 80% of generic name searches on the other 3 sites. On these other sites, over two thirds of brand name searches led to industry-sponsored sites. The Wikipedia pages with the highest number of hits were mainly for opiates, benzodiazepines, antibiotics, and antidepressants. Wikipedia and the National Library of Medicine rank highly in online drug searches. Further, our results suggest that patients most often seek information on drugs with the potential for dependence, for stigmatized conditions, that have received media attention, and for episodic treatments. Quality improvement efforts should focus on these drugs.
Aviation Research and the Internet
NASA Technical Reports Server (NTRS)
Scott, Antoinette M.
1995-01-01
The Internet is a network of networks. It was originally funded by the Defense Advanced Research Projects Agency or DOD/DARPA and evolved in part from the connection of supercomputer sites across the United States. The National Science Foundation (NSF) made the most of their supercomputers by connecting the sites to each other. This made the supercomputers more efficient and now allows scientists, engineers and researchers to access the supercomputers from their own labs and offices. The high speed networks that connect the NSF supercomputers form the backbone of the Internet. The World Wide Web (WWW) is a menu system. It gathers Internet resources from all over the world into a series of screens that appear on your computer. The WWW is also a distributed. The distributed system stores data information on many computers (servers). These servers can go out and get data when you ask for it. Hypermedia is the base of the WWW. One can 'click' on a section and visit other hypermedia (pages). Our approach to demonstrating the importance of aviation research through the Internet began with learning how to put pages on the Internet (on-line) ourselves. We were assigned two aviation companies; Vision Micro Systems Inc. and Innovative Aerodynamic Technologies (IAT). We developed home pages for these SBIR companies. The equipment used to create the pages were the UNIX and Macintosh machines. HTML Supertext software was used to write the pages and the Sharp JX600S scanner to scan the images. As a result, with the use of the UNIX, Macintosh, Sun, PC, and AXIL machines, we were able to present our home pages to over 800,000 visitors.
NASA Technical Reports Server (NTRS)
1996-01-01
A World Wide Web page, Webpress, designed for K-12 teachers is described. The primary emphasis of Webpress is the science of aeronautics, and the page includes many links to various NASA facilities as well as many other scientific organizations.
... turns interrupt others Return to Steps World-Wide Web Search Fin, Fur and Feather Bureau of Investigation ... outside this website We provide links to other web pages if you want to learn more about ...
ERIC Educational Resources Information Center
Nagasinghe, Iranga
2010-01-01
This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…
JavaScript: Convenient Interactivity for the Class Web Page.
ERIC Educational Resources Information Center
Gray, Patricia
This paper shows how JavaScript can be used within HTML pages to add interactive review sessions and quizzes incorporating graphics and sound files. JavaScript has the advantage of providing basic interactive functions without the use of separate software applications and players. Because it can be part of a standard HTML page, it is…
ERIC Educational Resources Information Center
Sun, Yanyan; Gao, Fei
2014-01-01
Web annotation is a Web 2.0 technology that allows learners to work collaboratively on web pages or electronic documents. This study explored the use of Web annotation as an online discussion tool by comparing it to a traditional threaded discussion forum. Ten graduate students participated in the study. Participants had access to both a Web…
Students as Web Site Authors: Effects on Motivation and Achievement
ERIC Educational Resources Information Center
Jones, Brett D.
2003-01-01
This study examined the effects of a Web site design project on students' motivation and achievement. Tenth-grade biology students worked together in teams on an ecology project that required them to locate relevant information on the Internet, decide which information should be included on their Web site, organize the information into Web pages,…
Focus Groups and Usability Testing in Redesigning an Academic Library's Web Site
ERIC Educational Resources Information Center
Oldham, Bonnie W.
2008-01-01
As the World Wide Web has advanced since its inception, librarians have endeavored to keep pace with this progress in the design of their library Web pages. User recommendations collected from focus groups and usability testing have indicated that the University of Scranton's Weinberg Memorial Library's Web site was not working as intended, and…
Learning in a Sheltered Internet Environment: The Use of WebQuests
ERIC Educational Resources Information Center
Segers, Eliane; Verhoeven, Ludo
2009-01-01
The present study investigated the effects on learning in a sheltered Internet environment using so-called WebQuests in elementary school classrooms in the Netherlands. A WebQuest is an assignment presented together with a series of web pages to help guide children's learning. The learning gains and quality of the work of 229 sixth graders…
A Web Browser Interface to Manage the Searching and Organizing of Information on the Web by Learners
ERIC Educational Resources Information Center
Li, Liang-Yi; Chen, Gwo-Dong
2010-01-01
Information Gathering is a knowledge construction process. Web learners make a plan for their Information Gathering task based on their prior knowledge. The plan is evolved with new information encountered and their mental model is constructed through continuously assimilating and accommodating new information gathered from different Web pages. In…
Web-based hydrodynamics computing
NASA Astrophysics Data System (ADS)
Shimoide, Alan; Lin, Luping; Hong, Tracie-Lynne; Yoon, Ilmi; Aragon, Sergio R.
2005-01-01
Proteins are long chains of amino acids that have a definite 3-d conformation and the shape of each protein is vital to its function. Since proteins are normally in solution, hydrodynamics (describes the movement of solvent around a protein as a function of shape and size of the molecule) can be used to probe the size and shape of proteins compared to those derived from X-ray crystallography. The computation chain needed for these hydrodynamics calculations consists of several separate programs by different authors on various platforms and often requires 3D visualizations of intermediate results. Due to the complexity, tools developed by a particular research group are not readily available for use by other groups, nor even by the non-experts within the same research group. To alleviate this situation, and to foment the easy and wide distribution of computational tools worldwide, we developed a web based interactive computational environment (WICE) including interactive 3D visualization that can be used with any web browser. Java based technologies were used to provide a platform neutral, user-friendly solution. Java Server Pages (JSP), Java Servlets, Java Beans, JOGL (Java bindings for OpenGL), and Java Web Start were used to create a solution that simplifies the computing chain for the user allowing the user to focus on their scientific research. WICE hides complexity from the user and provides robust and sophisticated visualization through a web browser.
Web-based hydrodynamics computing
NASA Astrophysics Data System (ADS)
Shimoide, Alan; Lin, Luping; Hong, Tracie-Lynne; Yoon, Ilmi; Aragon, Sergio R.
2004-12-01
Proteins are long chains of amino acids that have a definite 3-d conformation and the shape of each protein is vital to its function. Since proteins are normally in solution, hydrodynamics (describes the movement of solvent around a protein as a function of shape and size of the molecule) can be used to probe the size and shape of proteins compared to those derived from X-ray crystallography. The computation chain needed for these hydrodynamics calculations consists of several separate programs by different authors on various platforms and often requires 3D visualizations of intermediate results. Due to the complexity, tools developed by a particular research group are not readily available for use by other groups, nor even by the non-experts within the same research group. To alleviate this situation, and to foment the easy and wide distribution of computational tools worldwide, we developed a web based interactive computational environment (WICE) including interactive 3D visualization that can be used with any web browser. Java based technologies were used to provide a platform neutral, user-friendly solution. Java Server Pages (JSP), Java Servlets, Java Beans, JOGL (Java bindings for OpenGL), and Java Web Start were used to create a solution that simplifies the computing chain for the user allowing the user to focus on their scientific research. WICE hides complexity from the user and provides robust and sophisticated visualization through a web browser.
CBS Genome Atlas Database: a dynamic storage for bioinformatic results and sequence data.
Hallin, Peter F; Ussery, David W
2004-12-12
Currently, new bacterial genomes are being published on a monthly basis. With the growing amount of genome sequence data, there is a demand for a flexible and easy-to-maintain structure for storing sequence data and results from bioinformatic analysis. More than 150 sequenced bacterial genomes are now available, and comparisons of properties for taxonomically similar organisms are not readily available to many biologists. In addition to the most basic information, such as AT content, chromosome length, tRNA count and rRNA count, a large number of more complex calculations are needed to perform detailed comparative genomics. DNA structural calculations like curvature and stacking energy, DNA compositions like base skews, oligo skews and repeats at the local and global level are just a few of the analysis that are presented on the CBS Genome Atlas Web page. Complex analysis, changing methods and frequent addition of new models are factors that require a dynamic database layout. Using basic tools like the GNU Make system, csh, Perl and MySQL, we have created a flexible database environment for storing and maintaining such results for a collection of complete microbial genomes. Currently, these results counts to more than 220 pieces of information. The backbone of this solution consists of a program package written in Perl, which enables administrators to synchronize and update the database content. The MySQL database has been connected to the CBS web-server via PHP4, to present a dynamic web content for users outside the center. This solution is tightly fitted to existing server infrastructure and the solutions proposed here can perhaps serve as a template for other research groups to solve database issues. A web based user interface which is dynamically linked to the Genome Atlas Database can be accessed via www.cbs.dtu.dk/services/GenomeAtlas/. This paper has a supplemental information page which links to the examples presented: www.cbs.dtu.dk/services/GenomeAtlas/suppl/bioinfdatabase.
Morrissey, Karyn; Kinderman, Peter; Pontin, Eleanor; Tai, Sara; Schwannauer, Mathias
2016-08-01
In June 2011 the BBC Lab UK carried out a web-based survey on the causes of mental distress. The 'Stress Test' was launched on 'All in the Mind' a BBC Radio 4 programme and the test's URL was publicised on radio and TV broadcasts, and made available via BBC web pages and social media. Given the large amount of data created, over 32,800 participants, with corresponding diagnosis, demographic and socioeconomic characteristics; the dataset are potentially an important source of data for population based research on depression and anxiety. However, as respondents self-selected to participate in the online survey, the survey may comprise a non-random sample. It may be only individuals that listen to BBC Radio 4 and/or use their website that participated in the survey. In this instance using the Stress Test data for wider population based research may create sample selection bias. Focusing on the depression component of the Stress Test, this paper presents an easy-to-use method, the Two Step Probit Selection Model, to detect and statistically correct selection bias in the Stress Test. Using a Two Step Probit Selection Model; this paper did not find a statistically significant selection on unobserved factors for participants of the Stress Test. That is, survey participants who accessed and completed an online survey are not systematically different from non-participants on the variables of substantive interest. Copyright © 2016 Elsevier Ltd. All rights reserved.
Interactive 3d Landscapes on Line
NASA Astrophysics Data System (ADS)
Fanini, B.; Calori, L.; Ferdani, D.; Pescarin, S.
2011-09-01
The paper describes challenges identified while developing browser embedded 3D landscape rendering applications, our current approach and work-flow and how recent development in browser technologies could affect. All the data, even if processed by optimization and decimation tools, result in very huge databases that require paging, streaming and Level-of-Detail techniques to be implemented to allow remote web based real time fruition. Our approach has been to select an open source scene-graph based visual simulation library with sufficient performance and flexibility and adapt it to the web by providing a browser plug-in. Within the current Montegrotto VR Project, content produced with new pipelines has been integrated. The whole Montegrotto Town has been generated procedurally by CityEngine. We used this procedural approach, based on algorithms and procedures because it is particularly functional to create extensive and credible urban reconstructions. To create the archaeological sites we used optimized mesh acquired with laser scanning and photogrammetry techniques whereas to realize the 3D reconstructions of the main historical buildings we adopted computer-graphic software like blender and 3ds Max. At the final stage, semi-automatic tools have been developed and used up to prepare and clusterise 3D models and scene graph routes for web publishing. Vegetation generators have also been used with the goal of populating the virtual scene to enhance the user perceived realism during the navigation experience. After the description of 3D modelling and optimization techniques, the paper will focus and discuss its results and expectations.
Web-based training in German university eye hospitals - Education 2.0?
Handzel, Daniel M; Hesse, L
2011-01-01
To analyse web-based training in ophthalmology offered by German university eye hospitals. In January 2010 the websites of all 36 German university hospitals were searched for information provided for visitors, students and doctors alike. We evaluated the offer in terms of quantity and quality. All websites could be accessed at the time of the study. 28 pages provided information for students and doctors, one page only for students, three exclusively for doctors. Four pages didn't offer any information for these target groups. The websites offered information on events like congresses or students curricular education, there were also material for download for these events or for other purposes. We found complex e-learning-platforms on 9 pages. These dealt with special ophthalmological topics in a didactic arrangement. In spite of the extensive possibilities offered by the technology of Web 2.0, many conceivable tools were only rarely made available. It was not always possible to determine if the information provided was up-to-date, very often the last actualization of the content was long ago. On one page the date for the last change was stated as 2004. Currently there are 9 functional e-learning-applications offered by German university eye hospitals. Two additional hospitals present links to a project of the German Ophthalmological Society. There was a considerable variation in quantity and quality. No website made use of crediting successful studying, e.g. with CME-points or OSCE-credits. All German university eye hospitals present themselves in the World Wide Web. However, the lack of modern, technical as well as didactical state-of-the-art learning applications is alarming as it leaves an essential medium of today's communication unused.