Sample records for web access statistics

  1. School Web Sites: Are They Accessible to All?

    ERIC Educational Resources Information Center

    Wells, Julie A.; Barron, Ann E.

    2006-01-01

    In 2002, the National Center for Educational Statistics reported that 99% of public schools had Internet access and 86% of those schools had a web site or web page (Kleiner & Lewis, 2003). This study examined accessibility issues on elementary school homepages. Using a random sample of elementary school web sites, the researchers documented…

  2. Ocean Drilling Program: Web Site Access Statistics

    Science.gov Websites

    and products Drilling services and tools Online Janus database Search the ODP/TAMU web site ODP's main See statistics for JOIDES members. See statistics for Janus database. 1997 October November December accessible only on www-odp.tamu.edu. ** End of ODP, start of IODP. Privacy Policy ODP | Search | Database

  3. Evaluation of Web Accessibility of Consumer Health Information Websites

    PubMed Central

    Zeng, Xiaoming; Parmanto, Bambang

    2003-01-01

    The objectives of the study are to construct a comprehensive framework for web accessibility evaluation, to evaluate the current status of web accessibility of consumer health information websites and to investigate the relationship between web accessibility and property of the websites. We selected 108 consumer health information websites from the directory service of a Web search engine. We used Web accessibility specifications to construct a framework for the measurement of Web Accessibility Barriers (WAB) of website. We found that none of the websites is completely accessible to people with disabilities, but governmental and educational health information websites exhibit better performance on web accessibility than other categories of websites. We also found that the correlation between the WAB score and the popularity of a website is statistically significant. PMID:14728272

  4. Evaluation of web accessibility of consumer health information websites.

    PubMed

    Zeng, Xiaoming; Parmanto, Bambang

    2003-01-01

    The objectives of the study are to construct a comprehensive framework for web accessibility evaluation, to evaluate the current status of web accessibility of consumer health information websites and to investigate the relationship between web accessibility and property of the websites. We selected 108 consumer health information websites from the directory service of a Web search engine. We used Web accessibility specifications to construct a framework for the measurement of Web Accessibility Barriers (WAB) of website. We found that none of the websites is completely accessible to people with disabilities, but governmental and educational health information websites exhibit better performance on web accessibility than other categories of websites. We also found that the correlation between the WAB score and the popularity of a website is statistically significant.

  5. Making Spatial Statistics Service Accessible On Cloud Platform

    NASA Astrophysics Data System (ADS)

    Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.

    2014-04-01

    Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.

  6. Automated grading of homework assignments and tests in introductory and intermediate statistics courses using active server pages.

    PubMed

    Stockburger, D W

    1999-05-01

    Active server pages permit a software developer to customize the Web experience for users by inserting server-side script and database access into Web pages. This paper describes applications of these techniques and provides a primer on the use of these methods. Applications include a system that generates and grades individualized homework assignments and tests for statistics students. The student accesses the system as a Web page, prints out the assignment, does the assignment, and enters the answers on the Web page. The server, running on NT Server 4.0, grades the assignment, updates the grade book (on a database), and returns the answer key to the student.

  7. Web Content Accessibility of Consumer Health Information Web Sites for People with Disabilities: A Cross Sectional Evaluation

    PubMed Central

    Parmanto, Bambang

    2004-01-01

    Background The World Wide Web (WWW) has become an increasingly essential resource for health information consumers. The ability to obtain accurate medical information online quickly, conveniently and privately provides health consumers with the opportunity to make informed decisions and participate actively in their personal care. Little is known, however, about whether the content of this online health information is equally accessible to people with disabilities who must rely on special devices or technologies to process online information due to their visual, hearing, mobility, or cognitive limitations. Objective To construct a framework for an automated Web accessibility evaluation; to evaluate the state of accessibility of consumer health information Web sites; and to investigate the possible relationships between accessibility and other features of the Web sites, including function, popularity and importance. Methods We carried out a cross-sectional study of the state of accessibility of health information Web sites to people with disabilities. We selected 108 consumer health information Web sites from the directory service of a Web search engine. A measurement framework was constructed to automatically measure the level of Web Accessibility Barriers (WAB) of Web sites following Web accessibility specifications. We investigated whether there was a difference between WAB scores across various functional categories of the Web sites, and also evaluated the correlation between the WAB and Alexa traffic rank and Google Page Rank of the Web sites. Results We found that none of the Web sites we looked at are completely accessible to people with disabilities, i.e., there were no sites that had no violation of Web accessibility rules. However, governmental and educational health information Web sites do exhibit better Web accessibility than the other categories of Web sites (P < 0.001). We also found that the correlation between the WAB score and the popularity of a Web site is statistically significant (r = 0.28, P < 0.05), although there is no correlation between the WAB score and the importance of the Web sites (r = 0.15, P = 0.111). Conclusions Evaluation of health information Web sites shows that no Web site scrupulously abides by Web accessibility specifications, even for entities mandated under relevant laws and regulations. Government and education Web sites show better performance than Web sites among other categories. Accessibility of a Web site may have a positive impact on its popularity in general. However, the Web accessibility of a Web site may not have a significant relationship with its importance on the Web. PMID:15249268

  8. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  9. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  10. Web-TCGA: an online platform for integrated analysis of molecular cancer data sets.

    PubMed

    Deng, Mario; Brägelmann, Johannes; Schultze, Joachim L; Perner, Sven

    2016-02-06

    The Cancer Genome Atlas (TCGA) is a pool of molecular data sets publicly accessible and freely available to cancer researchers anywhere around the world. However, wide spread use is limited since an advanced knowledge of statistics and statistical software is required. In order to improve accessibility we created Web-TCGA, a web based, freely accessible online tool, which can also be run in a private instance, for integrated analysis of molecular cancer data sets provided by TCGA. In contrast to already available tools, Web-TCGA utilizes different methods for analysis and visualization of TCGA data, allowing users to generate global molecular profiles across different cancer entities simultaneously. In addition to global molecular profiles, Web-TCGA offers highly detailed gene and tumor entity centric analysis by providing interactive tables and views. As a supplement to other already available tools, such as cBioPortal (Sci Signal 6:pl1, 2013, Cancer Discov 2:401-4, 2012), Web-TCGA is offering an analysis service, which does not require any installation or configuration, for molecular data sets available at the TCGA. Individual processing requests (queries) are generated by the user for mutation, methylation, expression and copy number variation (CNV) analyses. The user can focus analyses on results from single genes and cancer entities or perform a global analysis (multiple cancer entities and genes simultaneously).

  11. Effects of Web-Based Instruction on Math Anxiety, the Sense of Mastery, and Global Self-Esteem: A Quasi-Experimental Study of Undergraduate Statistics Students

    ERIC Educational Resources Information Center

    Van Gundy, Karen; Morton, Beth A.; Liu, Hope Q.; Kline, Jennifer

    2006-01-01

    To explore the effects of web-based instruction (WBI) on math anxiety, the sense of mastery, and global self-esteem, we use quasi-experimental data from undergraduate statistics students in classes assigned to three study conditions, each with varied access to, and incentive for, the use of online technologies. Results suggest that when statistics…

  12. The Many Faces of the Economic Bulletin Board.

    ERIC Educational Resources Information Center

    Boettcher, Jennifer

    1996-01-01

    The Economic Bulletin Board (EBB), a one-stop site for economic statistics and government-sponsored business information, can be accessed on the World Wide Web, gopher, telnet, file transfer protocol, dial-up, and fax. Each access method has advantages and disadvantages related to connections, pricing, depth of access, retrieval, and system…

  13. Review of Web-Based Technical Documentation Processes. FY07 NAEP-QA Special Study Report. TR-08-17

    ERIC Educational Resources Information Center

    Gribben, Monica; Wise, Lauress; Becker, D. E.

    2008-01-01

    Beginning with the 2000 and 2001 National Assessment of Educational Progress (NAEP) assessments, the National Center for Education Statistics (NCES) has made technical documentation available on the worldwide web at http://nces.ed.gov/nationsreportcard/tdw/. The web-based documentation is designed to be less dense and more accessible than prior…

  14. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  15. Web Access to Japanese Science and Technology Information.

    ERIC Educational Resources Information Center

    Takase, Emi

    1997-01-01

    Describes a project conducted by the Massachusetts Institute of Technology (MIT) Libraries in collaboration with the MIT Japan Program; its objectives are to increase information exchange and enhance cooperation between Japan and the United States through a World Wide Web page and an interactive listserv. Examines usage statistics and issues in…

  16. Library Web Proxy Use Survey Results.

    ERIC Educational Resources Information Center

    Murray, Peter E.

    2001-01-01

    Outlines the use of proxy Web servers by libraries and reports on a survey on their use in libraries. Highlights include proxy use for remote resource access, for filtering, for bandwidth conservation, and for gathering statistics; privacy policies regarding the use of proxy server log files; and a copy of the survey. (LRW)

  17. StreamStats in North Carolina: a water-resources Web application

    USGS Publications Warehouse

    Weaver, J. Curtis; Terziotti, Silvia; Kolb, Katharine R.; Wagner, Chad R.

    2012-01-01

    A statewide StreamStats application for North Carolina was developed in cooperation with the North Carolina Department of Transportation following completion of a pilot application for the upper French Broad River basin in western North Carolina (Wagner and others, 2009). StreamStats for North Carolina, available at http://water.usgs.gov/osw/streamstats/north_carolina.html, is a Web-based Geographic Information System (GIS) application developed by the U.S. Geological Survey (USGS) in consultation with Environmental Systems Research Institute, Inc. (Esri) to provide access to an assortment of analytical tools that are useful for water-resources planning and management (Ries and others, 2008). The StreamStats application provides an accurate and consistent process that allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection sites and user-selected ungaged sites. In the North Carolina application, users can compute 47 basin characteristics and peak-flow frequency statistics (Weaver and others, 2009; Robbins and Pope, 1996) for a delineated drainage basin. Selected streamflow statistics and basin characteristics for data-collection sites have been compiled from published reports and also are immediately accessible by querying individual sites from the web interface. Examples of basin characteristics that can be computed in StreamStats include drainage area, stream slope, mean annual precipitation, and percentage of forested area (Ries and others, 2008). Examples of streamflow statistics that were previously available only through published documents include peak-flow frequency, flow-duration, and precipitation data. These data are valuable for making decisions related to bridge design, floodplain delineation, water-supply permitting, and sustainable stream quality and ecology. The StreamStats application also allows users to identify stream reaches upstream and downstream from user-selected sites and obtain information for locations along streams where activities occur that may affect streamflow conditions. This functionality can be accessed through a map-based interface with the user’s Web browser, or individual functions can be requested remotely through Web services (Ries and others, 2008).

  18. Comparison of quality of internet pages on human papillomavirus immunization in Italian and in English.

    PubMed

    Tozzi, Alberto Eugenio; Buonuomo, Paola Sabrina; Ciofi degli Atti, Marta Luisa; Carloni, Emanuela; Meloni, Marco; Gamba, Fiorenza

    2010-01-01

    Information available on the Internet about immunizations may influence parents' perception about human papillomavirus (HPV) immunization and their attitude toward vaccinating their daughters. We hypothesized that the quality of information on HPV available on the Internet may vary with language and with the level of knowledge of parents. To this end we compared the quality of a sample of Web pages in Italian with a sample of Web pages in English. Five reviewers assessed the quality of Web pages retrieved with popular search engines using criteria adapted from the Good Information Practice Essential Criteria for Vaccine Safety Web Sites recommended by the World Health Organization. Quality of Web pages was assessed in the domains of accessibility, credibility, content, and design. Scores in these domains were compared through nonparametric statistical tests. We retrieved and reviewed 74 Web sites in Italian and 117 in English. Most retrieved Web pages (33.5%) were from private agencies. Median scores were higher in Web pages in English compared with those in Italian in the domain of accessibility (p < .01), credibility (p < .01), and content (p < .01). The highest credibility and content scores were those of Web pages from governmental agencies or universities. Accessibility scores were positively associated with content scores (p < .01) and with credibility scores (p < .01). A total of 16.2% of Web pages in Italian opposed HPV immunization compared with 6.0% of those in English (p < .05). Quality of information and number of Web pages opposing HPV immunization may vary with the Web site language. High-quality Web pages on HPV, especially from public health agencies and universities, should be easily accessible and retrievable with common Web search engines. Copyright 2010 Society for Adolescent Medicine. Published by Elsevier Inc. All rights reserved.

  19. TMFoldWeb: a web server for predicting transmembrane protein fold class.

    PubMed

    Kozma, Dániel; Tusnády, Gábor E

    2015-09-17

    Here we present TMFoldWeb, the web server implementation of TMFoldRec, a transmembrane protein fold recognition algorithm. TMFoldRec uses statistical potentials and utilizes topology filtering and a gapless threading algorithm. It ranks template structures and selects the most likely candidates and estimates the reliability of the obtained lowest energy model. The statistical potential was developed in a maximum likelihood framework on a representative set of the PDBTM database. According to the benchmark test the performance of TMFoldRec is about 77 % in correctly predicting fold class for a given transmembrane protein sequence. An intuitive web interface has been developed for the recently published TMFoldRec algorithm. The query sequence goes through a pipeline of topology prediction and a systematic sequence to structure alignment (threading). Resulting templates are ordered by energy and reliability values and are colored according to their significance level. Besides the graphical interface, a programmatic access is available as well, via a direct interface for developers or for submitting genome-wide data sets. The TMFoldWeb web server is unique and currently the only web server that is able to predict the fold class of transmembrane proteins while assigning reliability scores for the prediction. This method is prepared for genome-wide analysis with its easy-to-use interface, informative result page and programmatic access. Considering the info-communication evolution in the last few years, the developed web server, as well as the molecule viewer, is responsive and fully compatible with the prevalent tablets and mobile devices.

  20. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  1. Influence of Internet Accessibility and Demographic factors on utilization of Web-based Health Information Resources by Resident Doctors in Nigeria

    PubMed Central

    Ajuwon, GA; Popoola, SO

    2015-01-01

    Background The internet is a huge library with avalanche of information resources including healthcare information. There are numerous studies on use of electronic resources by healthcare providers including medical practitioners however, there is a dearth of information on the patterns of use of web-based health information resource by resident doctors in Nigeria. This study therefore investigates the influence of internet accessibility and demographic factors on utilization of web-based health information resources by resident doctors in tertiary healthcare institutions in Nigeria. Methods Descriptive survey design was adopted for this study. The population of study consisted of medical doctors undergoing residency training in 13 tertiary healthcare institutions in South-West Nigeria. The tertiary healthcare institutions were Federal Medical Centres, University Teaching Hospitals and Specialist Hospitals (Neuropsychiatric and Orthopaedic). A pre-tested, self-administered questionnaire was used for data collection. The Statistical Package for the Social Sciences (SPSS) was used for data analysis. Data were analyzed using descriptive statistics, Pearson Product Moment correlation and multiple regression analysis. Results The mean age of the respondents was 34 years and males were in the majority (69.0%). A total of 96.1% respondents had access to the Internet. E-mail (X̄=5.40, SD=0.91), Google (X̄=5.26, SD=1.38), Yahoo (X̄=5.15, SD=4.44) were used weekly by the respondents. Preparation for Seminar/Grand Round presentation (X̄=8.4, SD=1.92), research (X̄=7.8, SD=2.70) and communication (X̄=7.6, SD=2.60) were ranked high as purposes for use of web-based information resources. There is a strong, positive and significant relationship between internet accessibility and utilization of web-based health information resources (r=0.628, p<0.05). Internet accessibility (B=0.911) and demographic factors: gender (B=−2.027), designation (B=−0.343) educational qualification (B=2.411) significantly influence utilization of web-based health information resources of the respondents. Conclusion A great majority of the respondents have access to the Internet and used web-based health information resources more for academic purposes than patient care. Training is required to promote use of internet health information resources among resident doctors. The findings of this study will be useful to the management of the 13 healthcare institutions regarding provision of appropriate internet facilities that will enhance access and use of web-based health information resources by resident doctors. PMID:26681825

  2. Influence of Internet Accessibility and Demographic factors on utilization of Web-based Health Information Resources by Resident Doctors in Nigeria.

    PubMed

    Ajuwon, G A; Popoola, S O

    2014-09-01

    The internet is a huge library with avalanche of information resources including healthcare information. There are numerous studies on use of electronic resources by healthcare providers including medical practitioners however, there is a dearth of information on the patterns of use of web-based health information resource by resident doctors in Nigeria. This study therefore investigates the influence of internet accessibility and demographic factors on utilization of web-based health information resources by resident doctors in tertiary healthcare institutions in Nigeria. Descriptive survey design was adopted for this study. The population of study consisted of medical doctors undergoing residency training in 13 tertiary healthcare institutions in South-West Nigeria. The tertiary healthcare institutions were Federal Medical Centres, University Teaching Hospitals and Specialist Hospitals (Neuropsychiatric and Orthopaedic). A pre-tested, self-administered questionnaire was used for data collection. The Statistical Package for the Social Sciences (SPSS) was used for data analysis. Data were analyzed using descriptive statistics, Pearson Product Moment correlation and multiple regression analysis. The mean age of the respondents was 34 years and males were in the majority (69.0%). A total of 96.1% respondents had access to the Internet. E-mail (X̄=5.40, SD=0.91), Google (X̄=5.26, SD=1.38), Yahoo (X̄=5.15, SD=4.44) were used weekly by the respondents. Preparation for Seminar/Grand Round presentation (X̄=8.4, SD=1.92), research (X̄=7.8, SD=2.70) and communication (X̄=7.6, SD=2.60) were ranked high as purposes for use of web-based information resources. There is a strong, positive and significant relationship between internet accessibility and utilization of web-based health information resources (r=0.628, p<0.05). Internet accessibility (B=0.911) and demographic factors: gender (B=-2.027), designation (B=-0.343) educational qualification (B=2.411) significantly influence utilization of web-based health information resources of the respondents. A great majority of the respondents have access to the Internet and used web-based health information resources more for academic purposes than patient care. Training is required to promote use of internet health information resources among resident doctors. The findings of this study will be useful to the management of the 13 healthcare institutions regarding provision of appropriate internet facilities that will enhance access and use of web-based health information resources by resident doctors.

  3. P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.

    PubMed

    Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D

    2017-11-01

    P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  4. Quick Access: Find Statistical Data on the Internet.

    ERIC Educational Resources Information Center

    Su, Di

    1999-01-01

    Provides an annotated list of Internet sources (World Wide Web, ftp, and gopher sites) for current and historical statistical business data, including selected interest rates, the Consumer Price Index, the Producer Price Index, foreign currency exchange rates, noon buying rates, per diem rates, the special drawing right, stock quotes, and mutual…

  5. COMPARE/Radiology, an interactive Web-based radiology teaching program evaluation of user response.

    PubMed

    Wagner, Matthias; Heckemann, Rolf A; Nömayr, Anton; Greess, Holger; Bautz, Werner A; Grunewald, Markus

    2005-06-01

    The aim of this study is to assess user benefits of COMPARE/Radiology, a highly interactive World Wide Web-based training program for radiology, as perceived by its users. COMPARE/Radiology (http://www.idr.med.uni-erlangen.de/compare.htm), an interactive training program based on 244 teaching cases, was created by the authors and made publicly available on the Internet. An anonymous survey was conducted among users to investigate the composition of the program's user base and assess the acceptance of the training program. In parallel, Web access data were collected and analyzed using descriptive statistics. The group of responding users (n = 1370) consisted of 201 preclinical medical students (14.7%), 314 clinical medical students (22.9%), 359 residents in radiology (26.2%), and 205 users of other professions (14.9%). A majority of respondents (1230; 89%) rated the interactivity of COMPARE/Radiology as good or excellent. Many respondents use COMPARE/Radiology for self-study (971; 70%) and for teaching others (600; 43%). Web access statistics show an increase in number of site visits from 1248 in December 2002 to 4651 in April 2004. Users appreciate the benefits of COMPARE/Radiology. The interactive instructional design was rated positively by responding users. The popularity of the site is growing, evidenced by the number of network accesses during the observation period.

  6. Health Statistics NSW: getting the right balance between privacy and small numbers in a web-based reporting system.

    PubMed

    Scandol, James P; Moore, Helen A

    2012-01-01

    Health Statistics NSW is a new web-based application developed by the Centre for Epidemiology and Research at the NSW Ministry of Health. The application is designed to be an efficient vehicle for the timely delivery of health statistics to a diverse audience including the general public, health planners, researchers, students and policy analysts. The development and implementation of this web application required the consideration of a series of competing demands such as: the public interest in providing health data while maintaining the privacy interests of the individuals whose health is being reported; reporting data at spatial scales of relevance to health planners while maintaining the statistical integrity of any inferences drawn; the use of hardware and software systems which are publicly accessible, scalable and robust, while ensuring high levels of security. These three competing demands and the relationships between them are discussed in the context of Health Statistics NSW.

  7. Using Interactive Simulations in Assessment: The Use of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    ERIC Educational Resources Information Center

    Neumann, David L.

    2010-01-01

    Interactive computer-based simulations have been applied in several contexts to teach statistical concepts in university level courses. In this report, the use of interactive simulations as part of summative assessment in a statistics course is described. Students accessed the simulations via the web and completed questions relating to the…

  8. Use of StreamStats in the Upper French Broad River Basin, North Carolina: A Pilot Water-Resources Web Application

    USGS Publications Warehouse

    Wagner, Chad R.; Tighe, Kirsten C.; Terziotti, Silvia

    2009-01-01

    StreamStats is a Web-based Geographic Information System (GIS) application that was developed by the U.S. Geological Survey (USGS) in cooperation with Environmental Systems Research Institute, Inc. (ESRI) to provide access to an assortment of analytical tools that are useful for water-resources planning and management. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection sites and selected ungaged sites. StreamStats also allows users to identify stream reaches upstream and downstream from user-selected sites and obtain information for locations along streams where activities occur that can affect streamflow conditions. This functionality can be accessed through a map-based interface with the user's Web browser or through individual functions requested remotely through other Web applications.

  9. WebArray: an online platform for microarray data analysis

    PubMed Central

    Xia, Xiaoqin; McClelland, Michael; Wang, Yipeng

    2005-01-01

    Background Many cutting-edge microarray analysis tools and algorithms, including commonly used limma and affy packages in Bioconductor, need sophisticated knowledge of mathematics, statistics and computer skills for implementation. Commercially available software can provide a user-friendly interface at considerable cost. To facilitate the use of these tools for microarray data analysis on an open platform we developed an online microarray data analysis platform, WebArray, for bench biologists to utilize these tools to explore data from single/dual color microarray experiments. Results The currently implemented functions were based on limma and affy package from Bioconductor, the spacings LOESS histogram (SPLOSH) method, PCA-assisted normalization method and genome mapping method. WebArray incorporates these packages and provides a user-friendly interface for accessing a wide range of key functions of limma and others, such as spot quality weight, background correction, graphical plotting, normalization, linear modeling, empirical bayes statistical analysis, false discovery rate (FDR) estimation, chromosomal mapping for genome comparison. Conclusion WebArray offers a convenient platform for bench biologists to access several cutting-edge microarray data analysis tools. The website is freely available at . It runs on a Linux server with Apache and MySQL. PMID:16371165

  10. [Evaluation of the Andalusia Public Health System hospital websites in the period 2010-2012].

    PubMed

    de la Torre Barbero, M J; Estepa Luna, M J; López-Pardo Martínez, M; León Márquez, M; Sánchez Laguna, F; Toledano Redondo, S

    2014-01-01

    Evaluate the quality, accessibility and presence of Web 2.0 tools in the Andalusia Public Health System hospitals websites Observational, descriptive study carried out between 2010 and 2012. The variables analyzed were: quality, accessibility and innovation. The quality was evaluated using a Bermudez-Tamayo questionnaire. Accessibility was measured using the Web Accessibility Test (TAW) tool. Web 2.0 tools were identified by direct observation. A total of 31 of the 45 hospitals (68.9%) had a website in the year 2010, increasing to 34 (75.5%) in 2012. The average score+standard deviation (SD) of the Bermudez-Tamayo quality questionnaire was 11.1+3.8 points in 2010, and 12.3+3.9 points in 2012, observing a statistically significant difference of 0.25 being observed between the means (P=.007), 95% CI; 0.00 to 0.50) In the accessibility evaluation only 7 websites (n=31) in 2010, and 10 (n=34) in 2012, fulfilled the legal criteria for accessibility. The use of Web 2.0 tools has increased throughout the study. In 2010, 19.4% (n=6) of the hospital websites had this type of tool, in comparison to 58.8% (n=20) in 2012. In general, the quality of the websites studied is good. However, current legislation regarding accessibility is not fulfilled and must be revised and adapted to the current legal rules. There is an incipient use of Web 2.0 resources as education and communication strategies with regard to health. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.

  11. Bringing modeling to the masses: A web based system to predict potential species distributions

    USGS Publications Warehouse

    Graham, Jim; Newman, Greg; Kumar, Sunil; Jarnevich, Catherine S.; Young, Nick; Crall, Alycia W.; Stohlgren, Thomas J.; Evangelista, Paul

    2010-01-01

    Predicting current and potential species distributions and abundance is critical for managing invasive species, preserving threatened and endangered species, and conserving native species and habitats. Accurate predictive models are needed at local, regional, and national scales to guide field surveys, improve monitoring, and set priorities for conservation and restoration. Modeling capabilities, however, are often limited by access to software and environmental data required for predictions. To address these needs, we built a comprehensive web-based system that: (1) maintains a large database of field data; (2) provides access to field data and a wealth of environmental data; (3) accesses values in rasters representing environmental characteristics; (4) runs statistical spatial models; and (5) creates maps that predict the potential species distribution. The system is available online at www.niiss.org, and provides web-based tools for stakeholders to create potential species distribution models and maps under current and future climate scenarios.

  12. Web services in the U.S. geological survey streamstats web application

    USGS Publications Warehouse

    Guthrie, J.D.; Dartiguenave, C.; Ries, Kernell G.

    2009-01-01

    StreamStats is a U.S. Geological Survey Web-based GIS application developed as a tool for waterresources planning and management, engineering design, and other applications. StreamStats' primary functionality allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, Web services have been developed that provide the capability to remote users and applications to access comprehensive GIS tools that are available in StreamStats, including delineating drainage-basin boundaries, computing basin characteristics, estimating streamflow statistics for user-selected locations, and determining point features that coincide with a National Hydrography Dataset (NHD) reach address. For the state of Kentucky, a web service also has been developed that provides users the ability to estimate daily time series of drainage-basin average values of daily precipitation and temperature. The use of web services allows the user to take full advantage of the datasets and processes behind the Stream Stats application without having to develop and maintain them. ?? 2009 IEEE.

  13. NOBAI: a web server for character coding of geometrical and statistical features in RNA structure

    PubMed Central

    Knudsen, Vegeir; Caetano-Anollés, Gustavo

    2008-01-01

    The Numeration of Objects in Biology: Alignment Inferences (NOBAI) web server provides a web interface to the applications in the NOBAI software package. This software codes topological and thermodynamic information related to the secondary structure of RNA molecules as multi-state phylogenetic characters, builds character matrices directly in NEXUS format and provides sequence randomization options. The web server is an effective tool that facilitates the search for evolutionary history embedded in the structure of functional RNA molecules. The NOBAI web server is accessible at ‘http://www.manet.uiuc.edu/nobai/nobai.php’. This web site is free and open to all users and there is no login requirement. PMID:18448469

  14. P-MartCancer–Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Bramer, Lisa M.; Jensen, Jeffrey L.

    P-MartCancer is a new interactive web-based software environment that enables biomedical and biological scientists to perform in-depth analyses of global proteomics data without requiring direct interaction with the data or with statistical software. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access to multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium (CPTAC) at the peptide, gene and protein levels. P-MartCancer is deployed using Azure technologies (http://pmart.labworks.org/cptac.html), the web-service is alternativelymore » available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/) and many statistical functions can be utilized directly from an R package available on GitHub (https://github.com/pmartR).« less

  15. ModelTest Server: a web-based tool for the statistical selection of models of nucleotide substitution online

    PubMed Central

    Posada, David

    2006-01-01

    ModelTest server is a web-based application for the selection of models of nucleotide substitution using the program ModelTest. The server takes as input a text file with likelihood scores for the set of candidate models. Models can be selected with hierarchical likelihood ratio tests, or with the Akaike or Bayesian information criteria. The output includes several statistics for the assessment of model selection uncertainty, for model averaging or to estimate the relative importance of model parameters. The server can be accessed at . PMID:16845102

  16. Development and Use of an Adaptive Learning Environment to Research Online Study Behaviour

    ERIC Educational Resources Information Center

    Jonsdottir, Anna Helga; Jakobsdottir, Audbjorg; Stefansson, Gunnar

    2015-01-01

    This paper describes a system for research on the behaviour of students taking online drills. The system is accessible and free to use for anyone with web access. Based on open source software, the teaching material is licensed under a Creative Commons License. The system has been used for computer-assisted education in statistics, mathematics and…

  17. Web site access statistics and delivery of research results

    Treesearch

    Daniel L. Schmoldt; Matthew F. Winn; Philip A. Araman

    1998-01-01

    For the past 2-1/2 years, our Research Work Unit (RWU) has been operating a Web site (http://www.se4702.forprod.vt.edu). The site became operational in October 1995. In the beginning, it was primarily designed to stake out a piece of the Internet with our name and our RWU’s organizational mission. Quickly, it became apparent to us, however, that our user community...

  18. Macroscopic characterisations of Web accessibility

    NASA Astrophysics Data System (ADS)

    Lopes, Rui; Carriço, Luis

    2010-12-01

    The Web Science framework poses fundamental questions on the analysis of the Web, by focusing on how microscopic properties (e.g. at the level of a Web page or Web site) emerge into macroscopic properties and phenomena. One research topic on the analysis of the Web is Web accessibility evaluation, which centres on understanding how accessible a Web page is for people with disabilities. However, when framing Web accessibility evaluation on Web Science, we have found that existing research stays at the microscopic level. This article presents an experimental study on framing Web accessibility evaluation into Web Science's goals. This study resulted in novel accessibility properties of the Web not found at microscopic levels, as well as of Web accessibility evaluation processes themselves. We observed at large scale some of the empirical knowledge on how accessibility is perceived by designers and developers, such as the disparity of interpretations of accessibility evaluation tools warnings. We also found a direct relation between accessibility quality and Web page complexity. We provide a set of guidelines for designing Web pages, education on Web accessibility, as well as on the computational limits of large-scale Web accessibility evaluations.

  19. FERMI/GLAST Integrated Trending and Plotting System Release 5.0

    NASA Technical Reports Server (NTRS)

    Ritter, Sheila; Brumer, Haim; Reitan, Denise

    2012-01-01

    An Integrated Trending and Plotting System (ITPS) is a trending, analysis, and plotting system used by space missions to determine performance and status of spacecraft and its instruments. ITPS supports several NASA mission operational control centers providing engineers, ground controllers, and scientists with access to the entire spacecraft telemetry data archive for the life of the mission, and includes a secure Web component for remote access. FERMI/GLAST ITPS Release 5.0 features include the option to display dates (yyyy/ddd) instead of orbit numbers along orbital Long-Term Trend (LTT) plot axis, the ability to save statistics from daily production plots as image files, and removal of redundant edit/create Input Definition File (IDF) screens. Other features are a fix to address invalid packet lengths, a change in naming convention of image files in order to use in script, the ability to save all ITPS plot images (from Windows or the Web) as GIF or PNG format, the ability to specify ymin and ymax on plots where previously only the desired range could be specified, Web interface capability to plot IDFs that contain out-oforder page and plot numbers, and a fix to change all default file names to show yyyydddhhmmss time stamps instead of hhmmssdddyyyy. A Web interface capability sorts files based on modification date (with newest one at top), and the statistics block can be displayed via a Web interface. Via the Web, users can graphically view the volume of telemetry data from each day contained in the ITPS archive in the Web digest. The ITPS could be also used in nonspace fields that need to plot data or trend data, including financial and banking systems, aviation and transportation systems, healthcare and educational systems, sales and marketing, and housing and construction.

  20. SIFTER search: a web server for accurate phylogeny-based protein function prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sahraeian, Sayed M.; Luo, Kevin R.; Brenner, Steven E.

    We are awash in proteins discovered through high-throughput sequencing projects. As only a minuscule fraction of these have been experimentally characterized, computational methods are widely used for automated annotation. Here, we introduce a user-friendly web interface for accurate protein function prediction using the SIFTER algorithm. SIFTER is a state-of-the-art sequence-based gene molecular function prediction algorithm that uses a statistical model of function evolution to incorporate annotations throughout the phylogenetic tree. Due to the resources needed by the SIFTER algorithm, running SIFTER locally is not trivial for most users, especially for large-scale problems. The SIFTER web server thus provides access tomore » precomputed predictions on 16 863 537 proteins from 232 403 species. Users can explore SIFTER predictions with queries for proteins, species, functions, and homologs of sequences not in the precomputed prediction set. Lastly, the SIFTER web server is accessible at http://sifter.berkeley.edu/ and the source code can be downloaded.« less

  1. SIFTER search: a web server for accurate phylogeny-based protein function prediction

    DOE PAGES

    Sahraeian, Sayed M.; Luo, Kevin R.; Brenner, Steven E.

    2015-05-15

    We are awash in proteins discovered through high-throughput sequencing projects. As only a minuscule fraction of these have been experimentally characterized, computational methods are widely used for automated annotation. Here, we introduce a user-friendly web interface for accurate protein function prediction using the SIFTER algorithm. SIFTER is a state-of-the-art sequence-based gene molecular function prediction algorithm that uses a statistical model of function evolution to incorporate annotations throughout the phylogenetic tree. Due to the resources needed by the SIFTER algorithm, running SIFTER locally is not trivial for most users, especially for large-scale problems. The SIFTER web server thus provides access tomore » precomputed predictions on 16 863 537 proteins from 232 403 species. Users can explore SIFTER predictions with queries for proteins, species, functions, and homologs of sequences not in the precomputed prediction set. Lastly, the SIFTER web server is accessible at http://sifter.berkeley.edu/ and the source code can be downloaded.« less

  2. SPSmart: adapting population based SNP genotype databases for fast and comprehensive web access.

    PubMed

    Amigo, Jorge; Salas, Antonio; Phillips, Christopher; Carracedo, Angel

    2008-10-10

    In the last five years large online resources of human variability have appeared, notably HapMap, Perlegen and the CEPH foundation. These databases of genotypes with population information act as catalogues of human diversity, and are widely used as reference sources for population genetics studies. Although many useful conclusions may be extracted by querying databases individually, the lack of flexibility for combining data from within and between each database does not allow the calculation of key population variability statistics. We have developed a novel tool for accessing and combining large-scale genomic databases of single nucleotide polymorphisms (SNPs) in widespread use in human population genetics: SPSmart (SNPs for Population Studies). A fast pipeline creates and maintains a data mart from the most commonly accessed databases of genotypes containing population information: data is mined, summarized into the standard statistical reference indices, and stored into a relational database that currently handles as many as 4 x 10(9) genotypes and that can be easily extended to new database initiatives. We have also built a web interface to the data mart that allows the browsing of underlying data indexed by population and the combining of populations, allowing intuitive and straightforward comparison of population groups. All the information served is optimized for web display, and most of the computations are already pre-processed in the data mart to speed up the data browsing and any computational treatment requested. In practice, SPSmart allows populations to be combined into user-defined groups, while multiple databases can be accessed and compared in a few simple steps from a single query. It performs the queries rapidly and gives straightforward graphical summaries of SNP population variability through visual inspection of allele frequencies outlined in standard pie-chart format. In addition, full numerical description of the data is output in statistical results panels that include common population genetics metrics such as heterozygosity, Fst and In.

  3. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui

    PubMed Central

    2012-01-01

    Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications. PMID:22998945

  4. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui.

    PubMed

    Newton, Richard; Deonarine, Andrew; Wernisch, Lorenz

    2012-09-24

    The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications.

  5. Data Access and Web Services at the EarthScope Plate Boundary Observatory

    NASA Astrophysics Data System (ADS)

    Matykiewicz, J.; Anderson, G.; Henderson, D.; Hodgkinson, K.; Hoyt, B.; Lee, E.; Persson, E.; Torrez, D.; Smith, J.; Wright, J.; Jackson, M.

    2007-12-01

    The EarthScope Plate Boundary Observatory (PBO) at UNAVCO, Inc., part of the NSF-funded EarthScope project, is designed to study the three-dimensional strain field resulting from deformation across the active boundary zone between the Pacific and North American plates in the western United States. To meet these goals, PBO will install 880 continuous GPS stations, 103 borehole strainmeter stations, and five laser strainmeters, as well as manage data for 209 previously existing continuous GPS stations and one previously existing laser strainmeter. UNAVCO provides access to data products from these stations, as well as general information about the PBO project, via the PBO web site (http://pboweb.unavco.org). GPS and strainmeter data products can be found using a variety of access methods, incuding map searches, text searches, and station specific data retrieval. In addition, the PBO construction status is available via multiple mapping interfaces, including custom web based map widgets and Google Earth. Additional construction details can be accessed from PBO operational pages and station specific home pages. The current state of health for the PBO network is available with the statistical snap-shot, full map interfaces, tabular web based reports, and automatic data mining and alerts. UNAVCO is currently working to enhance the community access to this information by developing a web service framework for the discovery of data products, interfacing with operational engineers, and exposing data services to third party participants. In addition, UNAVCO, through the PBO project, provides advanced data management and monitoring systems for use by the community in operating geodetic networks in the United States and beyond. We will demonstrate these systems during the AGU meeting, and we welcome inquiries from the community at any time.

  6. DATAFERRETT AND DATAWEB

    EPA Science Inventory

    DataFerrett is a data extraction software and a data mining tool that accesses data stored in TheDataWeb through the Internet. It can be installed as an application on your desktop or use a java applet with an Internet browser. Census Bureau and Bureau of Labor Statistics release...

  7. Use of The Math You Need When You Need It website outside of introductory geoscience courses

    NASA Astrophysics Data System (ADS)

    Baer, E. M.; Wenner, J. M.

    2011-12-01

    Web usage statistics and a recent survey of visitors to The Math You Need, When You Need It (TMYN) suggest that these web resources serve a significant number of students beyond those for whom they were originally intended. The web-based modules of TMYN are asynchronous online resources designed to help undergraduates learn quantitative concepts essential in a concurrent introductory geoscience course. In the past year, approximately 1,000 students accessed TMYN through associated geoscience courses; however, in that same time period,more than 40 times that number interacted significantly with the site according to Google Analytics. Of the nearly 220,000 total visitors, ~15% stayed on the site for longer than one minute and ~20% visited two or more pages within the site, suggesting that the content is engaging and useful to many of the visitors. In a pop-up survey of users, 81% of the nearly 350 respondents reported that they found what they were looking for. Although the nature of TMYN website users is difficult to discern definitively, daily, weekly and monthly use patterns indicate a predominance of academic users. Access to the site is lowest during the summer months and on Friday and Saturday, and is elevated on Sunday through Thursdays. Furthermore, in a pop-up survey of users who accessed more than one page, greater than half (56%) of the 346 respondents were students, 20% collegiate faculty and 9% K-12 teachers. Although the resources are specifically designed for geoscience students, 61% of survey respondents identified themselves as associated with other STEM disciplines. Thus, despite the decidedly geoscientific slant to these resources, survey data suggest that many STEM students and teachers are searching for the kinds of topics covered by TMYN. Furthermore, web use statistics indicate a substantial need for high quality web-based quantitative skill support materials for all STEM disciplines.

  8. 3Drefine: an interactive web server for efficient protein structure refinement

    PubMed Central

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371

  9. Tools for Interdisciplinary Data Assimilation and Sharing in Support of Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Walker, J.; Suftin, I.; Warren, M.; Kunicki, T.

    2013-12-01

    Information consumed and produced in hydrologic analyses is interdisciplinary and massive. These factors put a heavy information management burden on the hydrologic science community. The U.S. Geological Survey (USGS) Office of Water Information Center for Integrated Data Analytics (CIDA) seeks to assist hydrologic science investigators with all-components of their scientific data management life cycle. Ongoing data publication and software development projects will be presented demonstrating publically available data access services and manipulation tools being developed with support from two Department of the Interior initiatives. The USGS-led National Water Census seeks to provide both data and tools in support of nationally consistent water availability estimates. Newly available data include national coverages of radar-indicated precipitation, actual evapotranspiration, water use estimates aggregated by county, and South East region estimates of streamflow for 12-digit hydrologic unit code watersheds. Web services making these data available and applications to access them will be demonstrated. Web-available processing services able to provide numerous streamflow statistics for any USGS daily flow record or model result time series and other National Water Census processing tools will also be demonstrated. The National Climate Change and Wildlife Science Center is a USGS center leading DOI-funded academic global change adaptation research. It has a mission goal to ensure data used and produced by funded projects is available via web services and tools that streamline data management tasks in interdisciplinary science. For example, collections of downscaled climate projections, typically large collections of files that must be downloaded to be accessed, are being published using web services that allow access to the entire dataset via simple web-service requests and numerous processing tools. Recent progress on this front includes, data web services for Climate Model Intercomparison Phase 5 based downscaled climate projections, EPA's Integrated Climate and Land Use Scenarios projections of population and land cover metrics, and MODIS-derived land cover parameters from NASA's Land Processes Distributed Active Archive Center. These new services and ways to discover others will be presented through demonstration of a recently open-sourced project from a web-application or scripted workflow. Development and public deployment of server-based processing tools to subset and summarize these and other data is ongoing at the CIDA with partner groups such as 52 Degrees North and Unidata. The latest progress on subsetting, spatial summarization to areas of interest, and temporal summarization via common-statistical methods will be presented.

  10. A web-based application for initial screening of living kidney donors: development, implementation and evaluation.

    PubMed

    Moore, D R; Feurer, I D; Zavala, E Y; Shaffer, D; Karp, S; Hoy, H; Moore, D E

    2013-02-01

    Most centers utilize phone or written surveys to screen candidates who self-refer to be living kidney donors. To increase efficiency and reduce resource utilization, we developed a web-based application to screen kidney donor candidates. The aim of this study was to evaluate the use of this web-based application. Method and time of referral were tabulated and descriptive statistics summarized demographic characteristics. Time series analyses evaluated use over time. Between January 1, 2011 and March 31, 2012, 1200 candidates self-referred to be living kidney donors at our center. Eight hundred one candidates (67%) completed the web-based survey and 399 (33%) completed a phone survey. Thirty-nine percent of donors accessed the application on nights and weekends. Postimplementation of the web-based application, there was a statistically significant increase (p < 0.001) in the number of self-referrals via the web-based application as opposed to telephone contact. Also, there was a significant increase (p = 0.025) in the total number of self-referrals post-implementation from 61 to 116 per month. An interactive web-based application is an effective strategy for the initial screening of donor candidates. The web-based application increased the ability to interface with donors, process them efficiently and ultimately increased donor self-referral at our center. © Copyright 2012 The American Society of Transplantation and the American Society of Transplant Surgeons.

  11. National Assessment of Educational Progress.

    ERIC Educational Resources Information Center

    National Center for Education Statistics (ED), Washington, DC.

    The National Center for Education Statistics (NCES) recently unveiled a new Web site about the National Assessment of Educational Progress (NAEP), the "Nation's Report Card." This site (http://nces.ed.gov/nationsreportcard) provides easy access to a wealth of assessment information about the condition of education in the United States,…

  12. Inordinate Fondness: The Feds and the Internet.

    ERIC Educational Resources Information Center

    Morehead, Joe

    1997-01-01

    Examines the move to make U. S. government information available solely in an electronic format. Discusses inability of general purpose search engines to access the information; shift of cost to the consumer; the online version of the "Monthly Catalog of United States Government Publications"; federal statistics; Agency Web sites; and a…

  13. Integrated Medical Information Technology System (IMITS): Information and Clinical Technologies for the Advancement of Healthcare

    DTIC Science & Technology

    2010-08-31

    Teleaudiology o FY08: Remote access of cochlear implants Teleaudiology DIACAP / FDA certification o FY08: Teleaudiology DIACAP and FDA certification to conduct...remote access, monitor, and adjust cochlear implants  ECMO o FY05: Extra Corporeal Membrane Oxygenation (ECMO) o FY07 Pacific Rim ECMO/VAD...These dashboards were developed for use by appointed AFMS radiologists to monitor the flow and statistics of teleradiology. The dashboards are web

  14. 3DNOW: Image-Based 3d Reconstruction and Modeling via Web

    NASA Astrophysics Data System (ADS)

    Tefera, Y.; Poiesi, F.; Morabito, D.; Remondino, F.; Nocerino, E.; Chippendale, P.

    2018-05-01

    This paper presents a web-based 3D imaging pipeline, namely 3Dnow, that can be used by anyone without the need of installing additional software other than a browser. By uploading a set of images through the web interface, 3Dnow can generate sparse and dense point clouds as well as mesh models. 3D reconstructed models can be downloaded with standard formats or previewed directly on the web browser through an embedded visualisation interface. In addition to reconstructing objects, 3Dnow offers the possibility to evaluate and georeference point clouds. Reconstruction statistics, such as minimum, maximum and average intersection angles, point redundancy and density can also be accessed. The paper describes all features available in the web service and provides an analysis of the computational performance using servers with different GPU configurations.

  15. Federal Agency and Federal Library Reports: Library of Congress; Center for the Book; Federal Library and Information Center Committee; National Agricultural Library; National Library of Medicine; United States Government Printing Office; National Technical Information Service; National Archives and Records Administration; National Center for Education Statistics Library Statistics Program; National Commission on Libraries and Information Science; National Library of Education; Educational Resources Information Center.

    ERIC Educational Resources Information Center

    Fischer, Audrey; Cole, John Y.; Tarr, Susan M.; Carey, Len; Mehnert, Robert; Sherman, Andrew M.; Davis, Linda; Leahy, Debra W.; Chute, Adrienne; Willard, Robert S.; Dunn, Christina

    2003-01-01

    Includes annual reports from 12 federal agencies and libraries that discuss security, budgets, legislation, digital projects, preservation, government role, information management, personnel changes, collections, databases, financial issues, services, administration, Web sites, access to information, customer service, statistics, international…

  16. Publicly available hospital comparison web sites: determination of useful, valid, and appropriate information for comparing surgical quality.

    PubMed

    Leonardi, Michael J; McGory, Marcia L; Ko, Clifford Y

    2007-09-01

    To explore hospital comparison Web sites for general surgery based on: (1) a systematic Internet search, (2) Web site quality evaluation, and (3) exploration of possible areas of improvement. A systematic Internet search was performed to identify hospital quality comparison Web sites in September 2006. Publicly available Web sites were rated on accessibility, data/statistical transparency, appropriateness, and timeliness. A sample search was performed to determine ranking consistency. Six national hospital comparison Web sites were identified: 1 government (Hospital Compare [Centers for Medicare and Medicaid Services]), 2 nonprofit (Quality Check [Joint Commission on Accreditation of Healthcare Organizations] and Hospital Quality and Safety Survey Results [Leapfrog Group]), and 3 proprietary sites (names withheld). For accessibility and data transparency, the government and nonprofit Web sites were best. For appropriateness, the proprietary Web sites were best, comparing multiple surgical procedures using a combination of process, structure, and outcome measures. However, none of these sites explicitly defined terms such as complications. Two proprietary sites allowed patients to choose ranking criteria. Most data on these sites were 2 years old or older. A sample search of 3 surgical procedures at 4 hospitals demonstrated significant inconsistencies. Patients undergoing surgery are increasingly using the Internet to compare hospital quality. However, a review of available hospital comparison Web sites shows suboptimal measures of quality and inconsistent results. This may be partially because of a lack of complete and timely data. Surgeons should be involved with quality comparison Web sites to ensure appropriate methods and criteria.

  17. A quality evaluation methodology of health web-pages for non-professionals.

    PubMed

    Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro

    2004-06-01

    The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.

  18. Implementing Recommendations From Web Accessibility Guidelines: Would They Also Provide Benefits to Nondisabled Users.

    PubMed

    Schmutz, Sven; Sonderegger, Andreas; Sauer, Juergen

    2016-06-01

    We examined the consequences of implementing Web accessibility guidelines for nondisabled users. Although there are Web accessibility guidelines for people with disabilities available, they are rarely used in practice, partly due to the fact that practitioners believe that such guidelines provide no benefits, or even have negative consequences, for nondisabled people, who represent the main user group of Web sites. Despite these concerns, there is a lack of empirical research on the effects of current Web accessibility guidelines on nondisabled users. Sixty-one nondisabled participants used one of three Web sites differing in levels of accessibility (high, low, and very low). Accessibility levels were determined by following established Web accessibility guidelines (WCAG 2.0). A broad methodological approach was used, including performance measures (e.g., task completion time) and user ratings (e.g., perceived usability). A high level of Web accessibility led to better performance (i.e., task completion time and task completion rate) than low or very low accessibility. Likewise, high Web accessibility improved user ratings (i.e., perceived usability, aesthetics, workload, and trustworthiness) compared to low or very low Web accessibility. There was no difference between the very low and low Web accessibility conditions for any of the outcome measures. Contrary to some concerns in the literature and among practitioners, high conformance with Web accessibility guidelines may provide benefits to users without disabilities. The findings may encourage more practitioners to implement WCAG 2.0 for the benefit of users with disabilities and nondisabled users. © 2016, Human Factors and Ergonomics Society.

  19. A National Crop Progress Monitoring System Based on NASA Earth Science Results

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Zhang, B.; Deng, M.; Yang, Z.

    2011-12-01

    Crop progress is an important piece of information for food security and agricultural commodities. Timely monitoring and reporting are mandated for the operation of agricultural statistical agencies. Traditionally, the weekly reporting issued by the National Agricultural Statistics Service (NASS) of the United States Department of Agriculture (USDA) is based on reports from the knowledgeable state and county agricultural officials and farmers. The results are spatially coarse and subjective. In this project, a remote-sensing-supported crop progress monitoring system is being developed intensively using the data and derived products from NASA Earth Observing satellites. Moderate Resolution Imaging Spectroradiometer (MODIS) Level 3 product - MOD09 (Surface Reflectance) is used for deriving daily normalized vegetation index (NDVI), vegetation condition index (VCI), and mean vegetation condition index (MVCI). Ratio change to previous year and multiple year mean can be also produced on demand. The time-series vegetation condition indices are further combined with the NASS' remote-sensing-derived Cropland Data Layer (CDL) to estimate crop condition and progress crop by crop. To facilitate the operational requirement and increase the accessibility of data and products by different users, each component of the system has being developed and implemented following open specifications under the Web Service reference model of Open Geospatial Consortium Inc. Sensor observations and data are accessed through Web Coverage Service (WCS), Web Feature Service (WFS), or Sensor Observation Service (SOS) if available. Products are also served through such open-specification-compliant services. For rendering and presentation, Web Map Service (WMS) is used. A Web-service based system is set up and deployed at dss.csiss.gmu.edu/NDVIDownload. Further development will adopt crop growth models, feed the models with remotely sensed precipitation and soil moisture information, and incorporate the model results with vegetation-index time series for crop progress stage estimation.

  20. 3Drefine: an interactive web server for efficient protein structure refinement.

    PubMed

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-07-08

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. The International Mouse Phenotyping Consortium Web Portal, a unified point of access for knockout mice and related phenotyping data

    PubMed Central

    Koscielny, Gautier; Yaikhom, Gagarine; Iyer, Vivek; Meehan, Terrence F.; Morgan, Hugh; Atienza-Herrero, Julian; Blake, Andrew; Chen, Chao-Kung; Easty, Richard; Di Fenza, Armida; Fiegel, Tanja; Grifiths, Mark; Horne, Alan; Karp, Natasha A.; Kurbatova, Natalja; Mason, Jeremy C.; Matthews, Peter; Oakley, Darren J.; Qazi, Asfand; Regnart, Jack; Retha, Ahmad; Santos, Luis A.; Sneddon, Duncan J.; Warren, Jonathan; Westerberg, Henrik; Wilson, Robert J.; Melvin, David G.; Smedley, Damian; Brown, Steve D. M.; Flicek, Paul; Skarnes, William C.; Mallon, Ann-Marie; Parkinson, Helen

    2014-01-01

    The International Mouse Phenotyping Consortium (IMPC) web portal (http://www.mousephenotype.org) provides the biomedical community with a unified point of access to mutant mice and rich collection of related emerging and existing mouse phenotype data. IMPC mouse clinics worldwide follow rigorous highly structured and standardized protocols for the experimentation, collection and dissemination of data. Dedicated ‘data wranglers’ work with each phenotyping center to collate data and perform quality control of data. An automated statistical analysis pipeline has been developed to identify knockout strains with a significant change in the phenotype parameters. Annotation with biomedical ontologies allows biologists and clinicians to easily find mouse strains with phenotypic traits relevant to their research. Data integration with other resources will provide insights into mammalian gene function and human disease. As phenotype data become available for every gene in the mouse, the IMPC web portal will become an invaluable tool for researchers studying the genetic contributions of genes to human diseases. PMID:24194600

  2. WormQTL—public archive and analysis web portal for natural variation data in Caenorhabditis spp

    PubMed Central

    Snoek, L. Basten; Van der Velde, K. Joeri; Arends, Danny; Li, Yang; Beyer, Antje; Elvin, Mark; Fisher, Jasmin; Hajnal, Alex; Hengartner, Michael O.; Poulin, Gino B.; Rodriguez, Miriam; Schmid, Tobias; Schrimpf, Sabine; Xue, Feng; Jansen, Ritsert C.; Kammenga, Jan E.; Swertz, Morris A.

    2013-01-01

    Here, we present WormQTL (http://www.wormqtl.org), an easily accessible database enabling search, comparative analysis and meta-analysis of all data on variation in Caenorhabditis spp. Over the past decade, Caenorhabditis elegans has become instrumental for molecular quantitative genetics and the systems biology of natural variation. These efforts have resulted in a valuable amount of phenotypic, high-throughput molecular and genotypic data across different developmental worm stages and environments in hundreds of C. elegans strains. WormQTL provides a workbench of analysis tools for genotype–phenotype linkage and association mapping based on but not limited to R/qtl (http://www.rqtl.org). All data can be uploaded and downloaded using simple delimited text or Excel formats and are accessible via a public web user interface for biologists and R statistic and web service interfaces for bioinformaticians, based on open source MOLGENIS and xQTL workbench software. WormQTL welcomes data submissions from other worm researchers. PMID:23180786

  3. WormQTL--public archive and analysis web portal for natural variation data in Caenorhabditis spp.

    PubMed

    Snoek, L Basten; Van der Velde, K Joeri; Arends, Danny; Li, Yang; Beyer, Antje; Elvin, Mark; Fisher, Jasmin; Hajnal, Alex; Hengartner, Michael O; Poulin, Gino B; Rodriguez, Miriam; Schmid, Tobias; Schrimpf, Sabine; Xue, Feng; Jansen, Ritsert C; Kammenga, Jan E; Swertz, Morris A

    2013-01-01

    Here, we present WormQTL (http://www.wormqtl.org), an easily accessible database enabling search, comparative analysis and meta-analysis of all data on variation in Caenorhabditis spp. Over the past decade, Caenorhabditis elegans has become instrumental for molecular quantitative genetics and the systems biology of natural variation. These efforts have resulted in a valuable amount of phenotypic, high-throughput molecular and genotypic data across different developmental worm stages and environments in hundreds of C. elegans strains. WormQTL provides a workbench of analysis tools for genotype-phenotype linkage and association mapping based on but not limited to R/qtl (http://www.rqtl.org). All data can be uploaded and downloaded using simple delimited text or Excel formats and are accessible via a public web user interface for biologists and R statistic and web service interfaces for bioinformaticians, based on open source MOLGENIS and xQTL workbench software. WormQTL welcomes data submissions from other worm researchers.

  4. MDB: the Metalloprotein Database and Browser at The Scripps Research Institute

    PubMed Central

    Castagnetto, Jesus M.; Hennessy, Sean W.; Roberts, Victoria A.; Getzoff, Elizabeth D.; Tainer, John A.; Pique, Michael E.

    2002-01-01

    The Metalloprotein Database and Browser (MDB; http://metallo.scripps.edu) at The Scripps Research Institute is a web-accessible resource for metalloprotein research. It offers the scientific community quantitative information on geometrical parameters of metal-binding sites in protein structures available from the Protein Data Bank (PDB). The MDB also offers analytical tools for the examination of trends or patterns in the indexed metal-binding sites. A user can perform interactive searches, metal-site structure visualization (via a Java applet), and analysis of the quantitative data by accessing the MDB through a web browser without requiring an external application or platform-dependent plugin. The MDB also has a non-interactive interface with which other web sites and network-aware applications can seamlessly incorporate data or statistical analysis results from metal-binding sites. The information contained in the MDB is periodically updated with automated algorithms that find and index metal sites from new protein structures released by the PDB. PMID:11752342

  5. Why Does Attention to Web Articles Fall With Time?

    PubMed

    Simkin, Mikhail V; Roychowdhury, Vwani P

    2015-09-01

    We analyze access statistics of 150 blog entries and news articles for periods of up to 3 years. Access rate falls as an inverse power of time passed since publication. The power law holds for periods of up to 1,000 days. The exponents are different for different blogs and are distributed between 0.6 and 3.2. We argue that the decay of attention to a web article is caused by the link to it first dropping down the list of links on the website's front page and then disappearing from the front page and its subsequent movement further into background. The other proposed explanations that use a decaying with time novelty factor, or some intricate theory of human dynamics, cannot explain all of the experimental observations.

  6. 77 FR 1728 - Privacy Act of 1974; Publication of Five New Systems of Records; Amendments to Five Existing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-11

    ... assistance to correspondents; to use Web site based programs; to provide usage statistics associated with the... of individuals for surveys. Among other things, maintaining the names, addresses, etc. of individuals... information in the system. Safeguards: Access by authorized personnel only. Computer security safeguards are...

  7. 77 FR 71201 - Order Extending Temporary Conditional Exemption for Nationally Recognized Statistical Rating...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-29

    ... following conflict of interest: Issuing or maintaining a credit rating for a security or money market... that was paid for by the issuer, sponsor, or underwriter of the security or money market instrument. 17...; Provide free and unlimited access to such password- protected Internet Web site during the applicable...

  8. Improving Web Accessibility in a University Setting

    ERIC Educational Resources Information Center

    Olive, Geoffrey C.

    2010-01-01

    Improving Web accessibility for disabled users visiting a university's Web site is explored following the World Wide Web Consortium (W3C) guidelines and Section 508 of the Rehabilitation Act rules for Web page designers to ensure accessibility. The literature supports the view that accessibility is sorely lacking, not only in the USA, but also…

  9. Electronic Ramp to Success: Designing Campus Web Pages for Users with Disabilities.

    ERIC Educational Resources Information Center

    Coombs, Norman

    2002-01-01

    Discusses key issues in addressing the challenge of Web accessibility for people with disabilities, including tools for Web authoring, repairing, and accessibility validation, and relevant legal issues. Presents standards for Web accessibility, including the Section 508 Standards from the Federal Access Board, and the World Wide Web Consortium's…

  10. Current state of web accessibility of Malaysian ministries websites

    NASA Astrophysics Data System (ADS)

    Ahmi, Aidi; Mohamad, Rosli

    2016-08-01

    Despite the fact that Malaysian public institutions have progressed considerably on website and portal usage, web accessibility has been reported as one of the issues deserves special attention. Consistent with the government moves to promote an effective use of web and portal, it is essential for the government institutions to ensure compliance with established standards and guidelines on web accessibility. This paper evaluates accessibility of 25 Malaysian ministries websites using automated tools i.e. WAVE and Achecker. Both tools are designed to objectively evaluate web accessibility in conformance with Web Content Accessibility Guidelines 2.0 (WCAG 2.0) and United States Rehabilitation Act 1973 (Section 508). The findings reported somewhat low compliance to web accessibility standard amongst the ministries. Further enhancement is needed in the aspect of input elements such as label and checkbox to be associated with text as well as image-related elements. This findings could be used as a mechanism for webmasters to locate and rectify errors pertaining to the web accessibility and to ensure equal access of the web information and services to all citizen.

  11. Streamstats: U.S. Geological Survey Web Application for Streamflow Statistics for Connecticut

    USGS Publications Warehouse

    Ahearn, Elizabeth A.; Ries, Kernell G.; Steeves, Peter A.

    2006-01-01

    Introduction An important mission of the U. S. Geological Survey (USGS) is to provide information on streamflow in the Nation's rivers. Streamflow statistics are used by water managers, engineers, scientists, and others to protect people and property during floods and droughts, and to manage land, water, and biological resources. Common uses for streamflow statistics include dam, bridge, and culvert design; water-supply planning and management; water-use appropriations and permitting; wastewater and industrial discharge permitting; hydropower-facility design and regulation; and flood-plain mapping for establishing flood-insurance rates and land-use zones. In an effort to improve access to published streamflow statistics, and to make the process of computing streamflow statistics for ungaged stream sites easier, more accurate, and more consistent, the USGS and the Environmental Systems Research Institute, Inc. (ESRI) developed StreamStats (Ries and others, 2004). StreamStats is a Geographic Information System (GIS)-based Web application for serving previously published streamflow statistics and basin characteristics for USGS data-collection stations, and computing streamflow statistics and basin characteristics for ungaged stream sites. The USGS, in cooperation with the Connecticut Department of Environmental Protection and the Connecticut Department of Transportation, has implemented StreamStats for Connecticut.

  12. Integrating NASA Dryden Research Endeavors into the Teaching-Learning of Mathematics in the K-12 Classroom via the WWW

    NASA Technical Reports Server (NTRS)

    Ward, Robin A.

    2002-01-01

    The primary goal of this project was to continue populating the currently existing web site developed in 1998 in conjunction with the NASA Dryden Flight Research Center and California Polytechnic State University, with more mathematics lesson plans and activities that K-12 teachers, students, home-schoolers, and parents could access. All of the activities, while demonstrating some mathematical topic, also showcase the research endeavors of the NASA Dryden Flight Research Center. The website is located at: http://daniel.calpoly.edu/dfrc/Robin. The secondary goal of this project was to share the web-based activities with educators at various conferences and workshops. To address the primary goal of this project, over the past year, several new activities were posted on the web site and some of the existing activities were enhanced to contain more video clips, photos, and materials for teachers. To address the project's secondary goal, the web-based activities were showcased at several conferences and workshops. Additionally, in order to measure and assess the outreach impact of the web site, a link to the web site hitbox.com was established in April 2001, which allowed for the collection of traffic statistics against the web site (such as the domains of visitors, the frequency of visitors to this web site, etc.) Provided is a description of some of the newly created activities posted on the web site during the project period of 2001-2002, followed by a description of the conferences and workshops at which some of the web-based activities were showcased. Next is a brief summary of the web site's traffic statistics demonstrating its worldwide educational impact, followed by a listing of some of the awards and accolades the web site has received.

  13. Recognition of pornographic web pages by classifying texts and images.

    PubMed

    Hu, Weiming; Wu, Ou; Chen, Zhouyao; Fu, Zhouyu; Maybank, Steve

    2007-06-01

    With the rapid development of the World Wide Web, people benefit more and more from the sharing of information. However, Web pages with obscene, harmful, or illegal content can be easily accessed. It is important to recognize such unsuitable, offensive, or pornographic Web pages. In this paper, a novel framework for recognizing pornographic Web pages is described. A C4.5 decision tree is used to divide Web pages, according to content representations, into continuous text pages, discrete text pages, and image pages. These three categories of Web pages are handled, respectively, by a continuous text classifier, a discrete text classifier, and an algorithm that fuses the results from the image classifier and the discrete text classifier. In the continuous text classifier, statistical and semantic features are used to recognize pornographic texts. In the discrete text classifier, the naive Bayes rule is used to calculate the probability that a discrete text is pornographic. In the image classifier, the object's contour-based features are extracted to recognize pornographic images. In the text and image fusion algorithm, the Bayes theory is used to combine the recognition results from images and texts. Experimental results demonstrate that the continuous text classifier outperforms the traditional keyword-statistics-based classifier, the contour-based image classifier outperforms the traditional skin-region-based image classifier, the results obtained by our fusion algorithm outperform those by either of the individual classifiers, and our framework can be adapted to different categories of Web pages.

  14. How Accessible Are Public Libraries' Web Sites? A Study of Georgia Public Libraries

    ERIC Educational Resources Information Center

    Ingle, Emma; Green, Ravonne A.; Huprich, Julia

    2009-01-01

    One issue that public librarians must consider when planning Web site design is accessibility for patrons with disabilities. This article reports a study of Web site accessibility of public libraries in Georgia. The focus of the report is whether public libraries use accessible guidelines and standards in making their Web sites accessible. An…

  15. Non-visual Web Browsing: Beyond Web Accessibility

    PubMed Central

    Ramakrishnan, I.V.; Ashok, Vikas

    2017-01-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability. PMID:29202137

  16. Non-visual Web Browsing: Beyond Web Accessibility.

    PubMed

    Ramakrishnan, I V; Ashok, Vikas; Billah, Syed Masum

    2017-07-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability.

  17. Teaching with Internet Telescopes: Some Lessons Learned

    NASA Astrophysics Data System (ADS)

    Stencel, Robert

    Observational astronomy is often difficult for pre-college students and teachers because: (1) school occurs in daytime and visual observing at night; (2) light pollution hides the stars from students living in cities; (3) few schools have teachers trained to use and maintain astronomy equipment; (4) there is lack of access to expertise when needed; (5) physically disabled students cannot easily access a telescope eypiece. Internet access to computer controlled telescopes with digital cameras can solve many of these difficulties. The Web enables students and teachers to access well-maintained internet-controllable telescopes at dark-site locations and to consult more readily with experts. This paper reports on a three-month pilot project exploring this situation conducted Feb-May 2002 which allowed high school students to access a CCD-equipped accurately-pointing and tracking telescope located in New Mexico controllable over the Web with a user-friendly skymap browser tool. User interest proved phenomenal and user statistics proved diverse. There were distinct lessons learned about how to enhance student participation in the research process. Details available at website www.du.edu/~rstencel/stn.htm. We thank the ICSRC for a grant to Denver University and acknowledge in-kind support from the estate of William Herschel Womble.

  18. Web Accessibility and Guidelines

    NASA Astrophysics Data System (ADS)

    Harper, Simon; Yesilada, Yeliz

    Access to, and movement around, complex online environments, of which the World Wide Web (Web) is the most popular example, has long been considered an important and major issue in the Web design and usability field. The commonly used slang phrase ‘surfing the Web’ implies rapid and free access, pointing to its importance among designers and users alike. It has also been long established that this potentially complex and difficult access is further complicated, and becomes neither rapid nor free, if the user is disabled. There are millions of people who have disabilities that affect their use of the Web. Web accessibility aims to help these people to perceive, understand, navigate, and interact with, as well as contribute to, the Web, and thereby the society in general. This accessibility is, in part, facilitated by the Web Content Accessibility Guidelines (WCAG) currently moving from version one to two. These guidelines are intended to encourage designers to make sure their sites conform to specifications, and in that conformance enable the assistive technologies of disabled users to better interact with the page content. In this way, it was hoped that accessibility could be supported. While this is in part true, guidelines do not solve all problems and the new WCAG version two guidelines are surrounded by controversy and intrigue. This chapter aims to establish the published literature related to Web accessibility and Web accessibility guidelines, and discuss limitations of the current guidelines and future directions.

  19. iRefWeb: interactive analysis of consolidated protein interaction data and their supporting evidence

    PubMed Central

    Turner, Brian; Razick, Sabry; Turinsky, Andrei L.; Vlasblom, James; Crowdy, Edgard K.; Cho, Emerson; Morrison, Kyle; Wodak, Shoshana J.

    2010-01-01

    We present iRefWeb, a web interface to protein interaction data consolidated from 10 public databases: BIND, BioGRID, CORUM, DIP, IntAct, HPRD, MINT, MPact, MPPI and OPHID. iRefWeb enables users to examine aggregated interactions for a protein of interest, and presents various statistical summaries of the data across databases, such as the number of organism-specific interactions, proteins and cited publications. Through links to source databases and supporting evidence, researchers may gauge the reliability of an interaction using simple criteria, such as the detection methods, the scale of the study (high- or low-throughput) or the number of cited publications. Furthermore, iRefWeb compares the information extracted from the same publication by different databases, and offers means to follow-up possible inconsistencies. We provide an overview of the consolidated protein–protein interaction landscape and show how it can be automatically cropped to aid the generation of meaningful organism-specific interactomes. iRefWeb can be accessed at: http://wodaklab.org/iRefWeb. Database URL: http://wodaklab.org/iRefWeb/ PMID:20940177

  20. Web-based triage in a college health setting.

    PubMed

    Sole, Mary Lou; Stuart, Patricia L; Deichen, Michael

    2006-01-01

    The authors describe the initiation and use of a Web-based triage system in a college health setting. During the first 4 months of implementation, the system recorded 1,290 encounters. More women accessed the system (70%); the average age was 21.8 years. The Web-based triage system advised the majority of students to seek care within 24 hours; however, it recommended self-care management in 22.7% of encounters. Sore throat was the most frequent chief complaint (14.2%). A subset of 59 students received treatment at student health services after requesting an appointment via e-mail. The authors used kappa statistics to compare congruence between chief complaint and 24/7 WebMed classification (kappa = .94), between chief complaint and student health center diagnosis (kappa = .91), and between 24/7 WebMed classification and student health center diagnosis (kappa = .89). Initial evaluation showed high use and good accuracy of Web-based triage. This service provides education and advice to students about their health care concerns.

  1. Web Accessibility in Europe and the United States: What We Are Doing to Increase Inclusion

    ERIC Educational Resources Information Center

    Wheaton, Joseph; Bertini, Patrizia

    2007-01-01

    Accessibility is hardly a new problem and certainly did not originate with the Web. Lack of access to buildings long preceded the call for accessible Web content. Although it is unlikely that rehabilitation educators look at Web page accessibility with indifference, many may also find it difficult to implement. The authors posit three reasons why…

  2. Using Statistical Techniques and Web Search to Correct ESL Errors

    ERIC Educational Resources Information Center

    Gamon, Michael; Leacock, Claudia; Brockett, Chris; Dolan, William B.; Gao, Jianfeng; Belenko, Dmitriy; Klementiev, Alexandre

    2009-01-01

    In this paper we present a system for automatic correction of errors made by learners of English. The system has two novel aspects. First, machine-learned classifiers trained on large amounts of native data and a very large language model are combined to optimize the precision of suggested corrections. Second, the user can access real-life web…

  3. iSeq: Web-Based RNA-seq Data Analysis and Visualization.

    PubMed

    Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng

    2018-01-01

    Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .

  4. Accessibility and content of individualized adult reconstructive hip and knee/musculoskeletal oncology fellowship web sites.

    PubMed

    Young, Bradley L; Cantrell, Colin K; Patt, Joshua C; Ponce, Brent A

    2018-06-01

    Accessible, adequate online information is important to fellowship applicants. Program web sites can affect which programs applicants apply to, subsequently altering interview costs incurred by both parties and ultimately impacting rank lists. Web site analyses have been performed for all orthopaedic subspecialties other than those involved in the combined adult reconstruction and musculoskeletal (MSK) oncology fellowship match. A complete list of active programs was obtained from the official adult reconstruction and MSK oncology society web sites. Web site accessibility was assessed using a structured Google search. Accessible web sites were evaluated based on 21 previously reported content criteria. Seventy-four adult reconstruction programs and 11 MSK oncology programs were listed on the official society web sites. Web sites were identified and accessible for 58 (78%) adult reconstruction and 9 (82%) MSK oncology fellowship programs. No web site contained all content criteria and more than half of both adult reconstruction and MSK oncology web sites failed to include 12 of the 21 criteria. Several programs participating in the combined Adult Reconstructive Hip and Knee/Musculoskeletal Oncology Fellowship Match did not have accessible web sites. Of the web sites that were accessible, none contained comprehensive information and the majority lacked information that has been previously identified as being important to perspective applicants.

  5. The climate4impact platform: Providing, tailoring and facilitating climate model data access

    NASA Astrophysics Data System (ADS)

    Pagé, Christian; Pagani, Andrea; Plieger, Maarten; Som de Cerff, Wim; Mihajlovski, Andrej; de Vreede, Ernst; Spinuso, Alessandro; Hutjes, Ronald; de Jong, Fokke; Bärring, Lars; Vega, Manuel; Cofiño, Antonio; d'Anca, Alessandro; Fiore, Sandro; Kolax, Michael

    2017-04-01

    One of the main objectives of climate4impact is to provide standardized web services and tools that are reusable in other portals. These services include web processing services, web coverage services and web mapping services (WPS, WCS and WMS). Tailored portals can be targeted to specific communities and/or countries/regions while making use of those services. Easier access to climate data is very important for the climate change impact communities. To fulfill this objective, the climate4impact (http://climate4impact.eu/) web portal and services has been developed, targeting climate change impact modellers, impact and adaptation consultants, as well as other experts using climate change data. It provides to users harmonized access to climate model data through tailored services. It features static and dynamic documentation, Use Cases and best practice examples, an advanced search interface, an integrated authentication and authorization system with the Earth System Grid Federation (ESGF), a visualization interface with ADAGUC web mapping tools. In the latest version, statistical downscaling services, provided by the Santander Meteorology Group Downscaling Portal, were integrated. An innovative interface to integrate statistical downscaling services will be released in the upcoming version. The latter will be a big step in bridging the gap between climate scientists and the climate change impact communities. The climate4impact portal builds on the infrastructure of an international distributed database that has been set to disseminate the results from the global climate model results of the Coupled Model Intercomparison project Phase 5 (CMIP5). This database, the ESGF, is an international collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of climate model data. The European FP7 project IS-ENES, Infrastructure for the European Network for Earth System modelling, supports the European contribution to ESGF and contributes to the ESGF open source effort, notably through the development of search, monitoring, quality control, and metadata services. In its second phase, IS-ENES2 supports the implementation of regional climate model results from the international Coordinated Regional Downscaling Experiments (CORDEX). These services were extended within the European FP7 Climate Information Portal for Copernicus (CLIPC) project, and some could be later integrated into the European Copernicus platform.

  6. How to Write Easy-to-Read Health Materials: MedlinePlus

    MedlinePlus

    ... practices. An accessible Web site helps people with reading and learning disabilities. For more information on Web accessibility, see the WebAIM (Web Accessibility in Mind) site from the Center for Persons with Disabilities ...

  7. Tool independence for the Web Accessibility Quantitative Metric.

    PubMed

    Vigo, Markel; Brajnik, Giorgio; Arrue, Myriam; Abascal, Julio

    2009-07-01

    The Web Accessibility Quantitative Metric (WAQM) aims at accurately measuring the accessibility of web pages. One of the main features of WAQM among others is that it is evaluation tool independent for ranking and accessibility monitoring scenarios. This article proposes a method to attain evaluation tool independence for all foreseeable scenarios. After demonstrating that homepages have a more similar error profile than any other web page in a given web site, 15 homepages were measured with 10,000 different values of WAQM parameters using EvalAccess and LIFT, two automatic evaluation tools for accessibility. A similar procedure was followed with random pages and with several test files obtaining several tuples that minimise the difference between both tools. One thousand four hundred forty-nine web pages from 15 web sites were measured with these tuples and those values that minimised the difference between the tools were selected. Once the WAQM was tuned, the accessibility of 15 web sites was measured with two metrics for web sites, concluding that even if similar values can be produced, obtaining the same scores is undesirable since evaluation tools behave in a different way.

  8. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    PubMed Central

    2010-01-01

    Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791

  9. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    PubMed

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  10. Moving toward a universally accessible web: Web accessibility and education.

    PubMed

    Kurt, Serhat

    2017-12-08

    The World Wide Web is an extremely powerful source of information, inspiration, ideas, and opportunities. As such, it has become an integral part of daily life for a great majority of people. Yet, for a significant number of others, the internet offers only limited value due to the existence of barriers which make accessing the Web difficult, if not impossible. This article illustrates some of the reasons that achieving equality of access to the online world of education is so critical, explores the current status of Web accessibility, discusses evaluative tools and methods that can help identify accessibility issues in educational websites, and provides practical recommendations and guidelines for resolving some of the obstacles that currently hinder the achievability of the goal of universal Web access.

  11. Web 2.0

    NASA Astrophysics Data System (ADS)

    Gibson, Becky

    The Web is growing and changing from a paradigm of static publishing to one of participation and interaction. This change has implications for people with disabilities who rely on access to the Web for employment, information, entertainment, and increased independence. The interactive and collaborative nature of Web 2.0 can present access problems for some users. There are some best practices which can be put in place today to improve access. New specifications such as Accessible Rich Internet Applications (ARIA) and IAccessible2 are opening the doors to increasing the accessibility of Web 2.0 and beyond.

  12. Calypso: a user-friendly web-server for mining and visualizing microbiome-environment interactions.

    PubMed

    Zakrzewski, Martha; Proietti, Carla; Ellis, Jonathan J; Hasan, Shihab; Brion, Marie-Jo; Berger, Bernard; Krause, Lutz

    2017-03-01

    Calypso is an easy-to-use online software suite that allows non-expert users to mine, interpret and compare taxonomic information from metagenomic or 16S rDNA datasets. Calypso has a focus on multivariate statistical approaches that can identify complex environment-microbiome associations. The software enables quantitative visualizations, statistical testing, multivariate analysis, supervised learning, factor analysis, multivariable regression, network analysis and diversity estimates. Comprehensive help pages, tutorials and videos are provided via a wiki page. The web-interface is accessible via http://cgenome.net/calypso/ . The software is programmed in Java, PERL and R and the source code is available from Zenodo ( https://zenodo.org/record/50931 ). The software is freely available for non-commercial users. l.krause@uq.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  13. Methods for estimating magnitude and frequency of floods in Arizona, developed with unregulated and rural peak-flow data through water year 2010

    USGS Publications Warehouse

    Paretti, Nicholas V.; Kennedy, Jeffrey R.; Turney, Lovina A.; Veilleux, Andrea G.

    2014-01-01

    The regional regression equations were integrated into the U.S. Geological Survey’s StreamStats program. The StreamStats program is a national map-based web application that allows the public to easily access published flood frequency and basin characteristic statistics. The interactive web application allows a user to select a point within a watershed (gaged or ungaged) and retrieve flood-frequency estimates derived from the current regional regression equations and geographic information system data within the selected basin. StreamStats provides users with an efficient and accurate means for retrieving the most up to date flood frequency and basin characteristic data. StreamStats is intended to provide consistent statistics, minimize user error, and reduce the need for large datasets and costly geographic information system software.

  14. MUSTANG: A Community-Facing Web Service to Improve Seismic Data Quality Awareness Through Metrics

    NASA Astrophysics Data System (ADS)

    Templeton, M. E.; Ahern, T. K.; Casey, R. E.; Sharer, G.; Weertman, B.; Ashmore, S.

    2014-12-01

    IRIS DMC is engaged in a new effort to provide broad and deep visibility into the quality of data and metadata found in its terabyte-scale geophysical data archive. Taking advantage of large and fast disk capacity, modern advances in open database technologies, and nimble provisioning of virtual machine resources, we are creating an openly accessible treasure trove of data measurements for scientists and the general public to utilize in providing new insights into the quality of this data. We have branded this statistical gathering system MUSTANG, and have constructed it as a component of the web services suite that IRIS DMC offers. MUSTANG measures over forty data metrics addressing issues with archive status, data statistics and continuity, signal anomalies, noise analysis, metadata checks, and station state of health. These metrics could potentially be used both by network operators to diagnose station problems and by data users to sort suitable data from unreliable or unusable data. Our poster details what MUSTANG is, how users can access it, what measurements they can find, and how MUSTANG fits into the IRIS DMC's data access ecosystem. Progress in data processing, approaches to data visualization, and case studies of MUSTANG's use for quality assurance will be presented. We want to illustrate what is possible with data quality assurance, the need for data quality assurance, and how the seismic community will benefit from this freely available analytics service.

  15. BetaTPred: prediction of beta-TURNS in a protein using statistical algorithms.

    PubMed

    Kaur, Harpreet; Raghava, G P S

    2002-03-01

    beta-turns play an important role from a structural and functional point of view. beta-turns are the most common type of non-repetitive structures in proteins and comprise on average, 25% of the residues. In the past numerous methods have been developed to predict beta-turns in a protein. Most of these prediction methods are based on statistical approaches. In order to utilize the full potential of these methods, there is a need to develop a web server. This paper describes a web server called BetaTPred, developed for predicting beta-TURNS in a protein from its amino acid sequence. BetaTPred allows the user to predict turns in a protein using existing statistical algorithms. It also allows to predict different types of beta-TURNS e.g. type I, I', II, II', VI, VIII and non-specific. This server assists the users in predicting the consensus beta-TURNS in a protein. The server is accessible from http://imtech.res.in/raghava/betatpred/

  16. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    PubMed

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Web Accessibility and Accessibility Instruction

    ERIC Educational Resources Information Center

    Green, Ravonne A.; Huprich, Julia

    2009-01-01

    Section 508 of the Americans with Disabilities Act (ADA) mandates that programs and services be accessible to people with disabilities. While schools of library and information science (SLIS*) and university libraries should model accessible Web sites, this may not be the case. This article examines previous studies about the Web accessibility of…

  18. Emerging technologies and web accessibility: research challenges and opportunities focussing on vision issues.

    PubMed

    Harper, Simon; Yesilada, Yeliz

    2012-01-01

    This is a technological review paper focussed on identifying both the research challenges and opportunities for further investigation arising from emerging technologies, and it does not aim to propose any recommendation or standard. It is focussed on blind and partially sighted World Wide Web (Web) users along with others who use assistive technologies. The Web is a fast moving interdisciplinary domain in which new technologies, techniques and research is in perpetual development. It is often difficult to maintain a holistic view of new developments within the multiple domains which together make up the Web. This suggests that knowledge of the current developments and predictions of future developments are additionally important for the accessibility community. Web accessibility has previously been characterised by the correction of our past mistakes to make the current Web fulfil the original vision of access for all. New technologies were not designed with accessibility in mind and technologies that could be useful for addressing accessibility issues were not identified or adopted by the accessibility community. We wish to enable the research community to undertake preventative measures and proactively address challenges, while recognising opportunities, before they become unpreventable or require retrospective technological enhancement. This article then reviews emerging trends within the Web and Web Accessibility domains.

  19. National Disability Registers Report on Causes of Intellectual Disability in Taiwan: 2000-2007

    ERIC Educational Resources Information Center

    Lin, Jin-Ding; Yen, Chia-Feng; Wu, Jia-Ling; Kang, Shih-Wan

    2009-01-01

    The main purposes of the present analysis were to describe the causes of intellectual disability (ID) and examine its overtime change from 2000 to 2007 in Taiwan. Data of the present study mainly come from the public web-access information which collected by the Department of Statistics, Ministry of the Interiors, Taipei, Taiwan. Data were…

  20. Administrative Prevalence of Autism Spectrum Disorders Based on National Disability Registers in Taiwan

    ERIC Educational Resources Information Center

    Lin, Jin-Ding; Lin, Lan-Ping; Wu, Jia-Ling

    2009-01-01

    The aim of this paper was to describe the over time prevalence of autism from 2000 to 2007 in Taiwan, with particular focus on the age, gender, prevalence and causes. We analyzed data from the public web-access information collected by the Department of Statistics, Ministry of the Interiors, Taipei, Taiwan. The data included: (1) the physically…

  1. Innovations in user-defined analysis: dynamic grouping and customized user datasets in VistaPHw.

    PubMed

    Solet, David; Glusker, Ann; Laurent, Amy; Yu, Tianji

    2006-01-01

    Flexible, ready access to community health assessment data is a feature of innovative Web-based data query systems. An example is VistaPHw, which provides access to Washington state data and statistics used in community health assessment. Because of its flexible analysis options, VistaPHw customizes local, population-based results to be relevant to public health decision-making. The advantages of two innovations, dynamic grouping and the Custom Data Module, are described. Dynamic grouping permits the creation of user-defined aggregations of geographic areas, age groups, race categories, and years. Standard VistaPHw measures such as rates, confidence intervals, and other statistics may then be calculated for the new groups. Dynamic grouping has provided data for major, successful grant proposals, building partnerships with local governments and organizations, and informing program planning for community organizations. The Custom Data Module allows users to prepare virtually any dataset so it may be analyzed in VistaPHw. Uses for this module may include datasets too sensitive to be placed on a Web server or datasets that are not standardized across the state. Limitations and other system needs are also discussed.

  2. Secure Web-Site Access with Tickets and Message-Dependent Digests

    USGS Publications Warehouse

    Donato, David I.

    2008-01-01

    Although there are various methods for restricting access to documents stored on a World Wide Web (WWW) site (a Web site), none of the widely used methods is completely suitable for restricting access to Web applications hosted on an otherwise publicly accessible Web site. A new technique, however, provides a mix of features well suited for restricting Web-site or Web-application access to authorized users, including the following: secure user authentication, tamper-resistant sessions, simple access to user state variables by server-side applications, and clean session terminations. This technique, called message-dependent digests with tickets, or MDDT, maintains secure user sessions by passing single-use nonces (tickets) and message-dependent digests of user credentials back and forth between client and server. Appendix 2 provides a working implementation of MDDT with PHP server-side code and JavaScript client-side code.

  3. BioServices: a common Python package to access biological Web Services programmatically.

    PubMed

    Cokelaer, Thomas; Pultz, Dennis; Harder, Lea M; Serra-Musach, Jordi; Saez-Rodriguez, Julio

    2013-12-15

    Web interfaces provide access to numerous biological databases. Many can be accessed to in a programmatic way thanks to Web Services. Building applications that combine several of them would benefit from a single framework. BioServices is a comprehensive Python framework that provides programmatic access to major bioinformatics Web Services (e.g. KEGG, UniProt, BioModels, ChEMBLdb). Wrapping additional Web Services based either on Representational State Transfer or Simple Object Access Protocol/Web Services Description Language technologies is eased by the usage of object-oriented programming. BioServices releases and documentation are available at http://pypi.python.org/pypi/bioservices under a GPL-v3 license.

  4. The Plate Boundary Observatory: Community Focused Web Services

    NASA Astrophysics Data System (ADS)

    Matykiewicz, J.; Anderson, G.; Lee, E.; Hoyt, B.; Hodgkinson, K.; Persson, E.; Wright, J.; Torrez, D.; Jackson, M.

    2006-12-01

    The Plate Boundary Observatory (PBO), part of the NSF-funded EarthScope project, is designed to study the three-dimensional strain field resulting from deformation across the active boundary zone between the Pacific and North American plates in the western United States. To meet these goals, PBO will install 852 continuous GPS stations, 103 borehole strainmeter stations, 28 tiltmeters, and five laser strainmeters, as well as manage data for 209 previously existing continuous GPS stations. UNAVCO provides access to data products from these stations, as well as general information about the PBO project, via the PBO web site (http://pboweb.unavco.org). GPS and strainmeter data products can be found using a variety of channels, including map searches, text searches, and station specific data retrieval. In addition, the PBO construction status is available via multiple mapping interfaces, including custom web based map widgets and Google Earth. Additional construction details can be accessed from PBO operational pages and station specific home pages. The current state of health for the PBO network is available with the statistical snap-shot, full map interfaces, tabular web based reports, and automatic data mining and alerts. UNAVCO is currently working to enhance the community access to this information by developing a web service framework for the discovery of data products, interfacing with operational engineers, and exposing data services to third party participants. In addition, UNAVCO, through the PBO project, provides advanced data management and monitoring systems for use by the community in operating geodetic networks in the United States and beyond. We will demonstrate these systems during the AGU meeting, and we welcome inquiries from the community at any time.

  5. StreamStats: A water resources web application

    USGS Publications Warehouse

    Ries, Kernell G.; Guthrie, John G.; Rea, Alan H.; Steeves, Peter A.; Stewart, David W.

    2008-01-01

    Streamflow statistics, such as the 1-percent flood, the mean flow, and the 7-day 10-year low flow, are used by engineers, land managers, biologists, and many others to help guide decisions in their everyday work. For example, estimates of the 1-percent flood (the flow that is exceeded, on average, once in 100 years and has a 1-percent chance of being exceeded in any year, sometimes referred to as the 100-year flood) are used to create flood-plain maps that form the basis for setting insurance rates and land-use zoning. This and other streamflow statistics also are used for dam, bridge, and culvert design; water-supply planning and management; water-use appropriations and permitting; wastewater and industrial discharge permitting; hydropower facility design and regulation; and the setting of minimum required streamflows to protect freshwater ecosystems. In addition, researchers, planners, regulators, and others often need to know the physical and climatic characteristics of the drainage basins (basin characteristics) and the influence of human activities, such as dams and water withdrawals, on streamflow upstream from locations of interest to understand the mechanisms that control water availability and quality at those locations. Knowledge of the streamflow network and downstream human activities also is necessary to adequately determine whether an upstream activity, such as a water withdrawal, can be allowed without adversely affecting downstream activities.Streamflow statistics could be needed at any location along a stream. Most often, streamflow statistics are needed at ungaged sites, where no streamflow data are available to compute the statistics. At U.S. Geological Survey (USGS) streamflow data-collection stations, which include streamgaging stations, partial-record stations, and miscellaneous-measurement stations, streamflow statistics can be computed from available data for the stations. Streamflow data are collected continuously at streamgaging stations. Streamflow measurements are collected systematically over a period of years at partial-record stations to estimate peak-flow or low-flow statistics. Streamflow measurements usually are collected at miscellaneous-measurement stations for specific hydrologic studies with various objectives.StreamStats is a Web-based Geographic Information System (GIS) application that was created by the USGS, in cooperation with Environmental Systems Research Institute, Inc. (ESRI)1, to provide users with access to an assortment of analytical tools that are useful for water-resources planning and management. StreamStats functionality is based on ESRI’s ArcHydro Data Model and Tools, described on the Web at http://resources.arcgis.com/en/communities/hydro/01vn0000000s000000.htm. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection stations and user-selected ungaged sites. It also allows users to identify stream reaches that are upstream and downstream from user-selected sites, and to identify and obtain information for locations along the streams where activities that may affect streamflow conditions are occurring. This functionality can be accessed through a map-based user interface that appears in the user’s Web browser, or individual functions can be requested remotely as Web services by other Web or desktop computer applications. StreamStats can perform these analyses much faster than historically used manual techniques.StreamStats was designed so that each state would be implemented as a separate application, with a reliance on local partnerships to fund the individual applications, and a goal of eventual full national implementation. Idaho became the first state to implement StreamStats in 2003. By mid-2008, 14 states had applications available to the public, and 18 other states were in various stages of implementation.

  6. The Effects of Implementing Web Accessibility Standards on the Success of Secondary Adolescents

    ERIC Educational Resources Information Center

    Savi, Christine Opitz; Savenye, Wilhelmina; Rowland, Cynthia

    2008-01-01

    Web accessibility has become a paramount concern in providing equal access to audiences of all abilities. Unless web accessibility is supported and employed, the internet does not deliver worldwide access as it was intended. This study engaged 60 students in a secondary school setting in order to identify the navigational effectiveness and…

  7. Adolescents and web porn: a new era of sexuality.

    PubMed

    Pizzol, Damiano; Bertoldo, Alessandro; Foresta, Carlo

    2016-05-01

    Pornography can affect the lifestyles of adolescents, especially in terms of their sexual habits and porn consumption, and may have a significant influence on their sexual attitudes and behaviors. The aim of this study was to understand and analyze the frequency, duration, and perception of web porn utilization by young Italians attending high school. A total of 1565 students attending the final year of high school were involved in the study, and 1492 have agreed to fill out an anonymous survey. The questions representing the content of this study were: 1) How often do you access the web? 2) How much time do you remain connected? 3) Do you connect to pornographic sites? 4) How often do you access pornographic sites? 5) How much time you spend on them? 6) How often do you masturbate? and 7) How do you rate the attendance of these sites? Statistical analysis was performed by Fischer's test. All young people, on an almost daily basis, have access to Internet. Among those surveyed, 1163 (77.9%) of Internet users admit to the consumption of pornographic material, and of these, 93 (8%) access pornographic websites daily, 686 (59%) boys accessing these sites perceive the consumption of pornography as always stimulating, 255 (21.9%) define it as habitual, 116 (10%) report that it reduces sexual interest towards potential real-life partners, and the remaining 106 (9.1%) report a kind of addiction. In addition, 19% of overall pornography consumers report an abnormal sexual response, while the percentage rose to 25.1% among regular consumers. It is necessary to educate web users, especially young users, to a safe and responsible use of the Internet and of its contents. Moreover, public education campaigns should be increased in number and frequency to help improve knowledge of Internet-related sexual issues both by adolescents and by parents.

  8. Monitoring and Evaluating Use of the World Wide Web in an Academic Library: An Exploratory Study.

    ERIC Educational Resources Information Center

    Abramson, Alicia D.

    1998-01-01

    Examines use of the World Wide Web on public-access computers at the American University Library (Washington, D.C.) to identify the most frequently accessed Web sites, the frequency with which library-owned Web resources were accessed, and Web-usage patterns in the library in relation to the time of day and day of the week. (Author/AEF)

  9. NeuroVault.org: A repository for sharing unthresholded statistical maps, parcellations, and atlases of the human brain.

    PubMed

    Gorgolewski, Krzysztof J; Varoquaux, Gael; Rivera, Gabriel; Schwartz, Yannick; Sochat, Vanessa V; Ghosh, Satrajit S; Maumet, Camille; Nichols, Thomas E; Poline, Jean-Baptiste; Yarkoni, Tal; Margulies, Daniel S; Poldrack, Russell A

    2016-01-01

    NeuroVault.org is dedicated to storing outputs of analyses in the form of statistical maps, parcellations and atlases, a unique strategy that contrasts with most neuroimaging repositories that store raw acquisition data or stereotaxic coordinates. Such maps are indispensable for performing meta-analyses, validating novel methodology, and deciding on precise outlines for regions of interest (ROIs). NeuroVault is open to maps derived from both healthy and clinical populations, as well as from various imaging modalities (sMRI, fMRI, EEG, MEG, PET, etc.). The repository uses modern web technologies such as interactive web-based visualization, cognitive decoding, and comparison with other maps to provide researchers with efficient, intuitive tools to improve the understanding of their results. Each dataset and map is assigned a permanent Universal Resource Locator (URL), and all of the data is accessible through a REST Application Programming Interface (API). Additionally, the repository supports the NIDM-Results standard and has the ability to parse outputs from popular FSL and SPM software packages to automatically extract relevant metadata. This ease of use, modern web-integration, and pioneering functionality holds promise to improve the workflow for making inferences about and sharing whole-brain statistical maps. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Mobile Web and Accessibility

    NASA Astrophysics Data System (ADS)

    Hori, Masahiro; Kato, Takashi

    While focusing on the human-computer interaction side of the Web content delivery, this article discusses problems and prospects of the mobile Web and Web accessibility in terms of what lessons and experiences we have gained from Web accessibility and what they can say about the mobile Web. One aim is to draw particular attention to the importance of explicitly distinguishing between perceptual and cognitive aspects of the users’ interactions with the Web. Another is to emphasize the increased importance of scenario-based evaluation and remote testing for the mobile Web where the limited screen space and a variety of environmental factors of mobile use are critical design issues. A newly devised inspection type of evaluation method that focuses on the perceptual-cognitive distinction of accessibility and usability issues is presented as a viable means of scenario-based, remote testing for the Web.

  11. The Web as an educational tool for/in learning/teaching bioinformatics statistics.

    PubMed

    Oliver, J; Pisano, M E; Alonso, T; Roca, P

    2005-12-01

    Statistics provides essential tool in Bioinformatics to interpret the results of a database search or for the management of enormous amounts of information provided from genomics, proteomics and metabolomics. The goal of this project was the development of a software tool that would be as simple as possible to demonstrate the use of the Bioinformatics statistics. Computer Simulation Methods (CSMs) developed using Microsoft Excel were chosen for their broad range of applications, immediate and easy formula calculation, immediate testing and easy graphics representation, and of general use and acceptance by the scientific community. The result of these endeavours is a set of utilities which can be accessed from the following URL: http://gmein.uib.es/bioinformatica/statistics. When tested on students with previous coursework with traditional statistical teaching methods, the general opinion/overall consensus was that Web-based instruction had numerous advantages, but traditional methods with manual calculations were also needed for their theory and practice. Once having mastered the basic statistical formulas, Excel spreadsheets and graphics were shown to be very useful for trying many parameters in a rapid fashion without having to perform tedious calculations. CSMs will be of great importance for the formation of the students and professionals in the field of bioinformatics, and for upcoming applications of self-learning and continuous formation.

  12. Web Accessibility Policies at Land-Grant Universities

    ERIC Educational Resources Information Center

    Bradbard, David A.; Peters, Cara; Caneva, Yoana

    2010-01-01

    The Web has become an integral part of postsecondary education within the United States. There are specific laws that legally mandate postsecondary institutions to have Web sites that are accessible for students with disabilities (e.g., the Americans with Disabilities Act (ADA)). Web accessibility policies are a way for universities to provide a…

  13. A web-portal for interactive data exploration, visualization, and hypothesis testing

    PubMed Central

    Bartsch, Hauke; Thompson, Wesley K.; Jernigan, Terry L.; Dale, Anders M.

    2014-01-01

    Clinical research studies generate data that need to be shared and statistically analyzed by their participating institutions. The distributed nature of research and the different domains involved present major challenges to data sharing, exploration, and visualization. The Data Portal infrastructure was developed to support ongoing research in the areas of neurocognition, imaging, and genetics. Researchers benefit from the integration of data sources across domains, the explicit representation of knowledge from domain experts, and user interfaces providing convenient access to project specific data resources and algorithms. The system provides an interactive approach to statistical analysis, data mining, and hypothesis testing over the lifetime of a study and fulfills a mandate of public sharing by integrating data sharing into a system built for active data exploration. The web-based platform removes barriers for research and supports the ongoing exploration of data. PMID:24723882

  14. Working with WebQuests: Making the Web Accessible to Students with Disabilities.

    ERIC Educational Resources Information Center

    Kelly, Rebecca

    2000-01-01

    This article describes how students with disabilities in regular classes are using the WebQuest lesson format to access the Internet. It explains essential WebQuest principles, creating a draft Web page, and WebQuest components. It offers an example of a WebQuest about salvaging the sunken ships, Titanic and Lusitania. A WebQuest planning form is…

  15. SalanderMaps: A rapid overview about felt earthquakes through data mining of web-accesses

    NASA Astrophysics Data System (ADS)

    Kradolfer, Urs

    2013-04-01

    While seismological observatories detect and locate earthquakes based on measurements of the ground motion, they neither know a priori whether an earthquake has been felt by the public nor is it known, where it has been felt. Such information is usually gathered by evaluating feedback reported by the public through on-line forms on the web. However, after a felt earthquake in Switzerland, many people visit the webpages of the Swiss Seismological Service (SED) at the ETH Zurich and each such visit leaves traces in the logfiles on our web-servers. Data mining techniques, applied to these logfiles and mining publicly available data bases on the internet open possibilities to obtain previously unknown information about our virtual visitors. In order to provide precise information to authorities and the media, it would be desirable to rapidly know from which locations these web-accesses origin. The method 'Salander' (Seismic Activitiy Linked to Area codes - Nimble Detection of Earthquake Rumbles) will be introduced and it will be explained, how the IP-addresses (each computer or router directly connected to the internet has a unique IP-address; an example would be 129.132.53.5) of a sufficient amount of our virtual visitors were linked to their geographical area. This allows us to unprecedentedly quickly know whether and where an earthquake was felt in Switzerland. It will also be explained, why the method Salander is superior to commercial so-called geolocation products. The corresponding products of the Salander method, animated SalanderMaps, which are routinely generated after each earthquake with a magnitude of M>2 in Switzerland (http://www.seismo.ethz.ch/prod/salandermaps/, available after March 2013), demonstrate how the wavefield of earthquakes propagates through Switzerland and where it was felt. Often, such information is available within less than 60 seconds after origin time, and we always get a clear picture within already five minutes after origin time. Furthermore, the method allows to detect earthquakes solely on the analysis of accesses to our web-servers. Analyzing more than 170 million web-accesses since 2003, all seismic events within or near Switzerland with magnitudes M>4 and most felt events with magnitudes between 3 and 4 were detected. The current system is very robust, as we only had one false alarm while re-processing the web-access logfiles of the past almost 10 years. We anticipate that this method will produce even faster results in the future as the number of both commercial and private internet users is - according to the statistics of our logfiles - still increasing.

  16. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  17. Hand Society and Matching Program Web Sites Provide Poor Access to Information Regarding Hand Surgery Fellowship.

    PubMed

    Hinds, Richard M; Klifto, Christopher S; Naik, Amish A; Sapienza, Anthony; Capo, John T

    2016-08-01

    The Internet is a common resource for applicants of hand surgery fellowships, however, the quality and accessibility of fellowship online information is unknown. The objectives of this study were to evaluate the accessibility of hand surgery fellowship Web sites and to assess the quality of information provided via program Web sites. Hand fellowship Web site accessibility was evaluated by reviewing the American Society for Surgery of the Hand (ASSH) on November 16, 2014 and the National Resident Matching Program (NRMP) fellowship directories on February 12, 2015, and performing an independent Google search on November 25, 2014. Accessible Web sites were then assessed for quality of the presented information. A total of 81 programs were identified with the ASSH directory featuring direct links to 32% of program Web sites and the NRMP directory directly linking to 0%. A Google search yielded direct links to 86% of program Web sites. The quality of presented information varied greatly among the 72 accessible Web sites. Program description (100%), fellowship application requirements (97%), program contact email address (85%), and research requirements (75%) were the most commonly presented components of fellowship information. Hand fellowship program Web sites can be accessed from the ASSH directory and, to a lesser extent, the NRMP directory. However, a Google search is the most reliable method to access online fellowship information. Of assessable programs, all featured a program description though the quality of the remaining information was variable. Hand surgery fellowship applicants may face some difficulties when attempting to gather program information online. Future efforts should focus on improving the accessibility and content quality on hand surgery fellowship program Web sites.

  18. The Development of Web-based Graphical User Interface for Unified Modeling Data with Multi (Correlated) Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian

    2018-04-01

    Statistical models have been developed rapidly into various directions to accommodate various types of data. Data collected from longitudinal, repeated measured, clustered data (either continuous, binary, count, or ordinal), are more likely to be correlated. Therefore statistical model for independent responses, such as Generalized Linear Model (GLM), Generalized Additive Model (GAM) are not appropriate. There are several models available to apply for correlated responses including GEEs (Generalized Estimating Equations), for marginal model and various mixed effect model such as GLMM (Generalized Linear Mixed Models) and HGLM (Hierarchical Generalized Linear Models) for subject spesific models. These models are available on free open source software R, but they can only be accessed through command line interface (using scrit). On the othe hand, most practical researchers very much rely on menu based or Graphical User Interface (GUI). We develop, using Shiny framework, standard pull down menu Web-GUI that unifies most models for correlated responses. The Web-GUI has accomodated almost all needed features. It enables users to do and compare various modeling for repeated measure data (GEE, GLMM, HGLM, GEE for nominal responses) much more easily trough online menus. This paper discusses the features of the Web-GUI and illustrates the use of them. In General we find that GEE, GLMM, HGLM gave very closed results.

  19. Exploring the SCOAP3 Research Contributions of the United States

    NASA Astrophysics Data System (ADS)

    Marsteller, Matthew

    2016-03-01

    The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) is a successful global partnership of libraries, funding agencies and research centers. This presentation will inform the audience about SCOAP3 and also delve into descriptive statistics of the United States' intellectual contribution to particle physics via these open access journals. Exploration of the SCOAP3 particle physics literature using a variety of metrics tools such as Web of Science™, InCites™, Scopus® and SciVal will be shared. ORA or Sci2 will be used to visualize author collaboration networks.

  20. Untangling Your Web.

    ERIC Educational Resources Information Center

    Coombs, Norman

    2000-01-01

    Provides an overview of universal Web design and discusses guidelines developed by the Web access initiative (WAI) that focus on the access needs of Web users with disabilities. Highlights include barriers for people with print disabilities or motor impairments; the role of libraries; and resources to assist Web designers. (LRW)

  1. Web Accessibility Knowledge and Skills for Non-Web Library Staff

    ERIC Educational Resources Information Center

    McHale, Nina

    2012-01-01

    Why do librarians and library staff other than Web librarians and developers need to know about accessibility? Web services staff do not--or should not--operate in isolation from the rest of the library staff. It is important to consider what areas of online accessibility are applicable to other areas of library work and to colleagues' regular job…

  2. Towards Web-based representation and processing of health information

    PubMed Central

    Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J

    2009-01-01

    Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445

  3. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    PubMed

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.

  4. JavaScript Access to DICOM Network and Objects in Web Browser.

    PubMed

    Drnasin, Ivan; Grgić, Mislav; Gogić, Goran

    2017-10-01

    Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.

  5. GENESIS SciFlo: Choreographing Interoperable Web Services on the Grid using a Semantically-Enabled Dataflow Execution Environment

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.

    2007-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* & Globus Alliance toolkits), and enables scientists to do multi- instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. SciFlo also publishes its own SOAP services for space/time query and subsetting of Earth Science datasets, and automated access to large datasets via lists of (FTP, HTTP, or DAP) URLs which point to on-line HDF or netCDF files. Typical distributed workflows obtain datasets by calling standard WMS/WCS servers or discovering and fetching data granules from ftp sites; invoke remote analysis operators available as SOAP services (interface described by a WSDL document); and merge results into binary containers (netCDF or HDF files) for further analysis using local executable operators. Naming conventions (HDFEOS and CF-1.0 for netCDF) are exploited to automatically understand and read on-line datasets. More interoperable conventions, and broader adoption of existing converntions, are vital if we are to "scale up" automated choreography of Web Services beyond toy applications. Recently, the ESIP Federation sponsored a collaborative activity in which several ESIP members developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, the benefits of doing collaborative science analysis at the "touch of a button" once services are connected, and further collaborations that are being pursued.

  6. Accessing the SEED genome databases via Web services API: tools for programmers.

    PubMed

    Disz, Terry; Akhter, Sajia; Cuevas, Daniel; Olson, Robert; Overbeek, Ross; Vonstein, Veronika; Stevens, Rick; Edwards, Robert A

    2010-06-14

    The SEED integrates many publicly available genome sequences into a single resource. The database contains accurate and up-to-date annotations based on the subsystems concept that leverages clustering between genomes and other clues to accurately and efficiently annotate microbial genomes. The backend is used as the foundation for many genome annotation tools, such as the Rapid Annotation using Subsystems Technology (RAST) server for whole genome annotation, the metagenomics RAST server for random community genome annotations, and the annotation clearinghouse for exchanging annotations from different resources. In addition to a web user interface, the SEED also provides Web services based API for programmatic access to the data in the SEED, allowing the development of third-party tools and mash-ups. The currently exposed Web services encompass over forty different methods for accessing data related to microbial genome annotations. The Web services provide comprehensive access to the database back end, allowing any programmer access to the most consistent and accurate genome annotations available. The Web services are deployed using a platform independent service-oriented approach that allows the user to choose the most suitable programming platform for their application. Example code demonstrate that Web services can be used to access the SEED using common bioinformatics programming languages such as Perl, Python, and Java. We present a novel approach to access the SEED database. Using Web services, a robust API for access to genomics data is provided, without requiring large volume downloads all at once. The API ensures timely access to the most current datasets available, including the new genomes as soon as they come online.

  7. The Social Validation of Institutional Indicators to Promote System-Wide Web Accessibility in Postsecondary Institutions

    ERIC Educational Resources Information Center

    Mariger, Heather Ann

    2011-01-01

    The Internet is an integral part of higher education today. Students, faculty, and staff must have access to the institutional web for essential activities. For persons with disabilities, the web is a double-edged sword. While an accessibly designed website can mitigate or remove barriers, an inaccessible one can make access impossible. If…

  8. MAGI: a Node.js web service for fast microRNA-Seq analysis in a GPU infrastructure.

    PubMed

    Kim, Jihoon; Levy, Eric; Ferbrache, Alex; Stepanowsky, Petra; Farcas, Claudiu; Wang, Shuang; Brunner, Stefan; Bath, Tyler; Wu, Yuan; Ohno-Machado, Lucila

    2014-10-01

    MAGI is a web service for fast MicroRNA-Seq data analysis in a graphics processing unit (GPU) infrastructure. Using just a browser, users have access to results as web reports in just a few hours->600% end-to-end performance improvement over state of the art. MAGI's salient features are (i) transfer of large input files in native FASTA with Qualities (FASTQ) format through drag-and-drop operations, (ii) rapid prediction of microRNA target genes leveraging parallel computing with GPU devices, (iii) all-in-one analytics with novel feature extraction, statistical test for differential expression and diagnostic plot generation for quality control and (iv) interactive visualization and exploration of results in web reports that are readily available for publication. MAGI relies on the Node.js JavaScript framework, along with NVIDIA CUDA C, PHP: Hypertext Preprocessor (PHP), Perl and R. It is freely available at http://magi.ucsd.edu. © The Author 2014. Published by Oxford University Press.

  9. Breaking and Fixing Origin-Based Access Control in Hybrid Web/Mobile Application Frameworks.

    PubMed

    Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly

    2014-02-01

    Hybrid mobile applications (apps) combine the features of Web applications and "native" mobile apps. Like Web applications, they are implemented in portable, platform-independent languages such as HTML and JavaScript. Like native apps, they have direct access to local device resources-file system, location, camera, contacts, etc. Hybrid apps are typically developed using hybrid application frameworks such as PhoneGap. The purpose of the framework is twofold. First, it provides an embedded Web browser (for example, WebView on Android) that executes the app's Web code. Second, it supplies "bridges" that allow Web code to escape the browser and access local resources on the device. We analyze the software stack created by hybrid frameworks and demonstrate that it does not properly compose the access-control policies governing Web code and local code, respectively. Web code is governed by the same origin policy, whereas local code is governed by the access-control policy of the operating system (for example, user-granted permissions in Android). The bridges added by the framework to the browser have the same local access rights as the entire application, but are not correctly protected by the same origin policy. This opens the door to fracking attacks, which allow foreign-origin Web content included into a hybrid app (e.g., ads confined in iframes) to drill through the layers and directly access device resources. Fracking vulnerabilities are generic: they affect all hybrid frameworks, all embedded Web browsers, all bridge mechanisms, and all platforms on which these frameworks are deployed. We study the prevalence of fracking vulnerabilities in free Android apps based on the PhoneGap framework. Each vulnerability exposes sensitive local resources-the ability to read and write contacts list, local files, etc.-to dozens of potentially malicious Web domains. We also analyze the defenses deployed by hybrid frameworks to prevent resource access by foreign-origin Web content and explain why they are ineffectual. We then present NoFrak, a capability-based defense against fracking attacks. NoFrak is platform-independent, compatible with any framework and embedded browser, requires no changes to the code of the existing hybrid apps, and does not break their advertising-supported business model.

  10. Self-Directed Digital Learning: When Do Dental Students Study?

    PubMed

    Jackson, Tate H; Zhong, James; Phillips, Ceib; Koroluk, Lorne D

    2018-04-01

    The Growth and Development (G&D) curriculum at the University of North Carolina at Chapel Hill School of Dentistry uses self-directed web-based learning modules in the place of lectures and includes scheduled self-study times during the 8 am-5 pm school hours. The aim of this study was to use direct observation to evaluate dental students' access patterns with the self-directed, web-based learning modules in relation to planned self-study time allocated across the curriculum, proximity to course examinations, and course performance. Module access for all 80 students in the DDS Class of 2014 was recorded for date and time across the four G&D courses. Module access data were used to determine likelihood of usage during scheduled time and frequency of usage in three timeframes: >7, 3 to 7, and 0 to 2 days before the final exam. The results showed a statistically significant difference in the likelihood of module access during scheduled time across the curriculum (p<0.0001). Among the students, 64% accessed modules at least once during scheduled time in G&D1, but only 10%, 19%, and 18% in G&D2, G&D3, and G&D4, respectively. For all courses, the proportion of module accesses was significantly higher 0-2 days before an exam compared to the other two timeframes. Module access also differed significantly within each timeframe across all four courses (p<0.001). There was no association between module access and course performance. In this non-traditional, non-lecture, self-directed curriculum, students rarely accessed learning modules during syllabus-budgeted self-study time and accessed modules more frequently as course exams approached.

  11. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data Access Protocol (OpenDAP) servers. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. Once an analysis has been specified for a chunk or day of data, it can be easily repeated with different control parameters or over months of data. Recently, the Earth Science Information Partners (ESIP) Federation sponsored a collaborative activity in which several ESIP members advertised their respective WMS/WCS and SOAP services, developed some collaborative science scenarios for atmospheric and aerosol science, and then choreographed services from multiple groups into demonstration workflows using the SciFlo engine and a Business Process Execution Language (BPEL) workflow engine. For several scenarios, the same collaborative workflow was executed in three ways: using hand-coded scripts, by executing a SciFlo document, and by executing a BPEL workflow document. We will discuss the lessons learned from this activity, the need for standardized interfaces (like WMS/WCS), the difficulty in agreeing on even simple XML formats and interfaces, and further collaborations that are being pursued.

  12. [Web accessibility of Internet appointment scheduling in primary care].

    PubMed

    Casasola Balsells, Luis Alejandro; Guerra González, Juan Carlos; Casasola Balsells, María Araceli; Pérez Chamorro, Vicente Antonio

    2017-12-16

    To assess the accessibility level of Internet appointment scheduling in primary care and the fulfilment of the requirements of Spanish legislation. Descriptive study of the accessibility of 18 web sites corresponding to the autonomic health services responsible for Internet appointment scheduling for primary health care services. The level of web accessibility was evaluated by means of five automated tools. Only six websites self-declared to be in compliance with level AA of WCAG 2.0. The level of web accessibility according to the legal requirements in Spain is low. The evaluation tools identified the main errors to be corrected. Most of the autonomic health services responsible for Internet appointment scheduling in primary care need to improve their level of web accessibility and ensure that it complies with Spanish legislation. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Total Access: Making College Web Sites Accessible to Students with Disabilities

    ERIC Educational Resources Information Center

    Bruyere, Susanne

    2008-01-01

    Colleges increasingly rely on the Web to attract, inform, and interact with students. This makes Web site accessibility and usability critical concerns, particularly for public community colleges, which educate sizable numbers of students with disabilities. As committed providers of postsecondary education to students with disabilities and thus a…

  14. A Web-Based Database for Nurse Led Outreach Teams (NLOT) in Toronto.

    PubMed

    Li, Shirley; Kuo, Mu-Hsing; Ryan, David

    2016-01-01

    A web-based system can provide access to real-time data and information. Healthcare is moving towards digitizing patients' medical information and securely exchanging it through web-based systems. In one of Ontario's health regions, Nurse Led Outreach Teams (NLOT) provide emergency mobile nursing services to help reduce unnecessary transfers from long-term care homes to emergency departments. Currently the NLOT team uses a Microsoft Access database to keep track of the health information on the residents that they serve. The Access database lacks scalability, portability, and interoperability. The objective of this study is the development of a web-based database using Oracle Application Express that is easily accessible from mobile devices. The web-based database will allow NLOT nurses to enter and access resident information anytime and from anywhere.

  15. Making It Work for Everyone: HTML5 and CSS Level 3 for Responsive, Accessible Design on Your Library's Web Site

    ERIC Educational Resources Information Center

    Baker, Stewart C.

    2014-01-01

    This article argues that accessibility and universality are essential to good Web design. A brief review of library science literature sets the issue of Web accessibility in context. The bulk of the article explains the design philosophies of progressive enhancement and responsive Web design, and summarizes recent updates to WCAG 2.0, HTML5, CSS…

  16. The Status of Web Accessibility of Canadian Universities and Colleges: A Charter of Rights and Freedoms Issue

    ERIC Educational Resources Information Center

    Zaparyniuk, Nicholas; Montgomerie, Craig

    2005-01-01

    The fundamental ideal that access to education and information as one of our basic human rights must not be neglected in the electronic information age. This ideal however is not being met in the area of postsecondary Web accessibility. This study surveyed 350 postsecondary institutions in Canada to evaluate their level of Web accessibility in…

  17. Mapping Norway - a Method to Register and Survey the Status of Accessibility

    NASA Astrophysics Data System (ADS)

    Michaelis, Sven; Bögelsack, Kathrin

    2018-05-01

    The Norwegian mapping authority has developed a standard method for mapping accessibility mostly for people with limited or no walking abilities in urban and recreational areas. We choose an object-orientated approach where points, lines and polygons represents objects in the environment. All data are stored in a geospatial database, so they can be presented as web map and analyzed using GIS software. By the end of 2016 more than 160 municipalities are mapped using that method. The aim of this project is to establish a national standard for mapping and to provide a geodatabase that shows the status of accessibility throughout Norway. The data provide a useful tool for national statistics, local planning authorities and private users. First results show that accessibility is low and Norway still faces many challenges to meet the government's goals for Universal Design.

  18. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination

    PubMed Central

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-01-01

    Abstract Motivation Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Implementation Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. General features Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Availability Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. PMID:29025122

  19. Implementing Recommendations From Web Accessibility Guidelines: A Comparative Study of Nondisabled Users and Users With Visual Impairments.

    PubMed

    Schmutz, Sven; Sonderegger, Andreas; Sauer, Juergen

    2017-09-01

    The present study examined whether implementing recommendations of Web accessibility guidelines would have different effects on nondisabled users than on users with visual impairments. The predominant approach for making Web sites accessible for users with disabilities is to apply accessibility guidelines. However, it has been hardly examined whether this approach has side effects for nondisabled users. A comparison of the effects on both user groups would contribute to a better understanding of possible advantages and drawbacks of applying accessibility guidelines. Participants from two matched samples, comprising 55 participants with visual impairments and 55 without impairments, took part in a synchronous remote testing of a Web site. Each participant was randomly assigned to one of three Web sites, which differed in the level of accessibility (very low, low, and high) according to recommendations of the well-established Web Content Accessibility Guidelines 2.0 (WCAG 2.0). Performance (i.e., task completion rate and task completion time) and a range of subjective variables (i.e., perceived usability, positive affect, negative affect, perceived aesthetics, perceived workload, and user experience) were measured. Higher conformance to Web accessibility guidelines resulted in increased performance and more positive user ratings (e.g., perceived usability or aesthetics) for both user groups. There was no interaction between user group and accessibility level. Higher conformance to WCAG 2.0 may result in benefits for nondisabled users and users with visual impairments alike. Practitioners may use the present findings as a basis for deciding on whether and how to implement accessibility best.

  20. Brain Tumor Database, a free relational database for collection and analysis of brain tumor patient information.

    PubMed

    Bergamino, Maurizio; Hamilton, David J; Castelletti, Lara; Barletta, Laura; Castellan, Lucio

    2015-03-01

    In this study, we describe the development and utilization of a relational database designed to manage the clinical and radiological data of patients with brain tumors. The Brain Tumor Database was implemented using MySQL v.5.0, while the graphical user interface was created using PHP and HTML, thus making it easily accessible through a web browser. This web-based approach allows for multiple institutions to potentially access the database. The BT Database can record brain tumor patient information (e.g. clinical features, anatomical attributes, and radiological characteristics) and be used for clinical and research purposes. Analytic tools to automatically generate statistics and different plots are provided. The BT Database is a free and powerful user-friendly tool with a wide range of possible clinical and research applications in neurology and neurosurgery. The BT Database graphical user interface source code and manual are freely available at http://tumorsdatabase.altervista.org. © The Author(s) 2013.

  1. A tool for improving the Web accessibility of visually handicapped persons.

    PubMed

    Fujiki, Tadayoshi; Hanada, Eisuke; Yamada, Tomomi; Noda, Yoshihiro; Antoku, Yasuaki; Nakashima, Naoki; Nose, Yoshiaki

    2006-04-01

    Abstract Much has been written concerning the difficulties faced by visually handicapped persons when they access the internet. To solve some of the problems and to make web pages more accessible, we developed a tool we call the "Easy Bar," which works as a toolbar on the web browser. The functions of the Easy Bar are to change the size of web texts and images, to adjust the color, and to clear cached data that is automatically saved by the web browser. These functions are executed with ease by clicking buttons and operating a pull-down list. Since the icons built into Easy Bar are quite large, it is not necessary for the user to deal with delicate operations. The functions of Easy Bar run on any web page without increasing the processing time. For the visually handicapped, Easy Bar would contribute greatly to improved web accessibility to medical information.

  2. Implementation of a near-real time cross-border web-mapping platform on airborne particulate matter (PM) concentration with open-source software

    NASA Astrophysics Data System (ADS)

    Knörchen, Achim; Ketzler, Gunnar; Schneider, Christoph

    2015-01-01

    Although Europe has been growing together for the past decades, cross-border information platforms on environmental issues are still scarce. With regard to the establishment of a web-mapping tool on airborne particulate matter (PM) concentration for the Euregio Meuse-Rhine located in the border region of Belgium, Germany and the Netherlands, this article describes the research on methodical and technical backgrounds implementing such a platform. An open-source solution was selected for presenting the data in a Web GIS (OpenLayers/GeoExt; both JavaScript-based), applying other free tools for data handling (Python), data management (PostgreSQL), geo-statistical modelling (Octave), geoprocessing (GRASS GIS/GDAL) and web mapping (MapServer). The multilingual, made-to-order online platform provides access to near-real time data on PM concentration as well as additional background information. In an open data section, commented configuration files for the Web GIS client are being made available for download. Furthermore, all geodata generated by the project is being published under public domain and can be retrieved in various formats or integrated into Desktop GIS as Web Map Services (WMS).

  3. Bounce Back Now! Protocol of a population-based randomized controlled trial to examine the efficacy of a Web-based intervention with disaster-affected families.

    PubMed

    Ruggiero, Kenneth J; Davidson, Tatiana M; McCauley, Jenna; Gros, Kirstin Stauffacher; Welsh, Kyleen; Price, Matthew; Resnick, Heidi S; Danielson, Carla Kmett; Soltis, Kathryn; Galea, Sandro; Kilpatrick, Dean G; Saunders, Benjamin E; Nissenboim, Josh; Muzzy, Wendy; Fleeman, Anna; Amstadter, Ananda B

    2015-01-01

    Disasters have far-reaching and potentially long-lasting effects on youth and families. Research has consistently shown a clear increase in the prevalence of several mental health disorders after disasters, including depression and posttraumatic stress disorder. Widely accessible evidence-based interventions are needed to address this unmet need for youth and families, who are underrepresented in disaster research. Rapid growth in Internet and Smartphone access, as well as several Web based evaluation studies with various adult populations has shown that Web-based interventions are likely to be feasible in this context and can improve clinical outcomes. Such interventions also are generally cost-effective, can be targeted or personalized, and can easily be integrated in a stepped care approach to screening and intervention delivery. This is a protocol paper that describes an innovative study design in which we evaluate a self-help Web-based resource, Bounce Back Now, with a population-based sample of disaster affected adolescents and families. The paper includes description and justification for sampling selection and procedures, selection of assessment measures and methods, design of the intervention, and statistical evaluation of critical outcomes. Unique features of this study design include the use of address-based sampling to recruit a population-based sample of disaster-affected adolescents and parents, telephone and Web-based assessments, and development and evaluation of a highly individualized Web intervention for adolescents. Challenges related to large-scale evaluation of technology-delivered interventions with high-risk samples in time-sensitive research are discussed, as well as implications for future research and practice. Published by Elsevier Inc.

  4. Bounce Back Now! Protocol of a Population-Based Randomized Controlled Trial to Examine the Efficacy of a Web-based Intervention with Disaster-Affected Families

    PubMed Central

    Ruggiero, Kenneth J.; Davidson, Tatiana M.; McCauley, Jenna; Gros, Kirstin Stauffacher; Welsh, Kyleen; Price, Matthew; Resnick, Heidi S.; Danielson, Carla Kmett; Soltis, Kathryn; Galea, Sandro; Kilpatrick, Dean G.; Saunders, Benjamin E.; Nissenboim, Josh; Muzzy, Wendy; Fleeman, Anna; Amstadter, Ananda B.

    2014-01-01

    Disasters have far-reaching and potentially long-lasting effects on youth and families. Research has consistently shown a clear increase in the prevalence of several mental health disorders after disasters, including depression and posttraumatic stress disorder. Widely accessible evidence-based interventions are needed to address this unmet need for youth and families, who are underrepresented in disaster research. Rapid growth in Internet and Smartphone access, as well as several web based evaluation studies with various adult populations has shown that web-based interventions are likely to be feasible in this context and can improve clinical outcomes. Such interventions also are generally cost-effective, can be targeted or personalized, and can easily be integrated in a stepped care approach to screening and intervention delivery. This is a protocol paper that describes an innovative study design in which we evaluate a self-help web-based resource, Bounce Back Now, with a population-based sample of disaster affected adolescents and families. The paper includes description and justification for sampling selection and procedures, selection of assessment measures and methods, design of the intervention, and statistical evaluation of critical outcomes. Unique features of this study design include the use of address-based sampling to recruit a population-based sample of disaster-affected adolescents and parents, telephone and web-based assessments, and development and evaluation of a highly individualized web intervention for adolescents. Challenges related to large-scale evaluation of technology-delivered interventions with high-risk samples in time-sensitive research are discussed, as well as implications for future research and practice. PMID:25478956

  5. Delivering Electronic Resources with Web OPACs and Other Web-based Tools: Needs of Reference Librarians.

    ERIC Educational Resources Information Center

    Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.

    2000-01-01

    Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…

  6. 22 CFR 502.6 - Terms of use for accessing program materials available on agency Web sites.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... available on agency Web sites. 502.6 Section 502.6 Foreign Relations BROADCASTING BOARD OF GOVERNORS... program materials available on agency Web sites. (a) By accessing Agency Web sites, Requestors agree to all the Terms of Use available on those Web sites. (b) All Requestors are advised that Agency program...

  7. How Public Is the Web?: Robots, Access, and Scholarly Communication.

    ERIC Educational Resources Information Center

    Snyder, Herbert; Rosenbaum, Howard

    1998-01-01

    Examines the use of Robot Exclusion Protocol (REP) to restrict the access of search engine robots to 10 major United States university Web sites. An analysis of Web site searching and interviews with Web server administrators shows that the decision to use this procedure is largely technical and is typically made by the Web server administrator.…

  8. Assessment of Web Content Accessibility Levels in Spanish Official Online Education Environments

    ERIC Educational Resources Information Center

    Roig-Vila, Rosabel; Ferrández, Sergio; Ferri-Miralles, Imma

    2014-01-01

    Diversity-based designing, or the goal of ensuring that web-based information is accessible to as many diverse users as possible, has received growing international acceptance in recent years, with many countries introducing legislation to enforce it. This paper analyses web content accessibility levels in Spanish education portals according to…

  9. A Comparison of Web Resource Access Experiments: Planning for the New Millennium.

    ERIC Educational Resources Information Center

    Greenberg, Jane

    This paper reports on research that compared five leading experiments that aim to improve access to the growing number of information resources on the World Wide Web. The objective was to identify characteristics of success and considerations for improvement in experiments providing access to Web resources via bibliographic control methods. The…

  10. Texting and accessing the web while driving: traffic citations and crashes among young adult drivers.

    PubMed

    Cook, Jerry L; Jones, Randall M

    2011-12-01

    We examined relations between young adult texting and accessing the web while driving with driving outcomes (viz. crashes and traffic citations). Our premise is that engaging in texting and accessing the web while driving is not only distracting but that these activities represent a pattern of behavior that leads to an increase in unwanted outcomes, such as crashes and citations. College students (N = 274) on 3 campuses (one in California and 2 in Utah) completed an electronic questionnaire regarding their driving experience and cell phone use. Our data indicate that 3 out of 4 (74.3%) young adults engage in texting while driving, over half on a weekly basis (51.8%), and some engage in accessing the web while driving (16.8%). Data analysis revealed a relationship between these cell phone behaviors and traffic citations and crashes. The findings support Jessor and Jessor's (1977) "problem behavior syndrome" by showing that traffic citations are related to texting and accessing the web while driving and that crashes are related to accessing the web while driving. Limitations and recommendations are discussed.

  11. Evaluating Web accessibility at different processing phases

    NASA Astrophysics Data System (ADS)

    Fernandes, N.; Lopes, R.; Carriço, L.

    2012-09-01

    Modern Web sites use several techniques (e.g. DOM manipulation) that allow for the injection of new content into their Web pages (e.g. AJAX), as well as manipulation of the HTML DOM tree. This has the consequence that the Web pages that are presented to users (i.e. after browser processing) are different from the original structure and content that is transmitted through HTTP communication (i.e. after browser processing). This poses a series of challenges for Web accessibility evaluation, especially on automated evaluation software. This article details an experimental study designed to understand the differences posed by accessibility evaluation after Web browser processing. We implemented a Javascript-based evaluator, QualWeb, that can perform WCAG 2.0 based accessibility evaluations in the two phases of browser processing. Our study shows that, in fact, there are considerable differences between the HTML DOM trees in both phases, which have the consequence of having distinct evaluation results. We discuss the impact of these results in the light of the potential problems that these differences can pose to designers and developers that use accessibility evaluators that function before browser processing.

  12. EPEPT: A web service for enhanced P-value estimation in permutation tests

    PubMed Central

    2011-01-01

    Background In computational biology, permutation tests have become a widely used tool to assess the statistical significance of an event under investigation. However, the common way of computing the P-value, which expresses the statistical significance, requires a very large number of permutations when small (and thus interesting) P-values are to be accurately estimated. This is computationally expensive and often infeasible. Recently, we proposed an alternative estimator, which requires far fewer permutations compared to the standard empirical approach while still reliably estimating small P-values [1]. Results The proposed P-value estimator has been enriched with additional functionalities and is made available to the general community through a public website and web service, called EPEPT. This means that the EPEPT routines can be accessed not only via a website, but also programmatically using any programming language that can interact with the web. Examples of web service clients in multiple programming languages can be downloaded. Additionally, EPEPT accepts data of various common experiment types used in computational biology. For these experiment types EPEPT first computes the permutation values and then performs the P-value estimation. Finally, the source code of EPEPT can be downloaded. Conclusions Different types of users, such as biologists, bioinformaticians and software engineers, can use the method in an appropriate and simple way. Availability http://informatics.systemsbiology.net/EPEPT/ PMID:22024252

  13. Computational knowledge integration in biopharmaceutical research.

    PubMed

    Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim

    2003-09-01

    An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.

  14. Access and completion of a Web-based treatment in a population-based sample of tornado-affected adolescents.

    PubMed

    Price, Matthew; Yuen, Erica K; Davidson, Tatiana M; Hubel, Grace; Ruggiero, Kenneth J

    2015-08-01

    Although Web-based treatments have significant potential to assess and treat difficult-to-reach populations, such as trauma-exposed adolescents, the extent that such treatments are accessed and used is unclear. The present study evaluated the proportion of adolescents who accessed and completed a Web-based treatment for postdisaster mental health symptoms. Correlates of access and completion were examined. A sample of 2,000 adolescents living in tornado-affected communities was assessed via structured telephone interview and invited to a Web-based treatment. The modular treatment addressed symptoms of posttraumatic stress disorder, depression, and alcohol and tobacco use. Participants were randomized to experimental or control conditions after accessing the site. Overall access for the intervention was 35.8%. Module completion for those who accessed ranged from 52.8% to 85.6%. Adolescents with parents who used the Internet to obtain health-related information were more likely to access the treatment. Adolescent males were less likely to access the treatment. Future work is needed to identify strategies to further increase the reach of Web-based treatments to provide clinical services in a postdisaster context. (c) 2015 APA, all rights reserved).

  15. Access and Completion of a Web-Based Treatment in a Population-Based Sample of Tornado-Affected Adolescents

    PubMed Central

    Price, Matthew; Yuen, Erica; Davidson, Tatiana M.; Hubel, Grace; Ruggiero, Kenneth J.

    2015-01-01

    Although web-based treatments have significant potential to assess and treat difficult to reach populations, such as trauma-exposed adolescents, the extent that such treatments are accessed and used is unclear. The present study evaluated the proportion of adolescents who accessed and completed a web-based treatment for post-disaster mental health symptoms. Correlates of access and completion were examined. A sample of 2,000 adolescents living in tornado-affected communities was assessed via structured telephone interview and invited to a web-based treatment. The modular treatment addressed symptoms of PTSD, depression, and alcohol and tobacco use. Participants were randomized to experimental or control conditions after accessing the site. Overall access for the intervention was 35.8%. Module completion for those who accessed ranged from 52.8% to 85.6%. Adolescents with parents who used the Internet to obtain health-related information were more likely to access the treatment. Adolescent males were less likely to access the treatment. Future work is needed to identify strategies to further increase the reach of web-based treatments to provide clinical services in a post-disaster context. PMID:25622071

  16. A cross disciplinary study of link decay and the effectiveness of mitigation techniques

    PubMed Central

    2013-01-01

    Background The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. Results We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Conclusion Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved. PMID:24266891

  17. A cross disciplinary study of link decay and the effectiveness of mitigation techniques.

    PubMed

    Hennessey, Jason; Ge, Steven

    2013-01-01

    The dynamic, decentralized world-wide-web has become an essential part of scientific research and communication. Researchers create thousands of web sites every year to share software, data and services. These valuable resources tend to disappear over time. The problem has been documented in many subject areas. Our goal is to conduct a cross-disciplinary investigation of the problem and test the effectiveness of existing remedies. We accessed 14,489 unique web pages found in the abstracts within Thomson Reuters' Web of Science citation index that were published between 1996 and 2010 and found that the median lifespan of these web pages was 9.3 years with 62% of them being archived. Survival analysis and logistic regression were used to find significant predictors of URL lifespan. The availability of a web page is most dependent on the time it is published and the top-level domain names. Similar statistical analysis revealed biases in current solutions: the Internet Archive favors web pages with fewer layers in the Universal Resource Locator (URL) while WebCite is significantly influenced by the source of publication. We also created a prototype for a process to submit web pages to the archives and increased coverage of our list of scientific webpages in the Internet Archive and WebCite by 22% and 255%, respectively. Our results show that link decay continues to be a problem across different disciplines and that current solutions for static web pages are helping and can be improved.

  18. Browsing the World Wide Web from behind a firewall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simons, R.W.

    1995-02-01

    The World Wide Web provides a unified method of access to various information services on the Internet via a variety of protocols. Mosaic and other browsers give users a graphical interface to the Web that is easier to use and more visually pleasing than any other common Internet information service today. The availability of information via the Web and the number of users accessing it have both grown rapidly in the last year. The interest and investment of commercial firms in this technology suggest that in the near future, access to the Web may become as necessary to doing businessmore » as a telephone. This is problematical for organizations that use firewalls to protect their internal networks from the Internet. Allowing all the protocols and types of information found in the Web to pass their firewall will certainly increase the risk of attack by hackers on the Internet. But not allowing access to the Web could be even more dangerous, as frustrated users of the internal network are either unable to do their jobs, or find creative new ways to get around the firewall. The solution to this dilemma adopted at Sandia National Laboratories is described. Discussion also covers risks of accessing the Web, design alternatives considered, and trade-offs used to find the proper balance between access and protection.« less

  19. End User Evaluations

    NASA Astrophysics Data System (ADS)

    Jay, Caroline; Lunn, Darren; Michailidou, Eleni

    As new technologies emerge, and Web sites become increasingly sophisticated, ensuring they remain accessible to disabled and small-screen users is a major challenge. While guidelines and automated evaluation tools are useful for informing some aspects of Web site design, numerous studies have demonstrated that they provide no guarantee that the site is genuinely accessible. The only reliable way to evaluate the accessibility of a site is to study the intended users interacting with it. This chapter outlines the processes that can be used throughout the design life cycle to ensure Web accessibility, describing their strengths and weaknesses, and discussing the practical and ethical considerations that they entail. The chapter also considers an important emerging trend in user evaluations: combining data from studies of “standard” Web use with data describing existing accessibility issues, to drive accessibility solutions forward.

  20. Statistical modeling of the Internet traffic dynamics: To which extent do we need long-term correlations?

    NASA Astrophysics Data System (ADS)

    Markelov, Oleg; Nguyen Duc, Viet; Bogachev, Mikhail

    2017-11-01

    Recently we have suggested a universal superstatistical model of user access patterns and aggregated network traffic. The model takes into account the irregular character of end user access patterns on the web via the non-exponential distributions of the local access rates, but neglects the long-term correlations between these rates. While the model is accurate for quasi-stationary traffic records, its performance under highly variable and especially non-stationary access dynamics remains questionable. In this paper, using an example of the traffic patterns from a highly loaded network cluster hosting the website of the 1998 FIFA World Cup, we suggest a generalization of the previously suggested superstatistical model by introducing long-term correlations between access rates. Using queueing system simulations, we show explicitly that this generalization is essential for modeling network nodes with highly non-stationary access patterns, where neglecting long-term correlations leads to the underestimation of the empirical average sojourn time by several decades under high throughput utilization.

  1. elevatr: Access Elevation Data from Various APIs | Science ...

    EPA Pesticide Factsheets

    Several web services are available that provide access to elevation data. This package provides access to several of those services and returns elevation data either as a SpatialPointsDataFrame from point elevation services or as a raster object from raster elevation services. Currently, the package supports access to the Mapzen Elevation Service, Mapzen Terrain Service, and the USGS Elevation Point Query Service. The R language for statistical computing is increasingly used for spatial data analysis . This R package, elevatr, is in response to this and provides access to elevation data from various sources directly in R. The impact of `elevatr` is that it will 1) facilitate spatial analysis in R by providing access to foundational dataset for many types of analyses (e.g. hydrology, limnology) 2) open up a new set of users and uses for APIs widely used outside of R, and 3) provide an excellent example federal open source development as promoted by the Federal Source Code Policy (https://sourcecode.cio.gov/).

  2. Naval Oceanography Portal

    Science.gov Websites

    accessible and usable, working in accordance with the Web Content Accessibility Guidelines (WCAG v1.0). If navigation device enabling you to get around this web site using your keyboard. Available access keys This develop a web site that is clear and simple for everybody to use. Validation We have used XHTML 1.0 and

  3. Methodology for Localized and Accessible Image Formation and Elucidation

    ERIC Educational Resources Information Center

    Patil, Sandeep R.; Katiyar, Manish

    2009-01-01

    Accessibility is one of the key checkpoints in all software products, applications, and Web sites. Accessibility with digital images has always been a major challenge for the industry. Images form an integral part of certain type of documents and most Web 2.0-compliant Web sites. Individuals challenged with blindness and many dyslexics only make…

  4. Accessibility of School Districts' Web Sites: A Descriptive Study

    ERIC Educational Resources Information Center

    Bray, Marty; Flowers, Claudia; Gibson, Patricia

    2003-01-01

    Many school districts (SDs) use the World Wide Web (WWW or Web) to disseminate a wide variety of information about things such as district events, policies, and a wide variety of student information. On-line barriers limit the accessibility of the WWW for persons and students with disabilities and thus can limit their access to vital information.…

  5. Breaking and Fixing Origin-Based Access Control in Hybrid Web/Mobile Application Frameworks

    PubMed Central

    Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly

    2014-01-01

    Hybrid mobile applications (apps) combine the features of Web applications and “native” mobile apps. Like Web applications, they are implemented in portable, platform-independent languages such as HTML and JavaScript. Like native apps, they have direct access to local device resources—file system, location, camera, contacts, etc. Hybrid apps are typically developed using hybrid application frameworks such as PhoneGap. The purpose of the framework is twofold. First, it provides an embedded Web browser (for example, WebView on Android) that executes the app's Web code. Second, it supplies “bridges” that allow Web code to escape the browser and access local resources on the device. We analyze the software stack created by hybrid frameworks and demonstrate that it does not properly compose the access-control policies governing Web code and local code, respectively. Web code is governed by the same origin policy, whereas local code is governed by the access-control policy of the operating system (for example, user-granted permissions in Android). The bridges added by the framework to the browser have the same local access rights as the entire application, but are not correctly protected by the same origin policy. This opens the door to fracking attacks, which allow foreign-origin Web content included into a hybrid app (e.g., ads confined in iframes) to drill through the layers and directly access device resources. Fracking vulnerabilities are generic: they affect all hybrid frameworks, all embedded Web browsers, all bridge mechanisms, and all platforms on which these frameworks are deployed. We study the prevalence of fracking vulnerabilities in free Android apps based on the PhoneGap framework. Each vulnerability exposes sensitive local resources—the ability to read and write contacts list, local files, etc.—to dozens of potentially malicious Web domains. We also analyze the defenses deployed by hybrid frameworks to prevent resource access by foreign-origin Web content and explain why they are ineffectual. We then present NoFrak, a capability-based defense against fracking attacks. NoFrak is platform-independent, compatible with any framework and embedded browser, requires no changes to the code of the existing hybrid apps, and does not break their advertising-supported business model. PMID:25485311

  6. The clinical value of large neuroimaging data sets in Alzheimer's disease.

    PubMed

    Toga, Arthur W

    2012-02-01

    Rapid advances in neuroimaging and cyberinfrastructure technologies have brought explosive growth in the Web-based warehousing, availability, and accessibility of imaging data on a variety of neurodegenerative and neuropsychiatric disorders and conditions. There has been a prolific development and emergence of complex computational infrastructures that serve as repositories of databases and provide critical functionalities such as sophisticated image analysis algorithm pipelines and powerful three-dimensional visualization and statistical tools. The statistical and operational advantages of collaborative, distributed team science in the form of multisite consortia push this approach in a diverse range of population-based investigations. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Information-Flow-Based Access Control for Web Browsers

    NASA Astrophysics Data System (ADS)

    Yoshihama, Sachiko; Tateishi, Takaaki; Tabuchi, Naoshi; Matsumoto, Tsutomu

    The emergence of Web 2.0 technologies such as Ajax and Mashup has revealed the weakness of the same-origin policy[1], the current de facto standard for the Web browser security model. We propose a new browser security model to allow fine-grained access control in the client-side Web applications for secure mashup and user-generated contents. We propose a browser security model that is based on information-flow-based access control (IBAC) to overcome the dynamic nature of the client-side Web applications and to accurately determine the privilege of scripts in the event-driven programming model.

  8. The Fermi Science Support Center Data Servers and Archive

    NASA Astrophysics Data System (ADS)

    Reustle, Alexander; Fermi Science Support Center

    2018-01-01

    The Fermi Science Support Center (FSSC) provides the scientific community with access to Fermi data and other products. The Gamma-Ray Burst Monitor (GBM) data is stored at NASA's High Energy Astrophysics Science Archive Research Center (HEASARC) and is accessible through their searchable Browse web interface. The Large Area Telescope (LAT) data is distributed through a custom FSSC interface where users can request all photons detected from a region on the sky over a specified time and energy range. Through its website the FSSC also provides planning and scheduling products, such as long and short term observing timelines, spacecraft position and attitude histories, and exposure maps. We present an overview of the different data products provided by the FSSC, how they can be accessed, and statistics on the archive usage since launch.

  9. Web-based, GPU-accelerated, Monte Carlo simulation and visualization of indirect radiation imaging detector performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Han; Sharma, Diksha; Badano, Aldo, E-mail: aldo.badano@fda.hhs.gov

    2014-12-15

    Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: Themore » visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual feedback for the model predictions.« less

  10. The Cardiac Atlas Project--an imaging database for computational modeling and statistical atlases of the heart.

    PubMed

    Fonseca, Carissa G; Backhaus, Michael; Bluemke, David A; Britten, Randall D; Chung, Jae Do; Cowan, Brett R; Dinov, Ivo D; Finn, J Paul; Hunter, Peter J; Kadish, Alan H; Lee, Daniel C; Lima, Joao A C; Medrano-Gracia, Pau; Shivkumar, Kalyanam; Suinesiaputra, Avan; Tao, Wenchao; Young, Alistair A

    2011-08-15

    Integrative mathematical and statistical models of cardiac anatomy and physiology can play a vital role in understanding cardiac disease phenotype and planning therapeutic strategies. However, the accuracy and predictive power of such models is dependent upon the breadth and depth of noninvasive imaging datasets. The Cardiac Atlas Project (CAP) has established a large-scale database of cardiac imaging examinations and associated clinical data in order to develop a shareable, web-accessible, structural and functional atlas of the normal and pathological heart for clinical, research and educational purposes. A goal of CAP is to facilitate collaborative statistical analysis of regional heart shape and wall motion and characterize cardiac function among and within population groups. Three main open-source software components were developed: (i) a database with web-interface; (ii) a modeling client for 3D + time visualization and parametric description of shape and motion; and (iii) open data formats for semantic characterization of models and annotations. The database was implemented using a three-tier architecture utilizing MySQL, JBoss and Dcm4chee, in compliance with the DICOM standard to provide compatibility with existing clinical networks and devices. Parts of Dcm4chee were extended to access image specific attributes as search parameters. To date, approximately 3000 de-identified cardiac imaging examinations are available in the database. All software components developed by the CAP are open source and are freely available under the Mozilla Public License Version 1.1 (http://www.mozilla.org/MPL/MPL-1.1.txt). http://www.cardiacatlas.org a.young@auckland.ac.nz Supplementary data are available at Bioinformatics online.

  11. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    ERIC Educational Resources Information Center

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  12. Exploring U.S Cropland - A Web Service based Cropland Data Layer Visualization, Dissemination and Querying System (Invited)

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Han, W.; di, L.

    2010-12-01

    The National Agricultural Statistics Service (NASS) of the USDA produces the Cropland Data Layer (CDL) product, which is a raster-formatted, geo-referenced, U.S. crop specific land cover classification. These digital data layers are widely used for a variety of applications by universities, research institutions, government agencies, and private industry in climate change studies, environmental ecosystem studies, bioenergy production & transportation planning, environmental health research and agricultural production decision making. The CDL is also used internally by NASS for crop acreage and yield estimation. Like most geospatial data products, the CDL product is only available by CD/DVD delivery or online bulk file downloading via the National Research Conservation Research (NRCS) Geospatial Data Gateway (external users) or in a printed paper map format. There is no online geospatial information access and dissemination, no crop visualization & browsing, no geospatial query capability, nor online analytics. To facilitate the application of this data layer and to help disseminating the data, a web-service based CDL interactive map visualization, dissemination, querying system is proposed. It uses Web service based service oriented architecture, adopts open standard geospatial information science technology and OGC specifications and standards, and re-uses functions/algorithms from GeoBrain Technology (George Mason University developed). This system provides capabilities of on-line geospatial crop information access, query and on-line analytics via interactive maps. It disseminates all data to the decision makers and users via real time retrieval, processing and publishing over the web through standards-based geospatial web services. A CDL region of interest can also be exported directly to Google Earth for mashup or downloaded for use with other desktop application. This web service based system greatly improves equal-accessibility, interoperability, usability, and data visualization, facilitates crop geospatial information usage, and enables US cropland online exploring capability without any client-side software installation. It also greatly reduces the need for paper map and analysis report printing and media usages, and thus enhances low-carbon Agro-geoinformation dissemination for decision support.

  13. Enhancing UCSF Chimera through web services

    PubMed Central

    Huang, Conrad C.; Meng, Elaine C.; Morris, John H.; Pettersen, Eric F.; Ferrin, Thomas E.

    2014-01-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624

  14. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    NASA Astrophysics Data System (ADS)

    Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston

    2007-06-01

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  15. A web access script language to support clinical application development.

    PubMed

    O'Kane, K C; McColligan, E E

    1998-02-01

    This paper describes the development of a script language to support the implementation of decentralized, clinical information applications on the World Wide Web (Web). The goal of this work is to facilitate construction of low overhead, fully functional clinical information systems that can be accessed anywhere by low cost Web browsers to search, retrieve and analyze stored patient data. The Web provides a model of network access to data bases on a global scale. Although it was originally conceived as a means to exchange scientific documents, Web browsers and servers currently support access to a wide variety of audio, video, graphical and text based data to a rapidly growing community. Access to these services is via inexpensive client software browsers that connect to servers by means of the open architecture of the Internet. In this paper, the design and implementation of a script language that supports the development of low cost, Web-based, distributed clinical information systems for both Inter- and Intra-Net use is presented. The language is based on the Mumps language and, consequently, supports many legacy applications with few modifications. Several enhancements, however, have been made to support modern programming practices and the Web interface. The interpreter for the language also supports standalone program execution on Unix, MS-Windows, OS/2 and other operating systems.

  16. Open to All? Nationwide Evaluation of High-Priority Web Accessibility Considerations among Higher Education Websites

    ERIC Educational Resources Information Center

    Kimmons, Royce

    2017-01-01

    This study seeks to evaluate the basic Priority 1 web accessibility of all college and university websites in the US (n = 3141). Utilizing web scraping and automated content analysis, the study establishes that even in the case of high-priority, simple-to-address accessibility requirements, colleges and universities generally fail to make their…

  17. Beyond Section 508: The Spectrum of Legal Requirements for Accessible e-Government Web Sites in the United States

    ERIC Educational Resources Information Center

    Jaeger, Paul T.

    2004-01-01

    In the United States, a number of federal laws establish requirements that electronic government (e-government) information and services be accessible to individuals with disabilities. These laws affect e-government Web sites at the federal, state, and local levels. To this point, research about the accessibility of e-government Web sites has…

  18. An Introduction to Web Accessibility, Web Standards, and Web Standards Makers

    ERIC Educational Resources Information Center

    McHale, Nina

    2011-01-01

    Librarians and libraries have long been committed to providing equitable access to information. In the past decade and a half, the growth of the Internet and the rapid increase in the number of online library resources and tools have added a new dimension to this core duty of the profession: ensuring accessibility of online resources to users with…

  19. Accessibility Trends among Academic Library and Library School Web Sites in the USA and Canada

    ERIC Educational Resources Information Center

    Schmetzke, Axel; Comeaux, David

    2009-01-01

    This paper focuses on the accessibility of North American library and library school Web sites for all users, including those with disabilities. Web accessibility data collected in 2006 are compared to those of 2000 and 2002. The findings of this follow-up study continue to give cause for concern: Despite improvements since 2002, library and…

  20. The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.

    2010-12-01

    Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.

  1. The MetabolomeExpress Project: enabling web-based processing, analysis and transparent dissemination of GC/MS metabolomics datasets.

    PubMed

    Carroll, Adam J; Badger, Murray R; Harvey Millar, A

    2010-07-14

    Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.

  2. EuroPhenome: a repository for high-throughput mouse phenotyping data

    PubMed Central

    Morgan, Hugh; Beck, Tim; Blake, Andrew; Gates, Hilary; Adams, Niels; Debouzy, Guillaume; Leblanc, Sophie; Lengger, Christoph; Maier, Holger; Melvin, David; Meziane, Hamid; Richardson, Dave; Wells, Sara; White, Jacqui; Wood, Joe; de Angelis, Martin Hrabé; Brown, Steve D. M.; Hancock, John M.; Mallon, Ann-Marie

    2010-01-01

    The broad aim of biomedical science in the postgenomic era is to link genomic and phenotype information to allow deeper understanding of the processes leading from genomic changes to altered phenotype and disease. The EuroPhenome project (http://www.EuroPhenome.org) is a comprehensive resource for raw and annotated high-throughput phenotyping data arising from projects such as EUMODIC. EUMODIC is gathering data from the EMPReSSslim pipeline (http://www.empress.har.mrc.ac.uk/) which is performed on inbred mouse strains and knock-out lines arising from the EUCOMM project. The EuroPhenome interface allows the user to access the data via the phenotype or genotype. It also allows the user to access the data in a variety of ways, including graphical display, statistical analysis and access to the raw data via web services. The raw phenotyping data captured in EuroPhenome is annotated by an annotation pipeline which automatically identifies statistically different mutants from the appropriate baseline and assigns ontology terms for that specific test. Mutant phenotypes can be quickly identified using two EuroPhenome tools: PhenoMap, a graphical representation of statistically relevant phenotypes, and mining for a mutant using ontology terms. To assist with data definition and cross-database comparisons, phenotype data is annotated using combinations of terms from biological ontologies. PMID:19933761

  3. Hera: Using NASA Astronomy Data in the Classroom

    NASA Astrophysics Data System (ADS)

    Lochner, James C.; Mitchell, S.; Pence, W. D.

    2006-12-01

    Hera is a free internet-based tool that provides students access to both analysis software and data for studying astronomical objects such as black holes, binary star systems, supernovae, and galaxies. Students use a subset of the same software, and experience the same analysis process, that an astronomer follows in analyzing data obtained from an orbiting satellite observatory. Hera is accompanied by a web-based tutorial which steps students through the science background, procedures for accessing the data, and using the Hera software. The web pages include a lesson plan in which students explore data from a binary star system containing a normal star and a black hole. The objective of the lesson is for students to use plotting, estimation, and statistical techniques to determine the orbital period. Students may then apply these techniques to a number of data sets and draw conclusions on the natures of the systems (for example, students discover that one system is an eclipsing binary). The web page tutorial is self-guided and contains a number of exercises; students can work independently or in groups. Hera has been use with high school students and in introductory astronomy classes in community colleges. This poster describes Hera and its web-based tutorial. We outline the underlying software architecture, the development process, and its testing and classroom applications. We also describe the benefits to students in developing skills which extend basic science and math concepts into real applications.

  4. Web service activities at the IRIS DMC to support federated and multidisciplinary access

    NASA Astrophysics Data System (ADS)

    Trabant, Chad; Ahern, Timothy K.

    2013-04-01

    At the IRIS Data Management Center (DMC) we have developed a suite of web service interfaces to access our large archive of, primarily seismological, time series data and related metadata. The goals of these web services include providing: a) next-generation and easily used access interfaces for our current users, b) access to data holdings in a form usable for non-seismologists, c) programmatic access to facilitate integration into data processing workflows and d) a foundation for participation in federated data discovery and access systems. To support our current users, our services provide access to the raw time series data and metadata or conversions of the raw data to commonly used formats. Our services also support simple, on-the-fly signal processing options that are common first steps in many workflows. Additionally, high-level data products derived from raw data are available via service interfaces. To support data access by researchers unfamiliar with seismic data we offer conversion of the data to broadly usable formats (e.g. ASCII text) and data processing to convert the data to Earth units. By their very nature, web services are programmatic interfaces. Combined with ubiquitous support for web technologies in programming & scripting languages and support in many computing environments, web services are very well suited for integrating data access into data processing workflows. As programmatic interfaces that can return data in both discipline-specific and broadly usable formats, our services are also well suited for participation in federated and brokered systems either specific to seismology or multidisciplinary. Working within the International Federation of Digital Seismograph Networks, the DMC collaborated on the specification of standardized web service interfaces for use at any seismological data center. These data access interfaces, when supported by multiple data centers, will form a foundation on which to build discovery and access mechanisms for data sets spanning multiple centers. To promote the adoption of these standardized services the DMC has developed portable implementations of the software needed to host these interfaces, minimizing the work required at each data center. Within the COOPEUS project framework, the DMC is working with EU partners to install web services implementations at multiple data centers in Europe.

  5. Using a web-based application to define the accuracy of diagnostic tests when the gold standard is imperfect.

    PubMed

    Lim, Cherry; Wannapinij, Prapass; White, Lisa; Day, Nicholas P J; Cooper, Ben S; Peacock, Sharon J; Limmathurotsakul, Direk

    2013-01-01

    Estimates of the sensitivity and specificity for new diagnostic tests based on evaluation against a known gold standard are imprecise when the accuracy of the gold standard is imperfect. Bayesian latent class models (LCMs) can be helpful under these circumstances, but the necessary analysis requires expertise in computational programming. Here, we describe open-access web-based applications that allow non-experts to apply Bayesian LCMs to their own data sets via a user-friendly interface. Applications for Bayesian LCMs were constructed on a web server using R and WinBUGS programs. The models provided (http://mice.tropmedres.ac) include two Bayesian LCMs: the two-tests in two-population model (Hui and Walter model) and the three-tests in one-population model (Walter and Irwig model). Both models are available with simplified and advanced interfaces. In the former, all settings for Bayesian statistics are fixed as defaults. Users input their data set into a table provided on the webpage. Disease prevalence and accuracy of diagnostic tests are then estimated using the Bayesian LCM, and provided on the web page within a few minutes. With the advanced interfaces, experienced researchers can modify all settings in the models as needed. These settings include correlation among diagnostic test results and prior distributions for all unknown parameters. The web pages provide worked examples with both models using the original data sets presented by Hui and Walter in 1980, and by Walter and Irwig in 1988. We also illustrate the utility of the advanced interface using the Walter and Irwig model on a data set from a recent melioidosis study. The results obtained from the web-based applications were comparable to those published previously. The newly developed web-based applications are open-access and provide an important new resource for researchers worldwide to evaluate new diagnostic tests.

  6. A web-based program to increase knowledge and reduce cigarette and nargila smoking among Arab university students in Israel: mixed-methods study to test acceptability.

    PubMed

    Essa-Hadad, Jumanah; Linn, Shai; Rafaeli, Sheizaf

    2015-02-20

    Among Arab citizens in Israel, cigarette and nargila (hookah, waterpipe) smoking is a serious public health problem, particularly among the young adult population. With the dramatic increase of Internet and computer use among Arab college and university students, a Web-based program may provide an easy, accessible tool to reduce smoking rates without heavy resource demands required by traditional methods. The purpose of this research was to examine the acceptability and feasibility of a pilot Web-based program that provides tailored feedback to increase smoking knowledge and reduce cigarette and nargila smoking behaviors among Arab college/university students in Israel. A pilot Web-based program was developed, consisting of a self-administered questionnaire and feedback system on cigarette and nargila smoking. Arab university students were recruited to participate in a mixed-methods study, using both quantitative (pre-/posttest study design) and qualitative tools. A posttest was implemented at 1 month following participation in the intervention to assess any changes in smoking knowledge and behaviors. Focus group sessions were implemented to assess acceptability and preferences related to the Web-based program. A total of 225 participants-response rate of 63.2% (225/356)-completed the intervention at baseline and at 1-month poststudy, and were used for the comparative analysis. Statistically significant reductions in nargila smoking among participants (P=.001) were found. The intervention did not result in reductions in cigarette smoking. However, the tailored Web intervention resulted in statistically significant increases in the intention to quit smoking (P=.021). No statistically significant increases in knowledge were seen at 1-month poststudy. Participants expressed high satisfaction with the intervention and 93.8% (211/225) of those who completed the intervention at both time intervals reported that they would recommend the program to their friends, indicating excellent acceptability and feasibility of the intervention. This was further emphasized in the focus group sessions. A tailored Web-based program may be a promising tool to reduce nargila smoking among Arab university students in Israel. The tailored Web intervention was not successful at significantly reducing cigarette smoking or increasing knowledge. However, the intervention did increase participants' intention to quit smoking. Participants considered the Web-based tool to be an interesting, feasible, and highly acceptable strategy. ISRCTN registry ISRCTN59207794; http://www.isrctn.com/ISRCTN59207794 (Archived by WebCite at http://www.webcitation.org/6VkYOBNOJ).

  7. A Web-Based Program to Increase Knowledge and Reduce Cigarette and Nargila Smoking Among Arab University Students in Israel: Mixed-Methods Study to Test Acceptability

    PubMed Central

    Linn, Shai; Rafaeli, Sheizaf

    2015-01-01

    Background Among Arab citizens in Israel, cigarette and nargila (hookah, waterpipe) smoking is a serious public health problem, particularly among the young adult population. With the dramatic increase of Internet and computer use among Arab college and university students, a Web-based program may provide an easy, accessible tool to reduce smoking rates without heavy resource demands required by traditional methods. Objective The purpose of this research was to examine the acceptability and feasibility of a pilot Web-based program that provides tailored feedback to increase smoking knowledge and reduce cigarette and nargila smoking behaviors among Arab college/university students in Israel. Methods A pilot Web-based program was developed, consisting of a self-administered questionnaire and feedback system on cigarette and nargila smoking. Arab university students were recruited to participate in a mixed-methods study, using both quantitative (pre-/posttest study design) and qualitative tools. A posttest was implemented at 1 month following participation in the intervention to assess any changes in smoking knowledge and behaviors. Focus group sessions were implemented to assess acceptability and preferences related to the Web-based program. Results A total of 225 participants—response rate of 63.2% (225/356)—completed the intervention at baseline and at 1-month poststudy, and were used for the comparative analysis. Statistically significant reductions in nargila smoking among participants (P=.001) were found. The intervention did not result in reductions in cigarette smoking. However, the tailored Web intervention resulted in statistically significant increases in the intention to quit smoking (P=.021). No statistically significant increases in knowledge were seen at 1-month poststudy. Participants expressed high satisfaction with the intervention and 93.8% (211/225) of those who completed the intervention at both time intervals reported that they would recommend the program to their friends, indicating excellent acceptability and feasibility of the intervention. This was further emphasized in the focus group sessions. Conclusions A tailored Web-based program may be a promising tool to reduce nargila smoking among Arab university students in Israel. The tailored Web intervention was not successful at significantly reducing cigarette smoking or increasing knowledge. However, the intervention did increase participants’ intention to quit smoking. Participants considered the Web-based tool to be an interesting, feasible, and highly acceptable strategy. Trial Registration Trial Registration: ISRCTN registry ISRCTN59207794; http://www.isrctn.com/ISRCTN59207794 (Archived by WebCite at http://www.webcitation.org/6VkYOBNOJ). PMID:25707034

  8. Simple Enough--Even for Web Virgins: Lisa Mitten's Access to Native American Web Sites. Web Site Review Essay.

    ERIC Educational Resources Information Center

    Belgarde, Mary Jiron

    1998-01-01

    A mixed-blood Mohawk urban Indian and university librarian, Lisa Mitten provides access to Web sites with solid information about American Indians. Links are provided to 10 categories--Native nations, Native organizations, Indian education, Native media, powwows and festivals, Indian music, Native arts, Native businesses, and Indian-oriented home…

  9. A Framework for Transparently Accessing Deep Web Sources

    ERIC Educational Resources Information Center

    Dragut, Eduard Constantin

    2010-01-01

    An increasing number of Web sites expose their content via query interfaces, many of them offering the same type of products/services (e.g., flight tickets, car rental/purchasing). They constitute the so-called "Deep Web". Accessing the content on the Deep Web has been a long-standing challenge for the database community. For a user interested in…

  10. medplot: a web application for dynamic summary and analysis of longitudinal medical data based on R.

    PubMed

    Ahlin, Črt; Stupica, Daša; Strle, Franc; Lusa, Lara

    2015-01-01

    In biomedical studies the patients are often evaluated numerous times and a large number of variables are recorded at each time-point. Data entry and manipulation of longitudinal data can be performed using spreadsheet programs, which usually include some data plotting and analysis capabilities and are straightforward to use, but are not designed for the analyses of complex longitudinal data. Specialized statistical software offers more flexibility and capabilities, but first time users with biomedical background often find its use difficult. We developed medplot, an interactive web application that simplifies the exploration and analysis of longitudinal data. The application can be used to summarize, visualize and analyze data by researchers that are not familiar with statistical programs and whose knowledge of statistics is limited. The summary tools produce publication-ready tables and graphs. The analysis tools include features that are seldom available in spreadsheet software, such as correction for multiple testing, repeated measurement analyses and flexible non-linear modeling of the association of the numerical variables with the outcome. medplot is freely available and open source, it has an intuitive graphical user interface (GUI), it is accessible via the Internet and can be used within a web browser, without the need for installing and maintaining programs locally on the user's computer. This paper describes the application and gives detailed examples describing how to use the application on real data from a clinical study including patients with early Lyme borreliosis.

  11. WebGeocalc and Cosmographia: Modern Tools to Access SPICE Archives

    NASA Astrophysics Data System (ADS)

    Semenov, B. V.; Acton, C. H.; Bachman, N. J.; Ferguson, E. W.; Rose, M. E.; Wright, E. D.

    2017-06-01

    The WebGeocalc (WGC) web client-server tool and the SPICE-enhanced Cosmographia visualization program are two new ways for accessing space mission geometry data provided in the PDS SPICE kernel archives and by mission operational SPICE kernel sets.

  12. Web access and dissemination of Andalusian coastal erosion rates: viewers and standard/filtered map services.

    NASA Astrophysics Data System (ADS)

    Álvarez Francoso, Jose; Prieto Campos, Antonio; Ojeda Zujar, Jose; Guisado-Pintado, Emilia; Pérez Alcántara, Juan Pedro

    2017-04-01

    The accessibility to environmental information via web viewers using map services (OGC or proprietary services) has become more frequent since newly information sources (ortophotos, LIDAR, GPS) are of great detailed and thus generate a great volume of data which barely can be disseminated using either analogue (paper maps) or digital (pdf) formats. Moreover, governments and public institutions are concerned about the need of facilitates provision to research results and improve communication about natural hazards to citizens and stakeholders. This information ultimately, if adequately disseminated, it's crucial in decision making processes, risk management approaches and could help to increase social awareness related to environmental issues (particularly climate change impacts). To overcome this issue, two strategies for wide dissemination and communication of the results achieved in the calculation of beach erosion for the 640 km length of the Andalusian coast (South Spain) using web viewer technology are presented. Each of them are oriented to different end users and thus based on different methodologies. Erosion rates has been calculated at 50m intervals for different periods (1956-1977-2001-2011) as part of a National Research Project based on the spasialisation and web-access of coastal vulnerability indicators for Andalusian region. The 1st proposal generates WMS services (following OGC standards) that are made available by Geoserver, using a geoviewer client developed through Leaflet. This viewer is designed to be used by the general public (citizens, politics, etc) by combining a set of tools that give access to related documents (pdfs), visualisation tools (panoramio pictures, geo-localisation with GPS) are which are displayed within an user-friendly interface. Further, the use of WMS services (implemented on Geoserver) provides a detailed semiology (arrows and proportional symbols, using alongshore coastaline buffers to represent data) which not only enhances access to erosion rates but also enables multi-scale data representation. The 2nd proposal, as intended to be used by technicians and specialists on the field, includes a geoviewer with an innovative profile (including visualization of time-ranges, application of different uncertainty levels to the data, etc) to fulfil the needs of these users. For its development, a set of Javascript libraries combined with Openlayers (or Leaflet) are implemented to guarantee all the functionalities existing for the basic geoviewer. Further to this, the viewer has been improved by i) the generation of services by request through the application of a filter in ECQL language (Extended Common Query Language), using the vendor parameter CQL_FILTER from Geoserver. These dynamic filters allow the final user to predefine the visualised variable, its spatial and temporal domain, a range of specific values and other attributes, thus multiplying the generation of real-time cartography; ii) by using the layer's WFS service, the Javascript application exploit the alphanumeric data to generate related statistics in real time (e.g. mean rates, length of eroded coast, etc.) and interactive graphs (via HighCharts.js library) which accurately help in beach erosion rates interpretation (representing trends and bars diagrams, among others. As a result two approaches for communicating scientific results to different audiences based on web-based with complete dataset of geo-information, services and functionalities are implemented. The combination of standardised environmental data with tailor-made exploitation techniques (interactive maps, and real-time statistics) assures the correct access and interpretation of the information.

  13. Dental practice websites: creating a Web presence.

    PubMed

    Miller, Syrene A; Forrest, Jane L

    2002-07-01

    Web technology provides an opportunity for dentists to showcase their practice philosophy, quality of care, office setting, and staff in a creative manner. Having a Website provides a practice with innovative and cost-effective communications and marketing tools for current and potential patients who use the Internet. The main benefits of using a Website to promote one's practice are: Making office time more productive, tasks more timely, follow-up less necessary Engaging patients in an interactive and visual learning process Providing online forms and procedure examples for patients Projecting a competent and current image Tracking the usage of Web pages. Several options are available when considering the development of a Website. These options range in cost based on customization of the site and ongoing support services, such as site updates, technical assistance, and Web usage statistics. In most cases, Websites are less expensive than advertising in the phone book. Options in creating a Website include building one's own, employing a company that offers Website templates, and employing a company that offers customized sites. These development options and benefits will continue to grow as individuals access the Web and more information and sites become available.

  14. Website Accessibility for Users with Visual Impairment

    ERIC Educational Resources Information Center

    Smith, J. A.; Lind, M. R.

    2010-01-01

    In this web accessibility study of homepages of education departments in post-secondary educational institutions, the 1998 US Section 508 Law regarding webpage accessibility for people with disabilities was addressed. Along with the requirements of this legislation, there are growing demands for web accessibility resulting from age-related visual…

  15. DMINDA: an integrated web server for DNA motif identification and analyses

    PubMed Central

    Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying

    2014-01-01

    DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. PMID:24753419

  16. Software Application Profile: Opal and Mica: open-source software solutions for epidemiological data management, harmonization and dissemination.

    PubMed

    Doiron, Dany; Marcon, Yannick; Fortier, Isabel; Burton, Paul; Ferretti, Vincent

    2017-10-01

    Improving the dissemination of information on existing epidemiological studies and facilitating the interoperability of study databases are essential to maximizing the use of resources and accelerating improvements in health. To address this, Maelstrom Research proposes Opal and Mica, two inter-operable open-source software packages providing out-of-the-box solutions for epidemiological data management, harmonization and dissemination. Opal and Mica are two standalone but inter-operable web applications written in Java, JavaScript and PHP. They provide web services and modern user interfaces to access them. Opal allows users to import, manage, annotate and harmonize study data. Mica is used to build searchable web portals disseminating study and variable metadata. When used conjointly, Mica users can securely query and retrieve summary statistics on geographically dispersed Opal servers in real-time. Integration with the DataSHIELD approach allows conducting more complex federated analyses involving statistical models. Opal and Mica are open-source and freely available at [www.obiba.org] under a General Public License (GPL) version 3, and the metadata models and taxonomies that accompany them are available under a Creative Commons licence. © The Author 2017; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association

  17. DHLAS: A web-based information system for statistical genetic analysis of HLA population data.

    PubMed

    Thriskos, P; Zintzaras, E; Germenis, A

    2007-03-01

    DHLAS (database HLA system) is a user-friendly, web-based information system for the analysis of human leukocyte antigens (HLA) data from population studies. DHLAS has been developed using JAVA and the R system, it runs on a Java Virtual Machine and its user-interface is web-based powered by the servlet engine TOMCAT. It utilizes STRUTS, a Model-View-Controller framework and uses several GNU packages to perform several of its tasks. The database engine it relies upon for fast access is MySQL, but others can be used a well. The system estimates metrics, performs statistical testing and produces graphs required for HLA population studies: (i) Hardy-Weinberg equilibrium (calculated using both asymptotic and exact tests), (ii) genetics distances (Euclidian or Nei), (iii) phylogenetic trees using the unweighted pair group method with averages and neigbor-joining method, (iv) linkage disequilibrium (pairwise and overall, including variance estimations), (v) haplotype frequencies (estimate using the expectation-maximization algorithm) and (vi) discriminant analysis. The main merit of DHLAS is the incorporation of a database, thus, the data can be stored and manipulated along with integrated genetic data analysis procedures. In addition, it has an open architecture allowing the inclusion of other functions and procedures.

  18. BRepertoire: a user-friendly web server for analysing antibody repertoire data.

    PubMed

    Margreitter, Christian; Lu, Hui-Chun; Townsend, Catherine; Stewart, Alexander; Dunn-Walters, Deborah K; Fraternali, Franca

    2018-04-14

    Antibody repertoire analysis by high throughput sequencing is now widely used, but a persisting challenge is enabling immunologists to explore their data to discover discriminating repertoire features for their own particular investigations. Computational methods are necessary for large-scale evaluation of antibody properties. We have developed BRepertoire, a suite of user-friendly web-based software tools for large-scale statistical analyses of repertoire data. The software is able to use data preprocessed by IMGT, and performs statistical and comparative analyses with versatile plotting options. BRepertoire has been designed to operate in various modes, for example analysing sequence-specific V(D)J gene usage, discerning physico-chemical properties of the CDR regions and clustering of clonotypes. Those analyses are performed on the fly by a number of R packages and are deployed by a shiny web platform. The user can download the analysed data in different table formats and save the generated plots as image files ready for publication. We believe BRepertoire to be a versatile analytical tool that complements experimental studies of immune repertoires. To illustrate the server's functionality, we show use cases including differential gene usage in a vaccination dataset and analysis of CDR3H properties in old and young individuals. The server is accessible under http://mabra.biomed.kcl.ac.uk/BRepertoire.

  19. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  20. Content and Accessibility of Shoulder and Elbow Fellowship Web Sites in the United States.

    PubMed

    Young, Bradley L; Oladeji, Lasun O; Cichos, Kyle; Ponce, Brent

    2016-01-01

    Increasing numbers of training physicians are using the Internet to gather information about graduate medical education programs. The content and accessibility of web sites that provide this information have been demonstrated to influence applicants' decisions. Assessments of orthopedic fellowship web sites including sports medicine, pediatrics, hand and spine have found varying degrees of accessibility and material. The purpose of this study was to evaluate the accessibility and content of the American Shoulder and Elbow Surgeons (ASES) fellowship web sites (SEFWs). A complete list of ASES programs was obtained from a database on the ASES web site. The accessibility of each SEFWs was assessed by the existence of a functioning link found in the database and through Google®. Then, the following content areas of each SEFWs were evaluated: fellow education, faculty/previous fellow information, and recruitment. At the time of the study, 17 of the 28 (60.7%) ASES programs had web sites accessible through Google®, and only five (17.9%) had functioning links in the ASES database. Nine programs lacked a web site. Concerning web site content, the majority of SEFWs contained information regarding research opportunities, research requirements, case descriptions, meetings and conferences, teaching responsibilities, attending faculty, the application process, and a program description. Fewer than half of the SEFWs provided information regarding rotation schedules, current fellows, previous fellows, on-call expectations, journal clubs, medical school of current fellows, residency of current fellows, employment of previous fellows, current research, and previous research. A large portion of ASES fellowship programs lacked functioning web sites, and even fewer provided functioning links through the ASES database. Valuable information for potential applicants was largely inadequate across present SEFWs.

  1. National Data Buoy Center

    Science.gov Websites

    Click to go to NDBC home page Select to go to the NWS homepage Home About Us Contact Us Search NDBC Web link to RSS feed access page Web Widget Email Access Web Data Guide Maintenance Schedule Station Status Information USA.gov is the U.S. government's official web portal to all federal, state and local government

  2. NDBC DART® Program

    Science.gov Websites

    Click to go to NDBC home page Select to go to the NWS homepage Home About Us Contact Us Search NDBC Web link to RSS feed access page Web Widget Email Access Web Data Guide Maintenance Schedule Station Status Information USA.gov is the U.S. government's official web portal to all federal, state and local government

  3. An Investigation into Web Content Accessibility Guideline Conformance for an Aging Population

    ERIC Educational Resources Information Center

    Curran, Kevin; Robinson, David

    2007-01-01

    Poor web site design can cause difficulties for specific groups of users. By applying the Web Content Accessibility Guidelines to a web site, the amount of possible users who can successfully view the content of that site will increase, especially for those who are in the disabled and older adult categories of online users. Older adults are coming…

  4. Enhancing UCSF Chimera through web services.

    PubMed

    Huang, Conrad C; Meng, Elaine C; Morris, John H; Pettersen, Eric F; Ferrin, Thomas E

    2014-07-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Web Accessibility of the Higher Education Institute Websites Based on the World Wide Web Consortium and Section 508 of the Rehabilitation Act

    ERIC Educational Resources Information Center

    Alam, Najma H.

    2014-01-01

    The problem observed in this study is the low level of compliance of higher education website accessibility with Section 508 of the Rehabilitation Act of 1973. The literature supports the non-compliance of websites with the federal policy in general. Studies were performed to analyze the accessibility of fifty-four sample web pages using automated…

  6. Monitoring Web Site Usage of e-Bug: A Hygiene and Antibiotic Awareness Resource for Children

    PubMed Central

    Rajapandian, Vijayamaharaj; Eley, Charlotte V; Hoekstra, Beverley A; Lecky, Donna M; McNulty, Cliodna AM

    2015-01-01

    Background e-Bug is an educational resource which teaches children and young people about microbes, hygiene, infection, and prudent antibiotic use. The e-Bug resources are available in over 22 different languages and they are used widely across the globe. The resources can be accessed from the e-Bug website. Objective The objective of this study was to analyze the usage of the e-Bug website in order to understand how users access the website, where and when they access the site, and to review variation in use across the different areas of the site. Methods The usage statistics for the e-Bug website were monitored by Google Analytics between September 2010 and August 2013. Results The statistics show the website had over 324,000 visits during the three years, from just under 250,000 visitors, with the number of visitors increasing year after year. Visitors accessed the website from 211 different countries, with more than 267,000 documents downloaded. The majority of visitors were from the United Kingdom and visited the English website, although countries such as France and Portugal were also frequent visitors. Conclusions These website statistics confirm that e-Bug is frequently used across Europe and highlight that e-Bug use has expanded across the world. The findings from this report will be used to inform future modifications or updates to the materials, as well as the development of new educational resources. PMID:26567127

  7. Web-based GIS for spatial pattern detection: application to malaria incidence in Vietnam.

    PubMed

    Bui, Thanh Quang; Pham, Hai Minh

    2016-01-01

    There is a great concern on how to build up an interoperable health information system of public health and health information technology within the development of public information and health surveillance programme. Technically, some major issues remain regarding to health data visualization, spatial processing of health data, health information dissemination, data sharing and the access of local communities to health information. In combination with GIS, we propose a technical framework for web-based health data visualization and spatial analysis. Data was collected from open map-servers and geocoded by open data kit package and data geocoding tools. The Web-based system is designed based on Open-source frameworks and libraries. The system provides Web-based analyst tool for pattern detection through three spatial tests: Nearest neighbour, K function, and Spatial Autocorrelation. The result is a web-based GIS, through which end users can detect disease patterns via selecting area, spatial test parameters and contribute to managers and decision makers. The end users can be health practitioners, educators, local communities, health sector authorities and decision makers. This web-based system allows for the improvement of health related services to public sector users as well as citizens in a secure manner. The combination of spatial statistics and web-based GIS can be a solution that helps empower health practitioners in direct and specific intersectional actions, thus provide for better analysis, control and decision-making.

  8. A Privacy Access Control Framework for Web Services Collaboration with Role Mechanisms

    NASA Astrophysics Data System (ADS)

    Liu, Linyuan; Huang, Zhiqiu; Zhu, Haibin

    With the popularity of Internet technology, web services are becoming the most promising paradigm for distributed computing. This increased use of web services has meant that more and more personal information of consumers is being shared with web service providers, leading to the need to guarantee the privacy of consumers. This paper proposes a role-based privacy access control framework for Web services collaboration, it utilizes roles to specify the privacy privileges of services, and considers the impact on the reputation degree of the historic experience of services in playing roles. Comparing to the traditional privacy access control approaches, this framework can make the fine-grained authorization decision, thus efficiently protecting consumers' privacy.

  9. Web accessibility support for visually impaired users using link content analysis.

    PubMed

    Iwata, Hajime; Kobayashi, Naofumi; Tachibana, Kenji; Shirogane, Junko; Fukazawa, Yoshiaki

    2013-12-01

    Web pages are used for a variety of purposes. End users must understand dynamically changing content and sequentially follow page links to find desired material, requiring significant time and effort. However, for visually impaired users using screen readers, it can be difficult to find links to web pages when link text and alternative text descriptions are inappropriate. Our method supports the discovery of content by analyzing 8 categories of link types, and allows visually impaired users to be aware of the content represented by links in advance. This facilitates end users access to necessary information on web pages. Our method of classifying web page links is therefore effective as a means of evaluating accessibility.

  10. The potential of Web-based interventions for heart disease self-management: a mixed methods investigation.

    PubMed

    Kerr, Cicely; Murray, Elizabeth; Noble, Lorraine; Morris, Richard; Bottomley, Christian; Stevenson, Fiona; Patterson, David; Peacock, Richard; Turner, Indra; Jackson, Keith; Nazareth, Irwin

    2010-12-02

    Existing initiatives to support patient self-management of heart disease do not appear to be reaching patients most in need. Providing self-management programs over the Internet (web-based interventions) might help reduce health disparities by reaching a greater number of patients. However, it is unclear whether they can achieve this goal and whether their effectiveness might be limited by the digital divide. To explore the effectiveness of a web-based intervention in decreasing inequalities in access to self-management support in patients with coronary heart disease (CHD). Quantitative and qualitative methods were used to explore use made of a web-based intervention over a period of 9 months. Patients with CHD, with or without home Internet access or previous experience using the Internet, were recruited from primary care centers in diverse socioeconomic and ethnic areas of North London, UK. Patients without home Internet were supported in using the intervention at public Internet services. Only 10.6% of eligible patients chose to participate (N=168). Participants were predominantly Caucasian well-educated men, with greater proportions of male and younger CHD patients among participants than were registered at participating primary care practices. Most had been diagnosed with CHD a number of years prior to the study. Relatively few had been newly diagnosed or had experienced a cardiac event in the previous 5 years. Most had home Internet access and prior experience using the Internet. A greater use of the intervention was observed in older participants (for each 5-year age increase, OR 1.25 for no, low or high intervention use, 95% CI, 1.06-1.47) and in those that had home Internet access and prior Internet experience (OR 3.74, 95% CI, 1.52-9.22). Less use was observed in participants that had not recently experienced a cardiac event or diagnosis (≥ 5 years since cardiac event or diagnosis; OR 0.69, 95% CI, 0.50-0.95). Gender and level of education were not statistically related to level of use of the intervention. Data suggest that a recent cardiac event or diagnosis increased the need for information and advice in participants. However, participants that had been diagnosed several years ago showed little need for information and support. The inconvenience of public Internet access was a barrier for participants without home Internet access. The use of the intervention by participants with little or no Internet experience was limited by a lack of confidence with computers and discomfort with asking for assistance. It was also influenced by the level of participant need for information and by their perception of the intervention. The availability of a web-based intervention, with support for use at home or through public Internet services, did not result in a large number or all types of patients with CHD using the intervention for self-management support. The effectiveness of web-based interventions for patients with chronic diseases remains a significant challenge.

  11. Semantic Annotations and Querying of Web Data Sources

    NASA Astrophysics Data System (ADS)

    Hornung, Thomas; May, Wolfgang

    A large part of the Web, actually holding a significant portion of the useful information throughout the Web, consists of views on hidden databases, provided by numerous heterogeneous interfaces that are partly human-oriented via Web forms ("Deep Web"), and partly based on Web Services (only machine accessible). In this paper we present an approach for annotating these sources in a way that makes them citizens of the Semantic Web. We illustrate how queries can be stated in terms of the ontology, and how the annotations are used to selected and access appropriate sources and to answer the queries.

  12. P-MartCancer: A New Online Platform to Access CPTAC Datasets and Enable New Analyses | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The November 1, 2017 issue of Cancer Research is dedicated to a collection of computational resource papers in genomics, proteomics, animal models, imaging, and clinical subjects for non-bioinformaticists looking to incorporate computing tools into their work. Scientists at Pacific Northwest National Laboratory have developed P-MartCancer, an open, web-based interactive software tool that enables statistical analyses of peptide or protein data generated from mass-spectrometry (MS)-based global proteomics experiments.

  13. Model collaboration: university library system and rehabilitation research team to advance telepractice knowledge.

    PubMed

    Deliyannides, Timothy S; Gabler, Vanessa

    2012-01-01

    This Publisher's Report describes the collaboration between a university library system's scholarly communication and publishing office and a federally funded research team, the Rehabilitation Engineering Research Center (RERC) on Telerehabilitation. This novel interdisciplinary collaboration engages librarians, information technologists, publishing professionals, clinicians, policy experts, and engineers and has produced a new Open Access journal, International Journal of Telerehabilitation, and a developing, interactive web-based product dedicated to disseminating information about telerehabilitation. Readership statistics are presented for March 1, 2011 - February 29, 2012.

  14. ShortRead: a bioconductor package for input, quality assessment and exploration of high-throughput sequence data

    PubMed Central

    Morgan, Martin; Anders, Simon; Lawrence, Michael; Aboyoun, Patrick; Pagès, Hervé; Gentleman, Robert

    2009-01-01

    Summary: ShortRead is a package for input, quality assessment, manipulation and output of high-throughput sequencing data. ShortRead is provided in the R and Bioconductor environments, allowing ready access to additional facilities for advanced statistical analysis, data transformation, visualization and integration with diverse genomic resources. Availability and Implementation: This package is implemented in R and available at the Bioconductor web site; the package contains a ‘vignette’ outlining typical work flows. Contact: mtmorgan@fhcrc.org PMID:19654119

  15. Interfaces to PeptideAtlas: a case study of standard data access systems

    PubMed Central

    Handcock, Jeremy; Robinson, Thomas; Deutsch, Eric W.; Boyle, John

    2012-01-01

    Access to public data sets is important to the scientific community as a resource to develop new experiments or validate new data. Projects such as the PeptideAtlas, Ensembl and The Cancer Genome Atlas (TCGA) offer both access to public data and a repository to share their own data. Access to these data sets is often provided through a web page form and a web service API. Access technologies based on web protocols (e.g. http) have been in use for over a decade and are widely adopted across the industry for a variety of functions (e.g. search, commercial transactions, and social media). Each architecture adapts these technologies to provide users with tools to access and share data. Both commonly used web service technologies (e.g. REST and SOAP), and custom-built solutions over HTTP are utilized in providing access to research data. Providing multiple access points ensures that the community can access the data in the simplest and most effective manner for their particular needs. This article examines three common access mechanisms for web accessible data: BioMart, caBIG, and Google Data Sources. These are illustrated by implementing each over the PeptideAtlas repository and reviewed for their suitability based on specific usages common to research. BioMart, Google Data Sources, and caBIG are each suitable for certain uses. The tradeoffs made in the development of the technology are dependent on the uses each was designed for (e.g. security versus speed). This means that an understanding of specific requirements and tradeoffs is necessary before selecting the access technology. PMID:22941959

  16. A web platform for landuse, climate, demography, hydrology and beach erosion in the Black Sea catchment

    PubMed Central

    Lehmann, Anthony; Guigoz, Yaniss; Ray, Nicolas; Mancosu, Emanuele; Abbaspour, Karim C.; Rouholahnejad Freund, Elham; Allenbach, Karin; De Bono, Andrea; Fasel, Marc; Gago-Silva, Ana; Bär, Roger; Lacroix, Pierre; Giuliani, Gregory

    2017-01-01

    The Black Sea catchment (BSC) is facing important demographic, climatic and landuse changes that may increase pollution, vulnerability and scarcity of water resources, as well as beach erosion through sea level rise. Limited access to reliable time-series monitoring data from environmental, statistical, and socio-economical sources is a major barrier to policy development and decision-making. To address these issues, a web-based platform was developed to enable discovery and access to key environmental information for the region. This platform covers: landuse, climate, and demographic scenarios; hydrology and related water vulnerability and scarcity; as well as beach erosion. Each data set has been obtained with state-of-the-art modelling tools from available monitoring data using appropriate validation methods. These analyses were conducted using global and regional data sets. The data sets are intended for national to regional assessments, for instance for prioritizing environmental protection projects and investments. Together they form a unique set of information, which lay out future plausible change scenarios for the BSC, both for scientific and policy purposes. PMID:28675383

  17. A web platform for landuse, climate, demography, hydrology and beach erosion in the Black Sea catchment.

    PubMed

    Lehmann, Anthony; Guigoz, Yaniss; Ray, Nicolas; Mancosu, Emanuele; Abbaspour, Karim C; Rouholahnejad Freund, Elham; Allenbach, Karin; De Bono, Andrea; Fasel, Marc; Gago-Silva, Ana; Bär, Roger; Lacroix, Pierre; Giuliani, Gregory

    2017-07-04

    The Black Sea catchment (BSC) is facing important demographic, climatic and landuse changes that may increase pollution, vulnerability and scarcity of water resources, as well as beach erosion through sea level rise. Limited access to reliable time-series monitoring data from environmental, statistical, and socio-economical sources is a major barrier to policy development and decision-making. To address these issues, a web-based platform was developed to enable discovery and access to key environmental information for the region. This platform covers: landuse, climate, and demographic scenarios; hydrology and related water vulnerability and scarcity; as well as beach erosion. Each data set has been obtained with state-of-the-art modelling tools from available monitoring data using appropriate validation methods. These analyses were conducted using global and regional data sets. The data sets are intended for national to regional assessments, for instance for prioritizing environmental protection projects and investments. Together they form a unique set of information, which lay out future plausible change scenarios for the BSC, both for scientific and policy purposes.

  18. Hyper-Fit: Fitting Linear Models to Multidimensional Data with Multivariate Gaussian Uncertainties

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Obreschkow, D.

    2015-09-01

    Astronomical data is often uncertain with errors that are heteroscedastic (different for each data point) and covariant between different dimensions. Assuming that a set of D-dimensional data points can be described by a (D - 1)-dimensional plane with intrinsic scatter, we derive the general likelihood function to be maximised to recover the best fitting model. Alongside the mathematical description, we also release the hyper-fit package for the R statistical language (http://github.com/asgr/hyper.fit) and a user-friendly web interface for online fitting (http://hyperfit.icrar.org). The hyper-fit package offers access to a large number of fitting routines, includes visualisation tools, and is fully documented in an extensive user manual. Most of the hyper-fit functionality is accessible via the web interface. In this paper, we include applications to toy examples and to real astronomical data from the literature: the mass-size, Tully-Fisher, Fundamental Plane, and mass-spin-morphology relations. In most cases, the hyper-fit solutions are in good agreement with published values, but uncover more information regarding the fitted model.

  19. Guidelines for Making Web Content Accessible to All Users

    ERIC Educational Resources Information Center

    Thompson, Terrill; Primlani, Saroj; Fiedor, Lisa

    2009-01-01

    The main goal of accessibility standards and guidelines is to design websites everyone can use. The "IT Accessibility Constituent Group" developed this set of draft guidelines to help EQ authors, reviewers, and staff and the larger EDUCAUSE community ensure that web content is accessible to all users, including those with disabilities. This…

  20. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  1. Getting To Know the "Invisible Web."

    ERIC Educational Resources Information Center

    Smith, C. Brian

    2001-01-01

    Discusses the portions of the World Wide Web that cannot be accessed via directories or search engines, explains why they can't be accessed, and offers suggestions for reference librarians to find these sites. Lists helpful resources and gives examples of invisible Web sites which are often databases. (LRW)

  2. Is Accessibility an Issue in the Knowledge Society? Modern Web Applications in the Light of Accessibility

    NASA Astrophysics Data System (ADS)

    Bártek, Luděk; Ošlejšek, Radek; Pitner, Tomáš

    Recent development in Web shows a significant trend towards more user participation, massive use of new devices including portables, and high interactivity. The user participation goes hand in hand with inclusion of all potential user groups - also with special needs. However, we claim that albeit all the effort towards accessibility, it has not yet found an appopriate reflection among stakeholders of the "Top Web Applications" nor their users. This leads to undesired causes - the business-driven Web without all user participation is not a really democratic medium and, actually, does not comply with the original characteristics of Web 2.0. The paper tries to identify perspectives of further development including standardization processes and technical obstacles behind. It also shows ways and techniques to cope with the challenge based on our own research and development in accessible graphics and dialog-based systems.

  3. Developing a Web Platform to Support a Community of Practice: A Mixed Methods Study in Pediatric Physiotherapy.

    PubMed

    Pratte, Gabrielle; Hurtubise, Karen; Rivard, Lisa; Berbari, Jade; Camden, Chantal

    2018-01-01

    Web platforms are increasingly used to support virtual interactions between members of communities of practice (CoP). However, little is known about how to develop these platforms to support the implementation of best practices for health care professionals. The aim of this article is to explore pediatric physiotherapists' (PTs) perspectives regarding the utility and usability of the characteristic of a web platform developed to support virtual communities of practice (vCoP). This study adopted an explanatory sequential mixed methods design. A web platform supporting the interactions of vCoP members was developed for PTs working with children with developmental coordination disorder. Specific strategies and features were created to support the effectiveness of the platform across three domains: social, information-quality, and system-quality factors. Quantitative data were collected from a cross-sectional survey (n = 41) after 5 months of access to the web platform. Descriptive statistics were calculated. Qualitative data were also collected from semistructured interviews (n = 9), which were coded, interpreted, and analyzed by using Boucher's Web Ergonomics Conceptual Framework. The utility of web platform characteristics targeting the three key domain factors were generally perceived positively by PTs. However, web platform usability issues were noted by PTs, including problems with navigation and information retrieval. Web platform aiming to support vCoP should be carefully developed to target potential users' needs. Whenever possible, users should co-construct the web platform with vCoP developers. Moreover, each of the developed characteristics (eg, newsletter, search function) should be evaluated in terms of utility and usability for the users.

  4. 78 FR 67881 - Nondiscrimination on the Basis of Disability in Air Travel: Accessibility of Web Sites and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... corresponding accessible pages on a mobile Web site by one year after the final rule's effective date; and (3... Mobile Web site conformant with any of the following standards: WCAG 1.0, WCAG 2.0 at Level A, existing Section 508 standards, or Mobile Web Best Practices (MWBP) 1.0 (if applicable). Two of the options they...

  5. Internet Usage by Low-Literacy Adults Seeking Health Information: An Observational Analysis

    PubMed Central

    Birru, Mehret S; Monaco, Valerie M; Charles, Lonelyss; Drew, Hadiya; Njie, Valerie; Bierria, Timothy; Detlefsen, Ellen

    2004-01-01

    Background Adults with low literacy may encounter informational obstacles on the Internet when searching for health information, in part because most health Web sites require at least a high-school reading proficiency for optimal access. Objective The purpose of this study was to 1) determine how low-literacy adults independently access and evaluate health information on the Internet, 2) identify challenges and areas of proficiency in the Internet-searching skills of low-literacy adults. Methods Subjects (n=8) were enrolled in a reading assistance program at Bidwell Training Center in Pittsburgh, PA, and read at a 3rd to 8th grade level. Subjects conducted self-directed Internet searches for designated health topics while utilizing a think-aloud protocol. Subjects' keystrokes and comments were recorded using Camtasia Studio screen-capture software. The search terms used to find health information, the amount of time spent on each Web site, the number of Web sites accessed, the reading level of Web sites accessed, and the responses of subjects to questionnaires were assessed. Results Subjects collectively answered 8 out of 24 questions correctly. Seven out of 8 subjects selected "sponsored sites"-paid Web advertisements-over search engine-generated links when answering health questions. On average, subjects accessed health Web sites written at or above a 10th grade reading level. Standard methodologies used for measuring health literacy and for promoting subjects to verbalize responses to Web-site form and content had limited utility in this population. Conclusion This study demonstrates that Web health information requires a reading level that prohibits optimal access by some low-literacy adults. These results highlight the low-literacy adult population as a potential audience for Web health information, and indicate some areas of difficulty that these individuals face when using the Internet and health Web sites to find information on specific health topics. PMID:15471751

  6. A Web-based interface to calculate phonotactic probability for words and nonwords in Modern Standard Arabic.

    PubMed

    Aljasser, Faisal; Vitevitch, Michael S

    2018-02-01

    A number of databases (Storkel Behavior Research Methods, 45, 1159-1167, 2013) and online calculators (Vitevitch & Luce Behavior Research Methods, Instruments, and Computers, 36, 481-487, 2004) have been developed to provide statistical information about various aspects of language, and these have proven to be invaluable assets to researchers, clinicians, and instructors in the language sciences. The number of such resources for English is quite large and continues to grow, whereas the number of such resources for other languages is much smaller. This article describes the development of a Web-based interface to calculate phonotactic probability in Modern Standard Arabic (MSA). A full description of how the calculator can be used is provided. It can be freely accessed at http://phonotactic.drupal.ku.edu/ .

  7. 75 FR 32692 - Schools and Libraries Universal Service Support Mechanism

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-09

    ..., wireless Internet access applications, and web hosting. We propose to revise the Commission's rules to.../anti-spam software, scheduling services, wireless Internet access applications, and web hosting should... schools and libraries may receive discounts for eligible telecommunications services, Internet access, and...

  8. Web OPAC Interfaces: An Overview.

    ERIC Educational Resources Information Center

    Babu, B. Ramesh; O'Brien, Ann

    2000-01-01

    Discussion of Web-based online public access catalogs (OPACs) focuses on a review of six Web OPAC interfaces in use in academic libraries in the United Kingdom. Presents a checklist and guidelines of important features and functions that are currently available, including search strategies, access points, display, links, and layout. (Author/LRW)

  9. Connecting long-tail scientists with big data centers using SaaS

    NASA Astrophysics Data System (ADS)

    Percivall, G. S.; Bermudez, L. E.

    2012-12-01

    Big data centers and long tail scientists represent two extremes in the geoscience research community. Interoperability and inter-use based on software-as-a-service (SaaS) increases access to big data holdings by this underserved community of scientists. Large, institutional data centers have long been recognized as vital resources in the geoscience community. Permanent data archiving and dissemination centers provide "access to the data and (are) a critical source of people who have experience in the use of the data and can provide advice and counsel for new applications." [NRC] The "long-tail of science" is the geoscience researchers that work separate from institutional data centers [Heidorn]. Long-tail scientists need to be efficient consumers of data from large, institutional data centers. Discussions in NSF EarthCube capture the challenges: "Like the vast majority of NSF-funded researchers, Alice (a long-tail scientist) works with limited resources. In the absence of suitable expertise and infrastructure, the apparently simple task that she assigns to her graduate student becomes an information discovery and management nightmare. Downloading and transforming datasets takes weeks." [Foster, et.al.] The long-tail metaphor points to methods to bridge the gap, i.e., the Web. A decade ago, OGC began building a geospatial information space using open, web standards for geoprocessing [ORM]. Recently, [Foster, et.al.] accurately observed that "by adopting, adapting, and applying semantic web and SaaS technologies, we can make the use of geoscience data as easy and convenient as consumption of online media." SaaS places web services into Cloud Computing. SaaS for geospatial is emerging rapidly building on the first-generation geospatial web, e.g., OGC Web Coverage Service [WCS] and the Data Access Protocol [DAP]. Several recent examples show progress in applying SaaS to geosciences: - NASA's Earth Data Coherent Web has a goal to improve science user experience using Web Services (e.g. W*S, SOAP, RESTful) to reduce barriers to using EOSDIS data [ECW]. - NASA's LANCE provides direct access to vast amounts of satellite data using the OGC Web Map Tile Service (WMTS). - NOAA's Unified Access Framework for Gridded Data (UAF Grid) is a web service based capability for direct access to a variety of datasets using netCDF, OPeNDAP, THREDDS, WMS and WCS. [UAF] Tools to access SaaS's are many and varied: some proprietary, others open source; some run in browsers, others are stand-alone applications. What's required is interoperability using web interfaces offered by the data centers. NOAA's UAF service stack supports Matlab, ArcGIS, Ferret, GrADS, Google Earth, IDV, LAS. Any SaaS that offers OGC Web Services (WMS, WFS, WCS) can be accessed by scores of clients [OGC]. While there has been much progress in the recent year toward offering web services for the long-tail of scientists, more needs to be done. Web services offer data access but more than access is needed for inter-use of data, e.g. defining data schemas that allow for data fusion, addressing coordinate systems, spatial geometry, and semantics for observations. Connecting long-tail scientists with large, data centers using SaaS and, in the future, semantic web, will address this large and currently underserved user community.

  10. Cyberinfrastructure at IRIS: Challenges and Solutions Providing Integrated Data Access to EarthScope and Other Earth Science Data

    NASA Astrophysics Data System (ADS)

    Ahern, T. K.; Barga, R.; Casey, R.; Kamb, L.; Parastatidis, S.; Stromme, S.; Weertman, B. T.

    2008-12-01

    While mature methods of accessing seismic data from the IRIS DMC have existed for decades, the demands for improved interdisciplinary data integration call for new approaches. Talented software teams at the IRIS DMC, UNAVCO and the ICDP in Germany, have been developing web services for all EarthScope data including data from USArray, PBO and SAFOD. These web services are based upon SOAP and WSDL. The EarthScope Data Portal was the first external system to access data holdings from the IRIS DMC using Web Services. EarthScope will also draw more heavily upon products to aid in cross-disciplinary data reuse. A Product Management System called SPADE allows archive of and access to heterogeneous data products, presented as XML documents, at the IRIS DMC. Searchable metadata are extracted from the XML and enable powerful searches for products from EarthScope and other data sources. IRIS is teaming with the External Research Group at Microsoft Research to leverage a powerful Scientific Workflow Engine (Trident) and interact with the web services developed at centers such as IRIS to enable access to data services as well as computational services. We believe that this approach will allow web- based control of workflows and the invocation of computational services that transform data. This capability will greatly improve access to data across scientific disciplines. This presentation will review some of the traditional access tools as well as many of the newer approaches that use web services, scientific workflow to improve interdisciplinary data access.

  11. SWS: accessing SRS sites contents through Web Services.

    PubMed

    Romano, Paolo; Marra, Domenico

    2008-03-26

    Web Services and Workflow Management Systems can support creation and deployment of network systems, able to automate data analysis and retrieval processes in biomedical research. Web Services have been implemented at bioinformatics centres and workflow systems have been proposed for biological data analysis. New databanks are often developed by taking into account these technologies, but many existing databases do not allow a programmatic access. Only a fraction of available databanks can thus be queried through programmatic interfaces. SRS is a well know indexing and search engine for biomedical databanks offering public access to many databanks and analysis tools. Unfortunately, these data are not easily and efficiently accessible through Web Services. We have developed 'SRS by WS' (SWS), a tool that makes information available in SRS sites accessible through Web Services. Information on known sites is maintained in a database, srsdb. SWS consists in a suite of WS that can query both srsdb, for information on sites and databases, and SRS sites. SWS returns results in a text-only format and can be accessed through a WSDL compliant client. SWS enables interoperability between workflow systems and SRS implementations, by also managing access to alternative sites, in order to cope with network and maintenance problems, and selecting the most up-to-date among available systems. Development and implementation of Web Services, allowing to make a programmatic access to an exhaustive set of biomedical databases can significantly improve automation of in-silico analysis. SWS supports this activity by making biological databanks that are managed in public SRS sites available through a programmatic interface.

  12. Evaluation of the content and accessibility of web sites for accredited orthopaedic sports medicine fellowships.

    PubMed

    Mulcahey, Mary K; Gosselin, Michelle M; Fadale, Paul D

    2013-06-19

    The Internet is a common source of information for orthopaedic residents applying for sports medicine fellowships, with the web sites of the American Orthopaedic Society for Sports Medicine (AOSSM) and the San Francisco Match serving as central databases. We sought to evaluate the web sites for accredited orthopaedic sports medicine fellowships with regard to content and accessibility. We reviewed the existing web sites of the ninety-five accredited orthopaedic sports medicine fellowships included in the AOSSM and San Francisco Match databases from February to March 2012. A Google search was performed to determine the overall accessibility of program web sites and to supplement information obtained from the AOSSM and San Francisco Match web sites. The study sample consisted of the eighty-seven programs whose web sites connected to information about the fellowship. Each web site was evaluated for its informational value. Of the ninety-five programs, fifty-one (54%) had links listed in the AOSSM database. Three (3%) of all accredited programs had web sites that were linked directly to information about the fellowship. Eighty-eight (93%) had links listed in the San Francisco Match database; however, only five (5%) had links that connected directly to information about the fellowship. Of the eighty-seven programs analyzed in our study, all eighty-seven web sites (100%) provided a description of the program and seventy-six web sites (87%) included information about the application process. Twenty-one web sites (24%) included a list of current fellows. Fifty-six web sites (64%) described the didactic instruction, seventy (80%) described team coverage responsibilities, forty-seven (54%) included a description of cases routinely performed by fellows, forty-one (47%) described the role of the fellow in seeing patients in the office, eleven (13%) included call responsibilities, and seventeen (20%) described a rotation schedule. Two Google searches identified direct links for 67% to 71% of all accredited programs. Most accredited orthopaedic sports medicine fellowships lack easily accessible or complete web sites in the AOSSM or San Francisco Match databases. Improvement in the accessibility and quality of information on orthopaedic sports medicine fellowship web sites would facilitate the ability of applicants to obtain useful information.

  13. From Web accessibility to Web adaptability.

    PubMed

    Kelly, Brian; Nevile, Liddy; Sloan, David; Fanou, Sotiris; Ellison, Ruth; Herrod, Lisa

    2009-07-01

    This article asserts that current approaches to enhance the accessibility of Web resources fail to provide a solid foundation for the development of a robust and future-proofed framework. In particular, they fail to take advantage of new technologies and technological practices. The article introduces a framework for Web adaptability, which encourages the development of Web-based services that can be resilient to the diversity of uses of such services, the target audience, available resources, technical innovations, organisational policies and relevant definitions of 'accessibility'. The article refers to a series of author-focussed approaches to accessibility through which the authors and others have struggled to find ways to promote accessibility for people with disabilities. These approaches depend upon the resource author's determination of the anticipated users' needs and their provision. Through approaches labelled as 1.0, 2.0 and 3.0, the authors have widened their focus to account for contexts and individual differences in target audiences. Now, the authors want to recognise the role of users in determining their engagement with resources (including services). To distinguish this new approach, the term 'adaptability' has been used to replace 'accessibility'; new definitions of accessibility have been adopted, and the authors have reviewed their previous work to clarify how it is relevant to the new approach. Accessibility 1.0 is here characterised as a technical approach in which authors are told how to construct resources for a broadly defined audience. This is known as universal design. Accessibility 2.0 was introduced to point to the need to account for the context in which resources would be used, to help overcome inadequacies identified in the purely technical approach. Accessibility 3.0 moved the focus on users from a homogenised universal definition to recognition of the idiosyncratic needs and preferences of individuals and to cater for them. All of these approaches placed responsibility within the authoring/publishing domain without recognising the role the user might want to play, or the roles that other users in social networks, or even Web services might play. Adaptability shifts the emphasis and calls for greater freedom for the users to facilitate individual accessibility in the open Web environment.

  14. Creating Patient and Family Education Web Sites

    PubMed Central

    YADRICH, DONNA MACAN; FITZGERALD, SHARON A.; WERKOWITCH, MARILYN; SMITH, CAROL E.

    2013-01-01

    This article gives details about the methods and processes used to ensure that usability and accessibility were achieved during development of the Home Parenteral Nutrition Family Caregivers Web site, an evidence-based health education Web site for the family members and caregivers of chronically ill patients. This article addresses comprehensive definitions of usability and accessibility and illustrates Web site development according to Section 508 standards and the national Health and Human Services’ Research-Based Web Design and Usability Guidelines requirements. PMID:22024970

  15. CircadiOmics: circadian omic web portal.

    PubMed

    Ceglia, Nicholas; Liu, Yu; Chen, Siwei; Agostinelli, Forest; Eckel-Mahan, Kristin; Sassone-Corsi, Paolo; Baldi, Pierre

    2018-06-15

    Circadian rhythms play a fundamental role at all levels of biological organization. Understanding the mechanisms and implications of circadian oscillations continues to be the focus of intense research. However, there has been no comprehensive and integrated way for accessing and mining all circadian omic datasets. The latest release of CircadiOmics (http://circadiomics.ics.uci.edu) fills this gap for providing the most comprehensive web server for studying circadian data. The newly updated version contains high-throughput 227 omic datasets corresponding to over 74 million measurements sampled over 24 h cycles. Users can visualize and compare oscillatory trajectories across species, tissues and conditions. Periodicity statistics (e.g. period, amplitude, phase, P-value, q-value etc.) obtained from BIO_CYCLE and other methods are provided for all samples in the repository and can easily be downloaded in the form of publication-ready figures and tables. New features and substantial improvements in performance and data volume make CircadiOmics a powerful web portal for integrated analysis of circadian omic data.

  16. Design of a Web-tool for diagnostic clinical trials handling medical imaging research.

    PubMed

    Baltasar Sánchez, Alicia; González-Sistal, Angel

    2011-04-01

    New clinical studies in medicine are based on patients and controls using different imaging diagnostic modalities. Medical information systems are not designed for clinical trials employing clinical imaging. Although commercial software and communication systems focus on storage of image data, they are not suitable for storage and mining of new types of quantitative data. We sought to design a Web-tool to support diagnostic clinical trials involving different experts and hospitals or research centres. The image analysis of this project is based on skeletal X-ray imaging. It involves a computerised image method using quantitative analysis of regions of interest in healthy bone and skeletal metastases. The database is implemented with ASP.NET 3.5 and C# technologies for our Web-based application. For data storage, we chose MySQL v.5.0, one of the most popular open source databases. User logins were necessary, and access to patient data was logged for auditing. For security, all data transmissions were carried over encrypted connections. This Web-tool is available to users scattered at different locations; it allows an efficient organisation and storage of data (case report form) and images and allows each user to know precisely what his task is. The advantages of our Web-tool are as follows: (1) sustainability is guaranteed; (2) network locations for collection of data are secured; (3) all clinical information is stored together with the original images and the results derived from processed images and statistical analysis that enable us to perform retrospective studies; (4) changes are easily incorporated because of the modular architecture; and (5) assessment of trial data collected at different sites is centralised to reduce statistical variance.

  17. Efficient Access to Massive Amounts of Tape-Resident Data

    NASA Astrophysics Data System (ADS)

    Yu, David; Lauret, Jérôme

    2017-10-01

    Randomly restoring files from tapes degrades the read performance primarily due to frequent tape mounts. The high latency and time-consuming tape mount and dismount is a major issue when accessing massive amounts of data from tape storage. BNL’s mass storage system currently holds more than 80 PB of data on tapes, managed by HPSS. To restore files from HPSS, we make use of a scheduler software, called ERADAT. This scheduler system was originally based on code from Oak Ridge National Lab, developed in the early 2000s. After some major modifications and enhancements, ERADAT now provides advanced HPSS resource management, priority queuing, resource sharing, web-browser visibility of real-time staging activities and advanced real-time statistics and graphs. ERADAT is also integrated with ACSLS and HPSS for near real-time mount statistics and resource control in HPSS. ERADAT is also the interface between HPSS and other applications such as the locally developed Data Carousel, providing fair resource-sharing policies and related capabilities. ERADAT has demonstrated great performance at BNL.

  18. Providing Multi-Page Data Extraction Services with XWRAPComposer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ling; Zhang, Jianjun; Han, Wei

    2008-04-30

    Dynamic Web data sources – sometimes known collectively as the Deep Web – increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deepmore » Web. To address these challenges, we present DYNABOT, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DYNABOT has three unique characteristics. First, DYNABOT utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DYNABOT employs a modular, self-tuning system architecture for focused crawling of the Deep Web using service class descriptions. Third, DYNABOT incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less

  19. Opal web services for biomedical applications.

    PubMed

    Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W

    2010-07-01

    Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.

  20. On Building a Search Interface Discovery System

    NASA Astrophysics Data System (ADS)

    Shestakov, Denis

    A huge portion of the Web known as the deep Web is accessible via search interfaces to myriads of databases on the Web. While relatively good approaches for querying the contents of web databases have been recently proposed, one cannot fully utilize them having most search interfaces unlocated. Thus, the automatic recognition of search interfaces to online databases is crucial for any application accessing the deep Web. This paper describes the architecture of the I-Crawler, a system for finding and classifying search interfaces. The I-Crawler is intentionally designed to be used in the deep web characterization surveys and for constructing directories of deep web resources.

  1. Warrior Transition Leader: Medical Rehabilitation Handbook

    DTIC Science & Technology

    2011-01-01

    serve. rE f E r E n c E S 1. http://www.army.mil/warriorcarenews/. Accessed January 24, 2011. 2. Warrior Transition Command Web site. http...www.wtc.army.mil/about_us/ ctp.html. Accessed January 24, 2011. 3. Leipold JD. Warrior Transition Command stands up at Pentagon. US Army Web site. Army...January 24, 2011. 4. Warrior Transition Command Web site. http://wtc.armylive.dodlive.mil/ about-wtu/. Accessed January 24, 2011. 5. Leipold JD

  2. BioPortal: enhanced functionality via new Web services from the National Center for Biomedical Ontology to access and use ontologies in software applications.

    PubMed

    Whetzel, Patricia L; Noy, Natalya F; Shah, Nigam H; Alexander, Paul R; Nyulas, Csongor; Tudorache, Tania; Musen, Mark A

    2011-07-01

    The National Center for Biomedical Ontology (NCBO) is one of the National Centers for Biomedical Computing funded under the NIH Roadmap Initiative. Contributing to the national computing infrastructure, NCBO has developed BioPortal, a web portal that provides access to a library of biomedical ontologies and terminologies (http://bioportal.bioontology.org) via the NCBO Web services. BioPortal enables community participation in the evaluation and evolution of ontology content by providing features to add mappings between terms, to add comments linked to specific ontology terms and to provide ontology reviews. The NCBO Web services (http://www.bioontology.org/wiki/index.php/NCBO_REST_services) enable this functionality and provide a uniform mechanism to access ontologies from a variety of knowledge representation formats, such as Web Ontology Language (OWL) and Open Biological and Biomedical Ontologies (OBO) format. The Web services provide multi-layered access to the ontology content, from getting all terms in an ontology to retrieving metadata about a term. Users can easily incorporate the NCBO Web services into software applications to generate semantically aware applications and to facilitate structured data collection.

  3. SLiMSearch 2.0: biological context for short linear motifs in proteins

    PubMed Central

    Davey, Norman E.; Haslam, Niall J.; Shields, Denis C.

    2011-01-01

    Short, linear motifs (SLiMs) play a critical role in many biological processes. The SLiMSearch 2.0 (Short, Linear Motif Search) web server allows researchers to identify occurrences of a user-defined SLiM in a proteome, using conservation and protein disorder context statistics to rank occurrences. User-friendly output and visualizations of motif context allow the user to quickly gain insight into the validity of a putatively functional motif occurrence. For each motif occurrence, overlapping UniProt features and annotated SLiMs are displayed. Visualization also includes annotated multiple sequence alignments surrounding each occurrence, showing conservation and protein disorder statistics in addition to known and predicted SLiMs, protein domains and known post-translational modifications. In addition, enrichment of Gene Ontology terms and protein interaction partners are provided as indicators of possible motif function. All web server results are available for download. Users can search motifs against the human proteome or a subset thereof defined by Uniprot accession numbers or GO term. The SLiMSearch server is available at: http://bioware.ucd.ie/slimsearch2.html. PMID:21622654

  4. The Windows to the Universe Project: A Facility for Inter-American Geoscience Education and Outreach

    NASA Astrophysics Data System (ADS)

    Johnson, R. M.; Lagrave, M.; Araujo-Pradere, E.; Russell, R.; Gardiner, L.; Bergman, J.; Genyuk, J.; Henderson, S.; Dimarco, M.; Metcalfe, T.

    2005-05-01

    Windows to the Universe (http://www.windows.ucar.edu) is a popular and comprehensive Earth and space science education web site that uses an interdisciplinary approach to engage our global audience. The entire Windows to the Universe site (roughly 7,000 pages) is being translated into Spanish, with support from the National Science Foundation. Large portions have already been "published" to the web and have been in use since October 2003. Web site statistics indicate that use of the Spanish portion of the site has quickly ramped up to ~20% of total site traffic. Approximately 150,000 users per month have accessed the Spanish-language segments of the site over the past academic year, in addition to the visitors to the English version of the website. The largest fraction of non-US users of the Spanish website come from Mexico, with growing use from countries from Central and South America and Spain. A total of 6.7 million users from around the world accessed the educational resources on this comprehensive website in 2004. An exciting new web-based development interface utilizing templates and an image database allows scientists from around the world to collaborate with the Windows to the Universe team, becoming remote developers on the website. This approach has proven to work effectively for scientists eager to efficiently get their science research results out to the public, taking advantage of their specialized expertise and yet not requiring them to become specialists in informal or formal K-12 education.

  5. Science Initiatives of the US Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.

    2012-09-01

    The United States Virtual Astronomical Observatory program is the operational facility successor to the National Virtual Observatory development project. The primary goal of the US VAO is to build on the standards, protocols, and associated infrastructure developed by NVO and the International Virtual Observatory Alliance partners and to bring to fruition a suite of applications and web-based tools that greatly enhance the research productivity of professional astronomers. To this end, and guided by the advice of our Science Council (Fabbiano et al. 2011), we have focused on five science initiatives in the first two years of VAO operations: 1) scalable cross-comparisons between astronomical source catalogs, 2) dynamic spectral energy distribution construction, visualization, and model fitting, 3) integration and periodogram analysis of time series data from the Harvard Time Series Center and NASA Star and Exoplanet Database, 4) integration of VO data discovery and access tools into the IRAF data analysis environment, and 5) a web-based portal to VO data discovery, access, and display tools. We are also developing tools for data linking and semantic discovery, and have a plan for providing data mining and advanced statistical analysis resources for VAO users. Initial versions of these applications and web-based services are being released over the course of the summer and fall of 2011, with further updates and enhancements planned for throughout 2012 and beyond.

  6. Electronic doors to education: study of high school website accessibility in Iowa.

    PubMed

    Klein, David; Myhill, William; Hansen, Linda; Asby, Gary; Michaelson, Susan; Blanck, Peter

    2003-01-01

    The Americans with Disabilities Act (ADA), and Sections 504 and 508 of the Rehabilitation Act, prohibit discrimination against people with disabilities in all aspects of daily life, including education, work, and access to places of public accommodations. Increasingly, these antidiscrimination laws are used by persons with disabilities to ensure equal access to e-commerce, and to private and public Internet websites. To help assess the impact of the anti-discrimination mandate for educational communities, this study examined 157 website home pages of Iowa public high schools (52% of high schools in Iowa) in terms of their electronic accessibility for persons with disabilities. We predicted that accessibility problems would limit students and others in obtaining information from the web pages as well as limiting ability to navigate to other web pages. Findings show that although many web pages examined included information in accessible formats, none of the home pages met World Wide Web Consortium (W3C) standards for accessibility. The most frequent accessibility problem was lack of alternative text (ALT tags) for graphics. Technical sophistication built into pages was found to reduce accessibility. Implications are discussed for schools and educational institutions, and for laws, policies, and procedures on website accessibility. Copyright 2003 John Wiley & Sons, Ltd.

  7. Accessible Collaborative Learning Using Mobile Devices

    ERIC Educational Resources Information Center

    Wald, Mike; Li, Yunjia; Draffan, E. A.

    2014-01-01

    This paper describes accessible collaborative learning using mobile devices with mobile enhancements to Synote, the freely available, award winning, open source, web based application that makes web hosted recordings easier to access, search, manage, and exploit for all learners, teachers and other users. Notes taken live during lectures using…

  8. J-Plus Web Portal

    NASA Astrophysics Data System (ADS)

    Civera Lorenzo, Tamara

    2017-10-01

    Brief presentation about the J-PLUS EDR data access web portal (http://archive.cefca.es/catalogues/jplus-edr) where the different services available to retrieve images and catalogues data have been presented.J-PLUS Early Data Release (EDR) archive includes two types of data: images and dual and single catalogue data which include parameters measured from images. J-PLUS web portal offers catalogue data and images through several different online data access tools or services each suited to a particular need. The different services offered are: Coverage map Sky navigator Object visualization Image search Cone search Object list search Virtual observatory services: Simple Cone Search Simple Image Access Protocol Simple Spectral Access Protocol Table Access Protocol

  9. Development of new on-line statistical program for the Korean Society for Radiation Oncology

    PubMed Central

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Choi, Eun Kyung; Cho, Kwan Ho

    2015-01-01

    Purpose To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. Materials and Methods The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Results Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. Conclusion The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology. PMID:26157684

  10. Development of new on-line statistical program for the Korean Society for Radiation Oncology.

    PubMed

    Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Shin, Kyung Hwan; Choi, Eun Kyung; Cho, Kwan Ho

    2015-06-01

    To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology.

  11. The Effectiveness of Commercial Internet Web Sites: A User's Perspective.

    ERIC Educational Resources Information Center

    Bell, Hudson; Tang, Nelson K. H.

    1998-01-01

    A user survey of 60 company Web sites (electronic commerce, entertainment and leisure, financial and banking services, information services, retailing and travel, and tourism) determined that 30% had facilities for conducting online transactions and only 7% charged for site access. Overall, Web sites were rated high in ease of access, content, and…

  12. Web Database Development: Implications for Academic Publishing.

    ERIC Educational Resources Information Center

    Fernekes, Bob

    This paper discusses the preliminary planning, design, and development of a pilot project to create an Internet accessible database and search tool for locating and distributing company data and scholarly work. Team members established four project objectives: (1) to develop a Web accessible database and decision tool that creates Web pages on the…

  13. Sign Language Web Pages

    ERIC Educational Resources Information Center

    Fels, Deborah I.; Richards, Jan; Hardman, Jim; Lee, Daniel G.

    2006-01-01

    The World Wide Web has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The…

  14. Identification and Illustration of Insecure Direct Object References and their Countermeasures

    NASA Astrophysics Data System (ADS)

    KumarShrestha, Ajay; Singh Maharjan, Pradip; Paudel, Santosh

    2015-03-01

    The insecure direct object reference simply represents the flaws in the system design without the full protection mechanism for the sensitive system resources or data. It basically occurs when the web application developer provides direct access to objects in accordance with the user input. So any attacker can exploit this web vulnerability and gain access to privileged information by bypassing the authorization. The main aim of this paper is to demonstrate the real effect and the identification of the insecure direct object references and then to provide the feasible preventive solutions such that the web applications do not allow direct object references to be manipulated by attackers. The experiment of the insecure direct object referencing is carried out using the insecure J2EE web application called WebGoat and its security testing is being performed using another JAVA based tool called BURP SUITE. The experimental result shows that the access control check for gaining access to privileged information is a very simple problem but at the same time its correct implementation is a tricky task. The paper finally presents some ways to overcome this web vulnerability.

  15. The Importance of Process-Oriented Accessibility Guidelines for Web Developers.

    PubMed

    Steen-Hansen, Linn; Fagernes, Siri

    2016-01-01

    Current accessibility research shows that in the web development, the process itself may lead to inaccessible web sites and applications. Common practices typically do not allow sufficient testing. The focus is mainly on complying with minimum standards, and treating accessibility compliance as a sort of bug-fixing process, missing the user perspective. In addition, there is an alarming lack of knowledge and experience with accessibility issues. It has also been argued that bringing accessibility into the development process at all stages is the only way to achieve the highest possible level of accessibility. The work presented in this paper is based on a previous project focusing on guidelines for developing accessible rich Internet applications. The guidelines were classified as either process-oriented or technology-oriented. In this paper, we examine the process-oriented guidelines and give a practical perspective on how these guidelines will make the development process more accessibility-friendly.

  16. Web Based Data Access to the World Data Center for Climate

    NASA Astrophysics Data System (ADS)

    Toussaint, F.; Lautenschlager, M.

    2006-12-01

    The World Data Center for Climate (WDC-Climate, www.wdc-climate.de) is hosted by the Model &Data Group (M&D) of the Max Planck Institute for Meteorology. The M&D department is financed by the German government and uses the computers and mass storage facilities of the German Climate Computing Centre (Deutsches Klimarechenzentrum, DKRZ). The WDC-Climate provides web access to 200 Terabytes of climate data; the total mass storage archive contains nearly 4 Petabytes. Although the majority of the datasets concern model output data, some satellite and observational data are accessible as well. The underlying relational database is distributed on five servers. The CERA relational data model is used to integrate catalogue data and mass data. The flexibility of the model allows to store and access very different types of data and metadata. The CERA metadata catalogue provides easy access to the content of the CERA database as well as to other data in the web. Visit ceramodel.wdc-climate.de for additional information on the CERA data model. The majority of the users access data via the CERA metadata catalogue, which is open without registration. However, prior to retrieving data user are required to check in and apply for a userid and password. The CERA metadata catalogue is servlet based. So it is accessible worldwide through any web browser at cera.wdc-climate.de. In addition to data and metadata access by the web catalogue, WDC-Climate offers a number of other forms of web based data access. All metadata are available via http request as xml files in various metadata formats (ISO, DC, etc., see wini.wdc-climate.de) which allows for easy data interchange with other catalogues. Model data can be retrieved in GRIB, ASCII, NetCDF, and binary (IEEE) format. WDC-Climate serves as data centre for various projects. Since xml files are accessible by http, the integration of data into applications of different projects is very easy. Projects supported by WDC-Climate are e.g. CEOP, IPCC, and CARIBIC. A script tool for data download (jblob) is offered on the web page, to make retrieval of huge data quantities more comfortable.

  17. Information about liver transplantation on the World Wide Web.

    PubMed

    Hanif, F; Sivaprakasam, R; Butler, A; Huguet, E; Pettigrew, G J; Michael, E D A; Praseedom, R K; Jamieson, N V; Bradley, J A; Gibbs, P

    2006-09-01

    Orthotopic liver transplant (OLTx) has evolved to a successful surgical management for end-stage liver diseases. Awareness and information about OLTx is an important tool in assisting OLTx recipients and people supporting them, including non-transplant clinicians. The study aimed to investigate the nature and quality of liver transplant-related patient information on the World Wide Web. Four common search engines were used to explore the Internet by using the key words 'Liver transplant'. The URL (unique resource locator) of the top 50 returns was chosen as it was judged unlikely that the average user would search beyond the first 50 sites returned by a given search. Each Web site was assessed on the following categories: origin, language, accessibility and extent of the information. A weighted Information Score (IS) was created to assess the quality of clinical and educational value of each Web site and was scored independently by three transplant clinicians. The Internet search performed with the aid of the four search engines yielded a total of 2,255,244 Web sites. Of the 200 possible sites, only 58 Web sites were assessed because of repetition of the same Web sites and non-accessible links. The overall median weighted IS was 22 (IQR 1 - 42). Of the 58 Web sites analysed, 45 (77%) belonged to USA, six (10%) were European, and seven (12%) were from the rest of the world. The median weighted IS of publications originating from Europe and USA was 40 (IQR = 22 - 60) and 23 (IQR = 6 - 38), respectively. Although European Web sites produced a higher weighted IS [40 (IQR = 22 - 60)] as compared with the USA publications [23 (IQR = 6 - 38)], this was not statistically significant (p = 0.07). Web sites belonging to the academic institutions and the professional organizations scored significantly higher with a median weighted IS of 28 (IQR = 16 - 44) and 24(12 - 35), respectively, as compared with the commercial Web sites (median = 6 with IQR of 0 - 14, p = .001). There was an Intraclass Correlation Coefficient (ICC) of 0.89 and an associated 95% CI (0.83, 0.93) for the three observers on the 58 Web sites. The study highlights the need for a significant improvement in the information available on the World Wide Web about OLTx. It concludes that the educational material currently available on the World Wide Web about liver transplant is of poor quality and requires rigorous input from health care professionals. The authors suggest that clinicians should pay more attention to take the necessary steps to improve the standard of information available on their relevant Web sites and must take an active role in helping their patients find Web sites that provide the best and accurate information specifically applicable to the loco-regional circumstances.

  18. Focused Crawling of the Deep Web Using Service Class Descriptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rocco, D; Liu, L; Critchlow, T

    2004-06-21

    Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address thesemore » challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less

  19. Authoring Tools

    NASA Astrophysics Data System (ADS)

    Treviranus, Jutta

    Authoring tools that are accessible and that enable authors to produce accessible Web content play a critical role in web accessibility. Widespread use of authoring tools that comply to the W3C Authoring Tool Accessibility Guidelines (ATAG) would ensure that even authors who are neither knowledgeable about nor particularly motivated to produce accessible content do so by default. The principles and techniques of ATAG are discussed. Some examples of accessible authoring tools are described including authoring tool content management components such as TinyMCE. Considerations for creating an accessible collaborative environment are also covered. As part of providing accessible content, the debate between system-based personal optimization and one universally accessible site configuration is presented. The issues and potential solutions to address the accessibility crisis presented by the advent of rich internet applications are outlined. This challenge must be met to ensure that a large segment of the population is able to participate in the move toward the web as a two-way communication mechanism.

  20. Accessing Digital Libraries: A Study of ARL Members' Digital Projects

    ERIC Educational Resources Information Center

    Kahl, Chad M.; Williams, Sarah C.

    2006-01-01

    To ensure efficient access to and integrated searching capabilities for their institution's new digital library projects, the authors studied Web sites of the Association of Research Libraries' (ARL) 111 academic, English-language libraries. Data were gathered on 1117 digital projects, noting library Web site and project access, metadata, and…

  1. A Retrospective Look at Website Accessibility over Time

    ERIC Educational Resources Information Center

    Hackett, Stephanie; Parmanto, Bambang; Zeng, Xiaoming

    2005-01-01

    Websites were retrospectively analysed to study the effects that technological advances in web design have had on accessibility for persons with disabilities. A random sample of general websites and a convenience sample of US government websites were studied and compared for the years 1997-2002. Web accessibility barrier (WAB) and complexity…

  2. Recent Internet Use and Associations with Clinical Outcomes among Patients Entering Addiction Treatment Involved in a Web-Delivered Psychosocial Intervention Study.

    PubMed

    Tofighi, B; Campbell, A N C; Pavlicova, M; Hu, M C; Lee, J D; Nunes, E V

    2016-10-01

    The acceptability and clinical impact of a web-based intervention among patients entering addiction treatment who lack recent internet access are unclear. This secondary analysis of a national multisite treatment study (NIDA Clinical Trials Network-0044) assessed for acceptability and clinical impact of a web-based psychosocial intervention among participants enrolling in community-based, outpatient addiction treatment programs. Participants were randomly assigned to 12 weeks of a web-based therapeutic education system (TES) based on the community reinforcement approach plus contingency management versus treatment as usual (TAU). Demographic and clinical characteristics, and treatment outcomes were compared among participants with recent internet access in the 90 days preceding enrollment (N = 374) and without internet access (N = 133). Primary outcome variables included (1) acceptability of TES (i.e., module completion; acceptability of web-based intervention) and (2) clinical impact (i.e., self-reported abstinence confirmed by urine drug/breath alcohol tests; retention measured as time to dropout). Internet use was common (74 %) and was more likely among younger (18-49 years old) participants and those who completed high school (p < .001). Participants randomized to TES (n = 255) without baseline internet access rated the acceptability of TES modules significantly higher than those with internet access (t = 2.49, df = 218, p = .01). There was a near significant interaction between treatment, baseline abstinence, and internet access on time to dropout (χ 2 (1) = 3.8089, p = .051). TES was associated with better retention among participants not abstinent at baseline who had internet access (X 2 (1) = 6.69, p = .01). These findings demonstrate high acceptability of this web-based intervention among participants that lacked recent internet access.

  3. UceWeb: a web-based collaborative tool for collecting and sharing quality of life data.

    PubMed

    Parimbelli, E; Sacchi, L; Rubrichi, S; Mazzanti, A; Quaglini, S

    2015-01-01

    This work aims at building a platform where quality-of-life data, namely utility coefficients, can be elicited not only for immediate use, but also systematically stored together with patient profiles to build a public repository to be further exploited in studies on specific target populations (e.g. cost/utility analyses). We capitalized on utility theory and previous experience to define a set of desirable features such a tool should show to facilitate sound elicitation of quality of life. A set of visualization tools and algorithms has been developed to this purpose. To make it easily accessible for potential users, the software has been designed as a web application. A pilot validation study has been performed on 20 atrial fibrillation patients. A collaborative platform, UceWeb, has been developed and tested. It implements the standard gamble, time trade-off and rating-scale utility elicitation methods. It allows doctors and patients to choose the mode of interaction to maximize patients’ comfort in answering difficult questions. Every utility elicitation may contribute to the growth of the repository. UceWeb can become a unique source of data allowing researchers both to perform more reliable comparisons among healthcare interventions and build statistical models to gain deeper insight into quality of life data.

  4. SCOPE: a web server for practical de novo motif discovery.

    PubMed

    Carlson, Jonathan M; Chakravarty, Arijit; DeZiel, Charles E; Gross, Robert H

    2007-07-01

    SCOPE is a novel parameter-free method for the de novo identification of potential regulatory motifs in sets of coordinately regulated genes. The SCOPE algorithm combines the output of three component algorithms, each designed to identify a particular class of motifs. Using an ensemble learning approach, SCOPE identifies the best candidate motifs from its component algorithms. In tests on experimentally determined datasets, SCOPE identified motifs with a significantly higher level of accuracy than a number of other web-based motif finders run with their default parameters. Because SCOPE has no adjustable parameters, the web server has an intuitive interface, requiring only a set of gene names or FASTA sequences and a choice of species. The most significant motifs found by SCOPE are displayed graphically on the main results page with a table containing summary statistics for each motif. Detailed motif information, including the sequence logo, PWM, consensus sequence and specific matching sites can be viewed through a single click on a motif. SCOPE's efficient, parameter-free search strategy has enabled the development of a web server that is readily accessible to the practising biologist while providing results that compare favorably with those of other motif finders. The SCOPE web server is at .

  5. Remote visual analysis of large turbulence databases at multiple scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulido, Jesus; Livescu, Daniel; Kanov, Kalin

    The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less

  6. Remote visual analysis of large turbulence databases at multiple scales

    DOE PAGES

    Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...

    2018-06-15

    The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less

  7. Using EMBL-EBI services via Web interface and programmatically via Web Services

    PubMed Central

    Lopez, Rodrigo; Cowley, Andrew; Li, Weizhong; McWilliam, Hamish

    2015-01-01

    The European Bioinformatics Institute (EMBL-EBI) provides access to a wide range of databases and analysis tools that are of key importance in bioinformatics. As well as providing Web interfaces to these resources, Web Services are available using SOAP and REST protocols that enable programmatic access to our resources and allow their integration into other applications and analytical workflows. This unit describes the various options available to a typical researcher or bioinformatician who wishes to use our resources via Web interface or programmatically via a range of programming languages. PMID:25501941

  8. Web accessibility: a longitudinal study of college and university home pages in the northwestern United States.

    PubMed

    Thompson, Terrill; Burgstahler, Sheryl; Moore, Elizabeth J

    2010-01-01

    This article reports on a follow-up assessment to Thompson et al. (Proceedings of The First International Conference on Technology-based Learning with Disability, July 19-20, Dayton, Ohio, USA; 2007. pp 127-136), in which higher education home pages were evaluated over a 5-year period on their accessibility to individuals with disabilities. The purpose of this article is to identify trends in web accessibility and long-term impact of outreach and education. Home pages from 127 higher education institutions in the Northwest were evaluated for accessibility three times over a 6-month period in 2004-2005 (Phase I), and again in 2009 (Phase II). Schools in the study were offered varying degrees of training and/or support on web accessibility during Phase I. Pages were evaluated for accessibility using a set of manual checkpoints developed by the researchers. Over the 5-year period reported in this article, significant positive gains in accessibility were revealed on some measures, but accessibility declined on other measures. The areas of improvement are arguably the more basic, easy-to-implement accessibility features, while the area of decline is keyboard accessibility, which is likely associated with the emergence of dynamic new technologies on web pages. Even on those measures where accessibility is improving, it is still strikingly low. In Phase I of the study, institutions that received extensive training and support were more likely than other institutions to show improved accessibility on the measures where institutions improved overall, but were equally or more likely than others to show a decline on measures where institutions showed an overall decline. In Phase II, there was no significant difference between institutions who had received support earlier in the study, and those who had not. Results suggest that growing numbers of higher education institutions in the Northwest are motivated to add basic accessibility features to their home pages, and that outreach and education may have a positive effect on these measures. However, the results also reveal negative trends in accessibility, and outreach and education may not be strong enough to counter the factors that motivate institutions to deploy inaccessible emerging technologies. Further research is warranted toward identifying the motivational factors that are associated with increased and decreased web accessibility, and much additional work is needed to ensure that higher education web pages are accessible to individuals with disabilities.

  9. WebSat--a web software for microsatellite marker development.

    PubMed

    Martins, Wellington Santos; Lucas, Divino César Soares; Neves, Kelligton Fabricio de Souza; Bertioli, David John

    2009-01-01

    Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. The web tool may be accessed at http://purl.oclc.org/NET/websat/

  10. WebVR: an interactive web browser for virtual environments

    NASA Astrophysics Data System (ADS)

    Barsoum, Emad; Kuester, Falko

    2005-03-01

    The pervasive nature of web-based content has lead to the development of applications and user interfaces that port between a broad range of operating systems and databases, while providing intuitive access to static and time-varying information. However, the integration of this vast resource into virtual environments has remained elusive. In this paper we present an implementation of a 3D Web Browser (WebVR) that enables the user to search the internet for arbitrary information and to seamlessly augment this information into virtual environments. WebVR provides access to the standard data input and query mechanisms offered by conventional web browsers, with the difference that it generates active texture-skins of the web contents that can be mapped onto arbitrary surfaces within the environment. Once mapped, the corresponding texture functions as a fully integrated web-browser that will respond to traditional events such as the selection of links or text input. As a result, any surface within the environment can be turned into a web-enabled resource that provides access to user-definable data. In order to leverage from the continuous advancement of browser technology and to support both static as well as streamed content, WebVR uses ActiveX controls to extract the desired texture skin from industry strength browsers, providing a unique mechanism for data fusion and extensibility.

  11. EntrezAJAX: direct web browser access to the Entrez Programming Utilities.

    PubMed

    Loman, Nicholas J; Pallen, Mark J

    2010-06-21

    Web applications for biology and medicine often need to integrate data from Entrez services provided by the National Center for Biotechnology Information. However, direct access to Entrez from a web browser is not possible due to 'same-origin' security restrictions. The use of "Asynchronous JavaScript and XML" (AJAX) to create rich, interactive web applications is now commonplace. The ability to access Entrez via AJAX would be advantageous in the creation of integrated biomedical web resources. We describe EntrezAJAX, which provides access to Entrez eUtils and is able to circumvent same-origin browser restrictions. EntrezAJAX is easily implemented by JavaScript developers and provides identical functionality as Entrez eUtils as well as enhanced functionality to ease development. We provide easy-to-understand developer examples written in JavaScript to illustrate potential uses of this service. For the purposes of speed, reliability and scalability, EntrezAJAX has been deployed on Google App Engine, a freely available cloud service. The EntrezAJAX webpage is located at http://entrezajax.appspot.com/

  12. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Folkerts, M; University of California, San Diego, La Jolla, CA; Graves, Y

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is ablemore » to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.« less

  13. Missouri StreamStats—A water-resources web application

    USGS Publications Warehouse

    Ellis, Jarrett T.

    2018-01-31

    The U.S. Geological Survey (USGS) maintains and operates more than 8,200 continuous streamgages nationwide. Types of data that may be collected, computed, and stored for streamgages include streamgage height (water-surface elevation), streamflow, and water quality. The streamflow data allow scientists and engineers to calculate streamflow statistics, such as the 1-percent annual exceedance probability flood (also known as the 100-year flood), the mean flow, and the 7-day, 10-year low flow, which are used by managers to make informed water resource management decisions, at each streamgage location. Researchers, regulators, and managers also commonly need physical characteristics (basin characteristics) that describe the unique properties of a basin. Common uses for streamflow statistics and basin characteristics include hydraulic design, water-supply management, water-use appropriations, and flood-plain mapping for establishing flood-insurance rates and land-use zones. The USGS periodically publishes reports that update the values of basin characteristics and streamflow statistics at selected gaged locations (locations with streamgages), but these studies usually only update a subset of streamgages, making data retrieval difficult. Additionally, streamflow statistics and basin characteristics are most often needed at ungaged locations (locations without streamgages) for which published streamflow statistics and basin characteristics do not exist. Missouri StreamStats is a web-based geographic information system that was created by the USGS in cooperation with the Missouri Department of Natural Resources to provide users with access to an assortment of tools that are useful for water-resources planning and management. StreamStats allows users to easily obtain the most recent published streamflow statistics and basin characteristics for streamgage locations and to automatically calculate selected basin characteristics and estimate streamflow statistics at ungaged locations.

  14. 48 CFR 311.7001 - Section 508 accessibility standards for HHS Web site content and communications materials.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide... communication must meet the accessibility standards in 36 CFR 1194.22, “Web-based intranet and Internet... standards for HHS Web site content and communications materials. 311.7001 Section 311.7001 Federal...

  15. 48 CFR 311.7001 - Section 508 accessibility standards for HHS Web site content and communications materials.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide... communication must meet the accessibility standards in 36 CFR 1194.22, “Web-based intranet and Internet... standards for HHS Web site content and communications materials. 311.7001 Section 311.7001 Federal...

  16. 48 CFR 311.7001 - Section 508 accessibility standards for HHS Web site content and communications materials.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide... communication must meet the accessibility standards in 36 CFR 1194.22, “Web-based intranet and Internet... standards for HHS Web site content and communications materials. 311.7001 Section 311.7001 Federal...

  17. 48 CFR 311.7001 - Section 508 accessibility standards for HHS Web site content and communications materials.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... standards, and resolve any related issues. (c) Based on those discussions, the Project Officer shall provide... communication must meet the accessibility standards in 36 CFR 1194.22, “Web-based intranet and Internet... standards for HHS Web site content and communications materials. 311.7001 Section 311.7001 Federal...

  18. Web Accessibility and Usability of the Homepages from Academy of Human Resource Development Members' Institutions

    ERIC Educational Resources Information Center

    Zeng, Xiaoming; Sligar, Steven R.

    2008-01-01

    Human resource development programs in various institutions communicate with their constituencies including persons with disabilities through websites. Web sites need to be accessible for legal, economic and ethical reasons. We used an automated web usability evaluation tool, aDesigner, to evaluate 205 home pages from the organizations of AHRD…

  19. FUn: a framework for interactive visualizations of large, high-dimensional datasets on the web.

    PubMed

    Probst, Daniel; Reymond, Jean-Louis

    2018-04-15

    During the past decade, big data have become a major tool in scientific endeavors. Although statistical methods and algorithms are well-suited for analyzing and summarizing enormous amounts of data, the results do not allow for a visual inspection of the entire data. Current scientific software, including R packages and Python libraries such as ggplot2, matplotlib and plot.ly, do not support interactive visualizations of datasets exceeding 100 000 data points on the web. Other solutions enable the web-based visualization of big data only through data reduction or statistical representations. However, recent hardware developments, especially advancements in graphical processing units, allow for the rendering of millions of data points on a wide range of consumer hardware such as laptops, tablets and mobile phones. Similar to the challenges and opportunities brought to virtually every scientific field by big data, both the visualization of and interaction with copious amounts of data are both demanding and hold great promise. Here we present FUn, a framework consisting of a client (Faerun) and server (Underdark) module, facilitating the creation of web-based, interactive 3D visualizations of large datasets, enabling record level visual inspection. We also introduce a reference implementation providing access to SureChEMBL, a database containing patent information on more than 17 million chemical compounds. The source code and the most recent builds of Faerun and Underdark, Lore.js and the data preprocessing toolchain used in the reference implementation, are available on the project website (http://doc.gdb.tools/fun/). daniel.probst@dcb.unibe.ch or jean-louis.reymond@dcb.unibe.ch.

  20. DMSP SSJ4 Data Restoration, Classification, and On-Line Data Access

    NASA Technical Reports Server (NTRS)

    Wing, Simon; Bredekamp, Joseph H. (Technical Monitor)

    2000-01-01

    Compress and clean raw data file for permanent storage We have identified various error conditions/types and developed algorithms to get rid of these errors/noises, including the more complicated noise in the newer data sets. (status = 100% complete). Internet access of compacted raw data. It is now possible to access the raw data via our web site, http://www.jhuapl.edu/Aurora/index.html. The software to read and plot the compacted raw data is also available from the same web site. The users can now download the raw data, read, plot, or manipulate the data as they wish on their own computer. The users are able to access the cleaned data sets. Internet access of the color spectrograms. This task has also been completed. It is now possible to access the spectrograms from the web site mentioned above. Improve the particle precipitation region classification. The algorithm for doing this task has been developed and implemented. As a result, the accuracies improved. Now the web site routinely distributes the results of applying the new algorithm to the cleaned data set. Mark the classification region on the spectrograms. The software to mark the classification region in the spectrograms has been completed. This is also available from our web site.

  1. Internet use in pregnancy informs women's decision making: a web-based survey.

    PubMed

    Lagan, Briege M; Sinclair, Marlene; Kernohan, W George

    2010-06-01

    Internet access and usage is almost ubiquitous, providing new opportunities and increasing challenges for health care practitioners and users. With pregnant women reportedly turning to the Internet for information during pregnancy, a better understanding of this behavior is needed. The objective of this study was to ascertain why and how pregnant women use the Internet as a health information source, and the overall effect it had on their decision making. Kuhlthau's (1993) information-seeking model was adapted to provide the underpinning theoretical framework for the study. The design was exploratory and descriptive. Data were collected using a valid and reliable web-based questionnaire. Over a 12-week period, 613 women from 24 countries who had confirmed that they had used the Internet for pregnancy-related information during their pregnancy completed and submitted a questionnaire. Most women (97%) used search engines such as Google to identify online web pages to access a large variety of pregnancy-related information and to use the Internet for pregnancy-related social networking, support, and electronic commerce (i.e., e-commerce). Almost 94 percent of women used the Internet to supplement information already provided by health professionals and 83 percent used it to influence their pregnancy decision making. Nearly half of the respondents reported dissatisfaction with information given by health professionals (48.6%) and lack of time to ask health professionals questions (46.5%) as key factors influencing them to access the Internet. Statistically, women's confidence levels significantly increased with respect to making decisions about their pregnancy after Internet usage (p < 0.05). In this study, the Internet played a significant part in the respondents' health information seeking and decision making in pregnancy. Health professionals need to be ready to support pregnant women in online data retrieval, interpretation, and application.

  2. MelanomaDB: A Web Tool for Integrative Analysis of Melanoma Genomic Information to Identify Disease-Associated Molecular Pathways

    PubMed Central

    Trevarton, Alexander J.; Mann, Michael B.; Knapp, Christoph; Araki, Hiromitsu; Wren, Jonathan D.; Stones-Havas, Steven; Black, Michael A.; Print, Cristin G.

    2013-01-01

    Despite on-going research, metastatic melanoma survival rates remain low and treatment options are limited. Researchers can now access a rapidly growing amount of molecular and clinical information about melanoma. This information is becoming difficult to assemble and interpret due to its dispersed nature, yet as it grows it becomes increasingly valuable for understanding melanoma. Integration of this information into a comprehensive resource to aid rational experimental design and patient stratification is needed. As an initial step in this direction, we have assembled a web-accessible melanoma database, MelanomaDB, which incorporates clinical and molecular data from publically available sources, which will be regularly updated as new information becomes available. This database allows complex links to be drawn between many different aspects of melanoma biology: genetic changes (e.g., mutations) in individual melanomas revealed by DNA sequencing, associations between gene expression and patient survival, data concerning drug targets, biomarkers, druggability, and clinical trials, as well as our own statistical analysis of relationships between molecular pathways and clinical parameters that have been produced using these data sets. The database is freely available at http://genesetdb.auckland.ac.nz/melanomadb/about.html. A subset of the information in the database can also be accessed through a freely available web application in the Illumina genomic cloud computing platform BaseSpace at http://www.biomatters.com/apps/melanoma-profiler-for-research. The MelanomaDB database illustrates dysregulation of specific signaling pathways across 310 exome-sequenced melanomas and in individual tumors and identifies the distribution of somatic variants in melanoma. We suggest that MelanomaDB can provide a context in which to interpret the tumor molecular profiles of individual melanoma patients relative to biological information and available drug therapies. PMID:23875173

  3. MedlinePlus Milestones: 1998-present

    MedlinePlus

    ... page links and information daily and also offers access to this full XML content through its Web ... search-based Web service that allows developers to access MedlinePlus health topic data in XML format. MedlinePlus ...

  4. Global Precipitation Measurement (GPM) Mission Products and Services at the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC)

    NASA Technical Reports Server (NTRS)

    Liu, Z.; Ostrenga, D.; Vollmer, B.; Kempler, S.; Deshong, B.; Greene, M.

    2015-01-01

    The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) hosts and distributes GPM data within the NASA Earth Observation System Data Information System (EOSDIS). The GES DISC is also home to the data archive for the GPM predecessor, the Tropical Rainfall Measuring Mission (TRMM). Over the past 17 years, the GES DISC has served the scientific as well as other communities with TRMM data and user-friendly services. During the GPM era, the GES DISC will continue to provide user-friendly data services and customer support to users around the world. GPM products currently and to-be available: -Level-1 GPM Microwave Imager (GMI) and partner radiometer products, DPR products -Level-2 Goddard Profiling Algorithm (GPROF) GMI and partner products, DPR products -Level-3 daily and monthly products, DPR products -Integrated Multi-satellitE Retrievals for GPM (IMERG) products (early, late, and final) A dedicated Web portal (including user guides, etc.) has been developed for GPM data (http://disc.sci.gsfc.nasa.gov/gpm). Data services that are currently and to-be available include Google-like Mirador (http://mirador.gsfc.nasa.gov/) for data search and access; data access through various Web services (e.g., OPeNDAP, GDS, WMS, WCS); conversion into various formats (e.g., netCDF, HDF, KML (for Google Earth), ASCII); exploration, visualization, and statistical online analysis through Giovanni (http://giovanni.gsfc.nasa.gov); generation of value-added products; parameter and spatial subsetting; time aggregation; regridding; data version control and provenance; documentation; science support for proper data usage, FAQ, help desk; monitoring services (e.g. Current Conditions) for applications. The United User Interface (UUI) is the next step in the evolution of the GES DISC web site. It attempts to provide seamless access to data, information and services through a single interface without sending the user to different applications or URLs (e.g., search, access, subset, Giovanni, documents).

  5. omiRas: a Web server for differential expression analysis of miRNAs derived from small RNA-Seq data.

    PubMed

    Müller, Sören; Rycak, Lukas; Winter, Peter; Kahl, Günter; Koch, Ina; Rotter, Björn

    2013-10-15

    Small RNA deep sequencing is widely used to characterize non-coding RNAs (ncRNAs) differentially expressed between two conditions, e.g. healthy and diseased individuals and to reveal insights into molecular mechanisms underlying condition-specific phenotypic traits. The ncRNAome is composed of a multitude of RNAs, such as transfer RNA, small nucleolar RNA and microRNA (miRNA), to name few. Here we present omiRas, a Web server for the annotation, comparison and visualization of interaction networks of ncRNAs derived from next-generation sequencing experiments of two different conditions. The Web tool allows the user to submit raw sequencing data and results are presented as: (i) static annotation results including length distribution, mapping statistics, alignments and quantification tables for each library as well as lists of differentially expressed ncRNAs between conditions and (ii) an interactive network visualization of user-selected miRNAs and their target genes based on the combination of several miRNA-mRNA interaction databases. The omiRas Web server is implemented in Python, PostgreSQL, R and can be accessed at: http://tools.genxpro.net/omiras/.

  6. DMINDA: an integrated web server for DNA motif identification and analyses.

    PubMed

    Ma, Qin; Zhang, Hanyuan; Mao, Xizeng; Zhou, Chuan; Liu, Bingqiang; Chen, Xin; Xu, Ying

    2014-07-01

    DMINDA (DNA motif identification and analyses) is an integrated web server for DNA motif identification and analyses, which is accessible at http://csbl.bmb.uga.edu/DMINDA/. This web site is freely available to all users and there is no login requirement. This server provides a suite of cis-regulatory motif analysis functions on DNA sequences, which are important to elucidation of the mechanisms of transcriptional regulation: (i) de novo motif finding for a given set of promoter sequences along with statistical scores for the predicted motifs derived based on information extracted from a control set, (ii) scanning motif instances of a query motif in provided genomic sequences, (iii) motif comparison and clustering of identified motifs, and (iv) co-occurrence analyses of query motifs in given promoter sequences. The server is powered by a backend computer cluster with over 150 computing nodes, and is particularly useful for motif prediction and analyses in prokaryotic genomes. We believe that DMINDA, as a new and comprehensive web server for cis-regulatory motif finding and analyses, will benefit the genomic research community in general and prokaryotic genome researchers in particular. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. COMAN: a web server for comprehensive metatranscriptomics analysis.

    PubMed

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  8. Distribution of guidance models for cardiac resynchronization therapy in the setting of multi-center clinical trials

    NASA Astrophysics Data System (ADS)

    Rajchl, Martin; Abhari, Kamyar; Stirrat, John; Ukwatta, Eranga; Cantor, Diego; Li, Feng P.; Peters, Terry M.; White, James A.

    2014-03-01

    Multi-center trials provide the unique ability to investigate novel techniques across a range of geographical sites with sufficient statistical power, the inclusion of multiple operators determining feasibility under a wider array of clinical environments and work-flows. For this purpose, we introduce a new means of distributing pre-procedural cardiac models for image-guided interventions across a large scale multi-center trial. In this method, a single core facility is responsible for image processing, employing a novel web-based interface for model visualization and distribution. The requirements for such an interface, being WebGL-based, are minimal and well within the realms of accessibility for participating centers. We then demonstrate the accuracy of our approach using a single-center pacemaker lead implantation trial with generic planning models.

  9. Use of the World Wide Web for multisite data collection.

    PubMed

    Subramanian, A K; McAfee, A T; Getzinger, J P

    1997-08-01

    As access to the Internet becomes increasingly available, research applications in medicine will increase. This paper describes the use of the Internet, and, more specifically, the World Wide Web (WWW), as a channel of communication between EDs throughout the world and investigators who are interested in facilitating the collection of data from multiple sites. Data entered into user-friendly electronic surveys can be transmitted over the Internet to a database located at the site of the study, rendering geographic separation less of a barrier to the conduction of multisite studies. The electronic format of the data can enable real-time statistical processing while data are stored using existing database technologies. In theory, automated processing of variables within such a database enables early identification of data trends. Methods of ensuring validity, security, and compliance are discussed.

  10. 5 CFR 2606.201 - Requests for access.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... on OGE's Web site at http://www.usoge.gov, or upon request from OGE's Office of General Counsel and... Office of Federal Register at the GPO Access Web site (http://www.access.gpo.gov/su_docs/aces/PrivacyAct... individual's full name (including her maiden name, if pertinent), dates of employment, social security number...

  11. 6 CFR 5.21 - Requests for access to records.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... accessed electronically at the Government Printing Office's World Wide Web site (which can be found at http... Printing Office's World Wide Web site (which can be found at http://www.access.gpo.gov/su_docs). (c... requested records, you may also, at your option, include your social security number. (e) Verification of...

  12. 28 CFR 16.41 - Requests for access to records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the Government Printing Office's World Wide Web site (which can be found at http://www.access.gpo.gov... accessed electronically at the Government Printing Office's World Wide Web site (which can be found at http... requested records, you may also, at your option, include your social security number. (e) Verification of...

  13. 6 CFR 5.21 - Requests for access to records.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... accessed electronically at the Government Printing Office's World Wide Web site (which can be found at http... Printing Office's World Wide Web site (which can be found at http://www.access.gpo.gov/su_docs). (c... requested records, you may also, at your option, include your social security number. (e) Verification of...

  14. 28 CFR 16.41 - Requests for access to records.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the Government Printing Office's World Wide Web site (which can be found at http://www.access.gpo.gov... accessed electronically at the Government Printing Office's World Wide Web site (which can be found at http... requested records, you may also, at your option, include your social security number. (e) Verification of...

  15. A Web-Based Remote Access Laboratory Using SCADA

    ERIC Educational Resources Information Center

    Aydogmus, Z.; Aydogmus, O.

    2009-01-01

    The Internet provides an opportunity for students to access laboratories from outside the campus. This paper presents a Web-based remote access real-time laboratory using SCADA (supervisory control and data acquisition) control. The control of an induction motor is used as an example to demonstrate the effectiveness of this remote laboratory,…

  16. Binary Coded Web Access Pattern Tree in Education Domain

    ERIC Educational Resources Information Center

    Gomathi, C.; Moorthi, M.; Duraiswamy, K.

    2008-01-01

    Web Access Pattern (WAP), which is the sequence of accesses pursued by users frequently, is a kind of interesting and useful knowledge in practice. Sequential Pattern mining is the process of applying data mining techniques to a sequential database for the purposes of discovering the correlation relationships that exist among an ordered list of…

  17. Developing Guidelines for Evaluating the Adaptation of Accessible Web-Based Learning Materials

    ERIC Educational Resources Information Center

    Radovan, Marko; Perdih, Mojca

    2016-01-01

    E-learning is a rapidly developing form of education. One of the key characteristics of e-learning is flexibility, which enables easier access to knowledge for everyone. Information and communications technology (ICT), which is e-learning's main component, enables alternative means of accessing the web-based learning materials that comprise the…

  18. Making the World Wide Web Accessible to All Students.

    ERIC Educational Resources Information Center

    Guthrie, Sally A.

    2000-01-01

    Examines the accessibility of Web sites belonging to 80 colleges of communications and schools of journalism by examining the hypertext markup language (HTML) used to format the pages. Suggests ways to revise the markup of pages to make them more accessible to students with vision, hearing, and mobility problems. Lists resources of the latest…

  19. Real-time, continuous water-quality monitoring in Indiana and Kentucky

    USGS Publications Warehouse

    Shoda, Megan E.; Lathrop, Timothy R.; Risch, Martin R.

    2015-01-01

    Water-quality “super” gages (also known as “sentry” gages) provide real-time, continuous measurements of the physical and chemical characteristics of stream water at or near selected U.S. Geological Survey (USGS) streamgages in Indiana and Kentucky. A super gage includes streamflow and water-quality instrumentation and representative stream sample collection for laboratory analysis. USGS scientists can use statistical surrogate models to relate instrument values to analyzed chemical concentrations at a super gage. Real-time, continuous and laboratory-analyzed concentration and load data are publicly accessible on USGS Web pages.

  20. Unifying Access to National Hydrologic Data Repositories via Web Services

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Jennings, B.; Zaslavsky, I.; Maidment, D. R.

    2006-12-01

    The CUAHSI hydrologic information system (HIS) is designed to be a live, multiscale web portal system for accessing, querying, visualizing, and publishing distributed hydrologic observation data and models for any location or region in the United States. The HIS design follows the principles of open service oriented architecture, i.e. system components are represented as web services with well defined standard service APIs. WaterOneFlow web services are the main component of the design. The currently available services have been completely re-written compared to the previous version, and provide programmatic access to USGS NWIS. (steam flow, groundwater and water quality repositories), DAYMET daily observations, NASA MODIS, and Unidata NAM streams, with several additional web service wrappers being added (EPA STORET, NCDC and others.). Different repositories of hydrologic data use different vocabularies, and support different types of query access. Resolving semantic and structural heterogeneities across different hydrologic observation archives and distilling a generic set of service signatures is one of the main scalability challenges in this project, and a requirement in our web service design. To accomplish the uniformity of the web services API, data repositories are modeled following the CUAHSI Observation Data Model. The web service responses are document-based, and use an XML schema to express the semantics in a standard format. Access to station metadata is provided via web service methods, GetSites, GetSiteInfo and GetVariableInfo. The methdods form the foundation of CUAHSI HIS discovery interface and may execute over locally-stored metadata or request the information from remote repositories directly. Observation values are retrieved via a generic GetValues method which is executed against national data repositories. The service is implemented in ASP.Net, and other providers are implementing WaterOneFlow services in java. Reference implementation of WaterOneFlow web services is available. More information about the ongoing development of CUAHSI HIS is available from http://www.cuahsi.org/his/.

  1. Effects of a Web-based intervention for adults with chronic conditions on patient activation: online randomized controlled trial.

    PubMed

    Solomon, Michael; Wagner, Stephen L; Goes, James

    2012-02-21

    With almost one-half of Americans projected to have at least one chronic condition before 2020, a vital role of the health care system is to develop informed, engaged individuals who are effective self-managers of their health. Self-management interventions (SMIs) delivered face-to-face or by telephone (traditional SMIs) are associated with improved self-management knowledge, skills, and self-efficacy, which are expressed by the composite construct of patient activation, a predictor of health outcomes. Web-based interventions to support self-management across the spectrum of chronic diseases have the potential to reach a broader population of patients for extended periods than do traditional SMIs. However, evidence of the effectiveness of Web-based interventions on patient activation is sparse. High-quality studies featuring controlled comparisons of patients with different chronic conditions are needed to explore the interaction of Web-based interventions and patient activation. To explore the effect of a Web-based intervention on the patient activation levels of patients with chronic health conditions, measured as attitudes toward knowledge, skills, and confidence in self-managing health. For this 12-week study, prospective participants were selected from the patient panel of a regional health care system in the United States. The 201 eligible participants were randomly assigned to two groups. Intervention group participants had access to MyHealth Online, a patient portal featuring interactive health applications accessible via the Internet. Control participants had access to a health education website featuring various topics. Patient activation was assessed pre- and posttest using the 13-item patient activation measure. Parametric statistical models (t test, analysis of variance, analysis of covariance) were applied to draw inferences. The Web-based intervention demonstrated a positive and significant effect on the patient activation levels of participants in the intervention group. A significant difference in posttest patient activation scores was found between the two groups (F(1,123) = 4.438, P = .04, r = .196). Patients starting at the most advanced development of patient activation (stage 4) in the intervention group did not demonstrate significant change compared with participants beginning at earlier stages. To our knowledge, this is the first study to measure change in patient activation when a Web-based intervention is used by patients living with different chronic conditions. Results suggest that Web-based interventions increase patient activation and have the potential to enhance the self-management capabilities of the growing population of chronically ill people. Activated patients are more likely to adhere to recommended health care practices, which in turn leads to improved health outcomes. Designing Web-based interventions to target a specific stage of patient activation may optimize their effectiveness. For Web-based interventions to reach their potential as a key component of chronic disease management, evidence is needed that this technology produces benefits for a sustained period among a diverse population.

  2. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    PubMed

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  3. Multi-tool accessibility assessment of government department websites:a case-study with JKGAD.

    PubMed

    Ismail, Abid; Kuppusamy, K S; Nengroo, Ab Shakoor

    2017-08-02

    Nature of being accessible to all categories of users is one of the primary factors for enabling the wider reach of the resources published through World Wide Web. The accessibility of websites has been analyzed through W3C guidelines with the help of various tools. This paper presents a multi-tool accessibility assessment of government department websites belonging to the Indian state of Jammu and Kashmir. A comparative analysis of six accessibility tools is also presented with 14 different parameters. The accessibility analysis tools used in this study for analysis are aChecker, Cynthia Says, Tenon, wave, Mauve, and Hera. These tools provide us the results of selected websites accessibility status on Web Content Accessibility Guidelines (WCAG) 1.0 and 2.0. It was found that there are variations in accessibility analysis results when using different accessibility metrics to measure the accessibility of websites. In addition to this, we have identified the guidelines which have frequently been violated. It was observed that there is a need for incorporating the accessibility component features among the selected websites. This paper presents a set of suggestions to improve the accessibility status of these sites so that the information and services provided by these sites shall reach a wider spectrum of audience without any barrier. Implications for rehabilitation The following points indicates that this case study of JKGAD websites comes under Rehabilitation focused on Visually Impaired users. Due to the universal nature of web, it should be accessible to all according to WCAG guidelines framed by World Wide Web Consortium. In this paper we have identified multiple accessibility barriers for persons with visual impairment while browsing the Jammu and Kashmir Government websites. Multi-tool analysis has been done to pin-point the potential barriers for persons with visually Impaired. Usability analysis has been performed to check whether these websites are suitable for persons with visual impairment. We provide some valuable suggestions which can be followed by developers and designers to minimize these potential accessibility barriers.Based on aforementioned key points, this article helps the persons with disability especially Visually Impaired Users to access the web resources better with the implementation of identified suggestions.

  4. WebSat ‐ A web software for microsatellite marker development

    PubMed Central

    Martins, Wellington Santos; Soares Lucas, Divino César; de Souza Neves, Kelligton Fabricio; Bertioli, David John

    2009-01-01

    Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. Availability The web tool may be accessed at http://purl.oclc.org/NET/websat/ PMID:19255650

  5. MDWeb and MDMoby: an integrated web-based platform for molecular dynamics simulations.

    PubMed

    Hospital, Adam; Andrio, Pau; Fenollosa, Carles; Cicin-Sain, Damjan; Orozco, Modesto; Gelpí, Josep Lluís

    2012-05-01

    MDWeb and MDMoby constitute a web-based platform to help access to molecular dynamics (MD) in the standard and high-throughput regime. The platform provides tools to prepare systems from PDB structures mimicking the procedures followed by human experts. It provides inputs and can send simulations for three of the most popular MD packages (Amber, NAMD and Gromacs). Tools for analysis of trajectories, either provided by the user or retrieved from our MoDEL database (http://mmb.pcb.ub.es/MoDEL) are also incorporated. The platform has two ways of access, a set of web-services based on the BioMoby framework (MDMoby), programmatically accessible and a web portal (MDWeb). http://mmb.irbbarcelona.org/MDWeb; additional information and methodology details can be found at the web site ( http://mmb.irbbarcelona.org/MDWeb/help.php)

  6. Web-Based Online Public Access Catalogues of IIT Libraries in India: An Evaluative Study

    ERIC Educational Resources Information Center

    Madhusudhan, Margam; Aggarwal, Shalini

    2011-01-01

    Purpose: The purpose of the paper is to examine the various features and components of web-based online public access catalogues (OPACs) of IIT libraries in India with the help of a specially designed evaluation checklist. Design/methodology/approach: The various features of the web-based OPACs in six IIT libraries (IIT Delhi, IIT Bombay, IIT…

  7. NNDC Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuli, J.K.; Sonzogni,A.

    The National Nuclear Data Center has provided remote access to some of its resources since 1986. The major databases and other resources available currently through NNDC Web site are summarized. The National Nuclear Data Center (NNDC) has provided remote access to the nuclear physics databases it maintains and to other resources since 1986. With considerable innovation access is now mostly through the Web. The NNDC Web pages have been modernized to provide a consistent state-of-the-art style. The improved database services and other resources available from the NNOC site at www.nndc.bnl.govwill be described.

  8. An evaluation of the process and initial impact of disseminating a nursing e-thesis.

    PubMed

    Macduff, Colin

    2009-05-01

    This paper is a report of a study conducted to evaluate product, process and outcome aspects of the dissemination of a nursing PhD thesis via an open-access electronic institutional repository. Despite the growth of university institutional repositories which make theses easily accessible via the world wide web, nursing has been very slow to evaluate related processes and outcomes. Drawing on Stake's evaluation research methods, a case study design was adopted. The case is described using a four-phase structure within which key aspects of process and impact are reflexively analysed. In the conceptualization/re-conceptualization phase, fundamental questions about the purpose, format and imagined readership for a published nursing PhD were considered. In the preparation phase, seven key practical processes were identified that are likely to be relevant to most e-theses. In the dissemination phase email invitations were primarily used to invite engagement. The evaluation phase involved quantitative indicators of initial impact, such as page viewing and download statistics and qualitative feedback on processes and product. Analysis of process and impact elements of e-thesis dissemination is likely to have more than intrinsic value. The advent of e-theses housed in web-based institutional repositories has the potential to transform thesis access and use. It also offers potential to transform the nature and scope of thesis production and dissemination. Nursing scholars can exploit and evaluate such opportunities.

  9. Using EMBL-EBI Services via Web Interface and Programmatically via Web Services.

    PubMed

    Lopez, Rodrigo; Cowley, Andrew; Li, Weizhong; McWilliam, Hamish

    2014-12-12

    The European Bioinformatics Institute (EMBL-EBI) provides access to a wide range of databases and analysis tools that are of key importance in bioinformatics. As well as providing Web interfaces to these resources, Web Services are available using SOAP and REST protocols that enable programmatic access to our resources and allow their integration into other applications and analytical workflows. This unit describes the various options available to a typical researcher or bioinformatician who wishes to use our resources via Web interface or programmatically via a range of programming languages. Copyright © 2014 John Wiley & Sons, Inc.

  10. Access Control of Web- and Java-Based Applications

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.; Pajevski, Michael J.

    2013-01-01

    Cybersecurity has become a great concern as threats of service interruption, unauthorized access, stealing and altering of information, and spreading of viruses have become more prevalent and serious. Application layer access control of applications is a critical component in the overall security solution that also includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. An access control solution, based on an open-source access manager augmented with custom software components, was developed to provide protection to both Web-based and Javabased client and server applications. The DISA Security Service (DISA-SS) provides common access control capabilities for AMMOS software applications through a set of application programming interfaces (APIs) and network- accessible security services for authentication, single sign-on, authorization checking, and authorization policy management. The OpenAM access management technology designed for Web applications can be extended to meet the needs of Java thick clients and stand alone servers that are commonly used in the JPL AMMOS environment. The DISA-SS reusable components have greatly reduced the effort for each AMMOS subsystem to develop its own access control strategy. The novelty of this work is that it leverages an open-source access management product that was designed for Webbased applications to provide access control for Java thick clients and Java standalone servers. Thick clients and standalone servers are still commonly used in businesses and government, especially for applications that require rich graphical user interfaces and high-performance visualization that cannot be met by thin clients running on Web browsers

  11. Automating Information Discovery Within the Invisible Web

    NASA Astrophysics Data System (ADS)

    Sweeney, Edwina; Curran, Kevin; Xie, Ermai

    A Web crawler or spider crawls through the Web looking for pages to index, and when it locates a new page it passes the page on to an indexer. The indexer identifies links, keywords, and other content and stores these within its database. This database is searched by entering keywords through an interface and suitable Web pages are returned in a results page in the form of hyperlinks accompanied by short descriptions. The Web, however, is increasingly moving away from being a collection of documents to a multidimensional repository for sounds, images, audio, and other formats. This is leading to a situation where certain parts of the Web are invisible or hidden. The term known as the "Deep Web" has emerged to refer to the mass of information that can be accessed via the Web but cannot be indexed by conventional search engines. The concept of the Deep Web makes searches quite complex for search engines. Google states that the claim that conventional search engines cannot find such documents as PDFs, Word, PowerPoint, Excel, or any non-HTML page is not fully accurate and steps have been taken to address this problem by implementing procedures to search items such as academic publications, news, blogs, videos, books, and real-time information. However, Google still only provides access to a fraction of the Deep Web. This chapter explores the Deep Web and the current tools available in accessing it.

  12. Global Access to Library of Congress' Digital Resources: National Digital Library and Internet Resources.

    ERIC Educational Resources Information Center

    Chen, Ching-chih

    1996-01-01

    Summarizes how the Library of Congress' digital library collections can be accessed globally via the Internet and World Wide Web. Outlines the resources found in each of the various access points: gopher, online catalog, library and legislative Web sites, legal and copyright databases, and FTP (file transfer protocol) sites. (LAM)

  13. Publishing Accessible Materials on the Web and CD-ROM.

    ERIC Educational Resources Information Center

    Federal Resource Center for Special Education, Washington, DC.

    While it is generally simple to make electronic content accessible, it is also easy inadvertently to make it inaccessible. This guide covers the many formats of electronic documents and points out what to keep in mind and what procedures to follow to make documents accessible to all when disseminating information via the World Wide Web and on…

  14. Usage and Design Evaluation by Family Caregivers of aStroke Intervention Website

    PubMed Central

    Pierce, Linda L.; Steiner, Victoria

    2013-01-01

    Background Four out of 5 families are affected by stroke. Many caregivers access the Internet and gather healthcare information from web-based sources. Design The purpose of this descriptive evaluation was to assess the usage and design of the Caring~Web© site, which provides education/support for family caregivers of persons with stroke residing in home settings. Sample and Setting Thirty-six caregivers from two Midwest states accessed this intervention in a 1-year study. The average participant was fifty-four years of age, white, female, and the spouse of the care recipient. Methods In a telephone interview, four website questions were asked twice-/bi-monthly and a 33-item Survey at the conclusion of the study evaluated the website usage and design of its components. Descriptive analysis methods were used and statistics were collected on the number of visits to the website. Results On average, participants logged on to the website one to two hours per week, although usage declined after several months for some participants. Participants positively rated the website’s appearance and usability that included finding the training to be adequate. Conclusion Website designers can replicate this intervention for other health conditions. PMID:24025464

  15. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  16. Data from selected U.S. Geological Survey National Stream Water Quality Monitoring Networks

    USGS Publications Warehouse

    Alexander, Richard B.; Slack, James R.; Ludtke, Amy S.; Fitzgerald, Kathleen K.; Schertz, Terry L.

    1998-01-01

    A nationally consistent and well-documented collection of water quality and quantity data compiled during the past 30 years for streams and rivers in the United States is now available on CD-ROM and accessible over the World Wide Web. The data include measurements from two U.S. Geological Survey (USGS) national networks for 122 physical, chemical, and biological properties of water collected at 680 monitoring stations from 1962 to 1995, quality assurance information that describes the sample collection agencies, laboratories, analytical methods, and estimates of laboratory measurement error (bias and variance), and information on selected cultural and natural characteristics of the station watersheds. The data are easily accessed via user-supplied software including Web browser, spreadsheet, and word processor, or may be queried and printed according to user-specified criteria using the supplied retrieval software on CD-ROM. The water quality data serve a variety of scientific uses including research and educational applications related to trend detection, flux estimation, investigations of the effects of the natural environment and cultural sources on water quality, and the development of statistical methods for designing efficient monitoring networks and interpreting water resources data.

  17. Integrating Web-based technology into distance education for nurses in China: computer and Internet access and attitudes.

    PubMed

    Cragg, C E Betty; Edwards, Nancy; Yue, Zhao; Xin, Song Li; Hui, Zou Dao

    2003-01-01

    To increase continuing education accessibility, nurses around the world are turning to Web-based instruction. However, for Internet education to be successful, particularly in developing countries, nurses must have access to computers and the Internet as well as positive attitudes toward this form of learning. As part of a distance education project for nurses of the Tianjin Municipality in China, a survey of nurses was conducted to examine their sources of professional knowledge as well as their computer and Internet access and attitudes. The attitudes of the nurses were generally positive, and there was evidence of rapidly increasing use of and access to computers and the Internet. This article reports the results of that survey and their implications for Web-based teaching of Chinese nurses.

  18. Fast access to the CMS detector condition data employing HTML5 technologies

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    This paper focuses on using HTML version 5 (HTML5) for accessing condition data for the CMS experiment, evaluating the benefits and risks posed by the use of this technology. According to the authors of HTML5, this technology attempts to solve issues found in previous iterations of HTML and addresses the needs of web applications, an area previously not adequately covered by HTML. We demonstrate that employing HTML5 brings important benefits in terms of access performance to the CMS condition data. The combined use of web storage and web sockets allows increasing the performance and reducing the costs in term of computation power, memory usage and network bandwidth for client and server. Above all, the web workers allow creating different scripts that can be executed using multi-thread mode, exploiting multi-core microprocessors. Web workers have been employed in order to substantially decrease the web page rendering time to display the condition data stored in the CMS condition database.

  19. The value of Web-based library services at Cedars-Sinai Health System.

    PubMed

    Halub, L P

    1999-07-01

    Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services.

  20. The value of Web-based library services at Cedars-Sinai Health System.

    PubMed Central

    Halub, L P

    1999-01-01

    Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services. PMID:10427423

  1. Source Update Capture in Information Agents

    NASA Technical Reports Server (NTRS)

    Ashish, Naveen; Kulkarni, Deepak; Wang, Yao

    2003-01-01

    In this paper we present strategies for successfully capturing updates at Web sources. Web-based information agents provide integrated access to autonomous Web sources that can get updated. For many information agent applications we are interested in knowing when a Web source to which the application provides access, has been updated. We may also be interested in capturing all the updates at a Web source over a period of time i.e., detecting the updates and, for each update retrieving and storing the new version of data. Previous work on update and change detection by polling does not adequately address this problem. We present strategies for intelligently polling a Web source for efficiently capturing changes at the source.

  2. An efficient scheme for automatic web pages categorization using the support vector machine

    NASA Astrophysics Data System (ADS)

    Bhalla, Vinod Kumar; Kumar, Neeraj

    2016-07-01

    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  3. Rural and Urban/Suburban Families' Use of a Web-Based Mental Health Intervention.

    PubMed

    Bunnell, Brian E; Davidson, Tatiana M; Dewey, Daniel; Price, Matthew; Ruggiero, Kenneth J

    2017-05-01

    Background/Introduction: Access to mental healthcare among rural residents is a national concern because unique barriers (e.g., fewer providers, distance to services) create significant challenges for the 60 million Americans who live in these settings. There is now a large body of literature demonstrating the efficacy of a wide range of Internet-based interventions. However, little is known about the extent to which individuals in rural settings will use these approaches and find them acceptable. Research with youths and their caregivers within this scope is particularly limited and, therefore, of great importance. We examined access and completion of a Web-based disaster mental health intervention in a population-based sample of 1,997 rural (n = 676) and urban/suburban (n = 1,321) adolescents and their caregivers who were affected by the Spring 2011 tornadoes that touched down in parts of Missouri and Alabama. Results indicated no differences in the rate of access or completion of Web-based modules based on geographical location. Furthermore, for those who did not access the Web-based resource, no differences were observed with respect to reasons for not accessing modules based on geographical location. These data have promising implications for the reach of Web-based resources to both rural and urban/suburban communities, as well as the willingness of adolescents and their caregivers to access and complete such resources, regardless of geographical location.

  4. A Web Server for MACCS Magnetometer Data

    NASA Technical Reports Server (NTRS)

    Engebretson, Mark J.

    1998-01-01

    NASA Grant NAG5-3719 was provided to Augsburg College to support the development of a web server for the Magnetometer Array for Cusp and Cleft Studies (MACCS), a two-dimensional array of fluxgate magnetometers located at cusp latitudes in Arctic Canada. MACCS was developed as part of the National Science Foundation's GEM (Geospace Environment Modeling) Program, which was designed in part to complement NASA's Global Geospace Science programs during the decade of the 1990s. This report describes the successful use of these grant funds to support a working web page that provides both daily plots and file access to any user accessing the worldwide web. The MACCS home page can be accessed at http://space.augsburg.edu/space/MaccsHome.html.

  5. SIDECACHE: Information access, management and dissemination framework for web services.

    PubMed

    Doderer, Mark S; Burkhardt, Cory; Robbins, Kay A

    2011-06-14

    Many bioinformatics algorithms and data sets are deployed using web services so that the results can be explored via the Internet and easily integrated into other tools and services. These services often include data from other sites that is accessed either dynamically or through file downloads. Developers of these services face several problems because of the dynamic nature of the information from the upstream services. Many publicly available repositories of bioinformatics data frequently update their information. When such an update occurs, the developers of the downstream service may also need to update. For file downloads, this process is typically performed manually followed by web service restart. Requests for information obtained by dynamic access of upstream sources is sometimes subject to rate restrictions. SideCache provides a framework for deploying web services that integrate information extracted from other databases and from web sources that are periodically updated. This situation occurs frequently in biotechnology where new information is being continuously generated and the latest information is important. SideCache provides several types of services including proxy access and rate control, local caching, and automatic web service updating. We have used the SideCache framework to automate the deployment and updating of a number of bioinformatics web services and tools that extract information from remote primary sources such as NCBI, NCIBI, and Ensembl. The SideCache framework also has been used to share research results through the use of a SideCache derived web service.

  6. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    DTIC Science & Technology

    2017-02-01

    Image Processing Web Server Administration ...........................17 Fig. 18 Microsoft ASP.NET MVC 4 installation...algorithms are made into client applications that can be accessed from an image processing web service2 developed following Representational State...Transfer (REST) standards by a mobile app, laptop PC, and other devices. Similarly, weather tweets can be accessed via the Weather Digest Web Service

  7. Metrological traceability in education: A practical online system for measuring and managing middle school mathematics instruction

    NASA Astrophysics Data System (ADS)

    Torres Irribarra, D.; Freund, R.; Fisher, W.; Wilson, M.

    2015-02-01

    Computer-based, online assessments modelled, designed, and evaluated for adaptively administered invariant measurement are uniquely suited to defining and maintaining traceability to standardized units in education. An assessment of this kind is embedded in the Assessing Data Modeling and Statistical Reasoning (ADM) middle school mathematics curriculum. Diagnostic information about middle school students' learning of statistics and modeling is provided via computer-based formative assessments for seven constructs that comprise a learning progression for statistics and modeling from late elementary through the middle school grades. The seven constructs are: Data Display, Meta-Representational Competence, Conceptions of Statistics, Chance, Modeling Variability, Theory of Measurement, and Informal Inference. The end product is a web-delivered system built with Ruby on Rails for use by curriculum development teams working with classroom teachers in designing, developing, and delivering formative assessments. The online accessible system allows teachers to accurately diagnose students' unique comprehension and learning needs in a common language of real-time assessment, logging, analysis, feedback, and reporting.

  8. EntrezAJAX: direct web browser access to the Entrez Programming Utilities

    PubMed Central

    2010-01-01

    Web applications for biology and medicine often need to integrate data from Entrez services provided by the National Center for Biotechnology Information. However, direct access to Entrez from a web browser is not possible due to 'same-origin' security restrictions. The use of "Asynchronous JavaScript and XML" (AJAX) to create rich, interactive web applications is now commonplace. The ability to access Entrez via AJAX would be advantageous in the creation of integrated biomedical web resources. We describe EntrezAJAX, which provides access to Entrez eUtils and is able to circumvent same-origin browser restrictions. EntrezAJAX is easily implemented by JavaScript developers and provides identical functionality as Entrez eUtils as well as enhanced functionality to ease development. We provide easy-to-understand developer examples written in JavaScript to illustrate potential uses of this service. For the purposes of speed, reliability and scalability, EntrezAJAX has been deployed on Google App Engine, a freely available cloud service. The EntrezAJAX webpage is located at http://entrezajax.appspot.com/ PMID:20565938

  9. R3D Align web server for global nucleotide to nucleotide alignments of RNA 3D structures.

    PubMed

    Rahrig, Ryan R; Petrov, Anton I; Leontis, Neocles B; Zirbel, Craig L

    2013-07-01

    The R3D Align web server provides online access to 'RNA 3D Align' (R3D Align), a method for producing accurate nucleotide-level structural alignments of RNA 3D structures. The web server provides a streamlined and intuitive interface, input data validation and output that is more extensive and easier to read and interpret than related servers. The R3D Align web server offers a unique Gallery of Featured Alignments, providing immediate access to pre-computed alignments of large RNA 3D structures, including all ribosomal RNAs, as well as guidance on effective use of the server and interpretation of the output. By accessing the non-redundant lists of RNA 3D structures provided by the Bowling Green State University RNA group, R3D Align connects users to structure files in the same equivalence class and the best-modeled representative structure from each group. The R3D Align web server is freely accessible at http://rna.bgsu.edu/r3dalign/.

  10. Optimal Access to NASA Water Cycle Data for Water Resources Management

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Arctur, D. K.; Espinoza, G. E.; Rui, H.; Strub, R. F.; Vollmer, B.

    2016-12-01

    A "Digital Divide" in data representation exists between the preferred way of data access by the hydrology community (i.e., as time series of discrete spatial objects) and the common way of data archival by earth science data centers (i.e., as continuous spatial fields, one file per time step). This Divide has been an obstacle, specifically, between the Consortium of Universities for the Advancement of Hydrologic Science, Inc. Hydrologic Information System (CUAHSI HIS) and NASA Goddard Earth Sciences Data and Information Services Center (GES DISC). An optimal approach to bridging the Divide, developed by the GES DISC, is to reorganize data from the way they are archived to some way that is optimal for the desired method of data access. Specifically for CUAHSI HIS, selected data sets were reorganized into time series files, one per geographical "point." These time series files, termed "data rods," are pre-generated or virtual (generated on-the-fly). Data sets available as data rods include North American Land Data Assimilation System (NLDAS), Global Land Data Assimilation System (GLDAS), TRMM Multi-satellite Precipitation Analysis (TMPA), Land Parameter Retrieval Model (LPRM), Modern-Era Retrospective Analysis for Research and Applications (MERRA)-Land, and Groundwater and Soil Moisture Conditions from Gravity Recovery and Climate Experiment (GRACE) Data Assimilation drought indicators for North America Drought Monitor (GRACE-DA-DM). In order to easily avail the operational water resources community the benefits of optimally reorganized data, we have developed multiple methods of making these data more easily accessible and usable. These include direct access via RESTful Web services, a browser-based Web map and statistical tool for selected NLDAS variables for the U.S. (CONUS), a HydroShare app (Data Rods Explorer, under development) on the Tethys Platform, and access via the GEOSS Portal. Examples of drought-related applications of these data and data access methods are provided.

  11. One EPA Web Guidances and Checklists

    EPA Pesticide Factsheets

    These One EPA Web resources are available to editors with Web Guide access. Learn about content development, web council and EIC responsibilities, audiences and top tasks, website format and structure, and site review and approval.

  12. Finding Web-Based Anxiety Interventions on the World Wide Web: A Scoping Review

    PubMed Central

    Olander, Ellinor K; Ayers, Susan

    2016-01-01

    Background One relatively new and increasingly popular approach of increasing access to treatment is Web-based intervention programs. The advantage of Web-based approaches is the accessibility, affordability, and anonymity of potentially evidence-based treatment. Despite much research evidence on the effectiveness of Web-based interventions for anxiety found in the literature, little is known about what is publically available for potential consumers on the Web. Objective Our aim was to explore what a consumer searching the Web for Web-based intervention options for anxiety-related issues might find. The objectives were to identify currently publically available Web-based intervention programs for anxiety and to synthesize and review these in terms of (1) website characteristics such as credibility and accessibility; (2) intervention program characteristics such as intervention focus, design, and presentation modes; (3) therapeutic elements employed; and (4) published evidence of efficacy. Methods Web keyword searches were carried out on three major search engines (Google, Bing, and Yahoo—UK platforms). For each search, the first 25 hyperlinks were screened for eligible programs. Included were programs that were designed for anxiety symptoms, currently publically accessible on the Web, had an online component, a structured treatment plan, and were available in English. Data were extracted for website characteristics, program characteristics, therapeutic characteristics, as well as empirical evidence. Programs were also evaluated using a 16-point rating tool. Results The search resulted in 34 programs that were eligible for review. A wide variety of programs for anxiety, including specific anxiety disorders, and anxiety in combination with stress, depression, or anger were identified and based predominantly on cognitive behavioral therapy techniques. The majority of websites were rated as credible, secure, and free of advertisement. The majority required users to register and/or to pay a program access fee. Half of the programs offered some form of paid therapist or professional support. Programs varied in treatment length and number of modules and employed a variety of presentation modes. Relatively few programs had published research evidence of the intervention’s efficacy. Conclusions This review represents a snapshot of available Web-based intervention programs for anxiety that could be found by consumers in March 2015. The consumer is confronted with a diversity of programs, which makes it difficult to identify an appropriate program. Limited reports and existence of empirical evidence for efficacy make it even more challenging to identify credible and reliable programs. This highlights the need for consistent guidelines and standards on developing, providing, and evaluating Web-based interventions and platforms with reliable up-to-date information for professionals and consumers about the characteristics, quality, and accessibility of Web-based interventions. PMID:27251763

  13. Finding Web-Based Anxiety Interventions on the World Wide Web: A Scoping Review.

    PubMed

    Ashford, Miriam Thiel; Olander, Ellinor K; Ayers, Susan

    2016-06-01

    One relatively new and increasingly popular approach of increasing access to treatment is Web-based intervention programs. The advantage of Web-based approaches is the accessibility, affordability, and anonymity of potentially evidence-based treatment. Despite much research evidence on the effectiveness of Web-based interventions for anxiety found in the literature, little is known about what is publically available for potential consumers on the Web. Our aim was to explore what a consumer searching the Web for Web-based intervention options for anxiety-related issues might find. The objectives were to identify currently publically available Web-based intervention programs for anxiety and to synthesize and review these in terms of (1) website characteristics such as credibility and accessibility; (2) intervention program characteristics such as intervention focus, design, and presentation modes; (3) therapeutic elements employed; and (4) published evidence of efficacy. Web keyword searches were carried out on three major search engines (Google, Bing, and Yahoo-UK platforms). For each search, the first 25 hyperlinks were screened for eligible programs. Included were programs that were designed for anxiety symptoms, currently publically accessible on the Web, had an online component, a structured treatment plan, and were available in English. Data were extracted for website characteristics, program characteristics, therapeutic characteristics, as well as empirical evidence. Programs were also evaluated using a 16-point rating tool. The search resulted in 34 programs that were eligible for review. A wide variety of programs for anxiety, including specific anxiety disorders, and anxiety in combination with stress, depression, or anger were identified and based predominantly on cognitive behavioral therapy techniques. The majority of websites were rated as credible, secure, and free of advertisement. The majority required users to register and/or to pay a program access fee. Half of the programs offered some form of paid therapist or professional support. Programs varied in treatment length and number of modules and employed a variety of presentation modes. Relatively few programs had published research evidence of the intervention's efficacy. This review represents a snapshot of available Web-based intervention programs for anxiety that could be found by consumers in March 2015. The consumer is confronted with a diversity of programs, which makes it difficult to identify an appropriate program. Limited reports and existence of empirical evidence for efficacy make it even more challenging to identify credible and reliable programs. This highlights the need for consistent guidelines and standards on developing, providing, and evaluating Web-based interventions and platforms with reliable up-to-date information for professionals and consumers about the characteristics, quality, and accessibility of Web-based interventions.

  14. Tools for Integrating Data Access from the IRIS DMC into Research Workflows

    NASA Astrophysics Data System (ADS)

    Reyes, C. G.; Suleiman, Y. Y.; Trabant, C.; Karstens, R.; Weertman, B. R.

    2012-12-01

    Web service interfaces at the IRIS Data Management Center (DMC) provide access to a vast archive of seismological and related geophysical data. These interfaces are designed to easily incorporate data access into data processing workflows. Examples of data that may be accessed include: time series data, related metadata, and earthquake information. The DMC has developed command line scripts, MATLAB® interfaces and a Java library to support a wide variety of data access needs. Users of these interfaces do not need to concern themselves with web service details, networking, or even (in most cases) data conversion. Fetch scripts allow access to the DMC archive and are a comfortable fit for command line users. These scripts are written in Perl and are well suited for automation and integration into existing workflows on most operating systems. For metdata and event information, the Fetch scripts even parse the returned data into simple text summaries. The IRIS Java Web Services Library (IRIS-WS Library) allows Java developers the ability to create programs that access the DMC archives seamlessly. By returning the data and information as native Java objects the Library insulates the developer from data formats, network programming and web service details. The MATLAB interfaces leverage this library to allow users access to the DMC archive directly from within MATLAB (r2009b or newer), returning data into variables for immediate use. Data users and research groups are developing other toolkits that use the DMC's web services. Notably, the ObsPy framework developed at LMU Munich is a Python Toolbox that allows seamless access to data and information via the DMC services. Another example is the MATLAB-based GISMO and Waveform Suite developments that can now access data via web services. In summary, there now exist a host of ways that researchers can bring IRIS DMC data directly into their workflows. MATLAB users can use irisFetch.m, command line users can use the various Fetch scripts, Java users can use the IRIS-WS library, and Python users may request data through ObsPy. To learn more about any of these clients see http://www.iris.edu/ws/wsclients/.

  15. R3D-2-MSA: the RNA 3D structure-to-multiple sequence alignment server

    PubMed Central

    Cannone, Jamie J.; Sweeney, Blake A.; Petrov, Anton I.; Gutell, Robin R.; Zirbel, Craig L.; Leontis, Neocles

    2015-01-01

    The RNA 3D Structure-to-Multiple Sequence Alignment Server (R3D-2-MSA) is a new web service that seamlessly links RNA three-dimensional (3D) structures to high-quality RNA multiple sequence alignments (MSAs) from diverse biological sources. In this first release, R3D-2-MSA provides manual and programmatic access to curated, representative ribosomal RNA sequence alignments from bacterial, archaeal, eukaryal and organellar ribosomes, using nucleotide numbers from representative atomic-resolution 3D structures. A web-based front end is available for manual entry and an Application Program Interface for programmatic access. Users can specify up to five ranges of nucleotides and 50 nucleotide positions per range. The R3D-2-MSA server maps these ranges to the appropriate columns of the corresponding MSA and returns the contents of the columns, either for display in a web browser or in JSON format for subsequent programmatic use. The browser output page provides a 3D interactive display of the query, a full list of sequence variants with taxonomic information and a statistical summary of distinct sequence variants found. The output can be filtered and sorted in the browser. Previous user queries can be viewed at any time by resubmitting the output URL, which encodes the search and re-generates the results. The service is freely available with no login requirement at http://rna.bgsu.edu/r3d-2-msa. PMID:26048960

  16. Accredited hand surgery fellowship Web sites: analysis of content and accessibility.

    PubMed

    Trehan, Samir K; Morrell, Nathan T; Akelman, Edward

    2015-04-01

    To assess the accessibility and content of accredited hand surgery fellowship Web sites. A list of all accredited hand surgery fellowships was obtained from the online database of the American Society for Surgery of the Hand (ASSH). Fellowship program information on the ASSH Web site was recorded. All fellowship program Web sites were located via Google search. Fellowship program Web sites were analyzed for accessibility and content in 3 domains: program overview, application information/recruitment, and education. At the time of this study, there were 81 accredited hand surgery fellowships with 169 available positions. Thirty of 81 programs (37%) had a functional link on the ASSH online hand surgery fellowship directory; however, Google search identified 78 Web sites. Three programs did not have a Web site. Analysis of content revealed that most Web sites contained contact information, whereas information regarding the anticipated clinical, research, and educational experiences during fellowship was less often present. Furthermore, information regarding past and present fellows, salary, application process/requirements, call responsibilities, and case volume was frequently lacking. Overall, 52 of 81 programs (64%) had the minimal online information required for residents to independently complete the fellowship application process. Hand fellowship program Web sites could be accessed either via the ASSH online directory or Google search, except for 3 programs that did not have Web sites. Although most fellowship program Web sites contained contact information, other content such as application information/recruitment and education, was less frequently present. This study provides comparative data regarding the clinical and educational experiences outlined on hand fellowship program Web sites that are of relevance to residents, fellows, and academic hand surgeons. This study also draws attention to various ways in which the hand surgery fellowship application process can be made more user-friendly and efficient. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  17. Architecture of the local spatial data infrastructure for regional climate change research

    NASA Astrophysics Data System (ADS)

    Titov, Alexander; Gordov, Evgeny

    2013-04-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.

  18. Innovative Instructional Tools from the AMS

    NASA Astrophysics Data System (ADS)

    Abshire, W. E.; Geer, I. W.; Mills, E. W.; Nugnes, K. A.; Stimach, A. E.

    2016-12-01

    Since 1996, the American Meteorological Society (AMS) has been developing online educational materials with dynamic features that engage students and encourage additional exploration of various concepts. Most recently, AMS transitioned its etextbooks to webBooks. Now accessible anywhere with internet access, webBooks can be read with any web browser. Prior versions of AMS etextbooks were difficult to use in a lab setting, however webBooks are much easier to use and no longer a hurdle to learning. Additionally, AMS eInvestigations Manuals, also in webBook format, include labs with innovative features and educational tools. One such example is the AMS Climate at a Glance (CAG) app that draws data from NOAA's Climate at a Glance website. The user selects historical data of a given parameter and the app calculates various statistics revealing whether or not the results are consistent with climate change. These results allow users to distinguish between climate variability and climate change. This can be done for hundreds of locations across the U.S. and on multiple time scales. Another innovative educational tool used in AMS eInvestigations Manuals is the AMS Conceptual Climate Energy Model (CCEM). The CCEM is a computer simulation designed to enable users to track the paths that units of energy might follow as they enter, move through and exit an imaginary system according to simple rules applied to different scenarios. The purpose is to provide insight into the impacts of physical processes that operate in the real world. Finally, AMS educational materials take advantage of Google Earth imagery to reproduce the physical aspects of globes, allowing users to investigate spatial relationships in three dimensions. Google Earth imagery is used to explore tides, ocean bottom bathymetry and El Nino and La Nina. AMS will continue to develop innovative educational materials and tools as technology advances, to attract more students to the Earth sciences.

  19. Continuing Education for Department of Defense Health Professionals

    DTIC Science & Technology

    2015-11-24

    American Pharmacists Association, 60 and American Nurses Association. 61 These associations and other health-focused organizations, including health...1298. Accessed May 29, 2014. 60. American Pharmacists Association. Learn [Web page]. 2014; http://www.pharmacist.com/node/26541. Accessed May 29...American Pharmacists Association. Learn [Web page]. 2014; http://www.pharmacist.com/node/26541. Accessed May 29, 2014. 61. American Nurses Association

  20. Accessibility and Use of Web-Based Electronic Resources by Physicians in a Psychiatric Institution in Nigeria

    ERIC Educational Resources Information Center

    Oduwole, Adebambo Adewale; Oyewumi, Olatundun

    2010-01-01

    Purpose: This study aims to examine the accessibility and use of web-based electronic databases on the Health InterNetwork Access to Research Initiative (HINARI) portal by physicians in the Neuropsychiatric Hospital, Aro--a psychiatry health institution in Nigeria. Design/methodology/approach: Collection of data was through the use of a three-part…

  1. A System for Web-based Access to the HSOS Database

    NASA Astrophysics Data System (ADS)

    Lin, G.

    Huairou Solar Observing Station's (HSOS) magnetogram and dopplergram are world-class instruments. Access to their data has opened to the world. Web-based access to the data will provide a powerful, convenient tool for data searching and solar physics. It is necessary that our data be provided to users via the Web when it is opened to the world. In this presentation, the author describes general design and programming construction of the system. The system will be generated by PHP and MySQL. The author also introduces basic feature of PHP and MySQL.

  2. Comparison of telephone with World Wide Web-based responses by parents and teens to a follow-up survey after injury.

    PubMed

    Rivara, Frederick P; Koepsell, Thomas D; Wang, Jin; Durbin, Dennis; Jaffe, Kenneth M; Vavilala, Monica; Dorsch, Andrea; Roper-Caldbeck, Maria; Houseknecht, Eileen; Temkin, Nancy

    2011-06-01

    To identify sociodemographic factors associated with completing a follow-up survey about health status on the web versus by telephone, and to examine differences in reported health-related quality of life by method of response. Survey about child health status of 896 parents of children aged 0-17 years treated in a hospital emergency department or admitted for a traumatic brain injury or arm injury, and 227 injured adolescents aged 14-17 years. The main outcomes were characteristics of those who completed a follow-up survey on the web versus by telephone and health-related quality of life by method of response. Email addresses were provided by 76.9 percent of parents and 56.5 percent of adolescents at baseline. The survey was completed on the web by 64.9 percent of parents and 40.2 percent of adolescents through email. Parents with email access who were Blacks, Hispanics, had lower incomes, and those who were not working were less likely to choose the web mode for completing the survey. Unlike adolescents, the amount of time for parents to complete the survey online was significantly shorter than completion by telephone. Differences by survey mode were small but statistically significant in some of the six functional outcome measures examined. Survey mode was associated with several sociodemographic characteristics. Sole use of web surveys could provide biased data. © Health Research and Educational Trust.

  3. Beyond Description: Converting Web Site Usage Statistics into Concrete Site Improvement Ideas

    ERIC Educational Resources Information Center

    Arendt, Julie; Wagner, Cassie

    2010-01-01

    Web site usage statistics are a widely used tool for Web site development, but libraries are still learning how to use them successfully. This case study summarizes how Morris Library at Southern Illinois University Carbondale implemented Google Analytics on its Web site and used the reports to inform a site redesign. As the main campus library at…

  4. Efficient exploration of pan-cancer networks by generalized covariance selection and interactive web content

    PubMed Central

    Kling, Teresia; Johansson, Patrik; Sanchez, José; Marinescu, Voichita D.; Jörnsten, Rebecka; Nelander, Sven

    2015-01-01

    Statistical network modeling techniques are increasingly important tools to analyze cancer genomics data. However, current tools and resources are not designed to work across multiple diagnoses and technical platforms, thus limiting their applicability to comprehensive pan-cancer datasets such as The Cancer Genome Atlas (TCGA). To address this, we describe a new data driven modeling method, based on generalized Sparse Inverse Covariance Selection (SICS). The method integrates genetic, epigenetic and transcriptional data from multiple cancers, to define links that are present in multiple cancers, a subset of cancers, or a single cancer. It is shown to be statistically robust and effective at detecting direct pathway links in data from TCGA. To facilitate interpretation of the results, we introduce a publicly accessible tool (cancerlandscapes.org), in which the derived networks are explored as interactive web content, linked to several pathway and pharmacological databases. To evaluate the performance of the method, we constructed a model for eight TCGA cancers, using data from 3900 patients. The model rediscovered known mechanisms and contained interesting predictions. Possible applications include prediction of regulatory relationships, comparison of network modules across multiple forms of cancer and identification of drug targets. PMID:25953855

  5. A Study of the Demographics of Web-Based Health-Related Social Media Users.

    PubMed

    Sadah, Shouq A; Shahbazi, Moloud; Wiley, Matthew T; Hristidis, Vagelis

    2015-08-06

    The rapid spread of Web-based social media in recent years has impacted how patients share health-related information. However, little work has studied the demographics of these users. Our aim was to study the demographics of users who participate in health-related Web-based social outlets to identify possible links to health care disparities. We analyze and compare three different types of health-related social outlets: (1) general Web-based social networks, Twitter and Google+, (2) drug review websites, and (3) health Web forums. We focus on the following demographic attributes: age, gender, ethnicity, location, and writing level. We build and evaluate domain-specific classifiers to infer missing data where possible. The estimated demographic statistics are compared against various baselines, such as Internet and social networks usage of the population. We found that (1) drug review websites and health Web forums are dominated by female users, (2) the participants of health-related social outlets are generally older with the exception of the 65+ years bracket, (3) blacks are underrepresented in health-related social networks, (4) users in areas with better access to health care participate more in Web-based health-related social outlets, and (5) the writing level of users in health-related social outlets is significantly lower than the reading level of the population. We identified interesting and actionable disparities in the participation of various demographic groups to various types of health-related social outlets. These disparities are significantly distinct from the disparities in Internet usage or general social outlets participation.

  6. Automated ocean color product validation for the Southern California Bight

    NASA Astrophysics Data System (ADS)

    Davis, Curtiss O.; Tufillaro, Nicholas; Jones, Burt; Arnone, Robert

    2012-06-01

    Automated match ups allow us to maintain and improve the products of current satellite ocean color sensors (MODIS, MERIS), and new sensors (VIIRS). As part of the VIIRS mission preparation, we have created a web based automated match up tool that provides access to searchable fields for date, site, and products, and creates match-ups between satellite (MODIS, MERIS, VIIRS), and in-situ measurements (HyperPRO and SeaPRISM). The back end of the system is a 'mySQL' database, and the front end is a `php' web portal with pull down menus for searchable fields. Based on selections, graphics are generated showing match-ups and statistics, and ascii files are created for downloads for the matchup data. Examples are shown for matching the satellite data with the data from Platform Eureka SeaPRISM off L.A. Harbor in the Southern California Bight.

  7. We Don't Train in Vain: A Dissemination Trial of Three Strategies of Training Clinicians in Cognitive–Behavioral Therapy

    PubMed Central

    Sholomskas, Diane E.; Syracuse-Siewert, Gia; Rounsaville, Bruce J.; Ball, Samuel A.; Nuro, Kathryn F.; Carroll, Kathleen M.

    2008-01-01

    There has been little research on the effectiveness of different training strategies or the impact of exposure to treatment manuals alone on clinicians' ability to effectively implement empirically supported therapies. Seventy-eight community-based clinicians were assigned to 1 of 3 training conditions: review of a cognitive–behavioral therapy (CBT) manual only, review of the manual plus access to a CBT training Web site, or review of the manual plus a didactic seminar followed by supervised casework. The primary outcome measure was the clinicians' ability to demonstrate key CBT interventions, as assessed by independent ratings of structured role plays. Statistically significant differences favoring the seminar plus supervision over the manual only condition were found for adherence and skill ratings for 2 of the 3 role plays, with intermediate scores for the Web condition. PMID:15709837

  8. The Virtual Solar-Terrestrial Observatory; access to and use of diverse solar and solar- terrestrial data.

    NASA Astrophysics Data System (ADS)

    Fox, P.; McGuinness, D.; Cinquini, L.; West, P.; Garcia, J.; Zednik, S.; Benedict, J.

    2008-05-01

    This presentation will demonstrate how users and other data providers can utilize the Virtual Solar-Terrestrial Observatory (VSTO) to find, access and use diverse data holdings from the disciplines of solar, solar-terrestrial and space physics. VSTO provides a web portal, web services and a native applications programming interface for various levels of users. Since these access methods are based on semantic web technologies and refer to the VSTO ontology, users also have the option of taking advantage of value added services when accessing and using the data. We present example of both conventional use of VSTO as well as the advanced semantics use. Finally, we present our future directions for VSTO and semantic data frameworks in general.

  9. The Virtual Solar-Terrestrial Observatory; access to and use of diverse solar and solar-terrestrial data

    NASA Astrophysics Data System (ADS)

    Fox, P.

    2007-05-01

    This presentation will demonstrate how users and other data providers can utilize the Virtual Solar-Terrestrial Observatory (VSTO) to find, access and use diverse data holdings from the disciplines of solar, solar-terrestrial and space physics. VSTO provides a web portal, web services and a native applications programming interface for various levels of users. Since these access methods are based on semantic web technologies and refer to the VSTO ontology, users also have the option of taking advantage of value added services when accessing and using the data. We present example of both conventional use of VSTO as well as the advanced semantics use. Finally, we present our future directions for VSTO and semantic data frameworks in general.

  10. ProteoSign: an end-user online differential proteomics statistical analysis platform.

    PubMed

    Efstathiou, Georgios; Antonakis, Andreas N; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Divanach, Peter; Trudgian, David C; Thomas, Benjamin; Papanikolaou, Nikolas; Aivaliotis, Michalis; Acuto, Oreste; Iliopoulos, Ioannis

    2017-07-03

    Profiling of proteome dynamics is crucial for understanding cellular behavior in response to intrinsic and extrinsic stimuli and maintenance of homeostasis. Over the last 20 years, mass spectrometry (MS) has emerged as the most powerful tool for large-scale identification and characterization of proteins. Bottom-up proteomics, the most common MS-based proteomics approach, has always been challenging in terms of data management, processing, analysis and visualization, with modern instruments capable of producing several gigabytes of data out of a single experiment. Here, we present ProteoSign, a freely available web application, dedicated in allowing users to perform proteomics differential expression/abundance analysis in a user-friendly and self-explanatory way. Although several non-commercial standalone tools have been developed for post-quantification statistical analysis of proteomics data, most of them are not end-user appealing as they often require very stringent installation of programming environments, third-party software packages and sometimes further scripting or computer programming. To avoid this bottleneck, we have developed a user-friendly software platform accessible via a web interface in order to enable proteomics laboratories and core facilities to statistically analyse quantitative proteomics data sets in a resource-efficient manner. ProteoSign is available at http://bioinformatics.med.uoc.gr/ProteoSign and the source code at https://github.com/yorgodillo/ProteoSign. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. A statistical model and national data set for partioning fish-tissue mercury concentration variation between spatiotemporal and sample characteristic effects

    USGS Publications Warehouse

    Wente, Stephen P.

    2004-01-01

    Many Federal, Tribal, State, and local agencies monitor mercury in fish-tissue samples to identify sites with elevated fish-tissue mercury (fish-mercury) concentrations, track changes in fish-mercury concentrations over time, and produce fish-consumption advisories. Interpretation of such monitoring data commonly is impeded by difficulties in separating the effects of sample characteristics (species, tissues sampled, and sizes of fish) from the effects of spatial and temporal trends on fish-mercury concentrations. Without such a separation, variation in fish-mercury concentrations due to differences in the characteristics of samples collected over time or across space can be misattributed to temporal or spatial trends; and/or actual trends in fish-mercury concentration can be misattributed to differences in sample characteristics. This report describes a statistical model and national data set (31,813 samples) for calibrating the aforementioned statistical model that can separate spatiotemporal and sample characteristic effects in fish-mercury concentration data. This model could be useful for evaluating spatial and temporal trends in fishmercury concentrations and developing fish-consumption advisories. The observed fish-mercury concentration data and model predictions can be accessed, displayed geospatially, and downloaded via the World Wide Web (http://emmma.usgs.gov). This report and the associated web site may assist in the interpretation of large amounts of data from widespread fishmercury monitoring efforts.

  12. ENGINES: exploring single nucleotide variation in entire human genomes.

    PubMed

    Amigo, Jorge; Salas, Antonio; Phillips, Christopher

    2011-04-19

    Next generation ultra-sequencing technologies are starting to produce extensive quantities of data from entire human genome or exome sequences, and therefore new software is needed to present and analyse this vast amount of information. The 1000 Genomes project has recently released raw data for 629 complete genomes representing several human populations through their Phase I interim analysis and, although there are certain public tools available that allow exploration of these genomes, to date there is no tool that permits comprehensive population analysis of the variation catalogued by such data. We have developed a genetic variant site explorer able to retrieve data for Single Nucleotide Variation (SNVs), population by population, from entire genomes without compromising future scalability and agility. ENGINES (ENtire Genome INterface for Exploring SNVs) uses data from the 1000 Genomes Phase I to demonstrate its capacity to handle large amounts of genetic variation (>7.3 billion genotypes and 28 million SNVs), as well as deriving summary statistics of interest for medical and population genetics applications. The whole dataset is pre-processed and summarized into a data mart accessible through a web interface. The query system allows the combination and comparison of each available population sample, while searching by rs-number list, chromosome region, or genes of interest. Frequency and FST filters are available to further refine queries, while results can be visually compared with other large-scale Single Nucleotide Polymorphism (SNP) repositories such as HapMap or Perlegen. ENGINES is capable of accessing large-scale variation data repositories in a fast and comprehensive manner. It allows quick browsing of whole genome variation, while providing statistical information for each variant site such as allele frequency, heterozygosity or FST values for genetic differentiation. Access to the data mart generating scripts and to the web interface is granted from http://spsmart.cesga.es/engines.php. © 2011 Amigo et al; licensee BioMed Central Ltd.

  13. Deep Web video

    ScienceCinema

    None Available

    2018-02-06

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  14. The Virtual Ramp to the Equivalent Experience in the Virtual Museum: Accessibility to Museums on the Web.

    ERIC Educational Resources Information Center

    Nevile, Liddy; McCathieNevile, Charles

    This paper argues that a range of forms and modalities of resources should be provided to ensure accessibility and richness on the World Wide Web for all users. Based on experiences in developing virtual exhibitions of Quinkan Aboriginal Rock Art, the authors present a brief overview of the technology available for accessibility. Then they explore…

  15. Web Content Accessibility Guidelines 2.0: A Further Step towards Accessible Digital Information

    ERIC Educational Resources Information Center

    Ribera, Mireia; Porras, Merce; Boldu, Marc; Termens, Miquel; Sule, Andreu; Paris, Pilar

    2009-01-01

    Purpose: The purpose of this paper is to explain the changes in the Web Content Accessibility Guidelines (WCAG) 2.0 compared with WCAG 1.0 within the context of its historical development. Design/methodology/approach: In order to compare WCAG 2.0 with WCAG 1.0 a diachronic analysis of the evolution of these standards is done. Known authors and…

  16. Design, Development and Testing of Web Services for Multi-Sensor Snow Cover Mapping

    NASA Astrophysics Data System (ADS)

    Kadlec, Jiri

    This dissertation presents the design, development and validation of new data integration methods for mapping the extent of snow cover based on open access ground station measurements, remote sensing images, volunteer observer snow reports, and cross country ski track recordings from location-enabled mobile devices. The first step of the data integration procedure includes data discovery, data retrieval, and data quality control of snow observations at ground stations. The WaterML R package developed in this work enables hydrologists to retrieve and analyze data from multiple organizations that are listed in the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI) Water Data Center catalog directly within the R statistical software environment. Using the WaterML R package is demonstrated by running an energy balance snowpack model in R with data inputs from CUAHSI, and by automating uploads of real time sensor observations to CUAHSI HydroServer. The second step of the procedure requires efficient access to multi-temporal remote sensing snow images. The Snow Inspector web application developed in this research enables the users to retrieve a time series of fractional snow cover from the Moderate Resolution Imaging Spectroradiometer (MODIS) for any point on Earth. The time series retrieval method is based on automated data extraction from tile images provided by a Web Map Tile Service (WMTS). The average required time for retrieving 100 days of data using this technique is 5.4 seconds, which is significantly faster than other methods that require the download of large satellite image files. The presented data extraction technique and space-time visualization user interface can be used as a model for working with other multi-temporal hydrologic or climate data WMTS services. The third, final step of the data integration procedure is generating continuous daily snow cover maps. A custom inverse distance weighting method has been developed to combine volunteer snow reports, cross-country ski track reports and station measurements to fill cloud gaps in the MODIS snow cover product. The method is demonstrated by producing a continuous daily time step snow presence probability map dataset for the Czech Republic region. The ability of the presented methodology to reconstruct MODIS snow cover under cloud is validated by simulating cloud cover datasets and comparing estimated snow cover to actual MODIS snow cover. The percent correctly classified indicator showed accuracy between 80 and 90% using this method. Using crowdsourcing data (volunteer snow reports and ski tracks) improves the map accuracy by 0.7--1.2%. The output snow probability map data sets are published online using web applications and web services. Keywords: crowdsourcing, image analysis, interpolation, MODIS, R statistical software, snow cover, snowpack probability, Tethys platform, time series, WaterML, web services, winter sports.

  17. Design and Implementation of an Architectural Framework for Web Portals in a Ubiquitous Pervasive Environment

    PubMed Central

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal’s gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world’s largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework. PMID:22346693

  18. Creating Web-Based Scientific Applications Using Java Servlets

    NASA Technical Reports Server (NTRS)

    Palmer, Grant; Arnold, James O. (Technical Monitor)

    2001-01-01

    There are many advantages to developing web-based scientific applications. Any number of people can access the application concurrently. The application can be accessed from a remote location. The application becomes essentially platform-independent because it can be run from any machine that has internet access and can run a web browser. Maintenance and upgrades to the application are simplified since only one copy of the application exists in a centralized location. This paper details the creation of web-based applications using Java servlets. Java is a powerful, versatile programming language that is well suited to developing web-based programs. A Java servlet provides the interface between the central server and the remote client machines. The servlet accepts input data from the client, runs the application on the server, and sends the output back to the client machine. The type of servlet that supports the HTTP protocol will be discussed in depth. Among the topics the paper will discuss are how to write an http servlet, how the servlet can run applications written in Java and other languages, and how to set up a Java web server. The entire process will be demonstrated by building a web-based application to compute stagnation point heat transfer.

  19. Design and implementation of an architectural framework for web portals in a ubiquitous pervasive environment.

    PubMed

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.

  20. Web based health surveys: Using a Two Step Heckman model to examine their potential for population health analysis.

    PubMed

    Morrissey, Karyn; Kinderman, Peter; Pontin, Eleanor; Tai, Sara; Schwannauer, Mathias

    2016-08-01

    In June 2011 the BBC Lab UK carried out a web-based survey on the causes of mental distress. The 'Stress Test' was launched on 'All in the Mind' a BBC Radio 4 programme and the test's URL was publicised on radio and TV broadcasts, and made available via BBC web pages and social media. Given the large amount of data created, over 32,800 participants, with corresponding diagnosis, demographic and socioeconomic characteristics; the dataset are potentially an important source of data for population based research on depression and anxiety. However, as respondents self-selected to participate in the online survey, the survey may comprise a non-random sample. It may be only individuals that listen to BBC Radio 4 and/or use their website that participated in the survey. In this instance using the Stress Test data for wider population based research may create sample selection bias. Focusing on the depression component of the Stress Test, this paper presents an easy-to-use method, the Two Step Probit Selection Model, to detect and statistically correct selection bias in the Stress Test. Using a Two Step Probit Selection Model; this paper did not find a statistically significant selection on unobserved factors for participants of the Stress Test. That is, survey participants who accessed and completed an online survey are not systematically different from non-participants on the variables of substantive interest. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Sign language Web pages.

    PubMed

    Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G

    2006-01-01

    The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.

  2. Rural and Urban/Suburban Families' Use of a Web-Based Mental Health Intervention

    PubMed Central

    Davidson, Tatiana M.; Dewey, Daniel; Price, Matthew; Ruggiero, Kenneth J.

    2017-01-01

    Abstract Background/Introduction: Access to mental healthcare among rural residents is a national concern because unique barriers (e.g., fewer providers, distance to services) create significant challenges for the 60 million Americans who live in these settings. There is now a large body of literature demonstrating the efficacy of a wide range of Internet-based interventions. However, little is known about the extent to which individuals in rural settings will use these approaches and find them acceptable. Research with youths and their caregivers within this scope is particularly limited and, therefore, of great importance. Methods: We examined access and completion of a Web-based disaster mental health intervention in a population-based sample of 1,997 rural (n = 676) and urban/suburban (n = 1,321) adolescents and their caregivers who were affected by the Spring 2011 tornadoes that touched down in parts of Missouri and Alabama. Results: Results indicated no differences in the rate of access or completion of Web-based modules based on geographical location. Furthermore, for those who did not access the Web-based resource, no differences were observed with respect to reasons for not accessing modules based on geographical location. Discussion: These data have promising implications for the reach of Web-based resources to both rural and urban/suburban communities, as well as the willingness of adolescents and their caregivers to access and complete such resources, regardless of geographical location. PMID:27753542

  3. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    PubMed

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.

  4. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    PubMed Central

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392

  5. Village Green Project: Web-accessible Database

    EPA Science Inventory

    The purpose of this web-accessible database is for the public to be able to view instantaneous readings from a solar-powered air monitoring station located in a public location (prototype pilot test is outside of a library in Durham County, NC). The data are wirelessly transmitte...

  6. How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?

    NASA Technical Reports Server (NTRS)

    Scott, David W.

    2010-01-01

    The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use

  7. Combat Stories Map: A Historical Repository and After Action Tool for Capturing, Storing, and Analyzing Georeferenced Individual Combat Narratives

    DTIC Science & Technology

    2016-06-01

    of technology and near-global Internet accessibility, a web -based program incorporating interactive maps to record personal combat experiences does...not exist. The Combat Stories Map addresses this deficiency. The Combat Stories Map is a web -based Geographic Information System specifically designed...iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT Despite the proliferation of technology and near-global Internet accessibility, a web

  8. WebView Materialization

    DTIC Science & Technology

    2000-01-01

    horoscope page (for Scorpio). Although this particular combination might be unique or unpopular, if we decompose the page into four WebViews, one for metro...news, one for international news, one for the weather and one for the horoscope , then these WebViews can be accessed frequently enough to merit...query results, the cost of accessing them is about the same as the cost of generating them from scratch, using the virt policy. This will also be true

  9. The quality of patient-orientated Internet information on oral lichen planus: a pilot study.

    PubMed

    López-Jornet, Pía; Camacho-Alonso, Fabio

    2010-10-01

    This study examines the accessibility and quality Web pages related with oral lichen planus. Sites were identified using two search engines (Google and Yahoo!) and the search terms 'oral lichen planus' and 'oral lesion lichenoid'. The first 100 sites in each search were visited and classified. The web sites were evaluated for content quality by using the validated DISCERN rating instrument. JAMA benchmarks and 'Health on the Net' seal (HON). A total of 109,000 sites were recorded in Google using the search terms and 520,000 in Yahoo! A total of 19 Web pages considered relevant were examined on Google and 20 on Yahoo! As regards the JAMA benchmarks, only two pages satisfied the four criteria in Google (10%), and only three (15%) in Yahoo! As regards DISCERN, the overall quality of web site information was poor, no site reaching the maximum score. In Google 78.94% of sites had important deficiencies, and 50% in Yahoo!, the difference between the two search engines being statistically significant (P = 0.031). Only five pages (17.2%) on Google and eight (40%) on Yahoo! showed the HON code. Based on our review, doctors must assume primary responsibility for educating and counselling their patients. © 2010 Blackwell Publishing Ltd.

  10. The meaning of web-based communication for support: from the patients' perspective within a hematological healthcare setting.

    PubMed

    Högberg, Karin M; Stockelberg, Dick; Sandman, Lars; Broström, Anders; Nyström, Maria

    2015-01-01

    Being critically ill with a hematological disease is a challenge, sometimes causing a need for support in the adjustment to the stressful life situation. By providing Web-based communication for support from a nurse, patients get access to an alternative and untraditional way to communicate their issues. The aim was to describe the meaning of using Web-based communication for support from a patient perspective. A comprehensive randomized pilot study (n = 30) was conducted, allowing 15 patients in the experimental group to have access to the Web-based communication, to evaluate feasibility. Of these 15 participants, 10 were interviewed, focusing on their experiences. An empirical hermeneutical approach was used and the interpretive analysis focused on the meanings. Web-based communication for support means a space for patients to have their say, consolidation of a matter, an extended caring relationship, access to individual medical assessment, and an opportunity for emotional processing. The main interpretation indicates that the patient's influence on the communication strengthens according to the asynchronous, faceless, and written communication. The increased, and in some sense constant, access to an individual medical and caring assessment, in turn, implies a feeling of safety. Web-based communication for support seems to have the potential to enhance patients' participation on their own terms. To achieve the possible advantages of Web-based communication for support, nurses must acquire knowledge about caring writing. It requires respect for the patient and articulated accuracy and attention in the response given.

  11. Web-based education in anesthesiology: a critical overview.

    PubMed

    Doyle, D John

    2008-12-01

    The purpose of this review is to discuss the rise of web-based educational resources available to the anesthesiology community. Recent developments of particular importance include the growth of 'Web 2.0' resources, the development of the concepts of 'open access' and 'information philanthropy', and the expansion of web-based medical simulation software products.In addition, peer review of online educational resources has now come of age. The worldwide web has made available a large variety of valuable medical information and education resources only dreamed of two decades ago. To a large extent,these developments represent a shift in the focus of medical education resources to emphasize free access to materials and to encourage collaborative development efforts.

  12. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    PubMed

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None Available

    To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.

  14. Health 2.0 and Implications for Nursing Education

    PubMed Central

    Nelson, Ramona

    2012-01-01

    Over the last 20 years the evolution of web browsers providing easy access to the Internet has initiated a revolution in access to healthcare related information for both healthcare providers and patients. This access has changed both the process used to deliver education and the content of the nursing education curriculum worldwide. Our amazing ability to access information around the world is referred as to Web 1.0. Web 2.0 moves beyond access to a world where users are interactively creating information. With the advent of Health 2.0 we are confronting a second revolution that is challenging all aspects of healthcare including all aspects of nursing. This paper explores the concept of Health 2.0, discusses a conceptual framework approach for integrating Health 2.0 content into the nursing curriculum, outlines examples of key concepts required in today’s nursing curriculum and identifies selected issues arising from the impact of Health 2.0. PMID:24199108

  15. Standardized Access and Processing of Multi-Source Earth Observation Time-Series Data within a Regional Data Middleware

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Schmullius, C.

    2017-12-01

    Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) have been integrated in the middleware. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data on the basis of uniform data management. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. In addition to accessing EO data, the middleware is also used as an intermediary between the user and external databases (e.g., Flickr, YouTube). Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data. As a thematic example, an operational monitoring of vegetation phenology is being implemented on the basis of various optical satellite data and validation data from the German Weather Service. Other examples demonstrate the monitoring of wetlands focusing on automated discovery and access of Landsat and Sentinel data for local areas.

  16. Getting Started with Drupal WebCMS

    EPA Pesticide Factsheets

    Drupal WebCMS is accessible to EPA employees, and to onsite and offsite contractors. There are several roles in Drupal WebCMS and each allows a certain set of actions in the system. Users can have different roles in different web areas.

  17. Programmatic access to data and information at the IRIS DMC via web services

    NASA Astrophysics Data System (ADS)

    Weertman, B. R.; Trabant, C.; Karstens, R.; Suleiman, Y. Y.; Ahern, T. K.; Casey, R.; Benson, R. B.

    2011-12-01

    The IRIS Data Management Center (DMC) has developed a suite of web services that provide access to the DMC's time series holdings, their related metadata and earthquake catalogs. In addition, services are available to perform simple, on-demand time series processing at the DMC prior to being shipped to the user. The primary goal is to provide programmatic access to data and processing services in a manner usable by and useful to the research community. The web services are relatively simple to understand and use and will form the foundation on which future DMC access tools will be built. Based on standard Web technologies they can be accessed programmatically with a wide range of programming languages (e.g. Perl, Python, Java), command line utilities such as wget and curl or with any web browser. We anticipate these services being used for everything from simple command line access, used in shell scripts and higher programming languages to being integrated within complex data processing software. In addition to improving access to our data by the seismological community the web services will also make our data more accessible to other disciplines. The web services available from the DMC include ws-bulkdataselect for the retrieval of large volumes of miniSEED data, ws-timeseries for the retrieval of individual segments of time series data in a variety of formats (miniSEED, SAC, ASCII, audio WAVE, and PNG plots) with optional signal processing, ws-station for station metadata in StationXML format, ws-resp for the retrieval of instrument response in RESP format, ws-sacpz for the retrieval of sensor response in the SAC poles and zeros convention and ws-event for the retrieval of earthquake catalogs. To make the services even easier to use, the DMC is developing a library that allows Java programmers to seamlessly retrieve and integrate DMC information into their own programs. The library will handle all aspects of dealing with the services and will parse the returned data. By using this library a developer will not need to learn the details of the service interfaces or understand the data formats returned. This library will be used to build the software bridge needed to request data and information from within MATLAB°. We also provide several client scripts written in Perl for the retrieval of waveform data, metadata and earthquake catalogs using command line programs. For more information on the DMC's web services please visit http://www.iris.edu/ws/

  18. Integrated databanks access and sequence/structure analysis services at the PBIL.

    PubMed

    Perrière, Guy; Combet, Christophe; Penel, Simon; Blanchet, Christophe; Thioulouse, Jean; Geourjon, Christophe; Grassot, Julien; Charavay, Céline; Gouy, Manolo; Duret, Laurent; Deléage, Gilbert

    2003-07-01

    The World Wide Web server of the PBIL (Pôle Bioinformatique Lyonnais) provides on-line access to sequence databanks and to many tools of nucleic acid and protein sequence analyses. This server allows to query nucleotide sequence banks in the EMBL and GenBank formats and protein sequence banks in the SWISS-PROT and PIR formats. The query engine on which our data bank access is based is the ACNUC system. It allows the possibility to build complex queries to access functional zones of biological interest and to retrieve large sequence sets. Of special interest are the unique features provided by this system to query the data banks of gene families developed at the PBIL. The server also provides access to a wide range of sequence analysis methods: similarity search programs, multiple alignments, protein structure prediction and multivariate statistics. An originality of this server is the integration of these two aspects: sequence retrieval and sequence analysis. Indeed, thanks to the introduction of re-usable lists, it is possible to perform treatments on large sets of data. The PBIL server can be reached at: http://pbil.univ-lyon1.fr.

  19. Pathview Web: user friendly pathway visualization and data integration

    PubMed Central

    Pant, Gaurav; Bhavnasi, Yeshvant K.; Blanchard, Steven G.; Brouwer, Cory

    2017-01-01

    Abstract Pathway analysis is widely used in omics studies. Pathway-based data integration and visualization is a critical component of the analysis. To address this need, we recently developed a novel R package called Pathview. Pathview maps, integrates and renders a large variety of biological data onto molecular pathway graphs. Here we developed the Pathview Web server, as to make pathway visualization and data integration accessible to all scientists, including those without the special computing skills or resources. Pathview Web features an intuitive graphical web interface and a user centered design. The server not only expands the core functions of Pathview, but also provides many useful features not available in the offline R package. Importantly, the server presents a comprehensive workflow for both regular and integrated pathway analysis of multiple omics data. In addition, the server also provides a RESTful API for programmatic access and conveniently integration in third-party software or workflows. Pathview Web is openly and freely accessible at https://pathview.uncc.edu/. PMID:28482075

  20. Implementing a Web-Based Decision Support System to Spatially and Statistically Analyze Ecological Conditions of the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Nguyen, A.; Mueller, C.; Brooks, A. N.; Kislik, E. A.; Baney, O. N.; Ramirez, C.; Schmidt, C.; Torres-Perez, J. L.

    2014-12-01

    The Sierra Nevada is experiencing changes in hydrologic regimes, such as decreases in snowmelt and peak runoff, which affect forest health and the availability of water resources. Currently, the USDA Forest Service Region 5 is undergoing Forest Plan revisions to include climate change impacts into mitigation and adaptation strategies. However, there are few processes in place to conduct quantitative assessments of forest conditions in relation to mountain hydrology, while easily and effectively delivering that information to forest managers. To assist the USDA Forest Service, this study is the final phase of a three-term project to create a Decision Support System (DSS) to allow ease of access to historical and forecasted hydrologic, climatic, and terrestrial conditions for the entire Sierra Nevada. This data is featured within three components of the DSS: the Mapping Viewer, Statistical Analysis Portal, and Geospatial Data Gateway. Utilizing ArcGIS Online, the Sierra DSS Mapping Viewer enables users to visually analyze and locate areas of interest. Once the areas of interest are targeted, the Statistical Analysis Portal provides subbasin level statistics for each variable over time by utilizing a recently developed web-based data analysis and visualization tool called Plotly. This tool allows users to generate graphs and conduct statistical analyses for the Sierra Nevada without the need to download the dataset of interest. For more comprehensive analysis, users are also able to download datasets via the Geospatial Data Gateway. The third phase of this project focused on Python-based data processing, the adaptation of the multiple capabilities of ArcGIS Online and Plotly, and the integration of the three Sierra DSS components within a website designed specifically for the USDA Forest Service.

  1. Application of ESE Data and Tools to Air Quality Management: Services for Helping the Air Quality Community use ESE Data (SHAirED)

    NASA Technical Reports Server (NTRS)

    Falke, Stefan; Husar, Rudolf

    2011-01-01

    The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.

  2. Library OPACs on the Web: Finding and Describing Directories.

    ERIC Educational Resources Information Center

    Henry, Marcia

    1997-01-01

    Provides current descriptions of some of the major directories that link to library catalogs on the World Wide Web. Highlights include LibWeb; Hytelnet; WebCats; WWW Library Directory; and techniques for finding new library OPAC (online public access catalog) directories. (LRW)

  3. 32 CFR 806.15 - FOIA exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... safeguarding social security numbers (SSN). It states: “SSNs are personal and unique to each individual... accessible DoD web sites unless to do so is clearly authorized by law and implementing regulation and policy. Personal information should not be posted at nonpublicly accessible web sites unless it is mission...

  4. Preservice Teachers' Experiences on Accessing Course Materials Using Mobile Devices

    ERIC Educational Resources Information Center

    Unal, Zafer; Unal, Aslihan

    2014-01-01

    This study investigates and reports the first time experiences of mobile device users accessing the course materials on both the web and mobile version of course management system (Web Moodle & Mobile Moodle) during an online course offered at the University of South Florida, St. Petersburg College of Education.

  5. Washington Public Libraries Online: Collaborating in Cyberspace.

    ERIC Educational Resources Information Center

    Wildin, Nancy

    1997-01-01

    Discussion of public libraries, the Internet, and the World Wide Web focuses on development of a Web site in Washington. Highlights include access to the Internet through online public access catalogs; partnerships between various types of libraries; hardware and software; HTML training; content design; graphics design; marketing; evaluation; and…

  6. 1 CFR 301.1 - Establishment and location.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... information about the Conference either by accessing its Web site at http://www.acus.gov, by calling the Conference offices at (202) 480-2080, or by contacting [email protected]gov. The Conference's recommendations may be obtained by accessing its Web site or by visiting the reading room at its offices. ...

  7. A Web Site that Provides Resources for Assessing Students' Statistical Literacy, Reasoning and Thinking

    ERIC Educational Resources Information Center

    Garfield, Joan; delMas, Robert

    2010-01-01

    The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…

  8. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    NASA Astrophysics Data System (ADS)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application programming interface (API), which will allow other organizations to build their own custom applications and tools. New features such as finer scale aggregation and an online carbon calculator are being added to the LandCarbon web application to continue to make the site interactive, visually compelling, and useful for a wide range of users.

  9. No Longer Conveyor but Creator: Developing an Epistemology of the World Wide Web.

    ERIC Educational Resources Information Center

    Trombley, Laura E. Skandera; Flanagan, William G.

    2001-01-01

    Discusses the impact of the World Wide Web in terms of epistemology. Topics include technological innovations, including new dimensions of virtuality; the accessibility of information; tracking Web use via cookies; how the Web transforms the process of learning and knowing; linking information sources; and the Web as an information delivery…

  10. Surfing the World Wide Web to Education Hot-Spots.

    ERIC Educational Resources Information Center

    Dyrli, Odvard Egil

    1995-01-01

    Provides a brief explanation of Web browsers and their use, as well as technical information for those considering access to the WWW (World Wide Web). Curriculum resources and addresses to useful Web sites are included. Sidebars show sample searches using Yahoo and Lycos search engines, and a list of recommended Web resources. (JKP)

  11. Network of Research Infrastructures for European Seismology (NERIES)-Web Portal Developments for Interactive Access to Earthquake Data on a European Scale

    NASA Astrophysics Data System (ADS)

    Spinuso, A.; Trani, L.; Rives, S.; Thomy, P.; Euchner, F.; Schorlemmer, D.; Saul, J.; Heinloo, A.; Bossu, R.; van Eck, T.

    2009-04-01

    The Network of Research Infrastructures for European Seismology (NERIES) is European Commission (EC) project whose focus is networking together seismological observatories and research institutes into one integrated European infrastructure that provides access to data and data products for research. Seismological institutes and organizations in European and Mediterranean countries maintain large, geographically distributed data archives, therefore this scenario suggested a design approach based on the concept of an internet service oriented architecture (SOA) to establish a cyberinfrastructure for distributed and heterogeneous data streams and services. Moreover, one of the goals of NERIES is to design and develop a Web portal that acts as the uppermost layer of the infrastructure and provides rendering capabilities for the underlying sets of data The Web services that are currently being designed and implemented will deliver data that has been adopted to appropriate formats. The parametric information about a seismic event is delivered using a seismology-specific Extensible mark-up Language(XML) format called QuakeML (https://quake.ethz.ch/quakeml), which has been formalized and implemented in coordination with global earthquake-information agencies. Uniform Resource Identifiers (URIs) are used to assign identifiers to (1) seismic-event parameters described by QuakeML, and (2) generic resources, for example, authorities, locations providers, location methods, software adopted, and so on, described by use of a data model constructed with the resource description framework (RDF) and accessible as a service. The European-Mediterranean Seismological Center (EMSC) has implemented a unique event identifier (UNID) that will create the seismic event URI used by the QuakeML data model. Access to data such as broadband waveform, accelerometric data and stations inventories will be also provided through a set of Web services that will wrap the middleware used by the seismological observatory or institute that is supplying the data. Each single application of the portal consists of a Java-based JSR-168-standard portlet (often provided with interactive maps for data discovery). In specific cases, it will be possible to distribute the deployment of the portlets among the data providers, such as seismological agencies, because of the adoption, within the distributed architecture of the NERIES portal of the Web Services for Remote Portlets (WSRP) standard for presentation-oriented web services The purpose of the portal is to provide to the user his own environment where he can surf and retrieve the data of interest, offering a set of shopping carts with storage and management facilities. This approach involves having the user interact with dedicated tools in order to compose personalized datasets that can be downloaded or combined with other information available either through the NERIES network of Web services or through the user`s own carts. Administrative applications also are provided to perform monitoring tasks such as retrieving service statistics or scheduling submitted data requests. An administrative tool is included that allows the RDF model to be extended, within certain constraints, with new classes and properties.

  12. Network of Research Infrastructures for European Seismology (NERIES) - Web Portal Developments for Interactive Access to Earthquake Data on a European Scale

    NASA Astrophysics Data System (ADS)

    Spinuso, A.; Trani, L.; Rives, S.; Thomy, P.; Euchner, F.; Schorlemmer, D.; Saul, J.; Heinloo, A.; Bossu, R.; van Eck, T.

    2008-12-01

    The Network of Research Infrastructures for European Seismology (NERIES) is European Commission (EC) project whose focus is networking together seismological observatories and research institutes into one integrated European infrastructure that provides access to data and data products for research. Seismological institutes and organizations in European and Mediterranean countries maintain large, geographically distributed data archives, therefore this scenario suggested a design approach based on the concept of an internet service oriented architecture (SOA) to establish a cyberinfrastructure for distributed and heterogeneous data streams and services. Moreover, one of the goals of NERIES is to design and develop a Web portal that acts as the uppermost layer of the infrastructure and provides rendering capabilities for the underlying sets of data The Web services that are currently being designed and implemented will deliver data that has been adopted to appropriate formats. The parametric information about a seismic event is delivered using a seismology- specific Extensible mark-up Language(XML) format called QuakeML (https://quake.ethz.ch/quakeml), which has been formalized and implemented in coordination with global earthquake-information agencies. Uniform Resource Identifiers (URIs) are used to assign identifiers to (1) seismic-event parameters described by QuakeML, and (2) generic resources, for example, authorities, locations providers, location methods, software adopted, and so on, described by use of a data model constructed with the resource description framework (RDF) and accessible as a service. The European-Mediterranean Seismological Center (EMSC) has implemented a unique event identifier (UNID) that will create the seismic event URI used by the QuakeML data model. Access to data such as broadband waveform, accelerometric data and stations inventories will be also provided through a set of Web services that will wrap the middleware used by the seismological observatory or institute that is supplying the data. Each single application of the portal consists of a Java-based JSR-168-standard portlet (often provided with interactive maps for data discovery). In specific cases, it will be possible to distribute the deployment of the portlets among the data providers, such as seismological agencies, because of the adoption, within the distributed architecture of the NERIES portal of the Web Services for Remote Portlets (WSRP) standard for presentation-oriented web services The purpose of the portal is to provide to the user his own environment where he can surf and retrieve the data of interest, offering a set of shopping carts with storage and management facilities. This approach involves having the user interact with dedicated tools in order to compose personalized datasets that can be downloaded or combined with other information available either through the NERIES network of Web services or through the user's own carts. Administrative applications also are provided to perform monitoring tasks such as retrieving service statistics or scheduling submitted data requests. An administrative tool is included that allows the RDF model to be extended, within certain constraints, with new classes and properties.

  13. Create and Maintain Content

    EPA Pesticide Factsheets

    Find resources and guidance on writing for the web, keeping your content relevant, using social media, meeting accessibility standards, and how to transform your content into the WebCMS to meet One EPA Web standards.

  14. Assessing the Library Homepages of COPLAC Institutions for Section 508 Accessibility Errors: Who's Accessible, Who's Not, and How the Online WebXACT Assessment Tool Can Help

    ERIC Educational Resources Information Center

    Huprich, Julia; Green, Ravonne

    2007-01-01

    The Council on Public Liberal Arts Colleges (COPLAC) libraries websites were assessed for Section 508 errors using the online WebXACT tool. Only three of the twenty-one institutions (14%) had zero accessibility errors. Eighty-six percent of the COPLAC institutions had an average of 1.24 errors. Section 508 compliance is required for institutions…

  15. WEbcoli: an interactive and asynchronous web application for in silico design and analysis of genome-scale E.coli model.

    PubMed

    Jung, Tae-Sung; Yeo, Hock Chuan; Reddy, Satty G; Cho, Wan-Sup; Lee, Dong-Yup

    2009-11-01

    WEbcoli is a WEb application for in silico designing, analyzing and engineering Escherichia coli metabolism. It is devised and implemented using advanced web technologies, thereby leading to enhanced usability and dynamic web accessibility. As a main feature, the WEbcoli system provides a user-friendly rich web interface, allowing users to virtually design and synthesize mutant strains derived from the genome-scale wild-type E.coli model and to customize pathways of interest through a graph editor. In addition, constraints-based flux analysis can be conducted for quantifying metabolic fluxes and charactering the physiological and metabolic states under various genetic and/or environmental conditions. WEbcoli is freely accessible at http://webcoli.org. cheld@nus.edu.sg.

  16. A revaluation of the cultural dimension of disability policy in the European Union: the impact of digitization and web accessibility.

    PubMed

    Ferri, Delia; Giannoumis, G Anthony

    2014-01-01

    Reflecting the commitments undertaken by the EU through the conclusion of the United Nations Convention on the Rights of Persons with Disabilities (UNCRPD), the European Disability Strategy 2010–2020 not only gives a prominent position to accessibility, broadly interpreted, but also suggests an examination of the obligations for access to cultural goods and services. The European Disability Strategy 2010–2020 expressly acknowledges that EU action will support national activities to make sports, leisure, cultural and recreational organizations and activities accessible, and use the possibilities for copyright exceptions in the Directive 2001/29/EC (Infosoc Directive). This article discusses to what extent the EU has realized the principle of accessibility and the right to access cultural goods and services envisaged in the UNCRPD. Previous research has yet to explore how web accessibility and digitization interact with the cultural dimension of disability policy in the European Union. This examination attempts to fill this gap by discussing to what extent the European Union has put this cultural dimension into effect and how web accessibility policies and the digitization of cultural materials influence these efforts.

  17. D-MATRIX: A web tool for constructing weight matrix of conserved DNA motifs

    PubMed Central

    Sen, Naresh; Mishra, Manoj; Khan, Feroz; Meena, Abha; Sharma, Ashok

    2009-01-01

    Despite considerable efforts to date, DNA motif prediction in whole genome remains a challenge for researchers. Currently the genome wide motif prediction tools required either direct pattern sequence (for single motif) or weight matrix (for multiple motifs). Although there are known motif pattern databases and tools for genome level prediction but no tool for weight matrix construction. Considering this, we developed a D-MATRIX tool which predicts the different types of weight matrix based on user defined aligned motif sequence set and motif width. For retrieval of known motif sequences user can access the commonly used databases such as TFD, RegulonDB, DBTBS, Transfac. D­MATRIX program uses a simple statistical approach for weight matrix construction, which can be converted into different file formats according to user requirement. It provides the possibility to identify the conserved motifs in the co­regulated genes or whole genome. As example, we successfully constructed the weight matrix of LexA transcription factor binding site with the help of known sos­box cis­regulatory elements in Deinococcus radiodurans genome. The algorithm is implemented in C-Sharp and wrapped in ASP.Net to maintain a user friendly web interface. D­MATRIX tool is accessible through the CIMAP domain network. Availability http://203.190.147.116/dmatrix/ PMID:19759861

  18. D-MATRIX: a web tool for constructing weight matrix of conserved DNA motifs.

    PubMed

    Sen, Naresh; Mishra, Manoj; Khan, Feroz; Meena, Abha; Sharma, Ashok

    2009-07-27

    Despite considerable efforts to date, DNA motif prediction in whole genome remains a challenge for researchers. Currently the genome wide motif prediction tools required either direct pattern sequence (for single motif) or weight matrix (for multiple motifs). Although there are known motif pattern databases and tools for genome level prediction but no tool for weight matrix construction. Considering this, we developed a D-MATRIX tool which predicts the different types of weight matrix based on user defined aligned motif sequence set and motif width. For retrieval of known motif sequences user can access the commonly used databases such as TFD, RegulonDB, DBTBS, Transfac. D-MATRIX program uses a simple statistical approach for weight matrix construction, which can be converted into different file formats according to user requirement. It provides the possibility to identify the conserved motifs in the co-regulated genes or whole genome. As example, we successfully constructed the weight matrix of LexA transcription factor binding site with the help of known sos-box cis-regulatory elements in Deinococcus radiodurans genome. The algorithm is implemented in C-Sharp and wrapped in ASP.Net to maintain a user friendly web interface. D-MATRIX tool is accessible through the CIMAP domain network. http://203.190.147.116/dmatrix/

  19. The Role of Internet Paleo Perspective Overviews in Making Data About Past Climate and Environmental Change More Accessible

    NASA Astrophysics Data System (ADS)

    Anderson, D. M.; Bauer, B. A.; Gille, E. P.; Gross, W. S.; Hartman, M. A.; Shah, A. M.; Woodhouse, C. A.

    2005-12-01

    The cornerstone of scientific discovery is the peer-reviewed journal article, yet for non-specialists these articles can be difficult to appreciate. Scientific writing and the sheer number of articles published each month compound the problem. At the World Data Center for Paleoclimatology, a primary goal is to make published scientific results more accessible to non-specialists. In partnership with scientists, we have created Paleo Perspectives, online essays that provide an introduction to the scientific literature on a topic, background needed to appreciate the results, figures with detailed captions, photographs, short movies and visualizations, summaries, glossaries, direct links to the data, and links to additional information. The power and flexibility of the Internet enables us to provide and update this rich array of material. We have produced three paleo perspectives (global warming, drought, abrupt climate change), with a fourth in review (arctic climate variability). Web statistics indicate these are some of the Data Center`s most often-used web pages (more so for hot topics such as global warming), and awards and accolades indicate that the content is appreciated and on-target. Review by scientists assures the accuracy of the presentations, and newly-contributed data provide material for updates.

  20. R3D-2-MSA: the RNA 3D structure-to-multiple sequence alignment server.

    PubMed

    Cannone, Jamie J; Sweeney, Blake A; Petrov, Anton I; Gutell, Robin R; Zirbel, Craig L; Leontis, Neocles

    2015-07-01

    The RNA 3D Structure-to-Multiple Sequence Alignment Server (R3D-2-MSA) is a new web service that seamlessly links RNA three-dimensional (3D) structures to high-quality RNA multiple sequence alignments (MSAs) from diverse biological sources. In this first release, R3D-2-MSA provides manual and programmatic access to curated, representative ribosomal RNA sequence alignments from bacterial, archaeal, eukaryal and organellar ribosomes, using nucleotide numbers from representative atomic-resolution 3D structures. A web-based front end is available for manual entry and an Application Program Interface for programmatic access. Users can specify up to five ranges of nucleotides and 50 nucleotide positions per range. The R3D-2-MSA server maps these ranges to the appropriate columns of the corresponding MSA and returns the contents of the columns, either for display in a web browser or in JSON format for subsequent programmatic use. The browser output page provides a 3D interactive display of the query, a full list of sequence variants with taxonomic information and a statistical summary of distinct sequence variants found. The output can be filtered and sorted in the browser. Previous user queries can be viewed at any time by resubmitting the output URL, which encodes the search and re-generates the results. The service is freely available with no login requirement at http://rna.bgsu.edu/r3d-2-msa. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. SVAw - a web-based application tool for automated surrogate variable analysis of gene expression studies

    PubMed Central

    2013-01-01

    Background Surrogate variable analysis (SVA) is a powerful method to identify, estimate, and utilize the components of gene expression heterogeneity due to unknown and/or unmeasured technical, genetic, environmental, or demographic factors. These sources of heterogeneity are common in gene expression studies, and failing to incorporate them into the analysis can obscure results. Using SVA increases the biological accuracy and reproducibility of gene expression studies by identifying these sources of heterogeneity and correctly accounting for them in the analysis. Results Here we have developed a web application called SVAw (Surrogate variable analysis Web app) that provides a user friendly interface for SVA analyses of genome-wide expression studies. The software has been developed based on open source bioconductor SVA package. In our software, we have extended the SVA program functionality in three aspects: (i) the SVAw performs a fully automated and user friendly analysis workflow; (ii) It calculates probe/gene Statistics for both pre and post SVA analysis and provides a table of results for the regression of gene expression on the primary variable of interest before and after correcting for surrogate variables; and (iii) it generates a comprehensive report file, including graphical comparison of the outcome for the user. Conclusions SVAw is a web server freely accessible solution for the surrogate variant analysis of high-throughput datasets and facilitates removing all unwanted and unknown sources of variation. It is freely available for use at http://psychiatry.igm.jhmi.edu/sva. The executable packages for both web and standalone application and the instruction for installation can be downloaded from our web site. PMID:23497726

  2. A Content Standard for Computational Models; Digital Rights Management (DRM) Architectures; A Digital Object Approach to Interoperable Rights Management: Finely-Grained Policy Enforcement Enabled by a Digital Object Infrastructure; LOCKSS: A Permanent Web Publishing and Access System; Tapestry of Time and Terrain.

    ERIC Educational Resources Information Center

    Hill, Linda L.; Crosier, Scott J.; Smith, Terrence R.; Goodchild, Michael; Iannella, Renato; Erickson, John S.; Reich, Vicky; Rosenthal, David S. H.

    2001-01-01

    Includes five articles. Topics include requirements for a content standard to describe computational models; architectures for digital rights management systems; access control for digital information objects; LOCKSS (Lots of Copies Keep Stuff Safe) that allows libraries to run Web caches for specific journals; and a Web site from the U.S.…

  3. The Department of Defense and the Power of Cloud Computing: Weighing Acceptable Cost Versus Acceptable Risk

    DTIC Science & Technology

    2016-04-01

    the DOD will put DOD systems and data at a risk level comparable to that of their neighbors in the cloud. Just as a user browses a Web page on the...proxy servers for controlling user access to Web pages, and large-scale storage for data management. Each of these devices allows access to the...user to develop applications. Acunetics.com describes Web applications as “computer programs allowing Website visitors to submit and retrieve data

  4. Apollo: Giving application developers a single point of access to public health models using structured vocabularies and Web services

    PubMed Central

    Wagner, Michael M.; Levander, John D.; Brown, Shawn; Hogan, William R.; Millett, Nicholas; Hanna, Josh

    2013-01-01

    This paper describes the Apollo Web Services and Apollo-SV, its related ontology. The Apollo Web Services give an end-user application a single point of access to multiple epidemic simulators. An end user can specify an analytic problem—which we define as a configuration and a query of results—exactly once and submit it to multiple epidemic simulators. The end user represents the analytic problem using a standard syntax and vocabulary, not the native languages of the simulators. We have demonstrated the feasibility of this design by implementing a set of Apollo services that provide access to two epidemic simulators and two visualizer services. PMID:24551417

  5. Apollo: giving application developers a single point of access to public health models using structured vocabularies and Web services.

    PubMed

    Wagner, Michael M; Levander, John D; Brown, Shawn; Hogan, William R; Millett, Nicholas; Hanna, Josh

    2013-01-01

    This paper describes the Apollo Web Services and Apollo-SV, its related ontology. The Apollo Web Services give an end-user application a single point of access to multiple epidemic simulators. An end user can specify an analytic problem-which we define as a configuration and a query of results-exactly once and submit it to multiple epidemic simulators. The end user represents the analytic problem using a standard syntax and vocabulary, not the native languages of the simulators. We have demonstrated the feasibility of this design by implementing a set of Apollo services that provide access to two epidemic simulators and two visualizer services.

  6. Mobile access to virtual randomization for investigator-initiated trials.

    PubMed

    Deserno, Thomas M; Keszei, András P

    2017-08-01

    Background/aims Randomization is indispensable in clinical trials in order to provide unbiased treatment allocation and a valid statistical inference. Improper handling of allocation lists can be avoided using central systems, for example, human-based services. However, central systems are unaffordable for investigator-initiated trials and might be inaccessible from some places, where study subjects need allocations. We propose mobile access to virtual randomization, where the randomization lists are non-existent and the appropriate allocation is computed on demand. Methods The core of the system architecture is an electronic data capture system or a clinical trial management system, which is extended by an R interface connecting the R server using the Java R Interface. Mobile devices communicate via the representational state transfer web services. Furthermore, a simple web-based setup allows configuring the appropriate statistics by non-statisticians. Our comprehensive R script supports simple randomization, restricted randomization using a random allocation rule, block randomization, and stratified randomization for un-blinded, single-blinded, and double-blinded trials. For each trial, the electronic data capture system or the clinical trial management system stores the randomization parameters and the subject assignments. Results Apps are provided for iOS and Android and subjects are randomized using smartphones. After logging onto the system, the user selects the trial and the subject, and the allocation number and treatment arm are displayed instantaneously and stored in the core system. So far, 156 subjects have been allocated from mobile devices serving five investigator-initiated trials. Conclusion Transforming pre-printed allocation lists into virtual ones ensures the correct conduct of trials and guarantees a strictly sequential processing in all trial sites. Covering 88% of all randomization models that are used in recent trials, virtual randomization becomes available for investigator-initiated trials and potentially for large multi-center trials.

  7. A Bookmarking Service for Organizing and Sharing URLs

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Wolfe, Shawn R.; Chen, James R.; Mathe, Nathalie; Rabinowitz, Joshua L.

    1997-01-01

    Web browser bookmarking facilities predominate as the method of choice for managing URLs. In this paper, we describe some deficiencies of current bookmarking schemes, and examine an alternative to current approaches. We present WebTagger(TM), an implemented prototype of a personal bookmarking service that provides both individuals and groups with a customizable means of organizing and accessing Web-based information resources. In addition, the service enables users to supply feedback on the utility of these resources relative to their information needs, and provides dynamically-updated ranking of resources based on incremental user feedback. Individuals may access the service from anywhere on the Internet, and require no special software. This service greatly simplifies the process of sharing URLs within groups, in comparison with manual methods involving email. The underlying bookmark organization scheme is more natural and flexible than current hierarchical schemes supported by the major Web browsers, and enables rapid access to stored bookmarks.

  8. Academic medical center libraries on the Web.

    PubMed Central

    Tannery, N H; Wessel, C B

    1998-01-01

    Academic medical center libraries are moving towards publishing electronically, utilizing networked technologies, and creating digital libraries. The catalyst for this movement has been the Web. An analysis of academic medical center library Web pages was undertaken to assess the information created and communicated in early 1997. A summary of present uses and suggestions for future applications is provided. A method for evaluating and describing the content of library Web sites was designed. The evaluation included categorizing basic information such as description and access to library services, access to commercial databases, and use of interactive forms. The main goal of the evaluation was to assess original resources produced by these libraries. PMID:9803298

  9. DAVID-WS: a stateful web service to facilitate gene/protein list analysis

    PubMed Central

    Jiao, Xiaoli; Sherman, Brad T.; Huang, Da Wei; Stephens, Robert; Baseler, Michael W.; Lane, H. Clifford; Lempicki, Richard A.

    2012-01-01

    Summary: The database for annotation, visualization and integrated discovery (DAVID), which can be freely accessed at http://david.abcc.ncifcrf.gov/, is a web-based online bioinformatics resource that aims to provide tools for the functional interpretation of large lists of genes/proteins. It has been used by researchers from more than 5000 institutes worldwide, with a daily submission rate of ∼1200 gene lists from ∼400 unique researchers, and has been cited by more than 6000 scientific publications. However, the current web interface does not support programmatic access to DAVID, and the uniform resource locator (URL)-based application programming interface (API) has a limit on URL size and is stateless in nature as it uses URL request and response messages to communicate with the server, without keeping any state-related details. DAVID-WS (web service) has been developed to automate user tasks by providing stateful web services to access DAVID programmatically without the need for human interactions. Availability: The web service and sample clients (written in Java, Perl, Python and Matlab) are made freely available under the DAVID License at http://david.abcc.ncifcrf.gov/content.jsp?file=WS.html. Contact: xiaoli.jiao@nih.gov; rlempicki@nih.gov PMID:22543366

  10. DAVID-WS: a stateful web service to facilitate gene/protein list analysis.

    PubMed

    Jiao, Xiaoli; Sherman, Brad T; Huang, Da Wei; Stephens, Robert; Baseler, Michael W; Lane, H Clifford; Lempicki, Richard A

    2012-07-01

    The database for annotation, visualization and integrated discovery (DAVID), which can be freely accessed at http://david.abcc.ncifcrf.gov/, is a web-based online bioinformatics resource that aims to provide tools for the functional interpretation of large lists of genes/proteins. It has been used by researchers from more than 5000 institutes worldwide, with a daily submission rate of ∼1200 gene lists from ∼400 unique researchers, and has been cited by more than 6000 scientific publications. However, the current web interface does not support programmatic access to DAVID, and the uniform resource locator (URL)-based application programming interface (API) has a limit on URL size and is stateless in nature as it uses URL request and response messages to communicate with the server, without keeping any state-related details. DAVID-WS (web service) has been developed to automate user tasks by providing stateful web services to access DAVID programmatically without the need for human interactions. The web service and sample clients (written in Java, Perl, Python and Matlab) are made freely available under the DAVID License at http://david.abcc.ncifcrf.gov/content.jsp?file=WS.html.

  11. A Study of the Effectiveness of Web-Based Homework in Teaching Undergraduate Business Statistics

    ERIC Educational Resources Information Center

    Palocsay, Susan W.; Stevens, Scott P.

    2008-01-01

    Web-based homework (WBH) Technology can simplify the creation and grading of assignments as well as provide a feasible platform for assessment testing, but its effect on student learning in business statistics is unknown. This is particularly true of the latest software development of Web-based tutoring agents that dynamically evaluate individual…

  12. WebMeV | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Web MeV (Multiple-experiment Viewer) is a web/cloud-based tool for genomic data analysis. Web MeV is being built to meet the challenge of exploring large public genomic data set with intuitive graphical interface providing access to state-of-the-art analytical tools.

  13. Searching to Translate and Translating to Search: When Information Retrieval Meets Machine Translation

    ERIC Educational Resources Information Center

    Ture, Ferhan

    2013-01-01

    With the adoption of web services in daily life, people have access to tremendous amounts of information, beyond any human's reading and comprehension capabilities. As a result, search technologies have become a fundamental tool for accessing information. Furthermore, the web contains information in multiple languages, introducing another barrier…

  14. SciLinks

    Science.gov Websites

    SciLinks Forgot your login? Sign up for FREE access Log In I'm a ... Teacher Student User Name questions and satisfy their curiosity Learn More Sign up for Free Access Sites in the SciLinks program . SciLinks-Targeted, Grade-Specific Web Content for your Books Free web content to extend and expand student

  15. The Near Future Trend: Combining Web Access and Local CD Networks. Experience and a Few Suggestions.

    ERIC Educational Resources Information Center

    Ma, Wei

    1998-01-01

    Focuses on the trend to combine Web access and CD networks, benefits of considering the community network environment as a whole, and need for flexibility in considering new technologies. Describes the Occidental College Library (California) experience of building and sharing a network and network file server. (PEN)

  16. 77 FR 44475 - Final Definitions, Requirements, and Selection Criteria; Charter Schools Program (CSP)-Charter...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-30

    ... to the field of special education. Discussion: We agree that improving access to charter schools for..., standards, assessments, special education services and access to charter schools by students with.... Department of Education's Web site ( ed.gov ), data.ed.gov , the National Charter School Resource Center Web...

  17. 77 FR 20070 - Biweekly Notice of Applications and Amendments to Facility Operating Licenses and Combined...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-03

    ... Docket ID 2012-0078. You may submit comments by the following methods: Federal Rulemaking Web site: Go to.... SUPPLEMENTARY INFORMATION: I. Accessing Information and Submitting Comments A. Accessing Information Please... publicly available, by the following methods: Federal Rulemaking Web Site: Go to http://www.regulations.gov...

  18. ASK-LDT 2.0: A Web-Based Graphical Tool for Authoring Learning Designs

    ERIC Educational Resources Information Center

    Zervas, Panagiotis; Fragkos, Konstantinos; Sampson, Demetrios G.

    2013-01-01

    During the last decade, Open Educational Resources (OERs) have gained increased attention for their potential to support open access, sharing and reuse of digital educational resources. Therefore, a large amount of digital educational resources have become available worldwide through web-based open access repositories which are referred to as…

  19. Ionic Liquids Database- (ILThermo)

    National Institute of Standards and Technology Data Gateway

    SRD 147 NIST Ionic Liquids Database- (ILThermo) (Web, free access)   IUPAC Ionic Liquids Database, ILThermo, is a free web research tool that allows users worldwide to access an up-to-date data collection from the publications on experimental investigations of thermodynamic, and transport properties of ionic liquids as well as binary and ternary mixtures containing ionic liquids.

  20. FirstSearch and NetFirst--Web and Dial-up Access: Plus Ca Change, Plus C'est la Meme Chose?

    ERIC Educational Resources Information Center

    Koehler, Wallace; Mincey, Danielle

    1996-01-01

    Compares and evaluates the differences between OCLC's dial-up and World Wide Web FirstSearch access methods and their interfaces with the underlying databases. Also examines NetFirst, OCLC's new Internet catalog, the only Internet tracking database from a "traditional" database service. (Author/PEN)

  1. Policies and Procedures for Accessing Archived NASA Data via the Web

    NASA Technical Reports Server (NTRS)

    James, Nathan

    2011-01-01

    The National Space Science Data Center (NSSDC) was established by NASA to provide for the preservation and dissemination of scientific data from NASA missions. This white paper will address the NSSDC policies that govern data preservation and dissemination and the various methods of accessing NSSDC-archived data via the web.

  2. The Relationship between Web Accessibility Policy and Practice in Postsecondary Institutions

    ERIC Educational Resources Information Center

    Whitney, Michael P.

    2009-01-01

    From computer workstations to the world of the web, statutes and policies have afforded students with disabilities the right to participate in postsecondary education in a non-discriminatory manner. Automatic doors and adjustable tables are a commonplace on campuses and represent prime examples of accessible policy adherence, but what affect do…

  3. World-Wide Web: Adding Multimedia to Cyberspace.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1994-01-01

    Describes the World-Wide Web (WWW), a network information resource based on hypertext. How to access WWW browsers through remote login (telnet) or though free browser software, such as Mosaic, is provided. Eight information sources that can be accessed through the WWW are listed. The address of a listserv reporting on Internet developments is…

  4. The Case for Creating a Scholars Portal to the Web: A White Paper.

    ERIC Educational Resources Information Center

    Campbell, Jerry D.

    2001-01-01

    Considers the need for reliable, scholarly access to the Web and suggests that the Association for Research Libraries, in partnership with OCLC and the Library of Congress, develop a so-called scholar's portal. Topics include quality content; enhanced library services; and gateway functions, including access to commercial databases and focused…

  5. JABAWS 2.2 distributed web services for Bioinformatics: protein disorder, conservation and RNA secondary structure.

    PubMed

    Troshin, Peter V; Procter, James B; Sherstnev, Alexander; Barton, Daniel L; Madeira, Fábio; Barton, Geoffrey J

    2018-06-01

    JABAWS 2.2 is a computational framework that simplifies the deployment of web services for Bioinformatics. In addition to the five multiple sequence alignment (MSA) algorithms in JABAWS 1.0, JABAWS 2.2 includes three additional MSA programs (Clustal Omega, MSAprobs, GLprobs), four protein disorder prediction methods (DisEMBL, IUPred, Ronn, GlobPlot), 18 measures of protein conservation as implemented in AACon, and RNA secondary structure prediction by the RNAalifold program. JABAWS 2.2 can be deployed on a variety of in-house or hosted systems. JABAWS 2.2 web services may be accessed from the Jalview multiple sequence analysis workbench (Version 2.8 and later), as well as directly via the JABAWS command line interface (CLI) client. JABAWS 2.2 can be deployed on a local virtual server as a Virtual Appliance (VA) or simply as a Web Application Archive (WAR) for private use. Improvements in JABAWS 2.2 also include simplified installation and a range of utility tools for usage statistics collection, and web services querying and monitoring. The JABAWS CLI client has been updated to support all the new services and allow integration of JABAWS 2.2 services into conventional scripts. A public JABAWS 2 server has been in production since December 2011 and served over 800 000 analyses for users worldwide. JABAWS 2.2 is made freely available under the Apache 2 license and can be obtained from: http://www.compbio.dundee.ac.uk/jabaws. g.j.barton@dundee.ac.uk.

  6. Checking an integrated model of web accessibility and usability evaluation for disabled people.

    PubMed

    Federici, Stefano; Micangeli, Andrea; Ruspantini, Irene; Borgianni, Stefano; Corradi, Fabrizio; Pasqualotto, Emanuele; Olivetti Belardinelli, Marta

    2005-07-08

    A combined objective-oriented and subjective-oriented method for evaluating accessibility and usability of web pages for students with disability was tested. The objective-oriented approach is devoted to verifying the conformity of interfaces to standard rules stated by national and international organizations responsible for web technology standardization, such as W3C. Conversely, the subjective-oriented approach allows assessing how the final users interact with the artificial system, accessing levels of user satisfaction based on personal factors and environmental barriers. Five kinds of measurements were applied as objective-oriented and subjective-oriented tests. Objective-oriented evaluations were performed on the Help Desk web page for students with disability, included in the website of a large Italian state university. Subjective-oriented tests were administered to 19 students labeled as disabled on the basis of their own declaration at the University enrolment: 13 students were tested by means of the SUMI test and six students by means of the 'Cooperative evaluation'. Objective-oriented and subjective-oriented methods highlighted different and sometimes conflicting results. Both methods have pointed out much more consistency regarding levels of accessibility than of usability. Since usability is largely affected by individual differences in user's own (dis)abilities, subjective-oriented measures underscored the fact that blind students encountered much more web surfing difficulties.

  7. U.S. Geological Survey (USGS) Earthquake Web Applications

    NASA Astrophysics Data System (ADS)

    Fee, J.; Martinez, E.

    2015-12-01

    USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/

  8. Llnking the EarthScope Data Virtual Catalog to the GEON Portal

    NASA Astrophysics Data System (ADS)

    Lin, K.; Memon, A.; Baru, C.

    2008-12-01

    The EarthScope Data Portal provides a unified, single-point of access to EarthScope data and products from USArray, Plate Boundary Observatory (PBO), and San Andreas Fault Observatory at Depth (SAFOD) experiments. The portal features basic search and data access capabilities to allow users to discover and access EarthScope data using spatial, temporal, and other metadata-based (data type, station specific) search conditions. The portal search module is the user interface implementation of the EarthScope Data Search Web Service. This Web Service acts as a virtual catalog that in turn invokes Web services developed by IRIS (Incorporated Research Institutions for Seismology), UNAVCO (University NAVSTAR Consortium), and GFZ (German Research Center for Geosciences) to search for EarthScope data in the archives at each of these locations. These Web Services provide information about all resources (data) that match the specified search conditions. In this presentation we will describe how the EarthScope Data Search Web service can be integrated into the GEONsearch application in the GEON Portal (see http://portal.geongrid.org). Thus, a search request issued at the GEON Portal will also search the EarthScope virtual catalog thereby providing users seamless access to data in GEON as well as the Earthscope via a common user interface.

  9. Technical Services and the World Wide Web.

    ERIC Educational Resources Information Center

    Scheschy, Virginia M.

    The World Wide Web and browsers such as Netscape and Mosaic have simplified access to electronic resources. Today, technical services librarians can share in the wealth of information available on the Web. One of the premier Web sites for acquisitions librarians is AcqWeb, a cousin of the AcqNet listserv. In addition to interesting news items,…

  10. Increasing efficiency of information dissemination and collection through the World Wide Web

    Treesearch

    Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott

    2000-01-01

    Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...

  11. Integrating Mathematics, Science, and Language Arts Instruction Using the World Wide Web.

    ERIC Educational Resources Information Center

    Clark, Kenneth; Hosticka, Alice; Kent, Judi; Browne, Ron

    1998-01-01

    Addresses issues of access to World Wide Web sites, mathematics and science content-resources available on the Web, and methods for integrating mathematics, science, and language arts instruction. (Author/ASK)

  12. Leveraging Open Standard Interfaces in Accessing and Processing NASA Data Model Outputs

    NASA Astrophysics Data System (ADS)

    Falke, S. R.; Alameh, N. S.; Hoijarvi, K.; de La Beaujardiere, J.; Bambacus, M. J.

    2006-12-01

    An objective of NASA's Earth Science Division is to develop advanced information technologies for processing, archiving, accessing, visualizing, and communicating Earth Science data. To this end, NASA and other federal agencies have collaborated with the Open Geospatial Consortium (OGC) to research, develop, and test interoperability specifications within projects and testbeds benefiting the government, industry, and the public. This paper summarizes the results of a recent effort under the auspices of the OGC Web Services testbed phase 4 (OWS-4) to explore standardization approaches for accessing and processing the outputs of NASA models of physical phenomena. Within the OWS-4 context, experiments were designed to leverage the emerging OGC Web Processing Service (WPS) and Web Coverage Service (WCS) specifications to access, filter and manipulate the outputs of the NASA Goddard Earth Observing System (GEOS) and Goddard Chemistry Aerosol Radiation and Transport (GOCART) forecast models. In OWS-4, the intent is to provide the users with more control over the subsets of data that they can extract from the model results as well as over the final portrayal of that data. To meet that goal, experiments have been designed to test the suitability of use of OGC's Web Processing Service (WPS) and Web Coverage Service (WCS) for filtering, processing and portraying the model results (including slices by height or by time), and to identify any enhancements to the specs to meet the desired objectives. This paper summarizes the findings of the experiments highlighting the value of the Web Processing Service in providing standard interfaces for accessing and manipulating model data within spatial and temporal frameworks. The paper also points out the key shortcomings of the WPS especially in terms in comparison with a SOAP/WSDL approach towards solving the same problem.

  13. An online readability analysis of pathology-related patient education articles: an opportunity for pathologists to educate patients.

    PubMed

    Prabhu, Arpan V; Kim, Christopher; Crihalmeanu, Tudor; Hansberry, David R; Agarwal, Nitin; DeFrances, Marie C; Trejo Bittar, Humberto E

    2017-07-01

    Information for patients regarding their clinical conditions and treatment options is widely available online. The American Medical Association and National Institutes of Health recommend that online patient-oriented materials be written at no higher than a seventh-grade reading level to ensure full comprehension by the average American. This study sought to determine whether online patient-oriented materials explaining common pathology procedures are written at appropriate reading levels. Ten pathology procedures that patients would likely research were queried into Google search, and plain text from the first 10 Web sites containing patient education materials for each procedure was analyzed using 10 validated readability scales. We determined mean reading levels of materials grouped by readability scale, procedure, and Web site domain, the overall average reading level of all resources, and popular Web site domains. One hundred Web sites were accessed; one was omitted for short length (<100 words). The average reading grade level of the 99 materials, none of which met national health literacy guidelines (range, 7.3-17.4), was 10.9. Twenty-nine articles (29%) required a high school education for full comprehension, and 4 (4%) required an undergraduate college education. Most frequently accessed Web site domains included medlineplus.gov, webmd.com (both accessed 7 times), and labtestsonline.org (accessed 6 times). Average reading levels of the 11 most commonly accessed Web sites ranged from 8.25 (patient.info) to 12.25 (mayoclinic.org). Readability levels of most online pathology-related patient education materials exceeded those recommended by national health literacy guidelines. These patient education materials should be revised to help patients fully understand them. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Secure, web-accessible call rosters for academic radiology departments.

    PubMed

    Nguyen, A V; Tellis, W M; Avrin, D E

    2000-05-01

    Traditionally, radiology department call rosters have been posted via paper and bulletin boards. Frequently, changes to these lists are made by multiple people independently, but often not synchronized, resulting in confusion among the house staff and technical staff as to who is on call and when. In addition, multiple and disparate copies exist in different sections of the department, and changes made would not be propagated to all the schedules. To eliminate such difficulties, a paperless call scheduling application was developed. Our call scheduling program allowed Java-enabled web access to a database by designated personnel from each radiology section who have privileges to make the necessary changes. Once a person made a change, everyone accessing the database would see the modification. This eliminates the chaos resulting from people swapping shifts at the last minute and not having the time to record or broadcast the change. Furthermore, all changes to the database were logged. Users are given a log-in name and password and can only edit their section; however, all personnel have access to all sections' schedules. Our applet was written in Java 2 using the latest technology in database access. We access our Interbase database through the DataExpress and DB Swing (Borland, Scotts Valley, CA) components. The result is secure access to the call rosters via the web. There are many advantages to the web-enabled access, mainly the ability for people to make changes and have the changes recorded and propagated in a single virtual location and available to all who need to know.

  15. The Effects of Web-Based Patient Access to Laboratory Results in British Columbia: A Patient Survey on Comprehension and Anxiety.

    PubMed

    Mák, Geneviève; Smith Fowler, Heather; Leaver, Chad; Hagens, Simon; Zelmer, Jennifer

    2015-08-04

    Web-based patient access to personal health information is limited but increasing in Canada and internationally. This exploratory study aimed to increase understanding of how Web-based access to laboratory test results in British Columbia (Canada), which has been broadly available since 2010, affects patients' experiences. In November 2013, we surveyed adults in British Columbia who had had a laboratory test in the previous 12 months. Using a retrospective cohort design, we compared reported wait-time for results, test result comprehension, and anxiety levels of "service users" who had Web-based access to their test results (n=2047) with those of a general population panel that did not have Web-based access (n=1245). The vast majority of service users (83.99%, 95% CI 82.31-85.67) said they received their results within "a few days", compared to just over a third of the comparison group (37.84%, 95% CI 34.96-40.73). Most in both groups said they understood their test results, but the rate was lower for service users than the comparison group (75.55%, 95% CI 73.58-77.49 vs 84.69%, 95% CI 82.59-86.81). There was no significant difference between groups in levels of reported anxiety after receiving test results. While most patients who received their laboratory test results online reported little anxiety after receiving their results and were satisfied with the service, there may be opportunities to improve comprehension of results.

  16. The experiences of working carers of older people regarding access to a web-based family care support network offered by a municipality.

    PubMed

    Andersson, Stefan; Erlingsson, Christen; Magnusson, Lennart; Hanson, Elizabeth

    2017-09-01

    Policy makers in Sweden and other European Member States pay increasing attention as to how best support working carers; carers juggling providing unpaid family care for older family members while performing paid work. Exploring perceived benefits and challenges with web-based information and communication technologies as a means of supporting working carers' in their caregiving role, this paper draws on findings from a qualitative study. The study aimed to describe working carers' experiences of having access to the web-based family care support network 'A good place' (AGP) provided by the municipality to support those caring for an older family member. Content analysis of interviews with nine working carers revealed three themes: A support hub, connections to peers, personnel and knowledge; Experiencing ICT support as relevant in changing life circumstances; and Upholding one's personal firewall. Findings indicate that the web-based family care support network AGP is an accessible, complementary means of support. Utilising support while balancing caregiving, work obligations and responsibilities was made easier with access to AGP; enabling working carers to access information, psychosocial support and learning opportunities. In particular, it provided channels for carers to share experiences with others, to be informed, and to gain insights into medical and care issues. This reinforced working carers' sense of competence, helping them meet caregiving demands and see positive aspects in their situation. Carers' low levels of digital skills and anxieties about using computer-based support were barriers to utilising web-based support and could lead to deprioritising of this support. However, to help carers overcome these barriers and to better match web-based support to working carers' preferences and situations, web-based support must be introduced in a timely manner and must more accurately meet each working carer's unique caregiving needs. © 2016 Nordic College of Caring Science.

  17. Visualization on the Web of 20 Years of Crop Rotation and Wildlife Co-Evolutions

    NASA Astrophysics Data System (ADS)

    Plumejeaud-Perreau, Christine; Poitevin, Cyril; Bretagnolle, Vincent

    2018-05-01

    The accumulation of evidences of the effects of intensive agricultural practices against wildlife fauna and flora, and biodiversity in general, has been largely published in scientific papers (Tildman, 1999). However, data serving as sup-port to their conclusions are often kept hidden behind research institutions. This paper presents a data visualization sys-tem opened on the Web allowing citizens to get a comprehensive access to data issued from such kind of research institution, collected for more than 20 years. The Web Information System has been thought in order to ease the comparison of data issues from various databases describing the same object, the agricultural landscape, at different scales and through different observation devices. An interactive visualization is proposed in order to check co-evolution of fauna and flora together with agricultural practices. It mixes aerial orthoimagery produced since 1950 with vectorial data showing the evolutions of agricultural parcels with those of a few sentinel species such as the Montagu's harrier. This is made through a composition of maps, charts and time lines, and specific tools for comparison. A particular concern is given to the observation effort bias in order to show meaningful statistical aggregates.

  18. Dynamic assessment of microbial ecology (DAME): a web app for interactive analysis and visualization of microbial sequencing data.

    PubMed

    Piccolo, Brian D; Wankhade, Umesh D; Chintapalli, Sree V; Bhattacharyya, Sudeepa; Chunqiao, Luo; Shankar, Kartik

    2018-03-15

    Dynamic assessment of microbial ecology (DAME) is a Shiny-based web application for interactive analysis and visualization of microbial sequencing data. DAME provides researchers not familiar with R programming the ability to access the most current R functions utilized for ecology and gene sequencing data analyses. Currently, DAME supports group comparisons of several ecological estimates of α-diversity and β-diversity, along with differential abundance analysis of individual taxa. Using the Shiny framework, the user has complete control of all aspects of the data analysis, including sample/experimental group selection and filtering, estimate selection, statistical methods and visualization parameters. Furthermore, graphical and tabular outputs are supported by R packages using D3.js and are fully interactive. DAME was implemented in R but can be modified by Hypertext Markup Language (HTML), Cascading Style Sheets (CSS), and JavaScript. It is freely available on the web at https://acnc-shinyapps.shinyapps.io/DAME/. Local installation and source code are available through Github (https://github.com/bdpiccolo/ACNC-DAME). Any system with R can launch DAME locally provided the shiny package is installed. bdpiccolo@uams.edu.

  19. g:Profiler-a web server for functional interpretation of gene lists (2016 update).

    PubMed

    Reimand, Jüri; Arak, Tambet; Adler, Priit; Kolberg, Liis; Reisberg, Sulev; Peterson, Hedi; Vilo, Jaak

    2016-07-08

    Functional enrichment analysis is a key step in interpreting gene lists discovered in diverse high-throughput experiments. g:Profiler studies flat and ranked gene lists and finds statistically significant Gene Ontology terms, pathways and other gene function related terms. Translation of hundreds of gene identifiers is another core feature of g:Profiler. Since its first publication in 2007, our web server has become a popular tool of choice among basic and translational researchers. Timeliness is a major advantage of g:Profiler as genome and pathway information is synchronized with the Ensembl database in quarterly updates. g:Profiler supports 213 species including mammals and other vertebrates, plants, insects and fungi. The 2016 update of g:Profiler introduces several novel features. We have added further functional datasets to interpret gene lists, including transcription factor binding site predictions, Mendelian disease annotations, information about protein expression and complexes and gene mappings of human genetic polymorphisms. Besides the interactive web interface, g:Profiler can be accessed in computational pipelines using our R package, Python interface and BioJS component. g:Profiler is freely available at http://biit.cs.ut.ee/gprofiler/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Improving Land Cover Mapping: a Mobile Application Based on ESA Sentinel 2 Imagery

    NASA Astrophysics Data System (ADS)

    Melis, M. T.; Dessì, F.; Loddo, P.; La Mantia, C.; Da Pelo, S.; Deflorio, A. M.; Ghiglieri, G.; Hailu, B. T.; Kalegele, K.; Mwasi, B. N.

    2018-04-01

    The increasing availability of satellite data is a real value for the enhancement of environmental knowledge and land management. Possibilities to integrate different source of geo-data are growing and methodologies to create thematic database are becoming very sophisticated. Moreover, the access to internet services and, in particular, to web mapping services is well developed and spread either between expert users than the citizens. Web map services, like Google Maps or Open Street Maps, give the access to updated optical imagery or topographic maps but information on land cover/use - are not still provided. Therefore, there are many failings in the general utilization -non-specialized users- and access to those maps. This issue is particularly felt where the digital (web) maps could form the basis for land use management as they are more economic and accessible than the paper maps. These conditions are well known in many African countries where, while the internet access is becoming open to all, the local map agencies and their products are not widespread.

  1. The Geospatial Web and Local Geographical Education

    ERIC Educational Resources Information Center

    Harris, Trevor M.; Rouse, L. Jesse; Bergeron, Susan J.

    2010-01-01

    Recent innovations in the Geospatial Web represent a paradigm shift in Web mapping by enabling educators to explore geography in the classroom by dynamically using a rapidly growing suite of impressive online geospatial tools. Coupled with access to spatial data repositories and User-Generated Content, the Geospatial Web provides a powerful…

  2. Web-Mediated Knowledge Synthesis for Educators

    ERIC Educational Resources Information Center

    DeSchryver, Michael

    2015-01-01

    Ubiquitous and instant access to information on the Web is challenging what constitutes 21st century literacies. This article explores the notion of Web-mediated knowledge synthesis, an approach to integrating Web-based learning that may result in generative synthesis of ideas. This article describes the skills and strategies that may support…

  3. Katherine Fleming | NREL

    Science.gov Websites

    Fleming Photo of Katherine Fleming Katherine Fleming Database and Web Applications Engineer and web application development in the Commercial Buildings Research group. Her projects include the , Katherine was pursuing a Ph.D. with a focus on robotics and working as a Web developer and Web accessibility

  4. 78 FR 54241 - Proposed Information Collection; Comment Request; BroadbandMatch Web Site Tool

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... Information Collection; Comment Request; BroadbandMatch Web Site Tool AGENCY: National Telecommunications and... goal of increased broadband deployment and use in the United States. The BroadbandMatch Web site began... empowering technology effectively. II. Method of Collection BroadbandMatch users access the Web site through...

  5. From Web 2.0 to Teacher 2.0

    ERIC Educational Resources Information Center

    Thomas, David A.; Li, Qing

    2008-01-01

    The World Wide Web is evolving in response to users who demand faster and more efficient access to information, portability, and reusability of digital objects between Web-based and computer-based applications and powerful communication, publication, collaboration, and teaching and learning tools. This article reviews current uses of Web-based…

  6. On Building a Web-Based University

    ERIC Educational Resources Information Center

    Constantinescu, Dana; Stefansson, Gunnar

    2010-01-01

    This paper describes some of the principles for building a freely available web-based university with open content. The "tutor-web" is an international project for web-assisted education, including such free and open access. This project was initiated by the University of Iceland in partnership with many universities around the world,…

  7. Designing, Implementing, and Evaluating Secure Web Browsers

    ERIC Educational Resources Information Center

    Grier, Christopher L.

    2009-01-01

    Web browsers are plagued with vulnerabilities, providing hackers with easy access to computer systems using browser-based attacks. Efforts that retrofit existing browsers have had limited success since modern browsers are not designed to withstand attack. To enable more secure web browsing, we design and implement new web browsers from the ground…

  8. Access to Space Interactive Design Web Site

    NASA Technical Reports Server (NTRS)

    Leon, John; Cutlip, William; Hametz, Mark

    2000-01-01

    The Access To Space (ATS) Group at NASA's Goddard Space Flight Center (GSFC) supports the science and technology community at GSFC by facilitating frequent and affordable opportunities for access to space. Through partnerships established with access mode suppliers, the ATS Group has developed an interactive Mission Design web site. The ATS web site provides both the information and the tools necessary to assist mission planners in selecting and planning their ride to space. This includes the evaluation of single payloads vs. ride-sharing opportunities to reduce the cost of access to space. Features of this site include the following: (1) Mission Database. Our mission database contains a listing of missions ranging from proposed missions to manifested. Missions can be entered by our user community through data input tools. Data is then accessed by users through various search engines: orbit parameters, ride-share opportunities, spacecraft parameters, other mission notes, launch vehicle, and contact information. (2) Launch Vehicle Toolboxes. The launch vehicle toolboxes provide the user a full range of information on vehicle classes and individual configurations. Topics include: general information, environments, performance, payload interface, available volume, and launch sites.

  9. Experiences with http/WebDAV protocols for data access in high throughput computing

    NASA Astrophysics Data System (ADS)

    Bernabeu, Gerard; Martinez, Francisco; Acción, Esther; Bria, Arnau; Caubet, Marc; Delfino, Manuel; Espinal, Xavier

    2011-12-01

    In the past, access to remote storage was considered to be at least one order of magnitude slower than local disk access. Improvement on network technologies provide the alternative of using remote disk. For those accesses one can today reach levels of throughput similar or exceeding those of local disks. Common choices as access protocols in the WLCG collaboration are RFIO, [GSI]DCAP, GRIDFTP, XROOTD and NFS. HTTP protocol shows a promising alternative as it is a simple, lightweight protocol. It also enables the use of standard technologies such as http caching or load balancing which can be used to improve service resilience and scalability or to boost performance for some use cases seen in HEP such as the "hot files". WebDAV extensions allow writing data, giving it enough functionality to work as a remote access protocol. This paper will show our experiences with the WebDAV door for dCache, in terms of functionality and performance, applied to some of the HEP work flows in the LHC Tier1 at PIC.

  10. Usage of a generic web-based self-management intervention for breast cancer survivors: substudy analysis of the BREATH trial.

    PubMed

    van den Berg, Sanne W; Peters, Esmee J; Kraaijeveld, J Frank; Gielissen, Marieke F M; Prins, Judith B

    2013-08-19

    Generic fully automated Web-based self-management interventions are upcoming, for example, for the growing number of breast cancer survivors. It is hypothesized that the use of these interventions is more individualized and that users apply a large amount of self-tailoring. However, technical usage evaluations of these types of interventions are scarce and practical guidelines are lacking. To gain insight into meaningful usage parameters to evaluate the use of generic fully automated Web-based interventions by assessing how breast cancer survivors use a generic self-management website. Final aim is to propose practical recommendations for researchers and information and communication technology (ICT) professionals who aim to design and evaluate the use of similar Web-based interventions. The BREAst cancer ehealTH (BREATH) intervention is a generic unguided fully automated website with stepwise weekly access and a fixed 4-month structure containing 104 intervention ingredients (ie, texts, tasks, tests, videos). By monitoring https-server requests, technical usage statistics were recorded for the intervention group of the randomized controlled trial. Observed usage was analyzed by measures of frequency, duration, and activity. Intervention adherence was defined as continuous usage, or the proportion of participants who started using the intervention and continued to log in during all four phases. By comparing observed to minimal intended usage (frequency and activity), different user groups were defined. Usage statistics for 4 months were collected from 70 breast cancer survivors (mean age 50.9 years). Frequency of logins/person ranged from 0 to 45, total duration/person from 0 to 2324 minutes (38.7 hours), and activity from opening none to all intervention ingredients. 31 participants continued logging in to all four phases resulting in an intervention adherence rate of 44.3% (95% CI 33.2-55.9). Nine nonusers (13%), 30 low users (43%), and 31 high users (44%) were defined. Low and high users differed significantly on frequency (P<.001), total duration (P<.001), session duration (P=.009), and activity (P<.001). High users logged in an average of 21 times, had a mean session duration of 33 minutes, and opened on average 91% of all ingredients. Signing the self-help contract (P<.001), reporting usefulness of ingredients (P=.003), overall satisfaction (P=.028), and user friendliness evaluation (P=.003) were higher in high users. User groups did not differ on age, education, and baseline distress. By reporting the usage of a self-management website for breast cancer survivors, the present study gained first insight into the design of usage evaluations of generic fully automated Web-based interventions. It is recommended to (1) incorporate usage statistics that reflect the amount of self-tailoring applied by users, (2) combine technical usage statistics with self-reported usefulness, and (3) use qualitative measures. Also, (4) a pilot usage evaluation should be a fixed step in the development process of novel Web-based interventions, and (5) it is essential for researchers to gain insight into the rationale of recorded and nonrecorded usage statistics. Netherlands Trial Register (NTR): 2935; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=2935 (Archived by WebCite at http://www.webcitation.org/6IkX1ADEV).

  11. Web servers and services for electrostatics calculations with APBS and PDB2PQR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unni, Samir; Huang, Yong; Hanson, Robert M.

    APBS and PDB2PQR are widely utilized free software packages for biomolecular electrostatics calculations. Using the Opal toolkit, we have developed a web services framework for these software packages that enables the use of APBS and PDB2PQR by users who do not have local access to the necessary amount of computational capabilities. This not only increases accessibility of the software to a wider range of scientists, educators, and students but it also increases the availability of electrostatics calculations on portable computing platforms. Users can access this new functionality in two ways. First, an Opal-enabled version of APBS is provided in currentmore » distributions, available freely on the web. Second, we have extended the PDB2PQR web server to provide an interface for the setup, execution, and visualization electrostatics potentials as calculated by APBS. This web interface also uses the Opal framework which ensures the scalability needed to support the large APBS user community. Both of these resources are available from the APBS/PDB2PQR website: http://www.poissonboltzmann.org/.« less

  12. Pathview Web: user friendly pathway visualization and data integration.

    PubMed

    Luo, Weijun; Pant, Gaurav; Bhavnasi, Yeshvant K; Blanchard, Steven G; Brouwer, Cory

    2017-07-03

    Pathway analysis is widely used in omics studies. Pathway-based data integration and visualization is a critical component of the analysis. To address this need, we recently developed a novel R package called Pathview. Pathview maps, integrates and renders a large variety of biological data onto molecular pathway graphs. Here we developed the Pathview Web server, as to make pathway visualization and data integration accessible to all scientists, including those without the special computing skills or resources. Pathview Web features an intuitive graphical web interface and a user centered design. The server not only expands the core functions of Pathview, but also provides many useful features not available in the offline R package. Importantly, the server presents a comprehensive workflow for both regular and integrated pathway analysis of multiple omics data. In addition, the server also provides a RESTful API for programmatic access and conveniently integration in third-party software or workflows. Pathview Web is openly and freely accessible at https://pathview.uncc.edu/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Web servers and services for electrostatics calculations with APBS and PDB2PQR

    PubMed Central

    Unni, Samir; Huang, Yong; Hanson, Robert; Tobias, Malcolm; Krishnan, Sriram; Li, Wilfred W.; Nielsen, Jens E.; Baker, Nathan A.

    2011-01-01

    APBS and PDB2PQR are widely utilized free software packages for biomolecular electrostatics calculations. Using the Opal toolkit, we have developed a Web services framework for these software packages that enables the use of APBS and PDB2PQR by users who do not have local access to the necessary amount of computational capabilities. This not only increases accessibility of the software to a wider range of scientists, educators, and students but it also increases the availability of electrostatics calculations on portable computing platforms. Users can access this new functionality in two ways. First, an Opal-enabled version of APBS is provided in current distributions, available freely on the web. Second, we have extended the PDB2PQR web server to provide an interface for the setup, execution, and visualization electrostatics potentials as calculated by APBS. This web interface also uses the Opal framework which ensures the scalability needed to support the large APBS user community. Both of these resources are available from the APBS/PDB2PQR website: http://www.poissonboltzmann.org/. PMID:21425296

  14. Adapting Web content for low-literacy readers by using lexical elaboration and named entities labeling

    NASA Astrophysics Data System (ADS)

    Watanabe, W. M.; Candido, A.; Amâncio, M. A.; De Oliveira, M.; Pardo, T. A. S.; Fortes, R. P. M.; Aluísio, S. M.

    2010-12-01

    This paper presents an approach for assisting low-literacy readers in accessing Web online information. The "Educational FACILITA" tool is a Web content adaptation tool that provides innovative features and follows more intuitive interaction models regarding accessibility concerns. Especially, we propose an interaction model and a Web application that explore the natural language processing tasks of lexical elaboration and named entity labeling for improving Web accessibility. We report on the results obtained from a pilot study on usability analysis carried out with low-literacy users. The preliminary results show that "Educational FACILITA" improves the comprehension of text elements, although the assistance mechanisms might also confuse users when word sense ambiguity is introduced, by gathering, for a complex word, a list of synonyms with multiple meanings. This fact evokes a future solution in which the correct sense for a complex word in a sentence is identified, solving this pervasive characteristic of natural languages. The pilot study also identified that experienced computer users find the tool to be more useful than novice computer users do.

  15. Web Extensible Display Manager

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slominski, Ryan; Larrieu, Theodore L.

    Jefferson Lab's Web Extensible Display Manager (WEDM) allows staff to access EDM control system screens from a web browser in remote offices and from mobile devices. Native browser technologies are leveraged to avoid installing and managing software on remote clients such as browser plugins, tunnel applications, or an EDM environment. Since standard network ports are used firewall exceptions are minimized. To avoid security concerns from remote users modifying a control system, WEDM exposes read-only access and basic web authentication can be used to further restrict access. Updates of monitored EPICS channels are delivered via a Web Socket using a webmore » gateway. The software translates EDM description files (denoted with the edl suffix) to HTML with Scalable Vector Graphics (SVG) following the EDM's edl file vector drawing rules to create faithful screen renderings. The WEDM server parses edl files and creates the HTML equivalent in real-time allowing existing screens to work without modification. Alternatively, the familiar drag and drop EDM screen creation tool can be used to create optimized screens sized specifically for smart phones and then rendered by WEDM.« less

  16. EPA Web Training Classes

    EPA Pesticide Factsheets

    Scheduled webinars can help you better manage EPA web content. Class topics include Drupal basics, creating different types of pages in the WebCMS such as document pages and forms, using Google Analytics, and best practices for metadata and accessibility.

  17. Web usage mining at an academic health sciences library: an exploratory study.

    PubMed

    Bracke, Paul J

    2004-10-01

    This paper explores the potential of multinomial logistic regression analysis to perform Web usage mining for an academic health sciences library Website. Usage of database-driven resource gateway pages was logged for a six-month period, including information about users' network addresses, referring uniform resource locators (URLs), and types of resource accessed. It was found that referring URL did vary significantly by two factors: whether a user was on-campus and what type of resource was accessed. Although the data available for analysis are limited by the nature of the Web and concerns for privacy, this method demonstrates the potential for gaining insight into Web usage that supplements Web log analysis. It can be used to improve the design of static and dynamic Websites today and could be used in the design of more advanced Web systems in the future.

  18. What do web-use skill differences imply for online health information searches?

    PubMed

    Feufel, Markus A; Stahl, S Frederica

    2012-06-13

    Online health information is of variable and often low scientific quality. In particular, elderly less-educated populations are said to struggle in accessing quality online information (digital divide). Little is known about (1) how their online behavior differs from that of younger, more-educated, and more-frequent Web users, and (2) how the older population may be supported in accessing good-quality online health information. To specify the digital divide between skilled and less-skilled Web users, we assessed qualitative differences in technical skills, cognitive strategies, and attitudes toward online health information. Based on these findings, we identified educational and technological interventions to help Web users find and access good-quality online health information. We asked 22 native German-speaking adults to search for health information online. The skilled cohort consisted of 10 participants who were younger than 30 years of age, had a higher level of education, and were more experienced using the Web than 12 participants in the less-skilled cohort, who were at least 50 years of age. We observed online health information searches to specify differences in technical skills and analyzed concurrent verbal protocols to identify health information seekers' cognitive strategies and attitudes. Our main findings relate to (1) attitudes: health information seekers in both cohorts doubted the quality of information retrieved online; among poorly skilled seekers, this was mainly because they doubted their skills to navigate vast amounts of information; once a website was accessed, quality concerns disappeared in both cohorts, (2) technical skills: skilled Web users effectively filtered information according to search intentions and data sources; less-skilled users were easily distracted by unrelated information, and (3) cognitive strategies: skilled Web users searched to inform themselves; less-skilled users searched to confirm their health-related opinions such as "vaccinations are harmful." Independent of Web-use skills, most participants stopped a search once they had found the first piece of evidence satisfying search intentions, rather than according to quality criteria. Findings related to Web-use skills differences suggest two classes of interventions to facilitate access to good-quality online health information. Challenges related to findings (1) and (2) should be remedied by improving people's basic Web-use skills. In particular, Web users should be taught how to avoid information overload by generating specific search terms and to avoid low-quality information by requesting results from trusted websites only. Problems related to finding (3) may be remedied by visually labeling search engine results according to quality criteria.

  19. GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis.

    PubMed

    Zheng, Qi; Wang, Xiu-Jie

    2008-07-01

    Gene Ontology (GO) analysis has become a commonly used approach for functional studies of large-scale genomic or transcriptomic data. Although there have been a lot of software with GO-related analysis functions, new tools are still needed to meet the requirements for data generated by newly developed technologies or for advanced analysis purpose. Here, we present a Gene Ontology Enrichment Analysis Software Toolkit (GOEAST), an easy-to-use web-based toolkit that identifies statistically overrepresented GO terms within given gene sets. Compared with available GO analysis tools, GOEAST has the following improved features: (i) GOEAST displays enriched GO terms in graphical format according to their relationships in the hierarchical tree of each GO category (biological process, molecular function and cellular component), therefore, provides better understanding of the correlations among enriched GO terms; (ii) GOEAST supports analysis for data from various sources (probe or probe set IDs of Affymetrix, Illumina, Agilent or customized microarrays, as well as different gene identifiers) and multiple species (about 60 prokaryote and eukaryote species); (iii) One unique feature of GOEAST is to allow cross comparison of the GO enrichment status of multiple experiments to identify functional correlations among them. GOEAST also provides rigorous statistical tests to enhance the reliability of analysis results. GOEAST is freely accessible at http://omicslab.genetics.ac.cn/GOEAST/

  20. CovalentDock Cloud: a web server for automated covalent docking.

    PubMed

    Ouyang, Xuchang; Zhou, Shuo; Ge, Zemei; Li, Runtao; Kwoh, Chee Keong

    2013-07-01

    Covalent binding is an important mechanism for many drugs to gain its function. We developed a computational algorithm to model this chemical event and extended it to a web server, the CovalentDock Cloud, to make it accessible directly online without any local installation and configuration. It provides a simple yet user-friendly web interface to perform covalent docking experiments and analysis online. The web server accepts the structures of both the ligand and the receptor uploaded by the user or retrieved from online databases with valid access id. It identifies the potential covalent binding patterns, carries out the covalent docking experiments and provides visualization of the result for user analysis. This web server is free and open to all users at http://docking.sce.ntu.edu.sg/.

  1. WIWS: a protein structure bioinformatics Web service collection.

    PubMed

    Hekkelman, M L; Te Beek, T A H; Pettifer, S R; Thorne, D; Attwood, T K; Vriend, G

    2010-07-01

    The WHAT IF molecular-modelling and drug design program is widely distributed in the world of protein structure bioinformatics. Although originally designed as an interactive application, its highly modular design and inbuilt control language have recently enabled its deployment as a collection of programmatically accessible web services. We report here a collection of WHAT IF-based protein structure bioinformatics web services: these relate to structure quality, the use of symmetry in crystal structures, structure correction and optimization, adding hydrogens and optimizing hydrogen bonds and a series of geometric calculations. The freely accessible web services are based on the industry standard WS-I profile and the EMBRACE technical guidelines, and are available via both REST and SOAP paradigms. The web services run on a dedicated computational cluster; their function and availability is monitored daily.

  2. CovalentDock Cloud: a web server for automated covalent docking

    PubMed Central

    Ouyang, Xuchang; Zhou, Shuo; Ge, Zemei; Li, Runtao; Kwoh, Chee Keong

    2013-01-01

    Covalent binding is an important mechanism for many drugs to gain its function. We developed a computational algorithm to model this chemical event and extended it to a web server, the CovalentDock Cloud, to make it accessible directly online without any local installation and configuration. It provides a simple yet user-friendly web interface to perform covalent docking experiments and analysis online. The web server accepts the structures of both the ligand and the receptor uploaded by the user or retrieved from online databases with valid access id. It identifies the potential covalent binding patterns, carries out the covalent docking experiments and provides visualization of the result for user analysis. This web server is free and open to all users at http://docking.sce.ntu.edu.sg/. PMID:23677616

  3. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  4. LocusExplorer: a user-friendly tool for integrated visualization of human genetic association data and biological annotations.

    PubMed

    Dadaev, Tokhir; Leongamornlert, Daniel A; Saunders, Edward J; Eeles, Rosalind; Kote-Jarai, Zsofia

    2016-03-15

    : In this article, we present LocusExplorer, a data visualization and exploration tool for genetic association data. LocusExplorer is written in R using the Shiny library, providing access to powerful R-based functions through a simple user interface. LocusExplorer allows users to simultaneously display genetic, statistical and biological data for humans in a single image and allows dynamic zooming and customization of the plot features. Publication quality plots may then be produced in a variety of file formats. LocusExplorer is open source and runs through R and a web browser. It is available at www.oncogenetics.icr.ac.uk/LocusExplorer/ or can be installed locally and the source code accessed from https://github.com/oncogenetics/LocusExplorer tokhir.dadaev@icr.ac.uk. © The Author 2015. Published by Oxford University Press.

  5. Web-Based Statistical Sampling and Analysis

    ERIC Educational Resources Information Center

    Quinn, Anne; Larson, Karen

    2016-01-01

    Consistent with the Common Core State Standards for Mathematics (CCSSI 2010), the authors write that they have asked students to do statistics projects with real data. To obtain real data, their students use the free Web-based app, Census at School, created by the American Statistical Association (ASA) to help promote civic awareness among school…

  6. Developing Access Control Model of Web OLAP over Trusted and Collaborative Data Warehouses

    NASA Astrophysics Data System (ADS)

    Fugkeaw, Somchart; Mitrpanont, Jarernsri L.; Manpanpanich, Piyawit; Juntapremjitt, Sekpon

    This paper proposes the design and development of Role- based Access Control (RBAC) model for the Single Sign-On (SSO) Web-OLAP query spanning over multiple data warehouses (DWs). The model is based on PKI Authentication and Privilege Management Infrastructure (PMI); it presents a binding model of RBAC authorization based on dimension privilege specified in attribute certificate (AC) and user identification. Particularly, the way of attribute mapping between DW user authentication and privilege of dimensional access is illustrated. In our approach, we apply the multi-agent system to automate flexible and effective management of user authentication, role delegation as well as system accountability. Finally, the paper culminates in the prototype system A-COLD (Access Control of web-OLAP over multiple DWs) that incorporates the OLAP features and authentication and authorization enforcement in the multi-user and multi-data warehouse environment.

  7. The Digital Divide and Patient Portals: Internet Access Explained Differences in Patient Portal Use for Secure Messaging by Age, Race, and Income.

    PubMed

    Graetz, Ilana; Gordon, Nancy; Fung, Vick; Hamity, Courtnee; Reed, Mary E

    2016-08-01

    Online access to health records and the ability to exchange secure messages with physicians can improve patient engagement and outcomes; however, the digital divide could limit access to web-based portals among disadvantaged groups. To understand whether sociodemographic differences in patient portal use for secure messaging can be explained by differences in internet access and care preferences. Cross-sectional survey to examine the association between patient sociodemographic characteristics and internet access and care preferences; then, the association between sociodemographic characteristics and secure message use with and without adjusting for internet access and care preference. One thousand forty-one patients with chronic conditions in a large integrated health care delivery system (76% response rate). Internet access, portal use for secure messaging, preference for in-person or online care, and sociodemographic and health characteristics. Internet access and preference mediated some of the differences in secure message use by age, race, and income. For example, using own computer to access the internet explained 52% of the association between race and secure message use and 60% of the association between income and use (Sobel-Goodman mediation test, P<0.001 for both). Education and sex-related differences in portal use remained statistically significant when controlling for internet access and preference. As the availability and use of patient portals increase, it is important to understand which patients have limited access and the barriers they may face. Improving internet access and making portals available across multiple platforms, including mobile, may reduce some disparities in secure message use.

  8. Web client and ODBC access to legacy database information: a low cost approach.

    PubMed Central

    Sanders, N. W.; Mann, N. H.; Spengler, D. M.

    1997-01-01

    A new method has been developed for the Department of Orthopaedics of Vanderbilt University Medical Center to access departmental clinical data. Previously this data was stored only in the medical center's mainframe DB2 database, it is now additionally stored in a departmental SQL database. Access to this data is available via any ODBC compliant front-end or a web client. With a small budget and no full time staff, we were able to give our department on-line access to many years worth of patient data that was previously inaccessible. PMID:9357735

  9. A DICOM based radiotherapy plan database for research collaboration and reporting

    NASA Astrophysics Data System (ADS)

    Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.

    2014-03-01

    Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.

  10. Health and medication information resources on the World Wide Web.

    PubMed

    Grossman, Sara; Zerilli, Tina

    2013-04-01

    Health care practitioners have increasingly used the Internet to obtain health and medication information. The vast number of Internet Web sites providing such information and concerns with their reliability makes it essential for users to carefully select and evaluate Web sites prior to use. To this end, this article reviews the general principles to consider in this process. Moreover, as cost may limit access to subscription-based health and medication information resources with established reputability, freely accessible online resources that may serve as an invaluable addition to one's reference collection are highlighted. These include government- and organization-sponsored resources (eg, US Food and Drug Administration Web site and the American Society of Health-System Pharmacists' Drug Shortage Resource Center Web site, respectively) as well as commercial Web sites (eg, Medscape, Google Scholar). Familiarity with such online resources can assist health care professionals in their ability to efficiently navigate the Web and may potentially expedite the information gathering and decision-making process, thereby improving patient care.

  11. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  12. Pilot Evaluation of a Web-Based Intervention Targeting Sexual Health Service Access

    ERIC Educational Resources Information Center

    Brown, K. E.; Newby, K.; Caley, M.; Danahay, A.; Kehal, I.

    2016-01-01

    Sexual health service access is fundamental to good sexual health, yet interventions designed to address this have rarely been implemented or evaluated. In this article, pilot evaluation findings for a targeted public health behavior change intervention, delivered via a website and web-app, aiming to increase uptake of sexual health services among…

  13. Make That to Go: Re-Engineering a Web Portal for Mobile Access

    ERIC Educational Resources Information Center

    Spitzer, Stephan

    2012-01-01

    The fact that people now live in a world of abundant portable electronic devices is important to any organization that maintains a web presence, including libraries. No longer tied to a desktop, the patrons' netbooks, tablets, ebook readers, and, of course, cellphones all become potential tools for remote access to library content. About a year…

  14. WaveNet: A Web-Based Metocean Data Access, Processing and Analysis Tool; Part 5 - WW3 Database

    DTIC Science & Technology

    2015-02-01

    Program ( CDIP ); and Part 4 for the Great Lakes Observing System/Coastal Forecasting System (GLOS/GLCFS). Using step-by-step instructions, this Part 5...Demirbilek, Z., L. Lin, and D. Wilson. 2014a. WaveNet: A web-based metocean data access, processing, and analysis tool; part 3– CDIP database

  15. Essential Skills and Knowledge for Troubleshooting E-Resources Access Issues in a Web-Scale Discovery Environment

    ERIC Educational Resources Information Center

    Carter, Sunshine; Traill, Stacie

    2017-01-01

    Electronic resource access troubleshooting is familiar work in most libraries. The added complexity introduced when a library implements a web-scale discovery service, however, creates a strong need for well-organized, rigorous training to enable troubleshooting staff to provide the best service possible. This article outlines strategies, tools,…

  16. On2broker: Semantic-Based Access to Information Sources at the WWW.

    ERIC Educational Resources Information Center

    Fensel, Dieter; Angele, Jurgen; Decker, Stefan; Erdmann, Michael; Schnurr, Hans-Peter; Staab, Steffen; Studer, Rudi; Witt, Andreas

    On2broker provides brokering services to improve access to heterogeneous, distributed, and semistructured information sources as they are presented in the World Wide Web. It relies on the use of ontologies to make explicit the semantics of Web pages. This paper discusses the general architecture and main components (i.e., query engine, information…

  17. 76 FR 71914 - Nondiscrimination on the Basis of Disability in Air Travel: Accessibility of Web Sites and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-21

    ... Disability in Air Travel: Accessibility of Web Sites and Automated Kiosks at U.S. Airports AGENCY: Office of... January 9, 2012. This extension is a result of requests from a number of parties for additional time to... constructive comments for the Department's consideration. The Interactive Travel Services Association requested...

  18. 78 FR 3470 - DTE Electric Company (Formerly the Detroit Edison Company), Notice of Availability of Final...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-16

    ...--specific Web page at http://www.nrc.gov/reactors/new-reactors/col/fermi.html . The Ellis Library and... possesses and are publicly-available, using any of the following methods: Federal Rulemaking Web site: Go to... Documents Access and Management System (ADAMS): You may access publicly-available documents online in the...

  19. Evaluating the Accessibility of Web-Based Instruction for Students with Disabilities.

    ERIC Educational Resources Information Center

    Hinn, D. Michelle

    This paper presents the methods and results of a year-long evaluation study, conducted for the purpose of determining disability accessibility barriers and potential solutions for those barriers found in four World Wide Web-based learning environments. The primary questions used to frame the evaluation study were: (1) Are there any features of the…

  20. 77 FR 63758 - Magnuson-Stevens Act Provisions; Fisheries Off West Coast States; Biennial Specifications and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-17

    ... Access This final rule is accessible via the Internet at the Office of the Federal Register's Web site at... Pacific Fishery Management Council's Web site at http://www.pcouncil.org/ . Background The Pacific Coast... fishery and the potential impacts on overall catch levels. The Council's Groundfish Management Team (GMT...

  1. Web-Based Dissemination System for the Trusted Computing Exemplar Project

    DTIC Science & Technology

    2005-06-01

    6 3. Fiasco Microkernel ..............................................................................6 4. Apache Web Server...Fiasco Microkernel The next project examined was the Fiasco Microkernel developed by the Dresden University of Technology. This dissemination...System,” 1999, http://www.eros-os.org, Accessed: May 2005. [5] “The Fiasco Microkernel ,” February 2004, http://os.inf.tu-dresden.de/fiasco/, Accessed

  2. Towards an Approach of Semantic Access Control for Cloud Computing

    NASA Astrophysics Data System (ADS)

    Hu, Luokai; Ying, Shi; Jia, Xiangyang; Zhao, Kai

    With the development of cloud computing, the mutual understandability among distributed Access Control Policies (ACPs) has become an important issue in the security field of cloud computing. Semantic Web technology provides the solution to semantic interoperability of heterogeneous applications. In this paper, we analysis existing access control methods and present a new Semantic Access Control Policy Language (SACPL) for describing ACPs in cloud computing environment. Access Control Oriented Ontology System (ACOOS) is designed as the semantic basis of SACPL. Ontology-based SACPL language can effectively solve the interoperability issue of distributed ACPs. This study enriches the research that the semantic web technology is applied in the field of security, and provides a new way of thinking of access control in cloud computing.

  3. Smart Cities Intelligence System (SMACiSYS) Integrating Sensor Web with Spatial Data Infrastructures (sensdi)

    NASA Astrophysics Data System (ADS)

    Bhattacharya, D.; Painho, M.

    2017-09-01

    The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.

  4. Enhancing Access to Patient Education Information: A Pilot Usability Study

    PubMed Central

    Beaudoin, Denise E.; Rocha, Roberto A.; Tse, Tony

    2005-01-01

    Health care organizations are developing Web-based portals to provide patient access to personal health information and enhance patient-provider communication. This pilot study investigates two navigation models (“serial” and “menu-driven”) for improving access to education materials available through a portal. There was a trend toward greater user satisfaction with the menu-driven model. Model preference was influenced by frequency of Web use. Results should aid in the improvement of existing portals and in the development of new ones. PMID:16779179

  5. PathJam: a new service for integrating biological pathway information.

    PubMed

    Glez-Peña, Daniel; Reboiro-Jato, Miguel; Domínguez, Rubén; Gómez-López, Gonzalo; Pisano, David G; Fdez-Riverola, Florentino

    2010-10-28

    Biological pathways are crucial to much of the scientific research today including the study of specific biological processes related with human diseases. PathJam is a new comprehensive and freely accessible web-server application integrating scattered human pathway annotation from several public sources. The tool has been designed for both (i) being intuitive for wet-lab users providing statistical enrichment analysis of pathway annotations and (ii) giving support to the development of new integrative pathway applications. PathJam’s unique features and advantages include interactive graphs linking pathways and genes of interest, downloadable results in fully compatible formats, GSEA compatible output files and a standardized RESTful API.

  6. Usage Trends of Open Access and Local Journals: A Korean Case Study.

    PubMed

    Seo, Jeong-Wook; Chung, Hosik; Yun, Jungmin; Park, Jin Young; Park, Eunsun; Ahn, Yuri

    2016-01-01

    Articles from open access and local journals are important resources for research in Korea and the usage trends of these articles are important indicators for the assessment of the current research practice. We analyzed an institutional collection of published papers from 1998 to 2014 authored by researchers from Seoul National University, and their references from papers published between 1998 and 2011. The published papers were collected from Web of Science or Scopus and were analyzed according to the proportion of articles from open access journals. Their cited references from published papers in Web of Science were analyzed according to the proportion of local (South Korean) or open access journals. The proportion of open access papers was relatively stable until 2006 (2.5 ~ 5.2% in Web of Science and 2.7 ~ 4.2% in Scopus), but then increased to 15.9% (Web of Science) or 18.5% (Scopus) in 2014. We analyzed 2,750,485 cited references from 52,295 published papers. We found that the overall proportion of cited articles from local journals was 1.8% and that for open access journals was 3.0%. Citations of open access articles have increased since 2006 to 4.1% in 2011, although the increase in open access article citations was less than for open access publications. The proportion of citations from local journals was even lower. We think that the publishing / citing mismatch is a term to describe this difference, which is an issue at Seoul National University, where the number of published papers at open access or local journals is increasing but the number of citations is not. The cause of this discrepancy is multi-factorial but the governmental / institutional policies, social / cultural issues and authors' citing behaviors will explain the mismatch. However, additional measures are also necessary, such as the development of an institutional citation database and improved search capabilities with respect to local and open access documents.

  7. Usage Trends of Open Access and Local Journals: A Korean Case Study

    PubMed Central

    Chung, Hosik; Yun, Jungmin; Park, Jin Young; Park, Eunsun; Ahn, Yuri

    2016-01-01

    Articles from open access and local journals are important resources for research in Korea and the usage trends of these articles are important indicators for the assessment of the current research practice. We analyzed an institutional collection of published papers from 1998 to 2014 authored by researchers from Seoul National University, and their references from papers published between 1998 and 2011. The published papers were collected from Web of Science or Scopus and were analyzed according to the proportion of articles from open access journals. Their cited references from published papers in Web of Science were analyzed according to the proportion of local (South Korean) or open access journals. The proportion of open access papers was relatively stable until 2006 (2.5 ~ 5.2% in Web of Science and 2.7 ~ 4.2% in Scopus), but then increased to 15.9% (Web of Science) or 18.5% (Scopus) in 2014. We analyzed 2,750,485 cited references from 52,295 published papers. We found that the overall proportion of cited articles from local journals was 1.8% and that for open access journals was 3.0%. Citations of open access articles have increased since 2006 to 4.1% in 2011, although the increase in open access article citations was less than for open access publications. The proportion of citations from local journals was even lower. We think that the publishing / citing mismatch is a term to describe this difference, which is an issue at Seoul National University, where the number of published papers at open access or local journals is increasing but the number of citations is not. The cause of this discrepancy is multi-factorial but the governmental / institutional policies, social / cultural issues and authors' citing behaviors will explain the mismatch. However, additional measures are also necessary, such as the development of an institutional citation database and improved search capabilities with respect to local and open access documents. PMID:27195948

  8. Physical Impairment

    NASA Astrophysics Data System (ADS)

    Trewin, Shari

    Many health conditions can lead to physical impairments that impact computer and Web access. Musculoskeletal conditions such as arthritis and cumulative trauma disorders can make movement stiff and painful. Movement disorders such as tremor, Parkinsonism and dystonia affect the ability to control movement, or to prevent unwanted movements. Often, the same underlying health condition also has sensory or cognitive effects. People with dexterity impairments may use a standard keyboard and mouse, or any of a wide range of alternative input mechanisms. Examples are given of the diverse ways that specific dexterity impairments and input mechanisms affect the fundamental actions of Web browsing. As the Web becomes increasingly sophisticated, and physically demanding, new access features at the Web browser and page level will be necessary.

  9. High-performance web services for querying gene and variant annotation.

    PubMed

    Xin, Jiwen; Mark, Adam; Afrasiabi, Cyrus; Tsueng, Ginger; Juchler, Moritz; Gopal, Nikhil; Stupp, Gregory S; Putman, Timothy E; Ainscough, Benjamin J; Griffith, Obi L; Torkamani, Ali; Whetzel, Patricia L; Mungall, Christopher J; Mooney, Sean D; Su, Andrew I; Wu, Chunlei

    2016-05-06

    Efficient tools for data management and integration are essential for many aspects of high-throughput biology. In particular, annotations of genes and human genetic variants are commonly used but highly fragmented across many resources. Here, we describe MyGene.info and MyVariant.info, high-performance web services for querying gene and variant annotation information. These web services are currently accessed more than three million times permonth. They also demonstrate a generalizable cloud-based model for organizing and querying biological annotation information. MyGene.info and MyVariant.info are provided as high-performance web services, accessible at http://mygene.info and http://myvariant.info . Both are offered free of charge to the research community.

  10. The ViennaRNA web services.

    PubMed

    Gruber, Andreas R; Bernhart, Stephan H; Lorenz, Ronny

    2015-01-01

    The ViennaRNA package is a widely used collection of programs for thermodynamic RNA secondary structure prediction. Over the years, many additional tools have been developed building on the core programs of the package to also address issues related to noncoding RNA detection, RNA folding kinetics, or efficient sequence design considering RNA-RNA hybridizations. The ViennaRNA web services provide easy and user-friendly web access to these tools. This chapter describes how to use this online platform to perform tasks such as prediction of minimum free energy structures, prediction of RNA-RNA hybrids, or noncoding RNA detection. The ViennaRNA web services can be used free of charge and can be accessed via http://rna.tbi.univie.ac.at.

  11. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    PubMed

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  12. HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis

    PubMed Central

    David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057

  13. JANIS 4: An Improved Version of the NEA Java-based Nuclear Data Information System

    NASA Astrophysics Data System (ADS)

    Soppera, N.; Bossant, M.; Dupont, E.

    2014-06-01

    JANIS is software developed to facilitate the visualization and manipulation of nuclear data, giving access to evaluated data libraries, and to the EXFOR and CINDA databases. It is stand-alone Java software, downloadable from the web and distributed on DVD. Used offline, the system also makes use of an internet connection to access the NEA Data Bank database. It is now also offered as a full web application, only requiring a browser. The features added in the latest version of the software and this new web interface are described.

  14. JANIS 4: An Improved Version of the NEA Java-based Nuclear Data Information System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soppera, N., E-mail: nicolas.soppera@oecd.org; Bossant, M.; Dupont, E.

    JANIS is software developed to facilitate the visualization and manipulation of nuclear data, giving access to evaluated data libraries, and to the EXFOR and CINDA databases. It is stand-alone Java software, downloadable from the web and distributed on DVD. Used offline, the system also makes use of an internet connection to access the NEA Data Bank database. It is now also offered as a full web application, only requiring a browser. The features added in the latest version of the software and this new web interface are described.

  15. A trial of e-simulation of sudden patient deterioration (FIRST2ACT WEB) on student learning.

    PubMed

    Bogossian, Fiona E; Cooper, Simon J; Cant, Robyn; Porter, Joanne; Forbes, Helen

    2015-10-01

    High-fidelity simulation pedagogy is of increasing importance in health professional education; however, face-to-face simulation programs are resource intensive and impractical to implement across large numbers of students. To investigate undergraduate nursing students' theoretical and applied learning in response to the e-simulation program-FIRST2ACT WEBTM, and explore predictors of virtual clinical performance. Multi-center trial of FIRST2ACT WEBTM accessible to students in five Australian universities and colleges, across 8 campuses. A population of 489 final-year nursing students in programs of study leading to license to practice. Participants proceeded through three phases: (i) pre-simulation-briefing and assessment of clinical knowledge and experience; (ii) e-simulation-three interactive e-simulation clinical scenarios which included video recordings of patients with deteriorating conditions, interactive clinical tasks, pop up responses to tasks, and timed performance; and (iii) post-simulation feedback and evaluation. Descriptive statistics were followed by bivariate analysis to detect any associations, which were further tested using standard regression analysis. Of 409 students who commenced the program (83% response rate), 367 undergraduate nursing students completed the web-based program in its entirety, yielding a completion rate of 89.7%; 38.1% of students achieved passing clinical performance across three scenarios, and the proportion achieving passing clinical knowledge increased from 78.15% pre-simulation to 91.6% post-simulation. Knowledge was the main independent predictor of clinical performance in responding to a virtual deteriorating patient R(2)=0.090, F(7, 352)=4.962, p<0.001. The use of web-based technology allows simulation activities to be accessible to a large number of participants and completion rates indicate that 'Net Generation' nursing students were highly engaged with this mode of learning. The web-based e-simulation program FIRST2ACTTM effectively enhanced knowledge, virtual clinical performance, and self-assessed knowledge, skills, confidence, and competence in final-year nursing students. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. ClimateWizard: A Framework and Easy-to-Use Web-Mapping Tool for Global, Regional, and Local Climate-Change Analysis

    NASA Astrophysics Data System (ADS)

    Girvetz, E. H.; Zganjar, C.; Raber, G. T.; Hoekstra, J.; Lawler, J. J.; Kareiva, P.

    2008-12-01

    Now that there is overwhelming evidence of global climate change, scientists, managers and planners (i.e. practitioners) need to assess the potential impacts of climate change on particular ecological systems, within specific geographic areas, and at spatial scales they care about, in order to make better land management, planning, and policy decisions. Unfortunately, this application of climate science to real world decisions and planning has proceeded too slowly because we lack tools for translating cutting-edge climate science and climate-model outputs into something managers and planners can work with at local or regional scales (CCSP 2008). To help increase the accessibility of climate information, we have developed a freely-available, easy-to-use, web-based climate-change analysis toolbox, called ClimateWizard, for assessing how climate has and is projected to change at specific geographic locations throughout the world. The ClimateWizard uses geographic information systems (GIS), web-services (SOAP/XML), statistical analysis platforms (e.g. R- project), and web-based mapping services (e.g. Google Earth/Maps, KML/GML) to provide a variety of different analyses (e.g. trends and departures) and outputs (e.g. maps, graphs, tables, GIS layers). Because ClimateWizard analyzes large climate datasets stored remotely on powerful computers, users of the tool do not need to have fast computers or expensive software, but simply need access to the internet. The analysis results are then provided to users in a Google Maps webpage tailored to the specific climate-change question being asked. The ClimateWizard is not a static product, but rather a framework to be built upon and modified to suit the purposes of specific scientific, management, and policy questions. For example, it can be expanded to include bioclimatic variables (e.g. evapotranspiration) and marine data (e.g. sea surface temperature), as well as improved future climate projections, and climate-change impact analyses involving hydrology, vegetation, wildfire, disease, and food security. By harnessing the power of computer and web- based technologies, the ClimateWizard puts local, regional, and global climate-change analyses in the hands of a wider array of managers, planners, and scientists.

  17. Pilot Randomized Controlled Trial of Internet-delivered Cognitive-Behavioral Treatment for Pediatric Headache

    PubMed Central

    Law, Emily F.; Beals-Erickson, Sarah E.; Noel, Melanie; Claar, Robyn; Palermo, Tonya M.

    2015-01-01

    Objective To evaluate the feasibility and preliminary effectiveness of an Internet-delivered cognitive-behavioral therapy (CBT) intervention for adolescents with chronic headache. Background Headache is among the most common pain complaints of childhood. Cognitive-behavioral interventions are efficacious for improving pain among youth with headache. However, many youth do not receive psychological treatment for headache due to poor access, which has led to consideration of alternative delivery modalities such as the Internet. Methods We used a parallel arm randomized controlled trial design to evaluate the feasibility and preliminary effectiveness of an Internet-delivered family-based CBT intervention, Web-Based Management of Adolescent Pain (Web-MAP). Adolescents were eligible for the trial if they were a new patient being evaluated in a specialized headache clinic, between the ages of 11–17 years old, and had recurrent headache for three months or more as diagnosed by a pediatric neurologist. Eighty-three youth enrolled in the trial. An online random number generator was used to randomly assign participants to receive Internet CBT adjunctive to specialized headache treatment (n = 44) or specialized headache treatment alone (n = 39). The primary treatment outcome was headache days. Results Youth and parents in the Internet CBT group demonstrated high levels of engagement with the web program and reported satisfaction with the intervention. Multi-level modeling was used to conduct hypothesis testing for continuous outcomes. For our primary treatment outcome of headache days, adolescents reported a statistically significant reduction in headache days from baseline to post-treatment and baseline to three-month follow-up in both treatment conditions (main effect for time F(2, 136) = 19.70, p < .001). However, there was no statistically significant difference between the Internet CBT group and the specialized headache treatment group at post-treatment or follow-up (group x time interaction F(2, 134) = .94, p = .395). For our secondary treatment outcomes, findings from multilevel modeling (MLM) showed that adolescents in both groups demonstrated statistically significant improvement headache pain intensity, activity limitations, depressive symptoms and parent protective behaviors from baseline to post-treatment and these gains were maintained at three-month follow-up. Adolescent anxiety symptoms and sleep did not change during the study period for either group. There were no statistically significant group differences on any secondary outcomes at post-treatment or follow-up (p > 0.05 for all outcomes). No adverse events were reported. Conclusion Although adjunctive Internet CBT did not lead to additional benefit in this population, future research should evaluate whether it is an effective intervention for adolescents with headache who are unable to access specialized headache treatment. PMID:26316194

  18. Interactive Web Graphs with Fewer Restrictions

    NASA Technical Reports Server (NTRS)

    Fiedler, James

    2012-01-01

    There is growing popularity for interactive, statistical web graphs and programs to generate them. However, it seems that these programs tend to be somewhat restricted in which web browsers and statistical software are supported. For example, the software might use SVG (e.g., Protovis, gridSVG) or HTML canvas, both of which exclude most versions of Internet Explorer, or the software might be made specifically for R (gridSVG, CRanvas), thus excluding users of other stats software. There are more general tools (d3, Rapha lJS) which are compatible with most browsers, but using one of these to make statistical graphs requires more coding than is probably desired, and requires learning a new tool. This talk will present a method for making interactive web graphs, which, by design, attempts to support as many browsers and as many statistical programs as possible, while also aiming to be relatively easy to use and relatively easy to extend.

  19. 78 FR 40820 - 60-Day Notice of Proposed Information Collection: Exchange Programs Alumni Web Site Registration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-08

    ...: Exchange Programs Alumni Web Site Registration ACTION: Notice of request for public comment. SUMMARY: The... following methods: Web: Persons with access to the Internet may use the Federal Docket Management System... Programs Alumni Web site Registration OMB Control Number: 1405-0192 Type of Request: Extension of an...

  20. Factors Affecting the Successful Use of Web Sites

    ERIC Educational Resources Information Center

    Matheus, Anne

    2009-01-01

    Every second of every minute of every day, many people search the World Wide Web for information. People are less likely to use hard copy telephone books, encyclopedias, libraries, or other traditional methods of research. They access these traditional resources through web based portals. People use the web to find restaurants, go on vacation, to…

Top