Sample records for data-intensive web sites

  1. [Analysis of the web pages of the intensive care units of Spain].

    PubMed

    Navarro-Arnedo, J M

    2009-01-01

    In order to determine the Intensive Care Units (ICU) of Spanish hospitals that had a web site, to analyze the information they offered and to know what information they needed to offer according to a sample of ICU nurses, a cross-sectional observational, descriptive study was carried out between January and September 2008. For each ICU website, an analysis was made on the information available on the unit, its care, teaching and research activity on nursing. Simultaneously, based on a sample of intensive care nurses, the information that should be contained on an ICU website was determined. The results, expressed in absolute numbers and percentage, showed that 66 of the 292 hospitals with ICU (22.6%) had a web site; 50.7% of the sites showed the number of beds, 19.7% the activity report, 11.3% the published articles/studies and followed research lines and 9.9% the organized formation courses. 14 webs (19.7%) displayed images of nurses. However, only 1 (1.4%) offered guides on the actions followed. No web site offered a navigation section for nursing, the E-mail of the chief nursing, the nursing documentation used or if any nursing model of their own was used. It is concluded that only one-fourth of the Spanish hospitals with ICU have a web site; number of beds was the data offered by the most sites, whereas information on care, educational and investigating activities was very reduced and that on nursing was practically omitted on the web pages of intensive care units.

  2. Real-time estimation system for seismic-intensity exposed-population

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Nakamura, H.; Kunugi, T.; Suzuki, W.; Fujiwara, H.

    2013-12-01

    For an appropriate first-action to an earthquake, risk (damage) information evaluated in real-time are important as well as hazard (ground motion) information. To meet this need, we are developing real-time estimation system (J-RISQ) for exposed population and earthquake damage on buildings. We plan to open the web page of estimated exposed population to the public from autumn. When an earthquake occurs, seismic intensities are calculated at each observation station and sent to the DMC (Data Management Center) in different timing. For rapid estimation, the system does not wait for the data from all the stations but begins the first estimation when the number of the stations observing the seismic intensity of 2.5 or larger exceeds the threshold amount. Estimations are updated several times using all the available data at that moment. Spatial distribution of seismic intensity in 250 m meshes is estimated by the site amplification factor of surface layers and the observed data. By using this intensity distribution, the exposed population is estimated using population data of each mesh. The exposed populations for municipalities and prefectures are estimated by summing-up the exposures of included meshes for the area and are appropriately rounded taking estimation precision into consideration. The estimated intensities for major cities are shown by the histograms, which indicate the variation of the estimated values in the city together with the observed maximum intensity. The variation is mainly caused by the difference of the site amplification factors. The intensities estimated for meshes with large amplification factor are sometimes larger than the maximum value observed in the city. The estimated results are seen on the web site just after the earthquake. The results of the past earthquakes can be easily searched by keywords such as date, magnitudes, seismic intensities and source areas. The summary of the results in the one-page report of Portable Document Format is also available. This system has been experimentally operated since 2010 and has performed the estimations in real-time for more than 670 earthquakes by July of 2012. For about 75 % of these earthquakes, it takes less than one minute to send the e-mail of first estimation after receiving data from the first triggered station, and therefore, the rapidity of the system is satisfactory. To upload a PDF report form to the web site, it takes approximately additional 30 second.

  3. Creating the Web-based Intensive Care Unit Safety Reporting System

    PubMed Central

    Holzmueller, Christine G.; Pronovost, Peter J.; Dickman, Fern; Thompson, David A.; Wu, Albert W.; Lubomski, Lisa H.; Fahey, Maureen; Steinwachs, Donald M.; Engineer, Lilly; Jaffrey, Ali; Morlock, Laura L.; Dorman, Todd

    2005-01-01

    In an effort to improve patient safety, researchers at the Johns Hopkins University designed and implemented a comprehensive Web-based Intensive Care Unit Safety Reporting System (ICUSRS). The ICUSRS collects data about adverse events and near misses from all staff in the ICU. This report reflects data on 854 reports from 18 diverse ICUs across the United States. Reporting is voluntary, and data collected is confidential, with patient, provider, and reporter information deidentified. Preliminary data include system factors reported, degree of patient harm, reporting times, and evaluations of the system. Qualitative and quantitative data are reported back to the ICU site study teams and frontline staff through monthly reports, case discussions, and a quarterly newsletter. PMID:15561794

  4. Management intensity and vegetation complexity affect web-building spiders and their prey.

    PubMed

    Diehl, Eva; Mader, Viktoria L; Wolters, Volkmar; Birkhofer, Klaus

    2013-10-01

    Agricultural management and vegetation complexity affect arthropod diversity and may alter trophic interactions between predators and their prey. Web-building spiders are abundant generalist predators and important natural enemies of pests. We analyzed how management intensity (tillage, cutting of the vegetation, grazing by cattle, and synthetic and organic inputs) and vegetation complexity (plant species richness, vegetation height, coverage, and density) affect rarefied richness and composition of web-building spiders and their prey with respect to prey availability and aphid predation in 12 habitats, ranging from an uncut fallow to a conventionally managed maize field. Spiders and prey from webs were collected manually and the potential prey were quantified using sticky traps. The species richness of web-building spiders and the order richness of prey increased with plant diversity and vegetation coverage. Prey order richness was lower at tilled compared to no-till sites. Hemipterans (primarily aphids) were overrepresented, while dipterans, hymenopterans, and thysanopterans were underrepresented in webs compared to sticky traps. The per spider capture efficiency for aphids was higher at tilled than at no-till sites and decreased with vegetation complexity. After accounting for local densities, 1.8 times more aphids were captured at uncut compared to cut sites. Our results emphasize the functional role of web-building spiders in aphid predation, but suggest negative effects of cutting or harvesting. We conclude that reduced management intensity and increased vegetation complexity help to conserve local invertebrate diversity, and that web-building spiders at sites under low management intensity (e.g., semi-natural habitats) contribute to aphid suppression at the landscape scale.

  5. The Korean Neonatal Network: An Overview

    PubMed Central

    Chang, Yun Sil; Park, Hyun-Young

    2015-01-01

    Currently, in the Republic of Korea, despite the very-low-birth rate, the birth rate and number of preterm infants are markedly increasing. Neonatal deaths and major complications mostly occur in premature infants, especially very-low-birth-weight infants (VLBWIs). VLBWIs weigh less than 1,500 g at birth and require intensive treatment in a neonatal intensive care unit (NICU). The operation of the Korean Neonatal Network (KNN) officially started on April 15, 2013, by the Korean Society of Neonatology with support from the Korea Centers for Disease Control and Prevention. The KNN is a national multicenter neonatal network based on a prospective web-based registry for VLBWIs. About 2,000 VLBWIs from 60 participating hospital NICUs are registered annually in the KNN. The KNN has built unique systems such as a web-based real-time data display on the web site and a site-visit monitoring system for data quality surveillance. The KNN should be maintained and developed further in order to generate appropriate, population-based, data-driven, health-care policies; facilitate active multicenter neonatal research, including quality improvement of neonatal care; and ultimately lead to improvement in the prognosis of high-risk newborns and subsequent reduction in health-care costs through the development of evidence-based neonatal medicine in Korea. PMID:26566355

  6. The Korean Neonatal Network: An Overview.

    PubMed

    Chang, Yun Sil; Park, Hyun-Young; Park, Won Soon

    2015-10-01

    Currently, in the Republic of Korea, despite the very-low-birth rate, the birth rate and number of preterm infants are markedly increasing. Neonatal deaths and major complications mostly occur in premature infants, especially very-low-birth-weight infants (VLBWIs). VLBWIs weigh less than 1,500 g at birth and require intensive treatment in a neonatal intensive care unit (NICU). The operation of the Korean Neonatal Network (KNN) officially started on April 15, 2013, by the Korean Society of Neonatology with support from the Korea Centers for Disease Control and Prevention. The KNN is a national multicenter neonatal network based on a prospective web-based registry for VLBWIs. About 2,000 VLBWIs from 60 participating hospital NICUs are registered annually in the KNN. The KNN has built unique systems such as a web-based real-time data display on the web site and a site-visit monitoring system for data quality surveillance. The KNN should be maintained and developed further in order to generate appropriate, population-based, data-driven, health-care policies; facilitate active multicenter neonatal research, including quality improvement of neonatal care; and ultimately lead to improvement in the prognosis of high-risk newborns and subsequent reduction in health-care costs through the development of evidence-based neonatal medicine in Korea.

  7. A Password-Protected Web Site for Mothers Expressing Milk for Their Preterm Infants.

    PubMed

    Blatz, MaryAnn; Dowling, Donna; Underwood, Patricia W; Bieda, Amy; Graham, Gregory

    2017-06-01

    Research has demonstrated that breast milk significantly decreases morbidities that impact length of stay for preterm infants, but there is a need to test interventions to improve breastfeeding outcomes. Since many Americans are using technologies such as the Intranet and smartphones to find health information and manage health, a Web site was developed for mothers who provide breast milk for their preterm hospitalized infants. This study examined the efficacy of a Web site for mothers to educate them about breast milk expression and assist them in monitoring their breast milk supply. Quantitative and qualitative data were collected from mothers whose preterm infants were hospitalized in a level IV neonatal intensive care unit (NICU) or transitional care unit (TCU) in an urban academic medical center in the Midwest. Eighteen mothers participated in evaluation of the Web site. Thirteen mothers consistently logged on to the password-protected Web site (mean [standard deviation] = 13.3 [11.7]) times. Most participants, (69.2%), reported they used the breast milk educational information. Most mothers indicated that using the Web site log helped in tracking their pumping. These findings can be used to direct the design and development of web-based resources for mothers of preterm infants IMPLICATIONS FOR PRACTICE:: NICU and TCU staffs need to examine and establish approaches to actively involve mothers in monitoring the establishment and maintenance of an adequate supply of breast milk to improve neonatal health outcomes. An electronic health application that incorporates the features identified in this study should be developed and tested.

  8. The Modern Research Data Portal: a design pattern for networked, data-intensive science

    DOE PAGES

    Chard, Kyle; Dart, Eli; Foster, Ian; ...

    2018-01-15

    We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. Here, we capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site,https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  9. The Modern Research Data Portal: a design pattern for networked, data-intensive science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; Dart, Eli; Foster, Ian

    We describe best practices for providing convenient, high-speed, secure access to large data via research data portals. Here, we capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance data enclaves and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site,https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  10. The Modern Research Data Portal: A Design Pattern for Networked, Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; Dart, Eli; Foster, Ian

    Here we describe best practices for providing convenient, high-speed, secure access to large data via research data portals. We capture these best practices in a new design pattern, the Modern Research Data Portal, that disaggregates the traditional monolithic web-based data portal to achieve orders-of-magnitude increases in data transfer performance, support new deployment architectures that decouple control logic from data storage, and reduce development and operations costs. We introduce the design pattern; explain how it leverages high-performance Science DMZs and cloud-based data management services; review representative examples at research laboratories and universities, including both experimental facilities and supercomputer sites; describe howmore » to leverage Python APIs for authentication, authorization, data transfer, and data sharing; and use coding examples to demonstrate how these APIs can be used to implement a range of research data portal capabilities. Sample code at a companion web site, https://docs.globus.org/mrdp, provides application skeletons that readers can adapt to realize their own research data portals.« less

  11. 75 FR 42599 - Posting of Flight Delay Data on Web Sites

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ...] RIN No. 2105-AE02 Posting of Flight Delay Data on Web Sites AGENCY: Office of the Secretary (OST... performance information to a reporting air carrier's Web site from anytime between the 20th and 23rd day of... flight performance data onto their Web sites on Saturday, July 24, 2010, for June data, and all...

  12. 75 FR 34925 - Posting of Flight Delay Data on Web Sites

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-21

    ...] RIN No. 2105-AE02 Posting of Flight Delay Data on Web Sites AGENCY: Office of the Secretary (OST... carrier's Web site from anytime between the 20th and 23rd day of the month to the fourth Saturday of the... the requirement to post flight delay data on carriers' Web sites. Moreover, this change would further...

  13. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  14. Publicly available hospital comparison web sites: determination of useful, valid, and appropriate information for comparing surgical quality.

    PubMed

    Leonardi, Michael J; McGory, Marcia L; Ko, Clifford Y

    2007-09-01

    To explore hospital comparison Web sites for general surgery based on: (1) a systematic Internet search, (2) Web site quality evaluation, and (3) exploration of possible areas of improvement. A systematic Internet search was performed to identify hospital quality comparison Web sites in September 2006. Publicly available Web sites were rated on accessibility, data/statistical transparency, appropriateness, and timeliness. A sample search was performed to determine ranking consistency. Six national hospital comparison Web sites were identified: 1 government (Hospital Compare [Centers for Medicare and Medicaid Services]), 2 nonprofit (Quality Check [Joint Commission on Accreditation of Healthcare Organizations] and Hospital Quality and Safety Survey Results [Leapfrog Group]), and 3 proprietary sites (names withheld). For accessibility and data transparency, the government and nonprofit Web sites were best. For appropriateness, the proprietary Web sites were best, comparing multiple surgical procedures using a combination of process, structure, and outcome measures. However, none of these sites explicitly defined terms such as complications. Two proprietary sites allowed patients to choose ranking criteria. Most data on these sites were 2 years old or older. A sample search of 3 surgical procedures at 4 hospitals demonstrated significant inconsistencies. Patients undergoing surgery are increasingly using the Internet to compare hospital quality. However, a review of available hospital comparison Web sites shows suboptimal measures of quality and inconsistent results. This may be partially because of a lack of complete and timely data. Surgeons should be involved with quality comparison Web sites to ensure appropriate methods and criteria.

  15. Streamlining Data for Cross-Platform Web Delivery

    ERIC Educational Resources Information Center

    Watkins, Sean; Battles, Jason; Vacek, Rachel

    2013-01-01

    Smartphone users expect the presentation of Web sites on their mobile browsers to look and feel like native applications. With the pressure on library Web developers to produce app-like mobile sites, there is often a rush to get a site up without considering the importance of reusing or even restructuring the data driving the Web sites. An…

  16. Produce and Consume Linked Data with Drupal!

    NASA Astrophysics Data System (ADS)

    Corlosquet, Stéphane; Delbru, Renaud; Clark, Tim; Polleres, Axel; Decker, Stefan

    Currently a large number of Web sites are driven by Content Management Systems (CMS) which manage textual and multimedia content but also - inherently - carry valuable information about a site's structure and content model. Exposing this structured information to the Web of Data has so far required considerable expertise in RDF and OWL modelling and additional programming effort. In this paper we tackle one of the most popular CMS: Drupal. We enable site administrators to export their site content model and data to the Web of Data without requiring extensive knowledge on Semantic Web technologies. Our modules create RDFa annotations and - optionally - a SPARQL endpoint for any Drupal site out of the box. Likewise, we add the means to map the site data to existing ontologies on the Web with a search interface to find commonly used ontology terms. We also allow a Drupal site administrator to include existing RDF data from remote SPARQL endpoints on the Web in the site. When brought together, these features allow networked RDF Drupal sites that reuse and enrich Linked Data. We finally discuss the adoption of our modules and report on a use case in the biomedical field and the current status of its deployment.

  17. Space Weather Studies Using Ground-based Experimental Complex in Kazakhstan

    NASA Astrophysics Data System (ADS)

    Kryakunova, O.; Yakovets, A.; Monstein, C.; Nikolayevskiy, N.; Zhumabayev, B.; Gordienko, G.; Andreyev, A.; Malimbayev, A.; Levin, Yu.; Salikhov, N.; Sokolova, O.; Tsepakina, I.

    2015-12-01

    Kazakhstan ground-based experimental complex for space weather study is situated near Almaty. Results of space environment monitoring are accessible via Internet on the web-site of the Institute of Ionosphere (http://www.ionos.kz/?q=en/node/21) in real time. There is a complex database with hourly data of cosmic ray intensity, geomagnetic field intensity, and solar radio flux at 10.7 cm and 27.8 cm wavelengths. Several studies using those data are reported. They are an estimation of speed of a coronal mass ejection, a study of large scale traveling distrubances, an analysis of geomagnetically induced currents using the geomagnetic field data, and a solar energetic proton event on 27 January 2012.

  18. A Novel Database to Rank and Display Archeomagnetic Intensity Data

    NASA Astrophysics Data System (ADS)

    Donadini, F.; Korhonen, K.; Riisager, P.; Pesonen, L. J.; Kahma, K.

    2005-12-01

    To understand the content and the causes of the changes in the Earth's magnetic field beyond the observatory records one has to rely on archeomagnetic and lake sediment paleomagnetic data. The regional archeointensity curves are often of different quality and temporally variable which hampers the global analysis of the data in terms of dipole vs non-dipole field. We have developed a novel archeointensity database application utilizing MySQL, PHP (PHP Hypertext Preprocessor), and the Generic Mapping Tools (GMT) for ranking and displaying geomagnetic intensity data from the last 12000 years. Our application has the advantage that no specific software is required to query the database and view the results. Querying the database is performed using any Web browser; a fill-out form is used to enter the site location and a minimum ranking value to select the data points to be displayed. The form also features the possibility to select plotting of the data as an archeointensity curve with error bars, and a Virtual Axial Dipole Moment (VADM) or ancient field value (Ba) curve calculated using the CALS7K model (Continuous Archaeomagnetic and Lake Sediment geomagnetic model) of (Korte and Constable, 2005). The results of a query are displayed on a Web page containing a table summarizing the query parameters, a table showing the archeointensity values satisfying the query parameters, and a plot of VADM or Ba as a function of sample age. The database consists of eight related tables. The main one, INTENSITIES, stores the 3704 archeointensity measurements collected from 159 publications as VADM (and VDM when available) and Ba values, including their standard deviations and sampling locations. It also contains the number of samples and specimens measured from each site. The REFS table stores the references to a particular study. The names, latitudes, and longitudes of the regions where the samples were collected are stored in the SITES table. The MATERIALS, METHODS, SPECIMEN_TYPES and DATING_METHODS tables store information about the sample materials, intensity determination methods, specimen types and age determination methods. The SIGMA_COUNT table is used indirectly for ranking data according to the number of samples measured and their standard deviations. Each intensity measurement is assigned a score (0--2) depending on the number of specimens measured and their standard deviations, the intensity determination method, the type of specimens measured and materials. The ranking of each data point is calculated as the sum of the four scores and varies between 0 and 8. Additionally, users can select the parameters that will be included in the ranking.

  19. The Role of Virtual Reference in Library Web Site Design: A Qualitative Source for Usage Data

    ERIC Educational Resources Information Center

    Powers, Amanda Clay; Shedd, Julie; Hill, Clay

    2011-01-01

    Gathering qualitative information about usage behavior of library Web sites is a time-consuming process requiring the active participation of patron communities. Libraries that collect virtual reference transcripts, however, hold valuable data regarding how the library Web site is used that could benefit Web designers. An analysis of virtual…

  20. Efficacy of a Pilot Internet-Based Weight Management Program (H.E.A.L.T.H.) and Longitudinal Physical Fitness Data in Army Reserve Soldiers

    PubMed Central

    Newton, Robert L; Han, Hongmei; Stewart, Tiffany M; Ryan, Donna H; Williamson, Donald A

    2011-01-01

    Background The primary aims of this article are to describe the utilization of an Internet-based weight management Web site [Healthy Eating, Activity, and Lifestyle Training Headquarters (H.E.A.L.T.H.)] over a 12–27 month period and to describe concurrent weight and fitness changes in Army Reserve soldiers. Methods The H.E.A.L.T.H. Web site was marketed to Army Reserve soldiers via a Web site promotion program for 27 months (phase I) and its continued usage was observed over a subsequent 12-month period (phase II). Web site usage was obtained from the H.E.A.L.T.H. Web site. Weight and fitness data were extracted from the Regional Level Application Software (RLAS). Results A total of 1499 Army Reserve soldiers registered on the H.E.A.L.T.H. Web site. There were 118 soldiers who returned to the H.E.A.L.T.H. Web site more than once. Registration rate reduced significantly following the removal of the Web site promotion program. During phase I, 778 Army Reserve soldiers had longitudinal weight and fitness data in RLAS. Men exceeding the screening table weight gained less weight compared with men below it (p < .007). Percentage change in body weight was inversely associated with change in fitness scores. Conclusions The Web site promotion program resulted in 52% of available Army Reserve soldiers registering onto the H.E.A.L.T.H. Web site, and 7.9% used the Web site more than once. The H.E.A.L.T.H. Web site may be a viable population-based weight and fitness management tool for soldier use. PMID:22027327

  1. Using Web Server Logs in Evaluating Instructional Web Sites.

    ERIC Educational Resources Information Center

    Ingram, Albert L.

    2000-01-01

    Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, E.T.; Truett, D.F.; Truett, L.F.

    In the handful of years since the World Wide Web (WWW or Web) came into being, Web sites have developed at an astonishing rate. With the influx of Web pages comes a disparity of site types, including personal homepages, commercial sales sites, and educational data. The variety of sites and the deluge of information contained on the Web exemplify the individual nature of the WWW. Whereas some people argue that it is this eclecticism which gives the Web its charm, we propose that sites which are repositories of technical data would benefit from standardization. This paper proffers a methodology formore » publishing ecological research on the Web. The template we describe uses capabilities of HTML (the HyperText Markup Language) to enhance the value of the traditional scientific paper.« less

  3. A Multidisciplinary First-Year Seminar about Tuberculosis.

    ERIC Educational Resources Information Center

    Fluck, Richard A.

    2001-01-01

    Describes a writing intensive seminar for college freshman. Includes goals, reading assignments, writing assignments, and group projects. Provides web-based resources on tuberculosis along with an evaluation sheet for web site reviews. Concludes that students exhibited great interest in the topic and course feedback was positive. (DLH)

  4. 10 CFR 905.23 - What are the opportunities for using the Freedom of Information Act to request plan and report data?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...'s publicly available Web site or on Western's Web site. Customers posting their IRPs on their own Web site must notify Western of this decision when they submit their IRP. A hotlink on Western's Web site to IRPs posted on customer Web sites gives interested parties ready access to those IRPs. Western...

  5. 10 CFR 905.23 - What are the opportunities for using the Freedom of Information Act to request plan and report data?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...'s publicly available Web site or on Western's Web site. Customers posting their IRPs on their own Web site must notify Western of this decision when they submit their IRP. A hotlink on Western's Web site to IRPs posted on customer Web sites gives interested parties ready access to those IRPs. Western...

  6. 10 CFR 905.23 - What are the opportunities for using the Freedom of Information Act to request plan and report data?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...'s publicly available Web site or on Western's Web site. Customers posting their IRPs on their own Web site must notify Western of this decision when they submit their IRP. A hotlink on Western's Web site to IRPs posted on customer Web sites gives interested parties ready access to those IRPs. Western...

  7. 10 CFR 905.23 - What are the opportunities for using the Freedom of Information Act to request plan and report data?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...'s publicly available Web site or on Western's Web site. Customers posting their IRPs on their own Web site must notify Western of this decision when they submit their IRP. A hotlink on Western's Web site to IRPs posted on customer Web sites gives interested parties ready access to those IRPs. Western...

  8. 10 CFR 905.23 - What are the opportunities for using the Freedom of Information Act to request plan and report data?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...'s publicly available Web site or on Western's Web site. Customers posting their IRPs on their own Web site must notify Western of this decision when they submit their IRP. A hotlink on Western's Web site to IRPs posted on customer Web sites gives interested parties ready access to those IRPs. Western...

  9. Use of a secure Internet Web site for collaborative medical research.

    PubMed

    Marshall, W W; Haley, R W

    2000-10-11

    Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.

  10. Real-time Shakemap implementation in Austria

    NASA Astrophysics Data System (ADS)

    Weginger, Stefan; Jia, Yan; Papi Isaba, Maria; Horn, Nikolaus

    2017-04-01

    ShakeMaps provide near-real-time maps of ground motion and shaking intensity following significant earthquakes. They are automatically generated within a few minutes after occurrence of an earthquake. We tested and included the USGS ShakeMap 4.0 (experimental code) based on python in the Antelope real-time system with local modified GMPE and Site Effects based on the conditions in Austria. The ShakeMaps are provided in terms of Intensity, PGA, PGV and PSA. Future presentation of ShakeMap contour lines and Ground Motion Parameter with interactive maps and data exchange over Web-Services are shown.

  11. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research

    PubMed Central

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C.

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction. PMID:24904400

  12. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research.

    PubMed

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.

  13. Web queries as a source for syndromic surveillance.

    PubMed

    Hulth, Anette; Rydevik, Gustaf; Linde, Annika

    2009-01-01

    In the field of syndromic surveillance, various sources are exploited for outbreak detection, monitoring and prediction. This paper describes a study on queries submitted to a medical web site, with influenza as a case study. The hypothesis of the work was that queries on influenza and influenza-like illness would provide a basis for the estimation of the timing of the peak and the intensity of the yearly influenza outbreaks that would be as good as the existing laboratory and sentinel surveillance. We calculated the occurrence of various queries related to influenza from search logs submitted to a Swedish medical web site for two influenza seasons. These figures were subsequently used to generate two models, one to estimate the number of laboratory verified influenza cases and one to estimate the proportion of patients with influenza-like illness reported by selected General Practitioners in Sweden. We applied an approach designed for highly correlated data, partial least squares regression. In our work, we found that certain web queries on influenza follow the same pattern as that obtained by the two other surveillance systems for influenza epidemics, and that they have equal power for the estimation of the influenza burden in society. Web queries give a unique access to ill individuals who are not (yet) seeking care. This paper shows the potential of web queries as an accurate, cheap and labour extensive source for syndromic surveillance.

  14. Cleanups In My Community (CIMC) - Federal facilities that are also Superfund sites, National Layer

    EPA Pesticide Factsheets

    Federal facilities are properties owned by the federal government. This data layer provides access to Federal facilities that are Superfund sites as part of the CIMC web service. Data are collected using the Superfund Enterprise Management System (SEMS) and transferred to Envirofacts for access by the public. Data about Federal facility Superfund sites are located on their own EPA web pages, and CIMC links to those pages. Links to the relevant web pages for each site are provided within the attribute table. Federal facility sites can be either Superfund sites or RCRA Corrective Action sites, or they may have moved from one program to the other and back. In Cleanups in My Community, you can map or list any of these Federal Facility sites. This data layer shows only those facilities that are Superfund Sites. RCRA federal facility sites and other Superfund NPL sites are included in other data layers as part of this web service.Superfund is a program administered by the EPA to locate, investigate, and clean up worst hazardous waste sites throughout the United States. EPA administers the Superfund program in cooperation with individual states and tribal governments. These sites include abandoned warehouses, manufacturing facilities, processing plants, and landfills - the key word here being abandoned. The CIMC web service was initially published in 2013, but the data are updated on the 18th of each month. The full schedule for data updates in CIMC is located here:

  15. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    William J. Schroeder

    2011-11-13

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannotmore » be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem important to the nations scientific progress as described shortly. Further, SLAC researchers routinely generate massive amounts of data, and frequently collaborate with other researchers located around the world. Thus SLAC is an ideal teammate through which to develop, test and deploy this technology. The nature of the datasets generated by simulations performed at SLAC presented unique visualization challenges especially when dealing with higher-order elements that were addressed during this Phase II. During this Phase II, we have developed a strong platform for collaborative visualization based on ParaView. We have developed and deployed a ParaView Web Visualization framework that can be used for effective collaboration over the Web. Collaborating and visualizing over the Web presents the community with unique opportunities for sharing and accessing visualization and HPC resources that hitherto with either inaccessible or difficult to use. The technology we developed in here will alleviate both these issues as it becomes widely deployed and adopted.« less

  16. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Gaylord, A. G.; Tweedie, C. E.

    2013-12-01

    In 2013, the Barrow Area Information Database (BAID, www.baid.utep.edu) project resumed field operations in Barrow, AK. The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 11,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, and save or print maps and query results. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. Highlights for the 2013 season include the addition of more than 2000 additional research sites, providing differential global position system (dGPS) support to visiting scientists, surveying over 80 miles of coastline to document rates of erosion, training of local GIS personal, deployment of a wireless sensor network, and substantial upgrades to the BAID website and web mapping applications.

  17. Climate Web sites

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    With the growing interest in extreme climate and weather events, the National Oceanic and Atmospheric Administration (NOAA) has set up a one-stop Web site. It includes data on tornadoes, hurricanes, and heavy rainfall, temperature extremes, global climate change, satellite images, and El Niño and La Niña. The Web address is http://www.ncdc.noaa.gov.Another good climate Web site is the La Niña Home Page. Set up by the Environmental and Societal Impacts Group of the National Center for Atmospheric Research, the site includes forecasts, data sources, impacts, and Internet links.

  18. Talking Trash on the Internet: Working Real Data into Your Classroom.

    ERIC Educational Resources Information Center

    Lynch, Maurice P.; Walton, Susan A.

    1998-01-01

    Describes how a middle school teacher used the Chesapeake Bay National Estuarine Research Reserve in Virginia (CBNERRVA) Web site to provide scientific data for a unit on recycling. Includes sample data sheets and tables, charts results of a Web search for marine debris using different search engines, and lists selected marine data Web sites. (PEN)

  19. Exploring the Pattern of Links between Chinese University Web Sites.

    ERIC Educational Resources Information Center

    Tang, Rong; Thelwall, Mike

    2002-01-01

    Compares links between 76 Chinese university Web sites with ranks obtained from the NetBig lists, using a specialized Web crawler to collect data. Provides a background to the higher education system in mainland China, describes the NetBig ranking scheme, and explains Web site crawling problems encountered. (Author/LRW)

  20. At Their Service

    ERIC Educational Resources Information Center

    Villano, Matt

    2006-01-01

    For years, doing laundry at Columbia University (New York) was just as labor-intensive as it is at most universities. Fortunately, as of last spring, laundry life at Columbia has changed dramatically. Today, with the help of a real-time Web-based service called LaundryView, students can log on to the system via the LaundryView Web site from a link…

  1. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Gaylord, A.; Brown, J.; Tweedie, C. E.

    2012-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic. The Barrow Area Information Database (BAID, www.baidims.org) is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 9,600 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, and save or print maps and query results. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. BAID has been used to: Optimize research site choice; Reduce duplication of science effort; Discover complementary and potentially detrimental research activities in an area of scientific interest; Re-establish historical research sites for resampling efforts assessing change in ecosystem structure and function over time; Exchange knowledge across disciplines and generations; Facilitate communication between western science and traditional ecological knowledge; Provide local residents access to science data that facilitates adaptation to arctic change; (and) Educate the next generation of environmental and computer scientists. This poster describes key activities that will be undertaken over the next three years to provide BAID users with novel software tools to interact with a current and diverse selection of information and data about the Barrow area. Key activities include: 1. Collecting data on research activities, generating geospatial data, and providing mapping support. 2. Maintaining, updating and innovating the existing suite of BAID geobrowsers. 3. Maintaining and updating aging server hardware supporting BAID. 4. Adding interoperability with other CI using workflows, controlled vocabularies and web services. 5. Linking BAID to data archives at the National Snow and Ice Data Center (NSIDC). 6. Developing a wireless sensor network that provides web based interaction with near-real time climate and other data. 7. Training next generation of environmental and computer scientists and conducting outreach.

  2. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications

    PubMed Central

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-01-01

    Background Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. Results The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. Conclusion The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools. PMID:18021453

  3. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications.

    PubMed

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-11-19

    Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools.

  4. Driving Ms. Data: Creating Data-Driven Possibilities

    ERIC Educational Resources Information Center

    Hoffman, Richard

    2005-01-01

    This article describes how driven Web sites help schools and districts maximize their IT resources by making online content more "self-service" for users. It shows how to set up the capacity to create data-driven sites. By definition, a data-driven Web site is one in which the content comes from some back-end data source, such as a…

  5. Cleanups In My Community (CIMC) - Base Realignment and Closure (BRAC) Superfund Sites, National Layer

    EPA Pesticide Factsheets

    This data layer provides access to Base Realignment and Closure (BRAC) Superfund Sites as part of the CIMC web service. EPA works with DoD to facilitate the reuse and redevelopment of BRAC federal properties. When the BRAC program began in the early 1990s, EPA worked with DoD and the states to identify uncontaminated areas and these parcels were immediately made available for reuse. Since then EPA has worked with DoD to clean up the contaminated portions of bases. These are usually parcels that were training ranges, landfills, maintenance facilities and other past waste-disposal areas. Superfund is a program administered by the EPA to locate, investigate, and clean up worst hazardous waste sites throughout the United States. EPA administers the Superfund program in cooperation with individual states and tribal governments. These sites include abandoned warehouses, manufacturing facilities, processing plants, and landfills - the key word here being abandoned.This data layer shows Superfund Sites that are located at BRAC Federal Facilities. Additional Superfund sites and other BRAC sites (those that are not Superfund sites) are included in other data layers as part of this web service.BRAC Superfund Sites shown in this web service are derived from the epa.gov website and include links to the relevant web pages within the attribute table. Data about BRAC Superfund Sites are located on their own EPA web pages, and CIMC links to those pages. The CIMC web service

  6. Towards the Interoperability of Web, Database, and Mass Storage Technologies for Petabyte Archives

    NASA Technical Reports Server (NTRS)

    Moore, Reagan; Marciano, Richard; Wan, Michael; Sherwin, Tom; Frost, Richard

    1996-01-01

    At the San Diego Supercomputer Center, a massive data analysis system (MDAS) is being developed to support data-intensive applications that manipulate terabyte sized data sets. The objective is to support scientific application access to data whether it is located at a Web site, stored as an object in a database, and/or storage in an archival storage system. We are developing a suite of demonstration programs which illustrate how Web, database (DBMS), and archival storage (mass storage) technologies can be integrated. An application presentation interface is being designed that integrates data access to all of these sources. We have developed a data movement interface between the Illustra object-relational database and the NSL UniTree archival storage system running in a production mode at the San Diego Supercomputer Center. With this interface, an Illustra client can transparently access data on UniTree under the control of the Illustr DBMS server. The current implementation is based on the creation of a new DBMS storage manager class, and a set of library functions that allow the manipulation and migration of data stored as Illustra 'large objects'. We have extended this interface to allow a Web client application to control data movement between its local disk, the Web server, the DBMS Illustra server, and the UniTree mass storage environment. This paper describes some of the current approaches successfully integrating these technologies. This framework is measured against a representative sample of environmental data extracted from the San Diego Ba Environmental Data Repository. Practical lessons are drawn and critical research areas are highlighted.

  7. DMSP SSJ4 Data Restoration, Classification, and On-Line Data Access

    NASA Technical Reports Server (NTRS)

    Wing, Simon; Bredekamp, Joseph H. (Technical Monitor)

    2000-01-01

    Compress and clean raw data file for permanent storage We have identified various error conditions/types and developed algorithms to get rid of these errors/noises, including the more complicated noise in the newer data sets. (status = 100% complete). Internet access of compacted raw data. It is now possible to access the raw data via our web site, http://www.jhuapl.edu/Aurora/index.html. The software to read and plot the compacted raw data is also available from the same web site. The users can now download the raw data, read, plot, or manipulate the data as they wish on their own computer. The users are able to access the cleaned data sets. Internet access of the color spectrograms. This task has also been completed. It is now possible to access the spectrograms from the web site mentioned above. Improve the particle precipitation region classification. The algorithm for doing this task has been developed and implemented. As a result, the accuracies improved. Now the web site routinely distributes the results of applying the new algorithm to the cleaned data set. Mark the classification region on the spectrograms. The software to mark the classification region in the spectrograms has been completed. This is also available from our web site.

  8. The effects of Web site structure: the role of personal difference.

    PubMed

    Chung, Hwiman; Ahn, Euijin

    2007-12-01

    This study examined the effects of Web site structures in terms of advertising effectiveness- memory, attitude, and behavioral intentions. The primary research question for this study is, What type of Web site (Web ad) structure is most effective? In the pilot study, we tested the difference between two Web site structures, linear and interactive, in terms of traditional advertising effectiveness. Results from the pilot study did not support our research expectations. However, differences in terms of memory were noted between the two structures. After re-creating the Web site based on subjects' comments, in the final experiment, we examined the differences between the two structures and the moderating role of personality difference on the effects of Web site structure. The results confirm that participants' attitude, memory, and behavioral intentions were affected differently by the different Web site structures. However, some research hypotheses were not supported by the current data.

  9. Use of Social Media by Fathers of Premature Infants.

    PubMed

    Kim, Hyung Nam; Wyatt, Tami H; Li, Xueping; Gaylord, Mark

    Although parents of premature infants experience many challenges when transitioning home from the neonatal intensive care unit, healthcare providers and social support systems tend to focus on mothers and infants rather than fathers. Unfortunately, very little is known about paternal concerns and needs as compared with maternal ones. The lack of understanding about paternal needs may lead to inadequate designs of neonatal intensive care unit family support programs with less involved fathers, all of which contribute to increased burdens on mothers and poor health outcomes for their infants. Although information technology (IT) might have the potential to increase support for the fathers of preterm infants, only a few studies have examined systematically how IT applications can be beneficial. This study aims to advance the understanding of needs and concerns of fathers with preterm infants and how fathers use the IT applications (eg, social networking Web sites) to support themselves. We observed qualitatively various social networking Web sites (ie, 29 Web sites) where fathers share their experiences about preterm infants. We discovered that fathers used various social media to discuss their concerns and, in turn, obtained informational, companionship, and emotional supports. On the basis of our analysis, we provide insights into a father-centered technology intervention design.

  10. Communication and collaboration technologies.

    PubMed

    Cheeseman, Susan E

    2012-01-01

    This is the third in a series of columns exploring health information technology (HIT) in the neonatal intensive care unit (NICU). The first column provided background information on the implementation of information technology throughout the health care delivery system, as well as the requisite informatics competencies needed for nurses to fully engage in the digital era of health care. The second column focused on information and resources to master basic computer competencies described by the TIGER initiative (Technology Informatics Guiding Education Reform) as learning about computers, computer networks, and the transfer of data.1 This column will provide additional information related to basic computer competencies, focusing on communication and collaboration technologies. Computers and the Internet have transformed the way we communicate and collaborate. Electronic communication is the ability to exchange information through the use of computer equipment and software.2 Broadly defined, any technology that facilitates linking one or more individuals together is a collaborative tool. Collaboration using technology encompasses an extensive range of applications that enable groups of individuals to work together including e-mail, instant messaging (IM ), and several web applications collectively referred to as Web 2.0 technologies. The term Web 2.0 refers to web applications where users interact and collaborate with each other in a collective exchange of ideas generating content in a virtual community. Examples of Web 2.0 technologies include social networking sites, blogs, wikis, video sharing sites, and mashups. Many organizations are developing collaborative strategies and tools for employees to connect and interact using web-based social media technologies.3.

  11. Characteristics That Predict Physician Participation in a Web-Based CME Activity: The MI-Plus Study

    PubMed Central

    Schoen, Michael J.; Tipton, Edmond F.; Houston, Thomas K.; Funkhouser, Ellen; Levine, Deborah A.; Estrada, Carlos A.; Allison, Jeroan J.; Williams, O. Dale; Kiefe, Catarina I.

    2011-01-01

    Introduction Physician use of the Internet for practice improvement has increased dramatically over the last decade, but research shows that many physicians choose not to participate. The current study investigated the association of specific physician characteristics with enrollment rates and intensity of participation in a specific Internet-delivered educational intervention to improve care to post–myocardial infarction (MI) patients. Methods Primary-care physicians were recruited for participation in a randomized controlled trial designed to compare effectiveness of an intervention Web site versus a control Web site in the management of adult chronic disease. Physicians were informed that the intervention focused on ambulatory post–myocardial infarction patients. Physician characteristics were obtained from a commercial vendor with data merged from the American Medical Association and Alabama State Licensing Board. Enrollment and Web use were tracked electronically. Results Out of a sample of 1337 eligible physicians, 177 (13.2%) enrolled in the study. Enrollment was higher for physicians with more post-MI patients (≥20 vs < 20 patients, 15.3% vs 9.3%, P = .002) and for those practicing in rural compared to urban areas (16.3% vs 12.1%, P = .046). Intensity of use of the Internet courses after initial enrollment was not predicted by physician characteristics in the current sample. Discussion Physicians with more post-MI patients and rural practice location were found to predict enrollment in an Internet-delivered continuing medical education (CME) intervention designed to improve care for post-MI patients. These factors predicted program interest but not program use. More research is needed to replicate these findings to investigate variables that determine physician engagement in Internet CME. PMID:19998447

  12. Surface-water data and statistics from U.S. Geological Survey data-collection networks in New Jersey on the World Wide Web

    USGS Publications Warehouse

    Reiser, Robert G.; Watson, Kara M.; Chang, Ming; Nieswand, Steven P.

    2002-01-01

    The U.S. Geological Survey (USGS), in cooperation with other Federal, State, and local agencies, operates and maintains a variety of surface-water data-collection networks throughout the State of New Jersey. The networks include streamflow-gaging stations, low-flow sites, crest-stage gages, tide gages, tidal creststage gages, and water-quality sampling sites. Both real-time and historical surface-water data for many of the sites in these networks are available at the USGS, New Jersey District, web site (http://nj.usgs.gov/), and water-quality data are available at the USGS National Water Information System (NWIS) web site (http://waterdata.usgs.gov/nwis/). These data are an important source of information for water managers, engineers, environmentalists, and private citizens.

  13. Vaccines and Internet: characteristics of the vaccine safety net web sites and suggested improvements.

    PubMed

    Martínez-Mora, Marta; Alvarez-Pasquín, María José; Rodríguez-Salvanés, Francisco

    2008-12-09

    The Internet contains a large amount of useful information on many subjects, but also information of doubtful quality. To help identify Web sites on vaccine safety that fulfil good practice, the Global Advisory Committee on Vaccine Safety of the World Health Organization (WHO) has published criteria to which sites should adhere and a listing of Web sites that fulfil them. There are no studies describing the common attributes of these sites. To examine the attributes, design characteristics and resources of Web sites belonging to the Vaccine Safety Net (VSN) of the WHO. A cross-sectional, descriptive observational study using an evaluation questionnaire was carried out applied to the VSN web sites listed in March-April 2007. Twenty-six Web sites accredited by the VSN by April 2007 were analysed. With respect to design and quality, all sites contained information about the site manager. Postal and Email addresses were available for 84.6% of the sites. About privacy and personal data processing, 73.1% of sites specified the data protection procedure used and stated that data were not sold or passed third parties. The most-used language was English (76.9%). 96.3% of sites had links to other pro-vaccination sites and 19.2% provided the addresses of vaccination centres. 63.6% of webs were aimed at general public and health care workers but there was no clear separation of documents or different entry routes. With respect to information on vaccine safety, 84.6% of sites had information on adverse effects. In the general information section, 92.3% of sites had a new developments section. Some sites had multiple sources of financing and in 57% of sites, the financing was public. The most-important plus factors found were the transparency of financing, the lack of links to the pharmaceutical industry, the transparency of site management and responsibility and the proven scientific quality and constant updating of contents.

  14. 32 CFR Appendix A to Part 806b - Definitions

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... exemption for protecting the identity of confidential sources. Cookie: Data created by a Web server that is... (persistent cookie). It provides a way for the Web site to identify users and keep track of their preferences... or is sent to a Web site different from the one you are currently viewing. Defense Data Integrity...

  15. 32 CFR Appendix A to Part 806b - Definitions

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... exemption for protecting the identity of confidential sources. Cookie: Data created by a Web server that is... (persistent cookie). It provides a way for the Web site to identify users and keep track of their preferences... or is sent to a Web site different from the one you are currently viewing. Defense Data Integrity...

  16. BAID: The Barrow Area Information Database - an interactive web mapping portal and cyberinfrastructure for scientific activities in the vicinity of Barrow, Alaska.

    NASA Astrophysics Data System (ADS)

    Cody, R. P.; Kassin, A.; Kofoed, K. B.; Copenhaver, W.; Laney, C. M.; Gaylord, A. G.; Collins, J. A.; Tweedie, C. E.

    2014-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Data are described with metadata that meet Federal Geographic Data Committee standards and are archived at the University Corporation for Atmospheric Research Earth Observing Laboratory (EOL) where non-proprietary BAID data can be freely downloaded. Recent advances include the addition of more than 2000 new research sites, provision of differential global position system (dGPS) and Unmanned Aerial Vehicle (UAV) support to visiting scientists, surveying over 80 miles of coastline to document rates of erosion, training of local GIS personal to better make use of science in local decision making, deployment and near real time connectivity to a wireless micrometeorological sensor network, links to Barrow area datasets housed at national data archives and substantial upgrades to the BAID website and web mapping applications.

  17. Analyzing Web Server Logs to Improve a Site's Usage. The Systems Librarian

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2005-01-01

    This column describes ways to streamline and optimize how a Web site works in order to improve both its usability and its visibility. The author explains how to analyze logs and other system data to measure the effectiveness of the Web site design and search engine.

  18. TriNet "ShakeMaps": Rapid generation of peak ground motion and intensity maps for earthquakes in southern California

    USGS Publications Warehouse

    Wald, D.J.; Quitoriano, V.; Heaton, T.H.; Kanamori, H.; Scrivner, C.W.; Worden, C.B.

    1999-01-01

    Rapid (3-5 minutes) generation of maps of instrumental ground-motion and shaking intensity is accomplished through advances in real-time seismographic data acquisition combined with newly developed relationships between recorded ground-motion parameters and expected shaking intensity values. Estimation of shaking over the entire regional extent of southern California is obtained by the spatial interpolation of the measured ground motions with geologically based frequency and amplitude-dependent site corrections. Production of the maps is automatic, triggered by any significant earthquake in southern California. Maps are now made available within several minutes of the earthquake for public and scientific consumption via the World Wide Web; they will be made available with dedicated communications for emergency response agencies and critical users.

  19. High Plains Regional Ground-water Study web site

    USGS Publications Warehouse

    Qi, Sharon L.

    2000-01-01

    Now available on the Internet is a web site for the U.S. Geological Survey's (USGS) National Water-Quality Assessment (NAWQA) Program-High Plains Regional Ground-Water Study. The purpose of the web site is to provide public access to a wide variety of information on the USGS investigation of the ground-water resources within the High Plains aquifer system. Typical pages on the web site include the following: descriptions of the High Plains NAWQA, the National NAWQA Program, the study-area setting, current and past activities, significant findings, chemical and ancillary data (which can be downloaded), listing and access to publications, links to other sites about the High Plains area, and links to other web sites studying High Plains ground-water resources. The High Plains aquifer is a regional aquifer system that underlies 174,000 square miles in parts of eight States (Colorado, Kansas, Nebraska, New Mexico, Oklahoma, South Dakota, Texas, and Wyoming). Because the study area is so large, the Internet is an ideal way to provide project data and information on a near real-time basis. The web site will be a collection of living documents where project data and information are updated as it becomes available throughout the life of the project. If you have an interest in the High Plains area, you can check this site periodically to learn how the High Plains NAWQA activities are progressing over time and access new data and publications as they become available.

  20. SITEHOUND-web: a server for ligand binding site identification in protein structures.

    PubMed

    Hernandez, Marylens; Ghersi, Dario; Sanchez, Roberto

    2009-07-01

    SITEHOUND-web (http://sitehound.sanchezlab.org) is a binding-site identification server powered by the SITEHOUND program. Given a protein structure in PDB format SITEHOUND-web will identify regions of the protein characterized by favorable interactions with a probe molecule. These regions correspond to putative ligand binding sites. Depending on the probe used in the calculation, sites with preference for different ligands will be identified. Currently, a carbon probe for identification of binding sites for drug-like molecules, and a phosphate probe for phosphorylated ligands (ATP, phoshopeptides, etc.) have been implemented. SITEHOUND-web will display the results in HTML pages including an interactive 3D representation of the protein structure and the putative sites using the Jmol java applet. Various downloadable data files are also provided for offline data analysis.

  1. The Planetary Data System Distributed Inventory System

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; McMahon, Susan K.

    1996-01-01

    The advent of the World Wide Web (Web) and the ability to easily put data repositories on-line has resulted in a proliferation of digital libraries. The heterogeneity of the underlying systems, the autonomy of the individual sites, and distributed nature of the technology has made both interoperability across the sites and the search for resources within a site major research topics. This article will describe a system that addresses both issues using standard Web protocols and meta-data labels to implement an inventory of on-line resources across a group of sites. The success of this system is strongly dependent on the existence of and adherence to a standards architecture that guides the management of meta-data within participating sites.

  2. Assessing the Integrity of Web Sites Providing Data and Information on Corporate Behavior

    ERIC Educational Resources Information Center

    McLaughlin, Josetta; Pavelka, Deborah; McLaughlin, Gerald

    2005-01-01

    A significant trend in higher education evolving from the wide accessibility to the Internet is the availability of an ever-increasing supply of data on Web sites for use by professors, students, and researchers. As this usage by a wider variety of users grows, the ability to judge the integrity of the data, the related findings, and the Web site…

  3. Does your web site draw new patients?

    PubMed

    Wallin, Wendy S

    2009-11-01

    The absence of scientific data forces orthodontists to guess at how best to design Internet sites that persuade prospective patients to call for appointments. This study was conducted to identify the Web-site factors that lead prospective patients to make appointments or, conversely, to reject a practice. Ten participants actively looking online for an orthodontist were recruited to participate. They reviewed 64 orthodontic Web sites in their geographic areas and rated their likelihood of calling each practice for an appointment. The sessions were videotaped. Analysis of participant comments, navigation patterns, and ratings suggested 25 distinguishing factors. Statistical analysis showed 10 Web-site characteristics that predict the success of an orthodontic Web site in attracting new patients.

  4. Testing of a prototype Web based intervention for adolescent mothers on postpartum depression.

    PubMed

    Logsdon, M Cynthia; Barone, Michael; Lynch, Tania; Robertson, Ashley; Myers, John; Morrison, David; York, Sara; Gregg, Jennifer

    2013-08-01

    This article describes testing of a prototype Web site for adolescent mothers with postpartum depression; providing proof of concept. Participants (N=138) were recruited from a public school-based program for adolescent parents and completed the Mental Health Acceptability Scale, Stigma Scale for Receiving Psychological Help, and Attitudes Towards Seeking Professional Psychological Help Scale before, and after, the Web site intervention. They also provided feedback on the usability of the Web site. Attitudes related to depression and treatment (ATSPPH) improved after viewing the Web site (p=.023). Feedback on the Web site indicated that it was easy to use (77%), reflecting highly acceptable score for product usability. The data provide the foundation for the launch of the Web site from prototype to product and more comprehensive testing. The creation and testing of informational text messages will be added to the Web site to increase the interactivity and dose of the intervention. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. The Path to Graduation: A Model Interactive Web Site Design Supporting Doctoral Students

    ERIC Educational Resources Information Center

    Simmons-Johnson, Nicole

    2012-01-01

    Objective. This 2-phase mixed method study assessed 2nd-year doctoral students' and dissertation students' perceptions of the current Graduate School of Education dissertation support Web site, with implications for designing a model dissertation support Web site. Methods. Phase 1 collected quantitative and qualitative data through an…

  6. NSLDS Training Workshop: Participant's Guide.

    ERIC Educational Resources Information Center

    Department of Education, Washington, DC. Student Financial Assistance.

    These training materials were designed to be used by participants at a National Student Loan Data System (NSLDS) workshop which explains how to use the NSLDS Web site. Following an overview, the guide is organized into four sessions: (1) students' access and use of the NSLDS Web site; (2) navigating the financial aid professional Web site; (3)…

  7. Web GIS in practice VI: a demo playlist of geo-mashups for public health neogeographers

    PubMed Central

    Boulos, Maged N Kamel; Scotch, Matthew; Cheung, Kei-Hoi; Burden, David

    2008-01-01

    'Mashup' was originally used to describe the mixing together of musical tracks to create a new piece of music. The term now refers to Web sites or services that weave data from different sources into a new data source or service. Using a musical metaphor that builds on the origin of the word 'mashup', this paper presents a demonstration "playlist" of four geo-mashup vignettes that make use of a range of Web 2.0, Semantic Web, and 3-D Internet methods, with outputs/end-user interfaces spanning the flat Web (two-dimensional – 2-D maps), a three-dimensional – 3-D mirror world (Google Earth) and a 3-D virtual world (Second Life ®). The four geo-mashup "songs" in this "playlist" are: 'Web 2.0 and GIS (Geographic Information Systems) for infectious disease surveillance', 'Web 2.0 and GIS for molecular epidemiology', 'Semantic Web for GIS mashup', and 'From Yahoo! Pipes to 3-D, avatar-inhabited geo-mashups'. It is hoped that this showcase of examples and ideas, and the pointers we are providing to the many online tools that are freely available today for creating, sharing and reusing geo-mashups with minimal or no coding, will ultimately spark the imagination of many public health practitioners and stimulate them to start exploring the use of these methods and tools in their day-to-day practice. The paper also discusses how today's Web is rapidly evolving into a much more intensely immersive, mixed-reality and ubiquitous socio-experiential Metaverse that is heavily interconnected through various kinds of user-created mashups. PMID:18638385

  8. Study on online community user motif using web usage mining

    NASA Astrophysics Data System (ADS)

    Alphy, Meera; Sharma, Ajay

    2016-04-01

    The Web usage mining is the application of data mining, which is used to extract useful information from the online community. The World Wide Web contains at least 4.73 billion pages according to Indexed Web and it contains at least 228.52 million pages according Dutch Indexed web on 6th august 2015, Thursday. It’s difficult to get needed data from these billions of web pages in World Wide Web. Here is the importance of web usage mining. Personalizing the search engine helps the web user to identify the most used data in an easy way. It reduces the time consumption; automatic site search and automatic restore the useful sites. This study represents the old techniques to latest techniques used in pattern discovery and analysis in web usage mining from 1996 to 2015. Analyzing user motif helps in the improvement of business, e-commerce, personalisation and improvement of websites.

  9. 75 FR 17050 - Enhancing Airline Passenger Protections: Extension of Compliance Date for Posting of Flight Delay...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-05

    ... Flight Delay Data on Web Sites AGENCY: Office of the Secretary (OST), Department of Transportation (DOT... information on their Web sites. This extension is in response to requests by several carrier associations for... on Web sites in view of the extensive changes to carriers' reporting systems that are necessitated by...

  10. 78 FR 20983 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ... the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the principal... in a manner to facilitate its distribution via Web sites or mobile devices. \\4\\ See Securities... broadcasters, Web site and mobile device service providers, and others to distribute this data product to their...

  11. 76 FR 1489 - Self-Regulatory Organizations; NYSE Arca, Inc.; Order Granting Approval of Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-10

    ... disseminated on the CBOE Web site at http://www.cboe.com and through major market data vendors. An updated... Fund will provide Web site disclosure of portfolio holdings daily and will include, as applicable, the... Contracts are available from the Web sites of the CFE, automated quotation systems, published or other...

  12. Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.

    PubMed

    Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar

    2012-01-01

    Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.

  13. SnipViz: a compact and lightweight web site widget for display and dissemination of multiple versions of gene and protein sequences.

    PubMed

    Jaschob, Daniel; Davis, Trisha N; Riffle, Michael

    2014-07-23

    As high throughput sequencing continues to grow more commonplace, the need to disseminate the resulting data via web applications continues to grow. Particularly, there is a need to disseminate multiple versions of related gene and protein sequences simultaneously--whether they represent alleles present in a single species, variations of the same gene among different strains, or homologs among separate species. Often this is accomplished by displaying all versions of the sequence at once in a manner that is not intuitive or space-efficient and does not facilitate human understanding of the data. Web-based applications needing to disseminate multiple versions of sequences would benefit from a drop-in module designed to effectively disseminate these data. SnipViz is a client-side software tool designed to disseminate multiple versions of related gene and protein sequences on web sites. SnipViz has a space-efficient, interactive, and dynamic interface for navigating, analyzing and visualizing sequence data. It is written using standard World Wide Web technologies (HTML, Javascript, and CSS) and is compatible with most web browsers. SnipViz is designed as a modular client-side web component and may be incorporated into virtually any web site and be implemented without any programming. SnipViz is a drop-in client-side module for web sites designed to efficiently visualize and disseminate gene and protein sequences. SnipViz is open source and is freely available at https://github.com/yeastrc/snipviz.

  14. Reconstructing the temporal ordering of biological samples using microarray data.

    PubMed

    Magwene, Paul M; Lizardi, Paul; Kim, Junhyong

    2003-05-01

    Accurate time series for biological processes are difficult to estimate due to problems of synchronization, temporal sampling and rate heterogeneity. Methods are needed that can utilize multi-dimensional data, such as those resulting from DNA microarray experiments, in order to reconstruct time series from unordered or poorly ordered sets of observations. We present a set of algorithms for estimating temporal orderings from unordered sets of sample elements. The techniques we describe are based on modifications of a minimum-spanning tree calculated from a weighted, undirected graph. We demonstrate the efficacy of our approach by applying these techniques to an artificial data set as well as several gene expression data sets derived from DNA microarray experiments. In addition to estimating orderings, the techniques we describe also provide useful heuristics for assessing relevant properties of sample datasets such as noise and sampling intensity, and we show how a data structure called a PQ-tree can be used to represent uncertainty in a reconstructed ordering. Academic implementations of the ordering algorithms are available as source code (in the programming language Python) on our web site, along with documentation on their use. The artificial 'jelly roll' data set upon which the algorithm was tested is also available from this web site. The publicly available gene expression data may be found at http://genome-www.stanford.edu/cellcycle/ and http://caulobacter.stanford.edu/CellCycle/.

  15. Web-based resources for critical care education.

    PubMed

    Kleinpell, Ruth; Ely, E Wesley; Williams, Ged; Liolios, Antonios; Ward, Nicholas; Tisherman, Samuel A

    2011-03-01

    To identify, catalog, and critically evaluate Web-based resources for critical care education. A multilevel search strategy was utilized. Literature searches were conducted (from 1996 to September 30, 2010) using OVID-MEDLINE, PubMed, and the Cumulative Index to Nursing and Allied Health Literature with the terms "Web-based learning," "computer-assisted instruction," "e-learning," "critical care," "tutorials," "continuing education," "virtual learning," and "Web-based education." The Web sites of relevant critical care organizations (American College of Chest Physicians, American Society of Anesthesiologists, American Thoracic Society, European Society of Intensive Care Medicine, Society of Critical Care Medicine, World Federation of Societies of Intensive and Critical Care Medicine, American Association of Critical Care Nurses, and World Federation of Critical Care Nurses) were reviewed for the availability of e-learning resources. Finally, Internet searches and e-mail queries to critical care medicine fellowship program directors and members of national and international acute/critical care listserves were conducted to 1) identify the use of and 2) review and critique Web-based resources for critical care education. To ensure credibility of Web site information, Web sites were reviewed by three independent reviewers on the basis of the criteria of authority, objectivity, authenticity, accuracy, timeliness, relevance, and efficiency in conjunction with suggested formats for evaluating Web sites in the medical literature. Literature searches using OVID-MEDLINE, PubMed, and the Cumulative Index to Nursing and Allied Health Literature resulted in >250 citations. Those pertinent to critical care provide examples of the integration of e-learning techniques, the development of specific resources, reports of the use of types of e-learning, including interactive tutorials, case studies, and simulation, and reports of student or learner satisfaction, among other general reviews of the benefits of utilizing e-learning. Review of the Web sites of relevant critical care organizations revealed the existence of a number of e-learning resources, including online critical care courses, tutorials, podcasts, webcasts, slide sets, and continuing medical education resources, some requiring membership or a fee to access. Respondents to listserve queries (>100) and critical care medicine fellowship director and advanced practice nursing educator e-mail queries (>50) identified the use of a number of tutorials, self-directed learning modules, and video-enhanced programs for critical care education and practice. In all, >135 Web-based education resources exist, including video Web resources for critical care education in a variety of e-learning formats, such as tutorials, self-directed learning modules, interactive case studies, webcasts, podcasts, and video-enhanced programs. As identified by critical care educators and practitioners, e-learning is actively being integrated into critical care medicine and nursing training programs for continuing medical education and competency training purposes. Knowledge of available Web-based educational resources may enhance critical care practitioners' ongoing learning and clinical competence, although this has not been objectively measured to date.

  16. Characteristics associated with use of public and private web sites as sources of health care information: results from a national survey.

    PubMed

    Miller, Edward Alan; West, Darrell M

    2007-03-01

    We sought to determine the frequency with which Americans access health information from governmental (public sector) and nongovernmental (private sector) web sites and to identify similarities and differences in the characteristics associated with use of each type. Data derive from 928 individuals who responded to a November 2005 national survey. In addition to forms of health communication, we asked about age, gender, race, income, education, insurance, lifestyle, residence, satisfaction, literacy, and health. We report the extent of web site use stratified by sponsorship type-public and private. We also use chi tests to examine bivariate associations. Logistic regression and multiple imputation of missing data were used to examine the correlates of use in a multivariate context. More than twice as many respondents visited private web sites (29.6%) than public web sites (13.2%). However, just 23.6% and 18.9% of private and public web site visitors, respectively, reported doing so once a month or more. Both public and private web site visitors were more likely to be better-educated respondents (odds ratio [OR]=0.83, OR=1.57) reporting greater concerns about health care access (OR=1.28, OR=1.20) than nonvisitors. Younger individuals (OR=0.83) living in urban areas (OR=1.59) with stronger health literacy (OR=1.24) and reporting greater concerns about health care affordability (OR=1.59) were more likely to visit privately sponsored web sites but nonpublicly sponsored ones. Relatively low utilization levels necessitate a concerted effort to improve the quality, accessibility, and relevance of Internet health information. Efforts to close the digital divide must recognize differences in user characteristics across governmental and nongovernmental web site providers.

  17. Design and Evaluation of a Web-Based Symptom Monitoring Tool for Heart Failure.

    PubMed

    Wakefield, Bonnie J; Alexander, Gregory; Dohrmann, Mary; Richardson, James

    2017-05-01

    Heart failure is a chronic condition where symptom recognition and between-visit communication with providers are critical. Patients are encouraged to track disease-specific data, such as weight and shortness of breath. Use of a Web-based tool that facilitates data display in graph form may help patients recognize exacerbations and more easily communicate out-of-range data to clinicians. The purposes of this study were to (1) design a Web-based tool to facilitate symptom monitoring and symptom recognition in patients with chronic heart failure and (2) conduct a usability evaluation of the Web site. Patient participants generally had a positive view of the Web site and indicated it would support recording their health status and communicating with their doctors. Clinician participants generally had a positive view of the Web site and indicated it would be a potentially useful adjunct to electronic health delivery systems. Participants expressed a need to incorporate decision support within the site and wanted to add other data, for example, blood pressure, and have the ability to adjust font size. A few expressed concerns about data privacy and security. Technologies require careful design and testing to ensure they are useful, usable, and safe for patients and do not add to the burden of busy providers.

  18. 78 FR 69168 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... available on the Exchange's Web site ( http://www.cboe.com/AboutCBOE/CBOELegalRegulatoryHome.aspx ), at the... set forth on the Price List on the MDX Web site ( www.marketdataexpress.com ). MDX currently charges a.... Market participants would be able to purchase Historical COPS Data through the MDX Web site. The proposed...

  19. 78 FR 20969 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ... available on the Exchange's Web site at www.nyse.com , at the principal office of the Exchange, and at the... offered in a manner to facilitate its distribution via Web sites or mobile devices. \\5\\ See id. at 31501... data vendors, television broadcasters, Web site and mobile device service providers, and others to...

  20. 78 FR 20986 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-08

    ... the proposed rule change is available on the Exchange's Web site at www.nyse.com , at the principal... television screens. NYSE Arca Trades is not offered in a manner to facilitate its distribution via Web sites... in a new manner that will permit market data vendors, television broadcasters, Web site and mobile...

  1. Tampa Bay Study Data and Information Management System (DIMS)

    NASA Astrophysics Data System (ADS)

    Edgar, N. T.; Johnston, J. B.; Yates, K.; Smith, K. E.

    2005-05-01

    Providing easy access to data and information is an essential component of both science and management. The Tampa Bay Data and Information Management System (DIMS) catalogs and publicizes data and products which are generated through the Tampa Bay Integrated Science Study. The publicly accessible interface consists of a Web site (http://gulfsci.usgs.gov), a digital library, and an interactive map server (IMS). The Tampa Bay Study Web site contains information from scientists involved in the study, and is also the portal site for the digital library and IMS. Study information is highlighted on the Web site according to the estuarine component: geology and geomorphology, water and sediment quality, ecosystem structure and function, and hydrodynamics. The Tampa Bay Digital Library is a web-based clearinghouse for digital products on Tampa Bay, including documents, maps, spatial and tabular data sets, presentations, etc. New developments to the digital library include new search features, 150 new products over the past year, and partnerships to expand the offering of science products. The IMS is a Web-based geographic information system (GIS) used to store, analyze and display data pertaining to Tampa Bay. Upgrades to the IMS have improved performance and speed, as well as increased the number of data sets available for mapping. The Tampa Bay DIMS is a dynamic entity and will continue to evolve with the study. Beginning in 2005, the Tampa Bay Integrated Coastal Model will have a more prominent presence within the DIMS. The Web site will feature model projects and plans; the digital library will host model products and data sets; the IMS will display spatial model data sets and analyses. These tools will be used to increase communication of USGS efforts in Tampa Bay to the public, local managers, and scientists.

  2. A survey of motif finding Web tools for detecting binding site motifs in ChIP-Seq data

    PubMed Central

    2014-01-01

    Abstract ChIP-Seq (chromatin immunoprecipitation sequencing) has provided the advantage for finding motifs as ChIP-Seq experiments narrow down the motif finding to binding site locations. Recent motif finding tools facilitate the motif detection by providing user-friendly Web interface. In this work, we reviewed nine motif finding Web tools that are capable for detecting binding site motifs in ChIP-Seq data. We showed each motif finding Web tool has its own advantages for detecting motifs that other tools may not discover. We recommended the users to use multiple motif finding Web tools that implement different algorithms for obtaining significant motifs, overlapping resemble motifs, and non-overlapping motifs. Finally, we provided our suggestions for future development of motif finding Web tool that better assists researchers for finding motifs in ChIP-Seq data. Reviewers This article was reviewed by Prof. Sandor Pongor, Dr. Yuriy Gusev, and Dr. Shyam Prabhakar (nominated by Prof. Limsoon Wong). PMID:24555784

  3. Price comparisons on the internet based on computational intelligence.

    PubMed

    Kim, Jun Woo; Ha, Sung Ho

    2014-01-01

    Information-intensive Web services such as price comparison sites have recently been gaining popularity. However, most users including novice shoppers have difficulty in browsing such sites because of the massive amount of information gathered and the uncertainty surrounding Web environments. Even conventional price comparison sites face various problems, which suggests the necessity of a new approach to address these problems. Therefore, for this study, an intelligent product search system was developed that enables price comparisons for online shoppers in a more effective manner. In particular, the developed system adopts linguistic price ratings based on fuzzy logic to accommodate user-defined price ranges, and personalizes product recommendations based on linguistic product clusters, which help online shoppers find desired items in a convenient manner.

  4. Price Comparisons on the Internet Based on Computational Intelligence

    PubMed Central

    Kim, Jun Woo; Ha, Sung Ho

    2014-01-01

    Information-intensive Web services such as price comparison sites have recently been gaining popularity. However, most users including novice shoppers have difficulty in browsing such sites because of the massive amount of information gathered and the uncertainty surrounding Web environments. Even conventional price comparison sites face various problems, which suggests the necessity of a new approach to address these problems. Therefore, for this study, an intelligent product search system was developed that enables price comparisons for online shoppers in a more effective manner. In particular, the developed system adopts linguistic price ratings based on fuzzy logic to accommodate user-defined price ranges, and personalizes product recommendations based on linguistic product clusters, which help online shoppers find desired items in a convenient manner. PMID:25268901

  5. Who Goes There? Measuring Library Web Site Usage.

    ERIC Educational Resources Information Center

    Bauer, Kathleen

    2000-01-01

    Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)

  6. A web-based platform to support an evidence-based mental health intervention: lessons from the CBITS web site.

    PubMed

    Vona, Pamela; Wilmoth, Pete; Jaycox, Lisa H; McMillen, Janey S; Kataoka, Sheryl H; Wong, Marleen; DeRosier, Melissa E; Langley, Audra K; Kaufman, Joshua; Tang, Lingqi; Stein, Bradley D

    2014-11-01

    To explore the role of Web-based platforms in behavioral health, the study examined usage of a Web site for supporting training and implementation of an evidence-based intervention. Using data from an online registration survey and Google Analytics, the investigators examined user characteristics and Web site utilization. Site engagement was substantial across user groups. Visit duration differed by registrants' characteristics. Less experienced clinicians spent more time on the Web site. The training section accounted for most page views across user groups. Individuals previously trained in the Cognitive-Behavioral Intervention for Trauma in Schools intervention viewed more implementation assistance and online community pages than did other user groups. Web-based platforms have the potential to support training and implementation of evidence-based interventions for clinicians of varying levels of experience and may facilitate more rapid dissemination. Web-based platforms may be promising for trauma-related interventions, because training and implementation support should be readily available after a traumatic event.

  7. End User Evaluations

    NASA Astrophysics Data System (ADS)

    Jay, Caroline; Lunn, Darren; Michailidou, Eleni

    As new technologies emerge, and Web sites become increasingly sophisticated, ensuring they remain accessible to disabled and small-screen users is a major challenge. While guidelines and automated evaluation tools are useful for informing some aspects of Web site design, numerous studies have demonstrated that they provide no guarantee that the site is genuinely accessible. The only reliable way to evaluate the accessibility of a site is to study the intended users interacting with it. This chapter outlines the processes that can be used throughout the design life cycle to ensure Web accessibility, describing their strengths and weaknesses, and discussing the practical and ethical considerations that they entail. The chapter also considers an important emerging trend in user evaluations: combining data from studies of “standard” Web use with data describing existing accessibility issues, to drive accessibility solutions forward.

  8. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-05-04

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less

  9. 78 FR 21469 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-10

    ... on the Exchange's Web site at www.nyse.com , at the principal office of the Exchange, and at the... NYSE MKT RRP.\\7\\ NYSE MKT RRP is designed for Web site distribution and includes the real-time last... distribution of a last sale data product for reference purposes on Web sites at a low cost that would...

  10. 78 FR 21436 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-10

    ... available on the Exchange's Web site at www.nyse.com , at the principal office of the Exchange, and at the... Arca RRP.\\7\\ NYSE Arca RRP is designed for Web site distribution and includes the real-time last sale... distribution of a last sale data product for reference purposes on Web sites at a low cost that would...

  11. SSE Decommission Announcement

    Atmospheric Science Data Center

    2018-06-26

    ... We are pleased to announce that on June 13, 2018 the old SSE web site will be replaced with the new data web portal at https://power.larc.nasa.gov with improved solar and ... currently on SSE are now available at the new POWER web site although the parameters might be organized differently.  Also note ...

  12. NASA: Data on the Web.

    ERIC Educational Resources Information Center

    Galica, Carol

    1997-01-01

    Provides an annotated bibliography of selected NASA Web sites for K-12 math and science teachers: the NASA Lewis Research Center Learning Technologies K-12 Home Page, Spacelink, NASA Quest, Basic Aircraft Design Page, International Space Station, NASA Shuttle Web Site, LIFTOFF to Space Education, Telescopes in Education, and Space Educator's…

  13. Audiovisual Speech Web-Lab: an Internet teaching and research laboratory.

    PubMed

    Gordon, M S; Rosenblum, L D

    2001-05-01

    Internet resources now enable laboratories to make full-length experiments available on line. A handful of existing web sites offer users the ability to participate in experiments and generate usable data. We have integrated this technology into a web site that also provides full discussion of the theoretical and methodological aspects of the experiments using text and simple interactive demonstrations. The content of the web site (http://www.psych.ucr.edu/avspeech/lab) concerns audiovisual speech perception and its relation to face perception. The site is designed to be useful for users of multiple interests and levels of expertise.

  14. Focused sunlight factor of forest fire danger assessment using Web-GIS and RS technologies

    NASA Astrophysics Data System (ADS)

    Baranovskiy, Nikolay V.; Sherstnyov, Vladislav S.; Yankovich, Elena P.; Engel, Marina V.; Belov, Vladimir V.

    2016-08-01

    Timiryazevskiy forestry of Tomsk region (Siberia, Russia) is a study area elaborated in current research. Forest fire danger assessment is based on unique technology using probabilistic criterion, statistical data on forest fires, meteorological conditions, forest sites classification and remote sensing data. MODIS products are used for estimating some meteorological conditions and current forest fire situation. Geonformation technologies are used for geospatial analysis of forest fire danger situation on controlled forested territories. GIS-engine provides opportunities to construct electronic maps with different levels of forest fire probability and support raster layer for satellite remote sensing data on current forest fires. Web-interface is used for data loading on specific web-site and for forest fire danger data representation via World Wide Web. Special web-forms provide interface for choosing of relevant input data in order to process the forest fire danger data and assess the forest fire probability.

  15. Cleanups in My Community

    EPA Pesticide Factsheets

    Cleanups in My Community (CIMC) is a public web application that enables integrated access through maps, lists and search filtering to site-specific information EPA has across all cleanup programs. CIMC taps into data publicly available from EPA's EnviroFacts (RCRA Corrective Action facilities, Brownfields properties and grant areas, Superfund NPL sites, other facility data) and web services (water monitoring stations, impaired waters, emergency responses, tribal boundaries, congressional districts, etc.) and connects to other applications (e.g., Superfund's CPAD) to provide easy seamless access to site-specific cleanup information with explanatory text and within the context of related data. Data can be filtered by cleanup program, geography, environmental indicators, controls, and cleanup stage. CIMC also provides some web services that integrate these data for others to use in their applications.

  16. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  17. Handling Internet-Based Health Information: Improving Health Information Web Site Literacy Among Undergraduate Nursing Students.

    PubMed

    Wang, Weiwen; Sun, Ran; Mulvehill, Alice M; Gilson, Courtney C; Huang, Linda L

    2017-02-01

    Patient care problems arise when health care consumers and professionals find health information on the Internet because that information is often inaccurate. To mitigate this problem, nurses can develop Web literacy and share that skill with health care consumers. This study evaluated a Web-literacy intervention for undergraduate nursing students to find reliable Web-based health information. A pre- and postsurvey queried undergraduate nursing students in an informatics course; the intervention comprised lecture, in-class practice, and assignments about health Web site evaluation tools. Data were analyzed using Wilcoxon and ANOVA signed-rank tests. Pre-intervention, 75.9% of participants reported using Web sites to obtain health information. Postintervention, 87.9% displayed confidence in using an evaluation tool. Both the ability to critique health Web sites (p = .005) and confidence in finding reliable Internet-based health information (p = .058) increased. Web-literacy education guides nursing students to find, evaluate, and use reliable Web sites, which improves their ability to deliver safer patient care. [J Nurs Educ. 2017;56(2):110-114.]. Copyright 2017, SLACK Incorporated.

  18. A multilingual assessment of melanoma information quality on the Internet.

    PubMed

    Bari, Lilla; Kemeny, Lajos; Bari, Ferenc

    2014-06-01

    This study aims to assess and compare melanoma information quality in Hungarian, Czech, and German languages on the Internet. We used country-specific Google search engines to retrieve the first 25 uniform resource locators (URLs) by searching the word "melanoma" in the given language. Using the automated toolbar of Health On the Net Foundation (HON), we assessed each Web site for HON certification based on the Health On the Net Foundation Code of Conduct (HONcode). Information quality was determined using a 35-point checklist created by Bichakjian et al. (J Clin Oncol 20:134-141, 2002), with the NCCN melanoma guideline as control. After excluding duplicate and link-only pages, a total of 24 Hungarian, 18 Czech, and 21 German melanoma Web sites were evaluated and rated. The amount of HON certified Web sites was the highest among the German Web pages (19%). One of the retrieved Hungarian and none of the Czech Web sites were HON certified. We found the highest number of Web sites containing comprehensive, correct melanoma information in German language, followed by Czech and Hungarian pages. Although the majority of the Web sites lacked data about incidence, risk factors, prevention, treatment, work-up, and follow-up, at least one comprehensive, high-quality Web site was found in each language. Several Web sites contained incorrect information in each language. While a small amount of comprehensive, quality melanoma-related Web sites was found, most of the retrieved Web content lacked basic disease information, such as risk factors, prevention, and treatment. A significant number of Web sites contained malinformation. In case of melanoma, primary and secondary preventions are of especially high importance; therefore, the improvement of disease information quality available on the Internet is necessary.

  19. HammerCloud: A Stress Testing System for Distributed Analysis

    NASA Astrophysics Data System (ADS)

    van der Ster, Daniel C.; Elmsheuser, Johannes; Úbeda García, Mario; Paladin, Massimo

    2011-12-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  20. SoyBase Simple Semantic Web Architecture and Protocol (SSWAP) Services

    USDA-ARS?s Scientific Manuscript database

    Semantic web technologies offer the potential to link internet resources and data by shared concepts without having to rely on absolute lexical matches. Thus two web sites or web resources which are concerned with similar data types could be identified based on similar semantics. In the biological...

  1. 17 CFR 232.202 - Continuing hardship exemption.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...

  2. 17 CFR 232.202 - Continuing hardship exemption.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...

  3. 17 CFR 232.202 - Continuing hardship exemption.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...

  4. 17 CFR 232.202 - Continuing hardship exemption.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...

  5. 17 CFR 232.202 - Continuing hardship exemption.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... electronic format or post the Interactive Data File on its corporate Web site, as applicable, on the required... Interactive Data File, the electronic filer need not post on its Web site any statement with regard to the... submitted in electronic format or, in the case of an Interactive Data File (§ 232.11), to be posted on the...

  6. Do infertile women and government staff differ in the evaluation of infertility-related Web sites?

    PubMed

    Takabayashi, Chikako; Shimada, Keiko

    2011-01-01

    To investigate the evaluation of local government Web sites carrying information on infertility by infertile women and by government staff. In particular, the study investigated whether the women and staff differed with respect to the information they rate as important and their self-reported satisfaction with the Web sites. Cross-sectional descriptive study. Sixty-two local government staff members, of whom 46 were public health nurses managing subsidy programs for infertility treatment in the Hokuriku region of Japan, and 84 infertile women attending local clinics. We measured the level of satisfaction with the local government Web sites and perceptions about the importance of each type of content. Data were descriptively analyzed, as well as by factor analysis and multiple regression analysis. Local government Web sites were analyzed with respect to information about the treatment, details of the subsidy program, psychological support, and procedures for making a subsidy application. The women rated information on the treatment and details of the subsidy programs as important. There was no difference of satisfaction with the Web sites between the infertile women and the staff. Local government staff need to provide reliable data for women who are seeking information on infertility treatment. © 2011 Wiley Periodicals, Inc.

  7. An open source Java web application to build self-contained Web GIS sites

    NASA Astrophysics Data System (ADS)

    Zavala Romero, O.; Ahmed, A.; Chassignet, E.; Zavala-Hidalgo, J.

    2014-12-01

    This work describes OWGIS, an open source Java web application that creates Web GIS sites by automatically writing HTML and JavaScript code. OWGIS is configured by XML files that define which layers (geographic datasets) will be displayed on the websites. This project uses several Open Geospatial Consortium standards to request data from typical map servers, such as GeoServer, and is also able to request data from ncWMS servers. The latter allows for the displaying of 4D data stored using the NetCDF file format (widely used for storing environmental model datasets). Some of the features available on the sites built with OWGIS are: multiple languages, animations, vertical profiles and vertical transects, color palettes, color ranges, and the ability to download data. OWGIS main users are scientists, such as oceanographers or climate scientists, who store their data in NetCDF files and want to analyze, visualize, share, or compare their data using a website.

  8. Data and Statistics: Heart Failure

    MedlinePlus

    ... commit" type="submit" value="Submit" /> Related CDC Web Sites Heart Disease Stroke High Blood Pressure Salt ... to Prevent and Control Chronic Diseases Million Hearts® Web Sites with More Information About Heart Failure For ...

  9. Data and Statistics: Women and Heart Disease

    MedlinePlus

    ... commit" type="submit" value="Submit" /> Related CDC Web Sites Heart Disease Stroke High Blood Pressure Salt ... commit" type="submit" value="Submit" /> Related CDC Web Sites Heart Disease Stroke High Blood Pressure Salt ...

  10. Public outreach and communications of the Alaska Volcano Observatory during the 2005-2006 eruption of Augustine Volcano: Chapter 27 in The 2006 eruption of Augustine Volcano, Alaska

    USGS Publications Warehouse

    Adleman, Jennifer N.; Cameron, Cheryl E.; Snedigar, Seth F.; Neal, Christina A.; Wallace, Kristi L.; Power, John A.; Coombs, Michelle L.; Freymueller, Jeffrey T.

    2010-01-01

    The AVO Web site, with its accompanying database, is the backbone of AVO's external and internal communications. This was the first Cook Inlet volcanic eruption with a public expectation of real-time access to data, updates, and hazards information over the Internet. In March 2005, AVO improved the Web site from individual static pages to a dynamic, database-driven site. This new system provided quick and straightforward access to the latest information for (1) staff within the observatory, (2) emergency managers from State and local governments and organizations, (3) the media, and (4) the public. From mid-December 2005 through April 2006, the AVO Web site served more than 45 million Web pages and about 5.5 terabytes of data.

  11. SSE Transition to POWER

    Atmospheric Science Data Center

    2018-06-15

    ... We are pleased to announce that on June 13, 2018 the old SSE web site will be replaced with the new data web portal at https://power.larc.nasa.gov with improved solar and ... currently on SSE are now available at the new POWER web site although the parameters might be organized differently.  Also note ...

  12. Replacement of SSE with NASA's POWER Announcement

    Atmospheric Science Data Center

    2018-06-11

    ... We are pleased to announce that on June 13, 2018 the old SSE web site will be replaced with the new data web portal at https://power.larc.nasa.gov with improved solar and ... currently on SSE are now available at the new POWER web site although the parameters might be organized differently.  Also note ...

  13. 76 FR 67456 - Common Formats for Patient Safety Data Collection and Event Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ... Common Formats, can be accessed electronically at the following HHS Web site: http://www.PSO.AHRQ.gov... Thromboembolism (VTE), which includes Deep Vein Thrombosis (DVT) and Pulmonary Embolism (PE), will apply to both... available at the PSO Privacy Protection Center (PPC) Web site: https://www.psoppc.org/web/patientsafety...

  14. SWS: accessing SRS sites contents through Web Services.

    PubMed

    Romano, Paolo; Marra, Domenico

    2008-03-26

    Web Services and Workflow Management Systems can support creation and deployment of network systems, able to automate data analysis and retrieval processes in biomedical research. Web Services have been implemented at bioinformatics centres and workflow systems have been proposed for biological data analysis. New databanks are often developed by taking into account these technologies, but many existing databases do not allow a programmatic access. Only a fraction of available databanks can thus be queried through programmatic interfaces. SRS is a well know indexing and search engine for biomedical databanks offering public access to many databanks and analysis tools. Unfortunately, these data are not easily and efficiently accessible through Web Services. We have developed 'SRS by WS' (SWS), a tool that makes information available in SRS sites accessible through Web Services. Information on known sites is maintained in a database, srsdb. SWS consists in a suite of WS that can query both srsdb, for information on sites and databases, and SRS sites. SWS returns results in a text-only format and can be accessed through a WSDL compliant client. SWS enables interoperability between workflow systems and SRS implementations, by also managing access to alternative sites, in order to cope with network and maintenance problems, and selecting the most up-to-date among available systems. Development and implementation of Web Services, allowing to make a programmatic access to an exhaustive set of biomedical databases can significantly improve automation of in-silico analysis. SWS supports this activity by making biological databanks that are managed in public SRS sites available through a programmatic interface.

  15. Accredited hand surgery fellowship Web sites: analysis of content and accessibility.

    PubMed

    Trehan, Samir K; Morrell, Nathan T; Akelman, Edward

    2015-04-01

    To assess the accessibility and content of accredited hand surgery fellowship Web sites. A list of all accredited hand surgery fellowships was obtained from the online database of the American Society for Surgery of the Hand (ASSH). Fellowship program information on the ASSH Web site was recorded. All fellowship program Web sites were located via Google search. Fellowship program Web sites were analyzed for accessibility and content in 3 domains: program overview, application information/recruitment, and education. At the time of this study, there were 81 accredited hand surgery fellowships with 169 available positions. Thirty of 81 programs (37%) had a functional link on the ASSH online hand surgery fellowship directory; however, Google search identified 78 Web sites. Three programs did not have a Web site. Analysis of content revealed that most Web sites contained contact information, whereas information regarding the anticipated clinical, research, and educational experiences during fellowship was less often present. Furthermore, information regarding past and present fellows, salary, application process/requirements, call responsibilities, and case volume was frequently lacking. Overall, 52 of 81 programs (64%) had the minimal online information required for residents to independently complete the fellowship application process. Hand fellowship program Web sites could be accessed either via the ASSH online directory or Google search, except for 3 programs that did not have Web sites. Although most fellowship program Web sites contained contact information, other content such as application information/recruitment and education, was less frequently present. This study provides comparative data regarding the clinical and educational experiences outlined on hand fellowship program Web sites that are of relevance to residents, fellows, and academic hand surgeons. This study also draws attention to various ways in which the hand surgery fellowship application process can be made more user-friendly and efficient. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  16. NNDC Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuli, J.K.; Sonzogni,A.

    The National Nuclear Data Center has provided remote access to some of its resources since 1986. The major databases and other resources available currently through NNDC Web site are summarized. The National Nuclear Data Center (NNDC) has provided remote access to the nuclear physics databases it maintains and to other resources since 1986. With considerable innovation access is now mostly through the Web. The NNDC Web pages have been modernized to provide a consistent state-of-the-art style. The improved database services and other resources available from the NNOC site at www.nndc.bnl.govwill be described.

  17. Remote vibration monitoring system using wireless internet data transfer

    NASA Astrophysics Data System (ADS)

    Lemke, John

    2000-06-01

    Vibrations from construction activities can affect infrastructure projects in several ways. Within the general vicinity of a construction site, vibrations can result in damage to existing structures, disturbance to people, damage to sensitive machinery, and degraded performance of precision instrumentation or motion sensitive equipment. Current practice for monitoring vibrations in the vicinity of construction sites commonly consists of measuring free field or structural motions using velocity transducers connected to a portable data acquisition unit via cables. This paper describes an innovative way to collect, process, transmit, and analyze vibration measurements obtained at construction sites. The system described measures vibration at the sensor location, performs necessary signal conditioning and digitization, and sends data to a Web server using wireless data transmission and Internet protocols. A Servlet program running on the Web server accepts the transmitted data and incorporates it into a project database. Two-way interaction between the Web-client and the Web server is accomplished through the use of a Servlet program and a Java Applet running inside a browser located on the Web client's computer. Advantages of this system over conventional vibration data logging systems include continuous unattended monitoring, reduced costs associated with field data collection, instant access to data files and graphs by project team members, and the ability to remotely modify data sampling schemes.

  18. A Quick Guide to Education Data Resources. A Tabletop Reference to Selected NCES Web Site Tools. For Librarians, Students, and Parents.

    ERIC Educational Resources Information Center

    National Center for Education Statistics (ED), Washington, DC.

    This leaflet is a guide to data resources on the Internet related to education. The first Web site listed, http://nces.ed.gov/globallocator/, allows the user to search for public and private elementary and secondary schools by name, city, state, or zip code. The second site, "The Students' Classroom," offers information on a range of…

  19. Internet-accessible, near-real-time volcano monitoring data for geoscience education: the Volcanoes Exploration Project—Pu`u `O`o

    NASA Astrophysics Data System (ADS)

    Poland, M. P.; Teasdale, R.; Kraft, K.

    2010-12-01

    Internet-accessible real- and near-real-time Earth science datasets are an important resource for geoscience education, but relatively few comprehensive datasets are available, and background information to aid interpretation is often lacking. In response to this need, the U.S. Geological Survey’s (USGS) Hawaiian Volcano Observatory, in collaboration with the National Aeronautics and Space Administration and the University of Hawai‘i, Mānoa, established the Volcanoes Exploration Project: Pu‘u ‘O‘o (VEPP). The VEPP Web site provides access, in near-real time, to geodetic, seismic, and geologic data from the Pu‘u ‘O‘o eruptive vent on Kilauea Volcano, Hawai‘i. On the VEPP Web site, a time series query tool provides a means of interacting with continuous geophysical data. In addition, results from episodic kinematic GPS campaigns and lava flow field maps are posted as data are collected, and archived Webcam images from Pu‘u ‘O‘o crater are available as a tool for examining visual changes in volcanic activity over time. A variety of background information on volcano surveillance and the history of the 1983-present Pu‘u ‘O‘o-Kupaianaha eruption puts the available monitoring data in context. The primary goal of the VEPP Web site is to take advantage of high visibility monitoring data that are seldom suitably well-organized to constitute an established educational resource. In doing so, the VEPP project provides a geoscience education resource that demonstrates the dynamic nature of volcanoes and promotes excitement about the process of scientific discovery through hands-on learning. To support use of the VEPP Web site, a week-long workshop was held at Kilauea Volcano in July 2010, which included 25 participants from the United States and Canada. The participants represented a diverse cross-section of higher learning, from community colleges to research universities, and included faculty who teach both large introductory non-major classes and seminar-style upper division and graduate-level classes. Overall workshop goals were for participants to learn how to interpret each of the VEPP data types, become proficient in the use of the VEPP Web site, provide feedback on site content, and create teaching modules that integrate the site into college and university geoscience curriculum. By the end of the workshop, over 20 new teaching modules were developed and the VEPP Web site was modified based on participant feedback. Teaching activities are available via the VEPP Workshop section of the Science Education Resource Center (SERC) Web site (http://www.nagt.org/nagt/vepp/index.html).

  20. Usability Testing of the Indiana University Education Faculty Web Forms.

    ERIC Educational Resources Information Center

    Tuzun, Hakan; Lee, Sun Myung; Graham, Charles; Sluder, Kirk Job

    The usability test team examined design problems that limit the ability of instructors at the Indiana University to use data entry forms on the School of Education Web site. The forms permit instructors to publish information about themselves and about courses they teach on the School of Education Web site. Faculty and graduate student instructors…

  1. Robust multi-site MR data processing: iterative optimization of bias correction, tissue classification, and registration.

    PubMed

    Young Kim, Eun; Johnson, Hans J

    2013-01-01

    A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.

  2. Accuracy of Marketing Claims by Providers of Stereotactic Radiation Therapy

    PubMed Central

    Narang, Amol K.; Lam, Edwin; Makary, Martin A.; DeWeese, Theodore L.; Pawlik, Timothy M.; Pronovost, Peter J.; Herman, Joseph M.

    2013-01-01

    Purpose: Direct-to-consumer advertising by industry has been criticized for encouraging overuse of unproven therapies, but advertising by health care providers has not been as carefully scrutinized. Stereotactic radiation therapy is an emerging technology that has sparked controversy regarding the marketing campaigns of some manufacturers. Given that this technology is also being heavily advertised on the Web sites of health care providers, the accuracy of providers' marketing claims should be rigorously evaluated. Methods: We reviewed the Web sites of all US hospitals and private practices that provide stereotactic radiation using two leading brands of stereotactic radiosurgery technology. Centers were identified by using data from the manufacturers. Centers without Web sites were excluded. The final study population consisted of 212 centers with online advertisements for stereotactic radiation. Web sites were evaluated for advertisements that were inconsistent with advertising guidelines provided by the American Medical Association. Results: Most centers (76%) had individual pages dedicated to the marketing of their brand of stereotactic technology that frequently contained manufacturer-authored images (50%) or text (55%). Advertising for the treatment of tumors that have not been endorsed by professional societies was present on 66% of Web sites. Centers commonly claimed improved survival (22%), disease control (20%), quality of life (17%), and toxicity (43%) with stereotactic radiation. Although 40% of Web sites championed the center's regional expertise in delivering stereotactic treatments, only 15% of Web sites provided data to support their claims. Conclusion: Provider advertisements for stereotactic radiation were prominent and aggressive. Further investigation of provider advertising, its effects on quality of care, and potential oversight mechanisms is needed. PMID:23633973

  3. Accuracy of marketing claims by providers of stereotactic radiation therapy.

    PubMed

    Narang, Amol K; Lam, Edwin; Makary, Martin A; Deweese, Theodore L; Pawlik, Timothy M; Pronovost, Peter J; Herman, Joseph M

    2013-01-01

    Direct-to-consumer advertising by industry has been criticized for encouraging overuse of unproven therapies, but advertising by health care providers has not been as carefully scrutinized. Stereotactic radiation therapy is an emerging technology that has sparked controversy regarding the marketing campaigns of some manufacturers. Given that this technology is also being heavily advertised on the Web sites of health care providers, the accuracy of providers' marketing claims should be rigorously evaluated. We reviewed the Web sites of all U.S. hospitals and private practices that provide stereotactic radiation using two leading brands of stereotactic radiosurgery technology. Centers were identified by using data from the manufacturers. Centers without Web sites were excluded. The final study population consisted of 212 centers with online advertisements for stereotactic radiation. Web sites were evaluated for advertisements that were inconsistent with advertising guidelines provided by the American Medical Association. Most centers (76%) had individual pages dedicated to the marketing of their brand of stereotactic technology that frequently contained manufacturer-authored images (50%) or text (55%). Advertising for the treatment of tumors that have not been endorsed by professional societies was present on 66% of Web sites. Centers commonly claimed improved survival (22%), disease control (20%), quality of life (17%), and toxicity (43%) with stereotactic radiation. Although 40% of Web sites championed the center's regional expertise in delivering stereotactic treatments, only 15% of Web sites provided data to support their claims. Provider advertisements for stereotactic radiation were prominent and aggressive. Further investigation of provider advertising, its effects on quality of care, and potential oversight mechanisms is needed.

  4. The commercialization of robotic surgery: unsubstantiated marketing of gynecologic surgery by hospitals.

    PubMed

    Schiavone, Maria B; Kuo, Eugenia C; Naumann, R Wendel; Burke, William M; Lewin, Sharyn N; Neugut, Alfred I; Hershman, Dawn L; Herzog, Thomas J; Wright, Jason D

    2012-09-01

    We analyzed the content, quality, and accuracy of information provided on hospital web sites about robotic gynecologic surgery. An analysis of hospitals with more than 200 beds from a selection of states was performed. Hospital web sites were analyzed for the content and quality of data regarding robotic-assisted surgery. Among 432 hospitals, the web sites of 192 (44.4%) contained marketing for robotic gynecologic surgery. Stock images (64.1%) and text (24.0%) derived from the robot manufacturer were frequent. Although most sites reported improved perioperative outcomes, limitations of robotics including cost, complications, and operative time were discussed only 3.7%, 1.6%, and 3.7% of the time, respectively. Only 47.9% of the web sites described a comparison group. Marketing of robotic gynecologic surgery is widespread. Much of the content is not based on high-quality data, fails to present alternative procedures, and relies on stock text and images. Copyright © 2012 Mosby, Inc. All rights reserved.

  5. Food and beverage advertising on children's web sites.

    PubMed

    Ustjanauskas, A E; Harris, J L; Schwartz, M B

    2014-10-01

    Food marketing contributes to childhood obesity. Food companies commonly place display advertising on children's web sites, but few studies have investigated this form of advertising. Document the number of food and beverage display advertisements viewed on popular children's web sites, nutritional quality of advertised brands and proportion of advertising approved by food companies as healthier dietary choices for child-directed advertising. Syndicated Internet exposure data identified popular children's web sites and food advertisements viewed on these web sites from July 2009 through June 2010. Advertisements were classified according to food category and companies' participation in food industry self-regulation. The percent of advertisements meeting government-proposed nutrition standards was calculated. 3.4 billion food advertisements appeared on popular children's web sites; 83% on just four web sites. Breakfast cereals and fast food were advertised most often (64% of ads). Most ads (74%) promoted brands approved by companies for child-directed advertising, but 84% advertised products that were high in fat, sugar and/or sodium. Ads for foods designated by companies as healthier dietary choices appropriate for child-directed advertising were least likely to meet independent nutrition standards. Most foods advertised on popular children's web sites do not meet independent nutrition standards. Further improvements to industry self-regulation are required. © 2013 The Authors. Pediatric Obesity © 2013 International Association for the Study of Obesity.

  6. JBioWH: an open-source Java framework for bioinformatics data integration

    PubMed Central

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh PMID:23846595

  7. JBioWH: an open-source Java framework for bioinformatics data integration.

    PubMed

    Vera, Roberto; Perez-Riverol, Yasset; Perez, Sonia; Ligeti, Balázs; Kertész-Farkas, Attila; Pongor, Sándor

    2013-01-01

    The Java BioWareHouse (JBioWH) project is an open-source platform-independent programming framework that allows a user to build his/her own integrated database from the most popular data sources. JBioWH can be used for intensive querying of multiple data sources and the creation of streamlined task-specific data sets on local PCs. JBioWH is based on a MySQL relational database scheme and includes JAVA API parser functions for retrieving data from 20 public databases (e.g. NCBI, KEGG, etc.). It also includes a client desktop application for (non-programmer) users to query data. In addition, JBioWH can be tailored for use in specific circumstances, including the handling of massive queries for high-throughput analyses or CPU intensive calculations. The framework is provided with complete documentation and application examples and it can be downloaded from the Project Web site at http://code.google.com/p/jbiowh. A MySQL server is available for demonstration purposes at hydrax.icgeb.trieste.it:3307. Database URL: http://code.google.com/p/jbiowh.

  8. MDB: the Metalloprotein Database and Browser at The Scripps Research Institute

    PubMed Central

    Castagnetto, Jesus M.; Hennessy, Sean W.; Roberts, Victoria A.; Getzoff, Elizabeth D.; Tainer, John A.; Pique, Michael E.

    2002-01-01

    The Metalloprotein Database and Browser (MDB; http://metallo.scripps.edu) at The Scripps Research Institute is a web-accessible resource for metalloprotein research. It offers the scientific community quantitative information on geometrical parameters of metal-binding sites in protein structures available from the Protein Data Bank (PDB). The MDB also offers analytical tools for the examination of trends or patterns in the indexed metal-binding sites. A user can perform interactive searches, metal-site structure visualization (via a Java applet), and analysis of the quantitative data by accessing the MDB through a web browser without requiring an external application or platform-dependent plugin. The MDB also has a non-interactive interface with which other web sites and network-aware applications can seamlessly incorporate data or statistical analysis results from metal-binding sites. The information contained in the MDB is periodically updated with automated algorithms that find and index metal sites from new protein structures released by the PDB. PMID:11752342

  9. Cleanups In My Community (CIMC) - Removals/Responses, National Layer

    EPA Pesticide Factsheets

    This data layer provides access to Removal/Response sites as part of the CIMC web service. Removals are hazardous substance releases that require immediate or short-term response actions. These are generally addressed under the Emergency Response program and are initially tracked centrally by the federal government's National Reporting Center. Cleanups in My Community maps and lists removals that are included in EPA??s epaosc.org site, and provides direct links to information on these sites. CIMC obtains updated removal data through a web service from epaosc.org just before the 18th of each month.The CIMC web service was initially published in 2013, but the data are updated on the 18th of each month. The full schedule for data updates in CIMC is located here: http://iaspub.epa.gov/enviro/data_update_v2.

  10. Increasing public understanding of transgenic crops through the World Wide Web.

    PubMed

    Byrne, Patrick F; Namuth, Deana M; Harrington, Judy; Ward, Sarah M; Lee, Donald J; Hain, Patricia

    2002-07-01

    Transgenic crops among the most controversial "science and society" issues of recent years. Because of the complex techniques involved in creating these crops and the polarized debate over their risks and beliefs, a critical need has arisen for accessible and balanced information on this technology. World Wide Web sites offer several advantages for disseminating information on a fast-changing technical topic, including their global accessibility; and their ability to update information frequently, incorporate multimedia formats, and link to networks of other sites. An alliance between two complementary web sites at Colorado State University and the University of Nebraska-Lincoln takes advantage of the web environment to help fill the need for public information on crop genetic engineering. This article describes the objectives and features of each site. Viewership data and other feedback have shown these web sites to be effective means of reaching public audiences on a complex scientific topic.

  11. Behind the Numbers: Why Web Analytics Matter to Your Institution

    ERIC Educational Resources Information Center

    Thayer, Shelby

    2011-01-01

    Web analytics measure, collect, analyze, and report Internet data that help website managers improve the effectiveness of the site and its marketing efforts by allowing them to better understand how users interact with the site. Applying this data can help drive the right people to the website and keep them there. According to Joshua Dodson, Web…

  12. Disclaimer | NOAA Gulf Spill Restoration

    Science.gov Websites

    other information resources available on the World Wide Web, and NOAA does not control and cannot , reliability, or completeness of furnished data. Non-federal sites are identified on this site with this icon Linking to a non-federal government web site. This link does not imply endorsement. . The icon appears

  13. VAAPA: a web platform for visualization and analysis of alternative polyadenylation.

    PubMed

    Guan, Jinting; Fu, Jingyi; Wu, Mingcheng; Chen, Longteng; Ji, Guoli; Quinn Li, Qingshun; Wu, Xiaohui

    2015-02-01

    Polyadenylation [poly(A)] is an essential process during the maturation of most mRNAs in eukaryotes. Alternative polyadenylation (APA) as an important layer of gene expression regulation has been increasingly recognized in various species. Here, a web platform for visualization and analysis of alternative polyadenylation (VAAPA) was developed. This platform can visualize the distribution of poly(A) sites and poly(A) clusters of a gene or a section of a chromosome. It can also highlight genes with switched APA sites among different conditions. VAAPA is an easy-to-use web-based tool that provides functions of poly(A) site query, data uploading, downloading, and APA sites visualization. It was designed in a multi-tier architecture and developed based on Smart GWT (Google Web Toolkit) using Java as the development language. VAAPA will be a valuable addition to the community for the comprehensive study of APA, not only by making the high quality poly(A) site data more accessible, but also by providing users with numerous valuable functions for poly(A) site analysis and visualization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Rates and Determinants of Uptake and Use of an Internet Physical Activity and Weight Management Program in Office and Manufacturing Work Sites in England: Cohort Study

    PubMed Central

    Hurling, Robert; Bataveljic, Ogi; Fairley, Bruce W; Hurst, Tina L; Murray, Peter; Rennie, Kirsten L; Tomkins, Chris E; Finn, Anne; Cobain, Mark R; Pearson, Dympna A; Foreyt, John P

    2008-01-01

    Background Internet-based physical activity (PA) and weight management programs have the potential to improve employees’ health in large occupational health settings. To be successful, the program must engage a wide range of employees, especially those at risk of weight gain or ill health. Objective The aim of the study was to assess the use and nonuse (user attrition) of a Web-based and monitoring device–based PA and weight management program in a range of employees and to determine if engagement with the program was related to the employees’ baseline characteristics or measured outcomes. Methods Longitudinal observational study of a cohort of employees having access to the MiLife Web-based automated behavior change system. Employees were recruited from manufacturing and office sites in the North West and the South of England. Baseline health data were collected, and participants were given devices to monitor their weight and PA via data upload to the website. Website use, PA, and weight data were collected throughout the 12-week program. Results Overall, 12% of employees at the four sites (265/2302) agreed to participate in the program, with 130 men (49%) and 135 women (51%), and of these, 233 went on to start the program. During the program, the dropout rate was 5% (11/233). Of the remaining 222 Web program users, 173 (78%) were using the program at the end of the 12 weeks, with 69% (153/222) continuing after this period. Engagement with the program varied by site but was not significantly different between the office and factory sites. During the first 2 weeks, participants used the website, on average, 6 times per week, suggesting an initial learning period after which the frequency of website log-in was typically 2 visits per week and 7 minutes per visit. Employees who uploaded weight data had a significant reduction in weight (−2.6 kg, SD 3.2, P< .001). The reduction in weight was largest for employees using the program’s weight loss mode (−3.4 kg, SD 3.5). Mean PA level recorded throughout the program was 173 minutes (SE 12.8) of moderate/high intensity PA per week. Website interaction time was higher and attrition rates were lower (OR 1.38, P= .03) in those individuals with the greatest weight loss. Conclusions This Web-based PA and weight management program showed high levels of engagement across a wide range of employees, including overweight or obese workers, shift workers, and those who do not work with computers. Weight loss was observed at both office and manufacturing sites. The use of monitoring devices to capture and send data to the automated Web-based coaching program may have influenced the high levels of engagement observed in this study. When combined with objective monitoring devices for PA and weight, both use of the website and outcomes can be tracked, allowing the online coaching program to become more personalized to the individual. PMID:19117828

  15. Accessibility Trends among Academic Library and Library School Web Sites in the USA and Canada

    ERIC Educational Resources Information Center

    Schmetzke, Axel; Comeaux, David

    2009-01-01

    This paper focuses on the accessibility of North American library and library school Web sites for all users, including those with disabilities. Web accessibility data collected in 2006 are compared to those of 2000 and 2002. The findings of this follow-up study continue to give cause for concern: Despite improvements since 2002, library and…

  16. Smart caching based on mobile agent of power WebGIS platform.

    PubMed

    Wang, Xiaohui; Wu, Kehe; Chen, Fei

    2013-01-01

    Power information construction is developing towards intensive, platform, distributed direction with the expansion of power grid and improvement of information technology. In order to meet the trend, power WebGIS was designed and developed. In this paper, we first discuss the architecture and functionality of power WebGIS, and then we study caching technology in detail, which contains dynamic display cache model, caching structure based on mobile agent, and cache data model. We have designed experiments of different data capacity to contrast performance between WebGIS with the proposed caching model and traditional WebGIS. The experimental results showed that, with the same hardware environment, the response time of WebGIS with and without caching model increased as data capacity growing, while the larger the data was, the higher the performance of WebGIS with proposed caching model improved.

  17. Ondex Web: web-based visualization and exploration of heterogeneous biological networks.

    PubMed

    Taubert, Jan; Hassani-Pak, Keywan; Castells-Brooke, Nathalie; Rawlings, Christopher J

    2014-04-01

    Ondex Web is a new web-based implementation of the network visualization and exploration tools from the Ondex data integration platform. New features such as context-sensitive menus and annotation tools provide users with intuitive ways to explore and manipulate the appearance of heterogeneous biological networks. Ondex Web is open source, written in Java and can be easily embedded into Web sites as an applet. Ondex Web supports loading data from a variety of network formats, such as XGMML, NWB, Pajek and OXL. http://ondex.rothamsted.ac.uk/OndexWeb.

  18. Internet food marketing strategies aimed at children and adolescents: a content analysis of food and beverage brand web sites.

    PubMed

    Weber, Kristi; Story, Mary; Harnack, Lisa

    2006-09-01

    Americans are spending an increasing amount of time using "new media" like the Internet. There has been little research examining food and beverage Web sites' content and marketing practices, especially those that attract children and adolescents. The purpose of this study was to conduct a content analysis of food- and beverage-brand Web sites and the marketing techniques and advertising strategies present on these sites. The top five brands in eight food and beverage categories, 40 brands in total, were selected based on annual sales data from Brandweek magazine's annual "Superbrands" report. Data were collected using a standardized coding form. The results show a wide variety of Internet marketing techniques and advertising strategies targeting children and adolescents. "Advergaming" (games in which the advertised product is part of the game) was present on 63% of the Web sites. Half or more of the Web sites used cartoon characters (50%) or spokescharacters (55%), or had a specially designated children's area (58%) with a direct link from the homepage. With interactive media still in its developmental stage, there is a need to develop safeguards for children. Food and nutrition professionals need to advocate for responsible marketing techniques that will support the health of children.

  19. ProBiS-ligands: a web server for prediction of ligands by examination of protein binding sites.

    PubMed

    Konc, Janez; Janežič, Dušanka

    2014-07-01

    The ProBiS-ligands web server predicts binding of ligands to a protein structure. Starting with a protein structure or binding site, ProBiS-ligands first identifies template proteins in the Protein Data Bank that share similar binding sites. Based on the superimpositions of the query protein and the similar binding sites found, the server then transposes the ligand structures from those sites to the query protein. Such ligand prediction supports many activities, e.g. drug repurposing. The ProBiS-ligands web server, an extension of the ProBiS web server, is open and free to all users at http://probis.cmm.ki.si/ligands. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Analyzing traffic source impact on returning visitors ratio in information provider website

    NASA Astrophysics Data System (ADS)

    Prasetio, A.; Sari, P. K.; Sharif, O. O.; Sofyan, E.

    2016-04-01

    Web site performance, especially returning visitor is an important metric for an information provider web site. Since high returning visitor is a good indication of a web site’s visitor loyalty, it is important to find a way to improve this metric. This research investigated if there is any difference on returning visitor metric among three web traffic sources namely direct, referral and search. Monthly returning visitor and total visitor from each source is retrieved from Google Analytics tools and then calculated to measure returning visitor ratio. The period of data observation is from July 2012 to June 2015 resulting in a total of 108 samples. These data then analysed using One-Way Analysis of Variance (ANOVA) to address our research question. The results showed that different traffic source has significantly different returning visitor ratio especially between referral traffic source and the other two traffic sources. On the other hand, this research did not find any significant difference between returning visitor ratio from direct and search traffic sources. The owner of the web site can focus to multiply referral links from other relevant sites.

  1. Site-characteristic and hydrologic data for selected wells and springs on Federal land in Clark County, Nevada

    USGS Publications Warehouse

    Pavelko, Michael T.

    2014-01-01

    Site-characteristic and hydrologic data for selected wells and springs on U.S. Bureau of Land Management, National Park Service, U.S. Fish and Wildlife Service, and U.S. Forest Service land in Clark County, Nevada, were updated in the U.S. Geological Survey’s National Water Information System (NWIS) to facilitate multi-agency research. Data were researched and reviewed, sites were visited, and NWIS data were updated for 231 wells and 198 springs, including 36 wells and 67 springs that were added to NWIS and 44 duplicate sites that were deleted. The site-characteristic and hydrologic data collected, reviewed, edited, and added to NWIS include locations, well water levels, spring discharges, and water chemistry. Site-characteristic and hydrologic data can be accessed from links to the NWIS web interface; data not available through the web interface are presented in appendixes to this report.

  2. Evaluation of Norwegian cancer hospitals' Web sites and explorative survey among cancer patients on their use of the Internet

    PubMed Central

    2001-01-01

    Background Hospital homepages should provide comprehensive information on the hospital's services, such as departments and treatments available, prices, waiting time, leisure facilities, and other information important for patients and their relatives. Norway, with its population of approximately 4.3 million, ranks among the top countries globally for its ability to absorb and use technology. It is unclear to what degree Norwegian hospitals and patients use the Internet for information about health services. Objectives This study was undertaken to evaluate the quality of the biggest Norwegian cancer hospitals' Web sites and to gather some preliminary data on patients' use of the Internet. Methods In January 2001, we analyzed Web sites of 5 of the 7 biggest Norwegian hospitals treating cancer patients using a scoring system. The scoring instrument was based on recommendations developed by the Norwegian Central Information Service for Web sites and reflects the scope and depth of service information offered on hospital Web pages. In addition, 31 cancer patients visiting one hospital-based medical oncologist were surveyed about their use of the Internet. Results Of the 7 hospitals, 5 had a Web site. The Web sites differed markedly in quality. Types of information included - and number of Web sites that included each type of information - were, for example: search option, 1; interpreter service, 2; date of last update, 2; postal address, phone number, and e-mail service, 3; information in English, 2. None of the Web sites included information on waiting time or prices. Of the 31 patients surveyed, 12 had personal experience using the Internet and 4 had searched for medical information. The Internet users were significantly younger (mean age 47.8 years, range 28.4-66.8 years) than the nonusers (mean age 61.8 years, range 33.1-90.0 years) ( P= 0.007). Conclusions The hospitals' Web sites offer cancer patients and relatives useful information, but the Web sites were not impressive. PMID:11772545

  3. Communication of Career Pathways Through Associate Degree Program Web Sites: A Baseline Assessment.

    PubMed

    Becker, Ellen A; Vargas, Jenny

    2018-05-08

    The American Association for Respiratory Care sponsored a series of conferences that addressed the competency of the future workforce of respiratory therapists (RTs). Based upon the findings from those conferences, several initiatives emerged that support RTs earning a baccalaureate (or bachelor's) degree. The objective of this study was to identify the ways that associate degree programs communicate career pathways toward a baccalaureate degree through their Web sites. This cross-sectional observational study used a random sample of 100 of the 362 associate degree programs approved by the Commission on Accreditation for Respiratory Care. Data were collected from 3 specific categories: demographic data, baccalaureate completion information, and the Web page location for the program. The presence of statements related to any pathway toward a bachelor's degree, transfer credits, articulation agreements, and links for baccalaureate completion were recorded. The descriptive statistics in this study were reported as total numbers and percentages. Of the 100 programs in the random sample, only 89 were included in the study. Only 39 (44%) programs had links on their program Web site that had any content related to bachelor's degrees, 16 (18%) identified college transfer courses toward a bachelor's degree, and 26 (29%) programs included baccalaureate articulation agreements on their Web site. A minority of associate degree programs communicated career pathway information to their prospective and current students through program Web sites. An informative Web site would make the path more transparent for entry-level students to meet their future educational needs as their careers progress. Copyright © 2018 by Daedalus Enterprises.

  4. Discovering How Students Search a University Web Site: A Comparative Usability Case Study for PC and Mobile Devices

    ERIC Educational Resources Information Center

    Sengel, Erhan

    2014-01-01

    This study aims to investigate the usability level of web site of a university by observing 10 participants who are required to complete 11 tasks, which have been defined by the researchers before to gather data about effectiveness, efficiency and satisfaction. System Usability Scale was used to collect data about satisfaction. The research…

  5. Distance Education Programs in Texas Community & Technical Colleges: Assessing Student Support Services in a Virtual Environment.

    ERIC Educational Resources Information Center

    Luedtke, Cherry Beth

    This project evaluates the status of distance learning at 54 public, two-year community, and technical colleges in Texas. Data was collected from the Web sites of each of the institutions. The Web site data indicted that 44 of the colleges refer specifically to distance education courses offered. To assess what student support services are…

  6. 78 FR 42775 - CGI Federal, Inc., and Custom Applications Management; Transfer of Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-17

    ... develop applications, Web sites, Web pages, web-based applications and databases, in accordance with EPA policies and related Federal standards and procedures. The Contractor will provide [[Page 42776

  7. Installing a Local Copy of the Reactome Web Site and Knowledgebase

    PubMed Central

    McKay, Sheldon J; Weiser, Joel

    2015-01-01

    The Reactome project builds, maintains, and publishes a knowledgebase of biological pathways. The information in the knowledgebase is gathered from the experts in the field, peer reviewed, and edited by Reactome editorial staff and then published to the Reactome Web site, http://www.reactome.org (see UNIT 8.7; Croft et al., 2013). The Reactome software is open source and builds on top of other open-source or freely available software. Reactome data and code can be freely downloaded in its entirety and the Web site installed locally. This allows for more flexible interrogation of the data and also makes it possible to add one’s own information to the knowledgebase. PMID:26087747

  8. Porting Social Media Contributions with SIOC

    NASA Astrophysics Data System (ADS)

    Bojars, Uldis; Breslin, John G.; Decker, Stefan

    Social media sites, including social networking sites, have captured the attention of millions of users as well as billions of dollars in investment and acquisition. To better enable a user's access to multiple sites, portability between social media sites is required in terms of both (1) the personal profiles and friend networks and (2) a user's content objects expressed on each site. This requires representation mechanisms to interconnect both people and objects on the Web in an interoperable, extensible way. The Semantic Web provides the required representation mechanisms for portability between social media sites: it links people and objects to record and represent the heterogeneous ties that bind each to the other. The FOAF (Friend-of-a-Friend) initiative provides a solution to the first requirement, and this paper discusses how the SIOC (Semantically-Interlinked Online Communities) project can address the latter. By using agreed-upon Semantic Web formats like FOAF and SIOC to describe people, content objects, and the connections that bind them together, social media sites can interoperate and provide portable data by appealing to some common semantics. In this paper, we will discuss the application of Semantic Web technology to enhance current social media sites with semantics and to address issues with portability between social media sites. It has been shown that social media sites can serve as rich data sources for SIOC-based applications such as the SIOC Browser, but in the other direction, we will now show how SIOC data can be used to represent and port the diverse social media contributions (SMCs) made by users on heterogeneous sites.

  9. Cloud Computing for Protein-Ligand Binding Site Comparison

    PubMed Central

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery. PMID:23762824

  10. Cloud computing for protein-ligand binding site comparison.

    PubMed

    Hung, Che-Lun; Hua, Guan-Jie

    2013-01-01

    The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery.

  11. Informing child welfare policy and practice: using knowledge discovery and data mining technology via a dynamic Web site.

    PubMed

    Duncan, Dean F; Kum, Hye-Chung; Weigensberg, Elizabeth Caplick; Flair, Kimberly A; Stewart, C Joy

    2008-11-01

    Proper management and implementation of an effective child welfare agency requires the constant use of information about the experiences and outcomes of children involved in the system, emphasizing the need for comprehensive, timely, and accurate data. In the past 20 years, there have been many advances in technology that can maximize the potential of administrative data to promote better evaluation and management in the field of child welfare. Specifically, this article discusses the use of knowledge discovery and data mining (KDD), which makes it possible to create longitudinal data files from administrative data sources, extract valuable knowledge, and make the information available via a user-friendly public Web site. This article demonstrates a successful project in North Carolina where knowledge discovery and data mining technology was used to develop a comprehensive set of child welfare outcomes available through a public Web site to facilitate information sharing of child welfare data to improve policy and practice.

  12. Smart Caching Based on Mobile Agent of Power WebGIS Platform

    PubMed Central

    Wang, Xiaohui; Wu, Kehe; Chen, Fei

    2013-01-01

    Power information construction is developing towards intensive, platform, distributed direction with the expansion of power grid and improvement of information technology. In order to meet the trend, power WebGIS was designed and developed. In this paper, we first discuss the architecture and functionality of power WebGIS, and then we study caching technology in detail, which contains dynamic display cache model, caching structure based on mobile agent, and cache data model. We have designed experiments of different data capacity to contrast performance between WebGIS with the proposed caching model and traditional WebGIS. The experimental results showed that, with the same hardware environment, the response time of WebGIS with and without caching model increased as data capacity growing, while the larger the data was, the higher the performance of WebGIS with proposed caching model improved. PMID:24288504

  13. Food-web dynamics in a large river discontinuum

    USGS Publications Warehouse

    Cross, Wyatt F.; Baxter, Colden V.; Rosi-Marshall, Emma J.; Hall, Robert O.; Kennedy, Theodore A.; Donner, Kevin C.; Kelly, Holly A. Wellard; Seegert, Sarah E.Z.; Behn, Kathrine E.; Yard, Michael D.

    2013-01-01

    Nearly all ecosystems have been altered by human activities, and most communities are now composed of interacting species that have not co-evolved. These changes may modify species interactions, energy and material flows, and food-web stability. Although structural changes to ecosystems have been widely reported, few studies have linked such changes to dynamic food-web attributes and patterns of energy flow. Moreover, there have been few tests of food-web stability theory in highly disturbed and intensely managed freshwater ecosystems. Such synthetic approaches are needed for predicting the future trajectory of ecosystems, including how they may respond to natural or anthropogenic perturbations. We constructed flow food webs at six locations along a 386-km segment of the Colorado River in Grand Canyon (Arizona, USA) for three years. We characterized food-web structure and production, trophic basis of production, energy efficiencies, and interaction-strength distributions across a spatial gradient of perturbation (i.e., distance from Glen Canyon Dam), as well as before and after an experimental flood. We found strong longitudinal patterns in food-web characteristics that strongly correlated with the spatial position of large tributaries. Above tributaries, food webs were dominated by nonnative New Zealand mudsnails (62% of production) and nonnative rainbow trout (100% of fish production). The simple structure of these food webs led to few dominant energy pathways (diatoms to few invertebrate taxa to rainbow trout), large energy inefficiencies (i.e., Below large tributaries, invertebrate production declined ∼18-fold, while fish production remained similar to upstream sites and comprised predominately native taxa (80–100% of production). Sites below large tributaries had increasingly reticulate and detritus-based food webs with a higher prevalence of omnivory, as well as interaction strength distributions more typical of theoretically stable food webs (i.e., nearly twofold higher proportion of weak interactions). Consistent with theory, downstream food webs were less responsive to the experimental flood than sites closest to the dam. We show how human-induced shifts to food-web structure can affect energy flow and interaction strengths, and we show that these changes have consequences for food-web function and response to perturbations.

  14. U.S. Geological Survey World Wide Web Information

    USGS Publications Warehouse

    ,

    2000-01-01

    The U.S. Geological Survey (USGS) invites you to explore an earth science virtual library of digital information, publications, and data. The USGS World Wide Web sites offer an array of information that reflects scientific research and monitoring programs conducted in the areas of natural hazards, environmental resources, and cartog-raphy. This list provides gateways to access a cross section of the digital information on the USGS World Wide Web sites.

  15. U.S. Geological Survey World Wide Web Information

    USGS Publications Warehouse

    ,

    2003-01-01

    The U.S. Geological Survey (USGS) invites you to explore an earth science virtual library of digital information, publications, and data. The USGS World Wide Web sites offer an array of information that reflects scientific research and monitoring programs conducted in the areas of natural hazards, environmental resources, and cartography. This list provides gateways to access a cross section of the digital information on the USGS World Wide Web sites.

  16. U.S. Geological Survey World Wide Web Information

    USGS Publications Warehouse

    ,

    1999-01-01

    The U.S. Geological Survey (USGS) invites you to explore an earth science virtual library of digital information, publications, and data. The USGS Internet World Wide Web sites offer an array of information that reflects scientific research and monitoring programs conducted in the areas of natural hazards, environmental resources, and cartography. This list provides gateways to access a cross section of the digital information on the USGS World Wide Web sites.

  17. U.S. Geological Survey World Wide Web information

    USGS Publications Warehouse

    ,

    1997-01-01

    The U.S. Geological Survey (USGS) invites you to explore an earth science virtual library of digital information, publications, and data. The USGS Internet World Wide Web sites offer an array of information that reflects scientific research and monitoring programs conducted in the areas of natural hazards, environmental resources, and cartography. This list provides gateways to access a cross section of the digital information on the USGS World Wide Web sites.

  18. Surfing the Web for Science: Early Data on the Users and Uses of The Why Files.

    ERIC Educational Resources Information Center

    Eveland, William P., Jr.; Dunwoody, Sharon

    1998-01-01

    This brief offers an initial look at one science site on the World Wide Web (The Why Files: http://whyfiles.news.wise.edu) in order to consider the educational potential of this technology. The long-term goal of the studies of this site is to understand how the World Wide Web can be used to enhance science, mathematics, engineering, and technology…

  19. Global Land Survey Impervious Mapping Project Web Site

    NASA Technical Reports Server (NTRS)

    DeColstoun, Eric Brown; Phillips, Jacqueline

    2014-01-01

    The Global Land Survey Impervious Mapping Project (GLS-IMP) aims to produce the first global maps of impervious cover at the 30m spatial resolution of Landsat. The project uses Global Land Survey (GLS) Landsat data as its base but incorporates training data generated from very high resolution commercial satellite data and using a Hierarchical segmentation program called Hseg. The web site contains general project information, a high level description of the science, examples of input and output data, as well as links to other relevant projects.

  20. Using focus groups to guide development of a public health Web site.

    PubMed

    Henner, Terry A; Charles, Patricia

    2002-01-01

    This paper explores a project funded through the National Network of Libraries of Medicine to enhance effective use of the Internet by public health professionals. The processes and outcome of an effort to develop a statewide Web site for public health professionals are described. A series of focus groups was conducted as a preliminary data-gathering tool to evaluate the information needs of the target population. Results of the focus group provided a valuable framework upon which to build a successful schema for Web site development.

  1. Validity and client use of information from the World Wide Web regarding veterinary anesthesia in dogs.

    PubMed

    Hofmeister, Erik H; Watson, Victoria; Snyder, Lindsey B C; Love, Emma J

    2008-12-15

    To determine the validity of the information on the World Wide Web concerning veterinary anesthesia in dogs and to determine the methods dog owners use to obtain that information. Web-based search and client survey. 73 Web sites and 92 clients. Web sites were scored on a 5-point scale for completeness and accuracy of information about veterinary anesthesia by 3 board-certified anesthesiologists. A search for anesthetic information regarding 49 specific breeds of dogs was also performed. A survey was distributed to the clients who visited the University of Georgia Veterinary Teaching Hospital during a 4-month period to solicit data about sources used by clients to obtain veterinary medical information and the manner in which information obtained from Web sites was used. The general search identified 73 Web sites that included information on veterinary anesthesia; these sites received a mean score of 3.4 for accuracy and 2.5 for completeness. Of 178 Web sites identified through the breed-specific search, 57 (32%) indicated that a particular breed was sensitive to anesthesia. Of 83 usable, completed surveys, 72 (87%) indicated the client used the Web for veterinary medical information. Fifteen clients (18%) indicated they believed their animal was sensitive to anesthesia because of its breed. Information available on the internet regarding anesthesia in dogs is generally not complete and may be misleading with respect to risks to specific breeds. Consequently, veterinarians should appropriately educate clients regarding anesthetic risk to their particular dog.

  2. Using USNO's API to Obtain Data

    NASA Astrophysics Data System (ADS)

    Lesniak, Michael V.; Pozniak, Daniel; Punnoose, Tarun

    2015-01-01

    The U.S. Naval Observatory (USNO) is in the process of modernizing its publicly available web services into APIs (Application Programming Interfaces). Services configured as APIs offer greater flexibility to the user and allow greater usage. Depending on the particular service, users who implement our APIs will receive either a PNG (Portable Network Graphics) image or data in JSON (JavaScript Object Notation) format. This raw data can then be embedded in third-party web sites or in apps.Part of the USNO's mission is to provide astronomical and timing data to government agencies and the general public. To this end, the USNO provides accurate computations of astronomical phenomena such as dates of lunar phases, rise and set times of the Moon and Sun, and lunar and solar eclipse times. Users who navigate to our web site and select one of our 18 services are prompted to complete a web form, specifying parameters such as date, time, location, and object. Many of our services work for years between 1700 and 2100, meaning that past, present, and future events can be computed. Upon form submission, our web server processes the request, computes the data, and outputs it to the user.Over recent years, the use of the web by the general public has vastly changed. In response to this, the USNO is modernizing its web-based data services. This includes making our computed data easier to embed within third-party web sites as well as more easily querying from apps running on tablets and smart phones. To facilitate this, the USNO has begun converting its services into APIs. In addition to the existing web forms for the various services, users are able to make direct URL requests that return either an image or numerical data.To date, four of our web services have been configured to run with APIs. Two are image-producing services: "Apparent Disk of a Solar System Object" and "Day and Night Across the Earth." Two API data services are "Complete Sun and Moon Data for One Day" and "Dates of Primary Phases of the Moon." Instructions for how to use our API services as well as examples of their use can be found on one of our explanatory web pages and will be discussed here.

  3. The Maryland power plant research program internet resource for precipitation chemistry data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corio, L.A.; Jones, W.B.; Sherwell, J.

    1999-07-01

    The Maryland Department of Natural Resources Power Plant Research Program (PPRP) initiated a project in 1998 to make available on the World Wide Web (WWW), precipitation chemistry data from monitoring sites located in the Chesapeake Bay watershed. To that end, PPRP obtained, from various organizations, background information on atmospheric deposition monitoring programs (some of which are still on-going), as well as special studies. For those programs and studies with available precipitation chemistry data of known quality (data were not available for all programs and studies), PPRP obtained, processed, and uploaded the data to its WWW site (www.versar.com/pprp/features/aciddep/aciddep.htm). These data canmore » either be viewed on the web site or downloaded as a zipped file in either comma-delimited or Excel spreadsheet format. PPRP also provides descriptions of the monitoring programs/studies, including information on measurement methods and quality assurance procedures, where available. For the few monitoring programs (e.g., NADP) with existing web sites that allow on-line access to data, PPRP provides links to these sites. PPRP currently is working with the National Oceanic and Atmospheric Administration (NOAA) Air Resources Laboratory (ARL) in a cooperative effort to make more precipitation chemistry data easily available to the scientific community.« less

  4. A user-oriented web crawler for selectively acquiring online content in e-health research

    PubMed Central

    Xu, Songhua; Yoon, Hong-Jun; Tourassi, Georgia

    2014-01-01

    Motivation: Life stories of diseased and healthy individuals are abundantly available on the Internet. Collecting and mining such online content can offer many valuable insights into patients’ physical and emotional states throughout the pre-diagnosis, diagnosis, treatment and post-treatment stages of the disease compared with those of healthy subjects. However, such content is widely dispersed across the web. Using traditional query-based search engines to manually collect relevant materials is rather labor intensive and often incomplete due to resource constraints in terms of human query composition and result parsing efforts. The alternative option, blindly crawling the whole web, has proven inefficient and unaffordable for e-health researchers. Results: We propose a user-oriented web crawler that adaptively acquires user-desired content on the Internet to meet the specific online data source acquisition needs of e-health researchers. Experimental results on two cancer-related case studies show that the new crawler can substantially accelerate the acquisition of highly relevant online content compared with the existing state-of-the-art adaptive web crawling technology. For the breast cancer case study using the full training set, the new method achieves a cumulative precision between 74.7 and 79.4% after 5 h of execution till the end of the 20-h long crawling session as compared with the cumulative precision between 32.8 and 37.0% using the peer method for the same time period. For the lung cancer case study using the full training set, the new method achieves a cumulative precision between 56.7 and 61.2% after 5 h of execution till the end of the 20-h long crawling session as compared with the cumulative precision between 29.3 and 32.4% using the peer method. Using the reduced training set in the breast cancer case study, the cumulative precision of our method is between 44.6 and 54.9%, whereas the cumulative precision of the peer method is between 24.3 and 26.3%; for the lung cancer case study using the reduced training set, the cumulative precisions of our method and the peer method are, respectively, between 35.7 and 46.7% versus between 24.1 and 29.6%. These numbers clearly show a consistently superior accuracy of our method in discovering and acquiring user-desired online content for e-health research. Availability and implementation: The implementation of our user-oriented web crawler is freely available to non-commercial users via the following Web site: http://bsec.ornl.gov/AdaptiveCrawler.shtml. The Web site provides a step-by-step guide on how to execute the web crawler implementation. In addition, the Web site provides the two study datasets including manually labeled ground truth, initial seeds and the crawling results reported in this article. Contact: xus1@ornl.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24078710

  5. Facilitating Data-Intensive Education and Research in Earth Science through Geospatial Web Services

    ERIC Educational Resources Information Center

    Deng, Meixia

    2009-01-01

    The realm of Earth science (ES) is increasingly data-intensive. Geoinformatics research attempts to robustly smooth and accelerate the flow of data to information, information to knowledge, and knowledge to decisions and to supply necessary infrastructure and tools for advancing ES. Enabling easy access to and use of large volumes of ES data and…

  6. Globe Teachers Guide and Photographic Data on the Web

    NASA Technical Reports Server (NTRS)

    Kowal, Dan

    2004-01-01

    The task of managing the GLOBE Online Teacher s Guide during this time period focused on transforming the technology behind the delivery system of this document. The web application transformed from a flat file retrieval system to a dynamic database access approach. The new methodology utilizes Java Server Pages (JSP) on the front-end and an Oracle relational database on the backend. This new approach allows users of the web site, mainly teachers, to access content efficiently by grade level and/or by investigation or educational concept area. Moreover, teachers can gain easier access to data sheets and lab and field guides. The new online guide also included updated content for all GLOBE protocols. The GLOBE web management team was given documentation for maintaining the new application. Instructions for modifying the JSP templates and managing database content were included in this document. It was delivered to the team by the end of October, 2003. The National Geophysical Data Center (NGDC) continued to manage the school study site photos on the GLOBE website. 333 study site photo images were added to the GLOBE database and posted on the web during this same time period for 64 schools. Documentation for processing study site photos was also delivered to the new GLOBE web management team. Lastly, assistance was provided in transferring reference applications such as the Cloud and LandSat quizzes and Earth Systems Online Poster from NGDC servers to GLOBE servers along with documentation for maintaining these applications.

  7. Benchmarking to Identify Practice Variation in Test Ordering: A Potential Tool for Utilization Management.

    PubMed

    Signorelli, Heather; Straseski, Joely A; Genzen, Jonathan R; Walker, Brandon S; Jackson, Brian R; Schmidt, Robert L

    2015-01-01

    Appropriate test utilization is usually evaluated by adherence to published guidelines. In many cases, medical guidelines are not available. Benchmarking has been proposed as a method to identify practice variations that may represent inappropriate testing. This study investigated the use of benchmarking to identify sites with inappropriate utilization of testing for a particular analyte. We used a Web-based survey to compare 2 measures of vitamin D utilization: overall testing intensity (ratio of total vitamin D orders to blood-count orders) and relative testing intensity (ratio of 1,25(OH)2D to 25(OH)D test orders). A total of 81 facilities contributed data. The average overall testing intensity index was 0.165, or approximately 1 vitamin D test for every 6 blood-count tests. The average relative testing intensity index was 0.055, or one 1,25(OH)2D test for every 18 of the 25(OH)D tests. Both indexes varied considerably. Benchmarking can be used as a screening tool to identify outliers that may be associated with inappropriate test utilization. Copyright© by the American Society for Clinical Pathology (ASCP).

  8. A Review of Statistical Disclosure Control Techniques Employed by Web-Based Data Query Systems.

    PubMed

    Matthews, Gregory J; Harel, Ofer; Aseltine, Robert H

    We systematically reviewed the statistical disclosure control techniques employed for releasing aggregate data in Web-based data query systems listed in the National Association for Public Health Statistics and Information Systems (NAPHSIS). Each Web-based data query system was examined to see whether (1) it employed any type of cell suppression, (2) it used secondary cell suppression, and (3) suppressed cell counts could be calculated. No more than 30 minutes was spent on each system. Of the 35 systems reviewed, no suppression was observed in more than half (n = 18); observed counts below the threshold were observed in 2 sites; and suppressed values were recoverable in 9 sites. Six sites effectively suppressed small counts. This inquiry has revealed substantial weaknesses in the protective measures used in data query systems containing sensitive public health data. Many systems utilized no disclosure control whatsoever, and the vast majority of those that did deployed it inconsistently or inadequately.

  9. Accessing Digital Libraries: A Study of ARL Members' Digital Projects

    ERIC Educational Resources Information Center

    Kahl, Chad M.; Williams, Sarah C.

    2006-01-01

    To ensure efficient access to and integrated searching capabilities for their institution's new digital library projects, the authors studied Web sites of the Association of Research Libraries' (ARL) 111 academic, English-language libraries. Data were gathered on 1117 digital projects, noting library Web site and project access, metadata, and…

  10. Virtual Models of Long-Term Care

    ERIC Educational Resources Information Center

    Phenice, Lillian A.; Griffore, Robert J.

    2012-01-01

    Nursing homes, assisted living facilities and home-care organizations, use web sites to describe their services to potential consumers. This virtual ethnographic study developed models representing how potential consumers may understand this information using data from web sites of 69 long-term-care providers. The content of long-term-care web…

  11. Introduction to the Watershed Central Web Site and Watershed Wiki Mini-Workshop

    EPA Science Inventory

    Many communities across the country struggle to find the right approaches, tools and data to include in their watershed plans. EPA recently posted a new web site called "Watershed Central,” a “one-stop" tool, to help watershed organizations and others find key resources to protec...

  12. Catalytic site identification—a web server to identify catalytic site structural matches throughout PDB

    PubMed Central

    Kirshner, Daniel A.; Nilmeier, Jerome P.; Lightstone, Felice C.

    2013-01-01

    The catalytic site identification web server provides the innovative capability to find structural matches to a user-specified catalytic site among all Protein Data Bank proteins rapidly (in less than a minute). The server also can examine a user-specified protein structure or model to identify structural matches to a library of catalytic sites. Finally, the server provides a database of pre-calculated matches between all Protein Data Bank proteins and the library of catalytic sites. The database has been used to derive a set of hypothesized novel enzymatic function annotations. In all cases, matches and putative binding sites (protein structure and surfaces) can be visualized interactively online. The website can be accessed at http://catsid.llnl.gov. PMID:23680785

  13. Catalytic site identification--a web server to identify catalytic site structural matches throughout PDB.

    PubMed

    Kirshner, Daniel A; Nilmeier, Jerome P; Lightstone, Felice C

    2013-07-01

    The catalytic site identification web server provides the innovative capability to find structural matches to a user-specified catalytic site among all Protein Data Bank proteins rapidly (in less than a minute). The server also can examine a user-specified protein structure or model to identify structural matches to a library of catalytic sites. Finally, the server provides a database of pre-calculated matches between all Protein Data Bank proteins and the library of catalytic sites. The database has been used to derive a set of hypothesized novel enzymatic function annotations. In all cases, matches and putative binding sites (protein structure and surfaces) can be visualized interactively online. The website can be accessed at http://catsid.llnl.gov.

  14. Design Drivers of Water Data Services

    NASA Astrophysics Data System (ADS)

    Valentine, D.; Zaslavsky, I.

    2008-12-01

    The CUAHSI Hydrologic Information System (HIS) is being developed as a geographically distributed network of hydrologic data sources and functions that are integrated using web services so that they function as a connected whole. The core of the HIS service-oriented architecture is a collection of water web services, which provide uniform access to multiple repositories of observation data. These services use SOAP protocols communicating WaterML (Water Markup Language). When a client makes a data or metadata request using a CUAHSI HIS web service, these requests are made in standard manner, following the CUAHSI HIS web service signatures - regardless of how the underlying data source may be organized. Also, regardless of the format in which the data are returned by the source, the web services respond to requests by returning the data in a standard format of WaterML. The goal of WaterML design has been to capture semantics of hydrologic observations discovery and retrieval and express the point observations information model as an XML schema. To a large extent, it follows the representation of the information model as adopted by the CUASHI Observations Data Model (ODM) relational design. Another driver of WaterML design is specifications and metadata adopted by USGS NWIS, EPA STORET, and other federal agencies, as it seeks to provide a common foundation for exchanging both agency data and data collected in multiple academic projects. Another WaterML design principle was to create, in version 1 of HIS in particular, a fairly rigid and simple XML schema which is easy to generate and parse, thus creating the least barrier for adoption by hydrologists. WaterML includes a series of elements that reflect common notions used in describing hydrologic observations, such as site, variable, source, observation series, seriesCatalog, and data values. Each of the three main request methods in the water web services - GetSiteInfo, GetVariableInfo, and GetValues - has a corresponding response element in WaterML: SitesResponse, VariableResponse, and TimeSeriesResponse. The WaterML specification is being adopted by federal agencies. The experimental USGS NWIS Daily Values web service returns WaterML-compliant TImeSeriesResponse. The National Climatic Data Center is also prototyping WaterML for data delivery, and has developed a REST-based service that generates WaterML- compliant output for the NCDC ASOS network. Such agency-supported web services coming online provide a much more efficient way to deliver agency data compared to the web site scraper services that the CUAHSI HIS project has developed initially. The CUAHSI water data web services will continue to serve as the main communication mechanism within CUAHSI HIS, connecting a variety of data sources with a growing set of web service clients being developed in both academia and the commercial sector. The driving forces for the development of web services continue to be: - Application experience and needs of the growing number of CUAHSI HIS users, who experiment with additional data types, analysis modes, data browsing and searching strategies, and provide feedback to WaterML developers; - Data description requirements posed by various federal and state agencies; - Harmonization with standards being adopted or developed in neighboring communities, in particular the relevant standards being explored within the Open Geospatial Consortium. CUAHSI WaterML is a standard output schema for CUAHSI HIS water web services. Its formal specification is available as OGC discussion paper at www.opengeospatial.org/standards/dp/ class="ab'>

  15. Empowering radiologic education on the Internet: a new virtual website technology for hosting interactive educational content on the World Wide Web.

    PubMed

    Frank, M S; Dreyer, K

    2001-06-01

    We describe a virtual web site hosting technology that enables educators in radiology to emblazon and make available for delivery on the world wide web their own interactive educational content, free from dependencies on in-house resources and policies. This suite of technologies includes a graphically oriented software application, designed for the computer novice, to facilitate the input, storage, and management of domain expertise within a database system. The database stores this expertise as choreographed and interlinked multimedia entities including text, imagery, interactive questions, and audio. Case-based presentations or thematic lectures can be authored locally, previewed locally within a web browser, then uploaded at will as packaged knowledge objects to an educator's (or department's) personal web site housed within a virtual server architecture. This architecture can host an unlimited number of unique educational web sites for individuals or departments in need of such service. Each virtual site's content is stored within that site's protected back-end database connected to Internet Information Server (Microsoft Corp, Redmond WA) using a suite of Active Server Page (ASP) modules that incorporate Microsoft's Active Data Objects (ADO) technology. Each person's or department's electronic teaching material appears as an independent web site with different levels of access--controlled by a username-password strategy--for teachers and students. There is essentially no static hypertext markup language (HTML). Rather, all pages displayed for a given site are rendered dynamically from case-based or thematic content that is fetched from that virtual site's database. The dynamically rendered HTML is displayed within a web browser in a Socratic fashion that can assess the recipient's current fund of knowledge while providing instantaneous user-specific feedback. Each site is emblazoned with the logo and identification of the participating institution. Individuals with teacher-level access can use a web browser to upload new content as well as manage content already stored on their virtual site. Each virtual site stores, collates, and scores participants' responses to the interactive questions posed on line. This virtual web site strategy empowers the educator with an end-to-end solution for creating interactive educational content and hosting that content within the educator's personalized and protected educational site on the world wide web, thus providing a valuable outlet that can magnify the impact of his or her talents and contributions.

  16. Bringing Terra Science to the People: 10 years of education and public outreach

    NASA Astrophysics Data System (ADS)

    Riebeek, H.; Chambers, L. H.; Yuen, K.; Herring, D.

    2009-12-01

    The default image on Apple's iPhone is a blue, white, green and tan globe: the Blue Marble. The iconic image was produced using Terra data as part of the mission's education and public outreach efforts. As far-reaching and innovative as Terra science has been over the past decade, Terra education and public outreach efforts have been equally successful. This talk will provide an overview of Terra's crosscutting education and public outreach projects, which have reached into educational facilities—classrooms, museums, and science centers, across the Internet, and into everyday life. The Earth Observatory web site was the first web site designed for the public that told the unified story of what we can learn about our planet from all space-based platforms. Initially conceived as part of Terra mission outreach in 1999, the web site has won five Webby awards, the highest recognition a web site can receive. The Visible Earth image gallery is a catalogue of NASA Earth imagery that receives more than one million page views per month. The NEO (NASA Earth Observations) web site and WMS (web mapping service) tool serves global data sets to museums and science centers across the world. Terra educational products, including the My NASA Data web service and the Students' Cloud Observations Online (S'COOL) project, bring Terra data into the classroom. Both projects target multiple grade levels, ranging from elementary school to graduate school. S'COOL uses student observations of clouds to help validate Terra data. Students and their parents have puzzled over weekly "Where on Earth" geography quizzes published on line. Perhaps the most difficult group to reach is the large segment of the public that does not seek out science information online or in a science museum or classroom. To reach these people, EarthSky produced a series of podcasts and radio broadcasts that brought Terra science to more than 30 million people in 2009. Terra imagery, including the Blue Marble, have seen wide distribution in books like Our Changing Planet and films like An Inconvenient Truth. The Blue Marble, courtesy Reto Stockli and Rob Simmon, NASA's Earth Observatory.

  17. Ultrabroadband photonic internet: safety aspects

    NASA Astrophysics Data System (ADS)

    Kalicki, Arkadiusz; Romaniuk, Ryszard

    2008-11-01

    Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.

  18. The role of social networking web sites in influencing residency decisions.

    PubMed

    Schweitzer, Justin; Hannan, Alexander; Coren, Joshua

    2012-10-01

    Social networking Web sites such as Facebook have grown rapidly in popularity. It is unknown how such sites affect the ways in which medical trainees investigate and interact with graduate medical education (GME) programs. To evaluate the use of social networking Web sites as a means for osteopathic medical students, interns, residents, and fellows to interact with GME programs and report the degree to which that interaction impacts a medical trainee's choice of GME program. An anonymous, 10-item electronic survey on social networking Web sites was e-mailed to osteopathic medical student, intern, resident, and fellow members of the American College of Osteopathic Family Physicians. The weighted least squares test and the Fisher exact test were used for data analysis. A total of 9606 surveys were distributed, and 992 (10%) were completed. Nine hundred twenty-eight (93%) of the respondents used social networking Web sites, with the most popular services being Facebook (891 [90%]; P=.03), the Student Doctor Network (278 [28%]), and LinkedIn (89 [9%]; P=.03). Three hundred fifty-three respondents (36%; P=.52) were connected with a professional organization and 673 (68%; P=.73) used social networking Web sites for job searching related to GME programs or postresidency employment. Within the population of 497 third-, fourth-, and fifth-year osteopathic medical students, 136 (27%) reported gleaning information about programs through social networking Web sites (P=.01). Within the total population, 100 of 992 (10%) reported that this information influenced their decisions (P=.07). Of note, 144 (14%) of the total 992 respondents reported that the programs they applied to did not have any presence on social networking Web sites (P=.05). Our results indicate that social networking Web sites have a present and growing influence on how osteopathic medical students, interns, residents, and fellows learn about and select a GME program.

  19. AAVSO Target Tool: A Web-Based Service for Tracking Variable Star Observations (Abstract)

    NASA Astrophysics Data System (ADS)

    Burger, D.; Stassun, K. G.; Barnes, C.; Kafka, S.; Beck, S.; Li, K.

    2018-06-01

    (Abstract only) The AAVSO Target Tool is a web-based interface for bringing stars in need of observation to the attention of AAVSOís network of amateur and professional astronomers. The site currently tracks over 700 targets of interest, collecting data from them on a regular basis from AAVSOís servers and sorting them based on priority. While the target tool does not require a login, users can obtain visibility times for each target by signing up and entering a telescope location. Other key features of the site include filtering by AAVSO observing section, sorting by different variable types, formatting the data for printing, and exporting the data to a CSV file. The AAVSO Target Tool builds upon seven years of experience developing web applications for astronomical data analysis, most notably on Filtergraph (Burger, D., et al. 2013, Astronomical Data Analysis Software and Systems XXII, Astronomical Society of the Pacific, San Francisco, 399), and is built using the web2py web framework based on the python programming language. The target tool is available at http://filtergraph.com/aavso.

  20. Development of grid-like applications for public health using Web 2.0 mashup techniques.

    PubMed

    Scotch, Matthew; Yip, Kevin Y; Cheung, Kei-Hoi

    2008-01-01

    Development of public health informatics applications often requires the integration of multiple data sources. This process can be challenging due to issues such as different file formats, schemas, naming systems, and having to scrape the content of web pages. A potential solution to these system development challenges is the use of Web 2.0 technologies. In general, Web 2.0 technologies are new internet services that encourage and value information sharing and collaboration among individuals. In this case report, we describe the development and use of Web 2.0 technologies including Yahoo! Pipes within a public health application that integrates animal, human, and temperature data to assess the risk of West Nile Virus (WNV) outbreaks. The results of development and testing suggest that while Web 2.0 applications are reasonable environments for rapid prototyping, they are not mature enough for large-scale public health data applications. The application, in fact a "systems of systems," often failed due to varied timeouts for application response across web sites and services, internal caching errors, and software added to web sites by administrators to manage the load on their servers. In spite of these concerns, the results of this study demonstrate the potential value of grid computing and Web 2.0 approaches in public health informatics.

  1. WebDMS: A Web-Based Data Management System for Environmental Data

    NASA Astrophysics Data System (ADS)

    Ekstrand, A. L.; Haderman, M.; Chan, A.; Dye, T.; White, J. E.; Parajon, G.

    2015-12-01

    DMS is an environmental Data Management System to manage, quality-control (QC), summarize, document chain-of-custody, and disseminate data from networks ranging in size from a few sites to thousands of sites, instruments, and sensors. The server-client desktop version of DMS is used by local and regional air quality agencies (including the Bay Area Air Quality Management District, the South Coast Air Quality Management District, and the California Air Resources Board), the EPA's AirNow Program, and the EPA's AirNow-International (AirNow-I) program, which offers countries the ability to run an AirNow-like system. As AirNow's core data processing engine, DMS ingests, QCs, and stores real-time data from over 30,000 active sensors at over 5,280 air quality and meteorological sites from over 130 air quality agencies across the United States. As part of the AirNow-I program, several instances of DMS are deployed in China, Mexico, and Taiwan. The U.S. Department of State's StateAir Program also uses DMS for five regions in China and plans to expand to other countries in the future. Recent development has begun to migrate DMS from an onsite desktop application to WebDMS, a web-based application designed to take advantage of cloud hosting and computing services to increase scalability and lower costs. WebDMS will continue to provide easy-to-use data analysis tools, such as time-series graphs, scatterplots, and wind- or pollution-rose diagrams, as well as allowing data to be exported to external systems such as the EPA's Air Quality System (AQS). WebDMS will also provide new GIS analysis features and a suite of web services through a RESTful web API. These changes will better meet air agency needs and allow for broader national and international use (for example, by the AirNow-I partners). We will talk about the challenges and advantages of migrating DMS to the web, modernizing the DMS user interface, and making it more cost-effective to enhance and maintain over time.

  2. Designing Websites for Displaying Large Data Sets and Images on Multiple Platforms

    NASA Astrophysics Data System (ADS)

    Anderson, A.; Wolf, V. G.; Garron, J.; Kirschner, M.

    2012-12-01

    The desire to build websites to analyze and display ever increasing amounts of scientific data and images pushes for web site designs which utilize large displays, and to use the display area as efficiently as possible. Yet, scientists and users of their data are increasingly wishing to access these websites in the field and on mobile devices. This results in the need to develop websites that can support a wide range of devices and screen sizes, and to optimally use whatever display area is available. Historically, designers have addressed this issue by building two websites; one for mobile devices, and one for desktop environments, resulting in increased cost, duplicity of work, and longer development times. Recent advancements in web design technology and techniques have evolved which allow for the development of a single website that dynamically adjusts to the type of device being used to browse the website (smartphone, tablet, desktop). In addition they provide the opportunity to truly optimize whatever display area is available. HTML5 and CSS3 give web designers media query statements which allow design style sheets to be aware of the size of the display being used, and to format web content differently based upon the queried response. Web elements can be rendered in a different size, position, or even removed from the display entirely, based upon the size of the display area. Using HTML5/CSS3 media queries in this manner is referred to as "Responsive Web Design" (RWD). RWD in combination with technologies such as LESS and Twitter Bootstrap allow the web designer to build web sites which not only dynamically respond to the browser display size being used, but to do so in very controlled and intelligent ways, ensuring that good layout and graphic design principles are followed while doing so. At the University of Alaska Fairbanks, the Alaska Satellite Facility SAR Data Center (ASF) recently redesigned their popular Vertex application and converted it from a traditional, fixed-layout website into a RWD site built on HTML5, LESS and Twitter Bootstrap. Vertex is a data portal for remotely sensed imagery of the earth, offering Synthetic Aperture Radar (SAR) data products from the global ASF archive. By using Responsive Web Design, ASF is able to provide access to a massive collection of SAR imagery and allow the user to use mobile devices and desktops to maximum advantage. ASF's Vertex web site demonstrates that with increased interface flexibility, scientists, managers and users can increase their personal effectiveness by accessing data portals from their preferred device as their science dictates.

  3. Effects of extreme climatic events on small-scale spatial patterns: a 20-year study of the distribution of a desert spider.

    PubMed

    Birkhofer, Klaus; Henschel, Joh; Lubin, Yael

    2012-11-01

    Individuals of most animal species are non-randomly distributed in space. Extreme climatic events are often ignored as potential drivers of distribution patterns, and the role of such events is difficult to assess. Seothyra henscheli (Araneae, Eresidae) is a sedentary spider found in the Namib dunes in Namibia. The spider constructs a sticky-edged silk web on the sand surface, connected to a vertical, silk-lined burrow. Above-ground web structures can be damaged by strong winds or heavy rainfall, and during dispersal spiders are susceptible to environmental extremes. Locations of burrows were mapped in three field sites in 16 out of 20 years from 1987 to 2007, and these grid-based data were used to identify the relationship between spatial patterns, climatic extremes and sampling year. According to Morisita's index, individuals had an aggregated distribution in most years and field sites, and Geary's C suggests clustering up to scales of 2 m. Individuals were more aggregated in years with high maximum wind speed and low annual precipitation. Our results suggest that clustering is a temporally stable property of populations that holds even under fluctuating burrow densities. Climatic extremes, however, affect the intensity of clustering behaviour: individuals seem to be better protected in field sites with many conspecific neighbours. We suggest that burrow-site selection is driven at least partly by conspecific cuing, and this behaviour may protect populations from collapse during extreme climatic events.

  4. Four-dimensional characterization of a sheet-forming web

    DOEpatents

    Sari-Sarraf, Hamed; Goddard, James S.

    2003-04-22

    A method and apparatus are provided by which a sheet-forming web may be characterized in four dimensions. Light images of the web are recorded at a point adjacent the initial stage of the web, for example, near the headbox in a paperforming operation. The images are digitized, and the resulting data is processed by novel algorithms to provide a four-dimensional measurement of the web. The measurements include two-dimensional spatial information, the intensity profile of the web, and the depth profile of the web. These measurements can be used to characterize the web, predict its properties and monitor production events, and to analyze and quantify headbox flow dynamics.

  5. Graph Structure in Three National Academic Webs: Power Laws with Anomalies.

    ERIC Educational Resources Information Center

    Thelwall, Mike; Wilkinson, David

    2003-01-01

    Explains how the Web can be modeled as a mathematical graph and analyzes the graph structures of three national university publicly indexable Web sites from Australia, New Zealand, and the United Kingdom. Topics include commercial search engines and academic Web link research; method-analysis environment and data sets; and power laws. (LRW)

  6. CT colonography: Project of High National Interest No. 2005062137 of the Italian Ministry of Education, University and Research (MIUR).

    PubMed

    Neri, E; Laghi, A; Regge, D; Sacco, P; Gallo, T; Turini, F; Talini, E; Ferrari, R; Mellaro, M; Rengo, M; Marchi, S; Caramella, D; Bartolozzi, C

    2008-12-01

    The aim of this paper is to describe the Web site of the Italian Project on CT Colonography (Research Project of High National Interest, PRIN No. 2005062137) and present the prototype of the online database. The Web site was created with Microsoft Office Publisher 2003 software, which allows the realisation of multiple Web pages linked through a main menu located on the home page. The Web site contains a database of computed tomography (CT) colonography studies in the Digital Imaging and Communications in Medicine (DICOM) standard, all acquired with multidetector-row CT according to the parameters defined by the European Society of Abdominal and Gastrointestinal Radiology (ESGAR). The cases present different bowel-cleansing and tagging methods, and each case has been anonymised and classified according to the Colonography Reporting and Data System (C-RADS). The Web site is available at http address www.ctcolonography.org and is composed of eight pages. Download times for a 294-Mbyte file were 33 min from a residential ADSL (6 Mbit/s) network, 200 s from a local university network (100 Mbit/s) and 2 h and 50 min from a remote academic site in the USA. The Web site received 256 accesses in the 22 days since it went online. The Web site is an immediate and up-to-date tool for publicising the activity of the research project and a valuable learning resource for CT colonography.

  7. Web Content Accessibility of Consumer Health Information Web Sites for People with Disabilities: A Cross Sectional Evaluation

    PubMed Central

    Parmanto, Bambang

    2004-01-01

    Background The World Wide Web (WWW) has become an increasingly essential resource for health information consumers. The ability to obtain accurate medical information online quickly, conveniently and privately provides health consumers with the opportunity to make informed decisions and participate actively in their personal care. Little is known, however, about whether the content of this online health information is equally accessible to people with disabilities who must rely on special devices or technologies to process online information due to their visual, hearing, mobility, or cognitive limitations. Objective To construct a framework for an automated Web accessibility evaluation; to evaluate the state of accessibility of consumer health information Web sites; and to investigate the possible relationships between accessibility and other features of the Web sites, including function, popularity and importance. Methods We carried out a cross-sectional study of the state of accessibility of health information Web sites to people with disabilities. We selected 108 consumer health information Web sites from the directory service of a Web search engine. A measurement framework was constructed to automatically measure the level of Web Accessibility Barriers (WAB) of Web sites following Web accessibility specifications. We investigated whether there was a difference between WAB scores across various functional categories of the Web sites, and also evaluated the correlation between the WAB and Alexa traffic rank and Google Page Rank of the Web sites. Results We found that none of the Web sites we looked at are completely accessible to people with disabilities, i.e., there were no sites that had no violation of Web accessibility rules. However, governmental and educational health information Web sites do exhibit better Web accessibility than the other categories of Web sites (P < 0.001). We also found that the correlation between the WAB score and the popularity of a Web site is statistically significant (r = 0.28, P < 0.05), although there is no correlation between the WAB score and the importance of the Web sites (r = 0.15, P = 0.111). Conclusions Evaluation of health information Web sites shows that no Web site scrupulously abides by Web accessibility specifications, even for entities mandated under relevant laws and regulations. Government and education Web sites show better performance than Web sites among other categories. Accessibility of a Web site may have a positive impact on its popularity in general. However, the Web accessibility of a Web site may not have a significant relationship with its importance on the Web. PMID:15249268

  8. Implementing a low-cost web-based clinical trial management system for community studies: a case study.

    PubMed

    Geyer, John; Myers, Kathleen; Vander Stoep, Ann; McCarty, Carolyn; Palmer, Nancy; DeSalvo, Amy

    2011-10-01

    Clinical trials with multiple intervention locations and a single research coordinating center can be logistically difficult to implement. Increasingly, web-based systems are used to provide clinical trial support with many commercial, open source, and proprietary systems in use. New web-based tools are available which can be customized without programming expertise to deliver web-based clinical trial management and data collection functions. To demonstrate the feasibility of utilizing low-cost configurable applications to create a customized web-based data collection and study management system for a five intervention site randomized clinical trial establishing the efficacy of providing evidence-based treatment via teleconferencing to children with attention-deficit hyperactivity disorder. The sites are small communities that would not usually be included in traditional randomized trials. A major goal was to develop database that participants could access from computers in their home communities for direct data entry. Discussed is the selection process leading to the identification and utilization of a cost-effective and user-friendly set of tools capable of customization for data collection and study management tasks. An online assessment collection application, template-based web portal creation application, and web-accessible Access 2007 database were selected and customized to provide the following features: schedule appointments, administer and monitor online secure assessments, issue subject incentives, and securely transmit electronic documents between sites. Each tool was configured by users with limited programming expertise. As of June 2011, the system has successfully been used with 125 participants in 5 communities, who have completed 536 sets of assessment questionnaires, 8 community therapists, and 11 research staff at the research coordinating center. Total automation of processes is not possible with the current set of tools as each is loosely affiliated, creating some inefficiency. This system is best suited to investigations with a single data source e.g., psychosocial questionnaires. New web-based applications can be used by investigators with limited programming experience to implement user-friendly, efficient, and cost-effective tools for multi-site clinical trials with small distant communities. Such systems allow the inclusion in research of populations that are not usually involved in clinical trials.

  9. The Westfield River Watershed Interactive Atlas: mapping recreation data on the web

    Treesearch

    Robert S. Bristow; Steven Riberdy

    2002-01-01

    Imagine searching the web to create a map to your house. You could use one of the many Internet mapping sites like MapBlast™ or MapQuest™ to create such a map. But maybe you wish to get a map of trails for the Grand Canyon. The National Park Service web site could serve that need. Or you may wish to get a map to show you the way from the Orlando...

  10. FOCIH: Form-Based Ontology Creation and Information Harvesting

    NASA Astrophysics Data System (ADS)

    Tao, Cui; Embley, David W.; Liddle, Stephen W.

    Creating an ontology and populating it with data are both labor-intensive tasks requiring a high degree of expertise. Thus, scaling ontology creation and population to the size of the web in an effort to create a web of data—which some see as Web 3.0—is prohibitive. Can we find ways to streamline these tasks and lower the barrier enough to enable Web 3.0? Toward this end we offer a form-based approach to ontology creation that provides a way to create Web 3.0 ontologies without the need for specialized training. And we offer a way to semi-automatically harvest data from the current web of pages for a Web 3.0 ontology. In addition to harvesting information with respect to an ontology, the approach also annotates web pages and links facts in web pages to ontological concepts, resulting in a web of data superimposed over the web of pages. Experience with our prototype system shows that mappings between conceptual-model-based ontologies and forms are sufficient for creating the kind of ontologies needed for Web 3.0, and experiments with our prototype system show that automatic harvesting, automatic annotation, and automatic superimposition of a web of data over a web of pages work well.

  11. Free Factories: Unified Infrastructure for Data Intensive Web Services

    PubMed Central

    Zaranek, Alexander Wait; Clegg, Tom; Vandewege, Ward; Church, George M.

    2010-01-01

    We introduce the Free Factory, a platform for deploying data-intensive web services using small clusters of commodity hardware and free software. Independently administered virtual machines called Freegols give application developers the flexibility of a general purpose web server, along with access to distributed batch processing, cache and storage services. Each cluster exploits idle RAM and disk space for cache, and reserves disks in each node for high bandwidth storage. The batch processing service uses a variation of the MapReduce model. Virtualization allows every CPU in the cluster to participate in batch jobs. Each 48-node cluster can achieve 4-8 gigabytes per second of disk I/O. Our intent is to use multiple clusters to process hundreds of simultaneous requests on multi-hundred terabyte data sets. Currently, our applications achieve 1 gigabyte per second of I/O with 123 disks by scheduling batch jobs on two clusters, one of which is located in a remote data center. PMID:20514356

  12. Statistics, Structures & Satisfied Customers: Using Web Log Data to Improve Site Performance.

    ERIC Educational Resources Information Center

    Peacock, Darren

    This paper explores some of the ways in which the National Museum of Australia is using Web analysis tools to shape its future directions in the delivery of online services. In particular, it explores the potential of quantitative analysis, based on Web server log data, to convert these ephemeral traces of user experience into a strategic…

  13. Global drought watch from space at work: Crop losses and food security

    NASA Astrophysics Data System (ADS)

    Kogan, F.

    2012-12-01

    Drought is one of the most adverse environmental disasters. It affects countries economies, environment a very large number of people in the world. Only in the USA drought costs taxpayers nearly $6 billion each year. Drought is a very unusual phenomenon because unlike other environmental disaster it starts unnoticeably, develop cumulatively, the impact is also cumulative and by the time when the effect of drought is observable it is too late to mitigate the consequences. Therefore, it is difficult to mitigate droughts using in situ data. The National Oceanic and Atmospheric Administration (NOAA) developed new method for drought detection and monitoring from reflectance measured by the Advanced Very High Resolution Radiometer flown on NOAA polar-orbiting operational environmental satellites. The method calculates Vegetation Health (VH) indices, which estimate vegetation condition (health) on a scale from extreme stress to favorable conditions based on intensity of greenness, vigor and thermal condition of vegetation canopy. The VH is estimated every week for each 4 by 4 km earth surface and is delivered to the NOAA/NESDIS web site in digital and color-coded form. The web site address is the following http://www.star.nesdis.noaa.gov/smcd/emb/vci/VH/index.php In addition to drought and vegetation health monitoring, the VH indices are applied in agriculture, forestry, mosquito-borne diseases, climate, invasive species and others. During the first seven months of 2009, drought was observed in the southern US (especially Texas), Argentina (very intensive drought), some of the countries of sub-Sahara Africa, India (central and eastern), Kazakhstan and Australia.

  14. Online Periodic Table: A Cautionary Note

    NASA Astrophysics Data System (ADS)

    Izci, Kemal; Barrow, Lloyd H.; Thornhill, Erica

    2013-08-01

    The purpose of this study was (a) to evaluate ten online periodic table sources for their accuracy and (b) to compare the types of information and links provided to users. Limited studies have been reported on online periodic table (Diener and Moore 2011; Slocum and Moore in J Chem Educ 86(10):1167, 2009). Chemistry students' understanding of periodic table is vital for their success in chemistry, and the online periodic table has the potential to advance learners' understanding of chemical elements and fundamental chemistry concepts (Brito et al. in J Res Sci Teach 42(1):84-111, 2005). The ten sites were compared for accuracy of data with the Handbook of Chemistry and Physics (HCP, Haynes in CRC handbook of chemistry and physics: a ready-reference book of chemical and physical data. CRC Press, Boca Raton 2012). The 10 sites are the most visited periodic table Web sites available. Four different elements, carbon, gold, argon, and plutonium, were selected for comparison, and 11 different attributes for each element were identified for evaluating accuracy. A wide variation of accuracy was found among the 10 periodic table sources. Chemicool was the most accurate information provider with 66.67 % accuracy when compared to the HCP. The 22 types of information including meaning of name and use in industry and society provided by these sites were, also, compared. WebElements, "Chemicool", "Periodic Table Live", and "the Photographic Periodic Table of the Elements" were the most information providers, providing 86.36 % of information among the 10 Web sites. "WebElements" provides the most links among the 10 Web sites. It was concluded that if an individual teacher or student desires only raw physical data from element, the Internet might not be the best choice.

  15. Evaluating IPv6 Adoption in the Internet

    NASA Astrophysics Data System (ADS)

    Colitti, Lorenzo; Gunderson, Steinar H.; Kline, Erik; Refice, Tiziana

    As IPv4 address space approaches exhaustion, large networks are deploying IPv6 or preparing for deployment. However, there is little data available about the quantity and quality of IPv6 connectivity. We describe a methodology to measure IPv6 adoption from the perspective of a Web site operator and to evaluate the impact that adding IPv6 to a Web site will have on its users. We apply our methodology to the Google Web site and present results collected over the last year. Our data show that IPv6 adoption, while growing significantly, is still low, varies considerably by country, and is heavily influenced by a small number of large deployments. We find that native IPv6 latency is comparable to IPv4 and provide statistics on IPv6 transition mechanisms used.

  16. Use of a web site to increase knowledge and awareness of hunger-related issues.

    PubMed Central

    Jennings, Sharla; Cotugna, Nancy; Vickery, Connie E.

    2003-01-01

    The purpose of this study was to determine the current level of knowledge and awareness of hunger-related issues among a convenience sample of Delawareans. We also assessed whether raising knowledge and awareness of the hunger problem through the FBD's newly designed web site would encourage participation in antihunger activities. Via e-mail, 1,719 individuals were invited to participate in a three-phase, online survey, and 392 agreed. Phase-I questions were answered prior to viewing the web site, phase II (n=217) immediately afterward, and phase III (n=61) six weeks later. Responses indicated a high level of awareness about general hunger issues but specific knowledge proved to be at a lower level. No statistically significant differences were noted when data were collapsed across gender, age, educational level, or work setting. In a six-week post-survey, 41% of subjects were motivated by the web site to engage in an antihunger activity; 34% had told others about the web site and indicated it may be a useful tool in antihunger outreach efforts for the FBD. PMID:14651376

  17. Infant Gastroesophageal Reflux Information on the World Wide Web.

    PubMed

    Balgowan, Regina; Greer, Leah C; D'Auria, Jennifer P

    2016-01-01

    The purpose of this study was to describe the type and quality of health information about infant gastroesophageal reflux (GER) that a parent may find on the World Wide Web. The data collection tool included evaluation of Web site quality and infant GER-specific content on the 30 sites that met the inclusion criteria. The most commonly found content categories in order of frequency were management strategies, when to call a primary care provider, definition, and clinical features. The most frequently mentioned strategies included feeding changes, infant positioning, and medications. Thirteen of the 30 Web sites included information on both GER and gastroesophageal reflux disease. Mention of the use of medication to lessen infant symptoms was found on 15 of the 30 sites. Only 10 of the 30 sites included information about parent support and coping strategies. Pediatric nurse practitioners (PNPs) should utilize well-child visits to address the normalcy of physiologic infant GER and clarify any misperceptions parents may have about diagnosis and the role of medication from information they may have found on the Internet. It is critical for PNPs to assist in the development of Web sites with accurate content, advise parents on how to identify safe and reliable information, and provide examples of high-quality Web sites about child health topics such as infant GER. Copyright © 2016 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.

  18. Real-time Data Access From Remote Observatories

    NASA Astrophysics Data System (ADS)

    Detrick, D. L.; Lutz, L. F.; Etter, J. E.; Rosenberg, T. J.; Weatherwax, A. T.

    2006-12-01

    Real-time access to solar-terrestrial data is becoming increasingly important, not only because it is now possible to acquire and access data rapidly via the internet, but also because of the need for timely publication of real-time data for analysis and modeling efforts. Currently, engineering-scaled summary data are available routinely on a daily basis from many observatories, but only when the observatories have continuous, or at least daily network access. Increasingly, the upgrading of remote data acquisition hardware makes it possible to provide data in real-time, and it is becoming normal to expect timely access to data products. The NSF- supported PENGUIn/AGO constellation of autonomous Antarctic research observatories has provided real-time data since December, 2002, when Iridium satellite modems were installed at three sites. The Iridium telecommunications links are maintained continuously, transferring data between the remote observatories and a U.S.-based data acquisition site. The time-limiting factor with this scenario is now the delay in completing a data record before transmission, which can be as short as minutes depending on the sampling rate. The single-channel data throughput of the current systems is 20-MB/day (megabytes per day), but planned installations will be capable of operating with multiple modem channels. The data records are currently posted immediately to a web site accessible by anonymous FTP client software, for use by the instruments' principal investigators, and survey plots of selected signals are published daily. The web publication facilities are being upgraded, in order to allow other interested researchers rapid access to engineering-scaled data products, in several common formats, as well as providing interactive plotting capabilities. The web site will provide access to data from other collaborating observatories (including South Pole and McMurdo Stations), as well as ancillary data accessible from public sites (e.g., Kp, AE, Dst). The site will be accessible via common HTML interface protocols, enabling access to the data products by browsers or other compatible application software. We describe details of the hardware and software components of the Iridium telecommunications linkage, as well as details of the current and planned web publication capabilities.

  19. 32 CFR 701.102 - Online resources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... online Web site (http://www.privacy.navy.mil). This Web site supplements this subpart and subpart G. It...) Web site (http://www.doncio.navy.mil). This Web site provides detailed guidance on PIAs. (c) DOD's PA Web site (http://www.defenselink.mil/privacy). This Web site is an excellent resource that contains a...

  20. Assessment of the Quality of Patient-Orientated Information on Surgery for Crohn's Disease on the Internet.

    PubMed

    Yeung, Trevor M; Sacchi, Matteo; Mortensen, Neil J; Spinelli, Antonino

    2015-09-01

    The Internet is a vast resource for patients to search for health information on the treatment of Crohn's disease. This study examines the quality of Web sites that provide information to adults regarding Crohn's disease, including treatment options and surgery. Two search engines (Google and Yahoo) and the search terms "surgery for Crohn's disease" were used. The first 50 sites of each search were assessed. Sites that fulfilled the inclusion criteria were evaluated for content and scored by using the DISCERN instrument, which evaluates the quality of health information on treatment choices. One hundred sites were examined, of which 13 were duplicates. Sixty-two sites provided patient-orientated information. The other sites included 7 scientific articles, 3 blogs, 2 links, 6 forums, 3 video links, and 4 dead links. Of the 62 Web sites that provided patient information for adults, only 15 (24.2%) had been updated within the past 2 years. Only 9 (14.5%) were affiliated with hospitals and clinics. The majority of sites (33, 53.2%) were associated with private companies with commercial interests. Only half of the Web sites provided details on treatment options, and most Web sites did not provide any information on symptoms and procedure details. Just 5 Web sites (8.1%) described the risks of surgery, and only 7 (11.3%) provided any information on the timescale for recovery. Overall, only 1 Web site (1.6%) was identified as being "good" or "excellent" with the use of the DISCERN criteria. Although the internet is constantly evolving, this study captures data at a specific time point. Search results may vary depending on geographical location. This study only assessed English language websites. The quality of patient information on surgery for Crohn's disease is highly variable and generally poor. There is potential for the Internet to provide valuable information, and clinicians should identify high-quality Web sites to guide their patients.

  1. A user-oriented web crawler for selectively acquiring online content in e-health research.

    PubMed

    Xu, Songhua; Yoon, Hong-Jun; Tourassi, Georgia

    2014-01-01

    Life stories of diseased and healthy individuals are abundantly available on the Internet. Collecting and mining such online content can offer many valuable insights into patients' physical and emotional states throughout the pre-diagnosis, diagnosis, treatment and post-treatment stages of the disease compared with those of healthy subjects. However, such content is widely dispersed across the web. Using traditional query-based search engines to manually collect relevant materials is rather labor intensive and often incomplete due to resource constraints in terms of human query composition and result parsing efforts. The alternative option, blindly crawling the whole web, has proven inefficient and unaffordable for e-health researchers. We propose a user-oriented web crawler that adaptively acquires user-desired content on the Internet to meet the specific online data source acquisition needs of e-health researchers. Experimental results on two cancer-related case studies show that the new crawler can substantially accelerate the acquisition of highly relevant online content compared with the existing state-of-the-art adaptive web crawling technology. For the breast cancer case study using the full training set, the new method achieves a cumulative precision between 74.7 and 79.4% after 5 h of execution till the end of the 20-h long crawling session as compared with the cumulative precision between 32.8 and 37.0% using the peer method for the same time period. For the lung cancer case study using the full training set, the new method achieves a cumulative precision between 56.7 and 61.2% after 5 h of execution till the end of the 20-h long crawling session as compared with the cumulative precision between 29.3 and 32.4% using the peer method. Using the reduced training set in the breast cancer case study, the cumulative precision of our method is between 44.6 and 54.9%, whereas the cumulative precision of the peer method is between 24.3 and 26.3%; for the lung cancer case study using the reduced training set, the cumulative precisions of our method and the peer method are, respectively, between 35.7 and 46.7% versus between 24.1 and 29.6%. These numbers clearly show a consistently superior accuracy of our method in discovering and acquiring user-desired online content for e-health research. The implementation of our user-oriented web crawler is freely available to non-commercial users via the following Web site: http://bsec.ornl.gov/AdaptiveCrawler.shtml. The Web site provides a step-by-step guide on how to execute the web crawler implementation. In addition, the Web site provides the two study datasets including manually labeled ground truth, initial seeds and the crawling results reported in this article.

  2. The comparative effectiveness of clinic, work-site, phone, and Web-based tobacco treatment programs.

    PubMed

    An, Lawrence C; Betzner, Anne; Schillo, Barbara; Luxenberg, Michael G; Christenson, Matthew; Wendling, Ann; Saul, Jessie E; Kavanaugh, Annette

    2010-10-01

    Tobacco treatment programs may be offered in clinical settings, at work-sites, via telephone helplines, or over the Internet. Little comparative data exist regarding the real-world effectiveness of these programs. This paper compares the reach, effectiveness, and costs of these different modes of cessation assistance. This is an observational study of cohorts of participants in Minnesota's QUITPLAN programs in 2004. Cessation assistance was provided in person at 9 treatment centers, using group counseling at 68 work-sites, via a telephone helpline, or via the Internet. The main outcomes of the study are enrollment by current smokers, self-reported 30-day abstinence, and cost per quit. Reach was calculated statewide for the helpline and Web site, regionally for the treatment centers, and for the employee population for work-site programs. Enrollment was greatest for the Web site (n = 4,698), followed by the helpline (n = 2,351), treatment centers (n = 616), and work-sites (n = 479). The Web site attracted younger smokers. Smokers at treatment centers had higher levels of nicotine dependence. The helpline reached more socially disadvantaged smokers. Responder 30-day abstinence rates were higher for the helpline (29.3%), treatment centers (25.8%), and work-sites (19.6%) compared with the online program (12.5%). These differences persisted after controlling for baseline differences in participant characteristics and use of pharmacological therapy. The cost per quit was lowest for the Web site program ($291 per quit, 95% CI = $229-$372). Treatment center, work-site, helpline, and Web site programs differ in their reach, effectiveness, and estimated cost per quit. Each program plays a part in assisting populations of tobacco users in quitting.

  3. Terrestrial Contributions to the Aquatic Food Web in the Middle Yangtze River

    PubMed Central

    Wang, Jianzhu; Gu, Binhe; Huang, Jianhui; Han, Xingguo; Lin, Guanghui; Zheng, Fawen; Li, Yuncong

    2014-01-01

    Understanding the carbon sources supporting aquatic consumers in large rivers is essential for the protection of ecological integrity and for wildlife management. The relative importance of terrestrial and algal carbon to the aquatic food webs is still under intensive debate. The Yangtze River is the largest river in China and the third longest river in the world. The completion of the Three Gorges Dam (TGD) in 2003 has significantly altered the hydrological regime of the middle Yangtze River, but its immediate impact on carbon sources supporting the river food web is unknown. In this study, potential production sources from riparian and the main river channel, and selected aquatic consumers (invertebrates and fish) at an upstream constricted-channel site (Luoqi), a midstream estuarine site (Huanghua) and a near dam limnetic site (Maoping) of the TGD were collected for stable isotope (δ13C and δ15N) and IsoSource analyses. Model estimates indicated that terrestrial plants were the dominant carbon sources supporting the consumer taxa at the three study sites. Algal production appeared to play a supplemental role in supporting consumer production. The contribution from C4 plants was more important than that of C3 plants at the upstream site while C3 plants were the more important carbon source to the consumers at the two impacted sites (Huanghua and Maoping), particularly at the midstream site. There was no trend of increase in the contribution of autochthonous production from the upstream to the downstream sites as the flow rate decreased dramatically along the main river channel due to the construction of TGD. Our findings, along with recent studies in rivers and lakes, are contradictory to studies that demonstrate the importance of algal carbon in the aquatic food web. Differences in system geomorphology, hydrology, habitat heterogeneity, and land use may account for these contradictory findings reported in various studies. PMID:25047656

  4. Terrestrial contributions to the aquatic food web in the middle Yangtze River.

    PubMed

    Wang, Jianzhu; Gu, Binhe; Huang, Jianhui; Han, Xingguo; Lin, Guanghui; Zheng, Fawen; Li, Yuncong

    2014-01-01

    Understanding the carbon sources supporting aquatic consumers in large rivers is essential for the protection of ecological integrity and for wildlife management. The relative importance of terrestrial and algal carbon to the aquatic food webs is still under intensive debate. The Yangtze River is the largest river in China and the third longest river in the world. The completion of the Three Gorges Dam (TGD) in 2003 has significantly altered the hydrological regime of the middle Yangtze River, but its immediate impact on carbon sources supporting the river food web is unknown. In this study, potential production sources from riparian and the main river channel, and selected aquatic consumers (invertebrates and fish) at an upstream constricted-channel site (Luoqi), a midstream estuarine site (Huanghua) and a near dam limnetic site (Maoping) of the TGD were collected for stable isotope (δ13C and δ15N) and IsoSource analyses. Model estimates indicated that terrestrial plants were the dominant carbon sources supporting the consumer taxa at the three study sites. Algal production appeared to play a supplemental role in supporting consumer production. The contribution from C4 plants was more important than that of C3 plants at the upstream site while C3 plants were the more important carbon source to the consumers at the two impacted sites (Huanghua and Maoping), particularly at the midstream site. There was no trend of increase in the contribution of autochthonous production from the upstream to the downstream sites as the flow rate decreased dramatically along the main river channel due to the construction of TGD. Our findings, along with recent studies in rivers and lakes, are contradictory to studies that demonstrate the importance of algal carbon in the aquatic food web. Differences in system geomorphology, hydrology, habitat heterogeneity, and land use may account for these contradictory findings reported in various studies.

  5. BAID: The Barrow Area Information Database - An Interactive Web Mapping Portal and Cyberinfrastructure Showcasing Scientific Activities in the Vicinity of Barrow, Arctic Alaska.

    NASA Astrophysics Data System (ADS)

    Escarzaga, S. M.; Cody, R. P.; Kassin, A.; Barba, M.; Gaylord, A. G.; Manley, W. F.; Mazza Ramsay, F. D.; Vargas, S. A., Jr.; Tarin, G.; Laney, C. M.; Villarreal, S.; Aiken, Q.; Collins, J. A.; Green, E.; Nelson, L.; Tweedie, C. E.

    2015-12-01

    The Barrow area of northern Alaska is one of the most intensely researched locations in the Arctic and the Barrow Area Information Database (BAID, www.barrowmapped.org) tracks and facilitates a gamut of research, management, and educational activities in the area. BAID is a cyberinfrastructure (CI) that details much of the historic and extant research undertaken within in the Barrow region in a suite of interactive web-based mapping and information portals (geobrowsers). The BAID user community and target audience for BAID is diverse and includes research scientists, science logisticians, land managers, educators, students, and the general public. BAID contains information on more than 12,000 Barrow area research sites that extend back to the 1940's and more than 640 remote sensing images and geospatial datasets. In a web-based setting, users can zoom, pan, query, measure distance, save or print maps and query results, and filter or view information by space, time, and/or other tags. Additionally, data are described with metadata that meet Federal Geographic Data Committee standards. Recent advances include the addition of more than 2000 new research sites, the addition of a query builder user interface allowing rich and complex queries, and provision of differential global position system (dGPS) and high-resolution aerial imagery support to visiting scientists. Recent field surveys include over 80 miles of coastline to document rates of erosion and the collection of high-resolution sonar data for bathymetric mapping of Elson Lagoon and near shore region of the Chukchi Sea. A network of five climate stations has been deployed across the peninsula to serve as a wireless net for the research community and to deliver near real time climatic data to the user community. Local GIS personal have also been trained to better make use of scientific data for local decision making. Links to Barrow area datasets are housed at national data archives and substantial upgrades have been made to the BAID website and web mapping applications to include the public release of a new multi-temporal Imagery Viewer that allow users to interact with and compare imagery of the Barrow area from 1949 to present.

  6. Analysis of pathology department Web sites and practical recommendations.

    PubMed

    Nero, Christopher; Dighe, Anand S

    2008-09-01

    There are numerous customers for pathology departmental Web sites, including pathology department staff, clinical staff, residency applicants, job seekers, and other individuals outside the department seeking department information. Despite the increasing importance of departmental Web sites as a means of distributing information, no analysis has been done to date of the content and usage of pathology department Web sites. In this study, we analyzed pathology department Web sites to examine the elements present on each site and to evaluate the use of search technology on these sites. Further, we examined the usage patterns of our own departmental Internet and internet Web sites to better understand the users of pathology Web sites. We reviewed selected departmental pathology Web sites and analyzed their content and functionality. Our institution's departmental pathology Web sites were modified to enable detailed information to be stored regarding users and usage patterns, and that information was analyzed. We demonstrate considerable heterogeneity in departmental Web sites with many sites lacking basic content and search features. In addition, we demonstrate that increasing the traffic of a department's informational Web sites may result in reduced phone inquiries to the laboratory. We propose recommendations for pathology department Web sites to maximize promotion of a department's mission. A departmental pathology Web site is an essential communication tool for all pathology departments, and attention to the users and content of the site can have operational impact.

  7. The Great War: Online Resources.

    ERIC Educational Resources Information Center

    Duncanson, Bruce

    2002-01-01

    Presents an annotated bibliography of Web sites about World War I. Includes: (1) general Web sites; (2) Web sites with information during the war; (3) Web sites with information about post-World War I; (4) Web sites that provide photos, sound files of speeches, and propaganda posters; and (5) Web sites with lesson plans. (CMK)

  8. Creating and Maintaining Data-Driven Course Web Sites.

    ERIC Educational Resources Information Center

    Heines, Jesse M.

    This paper deals with techniques for reducing the amount of work that needs to be redone each semester when one prepares an existing course Web site for a new class. The key concept is algorithmic generation of common page elements while still allowing full control over page content via WYSIWYG tools like Microsoft FrontPage and Macromedia…

  9. Nursing Home Administrators' Opinions of the Nursing Home Compare Web Site

    ERIC Educational Resources Information Center

    Castle, Nicholas G.

    2005-01-01

    Purpose: In November of 2002 the Centers for Medicare and Medicaid Services publicly reported on a national basis the quality of nursing homes on the Nursing Home Compare (NHC) Web site. This study examines administrators' opinions of this initiative and whether it has fostered quality improvement. Design and Methods: Data used in this…

  10. The Sites Teachers Choose: A Gauge of Classroom Web Use

    ERIC Educational Resources Information Center

    Archambault, Leanna; Crippen, Kent

    2007-01-01

    The pervasive nature of the Internet, both in society and in America's schools, leads teacher educators to wonder how this dynamic tool is being utilized in the classroom and, especially, if it is benefiting students' understanding. This study analyzed 127 Web sites self-reported by in-service teachers as excellent for teaching. From these data, a…

  11. Commissions as information organizations: Meeting the information needs of an electronic society

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sevel, F.

    1997-11-01

    This paper describes how commission-sponsored web sites can effectively meet electronic information needs. Demographics of internet users are presented and analyzed. Online activities and user access data are also described. The implications of the characteristics of internet users for commission-sponsored web sites are discussed, and guidelines for determining marketing objectives are presented.

  12. Community-based randomized controlled trial of diabetes prevention study for high-risk individuals of type 2 diabetes: lifestyle intervention using web-based system.

    PubMed

    Cha, Seon-Ah; Lim, Sun-Young; Kim, Kook-Rye; Lee, Eun-Young; Kang, Borami; Choi, Yoon-Hee; Yoon, Kun-Ho; Ahn, Yu-Bae; Lee, Jin-Hee; Ko, Seung-Hyun

    2017-05-05

    The trend of increasing numbers of patients with type 2 diabetes emphasizes the need for active screening of high-risk individuals and intensive lifestyle modification (LSM). The community-based Korean Diabetes Prevention Study (C-KDPS) is a randomized controlled clinical trial to prevent type 2 diabetes by intensive LSM using a web-based program. The two public healthcare centers in Korea are involved, and 420 subjects are being recruited for 6 months and will be followed up for 22 months. The participants are allocated randomly to intensive LSM (18 individual sessions for 24 weeks) and usual care (control group). The major goals of the C-KDPS lifestyle intervention program are: 1) a minimum of 5-7% loss of initial body weight in 6 months and maintenance of this weight loss, 2) increased physical activity (≥ 150 min/week of moderate intensity activity), 3) balanced healthy eating, and 4) quitting smoking and alcohol with stress management. The web-based program includes education contents, video files, visit schedules, and inter-communicable keeping track sites. Primary outcomes are the diagnoses of newly developed diabetes. A 75-g oral glucose tolerance test with hemoglobin A1c level determination and cardiovascular risk factor assessment is scheduled at 6, 12, 18, and 22 months. Active screening of high-risk individuals and an effective LSM program are an essential prerequisite for successful diabetes prevention. We hope that our C-KDPS program can reduce the incidence of newly developed type 2 diabetes and be implemented throughout the country, merging community-based public healthcare resources and a web-based system. Clinical Research Information Service (CRIS), Republic of Korea (No. KCT0001981 ). Date of registration; July 28, 2016.

  13. IMAGESEER - IMAGEs for Education and Research

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Grubb, Thomas; Milner, Barbara

    2012-01-01

    IMAGESEER is a new Web portal that brings easy access to NASA image data for non-NASA researchers, educators, and students. The IMAGESEER Web site and database are specifically designed to be utilized by the university community, to enable teaching image processing (IP) techniques on NASA data, as well as to provide reference benchmark data to validate new IP algorithms. Along with the data and a Web user interface front-end, basic knowledge of the application domains, benchmark information, and specific NASA IP challenges (or case studies) are provided.

  14. Initial Data Release of the Kepler-INT Survey

    NASA Astrophysics Data System (ADS)

    Greiss, S.; Steeghs, D.; Gänsicke, B. T.; Martín, E. L.; Groot, P. J.; Irwin, M. J.; González-Solares, E.; Greimel, R.; Knigge, C.; Østensen, R. H.; Verbeek, K.; Drew, J. E.; Drake, J.; Jonker, P. G.; Ripepi, V.; Scaringi, S.; Southworth, J.; Still, M.; Wright, N. J.; Farnhill, H.; van Haaften, L. M.; Shah, S.

    2012-07-01

    This paper describes the first data release of the Kepler-INT Survey (KIS) that covers a 116 deg2 region of the Cygnus and Lyra constellations. The Kepler field is the target of the most intensive search for transiting planets to date. Despite the fact that the Kepler mission provides superior time-series photometry, with an enormous impact on all areas of stellar variability, its field lacks optical photometry complete to the confusion limit of the Kepler instrument necessary for selecting various classes of targets. For this reason, we follow the observing strategy and data reduction method used in the IPHAS and UVEX galactic plane surveys in order to produce a deep optical survey of the Kepler field. This initial release concerns data taken between 2011 May and August, using the Isaac Newton Telescope on the island of La Palma. Four broadband filters were used, U, g, r, i, as well as one narrowband one, Hα, reaching down to a 10σ limit of ~20th mag in the Vega system. Observations covering ~50 deg2, thus about half of the field, passed our quality control thresholds and constitute this first data release. We derive a global photometric calibration by placing the KIS magnitudes as close as possible to the Kepler Input Catalog (KIC) photometry. The initial data release catalog containing around 6 million sources from all the good photometric fields is available for download from the KIS Web site (www.astro.warwick.ac.uk/research/kis/) as well as via MAST (KIS magnitudes can be retrieved using the MAST enhanced target search page http://archive.stsci.edu/kepler/kepler_fov/search.php and also via Casjobs at MAST Web site http://mastweb.stsci.edu/kplrcasjobs/).

  15. INITIAL DATA RELEASE OF THE KEPLER-INT SURVEY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greiss, S.; Steeghs, D.; Gaensicke, B. T.

    2012-07-15

    This paper describes the first data release of the Kepler-INT Survey (KIS) that covers a 116 deg{sup 2} region of the Cygnus and Lyra constellations. The Kepler field is the target of the most intensive search for transiting planets to date. Despite the fact that the Kepler mission provides superior time-series photometry, with an enormous impact on all areas of stellar variability, its field lacks optical photometry complete to the confusion limit of the Kepler instrument necessary for selecting various classes of targets. For this reason, we follow the observing strategy and data reduction method used in the IPHAS andmore » UVEX galactic plane surveys in order to produce a deep optical survey of the Kepler field. This initial release concerns data taken between 2011 May and August, using the Isaac Newton Telescope on the island of La Palma. Four broadband filters were used, U, g, r, i, as well as one narrowband one, H{alpha}, reaching down to a 10{sigma} limit of {approx}20th mag in the Vega system. Observations covering {approx}50 deg{sup 2}, thus about half of the field, passed our quality control thresholds and constitute this first data release. We derive a global photometric calibration by placing the KIS magnitudes as close as possible to the Kepler Input Catalog (KIC) photometry. The initial data release catalog containing around 6 million sources from all the good photometric fields is available for download from the KIS Web site (www.astro.warwick.ac.uk/research/kis/) as well as via MAST (KIS magnitudes can be retrieved using the MAST enhanced target search page http://archive.stsci.edu/kepler/kepler{sub f}ov/search.php and also via Casjobs at MAST Web site http://mastweb.stsci.edu/kplrcasjobs/).« less

  16. LifeWatchGreece Portal development: architecture, implementation and challenges for a biodiversity research e-infrastructure.

    PubMed

    Gougousis, Alexandros; Bailly, Nicolas

    2016-01-01

    Biodiversity data is characterized by its cross-disciplinary character, the extremely broad range of data types and structures, and the plethora of different data sources providing resources for the same piece of information in a heterogeneous way. Since the web inception two decades ago, there are multiple initiatives to connect, aggregate, share, and publish biodiversity data, and to establish data and work flows in order to analyze them. The European program LifeWatch aims at establishing a distributed network of nodes implementing virtual research environment in Europe to facilitate the work of biodiversity researchers and managers. LifeWatchGreece is one of these nodes where a portal was developed offering access to a suite of virtual laboratories and e-services. Despite its strict definition in information technology, in practice "portal" is a fairly broad term that embraces many web architectures. In the biodiversity domain, the term "portal" is usually used to indicate either a web site that provides access to a single or an aggregation of data repositories (like: http://indiabiodiversity.org/, http://www.mountainbiodiversity.org/, http://data.freshwaterbiodiversity.eu), a web site that gathers information about various online biodiversity tools (like http://test-eubon.ebd.csic.es/, http://marine.lifewatch.eu/) or a web site that just gathers information and news about the biodiversity domain (like http://chm.moew.government.bg). LifeWatchGreece's portal takes the concept of a portal a step further. In strict IT terms, LifeWatchGreece's portal is partly a portal, partly a platform and partly an aggregator. It includes a number of biodiversity-related web tools integrated into a centrally-controlled software ecosystem. This ecosystem includes subsystems for access control, traffic monitoring, user notifications and web tool management. These subsystems are shared to all the web tools that have been integrated to the portal and thereby are part of this ecosystem. These web tools do not consist in external and completely independent web applications as it happens in most other portals. A quite obvious (to the user) indication of this is the Single-Sign-On (SSO) functionality for all tools and the common user interface wrapper that most of these tools use. Another example of a less obvious functionality is the common user profile that is shared and can be utilized by all tools (e.g user's timezone).

  17. Food and beverage brands that market to children and adolescents on the internet: a content analysis of branded web sites.

    PubMed

    Henry, Anna E; Story, Mary

    2009-01-01

    To identify food and beverage brand Web sites featuring designated children's areas, assess marketing techniques present on those industry Web sites, and determine nutritional quality of branded food items marketed to children. Systematic content analysis of food and beverage brand Web sites and nutrient analysis of food and beverages advertised on these Web sites. The World Wide Web. One-hundred thirty Internet Web sites of food and beverage brands with top media expenditures based on the America's Top 2000 Brands section of Brandweek magazine's annual "Superbrands" report. A standardized content analysis rating form to determine marketing techniques used on the food and beverage brand Web sites. Nutritional analysis of food brands was conducted. Of 130 Web sites analyzed, 48% featured designated children's areas. These Web sites featured a variety of Internet marketing techniques, including advergaming on 85% of the Web sites and interactive programs on 92% of the Web sites. Branded spokescharacters and tie-ins to other products were featured on the majority of the Web sites, as well. Few food brands (13%) with Web sites that market to children met the nutrition criteria set by the National Alliance for Nutrition and Activity. Nearly half of branded Web sites analyzed used designated children's areas to market food and beverages to children, 87% of which were of low nutritional quality. Nutrition professionals should advocate the use of advertising techniques to encourage healthful food choices for children.

  18. Data on Second Majors in Language and Literature, 2001-13

    ERIC Educational Resources Information Center

    Modern Language Association, 2015

    2015-01-01

    Data on second majors were added to the degree completions component of the United States Department of Education's Integrated Postsecondary Education Data System (IPEDS) in 2001 and were made available on the National Science Foundation's "WebCASPAR" Web site (https://webcaspar.nsf.gov/) in 2010. The Modern Language Association's first…

  19. Development of an Intelligent Monitoring System for Geological Carbon Sequestration (GCS) Systems

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Jeong, H.; Xu, W.; Hovorka, S. D.; Zhu, T.; Templeton, T.; Arctur, D. K.

    2016-12-01

    To provide stakeholders timely evidence that GCS repositories are operating safely and efficiently requires integrated monitoring to assess the performance of the storage reservoir as the CO2 plume moves within it. As a result, GCS projects can be data intensive, as a result of proliferation of digital instrumentation and smart-sensing technologies. GCS projects are also resource intensive, often requiring multidisciplinary teams performing different monitoring, verification, and accounting (MVA) tasks throughout the lifecycle of a project to ensure secure containment of injected CO2. How to correlate anomaly detected by a certain sensor to events observed by other devices to verify leakage incidents? How to optimally allocate resources for task-oriented monitoring if reservoir integrity is in question? These are issues that warrant further investigation before real integration can take place. In this work, we are building a web-based, data integration, assimilation, and learning framework for geologic carbon sequestration projects (DIAL-GCS). DIAL-GCS will be an intelligent monitoring system (IMS) for automating GCS closed-loop management by leveraging recent developments in high-throughput database, complex event processing, data assimilation, and machine learning technologies. Results will be demonstrated using realistic data and model derived from a GCS site.

  20. Internet Presentation of Departments of Pediatric Surgery in Germany and Their Compliance with Recommended Criteria for Promoting Services and Offering Professional Information for Patients.

    PubMed

    Farhat, Naim; Zoeller, Christoph; Petersen, Claus; Ure, Benno

    2016-08-01

    Introduction The presentation of health institutions in the internet is highly variable concerning marketing features and medical information. We aimed to investigate the structure and the kind of information provided on the Web sites of all departments of pediatric surgery in Germany. Furthermore, we aimed to identify the degree to which these Web sites comply with internet marketing recommendations for generating business. Method The Web sites of all pediatric surgery units referred to as departments on the official Web site of the German Society of Pediatric Surgery (GSPS) were assessed. The search engine Google was used by entering the terms "pediatric surgery" and the name of the city. Besides general data eight content characteristics focusing on ranking, accessibility, use of social media, multilingual sites, navigation options, selected images, contact details, and medical information were evaluated according to published recommendations. Results A total of 85 departments of pediatric surgery were included. On Google search results 44 (52%) ranked number one and 34 (40%) of the department's homepages were accessible directly through the homepage link of the GSPS. A link to own digital and/or social media was offered on 11 (13%) homepages. Nine sites were multilingual. The most common navigation bar item was clinical services on 74 (87%) homepages. Overall, 76 (89%) departments presented their doctors and 17 (20%) presented other staff members with images of doctors on 53 (62%) and contact data access from the homepage on 68 (80%) Web sites. On 25 (29%) Web sites information on the medical conditions treated were presented, on 17 (20%) details of treating concepts, and on 4 (5%) numbers of patients with specific conditions treated in the own department per year. Conclusion We conclude that numerous of the investigated online presentations do not comply with recommended criteria for offering professional information for patients and for promoting services. Only less than one-third of the departments of pediatric surgery in Germany offer information about the medical conditions they treat. Features, which may influence the decision of patients and parents such as ranking, accessibility, use of social media, multilingual sites, navigation options, selected images, and contact information were differently lacking on many Web sites. Georg Thieme Verlag KG Stuttgart · New York.

  1. Development of a laboratory niche Web site.

    PubMed

    Dimenstein, Izak B; Dimenstein, Simon I

    2013-10-01

    This technical note presents the development of a methodological laboratory niche Web site. The "Grossing Technology in Surgical Pathology" (www.grossing-technology.com) Web site is used as an example. Although common steps in creation of most Web sites are followed, there are particular requirements for structuring the template's menu on methodological laboratory Web sites. The "nested doll principle," in which one object is placed inside another, most adequately describes the methodological approach to laboratory Web site design. Fragmentation in presenting the Web site's material highlights the discrete parts of the laboratory procedure. An optimally minimal triad of components can be recommended for the creation of a laboratory niche Web site: a main set of media, a blog, and an ancillary component (host, contact, and links). The inclusion of a blog makes the Web site a dynamic forum for professional communication. By forming links and portals, cloud computing opens opportunities for connecting a niche Web site with other Web sites and professional organizations. As an additional source of information exchange, methodological laboratory niche Web sites are destined to parallel both traditional and new forms, such as books, journals, seminars, webinars, and internal educational materials. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Compilation of VS30 Data for the United States

    USGS Publications Warehouse

    Yong, Alan; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Odum, Jack K.; Stephenson, William J.; Haefner, Scott

    2016-01-01

    VS30, the time-averaged shear-wave velocity (VS) to a depth of 30 meters, is a key index adopted by the earthquake engineering community to account for seismic site conditions. VS30 is typically based on geophysical measurements of VS derived from invasive and noninvasive techniques at sites of interest. Owing to cost considerations, as well as logistical and environmental concerns, VS30 data are sparse or not readily available for most areas. Where data are available, VS30 values are often assembled in assorted formats that are accessible from disparate and (or) impermanent Web sites. To help remedy this situation, we compiled VS30 measurements obtained by studies funded by the U.S. Geological Survey (USGS) and other governmental agencies. Thus far, we have compiled VS30 values for 2,997 sites in the United States, along with metadata for each measurement from government-sponsored reports, Web sites, and scientific and engineering journals. Most of the data in our VS30 compilation originated from publications directly reporting the work of field investigators. A small subset (less than 20 percent) of VS30 values was previously compiled by the USGS and other research institutions. Whenever possible, VS30 originating from these earlier compilations were crosschecked against published reports. Both downhole and surface-based VS30 estimates are represented in our VS30 compilation. Most of the VS30 data are for sites in the western contiguous United States (2,141 sites), whereas 786 VS30 values are for sites in the Central and Eastern United States; 70 values are for sites in other parts of the United States, including Alaska (15 sites), Hawaii (30 sites), and Puerto Rico (25 sites). An interactive map is hosted on the primary USGS Web site for accessing VS30 data (http://earthquake.usgs.gov/research/vs30/).

  3. The quality of online antidepressant drug information: an evaluation of English and Finnish language Web sites.

    PubMed

    Prusti, Marjo; Lehtineva, Susanna; Pohjanoksa-Mäntylä, Marika; Bell, J Simon

    2012-01-01

    The Internet is a frequently used source of drug information, including among people with mental disorders. Online drug information may be narrow in scope, incomplete, and contain errors of omission. To evaluate the quality of online antidepressant drug information in English and Finnish. Forty Web sites were identified using the search terms antidepressants and masennuslääkkeet in English and Finnish, respectively. Included Web sites (14 English, 8 Finnish) were evaluated for aesthetics, interactivity, content coverage, and content correctness using published criteria. All Web sites were assessed using the Date, Author, References, Type, Sponsor (DARTS) and DISCERN quality assessment tools. English and Finnish Web sites had similar aesthetics, content coverage, and content correctness scores. English Web sites were more interactive than Finnish Web sites (P<.05). Overall, adverse drug reactions were covered on 21 of 22 Web sites; however, drug-alcohol interactions were addressed on only 9 of 22 Web sites, and dose was addressed on only 6 of 22 Web sites. Few (2/22 Web sites) provided incorrect information. The DISCERN score was significantly correlated with content coverage (r=0.670, P<.01), content correctness (r=0.663, P<.01), and the DARTS score (r=0.459, P<.05). No Web site provided information about all aspects of antidepressant treatment. Nevertheless, few Web sites provided incorrect information. Both English and Finnish Web sites were similar in terms of aesthetics, content coverage, and content correctness. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Meeting Reference Responsibilities through Library Web Sites.

    ERIC Educational Resources Information Center

    Adams, Michael

    2001-01-01

    Discusses library Web sites and explains some of the benefits when libraries make their sites into reference portals, linking them to other useful Web sites. Topics include print versus Web information sources; limitations of search engines; what Web sites to include, including criteria for inclusions; and organizing the sites. (LRW)

  5. Web vulnerability study of online pharmacy sites.

    PubMed

    Kuzma, Joanne

    2011-01-01

    Consumers are increasingly using online pharmacies, but these sites may not provide an adequate level of security with the consumers' personal data. There is a gap in this research addressing the problems of security vulnerabilities in this industry. The objective is to identify the level of web application security vulnerabilities in online pharmacies and the common types of flaws, thus expanding on prior studies. Technical, managerial and legal recommendations on how to mitigate security issues are presented. The proposed four-step method first consists of choosing an online testing tool. The next steps involve choosing a list of 60 online pharmacy sites to test, and then running the software analysis to compile a list of flaws. Finally, an in-depth analysis is performed on the types of web application vulnerabilities. The majority of sites had serious vulnerabilities, with the majority of flaws being cross-site scripting or old versions of software that have not been updated. A method is proposed for the securing of web pharmacy sites, using a multi-phased approach of technical and managerial techniques together with a thorough understanding of national legal requirements for securing systems.

  6. Business Systems Branch Abilities, Capabilities, and Services Web Page

    NASA Technical Reports Server (NTRS)

    Cortes-Pena, Aida Yoguely

    2009-01-01

    During the INSPIRE summer internship I acted as the Business Systems Branch Capability Owner for the Kennedy Web-based Initiative for Communicating Capabilities System (KWICC), with the responsibility of creating a portal that describes the services provided by this Branch. This project will help others achieve a clear view ofthe services that the Business System Branch provides to NASA and the Kennedy Space Center. After collecting the data through the interviews with subject matter experts and the literature in Business World and other web sites I identified discrepancies, made the necessary corrections to the sites and placed the information from the report into the KWICC web page.

  7. 77 FR 32135 - Notice of Lodging of Consent Decree Under the Clean Air Act and the Emergency Planning and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-31

    ... flare gas recovery systems and improved flaring efficiency, and enhanced controls for leak detection and... the monitoring data to a publicly available Web site on a weekly basis. The Consent Decree also... Department of Justice Web site: http://www.usdoj.gov/enrd/Consent_Decrees.html . A copy of the Decree may...

  8. An Integrated Decision Model for Evaluating Educational Web Sites from the Fuzzy Subjective and Objective Perspectives

    ERIC Educational Resources Information Center

    Huang, Tony Cheng-Kui; Huang, Chih-Hong

    2010-01-01

    With advances in information and network technologies, lots of data have been digitized to reveal information for users by the construction of Web sites. Unfortunately, they are both overloading and overlapping in Internet so that users cannot distinguish their quality. To address this issue in education, Hwang, Huang, and Tseng proposed a group…

  9. 78 FR 51781 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-21

    ... proposed rule change is available on the Exchange's Web site at www.nyse.com , at the principal office of... dark pools and electronic communication networks (``ECNs''). Competition among trading platforms can be... provide certain market data at no charge on their Web sites in order to attract more order flow, and use...

  10. 75 FR 11988 - Notice of Request for Approval To Collect New Information: Collection of Safety Culture Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... comments using the Internet, you may use the Web site http://www.regulations.gov . Please follow the instructions for submitting an electronic comment. You can also review comments on-line at the same Web site... consequences (e.g., a train in dark territory proceeds beyond its authority); (3) events that are below the FRA...

  11. PDB-Metrics: a web tool for exploring the PDB contents.

    PubMed

    Fileto, Renato; Kuser, Paula R; Yamagishi, Michel E B; Ribeiro, André A; Quinalia, Thiago G; Franco, Eduardo H; Mancini, Adauto L; Higa, Roberto H; Oliveira, Stanley R M; Santos, Edgard H; Vieira, Fabio D; Mazoni, Ivan; Cruz, Sergio A B; Neshich, Goran

    2006-06-30

    PDB-Metrics (http://sms.cbi.cnptia.embrapa.br/SMS/pdb_metrics/index.html) is a component of the Diamond STING suite of programs for the analysis of protein sequence, structure and function. It summarizes the characteristics of the collection of protein structure descriptions deposited in the Protein Data Bank (PDB) and provides a Web interface to search and browse the PDB, using a variety of alternative criteria. PDB-Metrics is a powerful tool for bioinformaticians to examine the data span in the PDB from several perspectives. Although other Web sites offer some similar resources to explore the PDB contents, PDB-Metrics is among those with the most complete set of such facilities, integrated into a single Web site. This program has been developed using SQLite, a C library that provides all the query facilities of a database management system.

  12. Usability Evaluation of Public Web Mapping Sites

    NASA Astrophysics Data System (ADS)

    Wang, C.

    2014-04-01

    Web mapping sites are interactive maps that are accessed via Webpages. With the rapid development of Internet and Geographic Information System (GIS) field, public web mapping sites are not foreign to people. Nowadays, people use these web mapping sites for various reasons, in that increasing maps and related map services of web mapping sites are freely available for end users. Thus, increased users of web mapping sites led to more usability studies. Usability Engineering (UE), for instance, is an approach for analyzing and improving the usability of websites through examining and evaluating an interface. In this research, UE method was employed to explore usability problems of four public web mapping sites, analyze the problems quantitatively and provide guidelines for future design based on the test results. Firstly, the development progress for usability studies were described, and simultaneously several usability evaluation methods such as Usability Engineering (UE), User-Centered Design (UCD) and Human-Computer Interaction (HCI) were generally introduced. Then the method and procedure of experiments for the usability test were presented in detail. In this usability evaluation experiment, four public web mapping sites (Google Maps, Bing maps, Mapquest, Yahoo Maps) were chosen as the testing websites. And 42 people, who having different GIS skills (test users or experts), gender (male or female), age and nationality, participated in this test to complete the several test tasks in different teams. The test comprised three parts: a pretest background information questionnaire, several test tasks for quantitative statistics and progress analysis, and a posttest questionnaire. The pretest and posttest questionnaires focused on gaining the verbal explanation of their actions qualitatively. And the design for test tasks targeted at gathering quantitative data for the errors and problems of the websites. Then, the results mainly from the test part were analyzed. The success rate from different public web mapping sites was calculated and compared, and displayed by the means of diagram. And the answers from questionnaires were also classified and organized in this part. Moreover, based on the analysis, this paper expands the discussion about the layout, map visualization, map tools, search logic and etc. Finally, this paper closed with some valuable guidelines and suggestions for the design of public web mapping sites. Also, limitations for this research stated in the end.

  13. Designing a web site for high school geoscience teaching in Iceland

    NASA Astrophysics Data System (ADS)

    Douglas, George R.

    1998-08-01

    The need to construct an earth science teaching site on the web prompted a survey of existing sites which, in spite of containing much of value, revealed many weaknesses in basic design, particularly as regards the organisation of links to information resources. Few web sites take into consideration the particular pedagogic needs of the high school science student and there has, as yet, been little serious attempt to exploit and organise the more outstanding advantages offered by the internet to science teaching, such as accessing real-time data. A web site has been constructed which, through basic design, enables students to access relevant information resources over a wide range of subjects and topics easily and rapidly, while at the same time performing an instructional role in how to handle both on-line and off-line resources. Key elements in the design are selection and monitoring by the teacher, task oriented pages and the use of the Dewey decimal classification system. The intention is to increase gradually the extent to which most teaching tasks are carried out via the web pages, in the belief that they can become an efficient central point for all the earth science curriculum.

  14. Human exposure assessment resources on the World Wide Web.

    PubMed

    Schwela, Dieter; Hakkinen, Pertti J

    2004-05-20

    Human exposure assessment is frequently noted as a weak link and bottleneck in the risk assessment process. Fortunately, the World Wide Web and Internet are providing access to numerous valuable sources of human exposure assessment-related information, along with opportunities for information exchange. Internet mailing lists are available as potential online help for exposure assessment questions, e.g. RISKANAL has several hundred members from numerous countries. Various Web sites provide opportunities for training, e.g. Web sites offering general human exposure assessment training include two from the US Environmental Protection Agency (EPA) and four from the US National Library of Medicine. Numerous other Web sites offer access to a wide range of exposure assessment information. For example, the (US) Alliance for Chemical Awareness Web site addresses direct and indirect human exposures, occupational exposures and ecological exposure assessments. The US EPA's Exposure Factors Program Web site provides a focal point for current information and data on exposure factors relevant to the United States. In addition, the International Society of Exposure Analysis Web site provides information about how this society seeks to foster and advance the science of exposure analysis. A major opportunity exists for risk assessors and others to broaden the level of exposure assessment information available via Web sites. Broadening the Web's exposure information could include human exposure factors-related information about country- or region-specific ranges in body weights, drinking water consumption, etc. along with residential factors-related information on air changeovers per hour in various types of residences. Further, country- or region-specific ranges on how various tasks are performed by various types of consumers could be collected and provided. Noteworthy are that efforts are underway in Europe to develop a multi-country collection of exposure factors and the European Commission is in the early stages of planning and developing a Web-accessible information system (EIS-ChemRisks) to serve as a single gateway to all major European initiatives on human exposure to chemicals contained and released from cleaning products, textiles, toys, etc.

  15. Setting Up the JBrowse Genome Browser

    PubMed Central

    Skinner, Mitchell E; Holmes, Ian H

    2010-01-01

    JBrowse is a web-based tool for visualizing genomic data. Unlike most other web-based genome browsers, JBrowse exploits the capabilities of the user's web browser to make scrolling and zooming fast and smooth. It supports the browsers used by almost all internet users, and is relatively simple to install. JBrowse can utilize multiple types of data in a variety of common genomic data formats, including genomic feature data in bioperl databases, GFF files, and BED files, and quantitative data in wiggle files. This unit describes how to obtain the JBrowse software, set it up on a Linux or Mac OS X computer running as a web server and incorporate genome annotation data from multiple sources into JBrowse. After completing the protocols described in this unit, the reader will have a web site that other users can visit to browse the genomic data. PMID:21154710

  16. A Global Overview of Male Escort Websites.

    PubMed

    Kumar, Navin; Minichiello, Victor; Scott, John; Harrington, Taylor

    2017-01-01

    This article details a preliminary dataset of global male escort sites to give insight into the scale of the online market. We conducted a content analysis of 499 Web sites and also measured traffic to these sites. Our analysis examined the structural characteristics of escort services, geographical and regulatory contexts, and resilience of such services. Results suggest that most sites are independent and not affiliated to escort agencies, and the majority cater to male escorts soliciting male clients, with a number of sites for female clientele and couples. These Web sites are dispersed globally, with Asian, European, and South American countries the major hubs in the market and a small number of large multinational sites based in the United States and Europe figuring as a major presence in markets. Although still subject to high levels of regulation in many parts of the world, the data suggest that male escorting is becoming more visible in diverse cultural contexts as measured by the number of Web sites appearing in public spaces.

  17. Web-based recruitment: effects of information, organizational brand, and attitudes toward a Web site on applicant attraction.

    PubMed

    Allen, David G; Mahto, Raj V; Otondo, Robert F

    2007-11-01

    Recruitment theory and research show that objective characteristics, subjective considerations, and critical contact send signals to prospective applicants about the organization and available opportunities. In the generating applicants phase of recruitment, critical contact may consist largely of interactions with recruitment sources (e.g., newspaper ads, job fairs, organization Web sites); however, research has yet to fully address how all 3 types of signaling mechanisms influence early job pursuit decisions in the context of organizational recruitment Web sites. Results based on data from 814 student participants searching actual organization Web sites support and extend signaling and brand equity theories by showing that job information (directly) and organization information (indirectly) are related to intentions to pursue employment when a priori perceptions of image are controlled. A priori organization image is related to pursuit intentions when subsequent information search is controlled, but organization familiarity is not, and attitudes about a recruitment source also influence attraction and partially mediate the effects of organization information. Theoretical and practical implications for recruitment are discussed. (c) 2007 APA

  18. Accessibility and content of individualized adult reconstructive hip and knee/musculoskeletal oncology fellowship web sites.

    PubMed

    Young, Bradley L; Cantrell, Colin K; Patt, Joshua C; Ponce, Brent A

    2018-06-01

    Accessible, adequate online information is important to fellowship applicants. Program web sites can affect which programs applicants apply to, subsequently altering interview costs incurred by both parties and ultimately impacting rank lists. Web site analyses have been performed for all orthopaedic subspecialties other than those involved in the combined adult reconstruction and musculoskeletal (MSK) oncology fellowship match. A complete list of active programs was obtained from the official adult reconstruction and MSK oncology society web sites. Web site accessibility was assessed using a structured Google search. Accessible web sites were evaluated based on 21 previously reported content criteria. Seventy-four adult reconstruction programs and 11 MSK oncology programs were listed on the official society web sites. Web sites were identified and accessible for 58 (78%) adult reconstruction and 9 (82%) MSK oncology fellowship programs. No web site contained all content criteria and more than half of both adult reconstruction and MSK oncology web sites failed to include 12 of the 21 criteria. Several programs participating in the combined Adult Reconstructive Hip and Knee/Musculoskeletal Oncology Fellowship Match did not have accessible web sites. Of the web sites that were accessible, none contained comprehensive information and the majority lacked information that has been previously identified as being important to perspective applicants.

  19. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  20. Access to Space Interactive Design Web Site

    NASA Technical Reports Server (NTRS)

    Leon, John; Cutlip, William; Hametz, Mark

    2000-01-01

    The Access To Space (ATS) Group at NASA's Goddard Space Flight Center (GSFC) supports the science and technology community at GSFC by facilitating frequent and affordable opportunities for access to space. Through partnerships established with access mode suppliers, the ATS Group has developed an interactive Mission Design web site. The ATS web site provides both the information and the tools necessary to assist mission planners in selecting and planning their ride to space. This includes the evaluation of single payloads vs. ride-sharing opportunities to reduce the cost of access to space. Features of this site include the following: (1) Mission Database. Our mission database contains a listing of missions ranging from proposed missions to manifested. Missions can be entered by our user community through data input tools. Data is then accessed by users through various search engines: orbit parameters, ride-share opportunities, spacecraft parameters, other mission notes, launch vehicle, and contact information. (2) Launch Vehicle Toolboxes. The launch vehicle toolboxes provide the user a full range of information on vehicle classes and individual configurations. Topics include: general information, environments, performance, payload interface, available volume, and launch sites.

  1. A systematic review of patient inflammatory bowel disease information resources on the World Wide Web.

    PubMed

    Bernard, André; Langille, Morgan; Hughes, Stephanie; Rose, Caren; Leddin, Desmond; Veldhuyzen van Zanten, Sander

    2007-09-01

    The Internet is a widely used information resource for patients with inflammatory bowel disease, but there is variation in the quality of Web sites that have patient information regarding Crohn's disease and ulcerative colitis. The purpose of the current study is to systematically evaluate the quality of these Web sites. The top 50 Web sites appearing in Google using the terms "Crohn's disease" or "ulcerative colitis" were included in the study. Web sites were evaluated using a (a) Quality Evaluation Instrument (QEI) that awarded Web sites points (0-107) for specific information on various aspects of inflammatory bowel disease, (b) a five-point Global Quality Score (GQS), (c) two reading grade level scores, and (d) a six-point integrity score. Thirty-four Web sites met the inclusion criteria, 16 Web sites were excluded because they were portals or non-IBD oriented. The median QEI score was 57 with five Web sites scoring higher than 75 points. The median Global Quality Score was 2.0 with five Web sites achieving scores of 4 or 5. The average reading grade level score was 11.2. The median integrity score was 3.0. There is marked variation in the quality of the Web sites containing information on Crohn's disease and ulcerative colitis. Many Web sites suffered from poor quality but there were five high-scoring Web sites.

  2. w4CSeq: software and web application to analyze 4C-seq data.

    PubMed

    Cai, Mingyang; Gao, Fan; Lu, Wange; Wang, Kai

    2016-11-01

    Circularized Chromosome Conformation Capture followed by deep sequencing (4C-Seq) is a powerful technique to identify genome-wide partners interacting with a pre-specified genomic locus. Here, we present a computational and statistical approach to analyze 4C-Seq data generated from both enzyme digestion and sonication fragmentation-based methods. We implemented a command line software tool and a web interface called w4CSeq, which takes in the raw 4C sequencing data (FASTQ files) as input, performs automated statistical analysis and presents results in a user-friendly manner. Besides providing users with the list of candidate interacting sites/regions, w4CSeq generates figures showing genome-wide distribution of interacting regions, and sketches the enrichment of key features such as TSSs, TTSs, CpG sites and DNA replication timing around 4C sites. Users can establish their own web server by downloading source codes at https://github.com/WGLab/w4CSeq Additionally, a demo web server is available at http://w4cseq.wglab.org CONTACT: kaiwang@usc.edu or wangelu@usc.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Manning's roughness coefficient for Illinois streams

    USGS Publications Warehouse

    Soong, David T.; Prater, Crystal D.; Halfar, Teresa M.; Wobig, Loren A.

    2012-01-01

    Manning's roughness coefficients for 43 natural and constructed streams in Illinois are reported and displayed on a U.S. Geological Survey Web site. At a majority of the sites, discharge and stage were measured, and corresponding Manning's coefficients—the n-values—were determined at more than one river discharge. The n-values discussed in this report are computed from data representing the stream reach studied and, therefore, are reachwise values. Presentation of the resulting n-values takes a visual-comparison approach similar to the previously published Barnes report (1967), in which photographs of channel conditions, description of the site, and the resulting n-values are organized for each site. The Web site where the data can be accessed and are displayed is at URL http://il.water.usgs.gov/proj/nvalues/.

  4. Critical loads and levels: Leveraging existing monitoring data

    Treesearch

    D. G. Fox; A. R. Riebau; R. Fisher

    2006-01-01

    A snapshot of current air quality in the National Parks and Wilderness areas of the US is presented based on data from the 165 site Interagency Monitoring of Protected Visual Environments, or IMPROVE program, and other relevant air quality monitoring programs. This snapshot is provided using the VIEWS web service, an on-line web-based data warehouse, analysis, and...

  5. Pediatric Price Transparency: Still Opaque With Opportunities for Improvement.

    PubMed

    Faherty, Laura J; Wong, Charlene A; Feingold, Jordyn; Li, Joan; Town, Robert; Fieldston, Evan; Werner, Rachel M

    2017-10-01

    Price transparency is gaining importance as families' portion of health care costs rise. We describe (1) online price transparency data for pediatric care on children's hospital Web sites and state-based price transparency Web sites, and (2) the consumer experience of obtaining an out-of-pocket estimate from children's hospitals for a common procedure. From 2015 to 2016, we audited 45 children's hospital Web sites and 38 state-based price transparency Web sites, describing availability and characteristics of health care prices and personalized cost estimate tools. Using secret shopper methodology, we called children's hospitals and submitted online estimate requests posing as a self-paying family requesting an out-of-pocket estimate for a tonsillectomy-adenoidectomy. Eight children's hospital Web sites (18%) listed prices. Twelve (27%) provided personalized cost estimate tool (online form n = 5 and/or phone number n = 9). All 9 hospitals with a phone number for estimates provided the estimated patient liability for a tonsillectomy-adenoidectomy (mean $6008, range $2622-$9840). Of the remaining 36 hospitals without a dedicated price estimate phone number, 21 (58%) provided estimates (mean $7144, range $1200-$15 360). Two of 4 hospitals with online forms provided estimates. Fifteen (39%) state-based Web sites distinguished between prices for pediatric and adult care. One had a personalized cost estimate tool. Meaningful prices for pediatric care were not widely available online through children's hospital or state-based price transparency Web sites. A phone line or online form for price estimates were effective strategies for hospitals to provide out-of-pocket price information. Opportunities exist to improve pediatric price transparency. Copyright © 2017 by the American Academy of Pediatrics.

  6. Organ procurement organizations Internet enrollment for organ donation: Abandoning informed consent

    PubMed Central

    Woien, Sandra; Rady, Mohamed Y; Verheijde, Joseph L; McGregor, Joan

    2006-01-01

    Background Requirements for organ donation after cardiac or imminent death have been introduced to address the transplantable organs shortage in the United States. Organ procurement organizations (OPOs) increasingly use the Internet for organ donation consent. Methods An analysis of OPO Web sites available to the public for enrollment and consent for organ donation. The Web sites and consent forms were examined for the minimal information recommended by the United States Department of Health and Human Services for informed consent. Content scores were calculated as percentages of data elements in four information categories: donor knowledge, donor consent reinforcement, donation promotion, and informed consent. Results There were 60 Web sites for organ donation enrollment serving the 52 states. The median percent (10 percentile-90 percentile) content scores of the Web sites for donor knowledge, donor consent reinforcement, and donation promotion were 33% (20–47), 79% (57–86), and 75% (50–100), respectively. The informed consent score was 0% (0–33). The content scores for donor knowledge and informed consent were significantly lower than donor consent reinforcement and donation promotion for all Web sites (P < .05). The content scores for the four categories were similar among the 11 regions of the United Network for Organ Sharing. Conclusion The Web sites and consent forms for public enrollment in organ donation do not fulfill the necessary requirements for informed consent. The Web sites predominantly provide positive reinforcement and promotional information rather than the transparent disclosure of organ donation process. Independent regulatory oversight is essential to ensure that Internet enrollment for organ donation complies with legal and ethical standards for informed consent. PMID:17187671

  7. Reservoir High's TE Site Wins Web Site of the Month

    ERIC Educational Resources Information Center

    Tech Directions, 2008

    2008-01-01

    This article features "Mr. Rhine's Technology Education Web Site," a winner of the Web Site of the Month. This Web site was designed by Luke Rhine, a teacher at the Reservoir High School in Fulton, Maryland. Rhine's Web site offers course descriptions and syllabuses, class calendars, lectures and presentations, design briefs and other course…

  8. The Way of the Web: Answers to Your Questions about Web Site Marketing.

    ERIC Educational Resources Information Center

    Wassom, Julie

    2002-01-01

    Provides suggestions for effective web site marketing for child care and early education programs. Includes key considerations in designing a web site, specific elements that cause visitors to stay on and return to the site, use of interactive sites, web-site updating and revision, and use of traditional marketing activities to direct prospective…

  9. A Web-based Data Intensive Visualization of Real-time River Drainage Network Response to Rainfall

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2012-04-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS streams rainfall data from NEXRAD radar, and provides three interfaces including animation for rainfall intensity, daily rainfall totals and rainfall accumulations for past 14 days for Iowa. A real-time interactive visualization interface is developed using past rainfall intensity data. The interface creates community-based rainfall products on-demand using watershed boundaries of each community as a mask. Each individual rainfall pixel is tracked in the interface along the drainage network, and the ones drains to same pixel location are accumulated. The interface loads recent rainfall data in five minute intervals that are combined with current values. Latest web technologies are utilized for the development of the interface including HTML 5 Canvas, and JavaScript. The performance of the interface is optimized to run smoothly on modern web browsers. The interface controls allow users to change internal parameters of the system, and operation conditions of the animation. The interface will help communities understand the effects of rainfall on water transport in stream and river networks and make better-informed decisions regarding the threat of floods. This presentation provides an overview of a unique visualization interface and discusses future plans for real-time dynamic presentations of streamflow forecasting.

  10. Use of StreamStats in the Upper French Broad River Basin, North Carolina: A Pilot Water-Resources Web Application

    USGS Publications Warehouse

    Wagner, Chad R.; Tighe, Kirsten C.; Terziotti, Silvia

    2009-01-01

    StreamStats is a Web-based Geographic Information System (GIS) application that was developed by the U.S. Geological Survey (USGS) in cooperation with Environmental Systems Research Institute, Inc. (ESRI) to provide access to an assortment of analytical tools that are useful for water-resources planning and management. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection sites and selected ungaged sites. StreamStats also allows users to identify stream reaches upstream and downstream from user-selected sites and obtain information for locations along streams where activities occur that can affect streamflow conditions. This functionality can be accessed through a map-based interface with the user's Web browser or through individual functions requested remotely through other Web applications.

  11. Web Site Credibility: Why Do People Believe What They Believe?

    ERIC Educational Resources Information Center

    Iding, Marie K.; Crosby, Martha E.; Auernheimer, Brent; Klemm, E. Barbara

    2009-01-01

    This research investigates university students' determinations of credibility of information on Web sites, confidence in their determinations, and perceptions of Web site authors' vested interests. In Study 1, university-level computer science and education students selected Web sites determined to be credible and Web sites that exemplified…

  12. Personality in cyberspace: personal Web sites as media for personality expressions and impressions.

    PubMed

    Marcus, Bernd; Machilek, Franz; Schütz, Astrid

    2006-06-01

    This research examined the personality of owners of personal Web sites based on self-reports, visitors' ratings, and the content of the Web sites. The authors compared a large sample of Web site owners with population-wide samples on the Big Five dimensions of personality. Controlling for demographic differences, the average Web site owner reported being slightly less extraverted and more open to experience. Compared with various other samples, Web site owners did not generally differ on narcissism, self-monitoring, or self-esteem, but gender differences on these traits were often smaller in Web site owners. Self-other agreement was highest with Openness to Experience, but valid judgments of all Big Five dimensions were derived from Web sites providing rich information. Visitors made use of quantifiable features of the Web site to infer personality, and the cues they utilized partly corresponded to self-reported traits. Copyright 2006 APA, all rights reserved.

  13. NDBC Tropical Atmosphere Ocean (TAO)

    Science.gov Websites

    to go to the NWS homepage Left navigation bar Home News Organization Search NDBC web site search TAO Tour FAQ NDBC Home Contact Us USA.gov is the U.S. government's official web portal to all federal , state and local government web resources and services. Recent Data Observations Search TAO DART Tropical

  14. U.S. Geological Survey and Microsoft Cooperative Research and Development Agreement: Geospatial Data Browsing and Retrieval Site on the World Wide Web

    USGS Publications Warehouse

    ,

    1999-01-01

    In May 1997, the U.S. Geological Survey (USGS) and the Microsoft Corporation of Redmond, Wash., entered into a cooperative research and development agreement (CRADA) to make vast amounts of geospatial data available to the general public through the Internet. The CRADA is a 36-month joint effort to develop a general, public-oriented browsing and retrieval site for geospatial data on the Internet. Specifically, Microsoft plans to (1) modify a large volume of USGS geospatial data so the images can be displayed quickly and easily over the Internet, (2) implement an easy-to-use interface for low-speed connections, and (3) develop an Internet Web site capable of servicing millions of users per day.

  15. U.S. Geological Survey and Microsoft Cooperative Research and Development Agreement: Geospatial Data Browsing and Retrieval Site on the World Wide Web

    USGS Publications Warehouse

    ,

    1998-01-01

    In May 1997, the U.S. Geological Survey (USGS) and the Microsoft Corporation of Redmond, Wash., entered into a cooperative research and development agreement (CRADA) to make vast amounts of geospatial data available to the general public through the Internet. The CRADA is a 36-month joint effort to develop a general, public-oriented browsing and retrieval site for geospatial data on the Internet. Specifically, Microsoft plans to (1) modify a large volume of USGS geospatial data so the images can be displayed quickly and easily over the Internet, (2) implement an easy-to-use interface for low-speed connections, and (3) develop an Internet Web site capable of servicing millions of users per day.

  16. Unifying Access to National Hydrologic Data Repositories via Web Services

    NASA Astrophysics Data System (ADS)

    Valentine, D. W.; Jennings, B.; Zaslavsky, I.; Maidment, D. R.

    2006-12-01

    The CUAHSI hydrologic information system (HIS) is designed to be a live, multiscale web portal system for accessing, querying, visualizing, and publishing distributed hydrologic observation data and models for any location or region in the United States. The HIS design follows the principles of open service oriented architecture, i.e. system components are represented as web services with well defined standard service APIs. WaterOneFlow web services are the main component of the design. The currently available services have been completely re-written compared to the previous version, and provide programmatic access to USGS NWIS. (steam flow, groundwater and water quality repositories), DAYMET daily observations, NASA MODIS, and Unidata NAM streams, with several additional web service wrappers being added (EPA STORET, NCDC and others.). Different repositories of hydrologic data use different vocabularies, and support different types of query access. Resolving semantic and structural heterogeneities across different hydrologic observation archives and distilling a generic set of service signatures is one of the main scalability challenges in this project, and a requirement in our web service design. To accomplish the uniformity of the web services API, data repositories are modeled following the CUAHSI Observation Data Model. The web service responses are document-based, and use an XML schema to express the semantics in a standard format. Access to station metadata is provided via web service methods, GetSites, GetSiteInfo and GetVariableInfo. The methdods form the foundation of CUAHSI HIS discovery interface and may execute over locally-stored metadata or request the information from remote repositories directly. Observation values are retrieved via a generic GetValues method which is executed against national data repositories. The service is implemented in ASP.Net, and other providers are implementing WaterOneFlow services in java. Reference implementation of WaterOneFlow web services is available. More information about the ongoing development of CUAHSI HIS is available from http://www.cuahsi.org/his/.

  17. Traffic-based feedback on the web.

    PubMed

    Aizen, Jonathan; Huttenlocher, Daniel; Kleinberg, Jon; Novak, Antal

    2004-04-06

    Usage data at a high-traffic web site can expose information about external events and surges in popularity that may not be accessible solely from analyses of content and link structure. We consider sites that are organized around a set of items available for purchase or download, consider, for example, an e-commerce site or collection of online research papers, and we study a simple indicator of collective user interest in an item, the batting average, defined as the fraction of visits to an item's description that result in an acquisition of that item. We develop a stochastic model for identifying points in time at which an item's batting average experiences significant change. In experiments with usage data from the Internet Archive, we find that such changes often occur in an abrupt, discrete fashion, and that these changes can be closely aligned with events such as the highlighting of an item on the site or the appearance of a link from an active external referrer. In this way, analyzing the dynamics of item popularity at an active web site can help characterize the impact of a range of events taking place both on and off the site.

  18. Traffic-based feedback on the web

    PubMed Central

    Aizen, Jonathan; Huttenlocher, Daniel; Kleinberg, Jon; Novak, Antal

    2004-01-01

    Usage data at a high-traffic web site can expose information about external events and surges in popularity that may not be accessible solely from analyses of content and link structure. We consider sites that are organized around a set of items available for purchase or download, consider, for example, an e-commerce site or collection of online research papers, and we study a simple indicator of collective user interest in an item, the batting average, defined as the fraction of visits to an item's description that result in an acquisition of that item. We develop a stochastic model for identifying points in time at which an item's batting average experiences significant change. In experiments with usage data from the Internet Archive, we find that such changes often occur in an abrupt, discrete fashion, and that these changes can be closely aligned with events such as the highlighting of an item on the site or the appearance of a link from an active external referrer. In this way, analyzing the dynamics of item popularity at an active web site can help characterize the impact of a range of events taking place both on and off the site. PMID:14709676

  19. Technical Services on the Net: Where Are We Now? A Comparative Study of Sixty Web Sites of Academic Libraries

    ERIC Educational Resources Information Center

    Wang, Jianrong; Gao, Vera

    2004-01-01

    This study examines sixty academic libraries' Web sites and finds that 80 percent of them do not have a technical services' homepage. Data reveal that institution's status might be a factor in whether a library has such a page. Further content analysis suggests there is an appropriate and useful public service role that technical services…

  20. StreamStats in North Carolina: a water-resources Web application

    USGS Publications Warehouse

    Weaver, J. Curtis; Terziotti, Silvia; Kolb, Katharine R.; Wagner, Chad R.

    2012-01-01

    A statewide StreamStats application for North Carolina was developed in cooperation with the North Carolina Department of Transportation following completion of a pilot application for the upper French Broad River basin in western North Carolina (Wagner and others, 2009). StreamStats for North Carolina, available at http://water.usgs.gov/osw/streamstats/north_carolina.html, is a Web-based Geographic Information System (GIS) application developed by the U.S. Geological Survey (USGS) in consultation with Environmental Systems Research Institute, Inc. (Esri) to provide access to an assortment of analytical tools that are useful for water-resources planning and management (Ries and others, 2008). The StreamStats application provides an accurate and consistent process that allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection sites and user-selected ungaged sites. In the North Carolina application, users can compute 47 basin characteristics and peak-flow frequency statistics (Weaver and others, 2009; Robbins and Pope, 1996) for a delineated drainage basin. Selected streamflow statistics and basin characteristics for data-collection sites have been compiled from published reports and also are immediately accessible by querying individual sites from the web interface. Examples of basin characteristics that can be computed in StreamStats include drainage area, stream slope, mean annual precipitation, and percentage of forested area (Ries and others, 2008). Examples of streamflow statistics that were previously available only through published documents include peak-flow frequency, flow-duration, and precipitation data. These data are valuable for making decisions related to bridge design, floodplain delineation, water-supply permitting, and sustainable stream quality and ecology. The StreamStats application also allows users to identify stream reaches upstream and downstream from user-selected sites and obtain information for locations along streams where activities occur that may affect streamflow conditions. This functionality can be accessed through a map-based interface with the user’s Web browser, or individual functions can be requested remotely through Web services (Ries and others, 2008).

  1. Farm Mapping to Assist, Protect, and Prepare Emergency Responders: Farm MAPPER.

    PubMed

    Reyes, Iris; Rollins, Tami; Mahnke, Andrea; Kadolph, Christopher; Minor, Gerald; Keifer, Matthew

    2014-01-01

    Responders such as firefighters and emergency medical technicians who respond to farm emergencies often face complex and unknown environments. They may encounter hazards such as fuels, solvents, pesticides, caustics, and exploding gas storage cylinders. Responders may be unaware of dirt roads within the farm that can expedite their arrival at critical sites or snow-covered manure pits that act as hidden hazards. A response to a farm, unless guided by someone familiar with the operation, may present a risk to responders and post a challenge in locating the victim. This project explored the use of a Web-based farm-mapping application optimized for tablets and accessible via easily accessible on-site matrix barcodes, or quick response codes (QR codes), to provide emergency responders with hazard and resource information to agricultural operations. Secured portals were developed for both farmers and responders, allowing both parties to populate and customize farm maps with icons. Data were stored online and linked to QR codes attached to mailbox posts where emergency responders may read them with a mobile device. Mock responses were conducted on dairy farms to test QR code linking efficacy, Web site security, and field usability. Findings from farmer usability tests showed willingness to enter data as well as ease of Web site navigation and data entry even with farmers who had limited computer knowledge. Usability tests with emergency responders showed ease of QR code connectivity to the farm maps and ease of Web site navigation. Further research is needed to improve data security as well as assess the program's applicability to nonfarm environments and integration with existing emergency response systems. The next phases of this project will expand the program for regional and national use, develop QR code-linked, Web-based extrication guidance for farm machinery for victim entrapment rescue, and create QR code-linked online training videos and materials for limited English proficient immigrant farm workers.

  2. Using Gender Schema Theory to Examine Gender Equity in Computing: a Preliminary Study

    NASA Astrophysics Data System (ADS)

    Agosto, Denise E.

    Women continue to constitute a minority of computer science majors in the United States and Canada. One possible contributing factor is that most Web sites, CD-ROMs, and other digital resources do not reflect girls' design and content preferences. This article describes a pilot study that considered whether gender schema theory can serve as a framework for investigating girls' Web site design and content preferences. Eleven 14- and 15-year-old girls participated in the study. The methodology included the administration of the Children's Sex-Role Inventory (CSRI), Web-surfing sessions, interviews, and data analysis using iterative pattern coding. On the basis of their CSRI scores, the participants were divided into feminine-high (FH) and masculine-high (MH) groups. Data analysis uncovered significant differences in the criteria the groups used to evaluate Web sites. The FH group favored evaluation criteria relating to graphic and multimedia design, whereas the MH group favored evaluation criteria relating to subject content. Models of the two groups' evaluation criteria are presented, and the implications of the findings are discussed.

  3. Web-services-based spatial decision support system to facilitate nuclear waste siting

    NASA Astrophysics Data System (ADS)

    Huang, L. Xinglai; Sheng, Grant

    2006-10-01

    The availability of spatial web services enables data sharing among managers, decision and policy makers and other stakeholders in much simpler ways than before and subsequently has created completely new opportunities in the process of spatial decision making. Though generally designed for a certain problem domain, web-services-based spatial decision support systems (WSDSS) can provide a flexible problem-solving environment to explore the decision problem, understand and refine problem definition, and generate and evaluate multiple alternatives for decision. This paper presents a new framework for the development of a web-services-based spatial decision support system. The WSDSS is comprised of distributed web services that either have their own functions or provide different geospatial data and may reside in different computers and locations. WSDSS includes six key components, namely: database management system, catalog, analysis functions and models, GIS viewers and editors, report generators, and graphical user interfaces. In this study, the architecture of a web-services-based spatial decision support system to facilitate nuclear waste siting is described as an example. The theoretical, conceptual and methodological challenges and issues associated with developing web services-based spatial decision support system are described.

  4. The Ensembl Web Site: Mechanics of a Genome Browser

    PubMed Central

    Stalker, James; Gibbins, Brian; Meidl, Patrick; Smith, James; Spooner, William; Hotz, Hans-Rudolf; Cox, Antony V.

    2004-01-01

    The Ensembl Web site (http://www.ensembl.org/) is the principal user interface to the data of the Ensembl project, and currently serves >500,000 pages (∼2.5 million hits) per week, providing access to >80 GB (gigabyte) of data to users in more than 80 countries. Built atop an open-source platform comprising Apache/mod_perl and the MySQL relational database management system, it is modular, extensible, and freely available. It is being actively reused and extended in several different projects, and has been downloaded and installed in companies and academic institutions worldwide. Here, we describe some of the technical features of the site, with particular reference to its dynamic configuration that enables it to handle disparate data from multiple species. PMID:15123591

  5. The Ensembl Web site: mechanics of a genome browser.

    PubMed

    Stalker, James; Gibbins, Brian; Meidl, Patrick; Smith, James; Spooner, William; Hotz, Hans-Rudolf; Cox, Antony V

    2004-05-01

    The Ensembl Web site (http://www.ensembl.org/) is the principal user interface to the data of the Ensembl project, and currently serves >500,000 pages (approximately 2.5 million hits) per week, providing access to >80 GB (gigabyte) of data to users in more than 80 countries. Built atop an open-source platform comprising Apache/mod_perl and the MySQL relational database management system, it is modular, extensible, and freely available. It is being actively reused and extended in several different projects, and has been downloaded and installed in companies and academic institutions worldwide. Here, we describe some of the technical features of the site, with particular reference to its dynamic configuration that enables it to handle disparate data from multiple species.

  6. Perceptions of Business Students' Feature Requirements in Educational Web Sites

    ERIC Educational Resources Information Center

    Hazari, Sunil; Johnson, Barbara

    2007-01-01

    There is paucity of original research that explains phenomena related to content organization and site design of educational Web sites. Educational Web sites are often used to provide Web-based instruction, which itself is a relatively recent phenomenon for business schools, and additional research is needed in this area. Educational Web sites are…

  7. The establishment of a statewide surveillance program for hospital-acquired infections in large Victorian public hospitals: a report from the VICNISS Coordinating Centre.

    PubMed

    Russo, Philip L; Bull, Ann; Bennett, Noleen; Boardman, Claire; Burrell, Simon; Motley, Jane; Berry, Kylie; Friedman, N Deborah; Richards, Michael

    2006-09-01

    A 1998 survey of acute Victorian public hospitals (VPH) revealed that surveillance of hospital-acquired infections (HAI) was underdeveloped, definitions and methodology varied considerably, and results disseminated inconsistently. The survey identified the need for an effective surveillance system for HAI. To develop and support a standardized surveillance program for HAIs in large acute VPH and to provide risk-adjusted, procedure-specific, HAI rates. In 2002, the independent Victorian Nosocomial Infection Surveillance System (VICNISS) Coordinating Centre (VCC) was established to develop and support the standardized surveillance program. A multidisciplinary team was recruited. A communication strategy, surveillance manual, user groups, and Web site were developed. Formal education sessions were provided to participating infection control nurse consultants (ICCs). Surveillance activities were based on the US Centers for Diseases Control and Prevention's National Nosocomial Infection Surveillance System (NNIS) surgical site infection and intensive care unit (ICU) components. NNIS methods were modified to suit local needs. Data collection was paper based or through existing hospital software. An advisory committee of key stakeholders met every second month. The surveillance program was rolled out over 12 months to all 28 large adult VPH. Data on over 20,000 surgical procedures performed at participating sites between November 11, 2002, and December 31, 2004, were submitted. Thirteen hospitals contributed to the ICU surveillance activities. Following aggregation and analysis by the VCC, hospital- and state-level results were posted on the Web page for hospitals to review. A standardized approach for surveillance of HAI was established in a short time frame in over 28 VPH. VICNISS is a tool that will continue to provide participating hospitals with a basis for continuous quality improvement.

  8. Can trainees design and deliver a national audit of epistaxis management? A pilot of a secure web-based audit tool and research trainee collaboratives.

    PubMed

    Mehta, N; Williams, R J; Smith, M E; Hall, A; Hardman, J C; Cheung, L; Ellis, M P; Fussey, J M; Lakhani, R; McLaren, O; Nankivell, P C; Sharma, N; Yeung, W; Carrie, S; Hopkins, C

    2017-06-01

    To investigate the feasibility of a national audit of epistaxis management led and delivered by a multi-region trainee collaborative using a web-based interface to capture patient data. Six trainee collaboratives across England nominated one site each and worked together to carry out this pilot. An encrypted data capture tool was adapted and installed within the infrastructure of a university secure server. Site-lead feedback was assessed through questionnaires. Sixty-three patients with epistaxis were admitted over a two-week period. Site leads reported an average of 5 minutes to complete questionnaires and described the tool as easy to use. Data quality was high, with little missing data. Site-lead feedback showed high satisfaction ratings for the project (mean, 4.83 out of 5). This pilot showed that trainee collaboratives can work together to deliver an audit using an encrypted data capture tool cost-effectively, whilst maintaining the highest levels of data quality.

  9. The RCSB Protein Data Bank: new resources for research and education

    PubMed Central

    Rose, Peter W.; Bi, Chunxiao; Bluhm, Wolfgang F.; Christie, Cole H.; Dimitropoulos, Dimitris; Dutta, Shuchismita; Green, Rachel K.; Goodsell, David S.; Prlić, Andreas; Quesada, Martha; Quinn, Gregory B.; Ramos, Alexander G.; Westbrook, John D.; Young, Jasmine; Zardecki, Christine; Berman, Helen M.; Bourne, Philip E.

    2013-01-01

    The Research Collaboratory for Structural Bioinformatics Protein Data Bank (RCSB PDB) develops tools and resources that provide a structural view of biology for research and education. The RCSB PDB web site (http://www.rcsb.org) uses the curated 3D macromolecular data contained in the PDB archive to offer unique methods to access, report and visualize data. Recent activities have focused on improving methods for simple and complex searches of PDB data, creating specialized access to chemical component data and providing domain-based structural alignments. New educational resources are offered at the PDB-101 educational view of the main web site such as Author Profiles that display a researcher’s PDB entries in a timeline. To promote different kinds of access to the RCSB PDB, Web Services have been expanded, and an RCSB PDB Mobile application for the iPhone/iPad has been released. These improvements enable new opportunities for analyzing and understanding structure data. PMID:23193259

  10. Web site development: applying aesthetics to promote breast health education and awareness.

    PubMed

    Thomas, Barbara; Goldsmith, Susan B; Forrest, Anne; Marshall, Renée

    2002-01-01

    This article describes the process of establishing a Web site as part of a collaborative project using visual art to promote breast health education. The need for a more "user-friendly" comprehensive breast health Web site that is aesthetically rewarding was identified after an analysis of current Web sites available through the World Wide Web. Two predetermined sets of criteria, accountability and aesthetics, were used to analyze these sites and to generate ideas for creating a breast health education Web site using visual art. Results of the analyses conducted are included as well as the factors to consider for incorporating into a Web site. The process specified is thorough and can be applied to establish a Web site that is aesthetically rewarding and informative for a variety of educational purposes.

  11. The Drupal Environmental Information Management System Provides Standardization, Flexibility and a Platform for Collaboration

    NASA Astrophysics Data System (ADS)

    Gries, C.; Vanderbilt, K.; Reid, D.; Melendez-Colom, E.; San Gil, I.

    2013-12-01

    Over the last five years several Long-Term Ecological Research (LTER) sites have collaboratively developed a standardized yet flexible approach to ecological information management based on the open source Drupal content management system. These LTER sites adopted a common data model for basic metadata necessary to describe data sets, but also used for site management and web presence. Drupal core functionality provides web forms for easy management of information stored in this data model. Custom Drupal extensions were developed to generate XML files conforming to the Ecological Metadata Language (EML) for contribution to the LTER Network Information System (NIS) and other data archives. Each LTER site then took advantage of the flexibility Drupal provides to develop its unique web presence, choosing different themes and adding additional content to the websites. By nature, information presented is highly interlinked which can easily be modeled in Drupal entities and is further supported by a sophisticated tagging system (Fig. 1). Therefore, it is possible to provide the visitor with many different entry points to the site specific information presented. For example, publications and datasets may be grouped for each scientist, for each research project, for each major research theme at the site, making the information presented more accessible for different visitors. Experience gained during the early years was recently used to launch a complete re-write for upgrading to Drupal 7. LTER sites from multiple academic institutions pooled resources in order to partner with professional Drupal developers. Highlights of the new developments are streamlined data entry, improved EML output and integrity, support of IM workflows, a faceted data set search, a highly configurable data exploration tool with intelligent filtering and data download, and, for the mobile age, a responsive web design theme. Seven custom modules and a specific installation profile were developed involving many other community contributed modules, all with an upgrade to Drupal 8 in mind. The collaborative development of the Drupal Ecological Information Management System (DEIMS) has resulted in a product that is standards-based but flexible enough to meet individual site needs. It is available at the Drupal.org website for other small research stations or labs to use, extend and improve according to the open source philosophy. Figure 1: Overview of DEIMS components and interactions

  12. New Tools for Estimating and Managing Local/Regional Air Quality Impacts of Prescribed Burns

    DTIC Science & Technology

    2013-02-01

    189 8.1.2 Conference Papers ...the dryer fuels and horizontal burn configuration would result in a more intense fire and provide emission factors closer to those from the flaming...the EPA web site, the CMAQ Model is a powerful computational tool used by EPA and states for air quality management in that it can be used to design

  13. Case study: development of and stakeholder responses to a nursing home consumer information system.

    PubMed

    O'Meara, Janis; Kitchener, Martin; Collier, Eric; Lyons, Margaret; de Billwiller-Kiss, Ana; Simon, Lisa Payne; Harrington, Charlene

    2005-01-01

    California Nursing Home Search (www.calnhs.org), launched in October 2002, provides information about nursing home quality to a broad range of stakeholders. This case study discusses the process of developing a consumer-oriented nursing home Web site and presents an analysis of postlaunch responses from a number of sources (i.e., media, outreach, Web site use, correspondence, meetings, interviews) to determine the impact of the site and how it can be improved and used as an example. Consumers found the Web site valuable, but some needed clarification on navigation. Providers had complaints about the use of quality ratings and concerns about public availability of the data. Most discharge planners and care managers do not use Internet resources to find facilities. Feedback, modifications, updates, and outreach are needed on a continuous basis to ensure the site is a helpful tool for all stakeholders.

  14. Are we there yet? An examination of online tailored health communication.

    PubMed

    Suggs, L Suzanne; McIntyre, Chris

    2009-04-01

    Increasingly, the Internet is playing an important role in consumer health and patient-provider communication. Seventy-three percent of American adults are now online, and 79% have searched for health information on the Internet. This study provides a baseline understanding of the extent to which health consumers are able to find tailored communication online. It describes the current behavioral focus, the channels being used to deliver the tailored content, and the level of tailoring in online-tailored communication. A content analysis of 497 health Web sites found few examples of personalized, targeted, or tailored health sites freely available online. Tailored content was provided in 13 Web sites, although 15 collected individual data. More health risk assessment (HRA) sites included tailored feedback than other topics. The patterns that emerged from the analysis demonstrate that online health users can access a number of Web sites with communication tailored to their needs.

  15. Photosynthesis and the web: 2001.

    PubMed

    Orr, L

    2001-01-01

    First, a brief history of the Internet and the World Wide Web is presented. This is followed by relevant information on photosynthesis-related web sites grouped into several categories: (1) large group sites, (2) comprehensive overview sites, (3) specific subject sites, (4) individual researcher sites, (5) kindergarten through high school (K-12) educational sites, (6) books and journals, and, 7) other useful sites. A section on searching the Web is also included. Finally, we have included an appendix with all of the web sites discussed herein as well as other web sites that space did not allow. Readers are requested to send comments, corrections and additions to gov@uiuc.edu.

  16. Remote real-time monitoring of subsurface landfill gas migration.

    PubMed

    Fay, Cormac; Doherty, Aiden R; Beirne, Stephen; Collins, Fiachra; Foley, Colum; Healy, John; Kiernan, Breda M; Lee, Hyowon; Maher, Damien; Orpen, Dylan; Phelan, Thomas; Qiu, Zhengwei; Zhang, Kirk; Gurrin, Cathal; Corcoran, Brian; O'Connor, Noel E; Smeaton, Alan F; Diamond, Dermot

    2011-01-01

    The cost of monitoring greenhouse gas emissions from landfill sites is of major concern for regulatory authorities. The current monitoring procedure is recognised as labour intensive, requiring agency inspectors to physically travel to perimeter borehole wells in rough terrain and manually measure gas concentration levels with expensive hand-held instrumentation. In this article we present a cost-effective and efficient system for remotely monitoring landfill subsurface migration of methane and carbon dioxide concentration levels. Based purely on an autonomous sensing architecture, the proposed sensing platform was capable of performing complex analytical measurements in situ and successfully communicating the data remotely to a cloud database. A web tool was developed to present the sensed data to relevant stakeholders. We report our experiences in deploying such an approach in the field over a period of approximately 16 months.

  17. Remote Real-Time Monitoring of Subsurface Landfill Gas Migration

    PubMed Central

    Fay, Cormac; Doherty, Aiden R.; Beirne, Stephen; Collins, Fiachra; Foley, Colum; Healy, John; Kiernan, Breda M.; Lee, Hyowon; Maher, Damien; Orpen, Dylan; Phelan, Thomas; Qiu, Zhengwei; Zhang, Kirk; Gurrin, Cathal; Corcoran, Brian; O’Connor, Noel E.; Smeaton, Alan F.; Diamond, Dermot

    2011-01-01

    The cost of monitoring greenhouse gas emissions from landfill sites is of major concern for regulatory authorities. The current monitoring procedure is recognised as labour intensive, requiring agency inspectors to physically travel to perimeter borehole wells in rough terrain and manually measure gas concentration levels with expensive hand-held instrumentation. In this article we present a cost-effective and efficient system for remotely monitoring landfill subsurface migration of methane and carbon dioxide concentration levels. Based purely on an autonomous sensing architecture, the proposed sensing platform was capable of performing complex analytical measurements in situ and successfully communicating the data remotely to a cloud database. A web tool was developed to present the sensed data to relevant stakeholders. We report our experiences in deploying such an approach in the field over a period of approximately 16 months. PMID:22163975

  18. How Japanese students characterize information from web-sites.

    PubMed

    Iwahara, A; Yamada, M; Hatta, T; Kawakami, A; Okamoto, M

    2000-12-01

    How 352 Japanese university students regard web-site information was investigated by two kinds of survey. Application of correspondence analysis and cluster analysis to the questionnaire responses to the web-site advertisement showed students regarded a web-site as a new alien medium which is different from current media. Students regarded web-sites as simply complicated, intellectual, and impermanent, or not memorable. Students got precise information from web-sites but they did not use it in making decisions to purchase goods.

  19. MMI: Increasing Community Collaboration

    NASA Astrophysics Data System (ADS)

    Galbraith, N. R.; Stocks, K.; Neiswender, C.; Maffei, A.; Bermudez, L.

    2007-12-01

    Building community requires a collaborative environment and guidance to help move members towards a common goal. An effective environment for community collaboration is a workspace that fosters participation and cooperation; effective guidance furthers common understanding and promotes best practices. The Marine Metadata Interoperability (MMI) project has developed a community web site to provide a collaborative environment for scientists, technologists, and data managers from around the world to learn about metadata and exchange ideas. Workshops, demonstration projects, and presentations also provide community-building opportunities for MMI. MMI has developed comprehensive online guides to help users understand and work with metadata standards, ontologies, and other controlled vocabularies. Documents such as "The Importance of Metadata Standards", "Usage vs. Discovery Vocabularies" and "Developing Controlled Vocabularies" guide scientists and data managers through a variety of metadata-related concepts. Members from eight organizations involved in marine science and informatics collaborated on this effort. The MMI web site has moved from Plone to Drupal, two content management systems which provide different opportunities for community-based work. Drupal's "organic groups" feature will be used to provide workspace for future teams tasked with content development, outreach, and other MMI mission-critical work. The new site is designed to enable members to easily create working areas, to build communities dedicated to developing consensus on metadata and other interoperability issues. Controlled-vocabulary-driven menus, integrated mailing-lists, member-based content creation and review tools are facets of the new web site architecture. This move provided the challenge of developing a hierarchical vocabulary to describe the resources presented on the site; consistent and logical tagging of web pages is the basis of Drupal site navigation. The new MMI web site presents enhanced opportunities for electronic discussions, focused collaborative work, and even greater community participation. The MMI project is beginning a new initiative to comprehensively catalog and document tools for marine metadata. The new MMI community-based web site will be used to support this work and to support the work of other ad-hoc teams in the future. We are seeking broad input from the community on this effort.

  20. Web sites for postpartum depression: convenient, frustrating, incomplete, and misleading.

    PubMed

    Summers, Audra L; Logsdon, M Cynthia

    2005-01-01

    To evaluate the content and the technology of Web sites providing information on postpartum depression. Eleven search engines were queried using the words "Postpartum Depression." The top 10 sites in each search engine were evaluated for correct content and technology using the Web Depression Tool, based on the Technology Assessment Model. Of the 36 unique Web sites located, 34 were available to review. Only five Web sites provided >75% correct responses to questions that summarized the current state of the science for postpartum depression. Eleven of the Web sites contained little or no useful information about postpartum depression, despite being among the first 10 Web sites listed by the search engine. Some Web sites contained possibly harmful suggestions for treatment of postpartum depression. In addition, there are many problems with the technology of Web sites providing information on postpartum depression. A better Web site for postpartum depression is necessary if we are to meet the needs of consumers for accurate and current information using technology that enhances learning. Since patient education is a core competency for nurses, it is essential that nurses understand how their patients are using the World Wide Web for learning and how we can assist our patients to find appropriate sites containing correct information.

  1. The Innate Immune Database (IIDB)

    PubMed Central

    Korb, Martin; Rust, Aistair G; Thorsson, Vesteinn; Battail, Christophe; Li, Bin; Hwang, Daehee; Kennedy, Kathleen A; Roach, Jared C; Rosenberger, Carrie M; Gilchrist, Mark; Zak, Daniel; Johnson, Carrie; Marzolf, Bruz; Aderem, Alan; Shmulevich, Ilya; Bolouri, Hamid

    2008-01-01

    Background As part of a National Institute of Allergy and Infectious Diseases funded collaborative project, we have performed over 150 microarray experiments measuring the response of C57/BL6 mouse bone marrow macrophages to toll-like receptor stimuli. These microarray expression profiles are available freely from our project web site . Here, we report the development of a database of computationally predicted transcription factor binding sites and related genomic features for a set of over 2000 murine immune genes of interest. Our database, which includes microarray co-expression clusters and a host of web-based query, analysis and visualization facilities, is available freely via the internet. It provides a broad resource to the research community, and a stepping stone towards the delineation of the network of transcriptional regulatory interactions underlying the integrated response of macrophages to pathogens. Description We constructed a database indexed on genes and annotations of the immediate surrounding genomic regions. To facilitate both gene-specific and systems biology oriented research, our database provides the means to analyze individual genes or an entire genomic locus. Although our focus to-date has been on mammalian toll-like receptor signaling pathways, our database structure is not limited to this subject, and is intended to be broadly applicable to immunology. By focusing on selected immune-active genes, we were able to perform computationally intensive expression and sequence analyses that would currently be prohibitive if applied to the entire genome. Using six complementary computational algorithms and methodologies, we identified transcription factor binding sites based on the Position Weight Matrices available in TRANSFAC. For one example transcription factor (ATF3) for which experimental data is available, over 50% of our predicted binding sites coincide with genome-wide chromatin immnuopreciptation (ChIP-chip) results. Our database can be interrogated via a web interface. Genomic annotations and binding site predictions can be automatically viewed with a customized version of the Argo genome browser. Conclusion We present the Innate Immune Database (IIDB) as a community resource for immunologists interested in gene regulatory systems underlying innate responses to pathogens. The database website can be freely accessed at . PMID:18321385

  2. Web Camera Use of Mothers and Fathers When Viewing Their Hospitalized Neonate.

    PubMed

    Rhoads, Sarah J; Green, Angela; Gauss, C Heath; Mitchell, Anita; Pate, Barbara

    2015-12-01

    Mothers and fathers of neonates hospitalized in a neonatal intensive care unit (NICU) differ in their experiences related to NICU visitation. To describe the frequency and length of maternal and paternal viewing of their hospitalized neonates via a Web camera. A total of 219 mothers and 101 fathers used the Web camera that allows 24/7 NICU viewing from September 1, 2010, to December 31, 2012, which included 40 mother and father dyads. We conducted a review of the Web camera's Web site log-on records in this nonexperimental, descriptive study. Mothers and fathers had a significant difference in the mean number of log-ons to the Web camera system (P = .0293). Fathers virtually visited the NICU less often than mothers, but there was not a statistical difference between mothers and fathers in terms of the mean total number of minutes viewing the neonate (P = .0834) or in the maximum number of minutes of viewing in 1 session (P = .6924). Patterns of visitations over time were not measured. Web camera technology could be a potential intervention to aid fathers in visiting their neonates. Both parents should be offered virtual visits using the Web camera and oriented regarding how to use the Web camera. These findings are important to consider when installing Web cameras in a NICU. Future research should continue to explore Web camera use in NICUs.

  3. The New USGS Volcano Hazards Program Web Site

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Graham, S. E.; Parker, T. J.; Snedigar, S. F.

    2008-12-01

    The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) has launched a revised web site that uses a map-based interface to display hazards information for U.S. volcanoes. The web site is focused on better communication of hazards and background volcano information to our varied user groups by reorganizing content based on user needs and improving data display. The Home Page provides a synoptic view of the activity level of all volcanoes for which updates are written using a custom Google® Map. Updates are accessible by clicking on one of the map icons or clicking on the volcano of interest in the adjacent color-coded list of updates. The new navigation provides rapid access to volcanic activity information, background volcano information, images and publications, volcanic hazards, information about VHP, and the USGS volcano observatories. The Volcanic Activity section was tailored for emergency managers but provides information for all our user groups. It includes a Google® Map of the volcanoes we monitor, an Elevated Activity Page, a general status page, information about our Volcano Alert Levels and Aviation Color Codes, monitoring information, and links to monitoring data from VHP's volcano observatories: Alaska Volcano Observatory (AVO), Cascades Volcano Observatory (CVO), Long Valley Observatory (LVO), Hawaiian Volcano Observatory (HVO), and Yellowstone Volcano Observatory (YVO). The YVO web site was the first to move to the new navigation system and we are working on integrating the Long Valley Observatory web site next. We are excited to continue to implement new geospatial technologies to better display our hazards and supporting volcano information.

  4. HCLS 2.0/3.0: health care and life sciences data mashup using Web 2.0/3.0.

    PubMed

    Cheung, Kei-Hoi; Yip, Kevin Y; Townsend, Jeffrey P; Scotch, Matthew

    2008-10-01

    We describe the potential of current Web 2.0 technologies to achieve data mashup in the health care and life sciences (HCLS) domains, and compare that potential to the nascent trend of performing semantic mashup. After providing an overview of Web 2.0, we demonstrate two scenarios of data mashup, facilitated by the following Web 2.0 tools and sites: Yahoo! Pipes, Dapper, Google Maps and GeoCommons. In the first scenario, we exploited Dapper and Yahoo! Pipes to implement a challenging data integration task in the context of DNA microarray research. In the second scenario, we exploited Yahoo! Pipes, Google Maps, and GeoCommons to create a geographic information system (GIS) interface that allows visualization and integration of diverse categories of public health data, including cancer incidence and pollution prevalence data. Based on these two scenarios, we discuss the strengths and weaknesses of these Web 2.0 mashup technologies. We then describe Semantic Web, the mainstream Web 3.0 technology that enables more powerful data integration over the Web. We discuss the areas of intersection of Web 2.0 and Semantic Web, and describe the potential benefits that can be brought to HCLS research by combining these two sets of technologies.

  5. HCLS 2.0/3.0: Health Care and Life Sciences Data Mashup Using Web 2.0/3.0

    PubMed Central

    Cheung, Kei-Hoi; Yip, Kevin Y.; Townsend, Jeffrey P.; Scotch, Matthew

    2010-01-01

    We describe the potential of current Web 2.0 technologies to achieve data mashup in the health care and life sciences (HCLS) domains, and compare that potential to the nascent trend of performing semantic mashup. After providing an overview of Web 2.0, we demonstrate two scenarios of data mashup, facilitated by the following Web 2.0 tools and sites: Yahoo! Pipes, Dapper, Google Maps and GeoCommons. In the first scenario, we exploited Dapper and Yahoo! Pipes to implement a challenging data integration task in the context of DNA microarray research. In the second scenario, we exploited Yahoo! Pipes, Google Maps, and GeoCommons to create a geographic information system (GIS) interface that allows visualization and integration of diverse categories of public health data, including cancer incidence and pollution prevalence data. Based on these two scenarios, we discuss the strengths and weaknesses of these Web 2.0 mashup technologies. We then describe Semantic Web, the mainstream Web 3.0 technology that enables more powerful data integration over the Web. We discuss the areas of intersection of Web 2.0 and Semantic Web, and describe the potential benefits that can be brought to HCLS research by combining these two sets of technologies. PMID:18487092

  6. Applications of the U.S. Geological Survey's global land cover product

    USGS Publications Warehouse

    Reed, B.

    1997-01-01

    The U.S. Geological Survey (USGS), in partnership with several international agencies and universities, has produced a global land cover characteristics database. The land cover data were created using multitemporal analysis of advanced very high resolution radiometer satellite images in conjunction with other existing geographic data. A translation table permits the conversion of the land cover classes into several conventional land cover schemes that are used by ecosystem modelers, climate modelers, land management agencies, and other user groups. The alternative classification schemes include Global Ecosystems, the Biosphere Atmosphere Transfer Scheme, the Simple Biosphere, the USGS Anderson Level 2, and the International Geosphere Biosphere Programme. The distribution system for these data is through the World Wide Web (the web site address is: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html) or by magnetic media upon special request The availability of the data over the World Wide Web, in conjunction with the flexible database structure, allows easy data access to a wide range of users. The web site contains a user registration form that allows analysis of the diverse applications of large-area land cover data. Currently, applications are divided among mapping (20 percent), conservation (30 percent), and modeling (35 percent).

  7. 16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Requirements for Web site registration or... PRODUCTS (Eff. June 28, 2010) § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the...

  8. 22 CFR 502.6 - Terms of use for accessing program materials available on agency Web sites.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... available on agency Web sites. 502.6 Section 502.6 Foreign Relations BROADCASTING BOARD OF GOVERNORS... program materials available on agency Web sites. (a) By accessing Agency Web sites, Requestors agree to all the Terms of Use available on those Web sites. (b) All Requestors are advised that Agency program...

  9. 16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...

  10. 16 CFR 1130.8 - Requirements for Web site registration or alternative e-mail registration.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...

  11. 16 CFR 1130.7 - Requirements for Web site registration or alternative e-mail registration.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Requirements for Web site registration or... PRODUCTS § 1130.7 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the purpose of registration...

  12. 16 CFR § 1130.8 - Requirements for Web site registration or alternative e-mail registration.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Requirements for Web site registration or... OR TODDLER PRODUCTS § 1130.8 Requirements for Web site registration or alternative e-mail registration. (a) Link to registration page. The manufacturer's Web site, or other Web site established for the...

  13. Automated Management of Exercise Intervention at the Point of Care: Application of a Web-Based Leg Training System

    PubMed Central

    2015-01-01

    Background Recent advances in information and communication technology have prompted development of Web-based health tools to promote physical activity, the key component of cardiac rehabilitation and chronic disease management. Mobile apps can facilitate behavioral changes and help in exercise monitoring, although actual training usually takes place away from the point of care in specialized gyms or outdoors. Daily participation in conventional physical activities is expensive, time consuming, and mostly relies on self-management abilities of patients who are typically aged, overweight, and unfit. Facilitation of sustained exercise training at the point of care might improve patient engagement in cardiac rehabilitation. Objective In this study we aimed to test the feasibility of execution and automatic monitoring of several exercise regimens on-site using a Web-enabled leg training system. Methods The MedExercise leg rehabilitation machine was equipped with wireless temperature sensors in order to monitor its usage by the rise of temperature in the resistance unit (Δt°). Personal electronic devices such as laptop computers were fitted with wireless gateways and relevant software was installed to monitor the usage of training machines. Cloud-based software allowed monitoring of participant training over the Internet. Seven healthy participants applied the system at various locations with training protocols typically used in cardiac rehabilitation. The heart rates were measured by fingertip pulse oximeters. Results Exercising in home chairs, in bed, and under an office desk was made feasible and resulted in an intensity-dependent increase of participants’ heart rates and Δt° in training machine temperatures. Participants self-controlled their activities on smart devices, while a supervisor monitored them over the Internet. Individual Δt° reached during 30 minutes of moderate-intensity continuous training averaged 7.8°C (SD 1.6). These Δt° were used as personalized daily doses of exercise with automatic email alerts sent upon achieving them. During 1-week training at home, automatic notifications were received on 4.4 days (SD 1.8). Although the high intensity interval training regimen was feasible on-site, it was difficult for self- and remote management. Opportunistic leg exercise under the desk, while working with a computer, and training in bed while viewing television were less intensive than dosed exercise bouts, but allowed prolonged leg mobilization of 73.7 minutes/day (SD 29.7). Conclusions This study demonstrated the feasibility of self-control exercise training on-site, which was accompanied by online monitoring, electronic recording, personalization of exercise doses, and automatic reporting of adherence. The results suggest that this technology and its applications are useful for the delivery of Web-based exercise rehabilitation and cardiac training programs at the point of care. PMID:28582243

  14. Characteristics of food industry web sites and "advergames" targeting children.

    PubMed

    Culp, Jennifer; Bell, Robert A; Cassady, Diana

    2010-01-01

    To assess the content of food industry Web sites targeting children by describing strategies used to prolong their visits and foster brand loyalty; and to document health-promoting messages on these Web sites. A content analysis was conducted of Web sites advertised on 2 children's networks, Cartoon Network and Nickelodeon. A total of 290 Web pages and 247 unique games on 19 Internet sites were examined. Games, found on 81% of Web sites, were the most predominant promotion strategy used. All games had at least 1 brand identifier, with logos being most frequently used. On average Web sites contained 1 "healthful" message for every 45 exposures to brand identifiers. Food companies use Web sites to extend their television advertising to promote brand loyalty among children. These sites almost exclusively promoted food items high in sugar and fat. Health professionals need to monitor food industry marketing practices used in "new media." Published by Elsevier Inc.

  15. Oregon OCS seafloor mapping: Selected lease blocks relevant to renewable energy

    USGS Publications Warehouse

    Cochrane, Guy R.; Hemery, Lenaïg G.; Henkel, Sarah K.

    2017-05-23

    In 2014 the U.S. Geological Survey (USGS) and the Bureau of Ocean Energy Management (BOEM) entered into Intra-agency agreement M13PG00037 to map an area of the Oregon Outer Continental Shelf (OCS) off of Coos Bay, Oregon, under consideration for development of a floating wind energy farm. The BOEM requires seafloor mapping and site characterization studies in order to evaluate the impact of seafloor and sub-seafloor conditions on the installation, operation, and structural integrity of proposed renewable energy projects, as well as to assess the potential effects of construction and operations on archaeological resources. The mission of the USGS is to provide geologic, topographic, and hydrologic information that contributes to the wise management of the Nation's natural resources and that promotes the health, safety, and well being of the people. This information consists of maps, databases, and descriptions and analyses of the water, energy, and mineral resources, land surface, underlying geologic structure, and dynamic processes of the earth.For the Oregon OCS study, the USGS acquired multibeam echo sounder and seafloor video data surrounding the proposed development site, which is 95 km2 in area and 15 miles offshore from Coos Bay. The development site had been surveyed by Solmar Hydro Inc. in 2013 under a contract with WindFloat Pacific. The USGS subsequently produced a bathymetry digital elevation model and a backscatter intensity grid that were merged with existing data collected by the contractor. The merged grids were published along with visual observations of benthic geo-habitat from the video data in an associated USGS data release (Cochrane and others, 2015).This report includes the results of analysis of the video data conducted by Oregon State University and the geo-habitat interpretation of the multibeam echo sounder (MBES) data conducted by the USGS. MBES data was published in Cochrane and others (2015). Interpretive data associated with this publication is published in Cochrane (2017). All the data is provided as geographic information system (GIS) files that contain both Esri ArcGIS geotiffs or shapefiles. For those who do not own the full suite of Esri GIS and mapping software, the data can be read using Esri ArcReader, a free viewer that is available at http://www.esri.com/software/arcgis/arcreader/index.html (last accessed August 29, 2016). Web services, which consist of standard implementations of ArcGIS representational state transfer (REST) Service and Open Geospatial Consortium (OGC) GIS web map service (WMS), also are available for all published GIS data. Web services were created using an ArcGIS service definition file, resulting in data layers that are symbolized as shown on the associated report figures. Both the ArcGIS REST Service and OGC WMS Service include all the individual GIS layers. Data layers are bundled together in a map-area web service; however, each layer can be symbolized and accessed individually after the web service is ingested into a desktop application or web map. Web services enable users to download and view data, as well as to easily add data to their own workflows, using any browser-enabled, standalone or mobile device.Though the surficial substrate is dominated by combinations of mud and sand substrate, a diverse assortment of geomorphologic features are related to geologic processes—one anticlinal ridge where bedrock is exposed, a slump and associated scarps, and pockmarks. Pockmarks are seen in the form of fields of small pockmarks, a lineation of large pockmarks with methanogenic carbonates, and areas of large pockmarks that have merged into larger variously shaped depressions. The slump appears to have originated at the pockmark lineation. Video-supervised numerical analysis of the MBES backscatter intensity data and vector ruggedness derived from the MBES bathymetry data was used to produce a substrate model called a seafloor character raster for the study area. The seafloor character raster consists of three substrate classes: soft-flat areas, hard-flat areas, and hard-rugged areas. A Coastal and Marine Ecological Classification Standard (CMECS) geoform and substrate map was also produced using depth, slope, and benthic position index classes to delineate geoform boundaries. Seven geoforms were identified in this process, including ridges, slump scars, slump deposits, basins, and pockmarks.Statistical analysis of the video data for correlations between substrate, depth, and invertebrate assemblages resulted in the identification of seven biomes: three hard-bottom biomes and four softbottom biomes. A similar analysis of vertebrate observations produces a similar set of biomes. The biome between-group dissimilarity was very high or high. Invertebrates alone represent most of the structure of the whole benthic community into different assemblages. A biotope map was generated using the seafloor character raster and the substrate and depth values of the biomes. Hard substrate biotopes were small in size and were located primarily on the ridge and in pockmarks along the pockmark lineation. The soft-bottom bitopes consisted of large contiguous areas delimited by isobaths.

  16. Data base for early postfire succession in Northern Rocky Mountain forests

    Treesearch

    Peter F. Stickney; Robert B. Campbell

    2000-01-01

    Web site and CD-ROM include 21 pages of text plus electronic data for 55 succession sites including color plates, tables, and figures. Provides data on quantitative postfire changes of plant species and forest vegetation components for up to the first 25 years of secondary plant succession for 55 forest sites in northern Idaho and northwestern Montana. Cover (aerial...

  17. Policy-Aware Content Reuse on the Web

    NASA Astrophysics Data System (ADS)

    Seneviratne, Oshani; Kagal, Lalana; Berners-Lee, Tim

    The Web allows users to share their work very effectively leading to the rapid re-use and remixing of content on the Web including text, images, and videos. Scientific research data, social networks, blogs, photo sharing sites and other such applications known collectively as the Social Web have lots of increasingly complex information. Such information from several Web pages can be very easily aggregated, mashed up and presented in other Web pages. Content generation of this nature inevitably leads to many copyright and license violations, motivating research into effective methods to detect and prevent such violations.

  18. [Legal aspects of Web 2.0 in the health field].

    PubMed

    Beslay, Nathalie; Jeunehomme, Marie

    2009-10-01

    Web 2.0 sites are considered to be hosting providers and not publishers of user-generated content. The liability of hosting providers' liability is defined by the law enacted on June 21, 2004, on confidence in the digital economy. Hosting providers must promptly remove the information they host or make its access impossible once they are informed of its illegality. They are required to obtain and retain data to enable identification of any person who has contributed to content hosted by them. The liability of hosting providers has arisen in numerous disputes about user-produced content in various situations (discussion lists, blogs, etc.). The National Board of Physicians has developed specific ethical guidelines for web sites devoted to health issues and specifically for physician-authored content. The National Board of Physicians acknowledges that physicians can present themselves, their office, and their specific practice on their web site, notwithstanding any restrictions otherwise applicable to advertising.

  19. The effect of types of banner ad, Web localization, and customer involvement on Internet users' attitudes.

    PubMed

    Chen, Jengchung Victor; Ross, William H; Yen, David C; Akhapon, Lerdsuwankij

    2009-02-01

    In this study, three characteristics of Web sites were varied: types of banner ad, Web localization, and involvement in purchasing a product. The dependent variable was attitude toward the site. In laboratory experiments conducted in Thailand and Taiwan, participants browsed versions of a Web site containing different types of banner ads and products. As a within-participants factor, each participant browsed both a standardized English-language Web site and a localized Web site. Results showed that animated (rather than static) banner ads, localized versions (rather than a standardized version) of Web sites, and high (rather than low) product involvement led to favorable attitudes toward the site.

  20. Effects of the Web Behavior Change Program for Activity and Multimodal Pain Rehabilitation: Randomized Controlled Trial

    PubMed Central

    Michaelson, Peter; Gard, Gunvor; Eriksson, Margareta K

    2016-01-01

    Background Web-based interventions with a focus on behavior change have been used for pain management, but studies of Web-based interventions integrated in clinical practice are lacking. To emphasize the development of cognitive skills and behavior, and to increase activity and self-care in rehabilitation, the Web Behavior Change Program for Activity (Web-BCPA) was developed and added to multimodal pain rehabilitation (MMR). Objective The objective of our study was to evaluate the effects of MMR in combination with the Web-BCPA compared with MMR among persons with persistent musculoskeletal pain in primary health care on pain intensity, self-efficacy, and copying, as part of a larger collection of data. Web-BCPA adherence and feasibility, as well as treatment satisfaction, were also investigated. Methods A total of 109 participants, mean age 43 (SD 11) years, with persistent pain in the back, neck, shoulder, and/or generalized pain were recruited to a randomized controlled trial with two intervention arms: (1) MMR+WEB (n=60) and (2) MMR (n=49). Participants in the MMR+WEB group self-guided through the eight modules of the Web-BCPA: pain, activity, behavior, stress and thoughts, sleep and negative thoughts, communication and self-esteem, solutions, and maintenance and progress. Data were collected with a questionnaire at baseline and at 4 and 12 months. Outcome measures were pain intensity (Visual Analog Scale), self-efficacy to control pain and to control other symptoms (Arthritis Self-Efficacy Scale), general self-efficacy (General Self-Efficacy Scale), and coping (two-item Coping Strategies Questionnaire; CSQ). Web-BCPA adherence was measured as minutes spent in the program. Satisfaction and Web-BCPA feasibility were assessed by a set of items. Results Of 109 participants, 99 received the allocated intervention (MMR+WEB: n=55; MMR: n=44); 88 of 99 (82%) completed the baseline and follow-up questionnaires. Intention-to-treat analyses were performed with a sample size of 99. The MMR+WEB intervention was effective over time (time*group) compared to MMR for the two-item CSQ catastrophizing subscale (P=.003), with an effect size of 0.61 (Cohen d) at 12 months. There were no significant between-group differences over time (time*group) regarding pain intensity, self-efficacy (pain, other symptoms, and general), or regarding six subscales of the two-item CSQ. Improvements over time (time) for the whole study group were found regarding mean (P<.001) and maximum (P=.002) pain intensity. The mean time spent in the Web-based program was 304 minutes (range 0-1142). Participants rated the items of Web-BCPA feasibility between 68/100 and 90/100. Participants in the MMR+WEB group were more satisfied with their MMR at 4 months (P<.001) and at 12 months (P=.003). Conclusions Adding a self-guided Web-based intervention with a focus on behavioral change for activity to MMR can reduce catastrophizing and increase satisfaction with MMR. Patients in MMR may need more supportive coaching to increase adherence in the Web-BCPA to find it valuable. ClinicalTrial Clinicaltrials.gov NCT01475591; https://clinicaltrials.gov/ct2/show/NCT01475591 (Archived by WebCite at http://www.webcitation.org/6kUnt7VQh) PMID:27707686

  1. Effects of the Web Behavior Change Program for Activity and Multimodal Pain Rehabilitation: Randomized Controlled Trial.

    PubMed

    Nordin, Catharina A; Michaelson, Peter; Gard, Gunvor; Eriksson, Margareta K

    2016-10-05

    Web-based interventions with a focus on behavior change have been used for pain management, but studies of Web-based interventions integrated in clinical practice are lacking. To emphasize the development of cognitive skills and behavior, and to increase activity and self-care in rehabilitation, the Web Behavior Change Program for Activity (Web-BCPA) was developed and added to multimodal pain rehabilitation (MMR). The objective of our study was to evaluate the effects of MMR in combination with the Web-BCPA compared with MMR among persons with persistent musculoskeletal pain in primary health care on pain intensity, self-efficacy, and copying, as part of a larger collection of data. Web-BCPA adherence and feasibility, as well as treatment satisfaction, were also investigated. A total of 109 participants, mean age 43 (SD 11) years, with persistent pain in the back, neck, shoulder, and/or generalized pain were recruited to a randomized controlled trial with two intervention arms: (1) MMR+WEB (n=60) and (2) MMR (n=49). Participants in the MMR+WEB group self-guided through the eight modules of the Web-BCPA: pain, activity, behavior, stress and thoughts, sleep and negative thoughts, communication and self-esteem, solutions, and maintenance and progress. Data were collected with a questionnaire at baseline and at 4 and 12 months. Outcome measures were pain intensity (Visual Analog Scale), self-efficacy to control pain and to control other symptoms (Arthritis Self-Efficacy Scale), general self-efficacy (General Self-Efficacy Scale), and coping (two-item Coping Strategies Questionnaire; CSQ). Web-BCPA adherence was measured as minutes spent in the program. Satisfaction and Web-BCPA feasibility were assessed by a set of items. Of 109 participants, 99 received the allocated intervention (MMR+WEB: n=55; MMR: n=44); 88 of 99 (82%) completed the baseline and follow-up questionnaires. Intention-to-treat analyses were performed with a sample size of 99. The MMR+WEB intervention was effective over time (time*group) compared to MMR for the two-item CSQ catastrophizing subscale (P=.003), with an effect size of 0.61 (Cohen d) at 12 months. There were no significant between-group differences over time (time*group) regarding pain intensity, self-efficacy (pain, other symptoms, and general), or regarding six subscales of the two-item CSQ. Improvements over time (time) for the whole study group were found regarding mean (P<.001) and maximum (P=.002) pain intensity. The mean time spent in the Web-based program was 304 minutes (range 0-1142). Participants rated the items of Web-BCPA feasibility between 68/100 and 90/100. Participants in the MMR+WEB group were more satisfied with their MMR at 4 months (P<.001) and at 12 months (P=.003). Adding a self-guided Web-based intervention with a focus on behavioral change for activity to MMR can reduce catastrophizing and increase satisfaction with MMR. Patients in MMR may need more supportive coaching to increase adherence in the Web-BCPA to find it valuable. Clinicaltrials.gov NCT01475591; https://clinicaltrials.gov/ct2/show/NCT01475591 (Archived by WebCite at http://www.webcitation.org/6kUnt7VQh).

  2. ACHP | Web Site Privacy Policy

    Science.gov Websites

    Search skip specific nav links Home arrow About ACHP arrow Web Site Privacy Policy ACHP Web Site Privacy be used after its purpose has been fulfilled. For questions on our Web site privacy policy, please contact the Web manager. Updated October 2, 2006 Return to Top

  3. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    ERIC Educational Resources Information Center

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  4. Analysis of governmental Web sites on food safety issues: a global perspective.

    PubMed

    Namkung, Young; Almanza, Barbara A

    2006-10-01

    Despite a growing concern over food safety issues, as well as a growing dependence on the Internet as a source of information, little research has been done to examine the presence and relevance of food safety-related information on Web sites. The study reported here conducted Web site analysis in order to examine the current operational status of governmental Web sites on food safety issues. The study also evaluated Web site usability, especially information dimensionalities such as utility, currency, and relevance of content, from the perspective of the English-speaking consumer. Results showed that out of 192 World Health Organization members, 111 countries operated governmental Web sites that provide information about food safety issues. Among 171 searchable Web sites from the 111 countries, 123 Web sites (71.9 percent) were accessible, and 81 of those 123 (65.9 percent) were available in English. The majority of Web sites offered search engine tools and related links for more information, but their availability and utility was limited. In terms of content, 69.9 percent of Web sites offered information on foodborne-disease outbreaks, compared with 31.5 percent that had travel- and health-related information.

  5. Automatic and continuous landslide monitoring: the Rotolon Web-based platform

    NASA Astrophysics Data System (ADS)

    Frigerio, Simone; Schenato, Luca; Mantovani, Matteo; Bossi, Giulia; Marcato, Gianluca; Cavalli, Marco; Pasuto, Alessandro

    2013-04-01

    Mount Rotolon (Eastern Italian Alps) is affected by a complex landslide that, since 1985, is threatening the nearby village of Recoaro Terme. The first written proof of a landslide occurrence dated back to 1798. After the last re-activation on November 2010 (637 mm of intense rainfall recorded in the 12 days prior the event), a mass of approximately 320.000 m3 detached from the south flank of Mount Rotolon and evolved into a fast debris flow that ran for about 3 km along the stream bed. A real-time monitoring system was required to detect early indication of rapid movements, potentially saving lives and property. A web-based platform for automatic and continuous monitoring was designed as a first step in the implementation of an early-warning system. Measurements collected by the automated geotechnical and topographic instrumentation, deployed over the landslide body, are gathered in a central box station. After the calibration process, they are transmitted by web services on a local server, where graphs, maps, reports and alert announcement are automatically generated and updated. All the processed information are available by web browser with different access rights. The web environment provides the following advantages: 1) data is collected from different data sources and matched on a single server-side frame 2) a remote user-interface allows regular technical maintenance and direct access to the instruments 3) data management system is synchronized and automatically tested 4) a graphical user interface on browser provides a user-friendly tool for decision-makers to interact with a system continuously updated. On this site two monitoring systems are actually on course: 1) GB-InSAR radar interferometer (University of Florence - Department of Earth Science) and 2) Automated Total Station (ATS) combined with extensometers network in a Web-based solution (CNR-IRPI Padova). This work deals with details on methodology, services and techniques adopted for the second monitoring solution. The activity directly interfaces with local Civil Protection agency, Regional Geological Service and local authorities with integrated roles and aims.

  6. Oregon Magnetic and Gravity Maps and Data: A Web Site for Distribution of Data

    USGS Publications Warehouse

    Roberts, Carter W.; Kucks, Robert P.; Hill, Patricia L.

    2008-01-01

    This web site gives the results of a USGS project to acquire the best available, public-domain, aeromagnetic and gravity data in the United States and merge these data into uniform, composite grids for each State. The results for the State of Oregon are presented here on this site. Files of aeromagnetic and gravity grids and images are available for these States for downloading. In Oregon, 49 magnetic surveys have been knit together to form a single digital grid and map. Also, a complete Bouguer gravity anomaly grid and map was generated from 40,665 gravity station measurements in and adjacent to Oregon. In addition, a map shows the location of the aeromagnetic surveys, color-coded to the survey flight-line spacing. This project was supported by the Mineral Resource Program of the USGS.

  7. ACHP | Other Historic Preservation Web Sites of Interest

    Science.gov Websites

    Preservation Web Sites of Interest Other Historic Preservation Web Sites of Interest National Transportation an organization's link on the ACHP's Web site does not imply endorsement of the organization or its

  8. Evaluating Domestic and International Web-Site Strategies.

    ERIC Educational Resources Information Center

    Simeon, Roblyn

    1999-01-01

    Presents the AIPD (attracting, informing, positioning, and delivering) approach to the evaluation of commercial Web sites that assess the strategic potential of Web sites, provides a framework for the development of competitive sites, and compares Web site strategies within and across national boundaries. Compares Internet strategies of Japanese…

  9. 77 FR 74266 - Review of National Environmental Policy Act Categorical Exclusion Survey Posted on DOT/FHWA Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-13

    ... of National Environmental Policy Act Categorical Exclusion Survey Posted on DOT/FHWA Web Site AGENCY... review is now available on the FHWA Web site, http://www.fhwa.dot.gov/map21 , and FTA Web site, http://www.fta.dot.gov/map21 . DATES: These reports were posted on the Web site on December 7, 2012...

  10. Evaluation of the content and accessibility of web sites for accredited orthopaedic sports medicine fellowships.

    PubMed

    Mulcahey, Mary K; Gosselin, Michelle M; Fadale, Paul D

    2013-06-19

    The Internet is a common source of information for orthopaedic residents applying for sports medicine fellowships, with the web sites of the American Orthopaedic Society for Sports Medicine (AOSSM) and the San Francisco Match serving as central databases. We sought to evaluate the web sites for accredited orthopaedic sports medicine fellowships with regard to content and accessibility. We reviewed the existing web sites of the ninety-five accredited orthopaedic sports medicine fellowships included in the AOSSM and San Francisco Match databases from February to March 2012. A Google search was performed to determine the overall accessibility of program web sites and to supplement information obtained from the AOSSM and San Francisco Match web sites. The study sample consisted of the eighty-seven programs whose web sites connected to information about the fellowship. Each web site was evaluated for its informational value. Of the ninety-five programs, fifty-one (54%) had links listed in the AOSSM database. Three (3%) of all accredited programs had web sites that were linked directly to information about the fellowship. Eighty-eight (93%) had links listed in the San Francisco Match database; however, only five (5%) had links that connected directly to information about the fellowship. Of the eighty-seven programs analyzed in our study, all eighty-seven web sites (100%) provided a description of the program and seventy-six web sites (87%) included information about the application process. Twenty-one web sites (24%) included a list of current fellows. Fifty-six web sites (64%) described the didactic instruction, seventy (80%) described team coverage responsibilities, forty-seven (54%) included a description of cases routinely performed by fellows, forty-one (47%) described the role of the fellow in seeing patients in the office, eleven (13%) included call responsibilities, and seventeen (20%) described a rotation schedule. Two Google searches identified direct links for 67% to 71% of all accredited programs. Most accredited orthopaedic sports medicine fellowships lack easily accessible or complete web sites in the AOSSM or San Francisco Match databases. Improvement in the accessibility and quality of information on orthopaedic sports medicine fellowship web sites would facilitate the ability of applicants to obtain useful information.

  11. BioData: a national aquatic bioassessment database

    USGS Publications Warehouse

    MacCoy, Dorene

    2011-01-01

    BioData is a U.S. Geological Survey (USGS) web-enabled database that for the first time provides for the capture, curation, integration, and delivery of bioassessment data collected by local, regional, and national USGS projects. BioData offers field biologists advanced capabilities for entering, editing, and reviewing the macroinvertebrate, algae, fish, and supporting habitat data from rivers and streams. It offers data archival and curation capabilities that protect and maintain data for the long term. BioData provides the Federal, State, and local governments, as well as the scientific community, resource managers, the private sector, and the public with easy access to tens of thousands of samples collected nationwide from thousands of stream and river sites. BioData also provides the USGS with centralized data storage for delivering data to other systems and applications through automated web services. BioData allows users to combine data sets of known quality from different projects in various locations over time. It provides a nationally aggregated database for users to leverage data from many independent projects that, until now, was not feasible at this scale. For example, from 1991 to 2011, the USGS Idaho Water Science Center collected more than 816 bioassessment samples from 63 sites for the National Water Quality Assessment (NAWQA) Program and more than 477 samples from 39 sites for a cooperative USGS and State of Idaho Statewide Water Quality Network (fig. 1). Using BioData, 20 years of samples collected for both of these projects can be combined for analysis. BioData delivers all of the data using current taxonomic nomenclature, thus relieving users of the difficult and time-consuming task of harmonizing taxonomy among samples collected during different time periods. Fish data are reported using the Integrated Taxonomic Information Service (ITIS) Taxonomic Serial Numbers (TSN's). A simple web-data input interface and self-guided, public data-retrieval web site provides access to bioassessment data. BioData currently accepts data collected using two national protocols: (1) NAWQA and (2) U.S. Environmental Protection Agency (USEPA) National Rivers and Streams Assessment (NRSA). Additional collection protocols are planned for future versions.

  12. SSE Transition to POWER is Now Complete

    Atmospheric Science Data Center

    2018-06-21

    ... A new POWER home page with enhanced responsive GIS-enabled web data services and mapping capabilities replaced the SSE site on June 13, 2018. This current set of SSE web applications and website is no longer accessible. The new POWER includes ...

  13. StreamStats: a U.S. geological survey web site for stream information

    USGS Publications Warehouse

    Kernell, G. Ries; Gray, John R.; Renard, Kenneth G.; McElroy, Stephen A.; Gburek, William J.; Canfield, H. Evan; Scott, Russell L.

    2003-01-01

    The U.S. Geological Survey has developed a Web application, named StreamStats, for providing streamflow statistics, such as the 100-year flood and the 7-day, 10-year low flow, to the public. Statistics can be obtained for data-collection stations and for ungaged sites. Streamflow statistics are needed for water-resources planning and management; for design of bridges, culverts, and flood-control structures; and for many other purposes. StreamStats users can point and click on data-collection stations shown on a map in their Web browser window to obtain previously determined streamflow statistics and other information for the stations. Users also can point and click on any stream shown on the map to get estimates of streamflow statistics for ungaged sites. StreamStats determines the watershed boundaries and measures physical and climatic characteristics of the watersheds for the ungaged sites by use of a Geographic Information System (GIS), and then it inserts the characteristics into previously determined regression equations to estimate the streamflow statistics. Compared to manual methods, StreamStats reduces the average time needed to estimate streamflow statistics for ungaged sites from several hours to several minutes.

  14. Web-Resources for Astronomical Data in the Ultraviolet

    NASA Astrophysics Data System (ADS)

    Sachkov, M. E.; Malkov, O. Yu.

    2017-12-01

    In this paper we describe databases of space projects that are operating or have operated in the ultraviolet spectral region. We give brief descriptions and links to major sources for UV data on the web: archives, space mission sites, databases, catalogues. We pay special attention to the World Space Observatory—Ultraviolet mission that will be launched in 2021.

  15. 76 FR 20054 - Self-Regulatory Organizations; the NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-11

    ... over 50,000,000 investors on Web sites operated by Google, Interactive Data, and Dow Jones, among... systems (``ATSs''), including dark pools and electronic communication networks (``ECNs''). Each SRO market..., Attain, TracECN, BATS Trading and Direct Edge. Today, BATS publishes its data at no charge on its Web...

  16. The Effectiveness of Course Web Sites in Higher Education: An Exploratory Study.

    ERIC Educational Resources Information Center

    Comunale, Christie L.; Sexton, Thomas R.; Voss, Diana J. Pedagano

    2002-01-01

    Describes an exploratory study of the educational effectiveness of course Web sites among undergraduate accounting students and graduate students in business statistics. Measured Web site visit frequency, usefulness of each site feature, and the impacts of Web sites on perceived learning and course performance. (Author/LRW)

  17. BPELPower—A BPEL execution engine for geospatial web services

    NASA Astrophysics Data System (ADS)

    Yu, Genong (Eugene); Zhao, Peisheng; Di, Liping; Chen, Aijun; Deng, Meixia; Bai, Yuqi

    2012-10-01

    The Business Process Execution Language (BPEL) has become a popular choice for orchestrating and executing workflows in the Web environment. As one special kind of scientific workflow, geospatial Web processing workflows are data-intensive, deal with complex structures in data and geographic features, and execute automatically with limited human intervention. To enable the proper execution and coordination of geospatial workflows, a specially enhanced BPEL execution engine is required. BPELPower was designed, developed, and implemented as a generic BPEL execution engine with enhancements for executing geospatial workflows. The enhancements are especially in its capabilities in handling Geography Markup Language (GML) and standard geospatial Web services, such as the Web Processing Service (WPS) and the Web Feature Service (WFS). BPELPower has been used in several demonstrations over the decade. Two scenarios were discussed in detail to demonstrate the capabilities of BPELPower. That study showed a standard-compliant, Web-based approach for properly supporting geospatial processing, with the only enhancement at the implementation level. Pattern-based evaluation and performance improvement of the engine are discussed: BPELPower directly supports 22 workflow control patterns and 17 workflow data patterns. In the future, the engine will be enhanced with high performance parallel processing and broad Web paradigms.

  18. Adapting the Content of Cancer Web Sites to the Information Needs of Patients: Reliability and Readability

    PubMed Central

    Bermúdez-Tamayo, Clara; Pernett, Jaime Jiménez; Garcia-Gutierrez, Jose Francisco; Cózar-Olmo, José Manuel; Valero-Aguilera, Beatriz

    2013-01-01

    Abstract Background: People who use the Internet to research health topics do not usually find all the information they need and do not trust what they read. This study was designed to assess the reliability, accessibility, readability, and popularity of cancer Web sites in Spanish and to analyze the suitability of Web site content in accordance with the specific information needs of cancer patients. Materials and Methods: This was a two-phase, cross-sectional, descriptive study. The first phase involved data gathering through online searches and direct observation. The second phase involved individual structured interviews with 169 patients with breast, prostate, bladder, and kidney cancer. Spearman rank correlations were calculated between variables. Results: Most sites belonged to nonprofit organizations, followed by universities or medical centers (14%). Thirty-one percent of the Web sites had quality seals, 59% provided details of authorship, 62% provided references to bibliographic sources, 38% identified their funding sources, and 54% showed the date of their last update. Twenty-one percent of the Web sites did not meet the minimum accessibility criteria. With regard to readability, 24% of the texts were considered to be “quite difficult.” Patients' information needs vary depending on the type of cancer they have, although all patients want to know about the likelihood of a cure, survival rates, the side effects, and risks of treatment. Conclusions: The health information on cancer available on the Internet in Spanish is not very reliable, accessible, or readable and is not necessarily the information that breast, kidney, prostate, and bladder cancer patients require. The content of cancer Web sites needs to be assessed according to the information needs of patients. PMID:24073899

  19. Adapting the content of cancer web sites to the information needs of patients: reliability and readability.

    PubMed

    Alba-Ruiz, Ruben; Bermúdez-Tamayo, Clara; Pernett, Jaime Jiménez; Garcia-Gutierrez, Jose Francisco; Cózar-Olmo, José Manuel; Valero-Aguilera, Beatriz

    2013-12-01

    People who use the Internet to research health topics do not usually find all the information they need and do not trust what they read. This study was designed to assess the reliability, accessibility, readability, and popularity of cancer Web sites in Spanish and to analyze the suitability of Web site content in accordance with the specific information needs of cancer patients. This was a two-phase, cross-sectional, descriptive study. The first phase involved data gathering through online searches and direct observation. The second phase involved individual structured interviews with 169 patients with breast, prostate, bladder, and kidney cancer. Spearman rank correlations were calculated between variables. Most sites belonged to nonprofit organizations, followed by universities or medical centers (14%). Thirty-one percent of the Web sites had quality seals, 59% provided details of authorship, 62% provided references to bibliographic sources, 38% identified their funding sources, and 54% showed the date of their last update. Twenty-one percent of the Web sites did not meet the minimum accessibility criteria. With regard to readability, 24% of the texts were considered to be "quite difficult." Patients' information needs vary depending on the type of cancer they have, although all patients want to know about the likelihood of a cure, survival rates, the side effects, and risks of treatment. The health information on cancer available on the Internet in Spanish is not very reliable, accessible, or readable and is not necessarily the information that breast, kidney, prostate, and bladder cancer patients require. The content of cancer Web sites needs to be assessed according to the information needs of patients.

  20. Building a Library Web Site on the Pillars of Web 2.0

    ERIC Educational Resources Information Center

    Coombs, Karen A.

    2007-01-01

    In this article, the author discusses a project they undertook to reshape the libraries' Web services of the University of Houston site. The site had been in a state of flux and it needed a new structure for both managing and organizing it. She realized the staff was looking for a Web site that was more "Web 2.0" in nature. Web 2.0 is often…

  1. The Scholarship of Teaching and Learning History Comes of Age: A New International Organization and Web Site/Newsletter

    ERIC Educational Resources Information Center

    Pace, David; Erekson, Keith A.

    2006-01-01

    In the four decades since the creation of this journal, historians in North America have seen a steady increase in the materials available to assist them in more effectively sharing the fruits of their discipline with their students. In the 1990s this effort was given a new intensity by the introduction into academia of the concept of a…

  2. A review of guidelines on home drug testing web sites for parents.

    PubMed

    Washio, Yukiko; Fairfax-Columbo, Jaymes; Ball, Emily; Cassey, Heather; Arria, Amelia M; Bresani, Elena; Curtis, Brenda L; Kirby, Kimberly C

    2014-01-01

    To update and extend prior work reviewing Web sites that discuss home drug testing for parents, and assess the quality of information that the Web sites provide, to assist them in deciding when and how to use home drug testing. We conducted a worldwide Web search that identified 8 Web sites providing information for parents on home drug testing. We assessed the information on the sites using a checklist developed with field experts in adolescent substance abuse and psychosocial interventions that focus on urine testing. None of the Web sites covered all the items on the 24-item checklist, and only 3 covered at least half of the items (12, 14, and 21 items, respectively). The remaining 5 Web sites covered less than half of the checklist items. The mean number of items covered by the Web sites was 11. Among the Web sites that we reviewed, few provided thorough information to parents regarding empirically supported strategies to effectively use drug testing to intervene on adolescent substance use. Furthermore, most Web sites did not provide thorough information regarding the risks and benefits to inform parents' decision to use home drug testing. Empirical evidence regarding efficacy, benefits, risks, and limitations of home drug testing is needed.

  3. Information about Sexual Health on Crisis Pregnancy Center Web Sites: Accurate for Adolescents?

    PubMed

    Bryant-Comstock, Katelyn; Bryant, Amy G; Narasimhan, Subasri; Levi, Erika E

    2016-02-01

    The objective of this study was to evaluate the quality and accuracy of sexual health information on crisis pregnancy center Web sites listed in state resource directories for pregnant women, and whether these Web sites specifically target adolescents. A survey of sexual health information presented on the Web sites of crisis pregnancy centers. Internet. Crisis pregnancy center Web sites. Evaluation of the sexual health information presented on crisis pregnancy center Web sites. Themes included statements that condoms are not effective, promotion of abstinence-only education, availability of comprehensive sexual education, appeal to a young audience, provision of comprehensive sexual health information, and information about sexually transmitted infections (STIs). Crisis pregnancy center Web sites provide inaccurate and misleading information about condoms, STIs, and methods to prevent STI transmission. This information might be particularly harmful to adolescents, who might be unable to discern the quality of sexual health information on crisis pregnancy center Web sites. Listing crisis pregnancy centers in state resource directories might lend legitimacy to the information on these Web sites. States should be discouraged from listing Web sites as an accurate source of information in their resource directories. Copyright © 2016 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  4. Marketing your medical practice with an effective web presence.

    PubMed

    Finch, Tammy

    2004-01-01

    The proliferation of the World Wide Web has provided an opportunity for medical practices to sell themselves through low-cost marketing on the Internet. A Web site is a quick and effective way to provide patients with up-to-date treatment and procedure information. This article provides suggestions on what to include on a medical practice's Web site, how the Web can assist office staff and physicians, and cost options for your Web site. The article also discusses design tips, such as Web-site optimization.

  5. Depth-of-processing effects as college students use academic advising Web sites.

    PubMed

    Boatright-Horowitz, Su L; Langley, Michelle; Gunnip, Matthew

    2009-06-01

    This research examined students' cognitive and affective responses to an academic advising Web site. Specifically, we investigated whether exposure to our Web site increased student reports that they would access university Web sites to obtain various types of advising information. A depth-of-processing (DOP) manipulation revealed this effect as students engaged in semantic processing of Web content but not when they engaged in superficial examination of the physical appearance of the same Web site. Students appeared to scan online academic advising materials for information of immediate importance without noticing other information or hyperlinks (e.g., regarding internships and careers). Suggestions are presented for increasing the effectiveness of academic advising Web sites.

  6. Presence of pro-tobacco messages on the Web.

    PubMed

    Hong, Traci; Cody, Michael J

    2002-01-01

    Ignored in the finalized Master Settlement Agreement (National Association of Attorneys General, 1998), the unmonitored, unregulated World Wide Web (Web) can operate as a major vehicle for delivering pro-tobacco messages, images, and products to millions of young consumers. A content analysis of 318 randomly sampled pro-tobacco Web sites revealed that tobacco has a pervasive presence on the Web, especially on e-commerce sites and sites featuring hobbies, recreation, and "fetishes." Products can be ordered online on nearly 50% of the sites, but only 23% of the sites included underage verification. Further, only 11% of these sites contain health warnings. Instead, pro-tobacco sites frequently associate smoking with "glamorous" and "alternative" lifestyles, and with images of young males and young (thin, attractive) females. Finally, many of the Web sites offered interactive site features that are potentially appealing to young Web users. Recommendations for future research and counterstrategies are discussed.

  7. jsNMR: an embedded platform-independent NMR spectrum viewer.

    PubMed

    Vosegaard, Thomas

    2015-04-01

    jsNMR is a lightweight NMR spectrum viewer written in JavaScript/HyperText Markup Language (HTML), which provides a cross-platform spectrum visualizer that runs on all computer architectures including mobile devices. Experimental (and simulated) datasets are easily opened in jsNMR by (i) drag and drop on a jsNMR browser window, (ii) by preparing a jsNMR file from the jsNMR web site, or (iii) by mailing the raw data to the jsNMR web portal. jsNMR embeds the original data in the HTML file, so a jsNMR file is a self-transforming dataset that may be exported to various formats, e.g. comma-separated values. The main applications of jsNMR are to provide easy access to NMR data without the need for dedicated software installed and to provide the possibility to visualize NMR spectra on web sites. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Research on the optimization strategy of web search engine based on data mining

    NASA Astrophysics Data System (ADS)

    Chen, Ronghua

    2018-04-01

    With the wide application of search engines, web site information has become an important way for people to obtain information. People have found that they are growing in an increasingly explosive manner. Web site information is verydifficult to find the information they need, and now the search engine can not meet the need, so there is an urgent need for the network to provide website personalized information service, data mining technology for this new challenge is to find a breakthrough. In order to improve people's accuracy of finding information from websites, a website search engine optimization strategy based on data mining is proposed, and verified by website search engine optimization experiment. The results show that the proposed strategy improves the accuracy of the people to find information, and reduces the time for people to find information. It has an important practical value.

  9. Project Anqa: Digitizing and Documenting Cultural Heritage in the Middle East

    NASA Astrophysics Data System (ADS)

    Akhtar, S.; Akoglu, G.; Simon, S.; Rushmeier, H.

    2017-08-01

    The practice of digitizing cultural heritage sites is gaining ground among conservation scientists and scholars in architecture, art history, computer science, and related fields. Recently, the location of such sites in areas of intense conflict has highlighted the urgent need for documenting cultural heritage for the purposes of preservation and posterity. The complex histories of such sites requires more than just their digitization, and should also include the meaningful interpretation of buildings and their surroundings with respect to context and intangible values. Project Anqa is an interdisciplinary and multi-partner effort that goes beyond simple digitization to record at-risk heritage sites throughout the Middle East and Saharan Africa, most notably in Syria and Iraq, before they are altered or destroyed. Through a collaborative process, Anqa assembles documentation, historically contextualizes it, and makes data accessible and useful for scholars, peers, and the wider public through state-of-the-art tools. The aim of the project is to engage in capacity-building on the ground in Syria and Iraq, as well as to create an educational web platform that informs viewers about cultural heritage in the region through research, digital storytelling, and the experience of virtual environments.

  10. Generating Mosaics of Astronomical Images

    NASA Technical Reports Server (NTRS)

    Bergou, Attila; Berriman, Bruce; Good, John; Jacob, Joseph; Katz, Daniel; Laity, Anastasia; Prince, Thomas; Williams, Roy

    2005-01-01

    "Montage" is the name of a service of the National Virtual Observatory (NVO), and of software being developed to implement the service via the World Wide Web. Montage generates science-grade custom mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. "Science-grade" in this context signifies that terrestrial and instrumental features are removed from images in a way that can be described quantitatively. "Custom" refers to user-specified parameters of projection, coordinates, size, rotation, and spatial sampling. The greatest value of Montage is expected to lie in its ability to analyze images at multiple wavelengths, delivering them on a common projection, coordinate system, and spatial sampling, and thereby enabling further analysis as though they were part of a single, multi-wavelength image. Montage will be deployed as a computation-intensive service through existing astronomy portals and other Web sites. It will be integrated into the emerging NVO architecture and will be executed on the TeraGrid. The Montage software will also be portable and publicly available.

  11. Visualization and Phospholipid Identification (VaLID): online integrated search engine capable of identifying and visualizing glycerophospholipids with given mass

    PubMed Central

    Figeys, Daniel; Fai, Stephen; Bennett, Steffany A. L.

    2013-01-01

    Motivation: Establishing phospholipid identities in large lipidomic datasets is a labour-intensive process. Where genomics and proteomics capitalize on sequence-based signatures, glycerophospholipids lack easily definable molecular fingerprints. Carbon chain length, degree of unsaturation, linkage, and polar head group identity must be calculated from mass to charge (m/z) ratios under defined mass spectrometry (MS) conditions. Given increasing MS sensitivity, many m/z values are not represented in existing prediction engines. To address this need, Visualization and Phospholipid Identification is a web-based application that returns all theoretically possible phospholipids for any m/z value and MS condition. Visualization algorithms produce multiple chemical structure files for each species. Curated lipids detected by the Canadian Institutes of Health Research Training Program in Neurodegenerative Lipidomics are provided as high-resolution structures. Availability: VaLID is available through the Canadian Institutes of Health Research Training Program in Neurodegenerative Lipidomics resources web site at https://www.med.uottawa.ca/lipidomics/resources.html. Contacts: lipawrd@uottawa.ca Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:23162086

  12. Improving Web Searches: Case Study of Quit-Smoking Web Sites for Teenagers

    PubMed Central

    Skinner, Harvey

    2003-01-01

    Background The Web has become an important and influential source of health information. With the vast number of Web sites on the Internet, users often resort to popular search sites when searching for information. However, little is known about the characteristics of Web sites returned by simple Web searches for information about smoking cessation for teenagers. Objective To determine the characteristics of Web sites retrieved by search engines about smoking cessation for teenagers and how information quality correlates with the search ranking. Methods The top 30 sites returned by 4 popular search sites in response to the search terms "teen quit smoking" were examined. The information relevance and quality characteristics of these sites were evaluated by 2 raters. Objective site characteristics were obtained using a page-analysis Web site. Results Only 14 of the 30 Web sites are of direct relevance to smoking cessation for teenagers. The readability of about two-thirds of the 14 sites is below an eighth-grade school level and they ranked significantly higher (Kendall rank correlation, tau = -0.39, P= .05) in search-site results than sites with readability above or equal to that grade level. Sites that ranked higher were significantly associated with the presence of e-mail address for contact (tau = -0.46, P= .01), annotated hyperlinks to external sites (tau = -0.39, P= .04), and the presence of meta description tag (tau = -0.48, P= .002). The median link density (number of external sites that have a link to that site) of the Web pages was 6 and the maximum was 735. A higher link density was significantly associated with a higher rank (tau = -0.58, P= .02). Conclusions Using simple search terms on popular search sites to look for information on smoking cessation for teenagers resulted in less than half of the sites being of direct relevance. To improve search efficiency, users could supplement results obtained from simple Web searches with human-maintained Web directories and learn to refine their searches with more advanced search syntax. PMID:14713656

  13. Perthes Disease: The Quality and Reliability of Information on the Internet.

    PubMed

    Nassiri, Mujtaba; Bruce-Brand, Robert A; O'Neill, Francis; Chenouri, Shojaeddin; Curtin, Paul

    2015-01-01

    Research has shown that up to 89% of parents used the Internet to seek health information regarding their child's medical condition. Much of the information on the Internet is valuable; however, the quality of health information is variable and unregulated. The aim of this study was to evaluate the quality and content of information about Perthes disease on the Internet using recognized scoring systems, identification of quality markers, and describe a novel specific score. We searched the top 3 search engines (Google, Yahoo!, and Bing) for the following keywords: "Perthes disease." Forty-five unique Web sites were identified. The Web sites were then categorized by type and assessed using the DISCERN score, the Journal of the American Medical Association (JAMA) benchmark criteria, and a novel Perthes-specific Content score. The presence of the Health On the Net (HON) code, a reported quality assurance marker, was noted. Of the Web sites analyzed, the Majority were Governmental and Nonprofit Organizations (NPO) (37.8%), followed by commercial Web sites (22.2%). Only 6 of the Web sites were HONcode certified. The mean DISCERN score was 53.1 (SD=9.0). The Governmental and NPO Web sites had the highest overall DISCERN scores followed closely by Physician Web sites. The mean JAMA benchmark criteria score was 2.1 (SD=1.2). Nine Web sites had maximal scores and the Academic Web sites had the highest overall JAMA benchmark scores. DISCERN scores, JAMA benchmark scores, and Perthes-specific Content scores were all greater for Web sites that bore the HONcode seal. The quality of information available online regarding Perthes disease is of variable quality. Governmental and NPO Web sites predominate and also provide higher quality content. The HONcode seal is a reliable indicator of Web site quality. Physicians should recommend the HONcode seal to their patients as a reliable indicator of Web site quality or, better yet, refer patients to sites they have personally reviewed. Supplying parents with a guide to health information on the Internet will help exclude Web sites as sources of misinformation.

  14. Web-based X-ray quality control documentation.

    PubMed

    David, George; Burnett, Lou Ann; Schenkel, Robert

    2003-01-01

    The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal department recommended that we secure the site in order to keep out those wishing to make mischief. Our interim solution has not been to password protect the page, which we feared would hinder access for occasional legitimate users, but also not to provide links to it from other hospital and department pages. Utility and productivity were improved and time and money were saved by making radiological equipment quality control documentation instantly available on-line.

  15. Beyond Description: Converting Web Site Usage Statistics into Concrete Site Improvement Ideas

    ERIC Educational Resources Information Center

    Arendt, Julie; Wagner, Cassie

    2010-01-01

    Web site usage statistics are a widely used tool for Web site development, but libraries are still learning how to use them successfully. This case study summarizes how Morris Library at Southern Illinois University Carbondale implemented Google Analytics on its Web site and used the reports to inform a site redesign. As the main campus library at…

  16. Elusive or Illuminating: Using the Web To Explore the Salem Witchcraft Trials.

    ERIC Educational Resources Information Center

    Hurter, Stephanie R.

    2003-01-01

    Presents Web sites useful for teaching about the Salem (Massachusetts) witchcraft trials. Includes Web sites that offer primary source material, collections of Web sites, teaching material, and sites that are interactive, including features, such as QuickTime movies. (CMK)

  17. Web Site Design Benchmarking within Industry Groups.

    ERIC Educational Resources Information Center

    Kim, Sung-Eon; Shaw, Thomas; Schneider, Helmut

    2003-01-01

    Discussion of electronic commerce focuses on Web site evaluation criteria and applies them to different industry groups in Korea. Defines six categories of Web site evaluation criteria: business function, corporate credibility, contents reliability, Web site attractiveness, systematic structure, and navigation; and discusses differences between…

  18. Testosterone replacement therapy and the internet: an assessment of providers' health-related web site information content.

    PubMed

    Oberlin, Daniel T; Masson, Puneet; Brannigan, Robert E

    2015-04-01

    To compare how providers of testosterone replacement therapy (TRT) in large metropolitan cities promote androgen replacement on their patient-oriented Web sites. TRT provider Web sites were identified using Google search and the terms "Testosterone replacement" and the name of the 5 most populous US cities. These Web sites were assessed for (1) type or specialty of medical provider, (2) discussion of the benefits and risks of TRT, and (3) industry affiliations. In total, 75 Web sites were evaluated. Twenty-seven of the 75 clinics (36%) were directed by nonphysicians, 35 (47%) were overseen by nonurology or nonendocrine physicians, and only 13 (17%) were specialist managed. Fourteen of 75 (18.6%) Web sites disclosed industry relationships. Ninety-five percent of Web sites promoted the benefits of TRT including improved sex drive, cognitive improvement, increased muscle strength, and/or improved energy. Only 20 of 75 Web sites (26.6%) described any side effect of TRT. Web sites directed by specialists were twice as likely to discuss risks of TRT compared with nonspecialist providers (41% vs 20%; odds ratio = 2.77; P <.01). Nine of 75 (12%) of all Web sites actually refuted that TRT was associated with significant side effects. Urologists and endocrinologists are in the minority of providers promoting TRT on the Internet. Specialists are more likely to discuss risks associated with TRT although the majority of surveyed Web sites that promote TRT do not mention treatment risks. There is substantial variability in quality and quantity of information on provider Web sites, which may contribute to misinformation regarding this prevalent health issue. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Accessing the public MIMIC-II intensive care relational database for clinical research.

    PubMed

    Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G

    2013-01-10

    The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.

  20. Continued benefits of a technical assistance web site to local tobacco control coalitions during a state budget shortfall.

    PubMed

    Buller, David B; Young, Walter F; Bettinghaus, Erwin P; Borland, Ron; Walther, Joseph B; Helme, Donald; Andersen, Peter A; Cutter, Gary R; Maloy, Julie A

    2011-01-01

    A state budget shortfall defunded 10 local tobacco coalitions during a randomized trial but defunded coalitions continued to have access to 2 technical assistance Web sites. To test the ability of Web-based technology to provide technical assistance to local tobacco control coalitions. Randomized 2-group trial with local tobacco control coalitions as the unit of randomization. Local communities (ie, counties) within the State of Colorado. Leaders and members in 34 local tobacco control coalitions funded by the state health department in Colorado. Two technical assistance Web sites: A Basic Web site with text-based information and a multimedia Enhanced Web site containing learning modules, resources, and communication features. Use of the Web sites in minutes, pages, and session and evaluations of coalition functioning on coalition development, conflict resolution, leadership satisfaction, decision-making satisfaction, shared mission, personal involvement, and organization involvement in survey of leaders and members. Coalitions that were defunded but had access to the multimedia Enhanced Web site during the Fully Funded period and after defunding continued to use it (treatment group × funding status × period, F(3,714) = 3.18, P = .0234). Coalitions with access to the Basic Web site had low Web site use throughout and use by defunded coalitions was nearly zero when funding ceased. Members in defunded Basic Web site coalitions reported that their coalitions functioned worse than defunded Enhanced Web site coalitions (coalition development: group × status, F(1,360) = 4.81, P = .029; conflict resolution: group × status, F(1,306) = 5.69, P = .018; leadership satisfaction: group × status, F(1,342) = 5.69, P = .023). The Enhanced Web site may have had a protective effect on defunded coalitions. Defunded coalitions may have increased their capacity by using the Enhanced Web site when fully funded or by continuing to use the available online resources after defunding. Web-based technical assistance with online training and resources may be a good investment when future funding is not ensured.

  1. 75 FR 6063 - Availability of NRC Open Government Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-05

    ... NUCLEAR REGULATORY COMMISSION [NRC-2010-0003] Availability of NRC Open Government Web Site AGENCY: Nuclear Regulatory Commission. ACTION: Notice of Availability of Open Government Web site for Online... Register notice, informs the public that the Nuclear Regulatory Commission's (NRC) Open Government Web site...

  2. Hot Spots on the Web for Teacher Librarians: A Selection of Recommended Web Sites for TLs To Visit.

    ERIC Educational Resources Information Center

    1996

    Six papers review and recommend sites on the Web as resources for teacher librarians include: "Just Do It: A Guide to Getting Out There and Doing It Yourself" (Catherine Ryan); "A Selection of Recommended Web Sites for TLs To Visit" (Karen Bonanno); "A Selection of Recommended Web Sites for TLs To Visit" (Sandra…

  3. The Atlas of Chinese World Wide Web Ecosystem Shaped by the Collective Attention Flows.

    PubMed

    Lou, Xiaodan; Li, Yong; Gu, Weiwei; Zhang, Jiang

    2016-01-01

    The web can be regarded as an ecosystem of digital resources connected and shaped by collective successive behaviors of users. Knowing how people allocate limited attention on different resources is of great importance. To answer this, we embed the most popular Chinese web sites into a high dimensional Euclidean space based on the open flow network model of a large number of Chinese users' collective attention flows, which both considers the connection topology of hyperlinks between the sites and the collective behaviors of the users. With these tools, we rank the web sites and compare their centralities based on flow distances with other metrics. We also study the patterns of attention flow allocation, and find that a large number of web sites concentrate on the central area of the embedding space, and only a small fraction of web sites disperse in the periphery. The entire embedding space can be separated into 3 regions(core, interim, and periphery). The sites in the core (1%) occupy a majority of the attention flows (40%), and the sites (34%) in the interim attract 40%, whereas other sites (65%) only take 20% flows. What's more, we clustered the web sites into 4 groups according to their positions in the space, and found that similar web sites in contents and topics are grouped together. In short, by incorporating the open flow network model, we can clearly see how collective attention allocates and flows on different web sites, and how web sites connected each other.

  4. Characterizing Aerosols over Southeast Asia using the AERONET Data Synergy Tool

    NASA Technical Reports Server (NTRS)

    Giles, David M.; Holben, Brent N.; Eck, Thomas F.; Slutsker, Ilya; Slutsker, Ilya; Welton, Ellsworth, J.; Chin, Mian; Kucsera, Thomas; Schmaltz, Jeffery E.; Diehl, Thomas; hide

    2007-01-01

    Biomass burning, urban pollution and dust aerosols have significant impacts on the radiative forcing of the atmosphere over Asia. In order to better quanti@ these aerosol characteristics, the Aerosol Robotic Network (AERONET) has established over 200 sites worldwide with an emphasis in recent years on the Asian continent - specifically Southeast Asia. A total of approximately 15 AERONET sun photometer instruments have been deployed to China, India, Pakistan, Thailand, and Vietnam. Sun photometer spectral aerosol optical depth measurements as well as microphysical and optical aerosol retrievals over Southeast Asia will be analyzed and discussed with supporting ground-based instrument, satellite, and model data sets, which are freely available via the AERONET Data Synergy tool at the AERONET web site (http://aeronet.gsfc.nasa.gov). This web-based data tool provides access to groundbased (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.

  5. Information Security Controls against Cross-Site Request Forgery Attacks on Software Applications of Automated Systems

    NASA Astrophysics Data System (ADS)

    Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.

    2018-05-01

    This paper presents statistical results and their consolidation, which were received in the study into security of various web-application against cross-site request forgery attacks. Some of the results were received in the study carried out within the framework of certification for compliance with information security requirements. The paper provides the results of consolidating information about the attack and protection measures, which are currently used by the developers of web-applications. It specifies results of the study, which demonstrate various distribution types: distribution of identified vulnerabilities as per the developer type (Russian and foreign), distribution of the security measures used in web-applications, distribution of the identified vulnerabilities as per the programming languages, data on the number of security measures that are used in the studied web-applications. The results of the study show that in most cases the developers of web-applications do not pay due attention to protection against cross-site request forgery attacks. The authors give recommendations to the developers that are planning to undergo a certification process for their software applications.

  6. Web-based Health Educational Program in Saudi Arabia.

    PubMed

    Bahkali, Salwa; Almaiman, Ahmad; Alsaleh, Mahassen; Elmetwally, Ashraf; Househ, Mowafa

    2014-01-01

    The purpose of this exploratory study is to provide an overview of a web-based health educational site created by the King Faisal Specialist Hospital and Research Center (KFSH&RC) in the Kingdom of Saudi Arabia (KSA). Sources of data included two interviews with Saudi IT personnel, three health educators, and two medical consultants working at KFSH&RC. The interviews ranged between 45 minutes and 120 minutes. The KFSH&RC website was also searched for the type of health information content posted. Results show that the KFSH&RC web-based health educational site provides health information through a medical encyclopedia, a social networking platform, health educational links, and targeted health information for children, which includes tools such as games and coloring books. Further research is needed on the effectiveness of the KFSH&RC web-based health education site in terms of improving knowledge and changing behavior of Saudi patients. The study recommends that targeted web-based health education strategies should be developed to reach large rural populations which have inadequate computer skills and limited access to the internet.

  7. Enhancing UCSF Chimera through web services

    PubMed Central

    Huang, Conrad C.; Meng, Elaine C.; Morris, John H.; Pettersen, Eric F.; Ferrin, Thomas E.

    2014-01-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624

  8. Web-based home telemedicine system for orthopedics

    NASA Astrophysics Data System (ADS)

    Lau, Christopher; Churchill, Sean; Kim, Janice; Matsen, Frederick A., III; Kim, Yongmin

    2001-05-01

    Traditionally, telemedicine systems have been designed to improve access to care by allowing physicians to consult a specialist about a case without sending the patient to another location, which may be difficult or time-consuming to reach. The cost of the equipment and network bandwidth needed for this consultation has restricted telemedicine use to contact between physicians instead of between patients and physicians. Recently, however, the wide availability of Internet connectivity and client and server software for e- mail, world wide web, and conferencing has made low-cost telemedicine applications feasible. In this work, we present a web-based system for asynchronous multimedia messaging between shoulder replacement surgery patients at home and their surgeons. A web browser plug-in was developed to simplify the process of capturing video and transferring it to a web site. The video capture plug-in can be used as a template to construct a plug-in that captures and transfers any type of data to a web server. For example, readings from home biosensor instruments (e.g., blood glucose meters and spirometers) that can be connected to a computing platform can be transferred to a home telemedicine web site. Both patients and doctors can access this web site to monitor progress longitudinally. The system has been tested with 3 subjects for the past 7 weeks, and we plan to continue testing in the foreseeable future.

  9. Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs

    NASA Astrophysics Data System (ADS)

    O'Connor, Rory V.

    This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.

  10. Image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marsh, Amber; Harsch, Tim; Pitt, Julie

    2007-08-31

    The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.

  11. Online Tools for Astronomy and Cosmochemistry

    NASA Technical Reports Server (NTRS)

    Meyer, B. S.

    2005-01-01

    Over the past year, the Webnucleo Group at Clemson University has been developing a web site with a number of interactive online tools for astronomy and cosmochemistry applications. The site uses SHP (Simplified Hypertext Preprocessor), which, because of its flexibility, allows us to embed almost any computer language into our web pages. For a description of SHP, please see http://www.joeldenny.com/ At our web site, an internet user may mine large and complex data sets, such as our stellar evolution models, and make graphs or tables of the results. The user may also run some of our detailed nuclear physics and astrophysics codes, such as our nuclear statistical equilibrium code, which is written in fortran and C. Again, the user may make graphs and tables and download the results.

  12. A Web-based approach to blood donor preparation.

    PubMed

    France, Christopher R; France, Janis L; Kowalsky, Jennifer M; Copley, Diane M; Lewis, Kristin N; Ellis, Gary D; McGlone, Sarah T; Sinclair, Kadian S

    2013-02-01

    Written and video approaches to donor education have been shown to enhance donation attitudes and intentions to give blood, particularly when the information provides specific coping suggestions for donation-related concerns. This study extends this work by comparing Web-based approaches to donor preparation among donors and nondonors. Young adults (62% female; mean [±SD] age, 19.3 [±1.5] years; mean [range] number of prior blood donations, 1.1 [0-26]; 60% nondonors) were randomly assigned to view 1) a study Web site designed to address common blood donor concerns and suggest specific coping strategies (n = 238), 2) a standard blood center Web site (n = 233), or 3) a control Web site where participants viewed videos of their choice (n = 202). Measures of donation attitude, anxiety, confidence, intention, anticipated regret, and moral norm were completed before and after the intervention. Among nondonors, the study Web site produced greater changes in donation attitude, confidence, intention, and anticipated regret relative to both the standard and the control Web sites, but only differed significantly from the control Web site for moral norm and anxiety. Among donors, the study Web site produced greater changes in donation confidence and anticipated regret relative to both the standard and the control Web sites, but only differed significantly from the control Web site for donation attitude, anxiety, intention, and moral norm. Web-based donor preparation materials may provide a cost-effective way to enhance donation intentions and encourage donation behavior. © 2012 American Association of Blood Banks.

  13. Web Analytics: A Picture of the Academic Library Web Site User

    ERIC Educational Resources Information Center

    Black, Elizabeth L.

    2009-01-01

    This article describes the usefulness of Web analytics for understanding the users of an academic library Web site. Using a case study, the analysis describes how Web analytics can answer questions about Web site user behavior, including when visitors come, the duration of the visit, how they get there, the technology they use, and the most…

  14. Using Empirical Data to Clarify the Meaning of Various Prescriptions for Designing a Web-Based Course

    ERIC Educational Resources Information Center

    Boulet, Marie-Michele

    2004-01-01

    Design prescriptions to create web-based courses and sites that are dynamic, easy-to-use, interactive and data-driven, emerge from a "how to do it" approach. Unfortunately, the theory behind these methods, prescriptions, procedures or tools, is rarely provided and the important terms, such as "easy-to-use", to which these…

  15. Integrating Radar Image Data with Google Maps

    NASA Technical Reports Server (NTRS)

    Chapman, Bruce D.; Gibas, Sarah

    2010-01-01

    A public Web site has been developed as a method for displaying the multitude of radar imagery collected by NASA s Airborne Synthetic Aperture Radar (AIRSAR) instrument during its 16-year mission. Utilizing NASA s internal AIRSAR site, the new Web site features more sophisticated visualization tools that enable the general public to have access to these images. The site was originally maintained at NASA on six computers: one that held the Oracle database, two that took care of the software for the interactive map, and three that were for the Web site itself. Several tasks were involved in moving this complicated setup to just one computer. First, the AIRSAR database was migrated from Oracle to MySQL. Then the back-end of the AIRSAR Web site was updated in order to access the MySQL database. To do this, a few of the scripts needed to be modified; specifically three Perl scripts that query that database. The database connections were then updated from Oracle to MySQL, numerous syntax errors were corrected, and a query was implemented that replaced one of the stored Oracle procedures. Lastly, the interactive map was designed, implemented, and tested so that users could easily browse and access the radar imagery through the Google Maps interface.

  16. Quality and accuracy of sexual health information web sites visited by young people.

    PubMed

    Buhi, Eric R; Daley, Ellen M; Oberne, Alison; Smith, Sarah A; Schneider, Tali; Fuhrmann, Hollie J

    2010-08-01

    We assessed online sexual health information quality and accuracy and the utility of web site quality indicators. In reviewing 177 sexual health web sites, we found below average quality but few inaccuracies. Web sites with the most technically complex information and/or controversial topics contained the most inaccuracies. We found no association between inaccurate information and web site quality. (c) 2010 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  17. Quality of prostate cancer screening information on the websites of nationally recognized cancer centers and health organizations.

    PubMed

    Manole, Bogdan-Alexandru; Wakefield, Daniel V; Dove, Austin P; Dulaney, Caleb R; Marcrom, Samuel R; Schwartz, David L; Farmer, Michael R

    2017-12-24

    The purpose of this study was to survey the accessibility and quality of prostate-specific antigen (PSA) screening information from National Cancer Institute (NCI) cancer center and public health organization Web sites. We surveyed the December 1, 2016, version of all 63 NCI-designated cancer center public Web sites and 5 major online clearinghouses from allied public/private organizations (cancer.gov, cancer.org, PCF.org, USPSTF.org, and CDC.gov). Web sites were analyzed according to a 50-item list of validated health care information quality measures. Web sites were graded by 2 blinded reviewers. Interrater agreement was confirmed by Cohen kappa coefficient. Ninety percent of Web sites addressed PSA screening. Cancer center sites covered 45% of topics surveyed, whereas organization Web sites addressed 70%. All organizational Web pages addressed the possibility of false-positive screening results; 41% of cancer center Web pages did not. Forty percent of cancer center Web pages also did not discuss next steps if a PSA test was positive. Only 6% of cancer center Web pages were rated by our reviewers as "superior" (eg, addressing >75% of the surveyed topics) versus 20% of organizational Web pages. Interrater agreement between our reviewers was high (kappa coefficient = 0.602). NCI-designated cancer center Web sites publish lower quality public information about PSA screening than sites run by major allied organizations. Nonetheless, information and communication deficiencies were observed across all surveyed sites. In an age of increasing patient consumerism, prospective prostate cancer patients would benefit from improved online PSA screening information from provider and advocacy organizations. Validated cancer patient Web educational standards remain an important, understudied priority. Copyright © 2018. Published by Elsevier Inc.

  18. Service-oriented workflow to efficiently and automatically fulfill products in a highly individualized web and mobile environment

    NASA Astrophysics Data System (ADS)

    Qiao, Mu

    2015-03-01

    Service Oriented Architecture1 (SOA) is widely used in building flexible and scalable web sites and services. In most of the web or mobile photo book and gifting business space, the products ordered are highly variable without a standard template that one can substitute texts or images from similar to that of commercial variable data printing. In this paper, the author describes a SOA workflow in a multi-sites, multi-product lines fulfillment system where three major challenges are addressed: utilization of hardware and equipment, highly automation with fault recovery, and highly scalable and flexible with order volume fluctuation.

  19. Outcomes and Utilization of a Low Intensity Workplace Weight Loss Program

    PubMed Central

    Carpenter, Kelly M.; Lovejoy, Jennifer C.; Lange, Jane M.; Hapgood, Jenny E.; Zbikowski, Susan M.

    2014-01-01

    Obesity is related to high health care costs and lost productivity in the workplace. Employers are increasingly sponsoring weight loss and wellness programs to ameliorate these costs. We evaluated weight loss outcomes, treatment utilization, and health behavior change in a low intensity phone- and web-based, employer-sponsored weight loss program. The intervention included three proactive counseling phone calls with a registered dietician and a behavioral health coach as well as a comprehensive website. At six months, one third of those who responded to the follow-up survey had lost a clinically significant amount of weight (≥5% of body weight). Clinically significant weight loss was predicted by the use of both the counseling calls and the website. When examining specific features of the web site, the weight tracking tool was the most predictive of weight loss. Health behavior changes such as eating more fruits and vegetables, increasing physical activity, and reducing stress were all predictive of clinically significant weight loss. Although limited by the low follow-up rate, this evaluation suggests that even low intensity weight loss programs can lead to clinical weight loss for a significant number of participants. PMID:24688791

  20. Designing and Managing Your Digital Library.

    ERIC Educational Resources Information Center

    Guenther, Kim

    2000-01-01

    Discusses digital libraries and Web site design issues. Highlights include accessibility issues, including standards, markup languages like HTML and XML, and metadata; building virtual communities; the use of Web portals for customized delivery of information; quality assurance tools, including data mining; and determining user needs, including…

  1. The NeMO Explorer Web Site: Interactive Exploration of a Recent Submarine Eruption and Hydrothermal Vents, Axial Volcano, Juan de Fuca Ridge

    NASA Astrophysics Data System (ADS)

    Weiland, C.; Chadwick, W. W.; Embley, R. W.

    2001-12-01

    To help visualize the submarine volcanic landscape at NOAA's New Millennium Observatory (NeMO), we have created the NeMO Explorer web site: http://www.pmel.noaa.gov/vents/nemo/explorer.html. This web site takes visitors a mile down beneath the ocean surface to explore Axial Seamount, an active submarine volcano 300 miles off the Oregon coast. We use virtual reality to put visitors in a photorealistic 3-D model of the seafloor that lets them view hydrothermal vents and fresh lava flows as if they were really on the seafloor. At each of six virtual sites there is an animated tour and a 360o panorama in which users can view the volcanic landscape and see biological communities within a spatially accurate context. From the six sites there are hyperlinks to 50 video clips taken by a remotely operated vehicle. Each virtual site concentrates on a different topic, including the dynamics of the 1998 eruption at Axial volcano (Rumbleometer), high-temperature hydrothermal vents (CASM and ASHES), diffuse hydrothermal venting (Marker33), subsurface microbial blooms (The Pit), and the boundary between old and new lavas (Castle vent). In addition to exploring the region geographically, visitors can also explore the web site via geological concepts. The concepts gallery lets you quickly find information about mid-ocean ridges, hydrothermal vents, vent fauna, lava morphology, and more. Of particular interest is an animation of the January 1998 eruption, which shows the rapid inflation (by over 3 m) and draining of the sheet flow. For more info see Fox et al., Nature, v.412, p.727, 2001. This project was funded by NOAA's High Performance Computing and Communication (HPCC) and Vents Programs. Our goal is to present a representative portion of the vast collection of NOAA's multimedia imagery to the public in a way that is easy to use and understand. These data are particularly challenging to present because of their high data rates and low contextual information. The 3-D models create effective context and new video technology allows us to present good quality video at lower data rates. Related curriculum materials for middle- and high-school students are also available from the NeMO web site at http://www.pmel.noaa.gov/vents/nemo/education.html. >http://www.pmel.noaa.gov/vents/nemo/explorer.html

  2. MAGA, a new database of gas natural emissions: a collaborative web environment for collecting data.

    NASA Astrophysics Data System (ADS)

    Cardellini, Carlo; Chiodini, Giovanni; Frigeri, Alessandro; Bagnato, Emanuela; Frondini, Francesco; Aiuppa, Alessandro

    2014-05-01

    The data on volcanic and non-volcanic gas emissions available online are, as today, are incomplete and most importantly, fragmentary. Hence, there is need for common frameworks to aggregate available data, in order to characterize and quantify the phenomena at various scales. A new and detailed web database (MAGA: MApping GAs emissions) has been developed, and recently improved, to collect data on carbon degassing form volcanic and non-volcanic environments. MAGA database allows researchers to insert data interactively and dynamically into a spatially referred relational database management system, as well as to extract data. MAGA kicked-off with the database set up and with the ingestion in to the database of the data from: i) a literature survey on publications on volcanic gas fluxes including data on active craters degassing, diffuse soil degassing and fumaroles both from dormant closed-conduit volcanoes (e.g., Vulcano, Phlegrean Fields, Santorini, Nysiros, Teide, etc.) and open-vent volcanoes (e.g., Etna, Stromboli, etc.) in the Mediterranean area and Azores, and ii) the revision and update of Googas database on non-volcanic emission of the Italian territory (Chiodini et al., 2008), in the framework of the Deep Earth Carbon Degassing (DECADE) research initiative of the Deep Carbon Observatory (DCO). For each geo-located gas emission site, the database holds images and description of the site and of the emission type (e.g., diffuse emission, plume, fumarole, etc.), gas chemical-isotopic composition (when available), gas temperature and gases fluxes magnitude. Gas sampling, analysis and flux measurement methods are also reported together with references and contacts to researchers expert of each site. In this phase data can be accessed on the network from a web interface, and data-driven web service, where software clients can request data directly from the database, are planned to be implemented shortly. This way Geographical Information Systems (GIS) and Virtual Globes (e.g., Google Earth) could easily access the database, and data could be exchanged with other database. At the moment the database includes: i) more than 1000 flux data about volcanic plume degassing from Etna and Stromboli volcanoes, ii) data from ~ 30 sites of diffuse soil degassing from Napoletan volcanoes, Azores, Canary, Etna, Stromboli, and Vulcano Island, several data on fumarolic emissions (~ 7 sites) with CO2 fluxes; iii) data from ~ 270 non volcanic gas emission site in Italy. We believe MAGA data-base is an important starting point to develop a large scale, expandable data-base aimed to excite, inspire, and encourage participation among researchers. In addition, the possibility to archive location and qualitative information for gas emission/sites not yet investigated, could stimulate the scientific community for future researches and will provide an indication on the current uncertainty on deep carbon fluxes global estimates

  3. Tobacco-prevention messages online: social marketing via the Web.

    PubMed

    Lin, Carolyn A; Hullman, Gwen A

    2005-01-01

    Antitobacco groups have joined millions of other commercial or noncommercial entities in developing a presence on the Web. These groups primarily represent the following different sponsorship categories: grassroots, medical, government, and corporate. To obtain a better understanding of the strengths and weaknesses in the message design of antitobacco Web sites, this project analyzed 100 antitobacco Web sites ranging across these four sponsorship categories. The results show that the tobacco industry sites posted just enough antismoking information to appease the antismoking publics. Medical organizations designed their Web sites as specialty sites and offered mostly scientific information. While the government sites resembled a clearinghouse for antitobacco related information, the grassroots sites represented the true advocacy outlets. In general, the industry sites provided the weakest persuasive messages and medical sites fared only slightly better. Government and grassroots sites rated most highly in presenting their antitobacco campaign messages on the Web.

  4. 78 FR 54241 - Proposed Information Collection; Comment Request; BroadbandMatch Web Site Tool

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... Information Collection; Comment Request; BroadbandMatch Web Site Tool AGENCY: National Telecommunications and... goal of increased broadband deployment and use in the United States. The BroadbandMatch Web site began... empowering technology effectively. II. Method of Collection BroadbandMatch users access the Web site through...

  5. 75 FR 75170 - APHIS User Fee Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-02

    ...] APHIS User Fee Web Site AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice... recover the costs of providing certain services. This notice announces the availability of a Web site that contains information about the Agency's user fees. ADDRESSES: The Agency's user fee Web site is located at...

  6. 78 FR 76187 - 30-Day Notice of Proposed Information Collection: Exchange Programs Alumni Web Site Registration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-16

    ...: Exchange Programs Alumni Web Site Registration ACTION: Notice of request for public comment and submission... Information Collection: Exchange Programs Alumni Web site Registration. OMB Control Number: 1405-0192. Type of... proposed collection: The International Exchange Alumni Web site requires information to process users...

  7. 12 CFR 555.310 - How do I notify OTS?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Describe the transactional web site. (2) Indicate the date the transactional web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional web site. (b) Transition provision. If you established a transactional web site after the date of your last...

  8. 7 CFR 2902.6 - Providing product information to Federal agencies.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Web site. An informational USDA Web site implementing section 9002 can be found at: http://www.biobased.oce.usda.gov. USDA will maintain a voluntary Web-based information site for manufacturers and... information. This Web site will provide information as to the availability, relative price, biobased content...

  9. 49 CFR 604.16 - Duties for recipients with respect to charter registration Web site.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... registration Web site. 604.16 Section 604.16 Transportation Other Regulations Relating to Transportation... Qualified Human Service Organizations and Duties for Recipients With Respect to Charter Registration Web site § 604.16 Duties for recipients with respect to charter registration Web site. Each recipient shall...

  10. 7 CFR 3201.6 - Providing product information to Federal agencies.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Informational Web site. An informational USDA Web site implementing section 9002 can be found at: http://www.biopreferred.gov. USDA will maintain a voluntary Web-based information site for manufacturers and vendors of... Web site will provide information as to the availability, relative price, biobased content...

  11. 7 CFR 3201.6 - Providing product information to Federal agencies.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Informational Web site. An informational USDA Web site implementing section 9002 can be found at: http://www.biopreferred.gov. USDA will maintain a voluntary Web-based information site for manufacturers and vendors of... Web site will provide information as to the availability, relative price, biobased content...

  12. A Design Analysis Model for Developing World Wide Web Sites.

    ERIC Educational Resources Information Center

    Ma, Yan

    2002-01-01

    Examines the relationship between and among designers, text, and users of the Galter Health Sciences Library Web site at Northwestern University by applying reader-response criticism. Highlights include Web site design; comparison of designers' intentions with the actual organization of knowledge on the Web site; and compares designer's intentions…

  13. 49 CFR 604.16 - Duties for recipients with respect to charter registration Web site.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... registration Web site. 604.16 Section 604.16 Transportation Other Regulations Relating to Transportation... Qualified Human Service Organizations and Duties for Recipients With Respect to Charter Registration Web site § 604.16 Duties for recipients with respect to charter registration Web site. Each recipient shall...

  14. 49 CFR 604.16 - Duties for recipients with respect to charter registration Web site.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... registration Web site. 604.16 Section 604.16 Transportation Other Regulations Relating to Transportation... Qualified Human Service Organizations and Duties for Recipients With Respect to Charter Registration Web site § 604.16 Duties for recipients with respect to charter registration Web site. Each recipient shall...

  15. 12 CFR 555.310 - How do I notify OTS?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Describe the transactional web site. (2) Indicate the date the transactional web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional web site. (b) Transition provision. If you established a transactional web site after the date of your last...

  16. 12 CFR 555.310 - How do I notify OTS?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Describe the transactional web site. (2) Indicate the date the transactional web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional web site. (b) Transition provision. If you established a transactional web site after the date of your last...

  17. 49 CFR 604.16 - Duties for recipients with respect to charter registration Web site.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... registration Web site. 604.16 Section 604.16 Transportation Other Regulations Relating to Transportation... Qualified Human Service Organizations and Duties for Recipients With Respect to Charter Registration Web site § 604.16 Duties for recipients with respect to charter registration Web site. Each recipient shall...

  18. 49 CFR 604.16 - Duties for recipients with respect to charter registration Web site.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... registration Web site. 604.16 Section 604.16 Transportation Other Regulations Relating to Transportation... Qualified Human Service Organizations and Duties for Recipients With Respect to Charter Registration Web site § 604.16 Duties for recipients with respect to charter registration Web site. Each recipient shall...

  19. 12 CFR 555.310 - How do I notify OTS?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Describe the transactional web site. (2) Indicate the date the transactional web site will become operational. (3) List a contact familiar with the deployment, operation, and security of the transactional web site. (b) Transition provision. If you established a transactional web site after the date of your last...

  20. Beta-test Results for an HPV Information Web site: GoHealthyGirls.org – Increasing HPV Vaccine Uptake in the United States

    PubMed Central

    Nodulman, Jessica A.; Kong, Alberta S.; Wheeler, Cosette M.; Buller, David B.; Woodall, W. Gill

    2014-01-01

    A web site, GoHealthyGirls, was developed to educate and inform parents and their adolescent daughters about human papillomavirus (HPV) and HPV vaccines. This article provides an overview of web site development and content followed by the results of a beta-test of the web site. 63 New Mexican parents of adolescent girls tested the site. Results indicated that GoHealthyGirls was a functioning and appealing web site. During this brief educational intervention, findings suggest that the web site has the potential to increase HPV vaccine uptake. This research supports the Internet as a valuable channel to disseminate health education and information to diverse populations. PMID:25221442

  1. Corporate Web Sites in Traditional Print Advertisements.

    ERIC Educational Resources Information Center

    Pardun, Carol J.; Lamb, Larry

    1999-01-01

    Describes the Web presence in print advertisements to determine how marketers are creating bridges between traditional advertising and the Internet. Content analysis showed Web addresses in print ads; categories of advertisers most likely to link print ads with Web sites; and whether the Web site attempts to develop a database of potential…

  2. Documenting pharmacist interventions on an intranet.

    PubMed

    Simonian, Armen I

    2003-01-15

    The process of developing and implementing an intranet Web site for clinical intervention documentation is described. An inpatient pharmacy department initiated an organizationwide effort to improve documentation of interventions by pharmacists at its seven hospitals to achieve real-time capture of meaningful benchmarking data. Standardization of intervention types would allow the health system to contrast and compare medication use, process improvement, and patient care initiatives among its hospitals. After completing a needs assessment and reviewing current methodologies, a computerized tracking tool was developed in-house and integrated with the organization's intranet. Representatives from all hospitals agreed on content and functionality requirements for the Web site. The site was completed and activated in February 2002. Before this Web site was established, the most documented intervention types were Renal Adjustment and Clarify Dose, with a daily average of four and three, respectively. After site activation, daily averages for Renal Adjustment remained unchanged, but Clarify Dose is now documented nine times per day. Drug Information and i.v.-to-p.o. intervention types, which previously averaged less than one intervention per day, are now documented an average of four times daily. Approximately 91% of staff pharmacists are using this site. Future plans for this site include enhanced accessibility to the site with wireless personal digital assistants. The design and implementation of an intranet Web site to document pharmacists' interventions doubled the rate of intervention documentation and standardized the intervention types among hospitals in the health system.

  3. GenProBiS: web server for mapping of sequence variants to protein binding sites.

    PubMed

    Konc, Janez; Skrlj, Blaz; Erzen, Nika; Kunej, Tanja; Janezic, Dusanka

    2017-07-03

    Discovery of potentially deleterious sequence variants is important and has wide implications for research and generation of new hypotheses in human and veterinary medicine, and drug discovery. The GenProBiS web server maps sequence variants to protein structures from the Protein Data Bank (PDB), and further to protein-protein, protein-nucleic acid, protein-compound, and protein-metal ion binding sites. The concept of a protein-compound binding site is understood in the broadest sense, which includes glycosylation and other post-translational modification sites. Binding sites were defined by local structural comparisons of whole protein structures using the Protein Binding Sites (ProBiS) algorithm and transposition of ligands from the similar binding sites found to the query protein using the ProBiS-ligands approach with new improvements introduced in GenProBiS. Binding site surfaces were generated as three-dimensional grids encompassing the space occupied by predicted ligands. The server allows intuitive visual exploration of comprehensively mapped variants, such as human somatic mis-sense mutations related to cancer and non-synonymous single nucleotide polymorphisms from 21 species, within the predicted binding sites regions for about 80 000 PDB protein structures using fast WebGL graphics. The GenProBiS web server is open and free to all users at http://genprobis.insilab.org. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. Web Proxy Auto Discovery for the WLCG

    NASA Astrophysics Data System (ADS)

    Dykstra, D.; Blomer, J.; Blumenfeld, B.; De Salvo, A.; Dewhurst, A.; Verguilov, V.

    2017-10-01

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily support that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which they direct to the nearest publicly accessible web proxy servers. The responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.

  5. Web Proxy Auto Discovery for the WLCG

    DOE PAGES

    Dykstra, D.; Blomer, J.; Blumenfeld, B.; ...

    2017-11-23

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less

  6. Web Proxy Auto Discovery for the WLCG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dykstra, D.; Blomer, J.; Blumenfeld, B.

    All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less

  7. Side by Side: What a Comparative Usability Study Told Us about a Web Site Redesign

    ERIC Educational Resources Information Center

    Dougan, Kirstin; Fulton, Camilla

    2009-01-01

    Library Web sites must compete against easy-to-use sites, such as Google Scholar, Google Books, and Wikipedia, for students' time and attention. Library Web sites must therefore be designed with aesthetics and user perceptions at the forefront. The Music and Performing Arts Library at Urbana-Champaign's Web site was overcrowded and in much need of…

  8. The use of the World Wide Web by medical journals in 2003 and 2005: an observational study.

    PubMed

    Schriger, David L; Ouk, Sripha; Altman, Douglas G

    2007-01-01

    The 2- to 6-page print journal article has been the standard for 200 years, yet this format severely limits the amount of detailed information that can be conveyed. The World Wide Web provides a low-cost option for posting extended text and supplementary information. It also can enhance the experience of journal editors, reviewers, readers, and authors through added functionality (eg, online submission and peer review, postpublication critique, and e-mail notification of table of contents.) Our aim was to characterize ways that journals were using the World Wide Web in 2005 and note changes since 2003. We analyzed the Web sites of 138 high-impact print journals in 3 ways. First, we compared the print and Web versions of March 2003 and 2005 issues of 28 journals (20 of which were randomly selected from the 138) to determine how often articles were published Web only and how often print articles were augmented by Web-only supplements. Second, we examined what functions were offered by each journal Web site. Third, for journals that offered Web pages for reader commentary about each article, we analyzed the number of comments and characterized these comments. Fifty-six articles (7%) in 5 journals were Web only. Thirteen of the 28 journals had no supplementary online content. By 2005, several journals were including Web-only supplements in >20% of their papers. Supplementary methods, tables, and figures predominated. The use of supplementary material increased by 5% from 2% to 7% in the 20-journal random sample from 2003 to 2005. Web sites had similar functionality with an emphasis on linking each article to related material and e-mailing readers about activity related to each article. There was little evidence of journals using the Web to provide readers an interactive experience with the data or with each other. Seventeen of the 138 journals offered rapid-response pages. Only 18% of eligible articles had any comments after 5 months. Journal Web sites offer similar functionality. The use of online-only articles and online-only supplements is increasing.

  9. Computer and internet use by ophthalmologists and trainees in an academic centre.

    PubMed

    Somal, Kirandeep; Lam, Wai-Ching; Tam, Eric

    2009-06-01

    The purpose of this study was to determine computer, internet, and department web site use by members of the Department of Ophthalmology and Vision Sciences at the University of Toronto in Toronto, Ont. Cross-sectional analysis. Eighty-eight members of the Department of Ophthalmology and Vision Sciences who responded to a survey. One hundred forty-eight department members (93 staff, 24 residents, and 31 fellows) were invited via e-mail to complete an online survey looking at computer and internet use. Participation was voluntary. Individuals who did not fill in an online response were sent a paper copy of the survey. No identifying fields were used in the data analysis. A response rate of 59% (88/148) was obtained. Fifty-nine percent of respondents described their computer skill as "good" or better; 86.4% utilized a computer in their clinical practice. Performance of computer-related tasks included accessing e-mail (98.9%), accessing medical literature (87.5%), conducting personal affairs (83%), and accessing conference/round schedules (65.9%). The survey indicated that 89.1% of respondents accessed peer-reviewed material online, including eMedicine (60.2%) and UpToDate articles (48.9%). Thirty-three percent of department members reported never having visited the department web site. Impediments to web site use included information not up to date (27.3%), information not of interest (22.1%), and difficulty locating the web site (20.8%). The majority of ophthalmologists and trainees in an academic centre utilize computer and internet resources for various tasks. A weak linear correlation was found between lower age of respondent and higher self-evaluated experience with computers (r = -0.43). Although use of the current department web site was low, respondents were interested in seeing improvements to the web site to increase its utility.

  10. Effect of transmission intensity on hotspots and micro-epidemiology of malaria in sub-Saharan Africa.

    PubMed

    Mogeni, Polycarp; Omedo, Irene; Nyundo, Christopher; Kamau, Alice; Noor, Abdisalan; Bejon, Philip

    2017-06-30

    Malaria transmission intensity is heterogeneous, complicating the implementation of malaria control interventions. We provide a description of the spatial micro-epidemiology of symptomatic malaria and asymptomatic parasitaemia in multiple sites. We assembled data from 19 studies conducted between 1996 and 2015 in seven countries of sub-Saharan Africa with homestead-level geospatial data. Data from each site were used to quantify spatial autocorrelation and examine the temporal stability of hotspots. Parameters from these analyses were examined to identify trends over varying transmission intensity. Significant hotspots of malaria transmission were observed in most years and sites. The risk ratios of malaria within hotspots were highest at low malaria positive fractions (MPFs) and decreased with increasing MPF (p < 0.001). However, statistical significance of hotspots was lowest at extremely low and extremely high MPFs, with a peak in statistical significance at an MPF of ~0.3. In four sites with longitudinal data we noted temporal instability and variable negative correlations between MPF and average age of symptomatic malaria across all sites, suggesting varying degrees of temporal stability. We observed geographical micro-variation in malaria transmission at sites with a variety of transmission intensities across sub-Saharan Africa. Hotspots are marked at lower transmission intensity, but it becomes difficult to show statistical significance when cases are sparse at very low transmission intensity. Given the predictability with which hotspots occur as transmission intensity falls, malaria control programmes should have a low threshold for responding to apparent clustering of cases.

  11. The Atlas of Chinese World Wide Web Ecosystem Shaped by the Collective Attention Flows

    PubMed Central

    Lou, Xiaodan; Li, Yong; Gu, Weiwei; Zhang, Jiang

    2016-01-01

    The web can be regarded as an ecosystem of digital resources connected and shaped by collective successive behaviors of users. Knowing how people allocate limited attention on different resources is of great importance. To answer this, we embed the most popular Chinese web sites into a high dimensional Euclidean space based on the open flow network model of a large number of Chinese users’ collective attention flows, which both considers the connection topology of hyperlinks between the sites and the collective behaviors of the users. With these tools, we rank the web sites and compare their centralities based on flow distances with other metrics. We also study the patterns of attention flow allocation, and find that a large number of web sites concentrate on the central area of the embedding space, and only a small fraction of web sites disperse in the periphery. The entire embedding space can be separated into 3 regions(core, interim, and periphery). The sites in the core (1%) occupy a majority of the attention flows (40%), and the sites (34%) in the interim attract 40%, whereas other sites (65%) only take 20% flows. What’s more, we clustered the web sites into 4 groups according to their positions in the space, and found that similar web sites in contents and topics are grouped together. In short, by incorporating the open flow network model, we can clearly see how collective attention allocates and flows on different web sites, and how web sites connected each other. PMID:27812133

  12. An Extraction Method of an Informative DOM Node from a Web Page by Using Layout Information

    NASA Astrophysics Data System (ADS)

    Tsuruta, Masanobu; Masuyama, Shigeru

    We propose an informative DOM node extraction method from a Web page for preprocessing of Web content mining. Our proposed method LM uses layout data of DOM nodes generated by a generic Web browser, and the learning set consists of hundreds of Web pages and the annotations of informative DOM nodes of those Web pages. Our method does not require large scale crawling of the whole Web site to which the target Web page belongs. We design LM so that it uses the information of the learning set more efficiently in comparison to the existing method that uses the same learning set. By experiments, we evaluate the methods obtained by combining one that consists of the method for extracting the informative DOM node both the proposed method and the existing methods, and the existing noise elimination methods: Heur removes advertisements and link-lists by some heuristics and CE removes the DOM nodes existing in the Web pages in the same Web site to which the target Web page belongs. Experimental results show that 1) LM outperforms other methods for extracting the informative DOM node, 2) the combination method (LM, {CE(10), Heur}) based on LM (precision: 0.755, recall: 0.826, F-measure: 0.746) outperforms other combination methods.

  13. Framing medical tourism: an examination of appeal, risk, convalescence, accreditation, and interactivity in medical tourism web sites.

    PubMed

    Mason, Alicia; Wright, Kevin B

    2011-02-01

    This exploratory study analyzed the content of medical tourism Web sites in an attempt to examine how they convey information about benefits and risks of medical procedures, how they frame credibility, and the degree to which these Web sites include interactive features for consumers. Drawing upon framing theory, the researchers content analyzed a sample of 66 medical tourism Web sites throughout the world. The results indicated that medical tourism Web sites largely promote the benefits of medical procedures while downplaying the risks, and relatively little information regarding the credibility of these services appears. In addition, the presentation of benefits/risks, credibility, and Web site interactivity were found to differ by region and type of facility. The authors discuss the implications of these findings concerning the framing of medical tourism Web site content, future directions for research, and limitations.

  14. The INGV Real Time Strong Motion Database

    NASA Astrophysics Data System (ADS)

    Massa, Marco; D'Alema, Ezio; Mascandola, Claudia; Lovati, Sara; Scafidi, Davide; Gomez, Antonio; Carannante, Simona; Franceschina, Gianlorenzo; Mirenna, Santi; Augliera, Paolo

    2017-04-01

    The INGV real time strong motion data sharing is assured by the INGV Strong Motion Database. ISMD (http://ismd.mi.ingv.it) was designed in the last months of 2011 in cooperation among different INGV departments, with the aim to organize the distribution of the INGV strong-motion data using standard procedures for data acquisition and processing. The first version of the web portal was published soon after the occurrence of the 2012 Emilia (Northern Italy), Mw 6.1, seismic sequence. At that time ISMD was the first European real time web portal devoted to the engineering seismology community. After four years of successfully operation, the thousands of accelerometric waveforms collected in the archive need necessary a technological improvement of the system in order to better organize the new data archiving and to make more efficient the answer to the user requests. ISMD 2.0 was based on PostgreSQL (www.postgresql.org), an open source object- relational database. The main purpose of the web portal is to distribute few minutes after the origin time the accelerometric waveforms and related metadata of the Italian earthquakes with ML≥3.0. Data are provided both in raw SAC (counts) and automatically corrected ASCII (gal) formats. The web portal also provide, for each event, a detailed description of the ground motion parameters (i.e. Peak Ground Acceleration, Velocity and Displacement, Arias and Housner Intensities) data converted in velocity and displacement, response spectra up to 10.0 s and general maps concerning the recent and the historical seismicity of the area together with information about its seismic hazard. The focal parameters of the events are provided by the INGV National Earthquake Center (CNT, http://cnt.rm.ingv.it). Moreover, the database provides a detailed site characterization section for each strong motion station, based on geological, geomorphological and geophysical information. At present (i.e. January 2017), ISMD includes 987 (121.185 waveforms) Italian earthquakes with ML≥3.0, recorded since the 1st January 2012, besides 204 accelerometric stations belonging to the INGV strong motion network and regional partner.

  15. THE ECOTOX DATABASE AND ECOLOGICAL SOIL SCREENING LEVEL (ECO-SSL) WEB SITES

    EPA Science Inventory

    The EPA's ECOTOX database (http://www.epa.gov/ecotox/) provides a web browser search interface for locating aquatic and terrestrial toxic effects information. Data on more than 8100 chemicals and 5700 terrestrial and aquatic species are included in the database. Information is ...

  16. 5 CFR 2604.201 - Public reading room facility and Web site.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Public reading room facility and Web site... DISCLOSURE REPORTS FOIA Public Reading Room Facility and Web Site; Index Identifying Information for the Public § 2604.201 Public reading room facility and Web site. (a)(1) Location of public reading room...

  17. Analysis of Elementary School Web Sites

    ERIC Educational Resources Information Center

    Hartshorne, Richard; Friedman, Adam; Algozzine, Bob; Kaur, Daljit

    2008-01-01

    While researchers have studied the use and value of educational software for many years, study of school Web sites and/or their effectiveness is limited. In this investigation, we identified goals and functions of school Web sites and used the foundations of effective Web site design to develop an evaluation checklist. We then applied these…

  18. Academic Library Web Sites: Current Practice and Future Directions

    ERIC Educational Resources Information Center

    Detlor, Brian; Lewis, Vivian

    2006-01-01

    To address competitive threats, academic libraries are encouraged to build robust Web sites personalized to learning and research tasks. Through an evaluation of Association of Research Libraries (ARL)-member Web sites, we suggest how library Web sites should evolve and reflect upon the impacts such recommendations may have on academic libraries…

  19. 75 FR 22391 - Notice of Web Site Publication for the Climate Program Office

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-28

    ...-01] Notice of Web Site Publication for the Climate Program Office AGENCY: Climate Program Office (CPO... its Web site at http://www.climate.noaa.gov . FOR FURTHER INFORMATION CONTACT: Eric Locklear; Chief... information is available on the Climate Program Office Web site pertaining to the CPO's research strategies...

  20. 75 FR 66413 - 30-Day Notice of Proposed Information Collection: Exchange Programs Alumni Web Site Registration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-28

    ...: Exchange Programs Alumni Web Site Registration, DS-7006 ACTION: Notice of request for public comment and... Collection The Exchange Programs Alumni Web site requires information to process users' voluntary requests for participation in the Web site. Other than contact information, which is required for website...

  1. 78 FR 66420 - Proposed Enhancements to the Motor Carrier Safety Measurement System (SMS) Public Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ...-0392] Proposed Enhancements to the Motor Carrier Safety Measurement System (SMS) Public Web Site AGENCY... on the Agency's Safety Measurement System (SMS) public Web site. FMCSA first announced the... public Web site that are the direct result of feedback from stakeholders regarding the information...

  2. 77 FR 38033 - Notice of Establishment of a Commodity Import Approval Process Web Site

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-26

    ... Process Web Site AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice. SUMMARY: We are announcing the creation of a new Plant Protection and Quarantine Web site that will provide stakeholders with... comment on draft risk assessments. This Web site will make the commodity import approval process more...

  3. 22 CFR 181.9 - Internet Web site publication.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Internet Web site publication. 181.9 Section... PUBLICATION OF INTERNATIONAL AGREEMENTS § 181.9 Internet Web site publication. The Office of the Assistant... responsible for making publicly available on the Internet Web site of the Department of State each treaty or...

  4. 22 CFR 181.9 - Internet Web site publication.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Internet Web site publication. 181.9 Section... PUBLICATION OF INTERNATIONAL AGREEMENTS § 181.9 Internet Web site publication. The Office of the Assistant... responsible for making publicly available on the Internet Web site of the Department of State each treaty or...

  5. 22 CFR 181.9 - Internet Web site publication.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Internet Web site publication. 181.9 Section... PUBLICATION OF INTERNATIONAL AGREEMENTS § 181.9 Internet Web site publication. The Office of the Assistant... responsible for making publicly available on the Internet Web site of the Department of State each treaty or...

  6. Characteristics of Food Industry Web Sites and "Advergames" Targeting Children

    ERIC Educational Resources Information Center

    Culp, Jennifer; Bell, Robert A.; Cassady, Diana

    2010-01-01

    Objective: To assess the content of food industry Web sites targeting children by describing strategies used to prolong their visits and foster brand loyalty; and to document health-promoting messages on these Web sites. Design: A content analysis was conducted of Web sites advertised on 2 children's networks, Cartoon Network and Nickelodeon. A…

  7. 5 CFR 2604.201 - Public reading room facility and Web site.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Public reading room facility and Web site... DISCLOSURE REPORTS FOIA Public Reading Room Facility and Web Site; Index Identifying Information for the Public § 2604.201 Public reading room facility and Web site. (a)(1) Location of public reading room...

  8. Library Web Sites in Pakistan: An Analysis of Content

    ERIC Educational Resources Information Center

    Qutab, Saima; Mahmood, Khalid

    2009-01-01

    Purpose: The purpose of this paper is to investigate library web sites in Pakistan, to analyse their content and navigational strengths and weaknesses and to give recommendations for developing better web sites and quality assessment studies. Design/methodology/approach: Survey of web sites of 52 academic, special, public and national libraries in…

  9. 22 CFR 181.9 - Internet Web site publication.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Internet Web site publication. 181.9 Section... PUBLICATION OF INTERNATIONAL AGREEMENTS § 181.9 Internet Web site publication. The Office of the Assistant... responsible for making publicly available on the Internet Web site of the Department of State each treaty or...

  10. 22 CFR 181.9 - Internet Web site publication.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Internet Web site publication. 181.9 Section... PUBLICATION OF INTERNATIONAL AGREEMENTS § 181.9 Internet Web site publication. The Office of the Assistant... responsible for making publicly available on the Internet Web site of the Department of State each treaty or...

  11. Formative Evaluation of a Family Life Education Web Site

    ERIC Educational Resources Information Center

    Steimle, Brynn M.; Duncan, Stephen F.

    2004-01-01

    Hundreds of family life education Web sites are available on the Internet, allowing individuals and families unprecedented access to family life education information. Evaluation is critical to ensuring the quality of and improving these Web sites; yet, few Web site evaluations have been conducted. We formatively evaluated a new family life…

  12. Ocean Drilling Program: Public Information: Promotional Materials

    Science.gov Websites

    Learning web site) "From Mountains to Monsoons" interactive CD-ROM and Teacher's Guide (August 1997; JOI Learning web site) "Blast from the Past" poster with classroom activities (August 1997; JOI Learning web site) Slides "The ODP in Film" DVD (JOI Learning web site) B-roll

  13. Library Web Site Administration: A Strategic Planning Model For the Smaller Academic Library

    ERIC Educational Resources Information Center

    Ryan, Susan M.

    2003-01-01

    Strategic planning provides a useful structure for creating and implementing library web sites. The planned integration of a library's web site into its mission and objectives ensures that the library's community of users will consider the web site one of the most important information tools the library offers.

  14. Information on infantile colic on the World Wide Web.

    PubMed

    Bailey, Shana D; D'Auria, Jennifer P; Haushalter, Jamie P

    2013-01-01

    The purpose of this study was to explore and describe the type and quality of information on infantile colic that a parent might access on the World Wide Web. Two checklists were used to evaluate the quality indicators of 24 Web sites and the colic-specific content. Fifteen health information Web sites met more of the quality parameters than the nine commercial sites. Eight Web sites included information about colic and infant abuse, with six being health information sites. The colic-specific content on 24 Web sites reflected current issues and controversies; however, the completeness of the information in light of current evidence varied among the Web sites. Strategies to avoid complications of parental stress or infant abuse were not commonly found on the Web sites. Pediatric professionals must guide parents to reliable colic resources that also include emotional support and understanding of infant crying. A best evidence guideline for the United States would eliminate confusion and uncertainty about which colic therapies are safe and effective for parents and professionals. Copyright © 2013 National Association of Pediatric Nurse Practitioners. Published by Mosby, Inc. All rights reserved.

  15. Systematic Review of Quality of Patient Information on Phalloplasty in the Internet.

    PubMed

    Karamitros, Georgios A; Kitsos, Nikolaos A; Sapountzis, Stamatis

    2017-12-01

    An increasing number of patients, considering aesthetic surgery, use Internet health information as their first source of information. However, the quality of information available in the Internet on phalloplasty is currently unknown. This study aimed to assess the quality of patient information on phalloplasty available in the Internet. The assessment of the Web sites was based on the modified Ensuring Quality Information for Patients (EQIP) instrument (36 items). Three hundred Web sites were identified by the most popular Web search engines. Ninety Web sites were assessed after, duplicates, irrelevant sources and Web sites in other languages rather than English were excluded. Only 16 (18%) Web sites addressed >21 items, and scores tended to be higher for Web sites developed by academic centers and the industry than for Web sites developed by private practicing surgeons. The EQIP score achieved by Web sites ranged between 4 and 29 of the total 36 points, with a median value of 17.5 points (interquartile range, 13-21). The top 5 Web sites with the highest scores were identified. The quality of patient information on phalloplasty in the Internet is substandard, and the existing Web sites present inadequate information. There is a dire need to improve the quality of Internet phalloplasty resources for potential patients who might consider this procedure. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  16. Gullible's Travels.

    ERIC Educational Resources Information Center

    Block, Marylaine

    2002-01-01

    Discusses how to teach students to evaluate information they find on the Internet. Highlights include motivation of Web site owners; link-checking; having student create Web pages to help with their evaluation skills of other Web sites; critical thinking skills; and helpful Web sites. (LRW)

  17. Prospective analysis of the quality of Spanish health information web sites after 3 years.

    PubMed

    Conesa-Fuentes, Maria C; Hernandez-Morante, Juan J

    2016-12-01

    Although the Internet has become an essential source of health information, our study conducted 3 years ago provided evidence of the low quality of Spanish health web sites. The objective of the present study was to evaluate the quality of Spanish health information web sites now, and to compare these results with those obtained 3 years ago. For the original study, the most visited health information web sites were selected through the PageRank® (Google®) system. The present study evaluated the quality of the same web sites from February to May 2013, using the method developed by Bermúdez-Tamayo et al. and HONCode® criteria. The mean quality of the selected web sites was low and has deteriorated since the previous evaluation, especially in regional health services and institutions' web sites. The quality of private web sites remained broadly similar. Compliance with privacy and update criteria also improved in the intervening period. The results indicate that, even in the case of health web sites, design or appearance is more relevant to developers than quality of information. It is recommended that responsible institutions should increase their efforts to eliminate low-quality health information that may further contribute to health problems.

  18. Solar Irradiance Data Products at the LASP Interactive Solar IRradiance Datacenter (LISIRD)

    NASA Astrophysics Data System (ADS)

    Lindholm, D. M.; Ware DeWolfe, A.; Wilson, A.; Pankratz, C. K.; Snow, M. A.; Woods, T. N.

    2011-12-01

    The Laboratory for Atmospheric and Space Physics (LASP) has developed the LASP Interactive Solar IRradiance Datacenter (LISIRD, http://lasp.colorado.edu/lisird/) web site to provide access to a comprehensive set of solar irradiance measurements and related datasets. Current data holdings include products from NASA missions SORCE, UARS, SME, and TIMED-SEE. The data provided covers a wavelength range from soft X-ray (XUV) at 0.1 nm up to the near infrared (NIR) at 2400 nm, as well as Total Solar Irradiance (TSI). Other datasets include solar indices, spectral and flare models, solar images, and more. The LISIRD web site features updated plotting, browsing, and download capabilities enabled by dygraphs, JavaScript, and Ajax calls to the LASP Time Series Server (LaTiS). In addition to the web browser interface, most of the LISIRD datasets can be accessed via the LaTiS web service interface that supports the OPeNDAP standard. OPeNDAP clients and other programming APIs are available for making requests that subset, aggregate, or filter data on the server before it is transported to the user. This poster provides an overview of the LISIRD system, summarizes the datasets currently available, and provides details on how to access solar irradiance data products through LISIRD's interfaces.

  19. Roadmap for a Departmental Web Site

    ERIC Educational Resources Information Center

    Zhang, Guo-Qiang; White, Lee; Hesse, Christopher; Buchner, Marc; Mehregany, Mehran

    2005-01-01

    Virtually every academic department in an institute of higher education requires Web presence as a critical component of its information technology strategy. The problem of how to leverage the World Wide Web and build effective and useful departmental Web sites seems to have long been solved. Yet browsing academic Web sites from around the world…

  20. The quality of mental health information commonly searched for on the Internet.

    PubMed

    Grohol, John M; Slimowicz, Joseph; Granda, Rebecca

    2014-04-01

    Previous research has reviewed the quality of online information related to specific mental disorders. Yet, no comprehensive study has been conducted on the overall quality of mental health information searched for online. This study examined the first 20 search results of two popular search engines-Google and Bing-for 11 common mental health terms. They were analyzed using the DISCERN instrument, an adaptation of the Depression Website Content Checklist (ADWCC), Flesch Reading Ease and Flesch-Kincaid Grade Level readability measures, HONCode badge display, and commercial status, resulting in an analysis of 440 web pages. Quality of Web site results varied based on type of disorder examined, with higher quality Web sites found for schizophrenia, bipolar disorder, and dysthymia, and lower quality ratings for phobia, anxiety, and panic disorder Web sites. Of the total Web sites analyzed, 67.5% had good or better quality content. Nearly one-third of the search results produced Web sites from three entities: WebMD, Wikipedia, and the Mayo Clinic. The mean Flesch Reading Ease score was 41.21, and the mean Flesch-Kincaid Grade Level score was 11.68. The presence of the HONCode badge and noncommercial status was found to have a small correlation with Web site quality, and Web sites displaying the HONCode badge and commercial sites had lower readability scores. Popular search engines appear to offer generally reliable results pointing to mostly good or better quality mental health Web sites. However, additional work is needed to make these sites more readable.

  1. U.S. Naval Observatory Annual Report 2001-2002

    DTIC Science & Technology

    2002-06-01

    practical astronomical information and data via printed publications, software products, and the World Wide Web. The Department’s products are used by the...Astronomical Almanac. Each almanac edition contains data for 1 year. These pub- lications are now on a well-established production schedule. The Astronomical...complementary Web site. In place of this list, the printed book will list the constants ~and references! used in the computations. Data for the obsolete Besselian

  2. Public transparency Web sites for radiology practices: prevalence of price, clinical quality, and service quality information.

    PubMed

    Rosenkrantz, Andrew B; Doshi, Ankur M

    2016-01-01

    To assess information regarding radiology practices on public transparency Web sites. Eight Web sites comparing radiology centers' price and quality were identified. Web site content was assessed. Six of eight Web sites reported examination prices. Other reported information included hours of operation (4/8), patient satisfaction (2/8), American College of Radiology (ACR) accreditation (3/8), on-site radiologists (2/8), as well as parking, accessibility, waiting area amenities, same/next-day reports, mammography follow-up rates, examination appropriateness, radiation dose, fellowship-trained radiologists, and advanced technologies (1/8 each). Transparency Web sites had a preponderance of price (and to a lesser extent service quality) information, risking fostering price-based competition at the expense of clinical quality. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. VizieR Online Data Catalog: Granulation model for 508 KIC stars (Cranmer+, 2014)

    NASA Astrophysics Data System (ADS)

    Cranmer, S. R.; Bastien, F. A.; Stassun, K. G.; Saar, S. H.

    2016-01-01

    A goal of this work is to find self-consistent and accurate ways to predict the properties of stellar light-curve variability, and to use this variability to calibrate against other methods of determining their fundamental parameters. Thus, it may be possible to develop the analysis of granular flicker measurements in a way that augments the results of asteroseismology and improves the accuracy of, e.g., stellar mass and radius measurements. To assist in this process, we provide tabulated data for 508 stars with photometric light curves measured by the Kepler mission, which also includes their derived masses and predicted values of the turbulent Mach number (Ma), the root-mean-square (rms) granulation intensity amplitude σ, and the flicker amplitude F8. These data are also hosted, with updates as needed, on the first author's Web site (http://www.cfa.harvard.edu/~scranmer/). With the data is a short code written in the Interactive Data Language (IDL) that reads the data and reproduces two of the three panels of Figure4 in the paper. (3 data files).

  4. Analysis of Technique to Extract Data from the Web for Improved Performance

    NASA Astrophysics Data System (ADS)

    Gupta, Neena; Singh, Manish

    2010-11-01

    The World Wide Web rapidly guides the world into a newly amazing electronic world, where everyone can publish anything in electronic form and extract almost all the information. Extraction of information from semi structured or unstructured documents, such as web pages, is a useful yet complex task. Data extraction, which is important for many applications, extracts the records from the HTML files automatically. Ontologies can achieve a high degree of accuracy in data extraction. We analyze method for data extraction OBDE (Ontology-Based Data Extraction), which automatically extracts the query result records from the web with the help of agents. OBDE first constructs an ontology for a domain according to information matching between the query interfaces and query result pages from different web sites within the same domain. Then, the constructed domain ontology is used during data extraction to identify the query result section in a query result page and to align and label the data values in the extracted records. The ontology-assisted data extraction method is fully automatic and overcomes many of the deficiencies of current automatic data extraction methods.

  5. Scalability Issues for Remote Sensing Infrastructure: A Case Study.

    PubMed

    Liu, Yang; Picard, Sean; Williamson, Carey

    2017-04-29

    For the past decade, a team of University of Calgary researchers has operated a large "sensor Web" to collect, analyze, and share scientific data from remote measurement instruments across northern Canada. This sensor Web receives real-time data streams from over a thousand Internet-connected sensors, with a particular emphasis on environmental data (e.g., space weather, auroral phenomena, atmospheric imaging). Through research collaborations, we had the opportunity to evaluate the performance and scalability of their remote sensing infrastructure. This article reports the lessons learned from our study, which considered both data collection and data dissemination aspects of their system. On the data collection front, we used benchmarking techniques to identify and fix a performance bottleneck in the system's memory management for TCP data streams, while also improving system efficiency on multi-core architectures. On the data dissemination front, we used passive and active network traffic measurements to identify and reduce excessive network traffic from the Web robots and JavaScript techniques used for data sharing. While our results are from one specific sensor Web system, the lessons learned may apply to other scientific Web sites with remote sensing infrastructure.

  6. PhosphOrtholog: a web-based tool for cross-species mapping of orthologous protein post-translational modifications.

    PubMed

    Chaudhuri, Rima; Sadrieh, Arash; Hoffman, Nolan J; Parker, Benjamin L; Humphrey, Sean J; Stöckli, Jacqueline; Hill, Adam P; James, David E; Yang, Jean Yee Hwa

    2015-08-19

    Most biological processes are influenced by protein post-translational modifications (PTMs). Identifying novel PTM sites in different organisms, including humans and model organisms, has expedited our understanding of key signal transduction mechanisms. However, with increasing availability of deep, quantitative datasets in diverse species, there is a growing need for tools to facilitate cross-species comparison of PTM data. This is particularly important because functionally important modification sites are more likely to be evolutionarily conserved; yet cross-species comparison of PTMs is difficult since they often lie in structurally disordered protein domains. Current tools that address this can only map known PTMs between species based on known orthologous phosphosites, and do not enable the cross-species mapping of newly identified modification sites. Here, we addressed this by developing a web-based software tool, PhosphOrtholog ( www.phosphortholog.com ) that accurately maps protein modification sites between different species. This facilitates the comparison of datasets derived from multiple species, and should be a valuable tool for the proteomics community. Here we describe PhosphOrtholog, a web-based application for mapping known and novel orthologous PTM sites from experimental data obtained from different species. PhosphOrtholog is the only generic and automated tool that enables cross-species comparison of large-scale PTM datasets without relying on existing PTM databases. This is achieved through pairwise sequence alignment of orthologous protein residues. To demonstrate its utility we apply it to two sets of human and rat muscle phosphoproteomes generated following insulin and exercise stimulation, respectively, and one publicly available mouse phosphoproteome following cellular stress revealing high mapping and coverage efficiency. Although coverage statistics are dataset dependent, PhosphOrtholog increased the number of cross-species mapped sites in all our example data sets by more than double when compared to those recovered using existing resources such as PhosphoSitePlus. PhosphOrtholog is the first tool that enables mapping of thousands of novel and known protein phosphorylation sites across species, accessible through an easy-to-use web interface. Identification of conserved PTMs across species from large-scale experimental data increases our knowledgebase of functional PTM sites. Moreover, PhosphOrtholog is generic being applicable to other PTM datasets such as acetylation, ubiquitination and methylation.

  7. Automated Management of Exercise Intervention at the Point of Care: Application of a Web-Based Leg Training System.

    PubMed

    Dedov, Vadim N; Dedova, Irina V

    2015-11-23

    Recent advances in information and communication technology have prompted development of Web-based health tools to promote physical activity, the key component of cardiac rehabilitation and chronic disease management. Mobile apps can facilitate behavioral changes and help in exercise monitoring, although actual training usually takes place away from the point of care in specialized gyms or outdoors. Daily participation in conventional physical activities is expensive, time consuming, and mostly relies on self-management abilities of patients who are typically aged, overweight, and unfit. Facilitation of sustained exercise training at the point of care might improve patient engagement in cardiac rehabilitation. In this study we aimed to test the feasibility of execution and automatic monitoring of several exercise regimens on-site using a Web-enabled leg training system. The MedExercise leg rehabilitation machine was equipped with wireless temperature sensors in order to monitor its usage by the rise of temperature in the resistance unit (Δt°). Personal electronic devices such as laptop computers were fitted with wireless gateways and relevant software was installed to monitor the usage of training machines. Cloud-based software allowed monitoring of participant training over the Internet. Seven healthy participants applied the system at various locations with training protocols typically used in cardiac rehabilitation. The heart rates were measured by fingertip pulse oximeters. Exercising in home chairs, in bed, and under an office desk was made feasible and resulted in an intensity-dependent increase of participants' heart rates and Δt° in training machine temperatures. Participants self-controlled their activities on smart devices, while a supervisor monitored them over the Internet. Individual Δt° reached during 30 minutes of moderate-intensity continuous training averaged 7.8°C (SD 1.6). These Δt° were used as personalized daily doses of exercise with automatic email alerts sent upon achieving them. During 1-week training at home, automatic notifications were received on 4.4 days (SD 1.8). Although the high intensity interval training regimen was feasible on-site, it was difficult for self- and remote management. Opportunistic leg exercise under the desk, while working with a computer, and training in bed while viewing television were less intensive than dosed exercise bouts, but allowed prolonged leg mobilization of 73.7 minutes/day (SD 29.7). This study demonstrated the feasibility of self-control exercise training on-site, which was accompanied by online monitoring, electronic recording, personalization of exercise doses, and automatic reporting of adherence. The results suggest that this technology and its applications are useful for the delivery of Web-based exercise rehabilitation and cardiac training programs at the point of care. ©Vadim N Dedov, Irina V Dedova. Originally published in JMIR Rehabilitation and Assistive Technology (http://rehab.jmir.org), 23.11.2015.

  8. A Web-based tool for UV irradiance data: predictions for European and Southeast Asian sites.

    PubMed

    Kift, Richard; Webb, Ann R; Page, John; Rimmer, John; Janjai, Serm

    2006-01-01

    There are a range of UV models available, but one needs significant pre-existing knowledge and experience in order to be able to use them. In this article a comparatively simple Web-based model developed for the SoDa (Integration and Exploitation of Networked Solar Radiation Databases for Environment Monitoring) project is presented. This is a clear-sky model with modifications for cloud effects. To determine if the model produces realistic UV data the output is compared with 1 year sets of hourly measurements at sites in the United Kingdom and Thailand. The accuracy of the output depends on the input, but reasonable results were obtained with the use of the default database inputs and improved when pyranometer instead of modeled data provided the global radiation input needed to estimate the UV. The average modeled values of UV for the UK site were found to be within 10% of measurements. For the tropical sites in Thailand the average modeled values were within 1120% of measurements for the four sites with the use of the default SoDa database values. These results improved when pyranometer data and TOMS ozone data from 2002 replaced the standard SoDa database values, reducing the error range for all four sites to less than 15%.

  9. Clustergrammer, a web-based heatmap visualization and analysis tool for high-dimensional biological data

    PubMed Central

    Fernandez, Nicolas F.; Gundersen, Gregory W.; Rahman, Adeeb; Grimes, Mark L.; Rikova, Klarisa; Hornbeck, Peter; Ma’ayan, Avi

    2017-01-01

    Most tools developed to visualize hierarchically clustered heatmaps generate static images. Clustergrammer is a web-based visualization tool with interactive features such as: zooming, panning, filtering, reordering, sharing, performing enrichment analysis, and providing dynamic gene annotations. Clustergrammer can be used to generate shareable interactive visualizations by uploading a data table to a web-site, or by embedding Clustergrammer in Jupyter Notebooks. The Clustergrammer core libraries can also be used as a toolkit by developers to generate visualizations within their own applications. Clustergrammer is demonstrated using gene expression data from the cancer cell line encyclopedia (CCLE), original post-translational modification data collected from lung cancer cells lines by a mass spectrometry approach, and original cytometry by time of flight (CyTOF) single-cell proteomics data from blood. Clustergrammer enables producing interactive web based visualizations for the analysis of diverse biological data. PMID:28994825

  10. Initiatives to Develop Web Sites Including Information about Brownfields Properties

    EPA Pesticide Factsheets

    This web site was created to assist in planning, designing, and operating web sites that include information about individual brownfields properties. The report is of value to parties designing or managing such sites.

  11. Promoting Your Web Site.

    ERIC Educational Resources Information Center

    Raeder, Aggi

    1997-01-01

    Discussion of ways to promote sites on the World Wide Web focuses on how search engines work and how they retrieve and identify sites. Appropriate Web links for submitting new sites and for Internet marketing are included. (LRW)

  12. Feasibility of using a web-based nutrition intervention among residents of multiethnic working-class neighborhoods.

    PubMed

    McNeill, Lorna H; Viswanath, K; Bennett, Gary G; Puleo, Elaine; Emmons, Karen M

    2007-07-01

    Using the Internet to promote behavior change is becoming more desirable as Internet use continues to increase among diverse audiences. Yet we know very little about whether this medium is useful or about different strategies to encourage Internet use by various populations. This pilot study tested the usefulness of a Web-based intervention designed to deliver nutrition-related information to and increase fruit and vegetable consumption among adults from working-class neighborhoods. Participants (N = 52) had access to the Web site for 6 weeks and received three e-mail reminders encouraging them to eat fruits and vegetables. The Web site provided information about overcoming barriers to healthy eating, accessing social support for healthy eating, setting goals for healthy eating, and maintaining a healthy diet, including recipes. We collected data on participants' use of the Web site, their Internet access and use, and their fruit and vegetable consumption. The mean age of the participants was 46 years, 73% were white, 46% did not have a college degree, and 12% had household incomes at or below 185% of the federal poverty index. They reported consuming an average of 3.4 servings of fruits and vegetables per day. More than half of the participants owned a computer, 75% logged onto the Web site at least once, and those who visited the site averaged 3.8 visits and viewed an average of 24.5 pages. The number of log-ons per day declined over the study period; however, reminder e-mails appeared to motivate participants to return to the Web site. Roughly 74% of participants viewed information on goal setting, 72% viewed information on dietary tracking, and 56% searched for main course recipes. The results of this pilot study suggest that Internet-based health messages have the potential to reach a large percentage of adults from working-class neighborhoods who have access to the Internet.

  13. Seahawk: moving beyond HTML in Web-based bioinformatics analysis.

    PubMed

    Gordon, Paul M K; Sensen, Christoph W

    2007-06-18

    Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therefore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer.

  14. Seahawk: moving beyond HTML in Web-based bioinformatics analysis

    PubMed Central

    Gordon, Paul MK; Sensen, Christoph W

    2007-01-01

    Background Traditional HTML interfaces for input to and output from Bioinformatics analysis on the Web are highly variable in style, content and data formats. Combining multiple analyses can therfore be an onerous task for biologists. Semantic Web Services allow automated discovery of conceptual links between remote data analysis servers. A shared data ontology and service discovery/execution framework is particularly attractive in Bioinformatics, where data and services are often both disparate and distributed. Instead of biologists copying, pasting and reformatting data between various Web sites, Semantic Web Service protocols such as MOBY-S hold out the promise of seamlessly integrating multi-step analysis. Results We have developed a program (Seahawk) that allows biologists to intuitively and seamlessly chain together Web Services using a data-centric, rather than the customary service-centric approach. The approach is illustrated with a ferredoxin mutation analysis. Seahawk concentrates on lowering entry barriers for biologists: no prior knowledge of the data ontology, or relevant services is required. In stark contrast to other MOBY-S clients, in Seahawk users simply load Web pages and text files they already work with. Underlying the familiar Web-browser interaction is an XML data engine based on extensible XSLT style sheets, regular expressions, and XPath statements which import existing user data into the MOBY-S format. Conclusion As an easily accessible applet, Seahawk moves beyond standard Web browser interaction, providing mechanisms for the biologist to concentrate on the analytical task rather than on the technical details of data formats and Web forms. As the MOBY-S protocol nears a 1.0 specification, we expect more biologists to adopt these new semantic-oriented ways of doing Web-based analysis, which empower them to do more complicated, ad hoc analysis workflow creation without the assistance of a programmer. PMID:17577405

  15. VISA--Vector Integration Site Analysis server: a web-based server to rapidly identify retroviral integration sites from next-generation sequencing.

    PubMed

    Hocum, Jonah D; Battrell, Logan R; Maynard, Ryan; Adair, Jennifer E; Beard, Brian C; Rawlings, David J; Kiem, Hans-Peter; Miller, Daniel G; Trobridge, Grant D

    2015-07-07

    Analyzing the integration profile of retroviral vectors is a vital step in determining their potential genotoxic effects and developing safer vectors for therapeutic use. Identifying retroviral vector integration sites is also important for retroviral mutagenesis screens. We developed VISA, a vector integration site analysis server, to analyze next-generation sequencing data for retroviral vector integration sites. Sequence reads that contain a provirus are mapped to the human genome, sequence reads that cannot be localized to a unique location in the genome are filtered out, and then unique retroviral vector integration sites are determined based on the alignment scores of the remaining sequence reads. VISA offers a simple web interface to upload sequence files and results are returned in a concise tabular format to allow rapid analysis of retroviral vector integration sites.

  16. Nursing students' perception of a Web-based intervention to support learning.

    PubMed

    Koch, Jane; Andrew, Sharon; Salamonson, Yenna; Everett, Bronwyn; Davidson, Patricia M

    2010-08-01

    Tailoring information to the needs of the learner is an important strategy in contemporary education settings. Web-based learning support, informed by multimedia theory, comprising interactive quizzes, glossaries with audio, short narrated Power Point(R) presentations, animations and digitised video clips were introduced in a first year Bachelor of Nursing biological sciences subject at a university in metropolitan Sydney. All students enrolled in this unit were invited to obtain access to the site and the number of hits to the site was recorded using the student tracking facility available on WebCT, an online course delivery tool adopted widely by many educational institutions and used in this study. Eighty-five percent of students enrolled in the subject accessed the learning support site. Students' perception of the value of a learning support site was assessed using a web-based survey. The survey was completed by 123 participants, representing a response rate of 22%. Three themes emerged from the qualitative data concerning nursing students' perception of the web-based activities: 'enhances my learning', 'study at my own pace', and 'about the activities: what I really liked/disliked'. Web-based interventions, supplementing a traditionally presented nursing science course were perceived by students to be beneficial in both learning and language development. Although students value interactive, multimedia learning they were not ready to completely abandon traditional modes of learning including face-to-face lectures. The findings of this study contribute to an understanding of how web-based resources can be best used to support students' learning in bioscience. Copyright 2009 Elsevier Ltd. All rights reserved.

  17. The effect of tailored Web-based interventions on pain in adults: a systematic review protocol.

    PubMed

    Martorella, Géraldine; Gélinas, C; Bérubé, M; Boitor, M; Fredericks, S; LeMay, S

    2016-04-12

    Information technologies can facilitate the implementation of health interventions, especially in the case of widespread conditions such as pain. Tailored Web-based interventions have been recognized for health behavior change among diverse populations. However, none of the systematic reviews looking at Web-based interventions for pain management has specifically addressed the contribution of tailoring. The aims of this systematic review are to assess the effect of tailored Web-based pain management interventions on pain intensity and physical and psychological functions. Randomized controlled trials including adults suffering from any type of pain and involving Web-based interventions for pain management, using at least one of the three tailoring strategies (personalization, feedback, or adaptation), will be considered. The following types of comparisons will be carried out: tailored Web-based intervention with (1) usual care (passive control group), (2) face-to-face intervention, and (3) standardized Web-based intervention. The primary outcome will be pain intensity measured using a self-report measure such as the numeric rating scale (e.g., 0-10) or visual analog scale (e.g., 0-100). Secondary outcomes will include pain interference with activities and psychological well-being. A systematic review of English and French articles using MEDLINE, Embase, CINAHL, PsycINFO, Web of Science, and Cochrane Library will be conducted from January 2000 to December 2015. Eligibility assessment will be performed independently in an unblinded standardized manner by two reviewers. Extracted data will include the following: sample size, demographics, dropout rate, number and type of study groups, type of pain, inclusion and exclusion criteria, study setting, type of Web-based intervention, tailoring strategy, comparator, type of pain intensity measure, pain-related disability and psychological well-being outcomes, and times of measurement. Disagreements between reviewers at the full-text level will be resolved by consulting a third reviewer, a senior researcher. This systematic review is the first one looking at the specific ingredients and effects of tailored and Web-based interventions for pain management. Results of this systematic review could contribute to a better understanding of the mechanisms by which Web-based interventions could be helpful for people facing pain problems. PROSPERO CRD42015027669.

  18. Gender and online privacy among teens: risk perception, privacy concerns, and protection behaviors.

    PubMed

    Youn, Seounmi; Hall, Kimberly

    2008-12-01

    Survey data from 395 high school students revealed that girls perceive more privacy risks and have a higher level of privacy concerns than boys. Regarding privacy protection behaviors, boys tended to read unsolicited e-mail and register for Web sites while directly sending complaints in response to unsolicited e-mail. This study found girls to provide inaccurate information as their privacy concerns increased. Boys, however, refrained from registering to Web sites as their concerns increased.

  19. Does an Interactive WebCT Site Help Students Learn?

    ERIC Educational Resources Information Center

    Elicker, Joelle D.; O'Malley, Alison L.; Williams, Christine M.

    2008-01-01

    We examined whether students with access to a supplemental course Web site enhanced with e-mail, discussion boards, and chat room capability reacted to it more positively than students who used a Web site with the same content but no communication features. Students used the Web sites on a voluntary basis. At the end of the semester, students…

  20. Investigating Web Sites of Faculties of Education: The Case of Turkey

    ERIC Educational Resources Information Center

    Kutluca, Tamer; Aydin, Serhat; Baki, Adnan

    2009-01-01

    The purpose of this paper is to explore the current status of the web sites of the Faculties of Education (FOEs) in Turkey. Bearing this in mind, a "Web Site Assessment Form" comprising thirty-seven items was developed and the web sites of the FOEs were evaluated with respect to "Content," "Currency," "Structure…

  1. 75 FR 384 - Event Problem Codes Web Site; Center for Devices and Radiological Health; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-05

    ...] Event Problem Codes Web Site; Center for Devices and Radiological Health; Availability AGENCY: Food and... the availability of a Web site where the Center for Devices and Radiological Health (CDRH) is posting... to all reporters (Sec. 803.21(b)). FDA is announcing the availability of a Web site that will make...

  2. 75 FR 75962 - Proposed Information Collection; Comment Request; Commerce.Gov Web Site User Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ...; Commerce.Gov Web Site User Survey AGENCY: Office of the Secretary, Office of Public Affairs. ACTION: Notice... serve users of Commerce.gov and the Department of Commerce bureaus' Web sites, the Offices of Public Affairs will collect information from users about their experience on the Web sites. A random number of...

  3. 49 CFR 375.213 - What information must I provide to a prospective individual shipper?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... hyperlink on your Internet Web site to the FMCSA Web site containing the information in FMCSA's publication... Internet Web site to the FMCSA Web site containing the information in FMCSA's publication “Your Rights and... explanation that individual shippers may examine these tariff sections or have copies sent to them upon...

  4. 76 FR 22926 - Notice of Availability of Draft Environmental Assessment and Finding of No Significant Impact for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-25

    ... the following methods: Federal Rulemaking Web site: Go to http://www.regulations.gov and search for... writing or in electronic form will be posted on the NRC Web site and on the Federal rulemaking Web site... electronically under ADAMS Accession Number ML110870992. Federal Rulemaking Web site: Public comments and...

  5. How Accessible Are Public Libraries' Web Sites? A Study of Georgia Public Libraries

    ERIC Educational Resources Information Center

    Ingle, Emma; Green, Ravonne A.; Huprich, Julia

    2009-01-01

    One issue that public librarians must consider when planning Web site design is accessibility for patrons with disabilities. This article reports a study of Web site accessibility of public libraries in Georgia. The focus of the report is whether public libraries use accessible guidelines and standards in making their Web sites accessible. An…

  6. Assessing an Infant Feeding Web Site as a Nutrition Education Tool for Child Care Providers

    ERIC Educational Resources Information Center

    Clark, Alena; Anderson, Jennifer; Adams, Elizabeth; Baker, Susan; Barrett, Karen

    2009-01-01

    Objective: Determine child care providers' infant feeding knowledge, attitude and behavior changes after viewing the infant feeding Web site and determine the effectiveness of the Web site and bilingual educational materials. Design: Intervention and control groups completed an on-line pretest survey, viewed a Web site for 3 months, and completed…

  7. 12 CFR 4.4 - Washington office and web site.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 1 2013-01-01 2013-01-01 false Washington office and web site. 4.4 Section 4.4... EXAMINERS Organization and Functions § 4.4 Washington office and web site. The Washington office of the OCC...'s Web site is at http://www.occ.gov. [76 FR 43561, July 21, 2011] ...

  8. 12 CFR 4.4 - Washington office and web site.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Washington office and web site. 4.4 Section 4.4... EXAMINERS Organization and Functions § 4.4 Washington office and web site. The Washington office of the OCC...'s Web site is at http://www.occ.gov. [76 FR 43561, July 21, 2011] ...

  9. 12 CFR 4.4 - Washington office and web site.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 1 2012-01-01 2012-01-01 false Washington office and web site. 4.4 Section 4.4... EXAMINERS Organization and Functions § 4.4 Washington office and web site. The Washington office of the OCC...'s Web site is at http://www.occ.gov. [76 FR 43561, July 21, 2011] ...

  10. Food marketing on popular children's web sites: a content analysis.

    PubMed

    Alvy, Lisa M; Calvert, Sandra L

    2008-04-01

    In 2006 the Institute of Medicine (IOM) concluded that food marketing was a contributor to childhood obesity in the United States. One recommendation of the IOM committee was for research on newer marketing venues, such as Internet Web sites. The purpose of this cross-sectional study was to answer the IOM's call by examining food marketing on popular children's Web sites. Ten Web sites were selected based on market research conducted by KidSay, which identified favorite sites of children aged 8 to 11 years during February 2005. Using a standardized coding form, these sites were examined page by page for the existence, type, and features of food marketing. Web sites were compared using chi2 analyses. Although food marketing was not pervasive on the majority of the sites, seven of the 10 Web sites contained food marketing. The products marketed were primarily candy, cereal, quick serve restaurants, and snacks. Candystand.com, a food product site, contained a significantly greater amount of food marketing than the other popular children's Web sites. Because the foods marketed to children are not consistent with a healthful diet, nutrition professionals should consider joining advocacy groups to pressure industry to reduce online food marketing directed at youth.

  11. Next-Gen Search Engines

    ERIC Educational Resources Information Center

    Gupta, Amardeep

    2005-01-01

    Current search engines--even the constantly surprising Google--seem unable to leap the next big barrier in search: the trillions of bytes of dynamically generated data created by individual web sites around the world, or what some researchers call the "deep web." The challenge now is not information overload, but information overlook.…

  12. A Flexible Monitoring Infrastructure for the Simulation Requests

    NASA Astrophysics Data System (ADS)

    Spinoso, V.; Missiato, M.

    2014-06-01

    Running and monitoring simulations usually involves several different aspects of the entire workflow: the configuration of the job, the site issues, the software deployment at the site, the file catalogue, the transfers of the simulated data. In addition, the final product of the simulation is often the result of several sequential steps. This project tries a different approach to monitoring the simulation requests. All the necessary data are collected from the central services which lead the submission of the requests and the data management, and stored by a backend into a NoSQL-based data cache; those data can be queried through a Web Service interface, which returns JSON responses, and allows users, sites, physics groups to easily create their own web frontend, aggregating only the needed information. As an example, it will be shown how it is possible to monitor the CMS services (ReqMgr, DAS/DBS, PhEDEx) using a central backend and multiple customized cross-language frontends.

  13. Disclaimer for external Web links | National Oceanic and Atmospheric

    Science.gov Websites

    Web links The appearance of external links on this Web site does not constitute endorsement by the Department of Commerce/National Oceanic and Atmospheric Administration of external Web sites or the . These links are provided consistent with the stated purpose of this Department of Commerce/NOAA Web site

  14. One Course, One Web Site--Of Course? Maybe Not!

    ERIC Educational Resources Information Center

    Cohn, Ellen R.

    2004-01-01

    Colleges and universities increasingly employ commercial Web-based course management systems (such as Blackboard and WebCT). How is it, then, that these institutions unquestioningly allocate a unique Web site to each class? Why establish one Web site for one course when other options provide so many benefits? Why isn't there a clamor for…

  15. Integrating NASA Dryden Research Endeavors into the Teaching-Learning of Mathematics in the K-12 Classroom via the WWW

    NASA Technical Reports Server (NTRS)

    Ward, Robin A.

    2002-01-01

    The primary goal of this project was to continue populating the currently existing web site developed in 1998 in conjunction with the NASA Dryden Flight Research Center and California Polytechnic State University, with more mathematics lesson plans and activities that K-12 teachers, students, home-schoolers, and parents could access. All of the activities, while demonstrating some mathematical topic, also showcase the research endeavors of the NASA Dryden Flight Research Center. The website is located at: http://daniel.calpoly.edu/dfrc/Robin. The secondary goal of this project was to share the web-based activities with educators at various conferences and workshops. To address the primary goal of this project, over the past year, several new activities were posted on the web site and some of the existing activities were enhanced to contain more video clips, photos, and materials for teachers. To address the project's secondary goal, the web-based activities were showcased at several conferences and workshops. Additionally, in order to measure and assess the outreach impact of the web site, a link to the web site hitbox.com was established in April 2001, which allowed for the collection of traffic statistics against the web site (such as the domains of visitors, the frequency of visitors to this web site, etc.) Provided is a description of some of the newly created activities posted on the web site during the project period of 2001-2002, followed by a description of the conferences and workshops at which some of the web-based activities were showcased. Next is a brief summary of the web site's traffic statistics demonstrating its worldwide educational impact, followed by a listing of some of the awards and accolades the web site has received.

  16. Use of the World Wide Web for multisite data collection.

    PubMed

    Subramanian, A K; McAfee, A T; Getzinger, J P

    1997-08-01

    As access to the Internet becomes increasingly available, research applications in medicine will increase. This paper describes the use of the Internet, and, more specifically, the World Wide Web (WWW), as a channel of communication between EDs throughout the world and investigators who are interested in facilitating the collection of data from multiple sites. Data entered into user-friendly electronic surveys can be transmitted over the Internet to a database located at the site of the study, rendering geographic separation less of a barrier to the conduction of multisite studies. The electronic format of the data can enable real-time statistical processing while data are stored using existing database technologies. In theory, automated processing of variables within such a database enables early identification of data trends. Methods of ensuring validity, security, and compliance are discussed.

  17. Terminology issues in user access to Web-based medical information.

    PubMed Central

    McCray, A. T.; Loane, R. F.; Browne, A. C.; Bangalore, A. K.

    1999-01-01

    We conducted a study of user queries to the National Library of Medicine Web site over a three month period. Our purpose was to study the nature and scope of these queries in order to understand how to improve users' access to the information they are seeking on our site. The results show that the queries are primarily medical in content (94%), with only a small percentage (5.5%) relating to library services, and with a very small percentage (.5%) not being medically relevant at all. We characterize the data set, and conclude with a discussion of our plans to develop a UMLS-based terminology server to assist NLM Web users. Images Figure 1 PMID:10566330

  18. Multigraph: Interactive Data Graphs on the Web

    NASA Astrophysics Data System (ADS)

    Phillips, M. B.

    2010-12-01

    Many aspects of geophysical science involve time dependent data that is often presented in the form of a graph. Considering that the web has become a primary means of communication, there are surprisingly few good tools and techniques available for presenting time-series data on the web. The most common solution is to use a desktop tool such as Excel or Matlab to create a graph which is saved as an image and then included in a web page like any other image. This technique is straightforward, but it limits the user to one particular view of the data, and disconnects the graph from the data in a way that makes updating a graph with new data an often cumbersome manual process. This situation is somewhat analogous to the state of mapping before the advent of GIS. Maps existed only in printed form, and creating a map was a laborious process. In the last several years, however, the world of mapping has experienced a revolution in the form of web-based and other interactive computer technologies, so that it is now commonplace for anyone to easily browse through gigabytes of geographic data. Multigraph seeks to bring a similar ease of access to time series data. Multigraph is a program for displaying interactive time-series data graphs in web pages that includes a simple way of configuring the appearance of the graph and the data to be included. It allows multiple data sources to be combined into a single graph, and allows the user to explore the data interactively. Multigraph lets users explore and visualize "data space" in the same way that interactive mapping applications such as Google Maps facilitate exploring and visualizing geography. Viewing a Multigraph graph is extremely simple and intuitive, and requires no instructions. Creating a new graph for inclusion in a web page involves writing a simple XML configuration file and requires no programming. Multigraph can read data in a variety of formats, and can display data from a web service, allowing users to "surf" through large data sets, downloading only those the parts of the data that are needed for display. Multigraph is currently in use on several web sites including the US Drought Portal (www.drought.gov), the NOAA Climate Services Portal (www.climate.gov), the Climate Reference Network (www.ncdc.noaa.gov/crn), NCDC's State of the Climate Report (www.ncdc.noaa.gov/sotc), and the US Forest Service's Forest Change Assessment Viewer (ews.forestthreats.org/NPDE/NPDE.html). More information about Multigraph is available from the web site www.multigraph.org. Interactive Graph of Global Temperature Anomalies from ClimateWatch Magazine (http://www.climatewatch.noaa.gov/2009/articles/climate-change-global-temperature)

  19. Use of the computer and Internet among Italian families: first national study.

    PubMed

    Bricolo, Francesco; Gentile, Douglas A; Smelser, Rachel L; Serpelloni, Giovanni

    2007-12-01

    Although home Internet access has continued to increase, little is known about actual usage patterns in homes. This nationally representative study of over 4,700 Italian households with children measured computer and Internet use of each family member across 3 months. Data on actual computer and Internet usage were collected by Nielsen//NetRatings service and provide national baseline information on several variables for several age groups separately, including children, adolescents, and adult men and women. National averages are shown for the average amount of time spent using computers and on the Web, the percentage of each age group online, and the types of Web sites viewed. Overall, about one-third of children ages 2 to 11, three-fourths of adolescents and adult women, and over four-fifths of adult men access the Internet each month. Children spend an average of 22 hours/month on the computer, with a jump to 87 hours/month for adolescents. Adult women spend less time (about 60 hours/month), and adult men spend more (over 100). The types of Web sites visited are reported, including the top five for each age group. In general, search engines and Web portals are the top sites visited, regardless of age group. These data provide a baseline for comparisons across time and cultures.

  20. Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.

    2012-12-01

    Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.

  1. Online Hydrologic Impact Assessment Decision Support System using Internet and Web-GIS Capability

    NASA Astrophysics Data System (ADS)

    Choi, J.; Engel, B. A.; Harbor, J.

    2002-05-01

    Urban sprawl and the corresponding land use change from lower intensity uses, such as agriculture and forests, to higher intensity uses including high density residential and commercial has various long- and short-term environment impacts on ground water recharge, water pollution, and storm water drainage. A web-based Spatial Decision Support System, SDSS, for Web-based operation of long-term hydrologic impact modeling and analysis was developed. The system combines a hydrologic model, databases, web-GIS capability and HTML user interfaces to create a comprehensive hydrologic analysis system. The hydrologic model estimates daily direct runoff using the NRCS Curve Number technique and annual nonpoint source pollution loading by an event mean concentration approach. This is supported by a rainfall database with over 30 years of daily rainfall for the continental US. A web-GIS interface and a robust Web-based watershed delineation capability were developed to simplify the spatial data preparation task that is often a barrier to hydrologic model operation. The web-GIS supports browsing of map layers including hydrologic soil groups, roads, counties, streams, lakes and railroads, as well as on-line watershed delineation for any geographic point the user selects with a simple mouse click. The watershed delineation results can also be used to generate data for the hydrologic and water quality models available in the DSS. This system is already being used by city and local government planners for hydrologic impact evaluation of land use change from urbanization, and can be found at http://pasture.ecn.purdue.edu/~watergen/hymaps. This system can assist local community, city and watershed planners, and even professionals when they are examining impacts of land use change on water resources. They can estimate the hydrologic impact of possible land use changes using this system with readily available data supported through the Internet. This system provides a cost effective approach to serve potential users who require easy-to-use tools.

  2. Direct-to-consumer advertising via the Internet: the role of Web site design.

    PubMed

    Sewak, Saurabh S; Wilkin, Noel E; Bentley, John P; Smith, Mickey C

    2005-06-01

    Recent attempts to propose criteria for judging the quality of pharmaceutical and healthcare Web sites do not distinguish between attributes of Web site design related to content and other attributes not related to the content. The Elaboration Likelihood Model from persuasion literature is used as a framework for investigating the effects of Web site design on consequents like attitude and knowledge acquisition. A between-subjects, 2 (high or low involvement)x2 (Web site designed with high or low aspects of visual appeal) factorial design was used in this research. College students were randomly assigned to these treatment groups yielding a balanced design with 29 observations per treatment cell. Analysis of variance results for the effects of involvement and Web site design on attitude and knowledge indicated that the interaction between the independent variables was not significant in both analyses. Examination of main effects revealed that participants who viewed the Web site with higher visual appeal actually had slightly lower knowledge scores (6.32) than those who viewed the Web site with lower visual appeal (7.03, F(1,112)=3.827, P=.053). Results of this research seem to indicate that aspects of Web site design (namely aspects of visual appeal and quality) may not play a role in attaining desired promotional objectives, which can include development of favorable attitudes toward the product and facilitating knowledge acquisition.

  3. Evaluation of Quality, Content, and Use of the Web Site Prepared for Family Members Giving Care to Stroke Patients.

    PubMed

    Demir, Yasemin; Gozum, Sebahat

    2015-09-01

    This study was designed to evaluate the quality, content, usability, and efficacy of a Web site prepared for the purpose of improving the caregiving capability of family members who provide care for stroke survivors at home. The DISCERN score for the Web site was found to be 4.35 over 5. The first section that assesses reliability of the Web site was 4.38 over 5; mean score of the second section that measures the quality of the provided information on treatment/care options was 4.30, and mean score of the third section that gives a general evaluation of the material was 4.1. The Web site content achieved an average score of 3.47 over 4 after evaluation by experts. The Web site system usability score was found to be 79.4 over 100. The Web site was utilized mostly for exercises in bed (76.3%; n = 29), use of medications, and patient safety (68.4%; n = 26). It was determined that those who were younger and employed and had no previous experience of nursing any patient utilized relatively more from the section of patient nutrition and oral care and married family caregivers from the body hygiene section. The Web site quality and content were judged to be good and reliable to use. The Web site was efficiently used by caregivers.

  4. A usability evaluation exploring the design of American Nurses Association state web sites.

    PubMed

    Alexander, Gregory L; Wakefield, Bonnie J; Anbari, Allison B; Lyons, Vanessa; Prentice, Donna; Shepherd, Marilyn; Strecker, E Bradley; Weston, Marla J

    2014-08-01

    National leaders are calling for opportunities to facilitate the Future of Nursing. Opportunities can be encouraged through state nurses association Web sites, which are part of the American Nurses Association, that are well designed, with appropriate content, and in a language professional nurses understand. The American Nurses Association and constituent state nurses associations provide information about nursing practice, ethics, credentialing, and health on Web sites. We conducted usability evaluations to determine compliance with heuristic and ethical principles for Web site design. We purposefully sampled 27 nursing association Web sites and used 68 heuristic and ethical criteria to perform systematic usability assessments of nurse association Web sites. Web site analysis included seven double experts who were all RNs trained in usability analysis. The extent to which heuristic and ethical criteria were met ranged widely from one state that met 0% of the criteria for "help and documentation" to states that met greater than 92% of criteria for "visibility of system status" and "aesthetic and minimalist design." Suggested improvements are simple yet make an impact on a first-time visitor's impression of the Web site. For example, adding internal navigation and tracking features and providing more details about the application process through help and frequently asked question documentation would facilitate better use. Improved usability will improve effectiveness, efficiency, and consumer satisfaction with these Web sites.

  5. Promoting Teachers' Positive Attitude towards Web Use: A Study in Web Site Development

    ERIC Educational Resources Information Center

    Akpinar, Yavuz; Bayramoglu, Yusuf

    2008-01-01

    The purpose of the study was to examine effects of a compact training for developing web sites on teachers' web attitude, as composed of: web self efficacy, perceived web enjoyment, perceived web usefulness and behavioral intention to use the web. To measure the related constructs, the Web Attitude Scale was adapted into Turkish and tested with a…

  6. Assessing Perceived Credibility of Web Sites in a Terrorism Context: The PFLP, Tamil Tigers, Hamas, and Hezbollah

    ERIC Educational Resources Information Center

    Spinks, Brandon Todd

    2009-01-01

    The purpose of the study was to contribute to the overall understanding of terrorist organizations' use of the Internet and to increase researchers' knowledge of Web site effectiveness. The methodological approach was evaluation of the perceived credibility of Web sites based on existing criteria derived from information users. The Web sites of…

  7. Beyond Electronic Brochures: An Analysis of Singapore Primary School Web Sites

    ERIC Educational Resources Information Center

    Hu, Chun; Soong, Andrew Kheng Fah

    2007-01-01

    This study aims to investigate how Singapore primary schools use their web sites, what kind of information is contained in the web sites, and how the information is presented. Based on an analysis of 176 primary school web sites, which represent all but one of the country's primary schools, findings indicate that most of Singapore's primary school…

  8. 76 FR 10072 - Proposed Generic Communications; Draft NRC Regulatory Issue Summary 2011-XX, Adequacy of Station...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-23

    ... electronic form will be posted on the NRC Web site and on the Federal Rulemaking Web site Regulations.gov... that they do not want publicly disclosed. Federal rulemaking Web site: Go to http://www.regulations.gov... through this Web site. Address questions about NRC dockets to Carol Gallagher, telephone: 301-492-3668, e...

  9. Security & Privacy Policy - Naval Oceanography Portal

    Science.gov Websites

    Notice: This is a U.S. Government Web Site 1. This is a World Wide Web site for official information information on this Web site are strictly prohibited and may be punishable under the Computer Fraud and Abuse Information Act (FOIA) | External Link Disclaimer This is an official U.S. Navy web site. Security &

  10. 75 FR 58411 - Medicare Program; Town Hall Meeting on the Physician Compare Web Site, October 27, 2010

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ...] Medicare Program; Town Hall Meeting on the Physician Compare Web Site, October 27, 2010 AGENCY: Centers for... establish a Physician Compare Web site by January 1, 2011. This notice announces a Town Hall meeting to discuss the Physician Compare Web site. The purpose of this Town Hall meeting is to solicit input from...

  11. 75 FR 32005 - National Emission Standards for Hazardous Air Pollutants for Major Sources: Industrial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-04

    ... Web site. E-mail: Comments may be sent by electronic mail (e-mail) to a-and-r[email protected] otherwise protected through http://www.regulations.gov or e-mail. The http://www.regulations.gov Web site is... Web site: http://www.epa.gov/airquality/combustion . Please refer to this Web site to confirm the date...

  12. Food and Beverage Brands that Market to Children and Adolescents on the Internet: A Content Analysis of Branded Web Sites

    ERIC Educational Resources Information Center

    Henry, Anna E.; Story, Mary

    2009-01-01

    Objective: To identify food and beverage brand Web sites featuring designated children's areas, assess marketing techniques present on those industry Web sites, and determine nutritional quality of branded food items marketed to children. Design: Systematic content analysis of food and beverage brand Web sites and nutrient analysis of food and…

  13. 12 CFR 611.1216 - Public availability of documents related to the termination.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the termination. (a) We may post on our Web site, or require you to post on your Web site: (1) Results... related transactions. (b) We will not post confidential information on our Web site and will not require you to post it on your Web site. (c) You may request that we treat specific information as...

  14. 12 CFR 611.1216 - Public availability of documents related to the termination.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... the termination. (a) We may post on our Web site, or require you to post on your Web site: (1) Results... related transactions. (b) We will not post confidential information on our Web site and will not require you to post it on your Web site. (c) You may request that we treat specific information as...

  15. 12 CFR 611.1216 - Public availability of documents related to the termination.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the termination. (a) We may post on our Web site, or require you to post on your Web site: (1) Results... related transactions. (b) We will not post confidential information on our Web site and will not require you to post it on your Web site. (c) You may request that we treat specific information as...

  16. 12 CFR 611.1216 - Public availability of documents related to the termination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the termination. (a) We may post on our Web site, or require you to post on your Web site: (1) Results... related transactions. (b) We will not post confidential information on our Web site and will not require you to post it on your Web site. (c) You may request that we treat specific information as...

  17. 75 FR 11972 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Order Approving...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... Business Member's Web site may not reflect 100 percent of that member's volume for that ATS dark pool...' Web sites to see the total volume for any given ATS dark pool, and the TRF Business Members will make... they receive for each ATS dark pool on their Web site and must prominently disclose that the Web site...

  18. caCORE: a common infrastructure for cancer informatics.

    PubMed

    Covitz, Peter A; Hartel, Frank; Schaefer, Carl; De Coronado, Sherri; Fragoso, Gilberto; Sahni, Himanso; Gustafson, Scott; Buetow, Kenneth H

    2003-12-12

    Sites with substantive bioinformatics operations are challenged to build data processing and delivery infrastructure that provides reliable access and enables data integration. Locally generated data must be processed and stored such that relationships to external data sources can be presented. Consistency and comparability across data sets requires annotation with controlled vocabularies and, further, metadata standards for data representation. Programmatic access to the processed data should be supported to ensure the maximum possible value is extracted. Confronted with these challenges at the National Cancer Institute Center for Bioinformatics, we decided to develop a robust infrastructure for data management and integration that supports advanced biomedical applications. We have developed an interconnected set of software and services called caCORE. Enterprise Vocabulary Services (EVS) provide controlled vocabulary, dictionary and thesaurus services. The Cancer Data Standards Repository (caDSR) provides a metadata registry for common data elements. Cancer Bioinformatics Infrastructure Objects (caBIO) implements an object-oriented model of the biomedical domain and provides Java, Simple Object Access Protocol and HTTP-XML application programming interfaces. caCORE has been used to develop scientific applications that bring together data from distinct genomic and clinical science sources. caCORE downloads and web interfaces can be accessed from links on the caCORE web site (http://ncicb.nci.nih.gov/core). caBIO software is distributed under an open source license that permits unrestricted academic and commercial use. Vocabulary and metadata content in the EVS and caDSR, respectively, is similarly unrestricted, and is available through web applications and FTP downloads. http://ncicb.nci.nih.gov/core/publications contains links to the caBIO 1.0 class diagram and the caCORE 1.0 Technical Guide, which provide detailed information on the present caCORE architecture, data sources and APIs. Updated information appears on a regular basis on the caCORE web site (http://ncicb.nci.nih.gov/core).

  19. Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser

    NASA Astrophysics Data System (ADS)

    Christen, M.

    2016-06-01

    Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.

  20. Effectiveness of off-line and web-based promotion of health information web sites.

    PubMed

    Jones, Craig E; Pinnock, Carole B

    2002-01-01

    The relative effectiveness of off-line and web-based promotional activities in increasing the use of health information web sites by target audiences were compared. Visitor sessions were classified according to their method of arrival at the site (referral) as external web site, search engine, or "no referrer" (i.e., visitor arriving at the site by inputting URL or using bookmarks). The number of Australian visitor sessions correlated with no referrer referrals but not web site or search-engine referrals. Results showed that the targeted consumer group is more likely to access the web site as a result of off-line promotional activities. The properties of target audiences likely to influence the effectiveness of off-line versus on-line promotional strategies include the size of the Internet using population of the target audience, their proficiency in the use of the Internet, and the increase in effectiveness of off-line promotional activities when applied to locally defined target audiences.

  1. A Framework for Web Usage Mining in Electronic Government

    NASA Astrophysics Data System (ADS)

    Zhou, Ping; Le, Zhongjian

    Web usage mining has been a major component of management strategy to enhance organizational analysis and decision. The literature on Web usage mining that deals with strategies and technologies for effectively employing Web usage mining is quite vast. In recent years, E-government has received much attention from researchers and practitioners. Huge amounts of user access data are produced in Electronic government Web site everyday. The role of these data in the success of government management cannot be overstated because they affect government analysis, prediction, strategies, tactical, operational planning and control. Web usage miming in E-government has an important role to play in setting government objectives, discovering citizen behavior, and determining future courses of actions. Web usage mining in E-government has not received adequate attention from researchers or practitioners. We developed a framework to promote a better understanding of the importance of Web usage mining in E-government. Using the current literature, we developed the framework presented herein, in hopes that it would stimulate more interest in this important area.

  2. Trends in access of plant biodiversity data revealed by Google Analytics

    PubMed Central

    Baxter, David G.; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E.

    2014-01-01

    Abstract The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development. PMID:25425933

  3. Trends in access of plant biodiversity data revealed by Google Analytics.

    PubMed

    Jones, Timothy Mark; Baxter, David G; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E

    2014-01-01

    The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development.

  4. Accessing the public MIMIC-II intensive care relational database for clinical research

    PubMed Central

    2013-01-01

    Background The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. Results QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge “Predicting mortality of ICU Patients”. Conclusions QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database. PMID:23302652

  5. Intelligent Learning Infrastructure for Knowledge Intensive Organizations: A Semantic Web Perspective

    ERIC Educational Resources Information Center

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2005-01-01

    In the context of Knowledge Society, the convergence of knowledge and learning management is a critical milestone. "Intelligent Learning Infrastructure for Knowledge Intensive Organizations: A Semantic Web Perspective" provides state-of-the art knowledge through a balanced theoretical and technological discussion. The semantic web perspective…

  6. PHL7/441: Fixing a Broken Line between the Perceived "Anarchy" of the Web and a Process-Comfortable Pharmaceutical Company

    PubMed Central

    Vercellesi, L

    1999-01-01

    Introduction In 1998 a pharmaceutical company published its Web site to provide: an institutional presence multifunctional information to primary customers and general public a new way of access to the company a link to existing company-sponsored sites a platform for future projects Since the publication, some significant integration have been added; in particular one is a primary interactive service, addressed to a selected audience. The need has been felt to foster new projects and establish the idea of routinely considering the site as a potential tool in the marketing mix, to provide advanced services to customers. Methods Re-assessment of the site towards objectives. Assessment of its perception with company potential suppliers. Results The issue "web use" was discussed in various management meetings; the trend of use of Internet among the primary customers was known; major concerns expressed were about staffing and return of investment for activities run in the Web. These perceptions are being addressed by making the company more comfortable by: Running the site through a detailed process and clear procedures, defining A new process of maintenance of the site, involving representatives of all the functions. Procedures and guidelines. A master file of approved answers and company contacts. Categories of activities (information, promotion, education, information to investors, general services, target-specific services). Measures for all the activities run in the Web site Specifically for the Web site a concise periodical report is being assessed, covering 1. Statistics about hits and mails, compared to the corporate data. Indication of new items published. Description by the "supplier" of new or ongoing innovative projects, to transfer best practice. Basic figures on the Italian trend in internet use and specifically in the pharmaceutical and medical fields. Comments to a few competitor sites. Examples of potential uses deriving from other Web sites. Discussion The comparatively low use of Internet in Italy has affected the systematic professional exploitation of the company site. The definition of "anarchic" commonly linked to the Web by local media has lead to the attempt to "master" and "normalize" the site with a stricter approach than usual: most procedures and guidelines have been designed from scratch as not available for similar activities traditionally run. A short set of information has been requested for inclusion in the report: its wide coverage will help to receive a flavour of the global parallel new world developing in the net. Hopefully this approach will help to create a comfortable attitude towards the medium in the whole organisation and to acquire a working experience with the net.

  7. Information about liver transplantation on the World Wide Web.

    PubMed

    Hanif, F; Sivaprakasam, R; Butler, A; Huguet, E; Pettigrew, G J; Michael, E D A; Praseedom, R K; Jamieson, N V; Bradley, J A; Gibbs, P

    2006-09-01

    Orthotopic liver transplant (OLTx) has evolved to a successful surgical management for end-stage liver diseases. Awareness and information about OLTx is an important tool in assisting OLTx recipients and people supporting them, including non-transplant clinicians. The study aimed to investigate the nature and quality of liver transplant-related patient information on the World Wide Web. Four common search engines were used to explore the Internet by using the key words 'Liver transplant'. The URL (unique resource locator) of the top 50 returns was chosen as it was judged unlikely that the average user would search beyond the first 50 sites returned by a given search. Each Web site was assessed on the following categories: origin, language, accessibility and extent of the information. A weighted Information Score (IS) was created to assess the quality of clinical and educational value of each Web site and was scored independently by three transplant clinicians. The Internet search performed with the aid of the four search engines yielded a total of 2,255,244 Web sites. Of the 200 possible sites, only 58 Web sites were assessed because of repetition of the same Web sites and non-accessible links. The overall median weighted IS was 22 (IQR 1 - 42). Of the 58 Web sites analysed, 45 (77%) belonged to USA, six (10%) were European, and seven (12%) were from the rest of the world. The median weighted IS of publications originating from Europe and USA was 40 (IQR = 22 - 60) and 23 (IQR = 6 - 38), respectively. Although European Web sites produced a higher weighted IS [40 (IQR = 22 - 60)] as compared with the USA publications [23 (IQR = 6 - 38)], this was not statistically significant (p = 0.07). Web sites belonging to the academic institutions and the professional organizations scored significantly higher with a median weighted IS of 28 (IQR = 16 - 44) and 24(12 - 35), respectively, as compared with the commercial Web sites (median = 6 with IQR of 0 - 14, p = .001). There was an Intraclass Correlation Coefficient (ICC) of 0.89 and an associated 95% CI (0.83, 0.93) for the three observers on the 58 Web sites. The study highlights the need for a significant improvement in the information available on the World Wide Web about OLTx. It concludes that the educational material currently available on the World Wide Web about liver transplant is of poor quality and requires rigorous input from health care professionals. The authors suggest that clinicians should pay more attention to take the necessary steps to improve the standard of information available on their relevant Web sites and must take an active role in helping their patients find Web sites that provide the best and accurate information specifically applicable to the loco-regional circumstances.

  8. 76 FR 48919 - NRC Enforcement Policy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-09

    ... Web site and on the Federal rulemaking Web site, http://www.regulations.gov . Because your comments... publicly disclosed. You may submit comments by any one of the following methods: Federal Rulemaking Web... Web site: Public comments and supporting materials related to this notice can be found at http://www...

  9. Creating Patient and Family Education Web Sites

    PubMed Central

    YADRICH, DONNA MACAN; FITZGERALD, SHARON A.; WERKOWITCH, MARILYN; SMITH, CAROL E.

    2013-01-01

    This article gives details about the methods and processes used to ensure that usability and accessibility were achieved during development of the Home Parenteral Nutrition Family Caregivers Web site, an evidence-based health education Web site for the family members and caregivers of chronically ill patients. This article addresses comprehensive definitions of usability and accessibility and illustrates Web site development according to Section 508 standards and the national Health and Human Services’ Research-Based Web Design and Usability Guidelines requirements. PMID:22024970

  10. Enhancing UCSF Chimera through web services.

    PubMed

    Huang, Conrad C; Meng, Elaine C; Morris, John H; Pettersen, Eric F; Ferrin, Thomas E

    2014-07-01

    Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Analysis of Internet information on the controversial X-Stop device.

    PubMed

    Anderson, Joshua T; Sullivan, T Barrett; Ahn, Uri M; Ahn, Nicholas U

    2014-10-01

    The Internet is frequently used by patients to aid in medical decision making. Multiple studies display the Internet's ineffectiveness in presenting high-quality information regarding surgical procedures and devices. With recent reports of unacceptably high complication rates and poor outcomes with the X-Stop device, it is important that online information is comprehensive and accurate. This study is the first to examine Internet information on the controversial X-Stop. To determine how accurately public information over the Internet portrays the existing primary literature on the X-Stop, how extensively the X-Stop is characterized online, and how patient decision making could foreseeably be affected. This cross-sectional study analyzed publicly available Internet information, including videos on the web site YouTube regarding the X-Stop device. No patients were involved in this study. No specific outcome measures were used. Search engines Google, Yahoo, and Bing were used to identify 105 web sites providing information on the X-Stop. Videos on the web site YouTube were included. Web sites were categorized based on the authorship. Each site was analyzed for the provision of appropriate patient inclusion and exclusion criteria, surgical and nonsurgical treatment alternatives, purported benefits, common complications, peer-reviewed literature citations, and descriptions/diagrams of the procedure. Data were evaluated for each authorship subgroup and the entire group of sites. Forty-three percent of sites were authored by a private medical group, 4% by an academic medical group, 16% by an insurance company, 9% by a biomedical industry, 10% by news sources, and 19% by other. Thirty-one percent of web sites and 11% of sites authored by private medical groups contained references to peer-reviewed literature. Fifty-six percent of web sites reported patient inclusion criteria, whereas 33% reported exclusion criteria. Benefits and complications were reported within 91% and 23% of sites, respectively. Surgical and nonsurgical treatment options were mentioned within 59% and 61% of web sites, respectively. Our study demonstrates the Internet's ineffectiveness in reporting quality information on the X-Stop. Information was often incomplete and potentially misleading. Significant controversy exists within primary literature regarding the safety and efficacy of the X-Stop. Yet, publicly available Internet information largely provided misinformation and did not reflect any such controversy. This raises the concern that such information lends itself more toward patient recruitment than patient education. Medical professionals need to know how this may affect their patients' decision making. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Mars Data analysis and visualization with Marsoweb

    NASA Astrophysics Data System (ADS)

    Gulick, V. G.; Deardorff, D. G.

    2003-04-01

    Marsoweb is a collaborative web environment that has been developed for the Mars research community to better visualize and analyze Mars orbiter data. Its goal is to enable online data discovery by providing an intuitive, interactive interface to data from the Mars Global Surveyor and other orbiters. Recently Marsoweb has served a prominent role as a resource center for the site selection process for the Mars Explorer Rover 2003 missions. In addition to hosting a repository of landing site memoranda and workshop talks, it includes a Java-based interface to a variety of data maps and images. This interface enables the display and numerical querying of data, and allows data profiles to be rendered from user-drawn cross-sections. High-resolution Mars Orbiter Camera (MOC) images (currently, over 100,000) can be graphically perused; browser-based image processing tools can be used on MOC images of potential landing sites. An automated VRML atlas allows users to construct "flyovers" of their own regions-of-interest in 3D. These capabilities enable Marsoweb to be used for general global data studies, in addition to those specific to landing site selection. As of December 2002, Marsoweb has been viewed by 88,000 distinct users with a total of 3.3 million hits (801,000 page requests in all) from NASA, USGS, academia, and the general public have accessed Marsoweb. The High Resolution Imaging Experiment team for the Mars 2005 Orbiter (HiRISE, PI Alfred McEwen) plans to cast a wide net to collect targeting suggestions. Members of the general public as well as the broad Mars science community will be able to submit suggestions of high resolution imaging targets. The web-based interface for target suggestion input (HiWeb) will be based upon Marsoweb (http://marsoweb.nas.nasa.gov).

  13. Web-based volcano monitoring data from the Pu‘u ‘O‘o eruptive vent (Kilauea Volcano, Hawai‘i) as a tool for geoscience education

    NASA Astrophysics Data System (ADS)

    Poland, M. P.; Townson, R.; Loren, A.; Brooks, B. A.; Foster, J. H.

    2009-12-01

    A significant challenge in college and university geoscience courses is conveying the dynamic nature of the Earth to students. The Internet, however, offers an opportunity to engage classes by making accessible the best examples of current geologic activity, regardless of location. In volcanology, Kilauea, Hawai‘i, is well known as one of the most active volcanoes in the world, and the Web site for the U.S. Geological Survey’s Hawaiian Volcano Observatory offers a daily update of volcanic activity that is followed by people around the globe. The Pu‘u ‘O‘o eruptive vent, on Kilauea‘s east rift zone, has been the focus of near continuous eruption since 1983, experiencing cycles of growth and collapse, high lava fountains, lava lakes, and other phenomena over the course of its existence. To track volcanic activity, various types of monitoring instruments have been installed on and around Pu‘u ‘O‘o, including (as of August 2009) two webcams, one short-period seismometer, one broadband seismometer, seven continuous GPS stations, and two continuous borehole tiltmeters. Monitoring data from Pu‘u ‘O‘o will be made available via the Internet as part of a collaborative research and education project between the Hawaiian Volcano Observatory, National Aeronautics and Space Administration, and University of Hawai‘i at Mānoa. The educational Web site is intended for use in college and university courses, from introductory science classes to graduate-level seminars. Scheduled to come on line by fall 2009, the Web site will provide tools to explore current monitoring results from the eruptive vent. Geophysical data, such as GPS, seismic, and tilt measurements, will be accessible via a time-series query tool, and the complete archive of webcam imagery will be available for examination of visual changes in volcanic activity over time. The Web site will also include background information and references concerning the 1983-present eruption, descriptions of monitoring tools, and resources for instructors. The goal of this project is to demonstrate the dynamic nature of the Earth, promote excitement about the process of scientific discovery, and inspire the next generation of Earth scientists. To encourage use of the Web site, a workshop will be held in mid-2010 to develop curricula for various levels of college and university courses.

  14. EarthInquiry: Using On-Line Data to Help Students Explore Fundamental Concepts in Geoscience

    NASA Astrophysics Data System (ADS)

    Alfano, M.; Keane, C. M.; Ridky, R. W.

    2002-12-01

    Using local case studies to learn about earth processes increases the relevance of science instruction. Students are encouraged to think about how geological processes affect their lives and experiences. Today, with many global data sets available on-line, instructors have unprecedented opportunities to bring local data into the classroom. However, while the resources are available, using on-line data presents a particular set of challenges. Access and entry to web sites frequently change and data format can be unpredictable. Often, instructors are faced with non-functional web sites on the day, or week, that they plan to assign a given activity. The American Geological Institute, with the participation of numerous geoscience professors, has developed EarthInquiry, a series of activities that utilize the abundant real-time and archived geoscience data available on-line. These modules are developed primarily for introductory college students. EarthInquiry modules follow a structured format, beginning with familiar examples at the global and national level to introduce students to the on-line data and the EarthInquiry web site. The web site offers detailed and up-to-date instructions on how to access the data, cached copies of sample data that can be used to complete each activity in the event of a network outage, and an assessment activity that helps students determine how well they have achieved an understanding of key concepts. The EarthInquiry booklet contains a series of engaging questions that allow students to solve problems in a scientific manner. As students gain content understanding and confidence in the requisite analysis, they examine the presented material at a more local level. In one activity, students explore the recurrence interval of a local stream. In other activities, they investigate the mineral resources and earthquake histories of their state. All modules are developed with the intent of building an appropriate cognitive foundation, while complimenting the topics typically discussed in an introductory physical or environmental geology course. The project is a collaboration of the American Geological Institute and W.H. Freeman and Company Publishers.

  15. Surfing for scoliosis: the quality of information available on the Internet.

    PubMed

    Mathur, Sameer; Shanti, Nael; Brkaric, Mario; Sood, Vivek; Kubeck, Justin; Paulino, Carl; Merola, Andrew A

    2005-12-01

    A cross section of Web sites accessible to the general public was surveyed. To evaluate the quality and accuracy of information on scoliosis that a patient might access on the Internet. The Internet is a rapidly expanding communications network with an estimated 765 million users worldwide by the year 2005. Medical information is one of the most common sources of inquires on the Web. More than 100 million Americans accessed the Internet for medical information in the year 2000. Undoubtedly, the use of the Internet for patient information needs will continue to expand as Internet access becomes more readily available. This expansion combined with the Internet's poorly regulated format can lead to problems in the quality of information available. Since the Internet operates on a global scale, implementing and enforcing standards have been difficult. The largely uncontrolled information can potentially negatively influence consumer health outcomes. To identify potential sites, five search engines were selected and the word "scoliosis" was entered into each search engine. A total of 50 Web sites were chosen for review. Each Web site was evaluated according to the type of Web site, quality content, and informational accuracy by three board-certified academic orthopedic surgeons, fellowship trained in spinal surgery, who each has been in practice for a minimum of 8 years. Each Web site was categorized as academic, commercial, physician, nonphysician health professional, and unidentified. In addition, each Web site was evaluated according to scoliosis-specific content using a point value system of 32 disease-specific key words pertinent to the care of scoliosis on an ordinal scale. A list of these words is given. Point values were given for the use of key words related to disease summary, classifications, treatment options, and complications. The accuracy of the individual Web site was evaluated by each spine surgeon using a scale of 1 to 4. A score of 1 represents that the examiner agreed with less than 25% of the information while a score of 4 represents greater than 75% agreement. Of the total 50 Web sites evaluated, 44% were academic, 18% were physician based, 16% were commercial, 12% were unidentified, and 10% were nonphysician health professionals. The quality content score (maximum, 32 points) for academic sites was 12.6 +/- 3.8, physician sites 11.3 +/- 4.0, commercial sites 11 +/- 4.2, unidentified 7.6 +/- 3.9, and nonphysician health professional site 7.0 +/- 1.8. The accuracy score (maximum, 12 points) was 6.6 +/- 2.4 for academic sites, 6.3 +/- 3.0 for physician-professional sites, 6.0 +/- 2.7 for unidentified sites, 5.5 +/- 3.8 for nonphysician professional sites, and 5.0 +/- 1.5 for commercial Web sites. The academic Web sites had the highest mean scores in both quality and accuracy content scores. The information about scoliosis on the Internet is of limited quality and poor information value. Although the majority of the Web sites were academic, the content quality and accuracy scores were still poor. The lowest scoring Web sites were the nonphysician professionals and the unidentified sites, which were often message boards. Overall, the highest scoring Web site related to both quality and accuracy of information was www.srs.org. This Web site was designed by the Scoliosis Research Society. The public and the medical communities need to be aware of these existing limitations of the Internet. Based on our review, the physician must assume primary responsibility of educating and counseling their patients.

  16. ProBiS tools (algorithm, database, and web servers) for predicting and modeling of biologically interesting proteins.

    PubMed

    Konc, Janez; Janežič, Dušanka

    2017-09-01

    ProBiS (Protein Binding Sites) Tools consist of algorithm, database, and web servers for prediction of binding sites and protein ligands based on the detection of structurally similar binding sites in the Protein Data Bank. In this article, we review the operations that ProBiS Tools perform, provide comments on the evolution of the tools, and give some implementation details. We review some of its applications to biologically interesting proteins. ProBiS Tools are freely available at http://probis.cmm.ki.si and http://probis.nih.gov. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. e-Ana and e-Mia: A Content Analysis of Pro–Eating Disorder Web Sites

    PubMed Central

    Schenk, Summer; Wilson, Jenny L.; Peebles, Rebecka

    2010-01-01

    Objectives. The Internet offers Web sites that describe, endorse, and support eating disorders. We examined the features of pro–eating disorder Web sites and the messages to which users may be exposed. Methods. We conducted a systematic content analysis of 180 active Web sites, noting site logistics, site accessories, “thinspiration” material (images and prose intended to inspire weight loss), tips and tricks, recovery, themes, and perceived harm. Results. Practically all (91%) of the Web sites were open to the public, and most (79%) had interactive features. A large majority (84%) offered pro-anorexia content, and 64% provided pro-bulimia content. Few sites focused on eating disorders as a lifestyle choice. Thinspiration material appeared on 85% of the sites, and 83% provided overt suggestions on how to engage in eating-disordered behaviors. Thirty-eight percent of the sites included recovery-oriented information or links. Common themes were success, control, perfection, and solidarity. Conclusions. Pro–eating disorder Web sites present graphic material to encourage, support, and motivate site users to continue their efforts with anorexia and bulimia. Continued monitoring will offer a valuable foundation to build a better understanding of the effects of these sites on their users. PMID:20558807

  18. A GIS-Interface Web Site: Exploratory Learning for Geography Curriculum

    ERIC Educational Resources Information Center

    Huang, Kuo Hung

    2011-01-01

    Although Web-based instruction provides learners with sufficient resources for self-paced learning, previous studies have confirmed that browsing navigation-oriented Web sites possibly hampers users' comprehension of information. Web sites designed as "categories of materials" for navigation demand more cognitive effort from users to orient their…

  19. Digital Discernment: An E-Commerce Web Site Evaluation Tool

    ERIC Educational Resources Information Center

    Sigman, Betsy Page; Boston, Brian J.

    2013-01-01

    Students entering the business workforce today may well share some responsibility for developing, revising, or evaluating their company's Web site. They may lack the experience, however, to critique their employer's Web presence effectively. The purpose of developing Digital Discernment, an e-commerce Web site evaluation tool, was to prepare…

  20. Shakespeare Goes Online: Web Resources for Teaching Shakespeare.

    ERIC Educational Resources Information Center

    Schuetz, Carol L.

    This annotated bibliography contains five sections and 62 items. The first section lists general resources including six Web site addresses; the second section, on Shakespeare's works, contains five Web site addresses; the third section, on Shakespeare and the Globe Theatre, provides five Web site addresses; the fourth section presents classroom…

Top