12 CFR Appendix A to Part 40 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... the term “Social Security number” in the first bullet. (2) Institutions must use five (5) of the...; a Web site; or use of a mail-in opt-out form. Institutions may include the words “toll-free” before... specific Web address that takes consumers directly to the opt-out page or a general Web address that...
12 CFR Appendix A to Part 716 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... the term “Social Security number” in the first bullet. (2) Institutions must use five (5) of the...; a Web site; or use of a mail-in opt-out form. Institutions may include the words “toll-free” before... specific Web address that takes consumers directly to the opt-out page or a general Web address that...
12 CFR Appendix A to Part 573 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... the term “Social Security number” in the first bullet. (2) Institutions must use five (5) of the...; a Web site; or use of a mail-in opt-out form. Institutions may include the words “toll-free” before... specific Web address that takes consumers directly to the opt-out page or a general Web address that...
12 CFR Appendix A to Part 216 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... information that the institution collects and shares. All institutions must use the term “Social Security... appropriate. An institution that allows consumers to opt out online must provide either a specific Web address that takes consumers directly to the opt-out page or a general Web address that provides a clear and...
17 CFR Appendix A to Part 160 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-04-01
... institutions must use the term “Social Security number” in the first bullet. (2) Institutions must use five (5... consumers to opt out online must provide either a specific Web address that takes consumers directly to the opt-out page or a general Web address that provides a clear and conspicuous direct link to the opt-out...
Developing a Web-Based Intervention to Prevent Drug Use among Adolescent Girls
Schwinn, Traci Marie; Hopkins, Jessica Elizabeth; Schinke, Steven Paul
2014-01-01
Objectives Girls’ rates of drug use have met up with, and in some instances, surpassed boys’ use. Though girls and boys share risk and protective factors associated with drug use, girls also have gender-specific risks. Interventions to prevent girls’ drug use must be tailored to address the dynamics of female adolescence. Methods One such intervention, called RealTeen, is a 9-session, web-based drug abuse prevention program designed to address such gender-specific risk factors associated with young girls’ drug use as depressed mood, low self-esteem, and high levels of perceived stress as well as general drug use risk factors of peer and social influences. Web-based delivery enables girls to interact with the program at their own pace and in a location of their choosing. Implications This paper describes the processes and challenges associated with developing and programming a gender-specific, web-based intervention to prevent drug use among adolescent girls. PMID:26778909
Higher Secondary Learners' Effectiveness towards Web Based Instruction (WBI) on Chemistry
ERIC Educational Resources Information Center
Sudha, A.; Amutha, S.
2015-01-01
Web-based training is becoming a phenomenon in education today because of its flexibility and convenience, it is vitally important to address those issues that adversely impact retention and success in this environment. To generate principles of effective asynchronous web-based materials specifically applicable for secondary level students based…
Overview of the World Wide Web Consortium (W3C) (SIGs IA, USE).
ERIC Educational Resources Information Center
Daly, Janet
2000-01-01
Provides an overview of a planned session to describe the work of the World Wide Web Consortium, including technical specifications for HTML (Hypertext Markup Language), XML (Extensible Markup Language), CSS (Cascading Style Sheets), and over 20 other Web standards that address graphics, multimedia, privacy, metadata, and other technologies. (LRW)
Mental Constructions and Constructions of Web Sites: Learner and Teacher Points of View
ERIC Educational Resources Information Center
Hazzan, Orit
2004-01-01
This research focuses on knowledge and ways in which knowledge may be constructed in the learner's mind. Specifically, it addresses the Web as a cognitive supporter for learning, organising and constructing a new domain of knowledge. In particular, the research analyses student reflection on constructing web sites. The analysis is based on an…
Manole, Bogdan-Alexandru; Wakefield, Daniel V; Dove, Austin P; Dulaney, Caleb R; Marcrom, Samuel R; Schwartz, David L; Farmer, Michael R
2017-12-24
The purpose of this study was to survey the accessibility and quality of prostate-specific antigen (PSA) screening information from National Cancer Institute (NCI) cancer center and public health organization Web sites. We surveyed the December 1, 2016, version of all 63 NCI-designated cancer center public Web sites and 5 major online clearinghouses from allied public/private organizations (cancer.gov, cancer.org, PCF.org, USPSTF.org, and CDC.gov). Web sites were analyzed according to a 50-item list of validated health care information quality measures. Web sites were graded by 2 blinded reviewers. Interrater agreement was confirmed by Cohen kappa coefficient. Ninety percent of Web sites addressed PSA screening. Cancer center sites covered 45% of topics surveyed, whereas organization Web sites addressed 70%. All organizational Web pages addressed the possibility of false-positive screening results; 41% of cancer center Web pages did not. Forty percent of cancer center Web pages also did not discuss next steps if a PSA test was positive. Only 6% of cancer center Web pages were rated by our reviewers as "superior" (eg, addressing >75% of the surveyed topics) versus 20% of organizational Web pages. Interrater agreement between our reviewers was high (kappa coefficient = 0.602). NCI-designated cancer center Web sites publish lower quality public information about PSA screening than sites run by major allied organizations. Nonetheless, information and communication deficiencies were observed across all surveyed sites. In an age of increasing patient consumerism, prospective prostate cancer patients would benefit from improved online PSA screening information from provider and advocacy organizations. Validated cancer patient Web educational standards remain an important, understudied priority. Copyright © 2018. Published by Elsevier Inc.
Enterprise Considerations for Ports and Protocols
2016-10-21
selected communications. These protocols are restricted to specific ports or addresses in the receiving web service. HTTPS is familiarly restricted...in use by the web services and applications that are connected to the network are required for interoperability and security. Policies specify the...network or reside at the end-points (i.e., web services or clients). ____________________________ Manuscript received June 1, 2016; revised July
16 CFR 0.2 - Official address.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Official address. 0.2 Section 0.2 Commercial Practices FEDERAL TRADE COMMISSION ORGANIZATION, PROCEDURES AND RULES OF PRACTICE ORGANIZATION § 0.2... 20580, unless otherwise specifically directed. The Commission's Web site address is www.ftc.gov. [63 FR...
16 CFR 0.2 - Official address.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Official address. 0.2 Section 0.2 Commercial Practices FEDERAL TRADE COMMISSION ORGANIZATION, PROCEDURES AND RULES OF PRACTICE ORGANIZATION § 0.2... 20580, unless otherwise specifically directed. The Commission's Web site address is www.ftc.gov. [63 FR...
ERIC Educational Resources Information Center
Ward, Stephen
2015-01-01
This study sought to understand the impact of self-efficacy and professional development on the implementation of specific Web 2.0 tools in the elementary classroom. There were three research questions addressed in this QUAN-Qual study. Quantitative data were collected through three surveys with 48 total participants: the Web 2.0 tools Utilization…
Oh! Web 2.0, Virtual Reference Service 2.0, Tools & Techniques (II)
ERIC Educational Resources Information Center
Arya, Harsh Bardhan; Mishra, J. K.
2012-01-01
The paper describes the theory and definition of the practice of librarianship, specifically addressing how Web 2.0 technologies (tools) such as synchronous messaging, collaborative reference service and streaming media, blogs, wikis, social networks, social bookmarking tools, tagging, RSS feeds, and mashups might intimate changes and how…
17 CFR Appendix A to Subpart A of... - Forms
Code of Federal Regulations, 2011 CFR
2011-04-01
... information that the institution collects and shares. All institutions must use the term “Social Security... the applicable opt-out methods described: telephone, such as by a toll-free number; a Web site; or use... appropriate. An institution that allows consumers to opt out online must provide either a specific Web address...
12 CFR Appendix A to Part 332 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... information that the institution collects and shares. All institutions must use the term “Social Security... the applicable opt-out methods described: Telephone, such as by a toll-free number; a Web site; or use... appropriate. An institution that allows consumers to opt out online must provide either a specific Web address...
The Readability of Information Literacy Content on Academic Library Web Sites
ERIC Educational Resources Information Center
Lim, Adriene
2010-01-01
This article reports on a study addressing the readability of content on academic libraries' Web sites, specifically content intended to improve users' information literacy skills. Results call for recognition of readability as an evaluative component of text in order to better meet the needs of diverse user populations. (Contains 8 tables.)
2016-06-01
of technology and near-global Internet accessibility, a web -based program incorporating interactive maps to record personal combat experiences does...not exist. The Combat Stories Map addresses this deficiency. The Combat Stories Map is a web -based Geographic Information System specifically designed...iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT Despite the proliferation of technology and near-global Internet accessibility, a web
DOORS to the semantic web and grid with a PORTAL for biomedical computing.
Taswell, Carl
2008-03-01
The semantic web remains in the early stages of development. It has not yet achieved the goals envisioned by its founders as a pervasive web of distributed knowledge and intelligence. Success will be attained when a dynamic synergism can be created between people and a sufficient number of infrastructure systems and tools for the semantic web in analogy with those for the original web. The domain name system (DNS), web browsers, and the benefits of publishing web pages motivated many people to register domain names and publish web sites on the original web. An analogous resource label system, semantic search applications, and the benefits of collaborative semantic networks will motivate people to register resource labels and publish resource descriptions on the semantic web. The Domain Ontology Oriented Resource System (DOORS) and Problem Oriented Registry of Tags and Labels (PORTAL) are proposed as infrastructure systems for resource metadata within a paradigm that can serve as a bridge between the original web and the semantic web. The Internet Registry Information Service (IRIS) registers [corrected] domain names while DNS publishes domain addresses with mapping of names to addresses for the original web. Analogously, PORTAL registers resource labels and tags while DOORS publishes resource locations and descriptions with mapping of labels to locations for the semantic web. BioPORT is proposed as a prototype PORTAL registry specific for the problem domain of biomedical computing.
16 CFR Appendix A to Part 313 - Model Privacy Form
Code of Federal Regulations, 2011 CFR
2011-01-01
... “Social Security number” in the first bullet. (2) Institutions must use five (5) of the following terms to... the applicable opt-out methods described: telephone, such as by a toll-free number; a Web site; or use... appropriate. An institution that allows consumers to opt out online must provide either a specific Web address...
Moby and Moby 2: creatures of the deep (web).
Vandervalk, Ben P; McCarthy, E Luke; Wilkinson, Mark D
2009-03-01
Facile and meaningful integration of data from disparate resources is the 'holy grail' of bioinformatics. Some resources have begun to address this problem by providing their data using Semantic Web standards, specifically the Resource Description Framework (RDF) and the Web Ontology Language (OWL). Unfortunately, adoption of Semantic Web standards has been slow overall, and even in cases where the standards are being utilized, interconnectivity between resources is rare. In response, we have seen the emergence of centralized 'semantic warehouses' that collect public data from third parties, integrate it, translate it into OWL/RDF and provide it to the community as a unified and queryable resource. One limitation of the warehouse approach is that queries are confined to the resources that have been selected for inclusion. A related problem, perhaps of greater concern, is that the majority of bioinformatics data exists in the 'Deep Web'-that is, the data does not exist until an application or analytical tool is invoked, and therefore does not have a predictable Web address. The inability to utilize Uniform Resource Identifiers (URIs) to address this data is a barrier to its accessibility via URI-centric Semantic Web technologies. Here we examine 'The State of the Union' for the adoption of Semantic Web standards in the health care and life sciences domain by key bioinformatics resources, explore the nature and connectivity of several community-driven semantic warehousing projects, and report on our own progress with the CardioSHARE/Moby-2 project, which aims to make the resources of the Deep Web transparently accessible through SPARQL queries.
StreamStats in Georgia: a water-resources web application
Gotvald, Anthony J.; Musser, Jonathan W.
2015-07-31
StreamStats is being implemented on a State-by-State basis to allow for customization of the data development and underlying datasets to address their specific needs, issues, and objectives. The USGS, in cooperation with the Georgia Environmental Protection Division and Georgia Department of Transportation, has implemented StreamStats for Georgia. The Georgia StreamStats Web site is available through the national StreamStats Web-page portal at http://streamstats.usgs.gov. Links are provided on this Web page for individual State applications, instructions for using StreamStats, definitions of basin characteristics and streamflow statistics, and other supporting information.
The inverse niche model for food webs with parasites
Warren, Christopher P.; Pascual, Mercedes; Lafferty, Kevin D.; Kuris, Armand M.
2010-01-01
Although parasites represent an important component of ecosystems, few field and theoretical studies have addressed the structure of parasites in food webs. We evaluate the structure of parasitic links in an extensive salt marsh food web, with a new model distinguishing parasitic links from non-parasitic links among free-living species. The proposed model is an extension of the niche model for food web structure, motivated by the potential role of size (and related metabolic rates) in structuring food webs. The proposed extension captures several properties observed in the data, including patterns of clustering and nestedness, better than does a random model. By relaxing specific assumptions, we demonstrate that two essential elements of the proposed model are the similarity of a parasite's hosts and the increasing degree of parasite specialization, along a one-dimensional niche axis. Thus, inverting one of the basic rules of the original model, the one determining consumers' generality appears critical. Our results support the role of size as one of the organizing principles underlying niche space and food web topology. They also strengthen the evidence for the non-random structure of parasitic links in food webs and open the door to addressing questions concerning the consequences and origins of this structure.
Secure password-based authenticated key exchange for web services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Fang; Meder, Samuel; Chevassut, Olivier
This paper discusses an implementation of an authenticated key-exchange method rendered on message primitives defined in the WS-Trust and WS-SecureConversation specifications. This IEEE-specified cryptographic method (AuthA) is proven-secure for password-based authentication and key exchange, while the WS-Trust and WS-Secure Conversation are emerging Web Services Security specifications that extend the WS-Security specification. A prototype of the presented protocol is integrated in the WSRF-compliant Globus Toolkit V4. Further hardening of the implementation is expected to result in a version that will be shipped with future Globus Toolkit releases. This could help to address the current unavailability of decent shared-secret-based authentication options inmore » the Web Services and Grid world. Future work will be to integrate One-Time-Password (OTP) features in the authentication protocol.« less
22 CFR 503.2 - Making a request.
Code of Federal Regulations, 2010 CFR
2010-04-01
... addressed to The Broadcasting Board of Governors (BBG), FOIA/Privacy Act Officer, Office of the General... the Internet on BBG's World Wide Web site (http://www.ibb.gov). The more specific the request for...
Harvest: a web-based biomedical data discovery and reporting application development platform.
Italia, Michael J; Pennington, Jeffrey W; Ruth, Byron; Wrazien, Stacey; Loutrel, Jennifer G; Crenshaw, E Bryan; Miller, Jeffrey; White, Peter S
2013-01-01
Biomedical researchers share a common challenge of making complex data understandable and accessible. This need is increasingly acute as investigators seek opportunities for discovery amidst an exponential growth in the volume and complexity of laboratory and clinical data. To address this need, we developed Harvest, an open source framework that provides a set of modular components to aid the rapid development and deployment of custom data discovery software applications. Harvest incorporates visual representations of multidimensional data types in an intuitive, web-based interface that promotes a real-time, iterative approach to exploring complex clinical and experimental data. The Harvest architecture capitalizes on standards-based, open source technologies to address multiple functional needs critical to a research and development environment, including domain-specific data modeling, abstraction of complex data models, and a customizable web client.
Matching Alternative Addresses: a Semantic Web Approach
NASA Astrophysics Data System (ADS)
Ariannamazi, S.; Karimipour, F.; Hakimpour, F.
2015-12-01
Rapid development of crowd-sourcing or volunteered geographic information (VGI) provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature's literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.
2015-04-29
in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software
Choi, H; Kim, S; Ko, H; Kim, Y; Park, C G
2016-10-01
WHAT IS KNOWN ON THE SUBJECT?: Problematic parent-child relationships have been identified as one of the main predictors of adolescents' mental health problems, but there are few existing interventions that address this issue. The format and delivery method of existing interventions for parents are relatively inaccessible for parents with full-time jobs and families living in rural areas. WHAT DOES THIS PAPER ADD TO EXISTING KNOWLEDGE?: The newly developed 'Stepping Stone' culturally specific web-based intervention, which is intended to help Korean parents of adolescents to acquire both knowledge and communication and conflict management skills, was found to be feasible and well-accepted by parents. This study enabled us to identify areas for improvement in the content and format of the intervention and strategies. This will potentially increase effect sizes for the outcome variables of parents' perception and behaviours. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: This web-based intervention could be delivered across diverse settings, such as schools and community mental health centers, to increase parents' knowledge of adolescent's mental health and allow for early detection of mental health problems. Mental health nurses working in schools may spend a significant amount of time addressing students' mental health issues; thus, this web-based intervention could be a useful resource to share with parents and children. In this way, the mental health nurses could facilitate parental engagement in the intervention and then help them to continue to apply and practice the knowledge and skills obtained through the program. Introduction There is a need for accessible, culturally specific web-based interventions to address parent-child relationships and adolescents' mental health. Aims This study developed and conducted a preliminary evaluation of a 4-week web-based intervention for parents of adolescents aged 11 to 16 years in Korea. Methods We used a two-group, repeated measures, quasi-experimental study design to assess the feasibility of developing and implementing a web-based intervention for parents. Descriptive statistics, chi-square and t tests, and mixed effect modeling were used for data analysis. Results The intervention and 1-month follow-up survey were completed by 47 parents in the intervention group and 46 parents in the attention control (AC) group. The intervention was found to be feasible and well-accepted by parents. Discussion This culturally specific web-based intervention is a useful tool for knowledge dissemination among large numbers of parents. Areas for improvement in the content and format of the intervention and strategies to elicit significant parent-child interactions are provided. Implications for practice and conclusion The intervention could be disseminated in collaboration with mental health nurses working in schools to facilitate parents' participation. © 2016 John Wiley & Sons Ltd.
Using component technologies for web based wavelet enhanced mammographic image visualization.
Sakellaropoulos, P; Costaridou, L; Panayiotakis, G
2000-01-01
The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.
Shakespeare Goes Online: Web Resources for Teaching Shakespeare.
ERIC Educational Resources Information Center
Schuetz, Carol L.
This annotated bibliography contains five sections and 62 items. The first section lists general resources including six Web site addresses; the second section, on Shakespeare's works, contains five Web site addresses; the third section, on Shakespeare and the Globe Theatre, provides five Web site addresses; the fourth section presents classroom…
A Web-based approach to blood donor preparation.
France, Christopher R; France, Janis L; Kowalsky, Jennifer M; Copley, Diane M; Lewis, Kristin N; Ellis, Gary D; McGlone, Sarah T; Sinclair, Kadian S
2013-02-01
Written and video approaches to donor education have been shown to enhance donation attitudes and intentions to give blood, particularly when the information provides specific coping suggestions for donation-related concerns. This study extends this work by comparing Web-based approaches to donor preparation among donors and nondonors. Young adults (62% female; mean [±SD] age, 19.3 [±1.5] years; mean [range] number of prior blood donations, 1.1 [0-26]; 60% nondonors) were randomly assigned to view 1) a study Web site designed to address common blood donor concerns and suggest specific coping strategies (n = 238), 2) a standard blood center Web site (n = 233), or 3) a control Web site where participants viewed videos of their choice (n = 202). Measures of donation attitude, anxiety, confidence, intention, anticipated regret, and moral norm were completed before and after the intervention. Among nondonors, the study Web site produced greater changes in donation attitude, confidence, intention, and anticipated regret relative to both the standard and the control Web sites, but only differed significantly from the control Web site for moral norm and anxiety. Among donors, the study Web site produced greater changes in donation confidence and anticipated regret relative to both the standard and the control Web sites, but only differed significantly from the control Web site for donation attitude, anxiety, intention, and moral norm. Web-based donor preparation materials may provide a cost-effective way to enhance donation intentions and encourage donation behavior. © 2012 American Association of Blood Banks.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface.
Jeliazkova, Nina; Jeliazkov, Vedrin
2011-05-16
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface
2011-01-01
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202
George, Nika; MacDougall, Megan
2016-01-01
Background Women are disproportionately likely to assist aging family members; approximately 53 million in the United States are involved with the health care of aging parents, in-laws, or other relatives. The busy schedules of “sandwich generation” women who care for older relatives require accessible and flexible health education, including Web-based approaches. Objective This paper describes the development and implementation of a Web-based health education intervention, The Sandwich Generation Diner, as a tool for intergenerational caregivers of older adults with physical and cognitive impairments. Methods We used Bartholomew’s Intervention Mapping (IM) process to develop our theory-based health education program. Bandura’s (1997) self-efficacy theory provided the overarching theoretical model. Results The Sandwich Generation Diner website features four modules that address specific health care concerns. Our research involves randomly assigning caregiver participants to one of two experimental conditions that are identical in the type of information provided, but vary significantly in the presentation. In addition to structured Web-based assessments, specific website usage data are recorded. Conclusions The Sandwich Generation Diner was developed to address some of the informational and self-efficacy needs of intergenerational female caregivers. The next step is to demonstrate that this intervention is: (1) attractive and effective with families assisting older adults, and (2) feasible to embed within routine home health services for older adults. PMID:27269632
Strategies to address participant misrepresentation for eligibility in Web-based research.
Kramer, Jessica; Rubin, Amy; Coster, Wendy; Helmuth, Eric; Hermos, John; Rosenbloom, David; Moed, Rich; Dooley, Meghan; Kao, Ying-Chia; Liljenquist, Kendra; Brief, Deborah; Enggasser, Justin; Keane, Terence; Roy, Monica; Lachowicz, Mark
2014-03-01
Emerging methodological research suggests that the World Wide Web ("Web") is an appropriate venue for survey data collection, and a promising area for delivering behavioral intervention. However, the use of the Web for research raises concerns regarding sample validity, particularly when the Web is used for recruitment and enrollment. The purpose of this paper is to describe the challenges experienced in two different Web-based studies in which participant misrepresentation threatened sample validity: a survey study and an online intervention study. The lessons learned from these experiences generated three types of strategies researchers can use to reduce the likelihood of participant misrepresentation for eligibility in Web-based research. Examples of procedural/design strategies, technical/software strategies and data analytic strategies are provided along with the methodological strengths and limitations of specific strategies. The discussion includes a series of considerations to guide researchers in the selection of strategies that may be most appropriate given the aims, resources and target population of their studies. Copyright © 2014 John Wiley & Sons, Ltd.
77 FR 70454 - Proposed Flood Hazard Determinations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... which included a Web page address through which the Preliminary Flood Insurance Rate Map (FIRM), and... be accessed. The information available through the Web page address has subsequently been updated... through the web page address listed in the table has been updated to reflect the Revised Preliminary...
76 FR 23341 - Sunshine Federal Register Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-26
...-415-1711) This meeting will be webcast live at the Web address-- http://www.nrc.gov . Week of May 2... Meeting); (Contact: Robert Kahler, 301-415-7528) This meeting will be webcast live at the Web address... Meeting); (Contact: Nathan Sanfilippo, 301-415-3951) This meeting will be webcast live at the Web address...
Missing the Forest for the Trees
ERIC Educational Resources Information Center
Amaral, Olga Maia; Garrison, Leslie
2007-01-01
This case study examines the alignment between the Intended Curriculum, Implemented Curriculum and Achieved Curriculum of a fourth grade inquiry based unit, "Food Chains and Webs." Specifically addressed are how the curriculum was modified to meet state standards, how teachers were trained, and how assessment of curricular implementation was…
Search, Read and Write: An Inquiry into Web Accessibility for People with Dyslexia.
Berget, Gerd; Herstad, Jo; Sandnes, Frode Eika
2016-01-01
Universal design in context of digitalisation has become an integrated part of international conventions and national legislations. A goal is to make the Web accessible for people of different genders, ages, backgrounds, cultures and physical, sensory and cognitive abilities. Political demands for universally designed solutions have raised questions about how it is achieved in practice. Developers, designers and legislators have looked towards the Web Content Accessibility Guidelines (WCAG) for answers. WCAG 2.0 has become the de facto standard for universal design on the Web. Some of the guidelines are directed at the general population, while others are targeted at more specific user groups, such as the visually impaired or hearing impaired. Issues related to cognitive impairments such as dyslexia receive less attention, although dyslexia is prevalent in at least 5-10% of the population. Navigation and search are two common ways of using the Web. However, while navigation has received a fair amount of attention, search systems are not explicitly included, although search has become an important part of people's daily routines. This paper discusses WCAG in the context of dyslexia for the Web in general and search user interfaces specifically. Although certain guidelines address topics that affect dyslexia, WCAG does not seem to fully accommodate users with dyslexia.
76 FR 6181 - Information Collection Available for Public Comments and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-03
....regulations.gov/search/index.jsp . Specifically address whether this information collection is necessary for... version of this document is available on the World Wide Web at http://www.regulations.gov/search/index.jsp... visit http://www.regulations.gov/search/index.jsp . By Order of the Maritime Administrator. Dated...
Web-Based Academic Support Services: Guidelines for Extensibility
ERIC Educational Resources Information Center
McCracken, Holly
2005-01-01
Using the experience of the University of Illinois at Springfield's College of Liberal Arts and Sciences at the as a foundation for discussion, this paper addresses the provision of student support services to distant students within the context of development and expansion. Specific issues for consideration include: integrating student support…
ERIC Educational Resources Information Center
Hochstrasser, Jeffrey L.
2014-01-01
This three article dissertation is the culminating requirement for the Professional Practices Doctorate, resulting in a terminal Ed.D. degree at the University of Idaho. As such, it consists of three articles specifically relating to educational concerns at Brigham Young University-Idaho. The goal was to address specific situations or needs…
Reliable and Persistent Identification of Linked Data Elements
NASA Astrophysics Data System (ADS)
Wood, David
Linked Data techniques rely upon common terminology in a manner similar to a relational database'vs reliance on a schema. Linked Data terminology anchors metadata descriptions and facilitates navigation of information. Common vocabularies ease the human, social tasks of understanding datasets sufficiently to construct queries and help to relate otherwise disparate datasets. Vocabulary terms must, when using the Resource Description Framework, be grounded in URIs. A current bestpractice on the World Wide Web is to serve vocabulary terms as Uniform Resource Locators (URLs) and present both human-readable and machine-readable representations to the public. Linked Data terminology published to theWorldWideWeb may be used by others without reference or notification to the publishing party. That presents a problem: Vocabulary publishers take on an implicit responsibility to maintain and publish their terms via the URLs originally assigned, regardless of the inconvenience such a responsibility may cause. Over the course of years, people change jobs, publishing organizations change Internet domain names, computers change IP addresses,systems administrators publish old material in new ways. Clearly, a mechanism is required to manageWeb-based vocabularies over a long term. This chapter places Linked Data vocabularies in context with the wider concepts of metadata in general and specifically metadata on the Web. Persistent identifier mechanisms are reviewed, with a particular emphasis on Persistent URLs, or PURLs. PURLs and PURL services are discussed in the context of Linked Data. Finally, historic weaknesses of PURLs are resolved by the introduction of a federation of PURL services to address needs specific to Linked Data.
A health information technology glossary for novices.
Cravens, Gary D; Dixon, Brian E; Zafar, Atif; McGowan, Julie J
2008-11-06
To deliver information to providers across the U.S., the Agency for Healthcare Research and Quality's National Resource Center for Health IT (NRC) created a public domain Web site containing a number of tools and resources. Specifically lacking from this Web site is a glossary of health IT terminology. To address this omission and respond to requests from Web site users,the Regenstrief Institute created the Health IT Glossary. This glossary is designed to provide novices, providers and others new to health IT, with a single source to find basic definitions for a broad range of terms, consistent with the Office of the National Coordinator (ONC) effort. The glossary is a living document, and feedback is welcomed from the health informatics community.
Network-Based Professional Development: A Comparison of Statewide Initiatives.
ERIC Educational Resources Information Center
Shotsberger, Paul G.; Stammen, Ronald; Vetter, Ronald; Blue, Gloria; Greer, Edrie
This paper addresses opportunities and issues related to the use of the World Wide Web and high-speed networks as a delivery vehicle for training educators who are geographically dispersed. The benefits and potential pitfalls of using networks as educational platforms are explored from the perspective of various systems specifically being…
75 FR 78340 - Information Collection Available for Public Comments and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
.../search/index.jsp . Specifically address whether this information collection is necessary for proper... version of this document is available on the World Wide Web at http://www.regulations.gov/search/index.jsp... visit http://www.regulations.gov/search/index.jsp . (Authority: 49 CFR 1.66) By Order of the Maritime...
76 FR 28845 - Information Collection Available for Public Comments and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-18
... ./search/index.jsp. Specifically address whether this information collection is necessary for proper... version of this document is available on the World Wide Web at http://www.regulations.gov/search/index,jsp... visit http://www.regulations.gov/search/index.jsp . Authority: 49 CFR 1.66. Dated: May 9, 2011. By Order...
75 FR 35876 - Information Collection Available for Public Comments and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-23
... electronic means via the Internet at http://www.regulations.gov/search/index.jsp . Specifically address... the World Wide Web at http://www.regulations.gov/search/index.jsp . Privacy Act: Anyone is able to.../index.jsp . Authority: 49 CFR 1.66. By Order of the Maritime Administrator. Dated: June 17, 2010. Julie...
Rebooting the East: Automation in University Libraries of the Former German Democratic Republic.
ERIC Educational Resources Information Center
Seadle, Michael
1996-01-01
Provides a history of the automation efforts at former East German libraries that have made their resources available for the first time. Highlights include World Wide Web home page addresses; library problems, including censorship; automation guidelines, funding, and cooperation; online catalogs; and specific examples of university libraries'…
Development and methods for an open-sourced data visualization tool
USDA-ARS?s Scientific Manuscript database
This paper presents an open source on-demand web tool, which is specifically addressed to scientists and researchers that are non-expert in converting time series data into a time surface visualization. Similar to a GIS environment the time surface shows time on two axes; time of day vs. day of year...
Povey, Sue; Al Aqeel, Aida I; Cambon-Thomsen, Anne; Dalgleish, Raymond; den Dunnen, Johan T; Firth, Helen V; Greenblatt, Marc S; Barash, Carol Isaacson; Parker, Michael; Patrinos, George P; Savige, Judith; Sobrido, Maria-Jesus; Winship, Ingrid; Cotton, Richard GH
2010-01-01
More than 1,000 Web-based locus-specific variation databases (LSDBs) are listed on the Website of the Human Genetic Variation Society (HGVS). These individual efforts, which often relate phenotype to genotype, are a valuable source of information for clinicians, patients, and their families, as well as for basic research. The initiators of the Human Variome Project recently recognized that having access to some of the immense resources of unpublished information already present in diagnostic laboratories would provide critical data to help manage genetic disorders. However, there are significant ethical issues involved in sharing these data worldwide. An international working group presents second-generation guidelines addressing ethical issues relating to the curation of human LSDBs that provide information via a Web-based interface. It is intended that these should help current and future curators and may also inform the future decisions of ethics committees and legislators. These guidelines have been reviewed by the Ethics Committee of the Human Genome Organization (HUGO). Hum Mutat 31:–6, 2010. © 2010 Wiley-Liss, Inc. PMID:20683926
Tools for Administration of a UNIX-Based Network
NASA Technical Reports Server (NTRS)
LeClaire, Stephen; Farrar, Edward
2004-01-01
Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.
Web malware spread modelling and optimal control strategies
NASA Astrophysics Data System (ADS)
Liu, Wanping; Zhong, Shouming
2017-02-01
The popularity of the Web improves the growth of web threats. Formulating mathematical models for accurate prediction of malicious propagation over networks is of great importance. The aim of this paper is to understand the propagation mechanisms of web malware and the impact of human intervention on the spread of malicious hyperlinks. Considering the characteristics of web malware, a new differential epidemic model which extends the traditional SIR model by adding another delitescent compartment is proposed to address the spreading behavior of malicious links over networks. The spreading threshold of the model system is calculated, and the dynamics of the model is theoretically analyzed. Moreover, the optimal control theory is employed to study malware immunization strategies, aiming to keep the total economic loss of security investment and infection loss as low as possible. The existence and uniqueness of the results concerning the optimality system are confirmed. Finally, numerical simulations show that the spread of malware links can be controlled effectively with proper control strategy of specific parameter choice.
Web malware spread modelling and optimal control strategies.
Liu, Wanping; Zhong, Shouming
2017-02-10
The popularity of the Web improves the growth of web threats. Formulating mathematical models for accurate prediction of malicious propagation over networks is of great importance. The aim of this paper is to understand the propagation mechanisms of web malware and the impact of human intervention on the spread of malicious hyperlinks. Considering the characteristics of web malware, a new differential epidemic model which extends the traditional SIR model by adding another delitescent compartment is proposed to address the spreading behavior of malicious links over networks. The spreading threshold of the model system is calculated, and the dynamics of the model is theoretically analyzed. Moreover, the optimal control theory is employed to study malware immunization strategies, aiming to keep the total economic loss of security investment and infection loss as low as possible. The existence and uniqueness of the results concerning the optimality system are confirmed. Finally, numerical simulations show that the spread of malware links can be controlled effectively with proper control strategy of specific parameter choice.
Web malware spread modelling and optimal control strategies
Liu, Wanping; Zhong, Shouming
2017-01-01
The popularity of the Web improves the growth of web threats. Formulating mathematical models for accurate prediction of malicious propagation over networks is of great importance. The aim of this paper is to understand the propagation mechanisms of web malware and the impact of human intervention on the spread of malicious hyperlinks. Considering the characteristics of web malware, a new differential epidemic model which extends the traditional SIR model by adding another delitescent compartment is proposed to address the spreading behavior of malicious links over networks. The spreading threshold of the model system is calculated, and the dynamics of the model is theoretically analyzed. Moreover, the optimal control theory is employed to study malware immunization strategies, aiming to keep the total economic loss of security investment and infection loss as low as possible. The existence and uniqueness of the results concerning the optimality system are confirmed. Finally, numerical simulations show that the spread of malware links can be controlled effectively with proper control strategy of specific parameter choice. PMID:28186203
More than a meal: integrating non-feeding interactions into food webs
Kéfi, Sonia; Berlow, Eric L.; Wieters, Evie A.; Navarrete, Sergio A.; Petchey, Owen L.; Wood, Spencer A.; Boit, Alice; Joppa, Lucas N.; Lafferty, Kevin D.; Williams, Richard J.; Martinez, Neo D.; Menge, Bruce A.; Blanchette, Carol A.; Iles, Alison C.; Brose, Ulrich
2012-01-01
Organisms eating each other are only one of many types of well documented and important interactions among species. Other such types include habitat modification, predator interference and facilitation. However, ecological network research has been typically limited to either pure food webs or to networks of only a few (<3) interaction types. The great diversity of non-trophic interactions observed in nature has been poorly addressed by ecologists and largely excluded from network theory. Herein, we propose a conceptual framework that organises this diversity into three main functional classes defined by how they modify specific parameters in a dynamic food web model. This approach provides a path forward for incorporating non-trophic interactions in traditional food web models and offers a new perspective on tackling ecological complexity that should stimulate both theoretical and empirical approaches to understanding the patterns and dynamics of diverse species interactions in nature.
78 FR 33807 - Privacy Act New System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-05
.... For National Institute of Standards and Technology, Chief Information Officer, 100 Bureau Drive..., address, email, and telephone number; credit card information; Web site URL; organization category and...; title; address; email address; telephone number; Web site URL; organization category and description...
Barchitta, M; Fragapane, S; Consoli, M T; Pennisi, C; Agodi, A
2012-01-01
The growing needs of people with disabilities require to integrate this issue into public health in order to improve political feasibility and to ensure that disability will not be left off from any strategic table. The main aim of the "Care for Work" project was to provide training contents to help workers and unemployed people to adapt their knowledge, skills and competencies to the care services sector in order to facilitate their insertion in a new employment source. The partners participating in the project are Organizations from 5 European countries. The project has been divided into seven Work Packages (WPs): three transversal WPs and four specific WPs, each addressing specific activities necessary to achieve the final objectives of the project. The "Care for Work" learning environment contains specific information and training on the techniques for caring people with acquired physical disabilities, as text documents and short training films. The project combines e-learning (Web 2.0) and mobile learning providing a flexible training platform for workers of care services sector. The "Care for Work" project offers specific training addressed to meet the new existing needs of workers of the care services sector and/or unemployed people. All the information and results of the project are available on the web page: www.careforwork.eu, and the present article is part of the WP "Valorization".
The Virtual Learning Commons (VLC): Enabling Co-Innovation Across Disciplines
NASA Astrophysics Data System (ADS)
Pennington, D. D.; Gandara, A.; Del Rio, N.
2014-12-01
A key challenge for scientists addressing grand-challenge problems is identifying, understanding, and integrating potentially relevant methods, models and tools that that are rapidly evolving in the informatics community. Such tools are essential for effectively integrating data and models in complex research projects, yet it is often difficult to know what tools are available and it is not easy to understand or evaluate how they might be used in a given research context. The goal of the National Science Foundation-funded Virtual Learning Commons (VLC) is to improve awareness and understanding of emerging methodologies and technologies, facilitate individual and group evaluation of these, and trace the impact of innovations within and across teams, disciplines, and communities. The VLC is a Web-based social bookmarking site designed specifically to support knowledge exchange in research communities. It is founded on well-developed models of technology adoption, diffusion of innovation, and experiential learning. The VLC makes use of Web 2.0 (Social Web) and Web 3.0 (Semantic Web) approaches. Semantic Web approaches enable discovery of potentially relevant methods, models, and tools, while Social Web approaches enable collaborative learning about their function. The VLC is under development and the first release is expected Fall 2014.
Developing a Web-Based Intervention to Prevent Drug Use among Adolescent Girls
ERIC Educational Resources Information Center
Schwinn, Traci Marie; Hopkins, Jessica Elizabeth; Schinke, Steven Paul
2016-01-01
Objectives: Girls' rates of drug use have met up with and, in some instances, surpassed boys' rates. Although girls and boys share risk and protective factors associated with drug use, girls also have gender-specific risks. Interventions to prevent girls' drug use must be tailored to address the dynamics of female adolescence. Methods: One such…
Webquests: A Strategy to Address the "Content" Dilemma in Teacher Education Coursework
ERIC Educational Resources Information Center
Fero, Marie
2008-01-01
The purpose of this article is to propose the use of technology integration in elementary and middle level education courses, specifically, the use of WebQuests as vehicles for the infusion of content knowledge in preservice and in-service education courses. Observation in content area teaching methods courses found that teacher candidates often…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
... related to the 7th Session of the AFTF will be accessible via the World Wide Web at the following address... World Health Organization (WHO). Through adoption of food standards, codes of practice, and other... animals. The guidelines should include specific science-based risk assessment criteria to apply to feed...
75 FR 6413 - Sunshine Act Meeting Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-09
... Barkley, 610-337- 5065.) This meeting will be webcast live at the Web address-- http://www.nrc.gov . Week..., 301-251-7982.) This meeting will be webcast live at the Web address-- http://www.nrc.gov . Week of... (Public Meeting). (Contact: Thomas Fredrichs, 301-415-5971.) This meeting will be webcast live at the Web...
Tian, Chenlu; Champlin, Sara; Mackert, Michael; Lazard, Allison; Agrawal, Deepak
2014-08-01
Colorectal cancer (CRC) screening rates in the Unites States are still below target level. Web-based patient education materials are used by patients and providers to provide supplemental information on CRC screening. Low literacy levels and patient perceptions are significant barriers to screening. There are little data on the quality of these online materials from a health literacy standpoint or whether they address patients' perceptions. To evaluate the readability, suitability, and health content of web-based patient education materials on colon cancer screening. Descriptive study. Web-based patient materials. Twelve reputable and popular online patient education materials were evaluated. Readability was measured by using the Flesch-Kincaid Reading Grade Level, and suitability was determined by the Suitability Assessment of Materials, a scale that considers characteristics such as content, graphics, layout/typography, and learning stimulation. Health content was evaluated within the framework of the Health Belief Model, a behavioral model that relates patients' perceptions of susceptibility to disease, severity, and benefits and barriers to their medical decisions. Each material was scored independently by 3 reviewers. Flesch-Kincaid Reading Grade Level score, Suitability Assessment of Materials score, health content score. Readability for 10 of 12 materials surpassed the maximum recommended sixth-grade reading level. Five were 10th grade level and above. Only 1 of 12 materials received a superior suitability score; 3 materials received inadequate scores. Health content analysis revealed that only 50% of the resources discussed CRC risk in the general population and <25% specifically addressed patients at high risk, such as African Americans, smokers, patients with diabetes, and obese patients. For perceived barriers to screening, only 8.3% of resources discussed embarrassment, 25% discussed pain with colonoscopy, 25% addressed cost of colonoscopy, and none specifically mentioned the need to get colonoscopy when no symptoms are present. No material discussed the social benefits of screening. Descriptive design. Most online patient education materials for CRC screening are written beyond the recommended sixth-grade reading level, with suboptimal suitability. Health content is lacking in addressing key perceived risks, barriers, and benefits to CRC screening. Developing more appropriate and targeted patient education resources on CRC may improve patient understanding and promote screening. Copyright © 2014 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.
Web-based surveillance of public information needs for informing preconception interventions.
D'Ambrosio, Angelo; Agricola, Eleonora; Russo, Luisa; Gesualdo, Francesco; Pandolfi, Elisabetta; Bortolus, Renata; Castellani, Carlo; Lalatta, Faustina; Mastroiacovo, Pierpaolo; Tozzi, Alberto Eugenio
2015-01-01
The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health. Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics. We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time. Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations.
Web-Based Surveillance of Public Information Needs for Informing Preconception Interventions
D’Ambrosio, Angelo; Agricola, Eleonora; Russo, Luisa; Gesualdo, Francesco; Pandolfi, Elisabetta; Bortolus, Renata; Castellani, Carlo; Lalatta, Faustina; Mastroiacovo, Pierpaolo; Tozzi, Alberto Eugenio
2015-01-01
Background The risk of adverse pregnancy outcomes can be minimized through the adoption of healthy lifestyles before pregnancy by women of childbearing age. Initiatives for promotion of preconception health may be difficult to implement. Internet can be used to build tailored health interventions through identification of the public's information needs. To this aim, we developed a semi-automatic web-based system for monitoring Google searches, web pages and activity on social networks, regarding preconception health. Methods Based on the American College of Obstetricians and Gynecologists guidelines and on the actual search behaviors of Italian Internet users, we defined a set of keywords targeting preconception care topics. Using these keywords, we analyzed the usage of Google search engine and identified web pages containing preconception care recommendations. We also monitored how the selected web pages were shared on social networks. We analyzed discrepancies between searched and published information and the sharing pattern of the topics. Results We identified 1,807 Google search queries which generated a total of 1,995,030 searches during the study period. Less than 10% of the reviewed pages contained preconception care information and in 42.8% information was consistent with ACOG guidelines. Facebook was the most used social network for sharing. Nutrition, Chronic Diseases and Infectious Diseases were the most published and searched topics. Regarding Genetic Risk and Folic Acid, a high search volume was not associated to a high web page production, while Medication pages were more frequently published than searched. Vaccinations elicited high sharing although web page production was low; this effect was quite variable in time. Conclusion Our study represent a resource to prioritize communication on specific topics on the web, to address misconceptions, and to tailor interventions to specific populations. PMID:25879682
16 CFR § 1115.27 - Recall notice content requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... quality to a Web site or other appropriate medium. As needed for effective notification, multiple... information (such as name, address, telephone and facsimile numbers, e-mail address, and Web site address... must include the information set forth below: (a) Terms. A recall notice must include the word “recall...
Vercellesi, L
1999-01-01
Introduction In 1998 a pharmaceutical company published its Web site to provide: an institutional presence multifunctional information to primary customers and general public a new way of access to the company a link to existing company-sponsored sites a platform for future projects Since the publication, some significant integration have been added; in particular one is a primary interactive service, addressed to a selected audience. The need has been felt to foster new projects and establish the idea of routinely considering the site as a potential tool in the marketing mix, to provide advanced services to customers. Methods Re-assessment of the site towards objectives. Assessment of its perception with company potential suppliers. Results The issue "web use" was discussed in various management meetings; the trend of use of Internet among the primary customers was known; major concerns expressed were about staffing and return of investment for activities run in the Web. These perceptions are being addressed by making the company more comfortable by: Running the site through a detailed process and clear procedures, defining A new process of maintenance of the site, involving representatives of all the functions. Procedures and guidelines. A master file of approved answers and company contacts. Categories of activities (information, promotion, education, information to investors, general services, target-specific services). Measures for all the activities run in the Web site Specifically for the Web site a concise periodical report is being assessed, covering 1. Statistics about hits and mails, compared to the corporate data. Indication of new items published. Description by the "supplier" of new or ongoing innovative projects, to transfer best practice. Basic figures on the Italian trend in internet use and specifically in the pharmaceutical and medical fields. Comments to a few competitor sites. Examples of potential uses deriving from other Web sites. Discussion The comparatively low use of Internet in Italy has affected the systematic professional exploitation of the company site. The definition of "anarchic" commonly linked to the Web by local media has lead to the attempt to "master" and "normalize" the site with a stricter approach than usual: most procedures and guidelines have been designed from scratch as not available for similar activities traditionally run. A short set of information has been requested for inclusion in the report: its wide coverage will help to receive a flavour of the global parallel new world developing in the net. Hopefully this approach will help to create a comfortable attitude towards the medium in the whole organisation and to acquire a working experience with the net.
Choo, Esther K.; Zlotnick, Caron; Strong, David R.; Squires, Daniel D.; Tapé, Chantal; Mello, Michael J.
2016-01-01
Background Addressing violence along with drug use change goals is critical for women with coexisting intimate partner violence (IPV) and substance use disorders (SUD). Methods This was an acceptability and feasibility study of BSAFER, a brief Web-based program and booster phone call addressing violence and drug use. A screening survey identified women with recent drug use and IPV in the emergency department (ED). Participants were randomized to BSAFER or a Web-based control program and booster call providing education about home fire safety. Program completion, usability, satisfaction and MI adherence were primary outcomes. Drug use and IPV outcomes were measured at baseline, one and three months. Results Forty women were enrolled (21 BSAFER, 19 control); 50% were non-white and mean age was 30 years. Most commonly used drugs were marijuana (88%) and cocaine (30%); 45% reported physical abuse and 33% severe combined physical and sexual abuse. Thirty-nine (98%) completed the Web program, 30 (75%) completed the booster, and 29 (73%) completed 3-month follow up. Mean System Usability Scale (SUS) for the BSAFER Web program was 84 (95% CI 78–89) of 100; mean Client Satisfaction Questionnaire (CSQ-8) was 28 (95% CI 26–29) of 32. MI adherence scores were high and similar for both the Web program and the booster. Both intervention and control groups had small mean decreases in weekly drug use days (0.7 days vs. 1.5 days); participants using drugs other than marijuana demonstrated greater average reductions in drug use than those using marijuana only. Conclusions An ED Web-based intervention for SUD and IPV in women demonstrated feasibility and acceptability. Future studies will examine efficacy of the BSAFER program and investigate whether specific subgroups of drug using women may be most responsive to ED-based Web interventions. PMID:26714233
Atreja, Ashish; Mehta, Neil; Miller, Deborah; Moore, Shirley; Nichols, Karen; Miller, Holly; Harris, C Martin
2005-01-01
Disabled and elderly populations are the fastest growing segment of Internet usage. However, these people face an “Inverse Information law”- access to appropriate information is particularly difficult to those who need it the most. Our tertiary care Multiple Sclerosis (MS) center received funding to develop a MS specific patient portal linked to web messaging system so as to empower patients to become more active participants in their health care. In order to design an effective portal, we conducted a qualitative study using focus groups and direct observation techniques. The study explores the perceptions, expectations and interactions of MS patients with the portal and underscores the many challenges MS patients face in getting quality health information on the Internet. Many of the patient barriers were due to inappropriate font sizes, low contrast, cluttering of web page and use of dynamic and flashing objects. Some of these issues are not addressed by Section 508 accessibility guidelines. We believe that any future patient portal or health information web site needs to address these issues and educate the patients about accessibility options to enhance utilization and user satisfaction. PMID:16778993
Code of Federal Regulations, 2010 CFR
2010-01-01
... TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CAN-SPAM RULE § 316.5 Prohibition on... other than the recipient's electronic mail address and opt-out preferences, or take any other steps except sending a reply electronic mail message or visiting a single Internet Web page, in order to: (a...
Supporting New Special Education Teachers: How Online Resources and Web 2.0 Technologies Can Help
ERIC Educational Resources Information Center
Billingsley, Bonnie; Israel, Maya; Smith, Sean
2011-01-01
New special education teachers (SETs) face some typical challenges as well as ones that are specific to their particular work settings. Providing support that addresses teachers' unique needs is important for increasing their effectiveness, helping them make a smooth entry into teaching, and reducing their stress and turnover. Nearly 20 years ago,…
Addressing data access challenges in seismology
NASA Astrophysics Data System (ADS)
Trabant, C. M.; Ahern, T.; Weertman, B.; Benson, R. B.; Van Fossen, M.; Weekly, R. T.; Casey, R. E.; Suleiman, Y. Y.; Stults, M.
2016-12-01
The development of web services at the IRIS Data Management Center (DMC) over the last 6 years represents the most significant enhancement of data access ever introduced at the DMC. These web services have allowed the us to focus our internal operations around a single, consistent data access layer while facilitating development of a new generation of tools and methods for researchers to conduct their work. This effort led the DMC to propose standardized web service interfaces within the International Federation of Digital Seismograph Networks (FDSN), enabling other seismological data centers to offer data using compatible interfaces. With this new foundation, we now turn our attention to more advanced data access challenges. In particular, we will present the status of two developments intending to address 1) access to data of consistent quality for science and 2) discovery and access of data from multiple data centers. To address the challenge of requesting high or consistent quality data we will introduce our Research-Ready Data Sets (RRDS) initiative. The purpose of the RRDS project is to reduce the time a researcher spends culling and otherwise identifying data appropriate for given study. RRDS will provide users with additional criteria related to data quality that can be specified when requesting data. Leveraging the data quality measurements provided by our MUSTANG system, these criteria will include ambient noise, completeness, dead channel identification and more. To address the challenge of seismological data discovery and access, we have built and continue to improve the IRIS Federator. The Federator takes advantage of the FDSN-standard web services at various data centers to help a user locate specific channels, wherever they may be offered globally. The search interface provides results that are pre-formatted requests, ready for submission to each data center that serves that data. These two developments are aimed squarely at reducing the time researchers spend searching for, collecting and preparing data for processing.
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications
Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.
2018-01-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069
Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems
NASA Astrophysics Data System (ADS)
Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn
The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.
Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D
2017-04-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.
Ballesteros, Michael F.; Webb, Kevin; McClure, Roderick J.
2017-01-01
Introduction The Centers for Disease Control and Prevention (CDC) developed the Web-based Injury Statistics Query and Reporting System (WISQARSTM) to meet the data needs of injury practitioners. In 2015, CDC completed a Portfolio Review of this system to inform its future development. Methods Evaluation questions addressed utilization, technology and innovation, data sources, and tools and training. Data were collected through environmental scans, a review of peer-reviewed and grey literature, a web search, and stakeholder interviews. Results Review findings led to specific recommendations for each evaluation question. Response CDC reviewed each recommendation and initiated several enhancements that will improve the ability of injury prevention practitioners to leverage these data, better make sense of query results, and incorporate findings and key messages into prevention practices. PMID:28454867
A resource oriented webs service for environmental modeling
NASA Astrophysics Data System (ADS)
Ferencik, Ioan
2013-04-01
Environmental modeling is a largely adopted practice in the study of natural phenomena. Environmental models can be difficult to build and use and thus sharing them within the community is an important aspect. The most common approach to share a model is to expose it as a web service. In practice the interaction with this web service is cumbersome due to lack of standardized contract and the complexity of the model being exposed. In this work we investigate the use of a resource oriented approach in exposing environmental models as web services. We view a model as a layered resource build atop the object concept from Object Oriented Programming, augmented with persistence capabilities provided by an embedded object database to keep track of its state and implementing the four basic principles of resource oriented architectures: addressability, statelessness, representation and uniform interface. For implementation we use exclusively open source software: Django framework, dyBase object oriented database and Python programming language. We developed a generic framework of resources structured into a hierarchy of types and consequently extended this typology with recurses specific to the domain of environmental modeling. To test our web service we used cURL, a robust command-line based web client.
Recipes for Semantic Web Dog Food — The ESWC and ISWC Metadata Projects
NASA Astrophysics Data System (ADS)
Möller, Knud; Heath, Tom; Handschuh, Siegfried; Domingue, John
Semantic Web conferences such as ESWC and ISWC offer prime opportunities to test and showcase semantic technologies. Conference metadata about people, papers and talks is diverse in nature and neither too small to be uninteresting or too big to be unmanageable. Many metadata-related challenges that may arise in the Semantic Web at large are also present here. Metadata must be generated from sources which are often unstructured and hard to process, and may originate from many different players, therefore suitable workflows must be established. Moreover, the generated metadata must use appropriate formats and vocabularies, and be served in a way that is consistent with the principles of linked data. This paper reports on the metadata efforts from ESWC and ISWC, identifies specific issues and barriers encountered during the projects, and discusses how these were approached. Recommendations are made as to how these may be addressed in the future, and we discuss how these solutions may generalize to metadata production for the Semantic Web at large.
NASA Astrophysics Data System (ADS)
Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.
2017-09-01
The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.
Approaches to communication in response to geo-hydrological risk: POLARIS an Italian web initiative.
NASA Astrophysics Data System (ADS)
Salvati, Paola; Pernice, Umberto; Bianchi, Cinzia; Fiorucci, Federica; Marchesini, Ivan; Guzzetti, Fausto
2015-04-01
In the contemporary information and knowledge-based society, communication can foster effective responses to geo-hydrological risks, by increasing awareness on the causes and consequences of specific hazards, e.g., landslides, debris flows, and floods, and by fostering the capacity of individuals, groups, and organizations to prepare, manage and recover from geo-hydrological events. In this context, communication plays a vital role in all phases of the disaster cycle. Although in the last few years the scientific community has begun to disseminate information on geo-hydrological hazards and the associated risks through thematic websites, these remain mainly addressed to experts for specific technical purposes with contents and web interfaces hardly appreciated by a wider audience and rarely synchronised with social networks. To address the problem posed by the lack of communication on geo-hydrological hazards with potential human consequence in Italy, we designed the POLARIS Web site. The initiative we are conducting has the main object of contributing, in different ways and at different geographical scales, to raise awareness about landslides and floods, and their impact on the Italian society. The website is structured into six main sections (i.e. Reports, Are you ready, Events, Alert Zones, Focus and Blog) that provide different and complementary information including, respectively: periodical reports on landslide and flood risk to the population of Italy, suitable behaviors to adopt during damaging events, data and analyses on specific events, visual and detailed info on damaging events of the Italian Alert Zones defined by the Civil Protection Authority and blog-posts on landslide and flood events encouraging citizens' participation to crowd-sourcing information. Consultants experienced in project management, web-communication strategies on natural hazards, info-graphics, and user experience design were involved in the initiative to arrange and publish the information, considering usability and accessibility of the website, and key graphic aspects of web 2.0 information, making the web site communication more effective to users pertaining to diversified audiences. Specific icons are designed to describe the geo-hydrological events and maps to visualize their impact on the territory. The scientific and technical contents are edited using appropriate communication strategies which adopt a less technical and more widely comprehensible language, using intuitive and engaging web interfaces and linking messages to social media that encourage citizens' interactions. Monitoring the access of users to the website during more than a year after its publication, we noticed how the majority of the access corresponds to the occurrence of the worst geo-hydrological events and, in particular, when journalists or scientists promoted the website through television. Such a positive effect on the growth of users access suggested us to enhance our collaboration with scientific journalists by linking traditional (i.e. TV) and social media to further enlarge the awareness of website and to better explain users how to use the website information for increasing their resilience to geo-hydrological hazards.
Content and Design Features of Academic Health Sciences Libraries' Home Pages.
McConnaughy, Rozalynd P; Wilson, Steven P
2018-01-01
The goal of this content analysis was to identify commonly used content and design features of academic health sciences library home pages. After developing a checklist, data were collected from 135 academic health sciences library home pages. The core components of these library home pages included a contact phone number, a contact email address, an Ask-a-Librarian feature, the physical address listed, a feedback/suggestions link, subject guides, a discovery tool or database-specific search box, multimedia, social media, a site search option, a responsive web design, and a copyright year or update date.
NASA Astrophysics Data System (ADS)
McGibbney, L. J.; Armstrong, E. M.
2016-12-01
Figuratively speaking, Scientific Datasets (SD) are shared by data producers in a multitude of shapes, sizes and flavors. Primarily however they exist as machine-independent manifestations supporting the creation, access, and sharing of array-oriented SD that can on occasion be spread across multiple files. Within the Earth Sciences, the most notable general examples include the HDF family, NetCDF, etc. with other formats such as GRIB being used pervasively within specific domains such as the Oceanographic, Atmospheric and Meteorological sciences. Such file formats contain Coverage Data e.g. a digital representation of some spatio-temporal phenomenon. A challenge for large data producers such as NASA and NOAA as well as consumers of coverage datasets (particularly surrounding visualization and interactive use within web clients) is that this is still not a straight-forward issue due to size, serialization and inherent complexity. Additionally existing data formats are either unsuitable for the Web (like netCDF files) or hard to interpret independently due to missing standard structures and metadata (e.g. the OPeNDAP protocol). Therefore alternative, Web friendly manifestations of such datasets are required.CoverageJSON is an emerging data format for publishing coverage data to the web in a web-friendly, way which fits in with the linked data publication paradigm hence lowering the barrier for interpretation by consumers via mobile devices and client applications, etc. as well as data producers who can build next generation Web friendly Web services around datasets. This work will detail how CoverageJSON is being evaluated at NASA JPL's PO.DAAC as an enabling data representation format for publishing SD as Linked Open Data embedded within SD landing pages as well as via semantic data repositories. We are currently evaluating how utilization of CoverageJSON within SD landing pages addresses the long-standing acknowledgement that SD producers are not currently addressing content-based optimization within their SD landing pages for better crawlability by commercial search engines.
NASA Astrophysics Data System (ADS)
Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael
2015-04-01
We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems. Geospatial capabilities for mapping, visualization, and spatial analysis will be important components of these new generation of Web-based interoperable information systems in the water domain. The purpose of this presentation is to increase the awareness of scientists, IT personnel and agency managers about the advantages offered by the combined use of Open Data, Open Specifications for geospatial and water-related data collection, storage and sharing, as well as mature FOSS projects for the creation of interoperable Web-based information systems in the water domain. A case study is used to illustrate how these principles and technologies can be integrated to create a system with the previously mentioned characteristics for managing and responding to flood events.
NASA Astrophysics Data System (ADS)
Zhang, M.; Cooper, L. W.; Biasatti, D. M.; Kedra, M.; Grebmeier, J. M.
2016-02-01
Food web dynamics in the Chukchi Sea have been previously evaluated using bulk analysis of stable carbon and nitrogen isotopes of organisms. However, recent advances in compound-specific stable isotope analysis of amino acids indicate the potential to better identify the contributions of different dietary sources (e.g., pelagic vs. benthic, ice algae vs. phytoplankton) and to resolve complexities of food web structure that are difficult to address with bulk isotope analysis. Here we combine amino acid δ13C and δ15N data measured from primary producers and tissues of bivalves, polychaetes and other benthic invertebrates collected during two cruises in the summer of 2013 and 2015 in the Pacific Arctic. The results showed spatial variation of carbon isotope values in amino acids with difference up to 6 per mil for each individual species or taxa studied, indicating a shift in the food-web baseline geographically. Furthermore, the spatial variation in isotopic values was related to environmental factors, specifically sea ice extent, and total organic carbon, total organic nitrogen and the carbon/nitrogen ratio of the organic fractions of surface sediments. Results also indicated that trophic levels, as estimated by differences in the nitrogen isotope composition of glutamic acid and phenylalanine [Δ15Nglu-phe (δ15Nglu - δ15Nphe)], varied spatially by 0.5 to 1.5 trophic levels for certain species or taxa such as Macoma calcarea, Maldanidae and Ampelisca, indicating trophic level shifts that were associated with the food quality of organic matter in the organic fraction of the sediments. These results can be potentially used to predict future food web change in this high latitude marine system that is known for its ecological importance and on-going environmental changes, including warming and sea ice decline.
A Proposal for a Thesaurus for Web Services in Solar Radiation
NASA Technical Reports Server (NTRS)
Gschwind, Benoit; Menard, Lionel; Ranchin, Thierry; Wald, Lucien; Stackhouse, Paul W., Jr.
2007-01-01
Metadata are necessary to discover, describe and exchange any type of information, resource and service at a large scale. A significant amount of effort has been made in the field of geography and environment to establish standards. Efforts still remain to address more specific domains such as renewable energies. This communication focuses on solar energy and more specifically on aspects in solar radiation that relate to geography and meteorology. A thesaurus in solar radiation is proposed for the keys elements in solar radiation namely time, space and radiation types. The importance of time-series in solar radiation is outlined and attributes of the key elements are discussed. An XML schema for encoding metadata is proposed. The exploitation of such a schema in web services is discussed. This proposal is a first attempt at establishing a thesaurus for describing data and applications in solar radiation.
Hulse, Nathan C; Long, Jie; Tao, Cui
2013-01-01
Infobuttons have been established to be an effective resource for addressing information needs at the point of care, as evidenced by recent research and their inclusion in government-based electronic health record incentive programs in the United States. Yet their utility has been limited to wide success for only a specific set of domains (lab data, medication orders, and problem lists) and only for discrete, singular concepts that are already documented in the electronic medical record. In this manuscript, we present an effort to broaden their utility by connecting a semantic web-based phenotyping engine with an infobutton framework in order to identify and address broader issues in patient data, derived from multiple data sources. We have tested these patterns by defining and testing semantic definitions of pre-diabetes and metabolic syndrome. We intend to carry forward relevant information to the infobutton framework to present timely, relevant education resources to patients and providers.
Corporate Web Sites in Traditional Print Advertisements.
ERIC Educational Resources Information Center
Pardun, Carol J.; Lamb, Larry
1999-01-01
Describes the Web presence in print advertisements to determine how marketers are creating bridges between traditional advertising and the Internet. Content analysis showed Web addresses in print ads; categories of advertisers most likely to link print ads with Web sites; and whether the Web site attempts to develop a database of potential…
The effect of tailored Web-based interventions on pain in adults: a systematic review protocol.
Martorella, Géraldine; Gélinas, C; Bérubé, M; Boitor, M; Fredericks, S; LeMay, S
2016-04-12
Information technologies can facilitate the implementation of health interventions, especially in the case of widespread conditions such as pain. Tailored Web-based interventions have been recognized for health behavior change among diverse populations. However, none of the systematic reviews looking at Web-based interventions for pain management has specifically addressed the contribution of tailoring. The aims of this systematic review are to assess the effect of tailored Web-based pain management interventions on pain intensity and physical and psychological functions. Randomized controlled trials including adults suffering from any type of pain and involving Web-based interventions for pain management, using at least one of the three tailoring strategies (personalization, feedback, or adaptation), will be considered. The following types of comparisons will be carried out: tailored Web-based intervention with (1) usual care (passive control group), (2) face-to-face intervention, and (3) standardized Web-based intervention. The primary outcome will be pain intensity measured using a self-report measure such as the numeric rating scale (e.g., 0-10) or visual analog scale (e.g., 0-100). Secondary outcomes will include pain interference with activities and psychological well-being. A systematic review of English and French articles using MEDLINE, Embase, CINAHL, PsycINFO, Web of Science, and Cochrane Library will be conducted from January 2000 to December 2015. Eligibility assessment will be performed independently in an unblinded standardized manner by two reviewers. Extracted data will include the following: sample size, demographics, dropout rate, number and type of study groups, type of pain, inclusion and exclusion criteria, study setting, type of Web-based intervention, tailoring strategy, comparator, type of pain intensity measure, pain-related disability and psychological well-being outcomes, and times of measurement. Disagreements between reviewers at the full-text level will be resolved by consulting a third reviewer, a senior researcher. This systematic review is the first one looking at the specific ingredients and effects of tailored and Web-based interventions for pain management. Results of this systematic review could contribute to a better understanding of the mechanisms by which Web-based interventions could be helpful for people facing pain problems. PROSPERO CRD42015027669.
Human exposure assessment resources on the World Wide Web.
Schwela, Dieter; Hakkinen, Pertti J
2004-05-20
Human exposure assessment is frequently noted as a weak link and bottleneck in the risk assessment process. Fortunately, the World Wide Web and Internet are providing access to numerous valuable sources of human exposure assessment-related information, along with opportunities for information exchange. Internet mailing lists are available as potential online help for exposure assessment questions, e.g. RISKANAL has several hundred members from numerous countries. Various Web sites provide opportunities for training, e.g. Web sites offering general human exposure assessment training include two from the US Environmental Protection Agency (EPA) and four from the US National Library of Medicine. Numerous other Web sites offer access to a wide range of exposure assessment information. For example, the (US) Alliance for Chemical Awareness Web site addresses direct and indirect human exposures, occupational exposures and ecological exposure assessments. The US EPA's Exposure Factors Program Web site provides a focal point for current information and data on exposure factors relevant to the United States. In addition, the International Society of Exposure Analysis Web site provides information about how this society seeks to foster and advance the science of exposure analysis. A major opportunity exists for risk assessors and others to broaden the level of exposure assessment information available via Web sites. Broadening the Web's exposure information could include human exposure factors-related information about country- or region-specific ranges in body weights, drinking water consumption, etc. along with residential factors-related information on air changeovers per hour in various types of residences. Further, country- or region-specific ranges on how various tasks are performed by various types of consumers could be collected and provided. Noteworthy are that efforts are underway in Europe to develop a multi-country collection of exposure factors and the European Commission is in the early stages of planning and developing a Web-accessible information system (EIS-ChemRisks) to serve as a single gateway to all major European initiatives on human exposure to chemicals contained and released from cleaning products, textiles, toys, etc.
Realising the Uncertainty Enabled Model Web
NASA Astrophysics Data System (ADS)
Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.
2012-12-01
The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Idol, T. A.
2015-12-01
Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues. Results of the testbeds will now be deployed in pilot applications. The testbed also identified areas of additional development needed to help identify scientific investments and cyberinfrastructure approaches needed to improve the application of climate science research results to urban climate resilence.
Studying Cannabis Use Behaviors With Facebook and Web Surveys: Methods and Insights
2018-01-01
The rapid and wide-reaching expansion of internet access and digital technologies offers epidemiologists numerous opportunities to study health behaviors. One particularly promising new data collection strategy is the use of Facebook’s advertising platform in conjunction with Web-based surveys. Our research team at the Center for Technology and Behavioral Health has used this quick and cost-efficient method to recruit large samples and address unique scientific questions related to cannabis use. In conducting this research, we have gleaned several insights for using this sampling method effectively and have begun to document the characteristics of the resulting data. We believe this information could be useful to other researchers attempting to study cannabis use or, potentially, other health behaviors. The first aim of this paper is to describe case examples of procedures for using Facebook as a survey sampling method for studying cannabis use. We then present several distinctive features of the data produced using this method. Finally, we discuss the utility of this sampling method for addressing specific types of epidemiological research questions. Overall, we believe that sampling with Facebook advertisements and Web surveys is best conceptualized as a targeted, nonprobability-based method for oversampling cannabis users across the United States. PMID:29720366
Group Treatment for Women Gamblers Using Web, Teleconference and Workbook: Effectiveness Pilot.
Boughton, Roberta R; Jindani, Farah; Turner, Nigel E
2016-01-01
While the past decades have seen a dramatic increase in the number of women who gamble and develop consequent problems, treatment services are being underutilized in Ontario. This pilot study explores the feasibility of using web- and phone-based group interventions to expand services available for women who might not otherwise seek or be able to access treatment. Distinct treatment considerations for working with women, such as the value of a women's group, advantages of phone counselling, and the implementation of modern web-based services, were reviewed. The study involved a clinician-facilitated group that used teleconferencing and webinar technology (Adobe Connect) for support and discussion, and a Tutorial Workbook (TW) developed specifically to address the issues and treatment needs of women who gamble at a problematic level. A mixed method analysis used to evaluate the results suggested that the group-based teleconference/webinar approach provided a much-needed means of treatment support for women. Participants reported that the program helped them to understand their gambling triggers, to improve their awareness, to feel better about themselves, to modify their mood and anxiety levels, to feel less isolated, to address their relationships, and to feel more hopeful for the future. The Tutorial Workbook, which was used to supplement the educational component of the group interaction, was highly rated.
Judging nursing information on the WWW: a theoretical understanding.
Cader, Raffik; Campbell, Steve; Watson, Don
2009-09-01
This paper is a report of a study of the judgement processes nurses use when evaluating World Wide Web information related to nursing practice. The World Wide Web has increased the global accessibility of online health information. However, the variable nature of the quality of World Wide Web information and its perceived level of reliability may lead to misinformation. This makes demands on healthcare professionals, and on nurses in particular, to ensure that health information of reliable quality is selected for use in practice. A grounded theory approach was adopted. Semi-structured interviews and focus groups were used to collect data, between 2004 and 2005, from 20 nurses undertaking a postqualification graduate course at a university and 13 nurses from a local hospital in the United Kingdom. A theoretical framework emerged that gave insight into the judgement process nurses use when evaluating World Wide Web information. Participants broke the judgement process down into specific tasks. In addition, they used tacit, process and propositional knowledge and intuition, quasi-rational cognition and analysis to undertake these tasks. World Wide Web information cues, time available and nurses' critical skills were influencing factors in their judgement process. Addressing the issue of quality and reliability associated with World Wide Web information is a global challenge. This theoretical framework could contribute towards meeting this challenge.
Eagleson, Roy; Altamirano-Diaz, Luis; McInnis, Alex; Welisch, Eva; De Jesus, Stefanie; Prapavessis, Harry; Rombeek, Meghan; Seabrook, Jamie A; Park, Teresa; Norozi, Kambiz
2017-03-17
With the increasing implementation of web-based, mobile health interventions in clinical trials, it is crucial for researchers to address the security and privacy concerns of patient information according to high ethical standards. The full process of meeting these standards is often made more complicated due to the use of internet-based technology and smartphones for treatment, telecommunication, and data collection; however, this process is not well-documented in the literature. The Smart Heart Trial is a single-arm feasibility study that is currently assessing the effects of a web-based, mobile lifestyle intervention for overweight and obese children and youth with congenital heart disease in Southwestern Ontario. Participants receive telephone counseling regarding nutrition and fitness; and complete goal-setting activities on a web-based application. This paper provides a detailed overview of the challenges the study faced in meeting the high standards of our Research Ethics Board, specifically regarding patient privacy. We outline our solutions, successes, limitations, and lessons learned to inform future similar studies; and model much needed transparency in ensuring high quality security and protection of patient privacy when using web-based and mobile devices for telecommunication and data collection in clinical research.
The Next Linear Collider Program
Navbar Other Address Books: Laboratory Phone/Email Web Directory SLAC SLAC Phonebook Entire SLAC Web FNAL Telephone Directory Fermilab Search LLNL Phone Book LLNL Web Servers LBNL Directory Services Web Search: A-Z Index KEK E-mail Database Research Projects NLC Website Search: Entire SLAC Web | Help
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-02
..., Proposed Collection: IMLS Museum Web Database: MuseumsCount.gov AGENCY: Institute of Museum and Library... general public. Information such as name, address, phone, email, Web site, staff size, program details... Museum Web Database: MuseumsCount.gov collection. The 60-day notice for the IMLS Museum Web Database...
Electronic Ramp to Success: Designing Campus Web Pages for Users with Disabilities.
ERIC Educational Resources Information Center
Coombs, Norman
2002-01-01
Discusses key issues in addressing the challenge of Web accessibility for people with disabilities, including tools for Web authoring, repairing, and accessibility validation, and relevant legal issues. Presents standards for Web accessibility, including the Section 508 Standards from the Federal Access Board, and the World Wide Web Consortium's…
78 FR 74188 - Sunshine Act Meetings Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-10
... Casks (Public Meeting) (Contact: Kevin Witt, 301-415-2145) This meeting will be Web cast live at the Web... Weather Events (Public Meeting) (Contact: George Wilson, 301-415-1711) This meeting will be Web cast live... will be Web cast live at the Web address-- http://www.nrc.gov/ . Week of January 13, 2014--Tentative...
Surfing the World Wide Web to Education Hot-Spots.
ERIC Educational Resources Information Center
Dyrli, Odvard Egil
1995-01-01
Provides a brief explanation of Web browsers and their use, as well as technical information for those considering access to the WWW (World Wide Web). Curriculum resources and addresses to useful Web sites are included. Sidebars show sample searches using Yahoo and Lycos search engines, and a list of recommended Web resources. (JKP)
How To Build a Web Site in Six Easy Steps.
ERIC Educational Resources Information Center
Yaworski, JoAnn
2002-01-01
Gives instructions in nontechnical terms for building a simple web site using Netscape Navigator or Communicator's web editor. Presents six steps that include: organizing information, creating a page and a background, linking files, linking to Internet web pages, linking images, and linking an email address. Gives advice for sending the web page…
Whetzel, Patricia L.; Grethe, Jeffrey S.; Banks, Davis E.; Martone, Maryann E.
2015-01-01
The NIDDK Information Network (dkNET; http://dknet.org) was launched to serve the needs of basic and clinical investigators in metabolic, digestive and kidney disease by facilitating access to research resources that advance the mission of the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK). By research resources, we mean the multitude of data, software tools, materials, services, projects and organizations available to researchers in the public domain. Most of these are accessed via web-accessible databases or web portals, each developed, designed and maintained by numerous different projects, organizations and individuals. While many of the large government funded databases, maintained by agencies such as European Bioinformatics Institute and the National Center for Biotechnology Information, are well known to researchers, many more that have been developed by and for the biomedical research community are unknown or underutilized. At least part of the problem is the nature of dynamic databases, which are considered part of the “hidden” web, that is, content that is not easily accessed by search engines. dkNET was created specifically to address the challenge of connecting researchers to research resources via these types of community databases and web portals. dkNET functions as a “search engine for data”, searching across millions of database records contained in hundreds of biomedical databases developed and maintained by independent projects around the world. A primary focus of dkNET are centers and projects specifically created to provide high quality data and resources to NIDDK researchers. Through the novel data ingest process used in dkNET, additional data sources can easily be incorporated, allowing it to scale with the growth of digital data and the needs of the dkNET community. Here, we provide an overview of the dkNET portal and its functions. We show how dkNET can be used to address a variety of use cases that involve searching for research resources. PMID:26393351
Swallow, Veronica M; Hall, Andrew G; Carolan, Ian; Santacroce, Sheila; Webb, Nicholas J A; Smith, Trish; Hanif, Noreen
2014-02-18
There is a lack of online, evidence-based information and resources to support home-based care of childhood CKD stages 3-5. Qualitative interviews were undertaken with parents, patients and professionals to explore their views on content of the proposed online parent information and support (OPIS) web-application. Data were analysed using Framework Analysis, guided by the concept of Self-efficacy. 32 parents, 26 patients and 12 professionals were interviewed. All groups wanted an application that explains, demonstrates, and enables parental clinical care-giving, with condition-specific, continously available, reliable, accessible material and a closed communication system to enable contact between families living with CKD. Professionals advocated a regularly updated application to empower parents to make informed health-care decisions. To address these requirements, key web-application components were defined as: (i) Clinical care-giving support (information on treatment regimens, video-learning tools, condition-specific cartoons/puzzles, and a question and answer area) and (ii) Psychosocial support for care-giving (social-networking, case studies, managing stress, and enhancing families' health-care experiences). Developing a web-application that meets parents' information and support needs will maximise its utility, thereby augmenting parents' self-efficacy for CKD caregiving, and optimising outcomes. Self-efficacy theory provides a schema for how parents' self-efficacy beliefs about management of their child's CKD could potentially be promoted by OPIS.
77 FR 28638 - Sunshine Act Meeting Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
... Meeting), (Contact: Rani Franovich, 301-415-1868). This meeting will be webcast live at the Web address...). This meeting will be webcast live at the Web address--www.nrc.gov. Week of June 11, 2012--Tentative... Nuclear Regulatory Commission (NRC) on Grid Reliability (Public Meeting) To be held at FERC Headquarters...
75 FR 9451 - Sunshine Act Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-02
..., 301-415-6749.) This meeting will be webcast live at the Web address-- http://www.nrc.gov , Week of... Commission and the Nuclear Regulatory Commission on Grid Reliability (Public Meeting). (Contact: Kenn Miller, 301-415-3152.) This meeting will be webcast live at the Web address-- http://www.nrc.gov . Week of...
75 FR 12583 - Sunshine Act; Meeting Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-16
... Commission and the Nuclear Regulatory Commission on Grid Reliability (Public Meeting). (Contact: Kenn Miller, 301-415-3152). This meeting will be webcast live at the Web address-- http://www.nrc.gov . Week of...) (Contact: Jose Ibarra, 301-415-2581). This meeting will be webcast live at the Web address-- http://www.nrc...
78 FR 42893 - Statement on Regulatory Burden
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-18
... efficiency reasons, commenters are encouraged to submit comments by email or through FCA's Web site. As... the following methods: Email: Send us an email at [email protected] . FCA Web site: http://www.fca.gov... and addresses, will be publicly available. However, we will attempt to remove email addresses to help...
CernVM WebAPI - Controlling Virtual Machines from the Web
NASA Astrophysics Data System (ADS)
Charalampidis, I.; Berzano, D.; Blomer, J.; Buncic, P.; Ganis, G.; Meusel, R.; Segal, B.
2015-12-01
Lately, there is a trend in scientific projects to look for computing resources in the volunteering community. In addition, to reduce the development effort required to port the scientific software stack to all the known platforms, the use of Virtual Machines (VMs)u is becoming increasingly popular. Unfortunately their use further complicates the software installation and operation, restricting the volunteer audience to sufficiently expert people. CernVM WebAPI is a software solution addressing this specific case in a way that opens wide new application opportunities. It offers a very simple API for setting-up, controlling and interfacing with a VM instance in the users computer, while in the same time offloading the user from all the burden of downloading, installing and configuring the hypervisor. WebAPI comes with a lightweight javascript library that guides the user through the application installation process. Malicious usage is prohibited by offering a per-domain PKI validation mechanism. In this contribution we will overview this new technology, discuss its security features and examine some test cases where it is already in use.
Increasing efficiency of information dissemination and collection through the World Wide Web
Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott
2000-01-01
Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...
ERIC Educational Resources Information Center
Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.
2000-01-01
Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…
Multi-dimensional effects of color on the world wide web
NASA Astrophysics Data System (ADS)
Morton, Jill
2002-06-01
Color is the most powerful building material of visual imagery on the World Wide Web. It must function successfully as it has done historically in traditional two-dimensional media, as well as address new challenges presented by this electronic medium. The psychological, physiological, technical and aesthetic effects of color have been redefined by the unique requirements of the electronic transmission of text and images on the Web. Color simultaneously addresses each of these dimensions in this electronic medium.
Integrating Mathematics, Science, and Language Arts Instruction Using the World Wide Web.
ERIC Educational Resources Information Center
Clark, Kenneth; Hosticka, Alice; Kent, Judi; Browne, Ron
1998-01-01
Addresses issues of access to World Wide Web sites, mathematics and science content-resources available on the Web, and methods for integrating mathematics, science, and language arts instruction. (Author/ASK)
75 FR 75170 - APHIS User Fee Web Site
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-02
...] APHIS User Fee Web Site AGENCY: Animal and Plant Health Inspection Service, USDA. ACTION: Notice... recover the costs of providing certain services. This notice announces the availability of a Web site that contains information about the Agency's user fees. ADDRESSES: The Agency's user fee Web site is located at...
AcqWeb: Book-Buying in the Age of the Internet.
ERIC Educational Resources Information Center
Leiserson, Anna Belle; Cook, Eleanor; Brading, Peter; Marshall, David L.
1997-01-01
Describes AcqWeb, a Web site that has pertinent information for library acquisitions. Topics include the partnership with ACQNET, the electronic news group for acquisitions; AcqWeb's structure; editorial policy; the International Directory of E-mail Addresses of Publishers, Vendors and Related Professional Associations; and future possibilities.…
ERIC Educational Resources Information Center
de Freitas Guilhermino Trindade, Daniela; Guimaraes, Cayley; Antunes, Diego Roberto; Garcia, Laura Sanchez; Lopes da Silva, Rafaella Aline; Fernandes, Sueli
2012-01-01
This study analysed the role of knowledge management (KM) tools used to cultivate a community of practice (CP) in its knowledge creation (KC), transfer, learning processes. The goal of such observations was to determine requirements that KM tools should address for the specific CP formed by Deaf and non-Deaf members of the CP. The CP studied is a…
32 CFR 537.12 - Settlement authority.
Code of Federal Regulations, 2010 CFR
2010-07-01
... samples posted at the USARCS Web site (for the address see the Note to § 537.1). USARCS may waive the... Affairs, Social Security disability, and any other government benefits accruing to the injured party. (iv... to the sample posted at the USARCS Web site (for the address see the Note to § 537.1). However, the...
32 CFR 537.12 - Settlement authority.
Code of Federal Regulations, 2011 CFR
2011-07-01
... samples posted at the USARCS Web site (for the address see the Note to § 537.1). USARCS may waive the... Affairs, Social Security disability, and any other government benefits accruing to the injured party. (iv... to the sample posted at the USARCS Web site (for the address see the Note to § 537.1). However, the...
77 FR 27099 - Sunshine Federal Register Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-08
... Actions, (Public Meeting), (Contact: Jessie Quichocho, 301-415-0209). This meeting will be webcast live at... meeting will be webcast live at the Web address: www.nrc.gov . Week of June 4, 2012--Tentative Thursday...) (Contact: Tanny Santos, 301-415-7270). This meeting will be webcast live at the Web address: www.nrc.gov...
75 FR 8155 - Sunshine Act; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-23
... meeting will be webcast live at the Web address--http:// www.nrc.gov 9:30 a.m. Briefing on Decommissioning Funding (Public Meeting) (Contact: Thomas Fredrichs, 301-415-5971). This meeting will be webcast live at... meeting will be webcast live at the Web address-- http://www.nrc.gov . Week of March 8, 2010--Tentative...
77 FR 30331 - Sunshine Act Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-22
...) (Contact: Rani Franovich, 301-415-1868). This meeting will be webcast live at the Web address--www.nrc.gov... will be webcast live at the Web address--www.nrc.gov. Week of June 11, 2012--Tentative Friday, June 15... Regulatory Commission (NRC) on Grid Reliability (Public Meeting) To be held at FERC Headquarters, 888 First...
77 FR 35081 - Sunshine Act Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-12
... Nuclear Regulatory Commission (NRC) on Grid Reliability (Public Meeting). To be held at FERC Headquarters... meeting will be webcast live at the Web address--www.ferc.gov. Week of June 18, 2012--Tentative There are... Wertz, 301-415-1568.) This meeting will be webcast live at the Web address--www.nrc.gov. Week of July 16...
ERIC Educational Resources Information Center
Yudess, Jo
2003-01-01
This article lists the Web sites of 12 international not-for-profit creativity associations designed to trigger more creative thought and research possibilities. Along with Web addresses, the entries include telephone contact information and a brief description of the organization. (CR)
The BiSciCol Triplifier: bringing biodiversity data to the Semantic Web.
Stucky, Brian J; Deck, John; Conlin, Tom; Ziemba, Lukasz; Cellinese, Nico; Guralnick, Robert
2014-07-29
Recent years have brought great progress in efforts to digitize the world's biodiversity data, but integrating data from many different providers, and across research domains, remains challenging. Semantic Web technologies have been widely recognized by biodiversity scientists for their potential to help solve this problem, yet these technologies have so far seen little use for biodiversity data. Such slow uptake has been due, in part, to the relative complexity of Semantic Web technologies along with a lack of domain-specific software tools to help non-experts publish their data to the Semantic Web. The BiSciCol Triplifier is new software that greatly simplifies the process of converting biodiversity data in standard, tabular formats, such as Darwin Core-Archives, into Semantic Web-ready Resource Description Framework (RDF) representations. The Triplifier uses a vocabulary based on the popular Darwin Core standard, includes both Web-based and command-line interfaces, and is fully open-source software. Unlike most other RDF conversion tools, the Triplifier does not require detailed familiarity with core Semantic Web technologies, and it is tailored to a widely popular biodiversity data format and vocabulary standard. As a result, the Triplifier can often fully automate the conversion of biodiversity data to RDF, thereby making the Semantic Web much more accessible to biodiversity scientists who might otherwise have relatively little knowledge of Semantic Web technologies. Easy availability of biodiversity data as RDF will allow researchers to combine data from disparate sources and analyze them with powerful linked data querying tools. However, before software like the Triplifier, and Semantic Web technologies in general, can reach their full potential for biodiversity science, the biodiversity informatics community must address several critical challenges, such as the widespread failure to use robust, globally unique identifiers for biodiversity data.
Is Domain Highlighting Actually Helpful in Identifying Phishing Web Pages?
Xiong, Aiping; Proctor, Robert W; Yang, Weining; Li, Ninghui
2017-06-01
To evaluate the effectiveness of domain highlighting in helping users identify whether Web pages are legitimate or spurious. As a component of the URL, a domain name can be overlooked. Consequently, browsers highlight the domain name to help users identify which Web site they are visiting. Nevertheless, few studies have assessed the effectiveness of domain highlighting, and the only formal study confounded highlighting with instructions to look at the address bar. We conducted two phishing detection experiments. Experiment 1 was run online: Participants judged the legitimacy of Web pages in two phases. In Phase 1, participants were to judge the legitimacy based on any information on the Web page, whereas in Phase 2, they were to focus on the address bar. Whether the domain was highlighted was also varied. Experiment 2 was conducted similarly but with participants in a laboratory setting, which allowed tracking of fixations. Participants differentiated the legitimate and fraudulent Web pages better than chance. There was some benefit of attending to the address bar, but domain highlighting did not provide effective protection against phishing attacks. Analysis of eye-gaze fixation measures was in agreement with the task performance, but heat-map results revealed that participants' visual attention was attracted by the highlighted domains. Failure to detect many fraudulent Web pages even when the domain was highlighted implies that users lacked knowledge of Web page security cues or how to use those cues. Potential applications include development of phishing prevention training incorporating domain highlighting with other methods to help users identify phishing Web pages.
Academic Library Web Sites: Current Practice and Future Directions
ERIC Educational Resources Information Center
Detlor, Brian; Lewis, Vivian
2006-01-01
To address competitive threats, academic libraries are encouraged to build robust Web sites personalized to learning and research tasks. Through an evaluation of Association of Research Libraries (ARL)-member Web sites, we suggest how library Web sites should evolve and reflect upon the impacts such recommendations may have on academic libraries…
75 FR 39935 - Drinking Water Strategy Contaminants as Group(s)-Notice of Web Dialogue
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-13
... Group(s)--Notice of Web Dialogue AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY... principles. The purpose of this notice is to announce that EPA will host a Web dialogue. The discussion topics for this Web dialogue are focused on the first of the four principles, addressing some...
About NOAA's National Weather Service
official web portal to all federal, state and local government web resources and services. Follow the Field Offices are located across the country. Links to web sites of these components of the National that affect small businesses have established Web sites, e-mail addresses, and toll free phone numbers
76 FR 54807 - Notice of Proposed Information Collection: IMLS Museum Web Database: MuseumsCount.gov
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-02
...: IMLS Museum Web Database: MuseumsCount.gov AGENCY: Institute of Museum and Library Services, National..., and the general public. Information such as name, address, phone, e-mail, Web site, congressional...: IMLS Museum Web Database, MuseumsCount.gov . OMB Number: To be determined. Agency Number: 3137...
NASA Astrophysics Data System (ADS)
Allebach, J. P.; Ortiz Segovia, Maria; Atkins, C. Brian; O'Brien-Strain, Eamonn; Damera-Venkata, Niranjan; Bhatti, Nina; Liu, Jerry; Lin, Qian
2010-02-01
Businesses have traditionally relied on different types of media to communicate with existing and potential customers. With the emergence of the Web, the relation between the use of print and electronic media has continually evolved. In this paper, we investigate one possible scenario that combines the use of the Web and print. Specifically, we consider the scenario where a small- or medium-sized business (SMB) has an existing web site from which they wish to pull content to create a print piece. Our assumption is that the web site was developed by a professional designer, working in conjunction with the business owner or marketing team, and that it contains a rich assembly of content that is presented in an aesthetically pleasing manner. Our goal is to understand the process that a designer would follow to create an effective and aesthetically pleasing print piece. We are particularly interested to understand the choices made by the designer with respect to placement and size of the text and graphic elements on the page. Toward this end, we conducted an experiment in which professional designers worked with SMBs to create print pieces from their respective web pages. In this paper, we report our findings from this experiment, and examine the underlying conclusions regarding the resulting document aesthetics in the context of the existing design, and engineering and computer science literatures that address this topic
Infant Gastroesophageal Reflux Information on the World Wide Web.
Balgowan, Regina; Greer, Leah C; D'Auria, Jennifer P
2016-01-01
The purpose of this study was to describe the type and quality of health information about infant gastroesophageal reflux (GER) that a parent may find on the World Wide Web. The data collection tool included evaluation of Web site quality and infant GER-specific content on the 30 sites that met the inclusion criteria. The most commonly found content categories in order of frequency were management strategies, when to call a primary care provider, definition, and clinical features. The most frequently mentioned strategies included feeding changes, infant positioning, and medications. Thirteen of the 30 Web sites included information on both GER and gastroesophageal reflux disease. Mention of the use of medication to lessen infant symptoms was found on 15 of the 30 sites. Only 10 of the 30 sites included information about parent support and coping strategies. Pediatric nurse practitioners (PNPs) should utilize well-child visits to address the normalcy of physiologic infant GER and clarify any misperceptions parents may have about diagnosis and the role of medication from information they may have found on the Internet. It is critical for PNPs to assist in the development of Web sites with accurate content, advise parents on how to identify safe and reliable information, and provide examples of high-quality Web sites about child health topics such as infant GER. Copyright © 2016 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.
Pruthi, Amanda; Nielsen, Matthew E; Raynor, Mathew C; Woods, Michael E; Wallen, Eric M; Smith, Angela B
2015-02-01
To determine the readability levels of reputable cancer and urologic Web sites addressing bladder, prostate, kidney, and testicular cancers. Online patient education materials (PEMs) for bladder, prostate, kidney, and testicular malignancies were evaluated from the American Cancer Society, American Society of Clinical Oncology, National Cancer Institute, Urology Care Foundation, Bladder Cancer Advocacy Network, Prostate Cancer Foundation, Kidney Cancer Association, and Testicular Cancer Resource Center. Grade level was determined using several readability indices, and analyses were performed on the basis of cancer type, Web site, and content area (general, causes, risk factors and prevention, diagnosis and staging, treatment, and post-treatment). Estimated grade level of online PEMs ranged from 9.2 to 14.2 with an overall mean of 11.7. Web sites for kidney cancer had the least difficult readability (11.3) and prostate cancer had the most difficult readability (12.1). Among specific Web sites, the most difficult readability levels were noted for the Urology Care Foundation Web site for bladder and prostate cancer and the Kidney Cancer Association and Testicular Cancer Resource Center for kidney and testes cancer. Readability levels within content areas varied on the basis of the disease and Web site. Online PEMs in urologic oncology are written at a level above the average American reader. Simplification of these resources is necessary to improve patient understanding of urologic malignancy. Copyright © 2015 Elsevier Inc. All rights reserved.
76 FR 9636 - Notice of Open Public Hearing and Roundtable Discussion
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-18
... China's domestic economic, social and security issues and how the Chinese government is addressing them... obtained from the USCC Web Site http://www.uscc.gov . Date and Time: Friday, February 25, 2011, 8:45 a.m... to the Commission's Web Site at http://www.uscc.gov as soon as available. ADDRESSES: The hearing will...
32 CFR 536.18 - Cross-servicing of claims.
Code of Federal Regulations, 2010 CFR
2010-07-01
... AGAINST THE UNITED STATES The Army Claims System § 536.18 Cross-servicing of claims. (a) Where claims..., E1.2 (posted on the USARCS Web site; for the address see § 536.2(a)). Tables listing claims offices worldwide are posted to the USARCS Web site at that address. U.S. Air Force claims offices may be identified...
Studying Cannabis Use Behaviors With Facebook and Web Surveys: Methods and Insights.
Borodovsky, Jacob T; Marsch, Lisa A; Budney, Alan J
2018-05-02
The rapid and wide-reaching expansion of internet access and digital technologies offers epidemiologists numerous opportunities to study health behaviors. One particularly promising new data collection strategy is the use of Facebook's advertising platform in conjunction with Web-based surveys. Our research team at the Center for Technology and Behavioral Health has used this quick and cost-efficient method to recruit large samples and address unique scientific questions related to cannabis use. In conducting this research, we have gleaned several insights for using this sampling method effectively and have begun to document the characteristics of the resulting data. We believe this information could be useful to other researchers attempting to study cannabis use or, potentially, other health behaviors. The first aim of this paper is to describe case examples of procedures for using Facebook as a survey sampling method for studying cannabis use. We then present several distinctive features of the data produced using this method. Finally, we discuss the utility of this sampling method for addressing specific types of epidemiological research questions. Overall, we believe that sampling with Facebook advertisements and Web surveys is best conceptualized as a targeted, nonprobability-based method for oversampling cannabis users across the United States. ©Jacob T Borodovsky, Lisa A Marsch, Alan J Budney. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 02.05.2018.
Cousineau, Tara; Houle, Brian; Bromberg, Jonas; Fernandez, Kathrine C; Kling, Whitney C
2008-01-01
Tailored nutrition Web programs constitute an emerging trend in obesity prevention. Initial investment in innovative technology necessitates that the target population be well understood. This pilot study's purpose was to determine the feasibility of a workplace nutrition Web program. Formative research was conducted with gaming industry employees and benefits managers to develop a consensus on workplace-specific nutrition needs. A demonstration Web program was piloted with stakeholders to determine feasibility. Indiana, Mississippi, Nevada, and New Jersey gaming establishments. 86 employees, 18 benefits managers. Prototype Web program. Concept mapping; 16-item nutrition knowledge test; satisfaction. Concept mapping was used to aggregate importance ratings on programmatic content, which informed Web program curriculum. Chi-square tests were performed postintervention to determine knowledge improvement. (1) Employees and benefits managers exhibited moderate agreement about content priorities for the program (r = 0.48). (2) There was a significant increase in employees' nutrition knowledge scores postintervention (t = 7.16, df = 36, P < .001); those with less knowledge exhibited the greatest gains in knowledge scores (r = -0.647, P < .001). Employees and benefit managers do not necessarily agree on the priority of nutrition-related content, suggesting a need for programs to appeal to various stakeholders. Computer-based approaches can address various stakeholder health concerns via tailored, customized programming.
Countering Botnets: Anomaly-Based Detection, Comprehensive Analysis, and Efficient Mitigation
2011-05-01
our network prophylactic for ISPs. Using DNSRBL lists to identify address to provide specific routes into a network device that does further deep ...Notos is resilient to changes in the zone classes we selected. Services like CDNs and major web sites can add new IPs or adjust domain formats, and...less good domain names, such as file-sharing, porn -related websites, etc., most of which are not run in a professional way and have disputable
Ecosystem oceanography for global change in fisheries.
Cury, Philippe Maurice; Shin, Yunne-Jai; Planque, Benjamin; Durant, Joël Marcel; Fromentin, Jean-Marc; Kramer-Schadt, Stephanie; Stenseth, Nils Christian; Travers, Morgane; Grimm, Volker
2008-06-01
Overexploitation and climate change are increasingly causing unanticipated changes in marine ecosystems, such as higher variability in fish recruitment and shifts in species dominance. An ecosystem-based approach to fisheries attempts to address these effects by integrating populations, food webs and fish habitats at different scales. Ecosystem models represent indispensable tools to achieve this objective. However, a balanced research strategy is needed to avoid overly complex models. Ecosystem oceanography represents such a balanced strategy that relates ecosystem components and their interactions to climate change and exploitation. It aims at developing realistic and robust models at different levels of organisation and addressing specific questions in a global change context while systematically exploring the ever-increasing amount of biological and environmental data.
Prusti, Marjo; Lehtineva, Susanna; Pohjanoksa-Mäntylä, Marika; Bell, J Simon
2012-01-01
The Internet is a frequently used source of drug information, including among people with mental disorders. Online drug information may be narrow in scope, incomplete, and contain errors of omission. To evaluate the quality of online antidepressant drug information in English and Finnish. Forty Web sites were identified using the search terms antidepressants and masennuslääkkeet in English and Finnish, respectively. Included Web sites (14 English, 8 Finnish) were evaluated for aesthetics, interactivity, content coverage, and content correctness using published criteria. All Web sites were assessed using the Date, Author, References, Type, Sponsor (DARTS) and DISCERN quality assessment tools. English and Finnish Web sites had similar aesthetics, content coverage, and content correctness scores. English Web sites were more interactive than Finnish Web sites (P<.05). Overall, adverse drug reactions were covered on 21 of 22 Web sites; however, drug-alcohol interactions were addressed on only 9 of 22 Web sites, and dose was addressed on only 6 of 22 Web sites. Few (2/22 Web sites) provided incorrect information. The DISCERN score was significantly correlated with content coverage (r=0.670, P<.01), content correctness (r=0.663, P<.01), and the DARTS score (r=0.459, P<.05). No Web site provided information about all aspects of antidepressant treatment. Nevertheless, few Web sites provided incorrect information. Both English and Finnish Web sites were similar in terms of aesthetics, content coverage, and content correctness. Copyright © 2012 Elsevier Inc. All rights reserved.
78 FR 78401 - Advisory Committee for Education and Human Resources; Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-26
... 1235, 4201 Wilson Boulevard, Arlington, VA 22230. To attend virtually via WebEX video, the web address... you need assistance joining the meeting, contact WebEx Technical Support at 1-800-857-8777, and reference WebEx meeting number 749 890 295 at URL: https://nsf.webex.com . Operated assisted teleconference...
Are We Ready To Abandon the Classroom? The Dark Side of Web Instruction.
ERIC Educational Resources Information Center
Cohen, LeoNora M.
This paper discusses four assumptions and four concerns regarding instruction using the World Wide Web. The assumptions address: the novice status of the Web course developer; the developer's appreciation for various aspects of the Web; her high expectations for doing it right; and her commitment to not incurring more costs for distance learners.…
Library learning space--empirical research and perspective.
Littleton, Dawn; Rethlefsen, Melissa
2008-01-01
Navigate the Net columns offer navigation to Web sites of value to medical librarians. For this issue, the authors recognize that librarians are frequently challenged to justify the need for the physical space occupied by a library in the context of the wide availability of electronic resources, ubiquitous student laptops, and competition for space needed by other institutional priorities. While this trend started years ago, it continues to raise a number of important practical and philosophical questions for libraries and the institutions they serve. What is the library for? What is library space best used for? How does the concept of "Library as Place" support informed decisions for librarians and space planners? In this issue, Web-based resources are surveyed that address these questions for libraries generally and health sciences libraries more specifically.
A verification strategy for web services composition using enhanced stacked automata model.
Nagamouttou, Danapaquiame; Egambaram, Ilavarasan; Krishnan, Muthumanickam; Narasingam, Poonkuzhali
2015-01-01
Currently, Service-Oriented Architecture (SOA) is becoming the most popular software architecture of contemporary enterprise applications, and one crucial technique of its implementation is web services. Individual service offered by some service providers may symbolize limited business functionality; however, by composing individual services from different service providers, a composite service describing the intact business process of an enterprise can be made. Many new standards have been defined to decipher web service composition problem namely Business Process Execution Language (BPEL). BPEL provides an initial work for forming an Extended Markup Language (XML) specification language for defining and implementing business practice workflows for web services. The problems with most realistic approaches to service composition are the verification of composed web services. It has to depend on formal verification method to ensure the correctness of composed services. A few research works has been carried out in the literature survey for verification of web services for deterministic system. Moreover the existing models did not address the verification properties like dead transition, deadlock, reachability and safetyness. In this paper, a new model to verify the composed web services using Enhanced Stacked Automata Model (ESAM) has been proposed. The correctness properties of the non-deterministic system have been evaluated based on the properties like dead transition, deadlock, safetyness, liveness and reachability. Initially web services are composed using Business Process Execution Language for Web Service (BPEL4WS) and it is converted into ESAM (combination of Muller Automata (MA) and Push Down Automata (PDA)) and it is transformed into Promela language, an input language for Simple ProMeLa Interpreter (SPIN) tool. The model is verified using SPIN tool and the results revealed better recital in terms of finding dead transition and deadlock in contrast to the existing models.
Lepoire, D; Richmond, P; Cheng, J-J; Kamboj, S; Arnish, J; Chen, S Y; Barr, C; McKenney, C
2008-08-01
As part of the requirement for terminating the licenses of nuclear power plants or other nuclear facilities, license termination plans or decommissioning plans are submitted by the licensee to the U.S. Nuclear Regulatory Commission (NRC) for review and approval. Decommissioning plans generally refer to the decommissioning of nonreactor facilities, while license termination plans specifically refer to the decommissioning of nuclear reactor facilities. To provide a uniform and consistent review of dose modeling aspects of these plans and to address NRC-wide knowledge management issues, the NRC, in 2006, commissioned Argonne National Laboratory to develop a Web-based training course on reviewing radiological dose assessments for license termination. The course, which had first been developed in 2005 to target specific aspects of the review processes for license termination plans and decommissioning plans, evolved from a live classroom course into a Web-based training course in 2006. The objective of the Web-based training course is to train NRC staff members (who have various relevant job functions and are located at headquarters, regional offices, and site locations) to conduct an effective review of dose modeling in accordance with the latest NRC guidance, including NUREG-1757, Volumes 1 and 2. The exact size of the staff population who will receive the training has not yet been accurately determined but will depend on various factors such as the decommissioning activities at the NRC. This Web-based training course is designed to give NRC staff members modern, flexible access to training. To this end, the course is divided into 16 modules: 9 core modules that deal with basic topics, and 7 advanced modules that deal with complex issues or job-specific topics. The core and advanced modules are tailored to various NRC staff members with different job functions. The Web-based system uses the commercially available software Articulate, which incorporates audio, video, and animation in slide presentations and has glossary, document search, and Internet connectivity features. The training course has been implemented on an NRC system that allows staff members to register, select courses, track records, and self-administer quizzes.
Persistence and availability of Web services in computational biology.
Schultheiss, Sebastian J; Münch, Marc-Christian; Andreeva, Gergana D; Rätsch, Gunnar
2011-01-01
We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository.
Persistence and Availability of Web Services in Computational Biology
Schultheiss, Sebastian J.; Münch, Marc-Christian; Andreeva, Gergana D.; Rätsch, Gunnar
2011-01-01
We have conducted a study on the long-term availability of bioinformatics Web services: an observation of 927 Web services published in the annual Nucleic Acids Research Web Server Issues between 2003 and 2009. We found that 72% of Web sites are still available at the published addresses, only 9% of services are completely unavailable. Older addresses often redirect to new pages. We checked the functionality of all available services: for 33%, we could not test functionality because there was no example data or a related problem; 13% were truly no longer working as expected; we could positively confirm functionality only for 45% of all services. Additionally, we conducted a survey among 872 Web Server Issue corresponding authors; 274 replied. 78% of all respondents indicate their services have been developed solely by students and researchers without a permanent position. Consequently, these services are in danger of falling into disrepair after the original developers move to another institution, and indeed, for 24% of services, there is no plan for maintenance, according to the respondents. We introduce a Web service quality scoring system that correlates with the number of citations: services with a high score are cited 1.8 times more often than low-scoring services. We have identified key characteristics that are predictive of a service's survival, providing reviewers, editors, and Web service developers with the means to assess or improve Web services. A Web service conforming to these criteria receives more citations and provides more reliable service for its users. The most effective way of ensuring continued access to a service is a persistent Web address, offered either by the publishing journal, or created on the authors' own initiative, for example at http://bioweb.me. The community would benefit the most from a policy requiring any source code needed to reproduce results to be deposited in a public repository. PMID:21966383
Network and User-Perceived Performance of Web Page Retrievals
NASA Technical Reports Server (NTRS)
Kruse, Hans; Allman, Mark; Mallasch, Paul
1998-01-01
The development of the HTTP protocol has been driven by the need to improve the network performance of the protocol by allowing the efficient retrieval of multiple parts of a web page without the need for multiple simultaneous TCP connections between a client and a server. We suggest that the retrieval of multiple page elements sequentially over a single TCP connection may result in a degradation of the perceived performance experienced by the user. We attempt to quantify this perceived degradation through the use of a model which combines a web retrieval simulation and an analytical model of TCP operation. Starting with the current HTTP/l.1 specification, we first suggest a client@side heuristic to improve the perceived transfer performance. We show that the perceived speed of the page retrieval can be increased without sacrificing data transfer efficiency. We then propose a new client/server extension to the HTTP/l.1 protocol to allow for the interleaving of page element retrievals. We finally address the issue of the display of advertisements on web pages, and in particular suggest a number of mechanisms which can make efficient use of IP multicast to send advertisements to a number of clients within the same network.
49 CFR 573.9 - Address for submitting required reports and other information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Internet Web page http://www.safercar.gov/Vehicle+Manufacturers. A manufacturer must use the templates provided at this Web page for all submissions required under this section. Defect and noncompliance... at this Web page. [78 FR 51421, Aug. 20, 2013] ...
Library Services through the World Wide Web.
ERIC Educational Resources Information Center
Xiao, Daniel; Mosley, Pixey Anne; Cornish, Alan
1997-01-01
Provides an overview of the services offered by Texas A&M University's Sterling C. Evans Library via the World Wide Web. Included are public relations, instruction, searching capabilities, enhanced communications, and exhibit options. Future applications of the Web in academic libraries are also addressed. (AEF)
Web Proxy Auto Discovery for the WLCG
NASA Astrophysics Data System (ADS)
Dykstra, D.; Blomer, J.; Blumenfeld, B.; De Salvo, A.; Dewhurst, A.; Verguilov, V.
2017-10-01
All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily support that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which they direct to the nearest publicly accessible web proxy servers. The responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.
Web Proxy Auto Discovery for the WLCG
Dykstra, D.; Blomer, J.; Blumenfeld, B.; ...
2017-11-23
All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less
Web Proxy Auto Discovery for the WLCG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykstra, D.; Blomer, J.; Blumenfeld, B.
All four of the LHC experiments depend on web proxies (that is, squids) at each grid site to support software distribution by the CernVM FileSystem (CVMFS). CMS and ATLAS also use web proxies for conditions data distributed through the Frontier Distributed Database caching system. ATLAS & CMS each have their own methods for their grid jobs to find out which web proxies to use for Frontier at each site, and CVMFS has a third method. Those diverse methods limit usability and flexibility, particularly for opportunistic use cases, where an experiment’s jobs are run at sites that do not primarily supportmore » that experiment. This paper describes a new Worldwide LHC Computing Grid (WLCG) system for discovering the addresses of web proxies. The system is based on an internet standard called Web Proxy Auto Discovery (WPAD). WPAD is in turn based on another standard called Proxy Auto Configuration (PAC). Both the Frontier and CVMFS clients support this standard. The input into the WLCG system comes from squids registered in the ATLAS Grid Information System (AGIS) and CMS SITECONF files, cross-checked with squids registered by sites in the Grid Configuration Database (GOCDB) and the OSG Information Management (OIM) system, and combined with some exceptions manually configured by people from ATLAS and CMS who operate WLCG Squid monitoring. WPAD servers at CERN respond to http requests from grid nodes all over the world with a PAC file that lists available web proxies, based on IP addresses matched from a database that contains the IP address ranges registered to organizations. Large grid sites are encouraged to supply their own WPAD web servers for more flexibility, to avoid being affected by short term long distance network outages, and to offload the WLCG WPAD servers at CERN. The CERN WPAD servers additionally support requests from jobs running at non-grid sites (particularly for LHC@Home) which it directs to the nearest publicly accessible web proxy servers. Furthermore, the responses to those requests are geographically ordered based on a separate database that maps IP addresses to longitude and latitude.« less
A Prototype Land Information Sensor Web: Design, Implementation and Implication for the SMAP Mission
NASA Astrophysics Data System (ADS)
Su, H.; Houser, P.; Tian, Y.; Geiger, J. K.; Kumar, S. V.; Gates, L.
2009-12-01
Land Surface Model (LSM) predictions are regular in time and space, but these predictions are influenced by errors in model structure, input variables, parameters and inadequate treatment of sub-grid scale spatial variability. Consequently, LSM predictions are significantly improved through observation constraints made in a data assimilation framework. Several multi-sensor satellites are currently operating which provide multiple global observations of the land surface, and its related near-atmospheric properties. However, these observations are not optimal for addressing current and future land surface environmental problems. To meet future earth system science challenges, NASA will develop constellations of smart satellites in sensor web configurations which provide timely on-demand data and analysis to users, and can be reconfigured based on the changing needs of science and available technology. A sensor web is more than a collection of satellite sensors. That means a sensor web is a system composed of multiple platforms interconnected by a communication network for the purpose of performing specific observations and processing data required to support specific science goals. Sensor webs can eclipse the value of disparate sensor components by reducing response time and increasing scientific value, especially when the two-way interaction between the model and the sensor web is enabled. The study of a prototype Land Information Sensor Web (LISW) is sponsored by NASA, trying to integrate the Land Information System (LIS) in a sensor web framework which allows for optimal 2-way information flow that enhances land surface modeling using sensor web observations, and in turn allows sensor web reconfiguration to minimize overall system uncertainty. This prototype is based on a simulated interactive sensor web, which is then used to exercise and optimize the sensor web modeling interfaces. The Land Information Sensor Web Service-Oriented Architecture (LISW-SOA) has been developed and it is the very first sensor web framework developed especially for the land surface studies. Synthetic experiments based on the LISW-SOA and the virtual sensor web provide a controlled environment in which to examine the end-to-end performance of the prototype, the impact of various sensor web design trade-offs and the eventual value of sensor webs for a particular prediction or decision support. In this paper, the design, implementation of the LISW-SOA and the implication for the Soil Moisture Active and Passive (SMAP) mission is presented. Particular attention is focused on examining the relationship between the economic investment on a sensor web (space and air borne, ground based) and the accuracy of the model predicted soil moisture, which can be achieved by using such sensor observations. The Study of Virtual Land Information Sensor Web (LISW) is expected to provide some necessary a priori knowledge for designing and deploying the next generation Global Earth Observing System of systems (GEOSS).
NASA Astrophysics Data System (ADS)
Khalilian, Madjid; Boroujeni, Farsad Zamani; Mustapha, Norwati
Nowadays the growth of the web causes some difficulties to search and browse useful information especially in specific domains. However, some portion of the web remains largely underdeveloped, as shown in lack of high quality contents. An example is the botany specific web directory, in which lack of well-structured web directories have limited user's ability to browse required information. In this research we propose an improved framework for constructing a specific web directory. In this framework we use an anchor directory as a foundation for primary web directory. This web directory is completed by information which is gathered with automatic component and filtered by experts. We conduct an experiment for evaluating effectiveness, efficiency and satisfaction.
Saito, L.; Johnson, B.M.; Bartholow, J.; Hanna, R.B.
2001-01-01
We investigated the effects on the reservoir food web of a new temperature control device (TCD) on the dam at Shasta Lake, California. We followed a linked modeling approach that used a specialized reservoir water quality model to forecast operation-induced changes in phytoplankton production. A food web–energy transfer model was also applied to propagate predicted changes in phytoplankton up through the food web to the predators and sport fishes of interest. The food web–energy transfer model employed a 10% trophic transfer efficiency through a food web that was mapped using carbon and nitrogen stable isotope analysis. Stable isotope analysis provided an efficient and comprehensive means of estimating the structure of the reservoir's food web with minimal sampling and background data. We used an optimization procedure to estimate the diet proportions of all food web components simultaneously from their isotopic signatures. Some consumers were estimated to be much more sensitive than others to perturbations to phytoplankton supply. The linked modeling approach demonstrated that interdisciplinary efforts enhance the value of information obtained from studies of managed ecosystems. The approach exploited the strengths of engineering and ecological modeling methods to address concerns that neither of the models could have addressed alone: (a) the water quality model could not have addressed quantitatively the possible impacts to fish, and (b) the food web model could not have examined how phytoplankton availability might change due to reservoir operations.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-23
... electronic form will be posted on the NRC Web site and on the Federal Rulemaking Web site Regulations.gov... that they do not want publicly disclosed. Federal rulemaking Web site: Go to http://www.regulations.gov... through this Web site. Address questions about NRC dockets to Carol Gallagher, telephone: 301-492-3668, e...
Visual Communication in Web Design - Analyzing Visual Communication in Web Design
NASA Astrophysics Data System (ADS)
Thorlacius, Lisbeth
Web sites are rapidly becoming the preferred media choice for information search, company presentation, shopping, entertainment, education, and social contacts. And along with the various forms of communication that the Web offers the aesthetic aspects have begun to play an increasingly important role. However, studies in the design and the relevance of focusing on the aesthetic aspects in planning and using Web sites have only to a smaller degree been subject of theoretical reflection. For example, Miller (2000), Thorlacius (2001, 2002, 2005), Engholm (2002, 2003), and Beaird (2007) have been contributing to set a beginning agenda that address the aesthetic aspects. On the other hand, there is a considerable amount of literature addressing the theoretical and methodological aspects focusing on the technical and functional aspects. In this context it is the aim of this article to introduce a model for analysis of visual communication on websites.
A resource-oriented architecture for a Geospatial Web
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; Nativi, Stefano
2010-05-01
In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine
Going, going, still there: using the WebCite service to permanently archive cited web pages.
Eysenbach, Gunther; Trudel, Mathieu
2005-12-30
Scholars are increasingly citing electronic "web references" which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To "webcite" a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its "instructions for authors" accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) "prospectively" before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted "citing articles" (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics.
Evaluation Criteria for the Educational Web-Information System
ERIC Educational Resources Information Center
Seok, Soonhwa; Meyen, Edward; Poggio, John C.; Semon, Sarah; Tillberg-Webb, Heather
2008-01-01
This article addresses how evaluation criteria improve educational Web-information system design, and the tangible and intangible benefits of using evaluation criteria, when implemented in an educational Web-information system design. The evaluation criteria were developed by the authors through a content validation study applicable to…
Harper, Simon; Yesilada, Yeliz
2012-01-01
This is a technological review paper focussed on identifying both the research challenges and opportunities for further investigation arising from emerging technologies, and it does not aim to propose any recommendation or standard. It is focussed on blind and partially sighted World Wide Web (Web) users along with others who use assistive technologies. The Web is a fast moving interdisciplinary domain in which new technologies, techniques and research is in perpetual development. It is often difficult to maintain a holistic view of new developments within the multiple domains which together make up the Web. This suggests that knowledge of the current developments and predictions of future developments are additionally important for the accessibility community. Web accessibility has previously been characterised by the correction of our past mistakes to make the current Web fulfil the original vision of access for all. New technologies were not designed with accessibility in mind and technologies that could be useful for addressing accessibility issues were not identified or adopted by the accessibility community. We wish to enable the research community to undertake preventative measures and proactively address challenges, while recognising opportunities, before they become unpreventable or require retrospective technological enhancement. This article then reviews emerging trends within the Web and Web Accessibility domains.
Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A
2011-11-29
Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.
2011-01-01
Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392
Hydrological models as web services: Experiences from the Environmental Virtual Observatory project
NASA Astrophysics Data System (ADS)
Buytaert, W.; Vitolo, C.; Reaney, S. M.; Beven, K.
2012-12-01
Data availability in environmental sciences is expanding at a rapid pace. From the constant stream of high-resolution satellite images to the local efforts of citizen scientists, there is an increasing need to process the growing stream of heterogeneous data and turn it into useful information for decision-making. Environmental models, ranging from simple rainfall - runoff relations to complex climate models, can be very useful tools to process data, identify patterns, and help predict the potential impact of management scenarios. Recent technological innovations in networking, computing and standardization may bring a new generation of interactive models plugged into virtual environments closer to the end-user. They are the driver of major funding initiatives such as the UK's Virtual Observatory program, and the U.S. National Science Foundation's Earth Cube. In this study we explore how hydrological models, being an important subset of environmental models, have to be adapted in order to function within a broader environment of web-services and user interactions. Historically, hydrological models have been developed for very different purposes. Typically they have a rigid model structure, requiring a very specific set of input data and parameters. As such, the process of implementing a model for a specific catchment requires careful collection and preparation of the input data, extensive calibration and subsequent validation. This procedure seems incompatible with a web-environment, where data availability is highly variable, heterogeneous and constantly changing in time, and where the requirements of end-users may be not necessarily align with the original intention of the model developer. We present prototypes of models that are web-enabled using the web standards of the Open Geospatial Consortium, and implemented in online decision-support systems. We identify issues related to (1) optimal use of available data; (2) the need for flexible and adaptive structures; (3) quantification and communication of uncertainties. Lastly, we present some road maps to address these issues and discuss them in the broader context of web-based data processing and "big data" science.
Communication of Career Pathways Through Associate Degree Program Web Sites: A Baseline Assessment.
Becker, Ellen A; Vargas, Jenny
2018-05-08
The American Association for Respiratory Care sponsored a series of conferences that addressed the competency of the future workforce of respiratory therapists (RTs). Based upon the findings from those conferences, several initiatives emerged that support RTs earning a baccalaureate (or bachelor's) degree. The objective of this study was to identify the ways that associate degree programs communicate career pathways toward a baccalaureate degree through their Web sites. This cross-sectional observational study used a random sample of 100 of the 362 associate degree programs approved by the Commission on Accreditation for Respiratory Care. Data were collected from 3 specific categories: demographic data, baccalaureate completion information, and the Web page location for the program. The presence of statements related to any pathway toward a bachelor's degree, transfer credits, articulation agreements, and links for baccalaureate completion were recorded. The descriptive statistics in this study were reported as total numbers and percentages. Of the 100 programs in the random sample, only 89 were included in the study. Only 39 (44%) programs had links on their program Web site that had any content related to bachelor's degrees, 16 (18%) identified college transfer courses toward a bachelor's degree, and 26 (29%) programs included baccalaureate articulation agreements on their Web site. A minority of associate degree programs communicated career pathway information to their prospective and current students through program Web sites. An informative Web site would make the path more transparent for entry-level students to meet their future educational needs as their careers progress. Copyright © 2018 by Daedalus Enterprises.
Seeking Inclusivity in English Language Learning Web Sites
ERIC Educational Resources Information Center
McClure, Kristene K.
2010-01-01
This article contributes to research on critical perspectives in Teaching English to Speakers of Other Languages (TESOL) and on evaluative frameworks for English language learning (ELL) Web sites. The research addressed the following questions: (a) To what extent do ELL Web sites depict diverse representations of gender, race, socioeconomic…
78 FR 8108 - NextGen Solutions Vendors Guide
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-05
... Commerce is developing a web-based NextGen Solutions Vendors Guide intended to be used by foreign air... being listed on the Vendors Guide Web site should submit their company's name, Web site address, contact... to aviation system upgrades) Example: Engineering Services More information on the four ICAO ASBU...
Tune in the Net with RealAudio.
ERIC Educational Resources Information Center
Buchanan, Larry
1997-01-01
Describes how to connect to the RealAudio Web site to download a player that provides sound from Web pages to the computer through streaming technology. Explains hardware and software requirements and provides addresses for other RealAudio Web sites are provided, including weather information and current news. (LRW)
Marketing Opportunities in the Digital World.
ERIC Educational Resources Information Center
Kiani, G. Reza
1998-01-01
Addresses the opportunities offered by the Web to marketers. Considers the Web as a two-way communication model in which four different communication states can take place. Suggests the necessity of new concepts and models for marketers to manage their Web sites, and presents opportunities supporting the marketers' objectives in the new…
Avoiding Pornography Landmines while Traveling the Information Superhighway.
ERIC Educational Resources Information Center
Lehmann, Kay
2002-01-01
Discusses how to avoid pornographic sites when using the Internet in classrooms. Highlights include re-setting the Internet home page; putting appropriate links in a Word document; creating a Web page with appropriate links; downloading the content of a Web site; educating the students; and re-checking all Web addresses. (LRW)
Rotondi, Armando J; Eack, Shaun M; Hanusa, Barbara H; Spring, Michael B; Haas, Gretchen L
2015-03-01
E-health applications are becoming integral components of general medical care delivery models and emerging for mental health care. Few exist for treatment of those with severe mental illness (SMI). In part, this is due to a lack of models to design such technologies for persons with cognitive impairments and lower technology experience. This study evaluated the effectiveness of an e-health design model for persons with SMI termed the Flat Explicit Design Model (FEDM). Persons with schizophrenia (n = 38) performed tasks to evaluate the effectiveness of 5 Web site designs: 4 were prominent public Web sites, and 1 was designed according to the FEDM. Linear mixed-effects regression models were used to examine differences in usability between the Web sites. Omnibus tests of between-site differences were conducted, followed by post hoc pairwise comparisons of means to examine specific Web site differences when omnibus tests reached statistical significance. The Web site designed using the FEDM required less time to find information, had a higher success rate, and was rated easier to use and less frustrating than the other Web sites. The home page design of one of the other Web sites provided the best indication to users about a Web site's contents. The results are consistent with and were used to expand the FEDM. The FEDM provides evidence-based guidelines to design e-health applications for person with SMI, including: minimize an application's layers or hierarchy, use explicit text, employ navigational memory aids, group hyperlinks in 1 area, and minimize the number of disparate subjects an application addresses. © The Author 2013. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
The Rise and Fall of Text on the Web: A Quantitative Study of Web Archives
ERIC Educational Resources Information Center
Cocciolo, Anthony
2015-01-01
Introduction: This study addresses the following research question: is the use of text on the World Wide Web declining? If so, when did it start declining, and by how much has it declined? Method: Web pages are downloaded from the Internet Archive for the years 1999, 2002, 2005, 2008, 2011 and 2014, producing 600 captures of 100 prominent and…
ERIC Educational Resources Information Center
Chang, Chia-Jung; Liu, Chen-Chung; Shen, Yan-Jhih
2012-01-01
Collaborative web exploration, in which learners work together to explore the World Wide Web, has become a key learning activity in education contexts. Learners can use a shared computer with a shared display to explore the web together. However, such a shared-computer approach may limit active participation among learners. To address this issue,…
Cousineau, Tara; Houle, Brian; Bromberg, Jonas; Fernandez, Kathrine C.; Kling, Whitney C.
2008-01-01
Objective Tailored nutrition Web programs constitute an emerging trend in obesity prevention. Initial investment in innovative technology necessitates that the target population be well understood. This pilot study’s purpose was to determine the feasibility of a workplace nutrition Web program. Design Formative research was conducted with gaming industry employees and benefits managers to develop a consensus on workplace-specific nutrition needs. A demonstration Web program was piloted with stakeholders to determine feasibility. Setting Indiana, Mississippi, Nevada, and New Jersey gaming establishments. Participants 86 employees, 18 benefits managers. Intervention Prototype Web program. Main Outcome Measures Concept mapping; 16-item nutrition knowledge test; satisfaction. Analysis Concept mapping was used to aggregate importance ratings on programmatic content, which informed Web program curriculum. Chi-square tests were performed postintervention to determine knowledge improvement. Results (1) Employees and benefits managers exhibited moderate agreement about content priorities for the program (r = 0.48). (2) There was a significant increase in employees’ nutrition knowledge scores postintervention (t = 7.16, df = 36, P < .001); those with less knowledge exhibited the greatest gains in knowledge scores (r = −0.647, P < .001). Conclusions and Implications Employees and benefit managers do not necessarily agree on the priority of nutrition-related content, suggesting a need for programs to appeal to various stakeholders. Computer-based approaches can address various stakeholder health concerns via tailored, customized programming. PMID:18457784
Mindful Mood Balance: A Case Report of Web-Based Treatment of Residual Depressive Symptoms
Felder, Jennifer; Dimidjian, Sona; Beck, Arne; Boggs, Jennifer M; Segal, Zindel
2014-01-01
Residual depressive symptoms are associated with increased risk for relapse and impaired functioning. Although there is no definitive treatment for residual depressive symptoms, Mindfulness-Based Cognitive Therapy has been shown to be effective, but access is limited. Mindful Mood Balance (MMB), a Web-based adaptation of Mindfulness-Based Cognitive Therapy, was designed to address this care gap. In this case study, we describe a composite case that is representative of the course of intervention with MMB and its implementation in a large integrated delivery system. Specifically, we describe the content of each of eight weekly sessions, and the self-management skills developed by participating in this program. MMB may be a cost-effective and scalable option in primary care for increasing access to treatments for patients with residual depressive symptoms. PMID:25141988
Cormode, Graham; Dasgupta, Anirban; Goyal, Amit; Lee, Chi Hoon
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.
Guide to the Internet. The world wide web.
Pallen, M.
1995-01-01
The world wide web provides a uniform, user friendly interface to the Internet. Web pages can contain text and pictures and are interconnected by hypertext links. The addresses of web pages are recorded as uniform resource locators (URLs), transmitted by hypertext transfer protocol (HTTP), and written in hypertext markup language (HTML). Programs that allow you to use the web are available for most operating systems. Powerful on line search engines make it relatively easy to find information on the web. Browsing through the web--"net surfing"--is both easy and enjoyable. Contributing to the web is not difficult, and the web opens up new possibilities for electronic publishing and electronic journals. Images p1554-a Fig 5 PMID:8520402
Creating Patient and Family Education Web Sites
YADRICH, DONNA MACAN; FITZGERALD, SHARON A.; WERKOWITCH, MARILYN; SMITH, CAROL E.
2013-01-01
This article gives details about the methods and processes used to ensure that usability and accessibility were achieved during development of the Home Parenteral Nutrition Family Caregivers Web site, an evidence-based health education Web site for the family members and caregivers of chronically ill patients. This article addresses comprehensive definitions of usability and accessibility and illustrates Web site development according to Section 508 standards and the national Health and Human Services’ Research-Based Web Design and Usability Guidelines requirements. PMID:22024970
ERIC Educational Resources Information Center
Obilade, Titilola T.; Burton, John K.
2015-01-01
This textual content analysis set out to determine the extent to which the theories, principles, and guidelines in 4 standard books of instructional design and technology were also addressed in 4 popular books on web design. The standard books on instructional design and the popular books on web design were chosen by experts in the fields. The…
Publicizing Your Web Resources for Maximum Exposure.
ERIC Educational Resources Information Center
Smith, Kerry J.
2001-01-01
Offers advice to librarians for marketing their Web sites on Internet search engines. Advises against relying solely on spiders and recommends adding metadata to the source code and delivering that information directly to the search engines. Gives an overview of metadata and typical coding for meta tags. Includes Web addresses for a number of…
Social Responsibility and Corporate Web Pages: Self-Presentation or Agenda-Setting?
ERIC Educational Resources Information Center
Esrock, Stuart L.; Leichty, Greg B.
1998-01-01
Examines how corporate entities use the Web to present themselves as socially responsible citizens and to advance policy positions. Samples randomly "Fortune 500" companies, revealing that, although 90% had Web pages and 82% of the sites addressed a corporate social responsibility issue, few corporations used their pages to monitor…
10 CFR 1303.103 - Public reading area.
Code of Federal Regulations, 2010 CFR
2010-01-01
... the address in this paragraph (a): (2) Telephone: 703-235-4473; (3) A request to the Board's Web site... Board's Web site or by letter or fax. Please ensure that the records sought are clearly described... available electronically at the Board's Web site (http://www.nwtrb.gov). (d) Records of the Board available...
ERIC Educational Resources Information Center
Wankel, Charles, Ed.; DeFillippi, Robert, Ed.
This volume demonstrates how technology is impacting management education and learning in a variety of educational contexts. Some of the issues and trends in management education addressed include: technotrends; web-based management learning; the changing nature of the web as a context for learning; online simulations; web-format case studies;…
Breaking out of the Asynchronous Box: Using Web Conferencing in Distance Learning
ERIC Educational Resources Information Center
Lietzau, Julie Arnold; Mann, Barbara J.
2009-01-01
A discussion of a university library's use of Web conferencing (real-time synchronous instruction) which addresses the questions (1) Is Web conferencing a viable option for distance students in online only classrooms? (2) Do faculty and students benefit from this type of instruction? Four different scenarios are presented with assessment results…
The Internet as a Reflective Mirror for a Company's Image.
ERIC Educational Resources Information Center
Fahrmann, Jennifer; Hartz, Kim; Wendling, Marijo; Yoder, Kevin
The Internet is becoming the primary way that businesses communicate and receive information. Corporate Web addresses and home pages have become a valuable tool for leaving a solid mark on potential clients, consumers, and competition. To determine how differences in Web pages design reflect corporate image, a study examined Web pages from two…
The WWW and Our Digital Heritage--The New Preservation Tasks of the Library Community.
ERIC Educational Resources Information Center
Mannerheim, Johan
This paper discusses the role of libraries in the preservation of World Wide Web publications. Topics addressed include: (1) the scope of Web preservation, including examples of projects that illustrate comprehensive and selective approaches; (2) the responsibility of Web preservation, including placing the responsibility on publishers and other…
Rozbroj, Tomas; Lyons, Anthony; Pitts, Marian; Mitchell, Anne; Christensen, Helen
2014-07-03
Lesbians and gay men have disproportionately high rates of depression and anxiety, and report lower satisfaction with treatments. In part, this may be because many health care options marginalize them by assuming heterosexuality, or misunderstand and fail to respond to the challenges specifically faced by these groups. E-therapies have particular potential to respond to the mental health needs of lesbians and gay men, but there is little research to determine whether they do so, or how they might be improved. We sought to examine the applicability of existing mental health e-therapies for lesbians and gay men. We reviewed 24 Web- and mobile phone-based e-therapies and assessed their performance in eight key areas, including the use of inclusive language and content and whether they addressed mental health stressors for lesbians and gay men, such as experiences of stigma related to their sexual orientation, coming out, and relationship issues that are specific to lesbians and gay men. We found that e-therapies seldom addressed these stressors. Furthermore, 58% (14/24) of therapies contained instances that assumed or suggested the user was heterosexual, with instances especially prevalent among better-evidenced programs. Our findings, and a detailed review protocol presented in this article, may be used as guides for the future development of mental health e-therapies to better accommodate the needs of lesbians and gay men.
Vibration Propagation in Spider Webs
NASA Astrophysics Data System (ADS)
Hatton, Ross; Otto, Andrew; Elias, Damian
Due to their poor eyesight, spiders rely on web vibrations for situational awareness. Web-borne vibrations are used to determine the location of prey, predators, and potential mates. The influence of web geometry and composition on web vibrations is important for understanding spider's behavior and ecology. Past studies on web vibrations have experimentally measured the frequency response of web geometries by removing threads from existing webs. The full influence of web structure and tension distribution on vibration transmission; however, has not been addressed in prior work. We have constructed physical artificial webs and computer models to better understand the effect of web structure on vibration transmission. These models provide insight into the propagation of vibrations through the webs, the frequency response of the bare web, and the influence of the spider's mass and stiffness on the vibration transmission patterns. Funded by NSF-1504428.
Class Projects on the Internet.
ERIC Educational Resources Information Center
Nicholson, Danny
1996-01-01
Discusses the use of the Internet in the classroom. Presents a project on renewable energy sources in which students produce web pages. Provides the web page address of the project completed by students. (ASK)
Learning from LANCE: Developing a Web Portal Infrastructure for NASA Earth Science Data (Invited)
NASA Astrophysics Data System (ADS)
Murphy, K. J.
2013-12-01
NASA developed the Land Atmosphere Near real-time Capability for EOS (LANCE) in response to a growing need for timely satellite observations by applications users, operational agencies and researchers. EOS capabilities originally intended for long-term Earth science research were modified to deliver satellite data products with sufficient latencies to meet the needs of the NRT user communities. LANCE products are primarily distributed as HDF data files for analysis, however novel capabilities for distribution of NRT imagery for visualization have been added which have expanded the user base. Additionally systems to convert data to information such as the MODIS hotspot/active fire data are also provided through the Fire Information for Resource Management System (FIRMS). LANCE services include: FTP/HTTP file distribution, Rapid Response (RR), Worldview, Global Imagery Browse Services (GIBS) and FIRMS. This paper discusses how NASA has developed services specifically for LANCE and is taking the lessons learned through these activities to develop an Earthdata Web Infrastructure. This infrastructure is being used as a platform to support development of data portals that address specific science issues for much of EOSDIS data.
SCEAPI: A unified Restful Web API for High-Performance Computing
NASA Astrophysics Data System (ADS)
Rongqiang, Cao; Haili, Xiao; Shasha, Lu; Yining, Zhao; Xiaoning, Wang; Xuebin, Chi
2017-10-01
The development of scientific computing is increasingly moving to collaborative web and mobile applications. All these applications need high-quality programming interface for accessing heterogeneous computing resources consisting of clusters, grid computing or cloud computing. In this paper, we introduce our high-performance computing environment that integrates computing resources from 16 HPC centers across China. Then we present a bundle of web services called SCEAPI and describe how it can be used to access HPC resources with HTTP or HTTPs protocols. We discuss SCEAPI from several aspects including architecture, implementation and security, and address specific challenges in designing compatible interfaces and protecting sensitive data. We describe the functions of SCEAPI including authentication, file transfer and job management for creating, submitting and monitoring, and how to use SCEAPI in an easy-to-use way. Finally, we discuss how to exploit more HPC resources quickly for the ATLAS experiment by implementing the custom ARC compute element based on SCEAPI, and our work shows that SCEAPI is an easy-to-use and effective solution to extend opportunistic HPC resources.
The World Data Fabric: A New Concept for Geophysical Data Collection and Dissemination
NASA Astrophysics Data System (ADS)
Papitashvili, V.; Papitashvili, N.
2005-12-01
Nowadays, a multitude of digital geophysical data have become available via the World Wide Web from a variety of sources, including the World Data Centers (WDC), their suppliers (discipline-specific observatories, research institutions, government agencies), and short-lived, sporadic datasets produced by individual investigators from their research grants. As a result, worldwide geophysical databases become diverse and distributed, urging for sophisticated search engines capable of identifying discipline-specific data on the Web and then retrieving requested intervals for scientific analyses or practical applications. Here we introduce a concept of the World Data Fabric (WDF) emerged from the essence of World Data Centers system that successfully served geophysical communities since the International Geophysical Year (1957-58). We propose to unify both components of the WDC System - data centers and data providers - into a worldwide data network (data fabric), where the WDC role would become more proactive through their direct interaction with the data producers. It suggested that the World Data Centers would become a backbone of the World Data Fabric, watching and copying newly ``Webbed'' geophysical data to the center archives - to preserve at least 2-3 copies (or as many as Centers exist) of the new datasets within the entire WDF. Thus, the WDF would become a self-organized system of the data nodes (providers) and data portals (the WDCs as``clearinghouse''). The WDF would be then developing similarly to the Web, but its focus would be on geophysical data rather than on the content of a specific geophysical discipline. Introducing the WDF concept, we face a number of challenges: (a) data providers should make their datasets available via the Internet using open (but secure) access protocols; (b) multiple copies of every dataset would spread across WDF; (c) every WDF dataset (original or copied) must be digitally signed by the data providers and then by the data copiers; and (c) the WDF datasets must be protected from deliberate corruption or hacking. As the WDF (for all or specific geophysical disciplines) is established and actively maintained by a series of policies and regulations (i.e., specific for a particular discipline) through the WDC activities, then one can write a specific middleware to retrieve required data from the ``data fabric'', building then either the specific Virtual Observatory or Distributed Data System. The presentation will address these challenges suggesting some immediate and intervening solutions.
Going, Going, Still There: Using the WebCite Service to Permanently Archive Cited Web Pages
Trudel, Mathieu
2005-01-01
Scholars are increasingly citing electronic “web references” which are not preserved in libraries or full text archives. WebCite is a new standard for citing web references. To “webcite” a document involves archiving the cited Web page through www.webcitation.org and citing the WebCite permalink instead of (or in addition to) the unstable live Web page. This journal has amended its “instructions for authors” accordingly, asking authors to archive cited Web pages before submitting a manuscript. Almost 200 other journals are already using the system. We discuss the rationale for WebCite, its technology, and how scholars, editors, and publishers can benefit from the service. Citing scholars initiate an archiving process of all cited Web references, ideally before they submit a manuscript. Authors of online documents and websites which are expected to be cited by others can ensure that their work is permanently available by creating an archived copy using WebCite and providing the citation information including the WebCite link on their Web document(s). Editors should ask their authors to cache all cited Web addresses (Uniform Resource Locators, or URLs) “prospectively” before submitting their manuscripts to their journal. Editors and publishers should also instruct their copyeditors to cache cited Web material if the author has not done so already. Finally, WebCite can process publisher submitted “citing articles” (submitted for example as eXtensible Markup Language [XML] documents) to automatically archive all cited Web pages shortly before or on publication. Finally, WebCite can act as a focussed crawler, caching retrospectively references of already published articles. Copyright issues are addressed by honouring respective Internet standards (robot exclusion files, no-cache and no-archive tags). Long-term preservation is ensured by agreements with libraries and digital preservation organizations. The resulting WebCite Index may also have applications for research assessment exercises, being able to measure the impact of Web services and published Web documents through access and Web citation metrics. PMID:16403724
2011-04-01
limitations to construction were determined based on data available in the NRCS Web Soil Survey (NRCS 2010). Soil limitations were rated for... paper , glass, certain plastics, ferrous scrap, copper scrap, nonferrous segregated scrap metals, tires, spent Final EA Addressing the Privatization...www.nrcs.usda.gov/technical/NRI/maps/meta/m5116.html>. Accessed 15 March 2010. NRCS 2010 NRCS. 2010. Web Soil Survey. Available online: <http
Zebra: a web server for bioinformatic analysis of diverse protein families.
Suplatov, Dmitry; Kirilin, Evgeny; Takhaveev, Vakil; Svedas, Vytas
2014-01-01
During evolution of proteins from a common ancestor, one functional property can be preserved while others can vary leading to functional diversity. A systematic study of the corresponding adaptive mutations provides a key to one of the most challenging problems of modern structural biology - understanding the impact of amino acid substitutions on protein function. The subfamily-specific positions (SSPs) are conserved within functional subfamilies but are different between them and, therefore, seem to be responsible for functional diversity in protein superfamilies. Consequently, a corresponding method to perform the bioinformatic analysis of sequence and structural data has to be implemented in the common laboratory practice to study the structure-function relationship in proteins and develop novel protein engineering strategies. This paper describes Zebra web server - a powerful remote platform that implements a novel bioinformatic analysis algorithm to study diverse protein families. It is the first application that provides specificity determinants at different levels of functional classification, therefore addressing complex functional diversity of large superfamilies. Statistical analysis is implemented to automatically select a set of highly significant SSPs to be used as hotspots for directed evolution or rational design experiments and analyzed studying the structure-function relationship. Zebra results are provided in two ways - (1) as a single all-in-one parsable text file and (2) as PyMol sessions with structural representation of SSPs. Zebra web server is available at http://biokinet.belozersky.msu.ru/zebra .
Columbia River food webs: Developing a broader scientific foundation for river restoration
Alldredge, J. Richard; Beauchamp, David; Bisson, Peter A.; Congleton, James; Henny, Charles; Huntly, Nancy; Lamberson, Roland; Levings, Colin; Naiman, Robert J.; Pearcy, William; Rieman, Bruce; Ruggerone, Greg; Scarnecchia, Dennis; Smouse, Peter; Wood, Chris C.
2011-01-01
The objectives of this report are to provide a fundamental understanding of aquatic food webs in the Columbia River Basin and to illustrate and summarize their influences on native fish restoration efforts. The spatial scope addresses tributaries, impoundments, the free-flowing Columbia and Snake rivers, as well as the estuary and plume. Achieving the Council's vision for the Columbia River Fish and Wildlife Program (NPCC 2009-09) of sustaining a "productive and diverse community" that provides "abundant" harvest, is best accomplished through a time-prioritized action plan, one that complements other approaches while addressing important challenges and uncertainties related to the Basin's food webs. Note that the oceanic food webs, although of immense importance in sustaining fish populations, are not considered beyond the plume since they involve an additional set of complex and rapidly evolving issues. An analysis of oceanic food webs of relevance to the Columbia River requires a separately focused effort (e.g., Hoegh- Guldberg and Bruno 2010).
Web-based three-dimensional geo-referenced visualization
NASA Astrophysics Data System (ADS)
Lin, Hui; Gong, Jianhua; Wang, Freeman
1999-12-01
This paper addresses several approaches to implementing web-based, three-dimensional (3-D), geo-referenced visualization. The discussion focuses on the relationship between multi-dimensional data sets and applications, as well as the thick/thin client and heavy/light server structure. Two models of data sets are addressed in this paper. One is the use of traditional 3-D data format such as 3-D Studio Max, Open Inventor 2.0, Vis5D and OBJ. The other is modelled by a web-based language such as VRML. Also, traditional languages such as C and C++, as well as web-based programming tools such as Java, Java3D and ActiveX, can be used for developing applications. The strengths and weaknesses of each approach are elaborated. Four practical solutions for using VRML and Java, Java and Java3D, VRML and ActiveX and Java wrapper classes (Java and C/C++), to develop applications are presented for web-based, real-time interactive and explorative visualization.
Criteria for representing circular arc and sine wave spar webs by non-curved elements
NASA Technical Reports Server (NTRS)
Jenkins, J. M.
1979-01-01
The basic problem of how to simply represent a curved web of a spar in a finite element structural model was addressed. The ratio of flat web to curved web axial deformations and longitudinal rotations were calculated using NASTRAN models. Multiplying factors were developed from these calculations for various web thicknesses. These multiplying factors can be applied directly to the area and moment of inertia inputs of the finite element model. This allows the thermal stress relieving configurations of sine wave and circular arc webs to be simply accounted for in finite element structural models.
Dorel, Mathurin; Viara, Eric; Barillot, Emmanuel; Zinovyev, Andrei; Kuperstein, Inna
2017-01-01
Human diseases such as cancer are routinely characterized by high-throughput molecular technologies, and multi-level omics data are accumulated in public databases at increasing rate. Retrieval and visualization of these data in the context of molecular network maps can provide insights into the pattern of regulation of molecular functions reflected by an omics profile. In order to make this task easy, we developed NaviCom, a Python package and web platform for visualization of multi-level omics data on top of biological network maps. NaviCom is bridging the gap between cBioPortal, the most used resource of large-scale cancer omics data and NaviCell, a data visualization web service that contains several molecular network map collections. NaviCom proposes several standardized modes of data display on top of molecular network maps, allowing addressing specific biological questions. We illustrate how users can easily create interactive network-based cancer molecular portraits via NaviCom web interface using the maps of Atlas of Cancer Signalling Network (ACSN) and other maps. Analysis of these molecular portraits can help in formulating a scientific hypothesis on the molecular mechanisms deregulated in the studied disease. NaviCom is available at https://navicom.curie.fr. © The Author(s) 2017. Published by Oxford University Press.
PepMapper: a collaborative web tool for mapping epitopes from affinity-selected peptides.
Chen, Wenhan; Guo, William W; Huang, Yanxin; Ma, Zhiqiang
2012-01-01
Epitope mapping from affinity-selected peptides has become popular in epitope prediction, and correspondingly many Web-based tools have been developed in recent years. However, the performance of these tools varies in different circumstances. To address this problem, we employed an ensemble approach to incorporate two popular Web tools, MimoPro and Pep-3D-Search, together for taking advantages offered by both methods so as to give users more options for their specific purposes of epitope-peptide mapping. The combined operation of Union finds as many associated peptides as possible from both methods, which increases sensitivity in finding potential epitopic regions on a given antigen surface. The combined operation of Intersection achieves to some extent the mutual verification by the two methods and hence increases the likelihood of locating the genuine epitopic region on a given antigen in relation to the interacting peptides. The Consistency between Intersection and Union is an indirect sufficient condition to assess the likelihood of successful peptide-epitope mapping. On average from 27 tests, the combined operations of PepMapper outperformed either MimoPro or Pep-3D-Search alone. Therefore, PepMapper is another multipurpose mapping tool for epitope prediction from affinity-selected peptides. The Web server can be freely accessed at: http://informatics.nenu.edu.cn/PepMapper/
ERIC Educational Resources Information Center
Dysart, Joe
2008-01-01
Given Google's growing market share--69% of all searches by the close of 2007--it's absolutely critical for any school on the Web to ensure its site is Google-friendly. A Google-optimized site ensures that students and parents can quickly find one's district on the Web even if they don't know the address. Plus, good search optimization simply…
Guiding Students in Using the World Wide Web for Research.
ERIC Educational Resources Information Center
Kubly, Kristin
This paper addresses the need for educators and librarians to guide students in using the World Wide Web appropriately by teaching them to evaluate Internet resources using criteria designed to identify the authoritative sources. The pros and cons of information commonly found on the Web are discussed, as well as academic Internet subject or…
Addressing the Needs of Students with Learning Disabilities during Their Interaction with the Web
ERIC Educational Resources Information Center
Curcic, Svjetlana
2011-01-01
Purpose: The purpose of this study is to examine the effectiveness of instruction in information problem solving within the world wide web (the web) environment. The participants were 20 seventh and eighth grade students with a learning disability (LD) in reading. An experimental pretest-posttest control group method was used to investigate the…
ERIC Educational Resources Information Center
Iding, Marie; Klemm, E. Barbara
2005-01-01
The present study addresses the need for teachers to critically evaluate the credibility, validity, and cognitive load associated with scientific information on Web sites, in order to effectively teach students to evaluate scientific information on the World Wide Web. A line of prior research investigating high school and university students'…
Integrating Web 2.0-Based Informal Learning with Workplace Training
ERIC Educational Resources Information Center
Zhao, Fang; Kemp, Linzi J.
2012-01-01
Informal learning takes place in the workplace through connection and collaboration mediated by Web 2.0 applications. However, little research has yet been published that explores informal learning and how to integrate it with workplace training. We aim to address this research gap by developing a conceptual Web 2.0-based workplace learning and…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-20
... address above, or from the Division of Migratory Bird Management's Web site at http://www.fws.gov... INFORMATION CONTACT or from our Web site at http://www.fws.gov/migratorybirds . Review of Public Comments and... http://www.regulations.gov , or from our Web site at http://www.fws.gov/migratorybirds/NewReports...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
47 CFR 73.670 - Commercial limits in children's programs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... for commercial purposes, including either e-commerce or advertising; (3) The Web site's home page and... (4) The page of the Web site to which viewers are directed by the Web site address is not used for e-commerce, advertising, or other commercial purposes (e.g., contains no links labeled “store” and no links...
Teaching with Web-Based Videos: Helping Students Grasp the Science in Popular Online Resources
ERIC Educational Resources Information Center
Pace, Barbara G.; Jones, Linda Cronin
2009-01-01
Today, the use of web-based videos in science classrooms is becoming more and more commonplace. However, these videos are often fast-paced and information rich--science concepts can be fragmented and embedded within larger cultural issues. This article addresses the cognitive difficulties posed by many web-based science videos. Drawing on concepts…
Prosthetic Frequently Asked Questions for the New Amputee
... with other amputees locally and across the country. WEB ADDRESSES FOR LINKED MATERIALS Amputee Coalition National Limb ... article in other publications, including other World Wide Web sites must contact the Amputee Coalition for permission ...
Specification Patent Management for Web Application Platform Ecosystem
NASA Astrophysics Data System (ADS)
Fukami, Yoshiaki; Isshiki, Masao; Takeda, Hideaki; Ohmukai, Ikki; Kokuryo, Jiro
Diversified usage of web applications has encouraged disintegration of web platform into management of identification and applications. Users make use of various kinds of data linked to their identity with multiple applications on certain social web platforms such as Facebook or MySpace. There has emerged competition among web application platforms. Platformers can design relationship with developers by controlling patent of their own specification and adopt open technologies developed external organizations. Platformers choose a way to open according to feature of the specification and their position. Patent management of specification come to be a key success factor to build competitive web application platforms. Each way to attract external developers such as standardization, open source has not discussed and analyzed all together.
A Tailored Web-Based Psycho-Educational Intervention for Cancer Patients and Their Family Caregivers
Northouse, Laurel; Schafenacker, Ann; Barr, Kathryn L.C.; Katapodi, Maria; Yoon, Hyojin; Brittain, Kelly; Song, Lixin; Ronis, David L.; An, Larry
2014-01-01
Background Most programs addressing psychosocial concerns of cancer survivors are in-person programs that are expensive to deliver, have limited availability, and seldom deal with caregivers’ concerns. Objective This study examined the feasibility of translating an efficacious nurse-delivered program (FOCUS Program) for patients and their caregivers to a tailored, dyadic web-based format. Specific aims were to: (i) test the preliminary effects of the web-based intervention on patient and caregiver outcomes, (ii) examine participants’ program satisfaction, and (iii) determine the feasibility of using a web-based delivery format. Intervention/Methods A Phase II feasibility study was conducted with cancer patients (lung, breast, colorectal, prostate) and their family caregivers (N=38 dyads). The web-based intervention provided information and support tailored to the unique characteristics of each patient, caregiver, and their dyadic relationship. Primary outcomes were emotional distress and quality of life (QOL). Secondary outcomes were benefits of illness/caregiving, communication, support, and self-efficacy. Analyses included descriptive statistics and repeated measures ANOVA. Results Dyads had a significant decrease in emotional distress, increase in QOL, and perceived more benefits of illness/caregiving. Caregivers also had significant improvement in self-efficacy. There were no changes in communication. Participants were satisfied with program usability, but recommended additional content. Conclusions It was possible to translate a clinician-delivered program to a web-based format that was easy to use and had positive effects on dyadic outcomes. Implications for Practice The web-based program is a promising way to provide psychosocial care to more patients and caregivers using fewer personnel. It needs further testing in a larger RCT. PMID:24945270
Using the World Wide WEB to promote science education in nuclear energy and RWM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, M.
1996-12-31
A priority of government and business in the United States and other first tier industrial countries continues to be the improvement of science, mathematics and technology (SMT) instruction in pre university level education. The U.S. federal government has made SMT instruction an educational priority and set goals for improving it in the belief that science, math and technology education are tied to our economic well being and standard of living. The new national standards in mathematics education, science education and the proposed standards in technology education are all aimed at improving knowledge and skills in the essential areas that themore » federal government considers important for protecting our technological advantage in the world economy. This paper will discuss a pilot project for establishing graphical Web capability in a limited number of rural Nevada schools (six) with support from the US Department of Energy (DOE) and the state of Nevada. The general goals of the pilot project are as follows: (1) to give rural teachers and students access to up to date science information on the Web; (2) to determine whether Web access can improve science teaching and student attitudes toward science in rural Nevada schools; and (3) to identify science content on the Web that supports the National Science Standards and Benchmarks. A specific objective that this paper will address is stated as the following question: What potential do nuclear energy information office web sites offer for changing student attitudes about nuclear energy and creating greater nuclear literacy.« less
Caballero, Víctor; Vernet, David; Zaballos, Agustín; Corral, Guiomar
2018-01-30
Sensor networks and the Internet of Things have driven the evolution of traditional electric power distribution networks towards a new paradigm referred to as Smart Grid. However, the different elements that compose the Information and Communication Technologies (ICTs) layer of a Smart Grid are usually conceived as isolated systems that typically result in rigid hardware architectures which are hard to interoperate, manage, and to adapt to new situations. If the Smart Grid paradigm has to be presented as a solution to the demand for distributed and intelligent energy management system, it is necessary to deploy innovative IT infrastructures to support these smart functions. One of the main issues of Smart Grids is the heterogeneity of communication protocols used by the smart sensor devices that integrate them. The use of the concept of the Web of Things is proposed in this work to tackle this problem. More specifically, the implementation of a Smart Grid's Web of Things, coined as the Web of Energy is introduced. The purpose of this paper is to propose the usage of Web of Energy by means of the Actor Model paradigm to address the latent deployment and management limitations of Smart Grids. Smart Grid designers can use the Actor Model as a design model for an infrastructure that supports the intelligent functions demanded and is capable of grouping and converting the heterogeneity of traditional infrastructures into the homogeneity feature of the Web of Things. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.
Jafarpour, Borna; Abidi, Samina Raza; Abidi, Syed Sibte Raza
2016-01-01
Computerizing paper-based CPG and then executing them can provide evidence-informed decision support to physicians at the point of care. Semantic web technologies especially web ontology language (OWL) ontologies have been profusely used to represent computerized CPG. Using semantic web reasoning capabilities to execute OWL-based computerized CPG unties them from a specific custom-built CPG execution engine and increases their shareability as any OWL reasoner and triple store can be utilized for CPG execution. However, existing semantic web reasoning-based CPG execution engines suffer from lack of ability to execute CPG with high levels of expressivity, high cognitive load of computerization of paper-based CPG and updating their computerized versions. In order to address these limitations, we have developed three CPG execution engines based on OWL 1 DL, OWL 2 DL and OWL 2 DL + semantic web rule language (SWRL). OWL 1 DL serves as the base execution engine capable of executing a wide range of CPG constructs, however for executing highly complex CPG the OWL 2 DL and OWL 2 DL + SWRL offer additional executional capabilities. We evaluated the technical performance and medical correctness of our execution engines using a range of CPG. Technical evaluations show the efficiency of our CPG execution engines in terms of CPU time and validity of the generated recommendation in comparison to existing CPG execution engines. Medical evaluations by domain experts show the validity of the CPG-mediated therapy plans in terms of relevance, safety, and ordering for a wide range of patient scenarios.
Vernet, David; Corral, Guiomar
2018-01-01
Sensor networks and the Internet of Things have driven the evolution of traditional electric power distribution networks towards a new paradigm referred to as Smart Grid. However, the different elements that compose the Information and Communication Technologies (ICTs) layer of a Smart Grid are usually conceived as isolated systems that typically result in rigid hardware architectures which are hard to interoperate, manage, and to adapt to new situations. If the Smart Grid paradigm has to be presented as a solution to the demand for distributed and intelligent energy management system, it is necessary to deploy innovative IT infrastructures to support these smart functions. One of the main issues of Smart Grids is the heterogeneity of communication protocols used by the smart sensor devices that integrate them. The use of the concept of the Web of Things is proposed in this work to tackle this problem. More specifically, the implementation of a Smart Grid’s Web of Things, coined as the Web of Energy is introduced. The purpose of this paper is to propose the usage of Web of Energy by means of the Actor Model paradigm to address the latent deployment and management limitations of Smart Grids. Smart Grid designers can use the Actor Model as a design model for an infrastructure that supports the intelligent functions demanded and is capable of grouping and converting the heterogeneity of traditional infrastructures into the homogeneity feature of the Web of Things. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29385748
Web-enabling technologies for the factory floor: a web-enabling strategy for emanufacturing
NASA Astrophysics Data System (ADS)
Velez, Ricardo; Lastra, Jose L. M.; Tuokko, Reijo O.
2001-10-01
This paper is intended to address the different technologies available for Web-enabling of the factory floor. It will give an overview of the importance of Web-enabling of the factory floor, in the application of the concepts of flexible and intelligent manufacturing, in conjunction with e-commerce. As a last section, it will try to define a Web-enabling strategy for the application in eManufacturing. This is made under the scope of the electronics manufacturing industry, so every application, technology or related matter is presented under such scope.
NASA Technical Reports Server (NTRS)
Ward, Robin A.
2002-01-01
The primary goal of this project was to continue populating the currently existing web site developed in 1998 in conjunction with the NASA Dryden Flight Research Center and California Polytechnic State University, with more mathematics lesson plans and activities that K-12 teachers, students, home-schoolers, and parents could access. All of the activities, while demonstrating some mathematical topic, also showcase the research endeavors of the NASA Dryden Flight Research Center. The website is located at: http://daniel.calpoly.edu/dfrc/Robin. The secondary goal of this project was to share the web-based activities with educators at various conferences and workshops. To address the primary goal of this project, over the past year, several new activities were posted on the web site and some of the existing activities were enhanced to contain more video clips, photos, and materials for teachers. To address the project's secondary goal, the web-based activities were showcased at several conferences and workshops. Additionally, in order to measure and assess the outreach impact of the web site, a link to the web site hitbox.com was established in April 2001, which allowed for the collection of traffic statistics against the web site (such as the domains of visitors, the frequency of visitors to this web site, etc.) Provided is a description of some of the newly created activities posted on the web site during the project period of 2001-2002, followed by a description of the conferences and workshops at which some of the web-based activities were showcased. Next is a brief summary of the web site's traffic statistics demonstrating its worldwide educational impact, followed by a listing of some of the awards and accolades the web site has received.
2018-01-01
Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users’ queries from commercial search engines), computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH) methods and evaluate four variants in a distributed computing environment (specifically, Hadoop). We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with “vanilla” LSH, even when using the same amount of space. PMID:29346410
Expediting the transfer of evidence into practice: building clinical partnerships*
Rader, Tamara; Gagnon, Anita J.
2000-01-01
A librarian/clinician partnership was fostered in one hospital through the formation of the Evidence-based Practice Committee, with an ulterior goal of facilitating the transfer of evidence into practice. The paper will describe barriers to evidence-based practice and outline the committee's strategies for overcoming these barriers, including the development and promotion of a Web-based guide to evidence-based practice specifically designed for clinicians (health professionals). Educational strategies for use of the Web-based guide will also be addressed. Advantages of this partnership are that the skills of librarians in meeting the needs of clinicians are maximized. The evidence-based practice skills of clinicians are honed and librarians make a valuable contribution to the knowledgebase of the clinical staff. The knowledge acquired through the partnership by both clinicians and librarians will increase the sophistication of the dialogue between the two groups and in turn will expedite the transfer of evidence into practice. PMID:10928710
The Challenge of Handling Big Data Sets in the Sensor Web
NASA Astrophysics Data System (ADS)
Autermann, Christian; Stasch, Christoph; Jirka, Simon
2016-04-01
More and more Sensor Web components are deployed in different domains such as hydrology, oceanography or air quality in order to make observation data accessible via the Web. However, besides variability of data formats and protocols in environmental applications, the fast growing volume of data with high temporal and spatial resolution is imposing new challenges for Sensor Web technologies when sharing observation data and metadata about sensors. Variability, volume and velocity are the core issues that are addressed by Big Data concepts and technologies. Most solutions in the geospatial sector focus on remote sensing and raster data, whereas big in-situ observation data sets relying on vector features require novel approaches. Hence, in order to deal with big data sets in infrastructures for observational data, the following questions need to be answered: 1. How can big heterogeneous spatio-temporal datasets be organized, managed, and provided to Sensor Web applications? 2. How can views on big data sets and derived information products be made accessible in the Sensor Web? 3. How can big observation data sets be processed efficiently? We illustrate these challenges with examples from the marine domain and outline how we address these challenges. We therefore show how big data approaches from mainstream IT can be re-used and applied to Sensor Web application scenarios.
17 CFR 230.498 - Summary Prospectuses for open-end management investment companies.
Code of Federal Regulations, 2014 CFR
2014-04-01
... [____]. (A) The legend must provide an Internet address, other than the address of the Commission's electronic filing system; toll free (or collect) telephone number; and e-mail address that investors can use to obtain the Statutory Prospectus and other information. The Internet Web site address must be...
17 CFR 230.498 - Summary Prospectuses for open-end management investment companies.
Code of Federal Regulations, 2011 CFR
2011-04-01
... [____]. (A) The legend must provide an Internet address, other than the address of the Commission's electronic filing system; toll free (or collect) telephone number; and e-mail address that investors can use to obtain the Statutory Prospectus and other information. The Internet Web site address must be...
17 CFR 230.498 - Summary Prospectuses for open-end management investment companies.
Code of Federal Regulations, 2013 CFR
2013-04-01
... [____]. (A) The legend must provide an Internet address, other than the address of the Commission's electronic filing system; toll free (or collect) telephone number; and e-mail address that investors can use to obtain the Statutory Prospectus and other information. The Internet Web site address must be...
Haq, Rashida; Heus, Lineke; Baker, Natalie A; Dastur, Daisy; Leung, Fok-Han; Leung, Eman; Li, Benjamin; Vu, Kathy; Parsons, Janet A
2013-07-25
Following the completion of treatment and as they enter the follow-up phase, breast cancer patients (BCPs) often recount feeling 'lost in transition', and are left with many questions concerning how their ongoing care and monitoring for recurrence will be managed. Family physicians (FPs) also frequently report feeling ill-equipped to provide follow-up care to BCPs. In this three-phase qualitative pilot study we designed, implemented and evaluated a multi-faceted survivorship care plan (SCP) to address the information needs of BCPs at our facility and of their FPs. In Phase 1 focus groups and individual interviews were conducted with 35 participants from three stakeholder groups (BCPs, FPs and oncology specialist health care providers (OHCPs)), to identify specific information needs. An SCP was then designed based on these findings, consisting of both web-based and paper-based tools (Phase 2). For Phase 3, both sets of tools were subsequently evaluated via focus groups and interviews with 26 participants. Interviews and focus groups were audio taped, transcribed and content analysed for emergent themes and patterns. In Phase 1 patients commented that web-based, paper-based and human resources components were desirable in any SCP. Patients did not focus exclusively on the post-treatment period, but instead spoke of evolving needs throughout their cancer journey. FPs indicated that any tools to support them must distill important information in a user-friendly format. In Phase 2, a pilot SCP was subsequently designed, consisting of both web-based and paper-based materials tailored specifically to the needs of BCPs as well as FPs. During Phase 3 (evaluation) BCPs indicated that the SCP was effective at addressing many of their needs, and offered suggestions for future improvements. Both patients and FPs found the pilot SCP to be an improvement from the previous standard of care. Patients perceived the quality of the BCP-FP relationship as integral to their comfort with FPs assuming follow-up responsibilities. This pilot multi-component SCP shows promise in addressing the information needs of BCPs and the FPs who care for them. Next steps include refinement of the different SCP components, further evaluation (including usability testing), and planning for more extensive implementation.
ERIC Educational Resources Information Center
Kimmons, Royce
2017-01-01
This study seeks to evaluate the basic Priority 1 web accessibility of all college and university websites in the US (n = 3141). Utilizing web scraping and automated content analysis, the study establishes that even in the case of high-priority, simple-to-address accessibility requirements, colleges and universities generally fail to make their…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... and recommendations could be found on the Web at http://www.hhs.gov/nvpo/nvac/subgroups/adultimmunization . The Web address where the draft report and recommendations can be found is http://www.hhs.gov...) The draft report and recommendations are available on the Web at http://www.hhs.gov/nvpo/nvac...
Federal Register 2010, 2011, 2012, 2013, 2014
2002-05-03
... (web-serving software), Linux, Perl, and those who are building a compatible & free version of MS`s..., Argument from Design Argument from Design-Web & Multimedia [email protected] http://www.ardes.com MTC-00003464... organization could be a good target for this effort. Their web address is http:// www.gnu.org/. This effort...
Cox, Martine Elizabeth; Small, Hannah Julie; Boyes, Allison W; O'Brien, Lorna; Rose, Shiho Karina; Baker, Amanda L; Henskens, Frans A; Kirkwood, Hannah Naomi; Roach, Della M
2017-01-01
Background Web-based typed exchanges are increasingly used by professionals to provide emotional support to patients. Although some empirical evidence exists to suggest that various strategies may be used to convey emotion during Web-based text communication, there has been no critical review of these data in patients with chronic conditions. Objectives The objective of this review was to identify the techniques used to convey emotion in written or typed Web-based communication and assess the empirical evidence regarding impact on communication and psychological outcomes. Methods An electronic search of databases, including MEDLINE, CINAHL, PsycINFO, EMBASE, and the Cochrane Library was conducted to identify literature published from 1990 to 2016. Searches were also conducted using Google Scholar, manual searching of reference lists of identified papers and manual searching of tables of contents for selected relevant journals. Data extraction and coding were completed by 2 reviewers (10.00% [573/5731] of screened papers, at abstract/title screening stage; 10.0% of screened [69/694] papers, at full-text screening stage). Publications were assessed against the eligibility criteria and excluded if they were duplicates, were not published in English, were published before 1990, referenced animal or nonhuman subjects, did not describe original research, were not journal papers, or did not empirically test the effect of one or more nonverbal communication techniques (for eg, smileys, emoticons, emotional bracketing, voice accentuation, trailers [ellipsis], and pseudowords) as part of Web-based or typed communication on communication-related variables, including message interpretation, social presence, the nature of the interaction (eg, therapeutic alliance), patient perceptions of the interaction (eg, participant satisfaction), or psychological outcomes, including depression, anxiety, and distress. Results A total of 6902 unique publications were identified. Of these, six publications met the eligibility criteria and were included in a narrative synthesis. All six studies addressed the effect of smileys or emoticons on participant responses, message interpretation, or social presence of the writer. None of these studies specifically targeted chronic conditions. It was found that emoticons were more effective in influencing the emotional impact of a message than no cue and that smileys and emoticons were able to convey a limited amount of emotion. No studies addressed other techniques for conveying emotion in written communication. No studies addressed the effects of any techniques on the nature of the interaction (eg, therapeutic alliance), patient perceptions of the interaction (eg, participant satisfaction), or psychological outcomes (depression, anxiety, or distress). Conclusions There is a need for greater empirical attention to the effects of the various proposed techniques for conveying emotion in Web-based typed communications to inform health service providers regarding best-practice communication skills in this setting. PMID:29066426
C3: A Collaborative Web Framework for NASA Earth Exchange
NASA Astrophysics Data System (ADS)
Foughty, E.; Fattarsi, C.; Hardoyo, C.; Kluck, D.; Wang, L.; Matthews, B.; Das, K.; Srivastava, A.; Votava, P.; Nemani, R. R.
2010-12-01
The NASA Earth Exchange (NEX) is a new collaboration platform for the Earth science community that provides a mechanism for scientific collaboration and knowledge sharing. NEX combines NASA advanced supercomputing resources, Earth system modeling, workflow management, NASA remote sensing data archives, and a collaborative communication platform to deliver a complete work environment in which users can explore and analyze large datasets, run modeling codes, collaborate on new or existing projects, and quickly share results among the Earth science communities. NEX is designed primarily for use by the NASA Earth science community to address scientific grand challenges. The NEX web portal component provides an on-line collaborative environment for sharing of Eearth science models, data, analysis tools and scientific results by researchers. In addition, the NEX portal also serves as a knowledge network that allows researchers to connect and collaborate based on the research they are involved in, specific geographic area of interest, field of study, etc. Features of the NEX web portal include: Member profiles, resource sharing (data sets, algorithms, models, publications), communication tools (commenting, messaging, social tagging), project tools (wikis, blogs) and more. The NEX web portal is built on the proven technologies and policies of DASHlink.arc.nasa.gov, (one of NASA's first science social media websites). The core component of the web portal is a C3 framework, which was built using Django and which is being deployed as a common framework for a number of collaborative sites throughout NASA.
Meeting the challenge of finding resources for ophthalmic nurses on the World Wide Web.
Duffel, P G
1998-12-01
The World Wide Web ("the Web") is a macrocosm of resources that can be overwhelming. Often the sheer volume of material available causes one to give up in despair before finding information of any use. The Web is such a popular resource that it cannot be ignored. Two of the biggest challenges to finding good information on the Web are knowing where to start and judging whether the information gathered is pertinent and credible. This article addresses these two challenges and introduces the reader to a variety of ophthalmology and vision science resources on the World Wide Web.
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Spazier, J.; Reißland, S.
2014-12-01
Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the concept a whirl and to shape science's future. Further functionality, improvements and possible profound changes have to implemented successively based on the users' evolving needs.
Nagel, Anna C; Spitzberg, Brian H; An, Li; Gawron, J Mark; Gupta, Dipak K; Yang, Jiue-An; Han, Su; Peddecord, K Michael; Lindsay, Suzanne; Sawyer, Mark H
2013-01-01
Background Surveillance plays a vital role in disease detection, but traditional methods of collecting patient data, reporting to health officials, and compiling reports are costly and time consuming. In recent years, syndromic surveillance tools have expanded and researchers are able to exploit the vast amount of data available in real time on the Internet at minimal cost. Many data sources for infoveillance exist, but this study focuses on status updates (tweets) from the Twitter microblogging website. Objective The aim of this study was to explore the interaction between cyberspace message activity, measured by keyword-specific tweets, and real world occurrences of influenza and pertussis. Tweets were aggregated by week and compared to weekly influenza-like illness (ILI) and weekly pertussis incidence. The potential effect of tweet type was analyzed by categorizing tweets into 4 categories: nonretweets, retweets, tweets with a URL Web address, and tweets without a URL Web address. Methods Tweets were collected within a 17-mile radius of 11 US cities chosen on the basis of population size and the availability of disease data. Influenza analysis involved all 11 cities. Pertussis analysis was based on the 2 cities nearest to the Washington State pertussis outbreak (Seattle, WA and Portland, OR). Tweet collection resulted in 161,821 flu, 6174 influenza, 160 pertussis, and 1167 whooping cough tweets. The correlation coefficients between tweets or subgroups of tweets and disease occurrence were calculated and trends were presented graphically. Results Correlations between weekly aggregated tweets and disease occurrence varied greatly, but were relatively strong in some areas. In general, correlation coefficients were stronger in the flu analysis compared to the pertussis analysis. Within each analysis, flu tweets were more strongly correlated with ILI rates than influenza tweets, and whooping cough tweets correlated more strongly with pertussis incidence than pertussis tweets. Nonretweets correlated more with disease occurrence than retweets, and tweets without a URL Web address correlated better with actual incidence than those with a URL Web address primarily for the flu tweets. Conclusions This study demonstrates that not only does keyword choice play an important role in how well tweets correlate with disease occurrence, but that the subgroup of tweets used for analysis is also important. This exploratory work shows potential in the use of tweets for infoveillance, but continued efforts are needed to further refine research methods in this field. PMID:24158773
Semantic Web-based Vocabulary Broker for Open Science
NASA Astrophysics Data System (ADS)
Ritschel, B.; Neher, G.; Iyemori, T.; Murayama, Y.; Kondo, Y.; Koyama, Y.; King, T. A.; Galkin, I. A.; Fung, S. F.; Wharton, S.; Cecconi, B.
2016-12-01
Keyword vocabularies are used to tag and to identify data of science data repositories. Such vocabularies consist of controlled terms and the appropriate concepts, such as GCMD1 keywords or the ESPAS2 keyword ontology. The Semantic Web-based mash-up of domain-specific, cross- or even trans-domain vocabularies provides unique capabilities in the network of appropriate data resources. Based on a collaboration between GFZ3, the FHP4, the WDC for Geomagnetism5 and the NICT6 we developed the concept of a vocabulary broker for inter- and trans-disciplinary data detection and integration. Our prototype of the Semantic Web-based vocabulary broker uses OSF7 for the mash-up of geo and space research vocabularies, such as GCMD keywords, ESPAS keyword ontology and SPASE8 keyword vocabulary. The vocabulary broker starts the search with "free" keywords or terms of a specific vocabulary scheme. The vocabulary broker almost automatically connects the different science data repositories which are tagged by terms of the aforementioned vocabularies. Therefore the mash-up of the SKOS9 based vocabularies with appropriate metadata from different domains can be realized by addressing LOD10 resources or virtual SPARQL11 endpoints which maps relational structures into the RDF format12. In order to demonstrate such a mash-up approach in real life, we installed and use a D2RQ13 server for the integration of IUGONET14 data which are managed by a relational database. The OSF based vocabulary broker and the D2RQ platform are installed at virtual LINUX machines at the Kyoto University. The vocabulary broker meets the standard of a main component of the WDS15 knowledge network. The Web address of the vocabulary broker is http://wdcosf.kugi.kyoto-u.ac.jp 1 Global Change Master Directory2 Near earth space data infrastructure for e-science3 German Research Centre for Geosciences4 University of Applied Sciences Potsdam5 World Data Center for Geomagnetism Kyoto6 National Institute of Information and Communications Technology Tokyo7 Open Semantic Framework8 Space Physics Archive Search and Extract9 Simple Knowledge Organization System10 Linked Open Data11 SPARQL Protocol And RDF Query12 Resource Description Framework13 Database to RDF Query14 Inter-university Upper atmosphere Global Observation NETwork15 World Data System
EZStream: Distributing Live ISS Experiment Telemetry via Internet
NASA Technical Reports Server (NTRS)
Myers, Gerry; Welch, Clara L. (Technical Monitor)
2002-01-01
This paper will present the high-level architecture and components of the current version of EZStream as well as the product direction & enhancements to be incorporated through a Phase II grant. Security will be addressed such as data encryption and user login. Remote user devices will be discussed including web browsers on PC's and displays on PDA's and smart cell phones. The interaction between EZStream and TReK will be covered as well as the eventuality of EZStream to receive and parse binary data streams directly. This makes EZStream beneficial to both the International Partners and non-NASA applications. The options of developing client-side display web pages will be addressed and the development of new tools to allow creation of display web pages by non-programmers.
Stellefson, Michael L; Shuster, Jonathan J; Chaney, Beth H; Paige, Samantha R; Alber, Julia M; Chaney, J Don; Sriram, P S
2017-09-05
Many people living with Chronic Obstructive Pulmonary Disease (COPD) have low general health literacy; however, there is little information available on these patients' eHealth literacy, or their ability to seek, find, understand, and appraise online health information and apply this knowledge to address or solve disease-related health concerns. A nationally representative sample of patients registered in the COPD Foundation's National Research Registry (N = 1,270) was invited to complete a web-based survey to assess socio-demographic (age, gender, marital status, education), health status (generic and lung-specific health-related quality of life), and socio-cognitive (social support, self-efficacy, COPD knowledge) predictors of eHealth literacy, measured using the 8-item eHealth literacy scale (eHEALS). Over 50% of the respondents (n = 176) were female (n = 89), with a mean age of 66.19 (SD = 9.47). Overall, participants reported moderate levels of eHealth literacy, with more than 70% feeling confident in their ability to find helpful health resources on the Internet. However, respondents were much less confident in their ability to distinguish between high- and low-quality sources of web-based health information. Very severe versus less severe COPD (β = 4.15), lower lung-specific health-related quality of life (β = -0.19), and greater COPD knowledge (β = 0.62) were significantly associated with higher eHealth literacy. Higher COPD knowledge was also significantly associated with greater knowledge (ρ = 0.24, p = .001) and use (ρ = 0.24, p = .001) of web-based health resources. Findings emphasize the importance of integrating skill-building activities into comprehensive patient education programs that enable patients with severe cases of COPD to identify high-quality sources of web-based health information. Additional research is needed to understand how new social technologies can be used to help medically underserved COPD patients benefit from web-based self-management support resources.
Evaluating Web-Based Learning Systems
ERIC Educational Resources Information Center
Pergola, Teresa M.; Walters, L. Melissa
2011-01-01
Accounting educators continuously seek ways to effectively integrate instructional technology into accounting coursework as a means to facilitate active learning environments and address the technology-driven learning preferences of the current generation of students. Most accounting textbook publishers now provide interactive, web-based learning…
Personalization of Rule-based Web Services.
Choi, Okkyung; Han, Sang Yong
2008-04-04
Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.
WebEAV: automatic metadata-driven generation of web interfaces to entity-attribute-value databases.
Nadkarni, P M; Brandt, C M; Marenco, L
2000-01-01
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples.
ERIC Educational Resources Information Center
Lobodzinski, Suave, Ed.; Tomek, Ivan, Ed.
The 1997 WebNet conference addressed research, new developments, and experiences related to the Internet and intranets. The 257 contributions of WebNet 97 contained in this proceedings comprise the full and short papers accepted for presentation at the conference. Included are positions papers by leading experts in the field; descriptions of ideas…
I Read it on the Computer, It Must Be True--Evaluating Information from the Web.
ERIC Educational Resources Information Center
Marcovitz, David M.
1997-01-01
Presents a lesson plan that exposes students to a variety of information sources on the World Wide Web to teach them to begin to be critical consumers of information. Students are divided into groups and given Web addresses for an organization with a strong view on Nazism. Discusses alternate and follow-up lessons and student reactions. Lists Web…
ERIC Educational Resources Information Center
Smith, Leigh K.; Draper, Roni Jo; Sabey, Brenda L.
2005-01-01
This qualitative study examined the use of WebQuests as a teaching tool in problem-based elementary methods courses. We explored the potential of WebQuests to address three dilemmas faced in teacher education: (a) modeling instruction that is based on current learning theory and research-based practices, (b) providing preservice teachers with…
A Quantitative Study of Factors Related to Adult E-Learner's Adoption of Web 2.0 Technology
ERIC Educational Resources Information Center
Bledsoe, Johnny Mark
2012-01-01
The content created by digital natives via collaborative Web 2.0 applications provides a rich source of unique knowledge and social capital for their virtual communities of interest. The problem addressed in this study was the limited understanding of older digital immigrants who use Web 2.0 applications to access, distribute, or enhance these…
Report of the Organic Contamination Science Steering Group
NASA Technical Reports Server (NTRS)
Mahaffy, P. R.; Beaty, D. W.; Anderson, M. S.; Aveni, G.; Bada, J. L.; Clemett, S. J.; DesMaris, D. J.; Douglas, S.; Dworkin, J. P.; Kern, R. G.
2004-01-01
The exploration of the possible emergence and duration of life on Mars from landed platforms requires attention to the quality of measurements that address these objectives. In particular, the potential impact of terrestrial contamination on the measurement of reduced carbon with sensitive in situ instruments must be addressed in order to reach definitive conclusions regarding the source of organic molecules. Following the recommendation of the Mars Exploration Program Analysis Group (MEPAG) at its September 2003 meeting [MEPAG, 2003], the Mars Program Office at NASA Headquarters chartered the Organic Contamination Science Steering Group (OCSSG) to address this issue. The full report of the six week study of the OCSSG can be found on the MEPAG web site [1]. The study was intended to define the contamination problem and to begin to suggest solutions that could provide direction to the engineering teams that design and produce the Mars landed systems. Requirements set by the Planetary Protection Policy in effect for any specific mission do not directly address this question of the potential interference from terrestrial contaminants during in situ measurements.
Nadkarni, Prakash M.; Brandt, Cynthia M.; Marenco, Luis
2000-01-01
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples. PMID:10887163
UFO: a web server for ultra-fast functional profiling of whole genome protein sequences.
Meinicke, Peter
2009-09-02
Functional profiling is a key technique to characterize and compare the functional potential of entire genomes. The estimation of profiles according to an assignment of sequences to functional categories is a computationally expensive task because it requires the comparison of all protein sequences from a genome with a usually large database of annotated sequences or sequence families. Based on machine learning techniques for Pfam domain detection, the UFO web server for ultra-fast functional profiling allows researchers to process large protein sequence collections instantaneously. Besides the frequencies of Pfam and GO categories, the user also obtains the sequence specific assignments to Pfam domain families. In addition, a comparison with existing genomes provides dissimilarity scores with respect to 821 reference proteomes. Considering the underlying UFO domain detection, the results on 206 test genomes indicate a high sensitivity of the approach. In comparison with current state-of-the-art HMMs, the runtime measurements show a considerable speed up in the range of four orders of magnitude. For an average size prokaryotic genome, the computation of a functional profile together with its comparison typically requires about 10 seconds of processing time. For the first time the UFO web server makes it possible to get a quick overview on the functional inventory of newly sequenced organisms. The genome scale comparison with a large number of precomputed profiles allows a first guess about functionally related organisms. The service is freely available and does not require user registration or specification of a valid email address.
Hearn,, Paul P.
2009-01-01
Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.
Food Web Topology in High Mountain Lakes
Sánchez-Hernández, Javier; Cobo, Fernando; Amundsen, Per-Arne
2015-01-01
Although diversity and limnology of alpine lake systems are well studied, their food web structure and properties have rarely been addressed. Here, the topological food webs of three high mountain lakes in Central Spain were examined. We first addressed the pelagic networks of the lakes, and then we explored how food web topology changed when benthic biota was included to establish complete trophic networks. We conducted a literature search to compare our alpine lacustrine food webs and their structural metrics with those of 18 published lentic webs using a meta-analytic approach. The comparison revealed that the food webs in alpine lakes are relatively simple, in terms of structural network properties (linkage density and connectance), in comparison with lowland lakes, but no great differences were found among pelagic networks. The studied high mountain food webs were dominated by a high proportion of omnivores and species at intermediate trophic levels. Omnivores can exploit resources at multiple trophic levels, and this characteristic might reduce competition among interacting species. Accordingly, the trophic overlap, measured as trophic similarity, was very low in all three systems. Thus, these alpine networks are characterized by many omnivorous consumers with numerous prey species and few consumers with a single or few prey and with low competitive interactions among species. The present study emphasizes the ecological significance of omnivores in high mountain lakes as promoters of network stability and as central players in energy flow pathways via food partitioning and enabling energy mobility among trophic levels. PMID:26571235
Food Web Topology in High Mountain Lakes.
Sánchez-Hernández, Javier; Cobo, Fernando; Amundsen, Per-Arne
2015-01-01
Although diversity and limnology of alpine lake systems are well studied, their food web structure and properties have rarely been addressed. Here, the topological food webs of three high mountain lakes in Central Spain were examined. We first addressed the pelagic networks of the lakes, and then we explored how food web topology changed when benthic biota was included to establish complete trophic networks. We conducted a literature search to compare our alpine lacustrine food webs and their structural metrics with those of 18 published lentic webs using a meta-analytic approach. The comparison revealed that the food webs in alpine lakes are relatively simple, in terms of structural network properties (linkage density and connectance), in comparison with lowland lakes, but no great differences were found among pelagic networks. The studied high mountain food webs were dominated by a high proportion of omnivores and species at intermediate trophic levels. Omnivores can exploit resources at multiple trophic levels, and this characteristic might reduce competition among interacting species. Accordingly, the trophic overlap, measured as trophic similarity, was very low in all three systems. Thus, these alpine networks are characterized by many omnivorous consumers with numerous prey species and few consumers with a single or few prey and with low competitive interactions among species. The present study emphasizes the ecological significance of omnivores in high mountain lakes as promoters of network stability and as central players in energy flow pathways via food partitioning and enabling energy mobility among trophic levels.
Discovery Mechanisms for the Sensor Web
Jirka, Simon; Bröring, Arne; Stasch, Christoph
2009-01-01
This paper addresses the discovery of sensors within the OGC Sensor Web Enablement framework. Whereas services like the OGC Web Map Service or Web Coverage Service are already well supported through catalogue services, the field of sensor networks and the according discovery mechanisms is still a challenge. The focus within this article will be on the use of existing OGC Sensor Web components for realizing a discovery solution. After discussing the requirements for a Sensor Web discovery mechanism, an approach will be presented that was developed within the EU funded project “OSIRIS”. This solution offers mechanisms to search for sensors, exploit basic semantic relationships, harvest sensor metadata and integrate sensor discovery into already existing catalogues. PMID:22574038
Source Update Capture in Information Agents
NASA Technical Reports Server (NTRS)
Ashish, Naveen; Kulkarni, Deepak; Wang, Yao
2003-01-01
In this paper we present strategies for successfully capturing updates at Web sources. Web-based information agents provide integrated access to autonomous Web sources that can get updated. For many information agent applications we are interested in knowing when a Web source to which the application provides access, has been updated. We may also be interested in capturing all the updates at a Web source over a period of time i.e., detecting the updates and, for each update retrieving and storing the new version of data. Previous work on update and change detection by polling does not adequately address this problem. We present strategies for intelligently polling a Web source for efficiently capturing changes at the source.
ERIC Educational Resources Information Center
Levine, Elliott
2001-01-01
Sound technology policies can spell the difference between an effective website and an online nightmare. An effective web development policy addresses six key areas: roles and responsibilities, content/educational value, privacy and safety, adherence to copyright laws, technical standards, and use of commercial sites and services. (MLH)
76 FR 39842 - Draft Investigation Report-DuPont Belle; Public Comment Requested
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
... the CSB Web site ( http://www.csb.gov .). Comments may also be sent to CSB Headquarters (see address... public review either at CSB Headquarters or by following directions posted on the CSB Web site. By..., operating [[Page 39843
Designing Web-based Telemedicine Training for Military Health Care Providers.
ERIC Educational Resources Information Center
Bangert, David; Doktor, Boert; Johnson, Erik
2001-01-01
Interviews with 48 military health care professionals identified 20 objectives and 4 learning clusters for a telemedicine training curriculum. From these clusters, web-based modules were developed addressing clinical learning, technology, organizational issues, and introduction to telemedicine. (Contains 19 references.) (SK)
Web app based patient education in psoriasis - a randomized controlled trial.
Hawkins, Spencer D; Barilla, Steven; Feldman, Steven R
2017-04-15
Patients report wanting more information about psoriasis and clear expectations from the onset of therapy. Dermatologists do not think patients receive or internalize adequate information. There isa need for further explanation of treatment regimens to increase knowledge, compliance, and patient satisfaction. Recent advancements in web technology have the potential to improve these psoriasis outcomes. A web based application was created to educate psoriasis patients using video, graphics, and textual information. An investigator blinded, randomized, controlled study evaluated the website's efficacy in 50 psoriasis patients at Wake Forest Baptist Health Dermatology. Patients were randomized into two groups: Group 1 received a link to the educational web app and a survey following their visit; Group 2 received a link to the survey with no educational web app. The survey assessed patient knowledge, self reported adherence to medication, and adequacy of addressing concerns. Twenty two patients completed the study. Patients in the web app group scored an average of 11/14 on the psoriasis knowledge quiz, whereas patients in the control group scored an average of 9/14 for an improvement of roughly 18% (p=0.008, n=22). Web app based education via DermPatientEd.Com is an efficient way to improve knowledge, but we did not demonstrate improvements in self-reported medication adherence or the ability to address concerns of psoriasis patients.
Technology use among adults who are deaf and hard of hearing: a national survey.
Maiorana-Basas, Michella; Pagliaro, Claudia M
2014-07-01
As society becomes increasingly more dependent on technology, information regarding the use, preference, and accessibility of commonly used devices and services among individuals who are deaf and hard of hearing (DHH) is crucial. Developing technologies that are functional and appropriately accessible allows persons who are DHH to fully participate in society, education, and business while also providing opportunities for personal and professional advancement. Although a few international studies have addressed the technology use of individuals who are DHH, none exist that focus on the needs, preferences, and accessibility of current Internet- and mobile-based technologies. Consequently, a national survey was conducted in the United States to determine the preference, frequency of use, and accessibility of various technologies (hardware, software, Web sites) by adults who are DHH and living in the United States. Findings indicate frequent use of smartphones and personal computers, specifically for text-based communication and web surfing, and little use of Teletypewriter/Telecommunications Device for the Deaf. Web site feature preferences include pictures and text, and captions over signed translations. Some results varied by demographics. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Web-based Factors Affecting Online Purchasing Behaviour
NASA Astrophysics Data System (ADS)
Ariff, Mohd Shoki Md; Sze Yan, Ng; Zakuan, Norhayati; Zaidi Bahari, Ahamad; Jusoh, Ahmad
2013-06-01
The growing use of internet and online purchasing among young consumers in Malaysia provides a huge prospect in e-commerce market, specifically for B2C segment. In this market, if E-marketers know the web-based factors affecting online buyers' behaviour, and the effect of these factors on behaviour of online consumers, then they can develop their marketing strategies to convert potential customers into active one, while retaining existing online customers. Review of previous studies related to the online purchasing behaviour in B2C market has point out that the conceptualization and empirical validation of the online purchasing behaviour of Information and Communication Technology (ICT) literate users, or ICT professional, in Malaysia has not been clearly addressed. This paper focuses on (i) web-based factors which online buyers (ICT professional) keep in mind while shopping online; and (ii) the effect of web-based factors on online purchasing behaviour. Based on the extensive literature review, a conceptual framework of 24 items of five factors was constructed to determine web-based factors affecting online purchasing behaviour of ICT professional. Analysis of data was performed based on the 310 questionnaires, which were collected using a stratified random sampling method, from ICT undergraduate students in a public university in Malaysia. The Exploratory factor analysis performed showed that five factors affecting online purchase behaviour are Information Quality, Fulfilment/Reliability/Customer Service, Website Design, Quick and Details, and Privacy/Security. The result of Multiple Regression Analysis indicated that Information Quality, Quick and Details, and Privacy/Security affect positively online purchase behaviour. The results provide a usable model for measuring web-based factors affecting buyers' online purchase behaviour in B2C market, as well as for online shopping companies to focus on the factors that will increase customers' online purchase.
MODEST: a web-based design tool for oligonucleotide-mediated genome engineering and recombineering
Bonde, Mads T.; Klausen, Michael S.; Anderson, Mads V.; Wallin, Annika I.N.; Wang, Harris H.; Sommer, Morten O.A.
2014-01-01
Recombineering and multiplex automated genome engineering (MAGE) offer the possibility to rapidly modify multiple genomic or plasmid sites at high efficiencies. This enables efficient creation of genetic variants including both single mutants with specifically targeted modifications as well as combinatorial cell libraries. Manual design of oligonucleotides for these approaches can be tedious, time-consuming, and may not be practical for larger projects targeting many genomic sites. At present, the change from a desired phenotype (e.g. altered expression of a specific protein) to a designed MAGE oligo, which confers the corresponding genetic change, is performed manually. To address these challenges, we have developed the MAGE Oligo Design Tool (MODEST). This web-based tool allows designing of MAGE oligos for (i) tuning translation rates by modifying the ribosomal binding site, (ii) generating translational gene knockouts and (iii) introducing other coding or non-coding mutations, including amino acid substitutions, insertions, deletions and point mutations. The tool automatically designs oligos based on desired genotypic or phenotypic changes defined by the user, which can be used for high efficiency recombineering and MAGE. MODEST is available for free and is open to all users at http://modest.biosustain.dtu.dk. PMID:24838561
Applying Sensor Web Technology to Marine Sensor Data
NASA Astrophysics Data System (ADS)
Jirka, Simon; del Rio, Joaquin; Mihai Toma, Daniel; Nüst, Daniel; Stasch, Christoph; Delory, Eric
2015-04-01
In this contribution we present two activities illustrating how Sensor Web technology helps to enable a flexible and interoperable sharing of marine observation data based on standards. An important foundation is the Sensor Web Architecture developed by the European FP7 project NeXOS (Next generation Low-Cost Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management). This architecture relies on the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) framework. It is an exemplary solution for facilitating the interoperable exchange of marine observation data within and between (research) organisations. The architecture addresses a series of functional and non-functional requirements which are fulfilled through different types of OGC SWE components. The diverse functionalities offered by the NeXOS Sensor Web architecture are shown in the following overview: - Pull-based observation data download: This is achieved through the OGC Sensor Observation Service (SOS) 2.0 interface standard. - Push-based delivery of observation data to allow users the subscription to new measurements that are relevant for them: For this purpose there are currently several specification activities under evaluation (e.g. OGC Sensor Event Service, OGC Publish/Subscribe Standards Working Group). - (Web-based) visualisation of marine observation data: Implemented through SOS client applications. - Configuration and controlling of sensor devices: This is ensured through the OGC Sensor Planning Service 2.0 interface. - Bridging between sensors/data loggers and Sensor Web components: For this purpose several components such as the "Smart Electronic Interface for Sensor Interoperability" (SEISI) concept are developed; this is complemented by a more lightweight SOS extension (e.g. based on the W3C Efficient XML Interchange (EXI) format). To further advance this architecture, there is on-going work to develop dedicated profiles of selected OGC SWE specifications that provide stricter guidance how these standards shall be applied to marine data (e.g. SensorML 2.0 profiles stating which metadata elements are mandatory building upon the ESONET Sensor Registry developments, etc.). Within the NeXOS project the presented architecture is implemented as a set of open source components. These implementations can be re-used by all interested scientists and data providers needing tools for publishing or consuming oceanographic sensor data. In further projects such as the European project FixO3 (Fixed-point Open Ocean Observatories), these software development activities are complemented with additional efforts to provide guidance how Sensor Web technology can be applied in an efficient manner. This way, not only software components are made available but also documentation and information resources that help to understand which types of Sensor Web deployments are best suited to fulfil different types of user requirements.
ERIC Educational Resources Information Center
Smith, Peter, Ed.
Topics addressed by the papers including in this proceedings include: multimedia in the classroom; World Wide Web site development; the evolution of academic library services; a Web-based literature course; development of a real-time intelligent network environment; serving grades over the Internet; e-mail over a Web browser; using technology to…
Information about epilepsy on the internet: An exploratory study of Arabic websites.
Alkhateeb, Jamal M; Alhadidi, Muna S
2018-01-01
The aim of this study was to explore information about epilepsy found on Arabic websites. The researchers collected information from the internet between November 2016 and January 2017. Information was obtained using Google and Yahoo search engines. Keywords used were the Arabic equivalent of the following two keywords: epilepsy (Al-saraa) and convulsion (Tashanoj). A total of 144 web pages addressing epilepsy in Arabic were reviewed. The majority of web pages were websites of medical institutions and general health websites, followed by informational and educational websites, others, blogs and websites of individuals, and news and media sites. Topics most commonly addressed were medical treatments for epilepsy (50% of all pages) followed by epilepsy definition (41%) and epilepsy etiology (34.7%). The results also revealed that the vast majority of web pages did not mention the source of information. Many web pages also did not provide author information. Only a small proportion of the web pages provided adequate information. Relatively few web pages provided inaccurate information or made sweeping generalizations. As a result, it is concluded that the findings of the present study suggest that development of more credible Arabic websites on epilepsy is needed. These websites need to go beyond basic information, offering more evidence-based and updated information about epilepsy. Copyright © 2017 Elsevier Inc. All rights reserved.
An Approach of Web-based Point Cloud Visualization without Plug-in
NASA Astrophysics Data System (ADS)
Ye, Mengxuan; Wei, Shuangfeng; Zhang, Dongmei
2016-11-01
With the advances in three-dimensional laser scanning technology, the demand for visualization of massive point cloud is increasingly urgent, but a few years ago point cloud visualization was limited to desktop-based solutions until the introduction of WebGL, several web renderers are available. This paper addressed the current issues in web-based point cloud visualization, and proposed a method of web-based point cloud visualization without plug-in. The method combines ASP.NET and WebGL technologies, using the spatial database PostgreSQL to store data and the open web technologies HTML5 and CSS3 to implement the user interface, a visualization system online for 3D point cloud is developed by Javascript with the web interactions. Finally, the method is applied to the real case. Experiment proves that the new model is of great practical value which avoids the shortcoming of the existing WebGIS solutions.
GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.
Liang, Steve H L; Huang, Chih-Yuan
2013-10-02
The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.
Review of Extracting Information From the Social Web for Health Personalization
Karlsen, Randi; Bonander, Jason
2011-01-01
In recent years the Web has come into its own as a social platform where health consumers are actively creating and consuming Web content. Moreover, as the Web matures, consumers are gaining access to personalized applications adapted to their health needs and interests. The creation of personalized Web applications relies on extracted information about the users and the content to personalize. The Social Web itself provides many sources of information that can be used to extract information for personalization apart from traditional Web forms and questionnaires. This paper provides a review of different approaches for extracting information from the Social Web for health personalization. We reviewed research literature across different fields addressing the disclosure of health information in the Social Web, techniques to extract that information, and examples of personalized health applications. In addition, the paper includes a discussion of technical and socioethical challenges related to the extraction of information for health personalization. PMID:21278049
GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web
Liang, Steve H.L.; Huang, Chih-Yuan
2013-01-01
The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision. PMID:24152921
77 FR 65419 - Notice of Sunshine Act Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-26
... will be Web cast live at the Web address--www.nrc.gov. * * * * * *The schedule for Commission meetings....html . * * * * * The NRC provides reasonable accommodation to individuals with disabilities where appropriate. If you need a reasonable accommodation to participate in these public meetings, or need this...
Supplementing Introductory Biology with On-Line Curriculum
ERIC Educational Resources Information Center
McGroarty, Estelle; Parker, Joyce; Heidemann, Merle; Lim, Heejun; Olson, Mark; Long, Tammy; Merrill, John; Riffell, Samuel; Smith, James; Batzli, Janet; Kirschtel, David
2004-01-01
We developed web-based modules addressing fundamental concepts of introductory biology delivered through the LON-CAPA course management system. These modules were designed and used to supplement large, lecture-based introductory biology classes. Incorporating educational principles and the strength of web-based instructional technology, choices…
ERIC Educational Resources Information Center
Nesbeitt, Sarah
1997-01-01
Numerous Web-based phone and address directories provide advantages over the white and yellow pages. Although many share a common database, each has features that set it apart: maps, suggested driving directions, and phone dialing. This article examines eight (Bigfoot, BigBook, BigYellow, Switchboard, Infospace, Contractjobs, InterNIC)…
Web-Based Instruction and Learning: Responding to K-14 Customer Needs
NASA Technical Reports Server (NTRS)
McCarthy, Marianne; Grabowski, Barbara; Koszalka, Tiffany; Peck, Christa
2003-01-01
A follow-up working conference was held at Lewis Research Center (now Glenn Research Center) on September 23-25, 1997, to continue discussing issues related to the development of Web-based education materials for the K-14 community. The conference continued the collaboration among the NASA aerospace technology Centers (Ames, Dryden, Langley, and Lewis [now Glenn]), NASA Headquarters, the University of Idaho and the Pennsylvania State University. The conference consisted of presentations by the Aeronautics Cooperative Agreement teams, and working sessions that addressed issues related to the conference theme, responding to the K-14 customers needs. The group identified the most significant issues by consensus. The issues addressed were: classroom access, World Wide Web resources, teacher training, different teaching and learning styles, interactivity, and education standards. The working sessions produced observations and recommendations in each of these areas in order to work toward the goal of making NASA sponsored Web-based educational resources useful to teachers and students.
Finding Specification Pages from the Web
NASA Astrophysics Data System (ADS)
Yoshinaga, Naoki; Torisawa, Kentaro
This paper presents a method of finding a specification page on the Web for a given object (e.g., ``Ch. d'Yquem'') and its class label (e.g., ``wine''). A specification page for an object is a Web page which gives concise attribute-value information about the object (e.g., ``county''-``Sauternes'') in well formatted structures. A simple unsupervised method using layout and symbolic decoration cues was applied to a large number of the Web pages to acquire candidate attributes for each class (e.g., ``county'' for a class ``wine''). We then filter out irrelevant words from the putative attributes through an author-aware scoring function that we called site frequency. We used the acquired attributes to select a representative specification page for a given object from the Web pages retrieved by a normal search engine. Experimental results revealed that our system greatly outperformed the normal search engine in terms of this specification retrieval.
NASA Astrophysics Data System (ADS)
Showstack, Randy
With the growing interest in extreme climate and weather events, the National Oceanic and Atmospheric Administration (NOAA) has set up a one-stop Web site. It includes data on tornadoes, hurricanes, and heavy rainfall, temperature extremes, global climate change, satellite images, and El Niño and La Niña. The Web address is http://www.ncdc.noaa.gov.Another good climate Web site is the La Niña Home Page. Set up by the Environmental and Societal Impacts Group of the National Center for Atmospheric Research, the site includes forecasts, data sources, impacts, and Internet links.
Rini, Christine; Vu, Maihan B; Lerner, Hannah; Bloom, Catherine; Carda-Auten, Jessica; Wood, William A; Basch, Ethan M; Voorhees, Peter M; Reeder-Hayes, Katherine E; Keefe, Francis J
2018-04-01
Persistent pain is common and inadequately treated in cancer patients. Behavioral pain interventions are a recommended part of multimodal pain treatments, but they are underused in clinical care due to barriers such as a lack of the resources needed to deliver them in person and difficulties coordinating their use with clinical care. Pain coping skills training (PCST) is an evidence-based behavioral pain intervention traditionally delivered in person. Delivering this training via the web would increase access to it by addressing barriers that currently limit its use. We conducted a patient pilot study of an 8-week web-based PCST program to determine the acceptability of this approach to patients and the program features needed to meet their needs. Focus groups with healthcare providers identified strategies for coordinating the use of web-based PCST in clinical care. Participants included 7 adults with bone pain due to multiple myeloma or metastasized breast or prostate cancer and 12 healthcare providers (4 physicians and 8 advanced practice providers) who treat cancer-related bone pain. Patients completed web-based PCST at home and then took part in an in-depth qualitative interview. Providers attended focus groups led by a trained moderator. Qualitative analyses identified themes in the patient and provider data. Patients reported strongly favorable responses to web-based PCST and described emotional and physical benefits. They offered suggestions for adapting the approach to better fit their needs and to overcome barriers to completion. Focus groups indicated a need to familiarize healthcare providers with PCST and to address concerns about overburdening patients. Providers would recommend the program to patients they felt could benefit. They suggested applying a broad definition of cancer pain and having various types of providers help coordinate program its use with clinical care. Web-based PCST was acceptable to patients and providers. Our findings suggest that patients could benefit from this approach, especially if patient and provider barriers are addressed.
A Security Architecture for Grid-enabling OGC Web Services
NASA Astrophysics Data System (ADS)
Angelini, Valerio; Petronzio, Luca
2010-05-01
In the proposed presentation we describe an architectural solution for enabling a secure access to Grids and possibly other large scale on-demand processing infrastructures through OGC (Open Geospatial Consortium) Web Services (OWS). This work has been carried out in the context of the security thread of the G-OWS Working Group. G-OWS (gLite enablement of OGC Web Services) is an international open initiative started in 2008 by the European CYCLOPS , GENESI-DR, and DORII Project Consortia in order to collect/coordinate experiences in the enablement of OWS's on top of the gLite Grid middleware. G-OWS investigates the problem of the development of Spatial Data and Information Infrastructures (SDI and SII) based on the Grid/Cloud capacity in order to enable Earth Science applications and tools. Concerning security issues, the integration of OWS compliant infrastructures and gLite Grids needs to address relevant challenges, due to their respective design principles. In fact OWS's are part of a Web based architecture that demands security aspects to other specifications, whereas the gLite middleware implements the Grid paradigm with a strong security model (the gLite Grid Security Infrastructure: GSI). In our work we propose a Security Architectural Framework allowing the seamless use of Grid-enabled OGC Web Services through the federation of existing security systems (mostly web based) with the gLite GSI. This is made possible mediating between different security realms, whose mutual trust is established in advance during the deployment of the system itself. Our architecture is composed of three different security tiers: the user's security system, a specific G-OWS security system, and the gLite Grid Security Infrastructure. Applying the separation-of-concerns principle, each of these tiers is responsible for controlling the access to a well-defined resource set, respectively: the user's organization resources, the geospatial resources and services, and the Grid resources. While the gLite middleware is tied to a consolidated security approach based on X.509 certificates, our system is able to support different kinds of user's security infrastructures. Our central component, the G-OWS Security Framework, is based on the OASIS WS-Trust specifications and on the OGC GeoRM architectural framework. This allows to satisfy advanced requirements such as the enforcement of specific geospatial policies and complex secure web service chained requests. The typical use case is represented by a scientist belonging to a given organization who issues a request to a G-OWS Grid-enabled Web Service. The system initially asks the user to authenticate to his/her organization's security system and, after verification of the user's security credentials, it translates the user's digital identity into a G-OWS identity. This identity is linked to a set of attributes describing the user's access rights to the G-OWS services and resources. Inside the G-OWS Security system, access restrictions are applied making use of the enhanced Geospatial capabilities specified by the OGC GeoXACML. If the required action needs to make use of the Grid environment the system checks if the user is entitled to access a Grid infrastructure. In that case his/her identity is translated to a temporary Grid security token using the Short Lived Credential Services (IGTF Standard). In our case, for the specific gLite Grid infrastructure, some information (VOMS Attributes) is plugged into the Grid Security Token to grant the access to the user's Virtual Organization Grid resources. The resulting token is used to submit the request to the Grid and also by the various gLite middleware elements to verify the user's grants. Basing on the presented framework, the G-OWS Security Working Group developed a prototype, enabling the execution of OGC Web Services on the EGEE Production Grid through the federation with a Shibboleth based security infrastructure. Future plans aim to integrate other Web authentication services such as OpenID, Kerberos and WS-Federation.
NASA Astrophysics Data System (ADS)
Lindermann, Nadine; Valcárcel, Sylvia; Schaarschmidt, Mario; von Kortzfleisch, Harald
Small- and medium sized enterprises (SMEs) are of high social and economic importance since they represent 99% of European enterprises. With regard to their restricted resources, SMEs are facing a limited capacity for innovation to compete with new challenges in a complex and dynamic competitive environment. Given this context, SMEs need to increasingly cooperate to generate innovations on an extended resource base. Our research project focuses on the aspect of open innovation in SME-networks enabled by Web 2.0 applications and referring to innovative solutions of non-competitive daily life problems. Examples are industrial safety, work-life balance issues or pollution control. The project raises the question whether the use of Web 2.0 applications can foster the exchange of creativity and innovative ideas within a network of SMEs and hence catalyze new forms of innovation processes among its participants. Using Web 2.0 applications within SMEs implies consequently breaking down innovation processes to employees’ level and thus systematically opening up a heterogeneous and broader knowledge base to idea generation. In this paper we address first steps on a roadmap towards Web 2.0-based open innovation processes within SME-networks. It presents a general framework for interaction activities leading to open innovation and recommends a regional marketplace as a viable, trust-building driver for further collaborative activities. These findings are based on field research within a specific SME-network in Rhineland-Palatinate Germany, the “WirtschaftsForum Neuwied e.V.”, which consists of roughly 100 heterogeneous SMEs employing about 8,000 workers.
32 CFR Appendix B to Part 310 - Sample Notification Letter
Code of Federal Regulations, 2010 CFR
2010-07-01
..., social security number, residential address, date of birth, office and home email address, office and... its Web site at http://www.consumer.gov/idtheft/con_steps.htm. The FTC urges that you immediately...
32 CFR Appendix B to Part 310 - Sample Notification Letter
Code of Federal Regulations, 2011 CFR
2011-07-01
..., social security number, residential address, date of birth, office and home email address, office and... its Web site at http://www.consumer.gov/idtheft/con_steps.htm. The FTC urges that you immediately...
Formative Evaluation of a Web-Based Course in Meteorology.
ERIC Educational Resources Information Center
Phelps, Julia; Reynolds, Ross
1999-01-01
Describes the formative-evaluation process for the EuroMET (European Meteorological Education and Training) project, Web-Based university courses in meteorology that were created to address the education and training needs of professional meteorologists and students throughout Europe. Usability and interactive and multimedia elements are…
Re-Examining Cognition during Student-Centered, Web-Based Learning
ERIC Educational Resources Information Center
Hannafin, Michael; Hannafin, Kathleen; Gabbitas, Bruce
2009-01-01
During student-centered learning, the individual assumes responsibility for determining learning goals, monitoring progress toward meeting goals, adjusting or adapting approaches as warranted, and determining when individual goals have been adequately addressed. This can be particularly challenging while learning from the World-Wide Web, where…
Intelligent Web-Based English Instruction in Middle Schools
ERIC Educational Resources Information Center
Jia, Jiyou
2015-01-01
The integration of technology into educational environments has become more prominent over the years. The combination of technology and face-to-face interaction with instructors allows for a thorough, more valuable educational experience. "Intelligent Web-Based English Instruction in Middle Schools" addresses the concerns associated with…
Brown, Jeffrey S; Holmes, John H; Shah, Kiran; Hall, Ken; Lazarus, Ross; Platt, Richard
2010-06-01
Comparative effectiveness research, medical product safety evaluation, and quality measurement will require the ability to use electronic health data held by multiple organizations. There is no consensus about whether to create regional or national combined (eg, "all payer") databases for these purposes, or distributed data networks that leave most Protected Health Information and proprietary data in the possession of the original data holders. Demonstrate functions of a distributed research network that supports research needs and also address data holders concerns about participation. Key design functions included strong local control of data uses and a centralized web-based querying interface. We implemented a pilot distributed research network and evaluated the design considerations, utility for research, and the acceptability to data holders of methods for menu-driven querying. We developed and tested a central, web-based interface with supporting network software. Specific functions assessed include query formation and distribution, query execution and review, and aggregation of results. This pilot successfully evaluated temporal trends in medication use and diagnoses at 5 separate sites, demonstrating some of the possibilities of using a distributed research network. The pilot demonstrated the potential utility of the design, which addressed the major concerns of both users and data holders. No serious obstacles were identified that would prevent development of a fully functional, scalable network. Distributed networks are capable of addressing nearly all anticipated uses of routinely collected electronic healthcare data. Distributed networks would obviate the need for centralized databases, thus avoiding numerous obstacles.
Stinson, Jennifer N; Lalloo, Chitra; Harris, Lauren; Isaac, Lisa; Campbell, Fiona; Brown, Stephen; Ruskin, Danielle; Gordon, Allan; Galonski, Marilyn; Pink, Leah R; Buckley, Norman; Henry, James L; White, Meghan; Karim, Allia
2014-01-01
BACKGROUND: While there are emerging web-based self-management programs for children and adolescents with chronic pain, there is currently not an integrated web- and smartphone-based app that specifically addresses the needs of adolescents with chronic pain. OBJECTIVES: To conduct a needs assessment to inform the development of an online chronic pain self-management program for adolescents, called iCanCope with Pain™. METHODS: A purposive sample of adolescents (n=23; 14 to 18 years of age) was recruited from two pediatric chronic pain clinics in Ontario. Interdisciplinary health care providers were also recruited from these sites. Three focus groups were conducted with adolescents (n=16) and one with pediatric health care providers (n=7). Individual adolescent interviews were also conducted (n=7). RESULTS: Qualitative analysis uncovered four major themes: pain impact; barriers to care; pain management strategies; and transition to adult care. Pain impacted social, emotional, physical and role functioning, as well as future goals. Barriers to care were revealed at the health care system, patient and societal levels. Pain management strategies included support systems, and pharmacological, physical and psychological approaches. Transition subthemes were: disconnect between pediatric and adult systems; skills development; parental role; and fear/anxiety. Based on these identified needs, the iCanCope with Pain™ architecture will include the core theory-based functionalities of: symptom self-monitoring; personalized goal setting; pain coping skills training; peer-based social support; and chronic pain education. CONCLUSIONS: The proposed iCanCope with Pain™ program aims to address the self-management needs of adolescents with chronic pain by improving access to disease information, strategies to manage symptoms and social support. PMID:25000507
Patel, Ronak Y; Shah, Neethu; Jackson, Andrew R; Ghosh, Rajarshi; Pawliczek, Piotr; Paithankar, Sameer; Baker, Aaron; Riehle, Kevin; Chen, Hailin; Milosavljevic, Sofia; Bizon, Chris; Rynearson, Shawn; Nelson, Tristan; Jarvik, Gail P; Rehm, Heidi L; Harrison, Steven M; Azzariti, Danielle; Powell, Bradford; Babb, Larry; Plon, Sharon E; Milosavljevic, Aleksandar
2017-01-12
The success of the clinical use of sequencing based tests (from single gene to genomes) depends on the accuracy and consistency of variant interpretation. Aiming to improve the interpretation process through practice guidelines, the American College of Medical Genetics and Genomics (ACMG) and the Association for Molecular Pathology (AMP) have published standards and guidelines for the interpretation of sequence variants. However, manual application of the guidelines is tedious and prone to human error. Web-based tools and software systems may not only address this problem but also document reasoning and supporting evidence, thus enabling transparency of evidence-based reasoning and resolution of discordant interpretations. In this report, we describe the design, implementation, and initial testing of the Clinical Genome Resource (ClinGen) Pathogenicity Calculator, a configurable system and web service for the assessment of pathogenicity of Mendelian germline sequence variants. The system allows users to enter the applicable ACMG/AMP-style evidence tags for a specific allele with links to supporting data for each tag and generate guideline-based pathogenicity assessment for the allele. Through automation and comprehensive documentation of evidence codes, the system facilitates more accurate application of the ACMG/AMP guidelines, improves standardization in variant classification, and facilitates collaborative resolution of discordances. The rules of reasoning are configurable with gene-specific or disease-specific guideline variations (e.g. cardiomyopathy-specific frequency thresholds and functional assays). The software is modular, equipped with robust application program interfaces (APIs), and available under a free open source license and as a cloud-hosted web service, thus facilitating both stand-alone use and integration with existing variant curation and interpretation systems. The Pathogenicity Calculator is accessible at http://calculator.clinicalgenome.org . By enabling evidence-based reasoning about the pathogenicity of genetic variants and by documenting supporting evidence, the Calculator contributes toward the creation of a knowledge commons and more accurate interpretation of sequence variants in research and clinical care.
Robotic Prostatectomy on the Web: A Cross-Sectional Qualitative Assessment.
Borgmann, Hendrik; Mager, René; Salem, Johannes; Bründl, Johannes; Kunath, Frank; Thomas, Christian; Haferkamp, Axel; Tsaur, Igor
2016-08-01
Many patients diagnosed with prostate cancer search for information on robotic prostatectomy (RobP) on the Web. We aimed to evaluate the qualitative characteristics of the mostly frequented Web sites on RobP with a particular emphasis on provider-dependent issues. Google was searched for the term "robotic prostatectomy" in Europe and North America. The mostly frequented Web sites were selected and classified as physician-provided and publically-provided. Quality was measured using Journal of the American Medical Association (JAMA) benchmark criteria, DISCERN score, and addressing of Trifecta surgical outcomes. Popularity was analyzed using Google PageRank and Alexa tool. Accessibility, usability, and reliability were investigated using the LIDA tool and readability was assessed using readability indices. Twenty-eight Web sites were physician-provided and 15 publically-provided. For all Web sites, 88% of JAMA benchmark criteria were fulfilled, DISCERN quality score was high, and 81% of Trifecta outcome measurements were addressed. Popularity was average according to Google PageRank (mean 2.9 ± 1.5) and Alexa Traffic Rank (median, 49,109; minimum, 7; maximum, 8,582,295). Accessibility (85 ± 7%), usability (92 ± 3%), and reliability scores (88 ± 8%) were moderate to high. Automated Readability Index was 7.2 ± 2.1 and Flesch-Kincaid Grade Level was 9 ± 2, rating the Web sites as difficult to read. Physician-provided Web sites had higher quality scores and lower readability compared with publically-provided Web sites. Websites providing information on RobP obtained medium to high ratings in all domains of quality in the current assessment. In contrast, readability needs to be significantly improved so that this content can become available for the populace. Copyright © 2015 Elsevier Inc. All rights reserved.
Web-based health interventions for family caregivers of elderly individuals: A Scoping Review.
Wasilewski, Marina B; Stinson, Jennifer N; Cameron, Jill I
2017-07-01
For the growing proportion of elders globally, aging-related illnesses are primary causes of morbidity causing reliance on family members for support in the community. Family caregivers experience poorer physical and mental health than their non-caregiving counterparts. Web-based interventions can provide accessible support to family caregivers to offset declines in their health and well-being. Existing reviews focused on web-based interventions for caregivers have been limited to single illness populations and have mostly focused on the efficacy of the interventions. We therefore have limited insight into how web-based interventions for family caregiver have been developed, implemented and evaluated across aging-related illness. To describe: a) theoretical underpinnings of the literature; b) development, content and delivery of web-based interventions; c) caregiver usage of web-based interventions; d) caregiver experience with web-based interventions and e) impact of web-based interventions on caregivers' health outcomes. We followed Arksey and O'Malley's methodological framework for conducting scoping reviews which entails setting research questions, selecting relevant studies, charting the data and synthesizing the results in a report. Fifty-three publications representing 32 unique web-based interventions were included. Over half of the interventions were targeted at dementia caregivers, with the rest targeting caregivers to the stroke, cancer, diabetes and general frailty populations. Studies used theory across the intervention trajectory. Interventions aimed to improve a range of health outcomes for caregivers through static and interactive delivery methods Caregivers were satisfied with the usability and accessibility of the websites but usage was generally low and declined over time. Depression and caregiver burden were the most common outcomes evaluated. The interventions ranged in their impact on health and social outcomes but reductions in perception of caregiver burden were consistently observed. Caregivers value interactive interventions that are tailored to their unique needs and the illness context. However, usage of the interventions was sporadic and declined over time, indicating that future interventions should address stage-specific needs across the caregiving trajectory. A systematic review has the potential to be conducted given the consistency in caregiver burden and depression as outcomes. Copyright © 2017 Elsevier B.V. All rights reserved.
Quality, range, and legibility in web sites related to orofacial functions.
Corrêa, Camila de Castro; Ferrari, Deborah Viviane; Berretin-Felix, Giédre
2013-10-01
Introduction Plenty of information about health is available on the Internet; however, quality and legibility are not always evaluated. Knowledge regarding orofacial functions can be considered important for the population because it allows proper stimulus, early diagnosis, and prevention of the oral myofunctional alterations during early infancy. Objective The aim was evaluate the quality, legibility, and range of Web sites available in Brazilian Portuguese regarding the orofacial functions. Methods Selected Web sites with information directed to parents/caregivers of babies regarding breast-feeding, feeding after 6 months, deleterious oral habits, and breathing and speech were studied. The Web sites were evaluated through the application of Flesch Reading Ease Test and aspects of the Health on the Net (HON) modified code (HONCode); the range of the subjects addressed was compared with other aspects of infant development. Results From the access of 350 pages of the Internet, 35 Web sites were selected and 315 excluded because they did not meet the inclusion criteria. In relation to legibility, Web sites scored an average of 61.23% in the Flesch Test, and the application of the modified HONCode showed an average of 6.43 points; an average of 2.49 subjects were found per Web site evaluated, with information on breast-feeding being more frequent and subjects such as breathing and speech less frequent. Conclusions Web sites that deal with orofacial functions presented standard legibility classification. Only half of the ethical principles were considered by the modified HONCode in their majority, and most addressed subjects after "breast-feeding" were presented with restricted range.
Quality, Range, and Legibility in Web Sites Related to Orofacial Functions
Corrêa, Camila de Castro; Ferrari, Deborah Viviane; Berretin-Felix, Giédre
2013-01-01
Introduction Plenty of information about health is available on the Internet; however, quality and legibility are not always evaluated. Knowledge regarding orofacial functions can be considered important for the population because it allows proper stimulus, early diagnosis, and prevention of the oral myofunctional alterations during early infancy. Objective The aim was evaluate the quality, legibility, and range of Web sites available in Brazilian Portuguese regarding the orofacial functions. Methods Selected Web sites with information directed to parents/caregivers of babies regarding breast-feeding, feeding after 6 months, deleterious oral habits, and breathing and speech were studied. The Web sites were evaluated through the application of Flesch Reading Ease Test and aspects of the Health on the Net (HON) modified code (HONCode); the range of the subjects addressed was compared with other aspects of infant development. Results From the access of 350 pages of the Internet, 35 Web sites were selected and 315 excluded because they did not meet the inclusion criteria. In relation to legibility, Web sites scored an average of 61.23% in the Flesch Test, and the application of the modified HONCode showed an average of 6.43 points; an average of 2.49 subjects were found per Web site evaluated, with information on breast-feeding being more frequent and subjects such as breathing and speech less frequent. Conclusions Web sites that deal with orofacial functions presented standard legibility classification. Only half of the ethical principles were considered by the modified HONCode in their majority, and most addressed subjects after “breast-feeding” were presented with restricted range. PMID:25992036
CliniWeb: managing clinical information on the World Wide Web.
Hersh, W R; Brown, K E; Donohoe, L C; Campbell, E M; Horacek, A E
1996-01-01
The World Wide Web is a powerful new way to deliver on-line clinical information, but several problems limit its value to health care professionals: content is highly distributed and difficult to find, clinical information is not separated from non-clinical information, and the current Web technology is unable to support some advanced retrieval capabilities. A system called CliniWeb has been developed to address these problems. CliniWeb is an index to clinical information on the World Wide Web, providing a browsing and searching interface to clinical content at the level of the health care student or provider. Its database contains a list of clinical information resources on the Web that are indexed by terms from the Medical Subject Headings disease tree and retrieved with the assistance of SAPHIRE. Limitations of the processes used to build the database are discussed, together with directions for future research.
Putting post-registration nursing students on-line: important lessons learned.
Wharrad, Heather J; Cook, Elaine; Poussa, Cherry
2005-05-01
A web site and discussion forum to support a part time degree course for nurses was introduced not only to support student learning but also to encourage students to use and develop their IT skills. Previous cohorts had identified that health informatics skills needed to be addressed more explicitly throughout the programme. The aims of the project were to: (i) evaluate the use of the web site and discussion forum; (ii) determine the barriers to using the web site and discussion forum; (iii) identify ways of overcoming any barriers. The first aim was addressed by analysing web page hits and contributions to the discussion forum. Students' experiences of using the web site and the discussion forum were collected using a questionnaire and followed up by a focus group made up of high and low users of the discussion forum. Students who had accessed the web site most often felt they had been able to communicate with their peers (Spearman's rho, p < 0.01) and had gained peer support by accessing the web site (Spearman's rho, p > 0.05). None of the participants in this study had used a discussion forum before and whilst some students had the skills and confidence to contribute to the on-line discussions, others 'lurked' and some did not access the discussion facility at all. Strategies for improving the engagement and quality of on-line learning are proposed from the lessons learned during this study.
Web 2.0 and Emerging Technologies in Online Learning
ERIC Educational Resources Information Center
Diaz, Veronica
2010-01-01
As online learning continues to grow, so do the free or nearly free Web 2.0 and emerging online learning technologies available to faculty and students. This chapter explores the implementation process and corresponding considerations of adapting such tools for teaching and learning. Issues addressed include copyright, intellectual property,…
Multilingualism and Web Advertising: Addressing French-Speaking Consumers
ERIC Educational Resources Information Center
Martin, Elizabeth
2011-01-01
Drawing inferences from both quantitative and qualitative data, this study examines the extent to which American companies tailor their Web advertising for global audiences with a particular focus on French-speaking consumers in North America, Europe, Africa, the Caribbean and French Polynesia. Explored from a sociolinguistic and social semiotic…
Streaming Media for Web Based Training.
ERIC Educational Resources Information Center
Childers, Chad; Rizzo, Frank; Bangert, Linda
This paper discusses streaming media for World Wide Web-based training (WBT). The first section addresses WBT in the 21st century, including the Synchronized Multimedia Integration Language (SMIL) standard that allows multimedia content such as text, pictures, sound, and video to be synchronized for a coherent learning experience. The second…
Semantic Search of Web Services
ERIC Educational Resources Information Center
Hao, Ke
2013-01-01
This dissertation addresses semantic search of Web services using natural language processing. We first survey various existing approaches, focusing on the fact that the expensive costs of current semantic annotation frameworks result in limited use of semantic search for large scale applications. We then propose a vector space model based service…
The Choice of Initial Web Search Strategies: A Comparison between Finnish and American Searchers.
ERIC Educational Resources Information Center
Iivonen, Mirja; White, Marilyn Domas
2001-01-01
Describes a study that used qualitative and quantitative methodologies to analyze differences between Finnish and American Web searchers in their choice of initial search strategies (direct address, subject directory, and search engines) and their reasoning underlying their choices. Considers implications for considering cultural differences in…
Website Accessibility for Users with Visual Impairment
ERIC Educational Resources Information Center
Smith, J. A.; Lind, M. R.
2010-01-01
In this web accessibility study of homepages of education departments in post-secondary educational institutions, the 1998 US Section 508 Law regarding webpage accessibility for people with disabilities was addressed. Along with the requirements of this legislation, there are growing demands for web accessibility resulting from age-related visual…
38 CFR 74.10 - Where must an application be filed?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Information Pages database located in the Center for Veterans Enterprise's Web portal, http://www.VetBiz.gov... information. Address information for the CVE is also contained on the Web portal. Correspondence may be dispatched to: Director, Center for Veterans Enterprise (00VE), U.S. Department of Veterans Affairs, 810...
38 CFR 74.10 - Where must an application be filed?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Information Pages database located in the Center for Veterans Enterprise's Web portal, http://www.VetBiz.gov... information. Address information for the CVE is also contained on the Web portal. Correspondence may be dispatched to: Director, Center for Veterans Enterprise (00VE), U.S. Department of Veterans Affairs, 810...
38 CFR 74.10 - Where must an application be filed?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Information Pages database located in the Center for Veterans Enterprise's Web portal, http://www.VetBiz.gov... information. Address information for the CVE is also contained on the Web portal. Correspondence may be dispatched to: Director, Center for Veterans Enterprise (00VE), U.S. Department of Veterans Affairs, 810...
38 CFR 74.10 - Where must an application be filed?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Information Pages database located in the Center for Veterans Enterprise's Web portal, http://www.VetBiz.gov... information. Address information for the CVE is also contained on the Web portal. Correspondence may be dispatched to: Director, Center for Veterans Enterprise (00VE), U.S. Department of Veterans Affairs, 810...
The World Wide Web Has Arrived--Science Educators Must All Get Aboard It.
ERIC Educational Resources Information Center
Didion, Catherine Jay
1997-01-01
Discusses the importance of science educators becoming familiar with electronic resources. Highlights the publication Science Teaching Reconsidered: A Handbook, which is designed to help undergraduate science educators. Addresses gender concerns regarding the use of educational resources. Lists science education and career resources on the web.…
76 FR 28733 - Gulf of Mexico Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-18
.... SUMMARY: The Gulf of Mexico Fishery Management Council will convene a Web based meeting of the... the Gulf of Mexico Fishery Management Council's Web site at http://www.gulfcouncil.org for instructions. Council address: Gulf of Mexico Fishery Management Council, 2203 N. Lois Avenue, Suite 1100...
22 CFR 42.33 - Diversity immigrants.
Code of Federal Regulations, 2011 CFR
2011-04-01
...-line and submit to the Department of State via a Web site established by the Department of State for the purpose of receiving such petitions. The Department will specify the address of the Web site prior... neutral, light-colored background. Photos taken with very dark or patterned, busy backgrounds will not be...
22 CFR 42.33 - Diversity immigrants.
Code of Federal Regulations, 2012 CFR
2012-04-01
...-line and submit to the Department of State via a Web site established by the Department of State for the purpose of receiving such petitions. The Department will specify the address of the Web site prior... neutral, light-colored background. Photos taken with very dark or patterned, busy backgrounds will not be...
22 CFR 42.33 - Diversity immigrants.
Code of Federal Regulations, 2013 CFR
2013-04-01
...-line and submit to the Department of State via a Web site established by the Department of State for the purpose of receiving such petitions. The Department will specify the address of the Web site prior... neutral, light-colored background. Photos taken with very dark or patterned, busy backgrounds will not be...
22 CFR 42.33 - Diversity immigrants.
Code of Federal Regulations, 2010 CFR
2010-04-01
...-line and submit to the Department of State via a Web site established by the Department of State for the purpose of receiving such petitions. The Department will specify the address of the Web site prior... neutral, light-colored background. Photos taken with very dark or patterned, busy backgrounds will not be...
22 CFR 42.33 - Diversity immigrants.
Code of Federal Regulations, 2014 CFR
2014-04-01
...-line and submit to the Department of State via a Web site established by the Department of State for the purpose of receiving such petitions. The Department will specify the address of the Web site prior... neutral, light-colored background. Photos taken with very dark or patterned, busy backgrounds will not be...
Teaching Positions: Difference, Pedagogy, and the Power of Address.
ERIC Educational Resources Information Center
Ellsworth, Elizabeth
This collection of essays takes the question of pedagogy into a variety of places, including film studies, psychoanalytic literature criticism, dialog, and readings of educational documentary films and web sites. Part 1, "Teaching as a Scene of Address," includes chapters 1-6. The chapters introduce the concept of mode of address and…
Panoramic-image-based rendering solutions for visualizing remote locations via the web
NASA Astrophysics Data System (ADS)
Obeysekare, Upul R.; Egts, David; Bethmann, John
2000-05-01
With advances in panoramic image-based rendering techniques and the rapid expansion of web advertising, new techniques are emerging for visualizing remote locations on the WWW. Success of these techniques depends on how easy and inexpensive it is to develop a new type of web content that provides pseudo 3D visualization at home, 24-hours a day. Furthermore, the acceptance of this new visualization medium depends on the effectiveness of the familiarization tools by a segment of the population that was never exposed to this type of visualization. This paper addresses various hardware and software solutions available to collect, produce, and view panoramic content. While cost and effectiveness of building the content is being addressed using a few commercial hardware solutions, effectiveness of familiarization tools is evaluated using a few sample data sets.
Evaluation of the Kloswall longwall mining system
NASA Astrophysics Data System (ADS)
Guay, P. J.
1982-04-01
A new longwal mining system specifically designed to extract a very deep web (48 inches or deeper) from a longwall panel was studied. Productivity and cost analysis comparing the new mining system with a conventional longwall operation taking a 30 inch wide web is presented. It is shown that the new system will increase annual production and return on investment in most cases. Conceptual drawings and specifications for a high capacity three drum shearer and a unique shield type of roof support specifically designed for very wide web operation are reported. The advantages and problems associated with wide web mining in general and as they relate specifically to the equipment selected for the new mining system are discussed.
Minich, Nori; Taylor, H. Gerry; Kirkwood, Michael; Brown, Tanya Maines; Stancin, Terry; Wade, Shari L
2015-01-01
Objective Investigate effectiveness of an online Counselor-Assisted Problem-Solving (CAPS) intervention on family functioning after traumatic brain injury (TBI). Methods Participants were randomized to CAPS (n = 65) or internet resource comparison (IRC; n = 67). CAPS is a counselor-assisted web-based program. IRC was given access to online resources. Outcomes were examined 6 months, 12 months, and 18 months after baseline. Injury severity, age, and SES were examined as moderators. Results A main effect of time was noted for teen-reported conflict and parent-reported problem solving. CAPS had decreased parent-reported conflict and a reduction in parental effective communication. Effects were specific to subsets of the sample. Conclusions CAPS, a family-based problem-solving intervention designed to address problem behaviors, had modest effects on some aspects of family functioning, when compared to IRC. Effects were generally limited to subsets of the families and were not evident across all follow-up assessments. PMID:26461100
Quintiliani, Lisa M; De Jesus, Maria; Wallington, Sherrie Flynt
2011-01-01
To examine an organizational level perspective of the process of adopting Web-based tailored nutrition and physical activity programs for community college students. In this qualitative study, 21 individual key informant interviews of community college student services and health center administrators were used to examine organizational-level perceptions of interest in, design characteristics of, and ways to promote health programs. A cross-classification matrix of a priori and emergent themes related to student diversity was created to describe cross-cutting patterns. Findings revealed 5 emergent themes for consideration in program development related to student diversity: (1) multiple roles played by students, (2) limited access to financial resources, (3) varied student demographics, (4) different levels of understanding, and (5) commuting to campus. Nutrition and physical activity programs for community colleges need to specifically address the diverse nature of their students to increase the potential of adoption. Copyright © 2011 Society for Nutrition Education. Published by Elsevier Inc. All rights reserved.
Swanson, Jonathan O; Plotner, David; Franklin, Holly L; Swanson, David L; Lokomba Bolamba, Victor; Lokangaka, Adrien; Sayury Pineda, Irma; Figueroa, Lester; Garces, Ana; Muyodi, David; Esamai, Fabian; Kanaiza, Nancy; Mirza, Waseem; Naqvi, Farnaz; Saleem, Sarah; Mwenechanya, Musaku; Chiwila, Melody; Hamsumonde, Dorothy; McClure, Elizabeth M; Goldenberg, Robert L; Nathan, Robert O
2016-01-01
ABSTRACT High quality is important in medical imaging, yet in many geographic areas, highly skilled sonographers are in short supply. Advances in Internet capacity along with the development of reliable portable ultrasounds have created an opportunity to provide centralized remote quality assurance (QA) for ultrasound exams performed at rural sites worldwide. We sought to harness these advances by developing a web-based tool to facilitate QA activities for newly trained sonographers who were taking part in a cluster randomized trial investigating the role of limited obstetric ultrasound to improve pregnancy outcomes in 5 low- and middle-income countries. We were challenged by connectivity issues, by country-specific needs for website usability, and by the overall need for a high-throughput system. After systematically addressing these needs, the resulting QA website helped drive ultrasound quality improvement across all 5 countries. It now offers the potential for adoption by future ultrasound- or imaging-based global health initiatives. PMID:28031304
ERIC Educational Resources Information Center
Ross, Irwin
1977-01-01
Specific characteristics of the different spiders' webs are discussed. Photographs illustrate the various web designs and web-making spiders. Included also are the numerous uses a spider makes from its own web. (MA)
Readability of menopause web sites: a cross-sectional study.
Charbonneau, Deborah H
2012-01-01
More women are frequently referring to the Internet for health information, yet the readability of information about menopause on the Internet has not been widely studied. To address this gap, this study examined the readability of information about menopause on 25 Internet Web sites. Findings included that information on the Web sites had a reading level higher than the recommended sixth-grade level, and culturally appropriate health information was lacking. Health educators and practitioners are in a pivotal role to help women understand information useful for healthcare decisions. Several criteria are discussed to help practitioners evaluate Web sites.
Web Application Design Using Server-Side JavaScript
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hampton, J.; Simons, R.
1999-02-01
This document describes the application design philosophy for the Comprehensive Nuclear Test Ban Treaty Research & Development Web Site. This design incorporates object-oriented techniques to produce a flexible and maintainable system of applications that support the web site. These techniques will be discussed at length along with the issues they address. The overall structure of the applications and their relationships with one another will also be described. The current problems and future design changes will be discussed as well.
Availability of the OGC geoprocessing standard: March 2011 reality check
NASA Astrophysics Data System (ADS)
Lopez-Pellicer, Francisco J.; Rentería-Agualimpia, Walter; Béjar, Rubén; Muro-Medrano, Pedro R.; Zarazaga-Soria, F. Javier
2012-10-01
This paper presents an investigation about the servers available in March 2011 conforming to the Web Processing Service interface specification published by the geospatial standards organization Open Geospatial Consortium (OGC) in 2007. This interface specification gives support to standard Web-based geoprocessing. The data used in this research were collected using a focused crawler configured for finding OGC Web services. The research goals are (i) to provide a reality check of the availability of Web Processing Service servers, (ii) to provide quantitative data about the use of different features defined in the standard that are relevant for a scalable Geoprocessing Web (e.g. long-running processes, Web-accessible data outputs), and (iii) to test if the advances in the use of search engines and focused crawlers for finding Web services can be applied for finding geoscience processing systems. Research results show the feasibility of the discovery approach and provide data about the implementation of the Web Processing Service specification. These results also show extensive use of features related to scalability, except for those related to technical and semantic interoperability.
31 CFR 363.5 - How do I contact Public Debt?
Code of Federal Regulations, 2010 CFR
2010-07-01
... TreasuryDirect ® to communicate information to us over a secure Internet connection. (b) Emails may be sent... should be addressed to the address provided on our web site at http://www.treasurydirect.gov/write.htm...
Web Prep: How to Prepare NAS Reports For Publication on the Web
NASA Technical Reports Server (NTRS)
Walatka, Pamela; Balakrishnan, Prithika; Clucas, Jean; McCabe, R. Kevin; Felchle, Gail; Brickell, Cristy
1996-01-01
This document contains specific advice and requirements for NASA Ames Code IN authors of NAS reports. Much of the information may be of interest to other authors writing for the Web. WebPrep has a graphic Table of Contents in the form of a WebToon, which simulates a discussion between a scientist and a Web publishing consultant. In the WebToon, Frequently Asked Questions about preparing reports for the Web are linked to relevant text in the body of this document. We also provide a text-only Table of Contents. The text for this document is divided into chapters: each chapter corresponds to one frame of the WebToons. The chapter topics are: converting text to HTML, converting 2D graphic images to gif, creating imagemaps and tables, converting movie and audio files to Web formats, supplying 3D interactive data, and (briefly) JAVA capabilities. The last chapter is specifically for NAS staff authors. The Glossary-Index lists web related words and links to topics covered in the main text.
Efficient Web Vulnerability Detection Tool for Sleeping Giant-Cross Site Request Forgery
NASA Astrophysics Data System (ADS)
Parimala, G.; Sangeetha, M.; AndalPriyadharsini, R.
2018-04-01
Now day’s web applications are very high in the rate of usage due to their user friendly environment and getting any information via internet but these web applications are affected by lot of threats. CSRF attack is one of the serious threats to web applications which is based on the vulnerabilities present in the normal web request and response of HTTP protocol. It is hard to detect but hence still it is present in most of the existing web applications. In CSRF attack, without user knowledge the unwanted actions on a reliable websites are forced to happen. So it is placed in OWASP’s top 10 Web Application attacks list. My proposed work is to do a real time scan of CSRF vulnerability attack in given URL of the web applications as well as local host address for any organization using python language. Client side detection of CSRF is depended on Form count which is presented in that given web site.
Leaf LIMS: A Flexible Laboratory Information Management System with a Synthetic Biology Focus.
Craig, Thomas; Holland, Richard; D'Amore, Rosalinda; Johnson, James R; McCue, Hannah V; West, Anthony; Zulkower, Valentin; Tekotte, Hille; Cai, Yizhi; Swan, Daniel; Davey, Robert P; Hertz-Fowler, Christiane; Hall, Anthony; Caddick, Mark
2017-12-15
This paper presents Leaf LIMS, a flexible laboratory information management system (LIMS) designed to address the complexity of synthetic biology workflows. At the project's inception there was a lack of a LIMS designed specifically to address synthetic biology processes, with most systems focused on either next generation sequencing or biobanks and clinical sample handling. Leaf LIMS implements integrated project, item, and laboratory stock tracking, offering complete sample and construct genealogy, materials and lot tracking, and modular assay data capture. Hence, it enables highly configurable task-based workflows and supports data capture from project inception to completion. As such, in addition to it supporting synthetic biology it is ideal for many laboratory environments with multiple projects and users. The system is deployed as a web application through Docker and is provided under a permissive MIT license. It is freely available for download at https://leaflims.github.io .
Healthcare Needs of and Access Barriers for Brazilian Transgender and Gender Diverse People.
Costa, Angelo Brandelli; da Rosa Filho, Heitor Tome; Pase, Paola Fagundes; Fontanari, Anna Martha Vaitses; Catelan, Ramiro Figueiredo; Mueller, Andressa; Cardoso, Dhiordan; Soll, Bianca; Schwarz, Karine; Schneider, Maiko Abel; Gagliotti, Daniel Augusto Mori; Saadeh, Alexandre; Lobato, Maria Inês Rodrigues; Nardi, Henrique Caetano; Koller, Silvia Helena
2018-02-01
Transgender and gender diverse people (TGD) have specific healthcare needs and struggles with access barriers that should be addressed by public health systems. Our study aimed to address this topic in the Brazilian context. A hospital and web-based cross-sectional survey built with input from the medical and transgender communities was developed to assess TGD healthcare needs of and access barriers in two Brazilian states. Although services that assist this population have existed in Brazil since the 1990s, TGD have difficulty accessing these services due to discrimination, lack of information and a policy design that does not meet the needs of TGD. A history of discrimination was associated with a 6.72-fold increase in the frequency of health service avoidance [95% CI (4.5, 10.1)]. This article discusses the urgent necessity for adequate health policies and for the training of professionals regarding the needs of Brazilian TGD.
2008-03-01
Machine [29]. OC4J applications support Java Servlets , Web services, and the following J2EE specific standards: Extensible Markup Language (XML...IMAP Internet Message Access Protocol IP Internet Protocol IT Information Technology xviii J2EE Java Enterprise Environment JSR 168 Java ...LDAP), World Wide Web Distributed Authoring and Versioning (WebDav), Java Specification Request 168 (JSR 168), and Web Services for Remote
Communication strategies to address geohydrological risks: the POLARIS web initiative in Italy
NASA Astrophysics Data System (ADS)
Salvati, Paola; Pernice, Umberto; Bianchi, Cinzia; Marchesini, Ivan; Fiorucci, Federica; Guzzetti, Fausto
2016-06-01
Floods and landslides are common phenomena that cause serious damage and pose a severe threat to the population of Italy. The social and economic impact of floods and landslides in Italy is severe, and strategies to target the mitigation of the effects of these phenomena are needed. In the last few years, the scientific community has started to use web technology to communicate information on geohydrological hazards and the associated risks. However, the communication is often targeted at technical experts. In the attempt to communicate relevant information on geohydrological hazards with potential human consequences to a broader audience, we designed the POpoLazione A RISchio (POLARIS) website. POLARIS publishes accurate information on geohydrological risk to the population of Italy, including periodic reports on landslide and flood risk, analyses of specific damaging events and blog posts on landslide and flood events. By monitoring the access to POLARIS in the 21-month period between January 2014 and October 2015, we found that access increased during particularly damaging geohydrological events and immediately after the website was advertised by press releases. POLARIS demonstrates that the scientific community can implement suitable communication strategies that address different societal audiences, exploiting the role of mass media and social media. The strategies can help multiple audiences understand how risks can be reduced through appropriate measures and behaviours, contributing to increasing the resilience of the population to geohydrological risk.
Garrison, Nanibaa' A; Sathe, Nila A; Antommaria, Armand H Matheny; Holm, Ingrid A; Sanderson, Saskia C; Smith, Maureen E; McPheeters, Melissa L; Clayton, Ellen W
2016-07-01
In 2011, an Advanced Notice of Proposed Rulemaking proposed that de-identified human data and specimens be included in biobanks only if patients provide consent. The National Institutes of Health Genomic Data Sharing policy went into effect in 2015, requiring broad consent from almost all research participants. We conducted a systematic literature review of attitudes toward biobanking, broad consent, and data sharing. Bibliographic databases included MEDLINE, Web of Science, EthxWeb, and GenETHX. Study screening was conducted using DistillerSR. The final 48 studies included surveys (n = 23), focus groups (n = 8), mixed methods (n = 14), interviews (n = 1), and consent form analyses (n = 2). Study quality was characterized as good (n = 19), fair (n = 27), and poor (n = 2). Although many participants objected, broad consent was often preferred over tiered or study-specific consent, particularly when broad consent was the only option, samples were de-identified, logistics of biobanks were communicated, and privacy was addressed. Willingness for data to be shared was high, but it was lower among individuals from under-represented minorities, individuals with privacy and confidentiality concerns, and when pharmaceutical companies had access to data. Additional research is needed to understand factors affecting willingness to give broad consent for biobank research and data sharing in order to address concerns to enhance acceptability.Genet Med 18 7, 663-671.
Gowda, Charitha; Schaffer, Sarah E.; Kopec, Kristin; Markel, Arielle; Dempsey, Amanda F.
2013-01-01
Healthcare providers need strategies to better address the concerns of vaccine-hesitant parents. We studied whether individually tailored education was more effective than untailored education at improving vaccination intention among MMR vaccine-hesitant parents. In an intervention pilot study of parents (n = 77) of children < 6 y who screened as hesitant to vaccinate against MMR (first or second dose), parents were randomly assigned to receive either (1) educational web pages that were individually tailored to address their specific vaccine concerns; or (2) web pages similar in appearance to the intervention but containing untailored information. The main outcome, change in vaccination intention before and after the intervention, was assessed using an 11-pt scale (higher values indicated greater intent). We found that a greater proportion of parents in the tailored than untailored arm had positive vaccination intentions after viewing educational information (58% vs. 46%). Furthermore, parents in the tailored group had a greater magnitude of change in vaccination intention (1.08 vs. 0.49 points) than participants in the untailored group. However, neither of these results was statistically significant. From this pilot study we conclude message tailoring may be an effective way to improve vaccine compliance among vaccine hesitant parents. However, larger studies are warranted to further investigate the efficacy of providing tailored education for increasing vaccine acceptance among parents with diverse beliefs. PMID:23291937
Timpka, Toomas; Eriksson, Henrik; Ludvigsson, Johnny; Ekberg, Joakim; Nordfeldt, Sam; Hanberger, Lena
2008-11-28
Chronic disease management is a global health concern. By the time they reach adolescence, 10-15% of all children live with a chronic disease. The role of educational interventions in facilitating adaptation to chronic disease is receiving growing recognition, and current care policies advocate greater involvement of patients in self-care. Web 2.0 is an umbrella term for new collaborative Internet services characterized by user participation in developing and managing content. Key elements include Really Simple Syndication (RSS) to rapidly disseminate awareness of new information; weblogs (blogs) to describe new trends, wikis to share knowledge, and podcasts to make information available on personal media players. This study addresses the potential to develop Web 2.0 services for young persons with a chronic disease. It is acknowledged that the management of childhood chronic disease is based on interplay between initiatives and resources on the part of patients, relatives, and health care professionals, and where the balance shifts over time to the patients and their families. Participatory action research was used to stepwise define a design specification in the form of a pattern language. Support for children diagnosed with diabetes Type 1 was used as the example area. Each individual design pattern was determined graphically using card sorting methods, and textually in the form Title, Context, Problem, Solution, Examples and References. Application references were included at the lowest level in the graphical overview in the pattern language but not specified in detail in the textual descriptions. The design patterns are divided into functional and non-functional design elements, and formulated at the levels of organizational, system, and application design. The design elements specify access to materials for development of the competences needed for chronic disease management in specific community settings, endorsement of self-learning through online peer-to-peer communication, and systematic accreditation and evaluation of materials and processes. The use of design patterns allows representing the core design elements of a Web 2.0 system upon which an 'ecological' development of content respecting these constraints can be built. Future research should include evaluations of Web 2.0 systems implemented according to the architecture in practice settings.
A Simulation Model that Decreases Faculty Concerns about Adopting Web-Based Instruction
ERIC Educational Resources Information Center
Song, Hae-Deok; Wang, Wei-Tsong; Liu, Chao-Yueh
2011-01-01
Faculty members have different concerns as they integrate new technology into their teaching practices. The integration of Web-Based Instruction in higher-education settings will not be successful if these faculty concerns are not addressed. Four main stages of faculty concern (information, personal, management, and impact) were identified based…
The Full Monty: Locating Resources, Creating, and Presenting a Web Enhanced History Course.
ERIC Educational Resources Information Center
Bazillion, Richard J.; Braun, Connie L.
2001-01-01
Discusses how to develop a history course using the World Wide Web; course development software; full text digitized articles, electronic books, primary documents, images, and audio files; and computer equipment such as LCD projectors and interactive whiteboards. Addresses the importance of support for faculty using technology in teaching. (PAL)
Meeting the Needs of Travel Clientele: Tried and True Strategies That Work.
ERIC Educational Resources Information Center
Blessing, Kathy; Whitney, Cherine
This paper describes sources for meeting the information needs of travel clientele. Topics addressed include: (1) U.S. government Web sites; (2) collection development tools, including review journals, online bookstores, travel Web sites, and sources of point-by-point comparisons of guide books; (3) prominent guidebook series and publisher Web…
46 CFR 520.9 - Access to tariffs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... networks (“PSTN”); or (2) The Internet (Web) by: (i) Web browser; or (ii) Telnet session. (b) Dial-up...) Internet connection. (1) This connection option requires that systems provide: (i) A universal resource locator (“URL”) Internet address (e.g., http://www.tariffsrus.com or http://1.2.3.4); and/or (ii) A URL...
46 CFR 520.9 - Access to tariffs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... networks (“PSTN”); or (2) The Internet (Web) by: (i) Web browser; or (ii) Telnet session. (b) Dial-up...) Internet connection. (1) This connection option requires that systems provide: (i) A universal resource locator (“URL”) Internet address (e.g., http://www.tariffsrus.com or http://1.2.3.4); and/or (ii) A URL...
46 CFR 520.9 - Access to tariffs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... networks (“PSTN”); or (2) The Internet (Web) by: (i) Web browser; or (ii) Telnet session. (b) Dial-up...) Internet connection. (1) This connection option requires that systems provide: (i) A universal resource locator (“URL”) Internet address (e.g., http://www.tariffsrus.com or http://1.2.3.4); and/or (ii) A URL...
75 FR 20850 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-21
... in the efforts to make system changes. Grantees will complete systems web-based data entry on goals... grantee and key staff-partner interview 249 2 1.60 797 guide EBHV grantee systems web-based data entry... prevent child maltreatment. An evaluation study will address four domains: (1) Systems change to develop...
46 CFR 520.9 - Access to tariffs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... networks (“PSTN”); or (2) The Internet (Web) by: (i) Web browser; or (ii) Telnet session. (b) Dial-up...) Internet connection. (1) This connection option requires that systems provide: (i) A universal resource locator (“URL”) Internet address (e.g., http://www.tariffsrus.com or http://1.2.3.4); and/or (ii) A URL...
46 CFR 520.9 - Access to tariffs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... networks (“PSTN”); or (2) The Internet (Web) by: (i) Web browser; or (ii) Telnet session. (b) Dial-up...) Internet connection. (1) This connection option requires that systems provide: (i) A universal resource locator (“URL”) Internet address (e.g., http://www.tariffsrus.com or http://1.2.3.4); and/or (ii) A URL...
Eliciting Web Site Preferences of People with Learning Disabilities
ERIC Educational Resources Information Center
Williams, Peter
2017-01-01
The Internet can be an excellent tool to help people with learning disabilities access relevant and appropriately written information. However, little work has been undertaken to ascertain web design or content preferences for this cohort. This paper examines methods to address this issue. Twenty five participants were presented with three web…
45 CFR 154.301 - CMS's determinations of Effective Rate Review Programs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... provide, for the rate increases it reviews, access from its Web site to at least the information contained... provide CMS's Web address for such information) and have a mechanism for receiving public comments on... quality; (ix) The impact of changes in other administrative costs; (x) The impact of changes in applicable...
45 CFR 154.301 - CMS's determinations of Effective Rate Review Programs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... provide, for the rate increases it reviews, access from its Web site to at least the information contained... provide CMS's Web address for such information) and have a mechanism for receiving public comments on... quality; (ix) The impact of changes in other administrative costs; (x) The impact of changes in applicable...
Policies and Procedures for Accessing Archived NASA Data via the Web
NASA Technical Reports Server (NTRS)
James, Nathan
2011-01-01
The National Space Science Data Center (NSSDC) was established by NASA to provide for the preservation and dissemination of scientific data from NASA missions. This white paper will address the NSSDC policies that govern data preservation and dissemination and the various methods of accessing NSSDC-archived data via the web.
World-Wide Web: Adding Multimedia to Cyberspace.
ERIC Educational Resources Information Center
Descy, Don E.
1994-01-01
Describes the World-Wide Web (WWW), a network information resource based on hypertext. How to access WWW browsers through remote login (telnet) or though free browser software, such as Mosaic, is provided. Eight information sources that can be accessed through the WWW are listed. The address of a listserv reporting on Internet developments is…
MetaSpider: Meta-Searching and Categorization on the Web.
ERIC Educational Resources Information Center
Chen, Hsinchun; Fan, Haiyan; Chau, Michael; Zeng, Daniel
2001-01-01
Discusses the difficulty of locating relevant information on the Web and studies two approaches to addressing the low precision and poor presentation of search results: meta-search and document categorization. Introduces MetaSpider, a meta-search engine, and presents results of a user evaluation study that compared three search engines.…
ERIC Educational Resources Information Center
Menard, Lauren A.
2011-01-01
Obstacles to the classroom implementation of the fourth grade Math component of Louisiana's web-based testing tutorial were addressed in this informal pilot. Technology integration improved standardized test preparation for students with special needs. Supplemental test preparation sessions give the benefits of (a) increased familiarity with…
Domain-specific Web Service Discovery with Service Class Descriptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rocco, D; Caverlee, J; Liu, L
2005-02-14
This paper presents DynaBot, a domain-specific web service discovery system. The core idea of the DynaBot service discovery system is to use domain-specific service class descriptions powered by an intelligent Deep Web crawler. In contrast to current registry-based service discovery systems--like the several available UDDI registries--DynaBot promotes focused crawling of the Deep Web of services and discovers candidate services that are relevant to the domain of interest. It uses intelligent filtering algorithms to match services found by focused crawling with the domain-specific service class descriptions. We demonstrate the capability of DynaBot through the BLAST service discovery scenario and describe ourmore » initial experience with DynaBot.« less
Redesigning Instruction through Web-based Course Authoring Tools.
ERIC Educational Resources Information Center
Dabbagh, Nada H.; Schmitt, Jeff
1998-01-01
Examines the pedagogical implications of redesigning instruction for Web-based delivery through a case study of an undergraduate computer science course. Initially designed for a traditional learning environment, this course transformed to a Web-based course using WebCT, a Web-based course authoring tool. Discusses the specific features of WebCT.…
NeAT: a toolbox for the analysis of biological networks, clusters, classes and pathways.
Brohée, Sylvain; Faust, Karoline; Lima-Mendez, Gipsi; Sand, Olivier; Janky, Rekin's; Vanderstocken, Gilles; Deville, Yves; van Helden, Jacques
2008-07-01
The network analysis tools (NeAT) (http://rsat.ulb.ac.be/neat/) provide a user-friendly web access to a collection of modular tools for the analysis of networks (graphs) and clusters (e.g. microarray clusters, functional classes, etc.). A first set of tools supports basic operations on graphs (comparison between two graphs, neighborhood of a set of input nodes, path finding and graph randomization). Another set of programs makes the connection between networks and clusters (graph-based clustering, cliques discovery and mapping of clusters onto a network). The toolbox also includes programs for detecting significant intersections between clusters/classes (e.g. clusters of co-expression versus functional classes of genes). NeAT are designed to cope with large datasets and provide a flexible toolbox for analyzing biological networks stored in various databases (protein interactions, regulation and metabolism) or obtained from high-throughput experiments (two-hybrid, mass-spectrometry and microarrays). The web interface interconnects the programs in predefined analysis flows, enabling to address a series of questions about networks of interest. Each tool can also be used separately by entering custom data for a specific analysis. NeAT can also be used as web services (SOAP/WSDL interface), in order to design programmatic workflows and integrate them with other available resources.
PACCMIT/PACCMIT-CDS: identifying microRNA targets in 3′ UTRs and coding sequences
Šulc, Miroslav; Marín, Ray M.; Robins, Harlan S.; Vaníček, Jiří
2015-01-01
The purpose of the proposed web server, publicly available at http://paccmit.epfl.ch, is to provide a user-friendly interface to two algorithms for predicting messenger RNA (mRNA) molecules regulated by microRNAs: (i) PACCMIT (Prediction of ACcessible and/or Conserved MIcroRNA Targets), which identifies primarily mRNA transcripts targeted in their 3′ untranslated regions (3′ UTRs), and (ii) PACCMIT-CDS, designed to find mRNAs targeted within their coding sequences (CDSs). While PACCMIT belongs among the accurate algorithms for predicting conserved microRNA targets in the 3′ UTRs, the main contribution of the web server is 2-fold: PACCMIT provides an accurate tool for predicting targets also of weakly conserved or non-conserved microRNAs, whereas PACCMIT-CDS addresses the lack of similar portals adapted specifically for targets in CDS. The web server asks the user for microRNAs and mRNAs to be analyzed, accesses the precomputed P-values for all microRNA–mRNA pairs from a database for all mRNAs and microRNAs in a given species, ranks the predicted microRNA–mRNA pairs, evaluates their significance according to the false discovery rate and finally displays the predictions in a tabular form. The results are also available for download in several standard formats. PMID:25948580
Can people find patient decision aids on the Internet?
Morris, Debra; Drake, Elizabeth; Saarimaki, Anton; Bennett, Carol; O'Connor, Annette
2008-12-01
To determine if people could find patient decision aids (PtDAs) on the Internet using the most popular general search engines. We chose five medical conditions for which English language PtDAs were available from at least three different developers. The search engines used were: Google (www.google.com), Yahoo! (www.yahoo.com), and MSN (www.msn.com). For each condition and search engine we ran six searches using a combination of search terms. We coded all non-sponsored Web pages that were linked from the first page of the search results. Most first page results linked to informational Web pages about the condition, only 16% linked to PtDAs. PtDAs were more readily found for the breast cancer surgery decision (our searches found seven of the nine developers). The searches using Yahoo and Google search engines were more likely to find PtDAs. The following combination of search terms: condition, treatment, decision (e.g. breast cancer surgery decision) was most successful across all search engines (29%). While some terms and search engines were more successful, few resulted in direct links to PtDAs. Finding PtDAs would be improved with use of standardized labelling, providing patients with specific Web site addresses or access to an independent PtDA clearinghouse.
Optimizing Citizen Engagement During Emergencies Through Use of Web 2.0 Technologies
2009-03-01
Technologies 6. AUTHOR( S ) Laurie J. Van Leuven 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) Naval Postgraduate School...Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME( S ) AND ADDRESS(ES) N/A 10. SPONSORING...Van Leuven Security and Emergency Management Strategic Advisor, City of Seattle B.A., University of Washington, 1998 Submitted in partial
A Web-Based Platform for Educating Researchers About Bioethics and Biobanking.
Sehovic, Ivana; Gwede, Clement K; Meade, Cathy D; Sodeke, Stephen; Pentz, Rebecca; Quinn, Gwendolyn P
2016-06-01
Participation in biobanking among individuals with familial risk for hereditary cancer (IFRs) and underserved/minority populations is vital for biobanking research. To address gaps in researcher knowledge regarding ethical concerns of these populations, we developed a web-based curriculum. Based on formative research and expert panel assessments, a curriculum and website was developed in an integrative, systematic manner. Researchers were recruited to evaluate the curriculum. Public health graduate students were recruited to pilot test the curriculum. All 14 researchers agreed the curriculum was easy to understand, adequately addressed the domains, and contained appropriate post-test questions. The majority evaluated the dialgoue animations as interesting and valuable. Twenty-two graduate students completed the curriculum, and 77 % improved their overall test score. A web-based curriculum is an acceptable and effective way to provide information to researchers about vulnerable populations' biobanking concerns. Future goals are to incorporate the curriculum with larger organizations.
A Web-based Platform for Educating Researchers about Bioethics and Biobanking
Sehovic, Ivana; Gwede, Clement K.; Meade, Cathy D.; Sodeke, Stephen; Pentz, Rebecca; Quinn, Gwendolyn P.
2015-01-01
Background Participation in biobanking among individuals with familial risk for hereditary cancer (IFRs) and underserved/minority populations is vital for biobanking research. To address gaps in researcher knowledge regarding ethical concerns of these populations, we developed a web-based curriculum. Methods Based on formative research and expert panel assessments, a curriculum and website was developed in an integrative, systematic manner. Researchers were recruited to evaluate the curriculum. Public health graduate students were recruited to pilot test the curriculum. Results All 14 researchers agreed that the curriculum was easy to understand, adequately addressed the domains, and contained appropriate post-test questions. A majority felt the dialogue animations were interesting and valuable. 22 graduate students completed the curriculum and 77% improved their overall test score. Conclusions A web-based curriculum is an acceptable and effective way to provide information to researchers about vulnerable populations’ biobanking concerns. Future goals are to incorporate the curriculum with larger organizations. PMID:25773136
NASA Astrophysics Data System (ADS)
Ballora, Mark; Hall, David L.
2010-04-01
Detection of intrusions is a continuing problem in network security. Due to the large volumes of data recorded in Web server logs, analysis is typically forensic, taking place only after a problem has occurred. This paper describes a novel method of representing Web log information through multi-channel sound, while simultaneously visualizing network activity using a 3-D immersive environment. We are exploring the detection of intrusion signatures and patterns, utilizing human aural and visual pattern recognition ability to detect intrusions as they occur. IP addresses and return codes are mapped to an informative and unobtrusive listening environment to act as a situational sound track of Web traffic. Web log data is parsed and formatted using Python, then read as a data array by the synthesis language SuperCollider [1], which renders it as a sonification. This can be done either for the study of pre-existing data sets or in monitoring Web traffic in real time. Components rendered aurally include IP address, geographical information, and server Return Codes. Users can interact with the data, speeding or slowing the speed of representation (for pre-existing data sets) or "mixing" sound components to optimize intelligibility for tracking suspicious activity.
Graduating to Postdoc: Information-Sharing in Support of Organizational Structures and Needs
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Lucas, Paul J.; Compton, Michael M.; Stewart, Helen J.; Baya, Vinod; DelAlto, Martha
1999-01-01
The deployment of information-sharing systems in large organizations can significantly impact existing policies and procedures with regard to authority and control over information. Unless information-sharing systems explicitly support organizational structures and needs, these systems will be rejected summarily. The Postdoc system is a deployed Web-based information-sharing system created specifically to address organizational needs. Postdoc contains various organizational support features including a shared, globally navigable document space, as well as specialized access control, distributed administration, and mailing list features built around the key notion of hierarchical group structures. We review successes and difficulties in supporting organizational needs with Postdoc
The multifaceted influence of gender in career progress in nursing.
Tracey, Catherine; Nicholl, Honor
2007-10-01
The complex web of gender influence in the workplace results from a multifaceted interplay of factors [Walby et al. (1994) Medicine and Nursing. Sage Publications, London]. Literature reports that in nursing men's success compared with that of women is disproportionate and substantial evidence of gender-based disadvantage is found [Women in Management Review13 (1998) 184]. However, studies have not addressed the specific reasons for this and little is known of how or what influences nurses' career decisions and developments [Journal of Advanced Nursing25 (1997) 602]. Those studies which examine career developments and patterns are mainly found in the private business sector.
Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems
ERIC Educational Resources Information Center
Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul
2009-01-01
Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…
Rudolph, Abby E; Bazzi, Angela Robertson; Fish, Sue
2016-10-01
Analyses with geographic data can be used to identify "hot spots" and "health service deserts", examine associations between proximity to services and their use, and link contextual factors with individual-level data to better understand how environmental factors influence behaviors. Technological advancements in methods for collecting this information can improve the accuracy of contextually-relevant information; however, they have outpaced the development of ethical standards and guidance, particularly for research involving populations engaging in illicit/stigmatized behaviors. Thematic analysis identified ethical considerations for collecting geographic data using different methods and the extent to which these concerns could influence study compliance and data validity. In-depth interviews with 15 Baltimore residents (6 recruited via flyers and 9 via peer-referral) reporting recent drug use explored comfort with and ethics of three methods for collecting geographic information: (1) surveys collecting self-reported addresses/cross-streets, (2) surveys using web-based maps to find/confirm locations, and (3) geographical momentary assessments (GMA), which collect spatiotemporally referenced behavioral data. Survey methods for collecting geographic data (i.e., addresses/cross-streets and web-based maps) were generally acceptable; however, participants raised confidentiality concerns regarding exact addresses for illicit/stigmatized behaviors. Concerns specific to GMA included burden of carrying/safeguarding phones and responding to survey prompts, confidentiality, discomfort with being tracked, and noncompliance with study procedures. Overall, many felt that confidentiality concerns could influence the accuracy of location information collected for sensitive behaviors and study compliance. Concerns raised by participants could result in differential study participation and/or study compliance and questionable accuracy/validity of location data for sensitive behaviors. Copyright © 2016 Elsevier Ltd. All rights reserved.
The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications
2011-01-01
Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework. PMID:21806842
The 2nd DBCLS BioHackathon: interoperable bioinformatics Web services for integrated applications.
Katayama, Toshiaki; Wilkinson, Mark D; Vos, Rutger; Kawashima, Takeshi; Kawashima, Shuichi; Nakao, Mitsuteru; Yamamoto, Yasunori; Chun, Hong-Woo; Yamaguchi, Atsuko; Kawano, Shin; Aerts, Jan; Aoki-Kinoshita, Kiyoko F; Arakawa, Kazuharu; Aranda, Bruno; Bonnal, Raoul Jp; Fernández, José M; Fujisawa, Takatomo; Gordon, Paul Mk; Goto, Naohisa; Haider, Syed; Harris, Todd; Hatakeyama, Takashi; Ho, Isaac; Itoh, Masumi; Kasprzyk, Arek; Kido, Nobuhiro; Kim, Young-Joo; Kinjo, Akira R; Konishi, Fumikazu; Kovarskaya, Yulia; von Kuster, Greg; Labarga, Alberto; Limviphuvadh, Vachiranee; McCarthy, Luke; Nakamura, Yasukazu; Nam, Yunsun; Nishida, Kozo; Nishimura, Kunihiro; Nishizawa, Tatsuya; Ogishima, Soichi; Oinn, Tom; Okamoto, Shinobu; Okuda, Shujiro; Ono, Keiichiro; Oshita, Kazuki; Park, Keun-Joon; Putnam, Nicholas; Senger, Martin; Severin, Jessica; Shigemoto, Yasumasa; Sugawara, Hideaki; Taylor, James; Trelles, Oswaldo; Yamasaki, Chisato; Yamashita, Riu; Satoh, Noriyuki; Takagi, Toshihisa
2011-08-02
The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP/WSDL specification among and between various programming-language libraries; and iv) incompatibility between various bioinformatics data formats. Although it was still difficult to solve real world problems posed to the developers by the biological researchers in attendance because of these problems, we note the promise of addressing these issues within a semantic framework.
ACHP | Web Site Privacy Policy
Search skip specific nav links Home arrow About ACHP arrow Web Site Privacy Policy ACHP Web Site Privacy be used after its purpose has been fulfilled. For questions on our Web site privacy policy, please contact the Web manager. Updated October 2, 2006 Return to Top
Spamology: A Study of Spam Origins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shue, Craig A; Gupta, Prof. Minaxi; Kong, Chin Hua
2009-01-01
The rise of spam in the last decade has been staggering, with the rate of spam exceeding that of legitimate email. While conjectures exist on how spammers gain access to email addresses to spam, most work in the area of spam containment has either focused on better spam filtering methodologies or on understanding the botnets commonly used to send spam. In this paper, we aim to understand the origins of spam. We post dedicated email addresses to record how and where spammers go to obtain email addresses. We find that posting an email address on public Web pages yields immediatemore » and high-volume spam. Surprisingly, even simple email obfuscation approaches are still sufficient today to prevent spammers from harvesting emails. We also find that attempts to find open relays continue to be popular among spammers. The insights we gain on the use of Web crawlers used to harvest email addresses and the commonalities of techniques used by spammers open the door for radically different follow-up work on spam containment and even systematic enforcement of spam legislation at a large scale.« less
Developing and Evaluating the GriefLink Web Site: Processes, Protocols, Dilemmas and Lessons Learned
ERIC Educational Resources Information Center
Clark, Sheila; Burgess, Teresa; Laven, Gillian; Bull, Michael; Marker, Julie; Browne, Eric
2004-01-01
Despite a profusion of recommendations regarding the quality of web sites and guidelines related to ethical issues surrounding health-related sites, there is little guidance for the design and evaluation of sites relating to loss and grief. This article, which addresses these deficiencies, results from a community consultation process of designing…
Conservation of a molecular target across species can be used as a line-of-evidence to predict the likelihood of chemical susceptibility. The web-based Sequence Alignment to Predict Across Species Susceptibility (SeqAPASS) tool was developed to simplify, streamline, and quantitat...
Pilot Evaluation of a Web-Based Intervention Targeting Sexual Health Service Access
ERIC Educational Resources Information Center
Brown, K. E.; Newby, K.; Caley, M.; Danahay, A.; Kehal, I.
2016-01-01
Sexual health service access is fundamental to good sexual health, yet interventions designed to address this have rarely been implemented or evaluated. In this article, pilot evaluation findings for a targeted public health behavior change intervention, delivered via a website and web-app, aiming to increase uptake of sexual health services among…
Interactive Display of High-Resolution Images on the World Wide Web.
ERIC Educational Resources Information Center
Clyde, Stephen W.; Hirschi, Gregory W.
Viewing high-resolution images on the World Wide Web at a level of detail necessary for collaborative research is still a problem today, given the Internet's current bandwidth limitations and its ever increasing network traffic. ImageEyes is an interactive display tool being developed at Utah State University that addresses this problem by…
Ethics of Research into Learning and Teaching with Web 2.0: Reflections on Eight Case Studies
ERIC Educational Resources Information Center
Chang, Rosemary L.; Gray, Kathleen
2013-01-01
The unique features and educational affordances of Web 2.0 technologies pose new challenges for conducting learning and teaching research in ways that adequately address ethical issues of informed consent, beneficence, respect, justice, research merit and integrity. This paper reviews these conceptual bases of human research ethics and gives…
The Faculty's Perception of Web-Based Instruction Application in Iran's Higher Education
ERIC Educational Resources Information Center
Gholami, Khalil; Sayadi, Yaser
2012-01-01
This paper addresses the faculty perception on web-based instruction in order to explain the nature of learning and instruction in this setting. Using a mixed method approach, the research studied a sample of 132 University Faculty (lecturers and professors) in University of Kurdistan. The research tools were interview and questionnaire. The…
Re-Conceptualizing the ELP as a Web 2.0 Personal Language Learning Environment
ERIC Educational Resources Information Center
Haines, Kevin; van Engen, Jeroen
2013-01-01
This paper addresses the reconceptualization of the ELP as a Personal Language Learning Environment (PLLE), encouraging learners towards greater self-regulation. Such a development fits in with the pedagogical function of the ELP by scaffolding the plurilingual, lifelong learning of languages. Web 2.0 social media tools allow learners to work with…
Addressing Challenges in Web Accessibility for the Blind and Visually Impaired
ERIC Educational Resources Information Center
Guercio, Angela; Stirbens, Kathleen A.; Williams, Joseph; Haiber, Charles
2011-01-01
Searching for relevant information on the web is an important aspect of distance learning. This activity is a challenge for visually impaired distance learners. While sighted people have the ability to filter information in a fast and non sequential way, blind persons rely on tools that process the information in a sequential way. Learning is…
Inquiry of Pre-Service Teachers' Concern about Integrating Web 2.0 into Instruction
ERIC Educational Resources Information Center
Hao, Yungwei; Lee, Kathryn S.
2017-01-01
To promote technology integration, it is essential to address pre-service teacher (PST) concerns about facilitating technology-enhanced learning environments. This study adopted the Concerns-Based Adoption Model to investigate PST concern on Web 2.0 integration. Four hundred and eighty-nine PSTs in a teacher education university in north Taiwan…
ERIC Educational Resources Information Center
Gelbart, Hadas; Brill, Gilat; Yarden, Anat
2009-01-01
Providing learners with opportunities to engage in activities similar to those carried out by scientists was addressed in a web-based research simulation in genetics developed for high school biology students. The research simulation enables learners to apply their genetics knowledge while giving them an opportunity to participate in an authentic…
Technopotters and Webs of Clay: Digital Possibilities for Teaching Ceramics
ERIC Educational Resources Information Center
Weida, Courtney Lee
2007-01-01
In this article, the author examines ways in which the Internet is changing the way ceramicists teach, learn, and work. She addresses the curricular issue of how Web resources may supplement ceramic art history and extend student-centered learning. The author also explores the nature of the interplay between computer technology and clay. (Contains…
USDA-ARS?s Scientific Manuscript database
This work addresses a cross-cutting issue within the field of food-web ecology—the integration of the microbiome into trophic hierarchies. The nature and degree to which microbes may reconfigure the trophic identities of carnivore and omnivore groups have remained surprisingly unresolved. This means...
Web-Based Machine Translation as a Tool for Promoting Electronic Literacy and Language Awareness
ERIC Educational Resources Information Center
Williams, Lawrence
2006-01-01
This article addresses a pervasive problem of concern to teachers of many foreign languages: the use of Web-Based Machine Translation (WBMT) by students who do not understand the complexities of this relatively new tool. Although networked technologies have greatly increased access to many language and communication tools, WBMT is still…
Developing a Web 2.0-Based System with User-Authored Content for Community Use and Teacher Education
ERIC Educational Resources Information Center
Cifuentes, Lauren; Sharp, Amy; Bulu, Sanser; Benz, Mike; Stough, Laura M.
2010-01-01
We report on an investigation into the design, development, implementation, and evaluation of an informational and instructional Website in order to generate guidelines for instructional designers of read/write Web environments. We describe the process of design and development research, the problem addressed, the theory-based solution, and the…
ERIC Educational Resources Information Center
Harushimana, Immaculee
2008-01-01
This article, "The Web-Savvy Urban Teacher," addresses the question of what educational technology educators and scholars can do to close the pedagogical mismatch, which exists today between "digital native" secondary students and their predigital educators. The infrequent use of the Internet as a resource in urban schools is detrimental for…
XML: A Language To Manage the World Wide Web. ERIC Digest.
ERIC Educational Resources Information Center
Davis-Tanous, Jennifer R.
This digest provides an overview of XML (Extensible Markup Language), a markup language used to construct World Wide Web pages. Topics addressed include: (1) definition of a markup language, including comparison of XML with SGML (Standard Generalized Markup Language) and HTML (HyperText Markup Language); (2) how XML works, including sample tags,…
Report on the EMBER Project--A European Multimedia Bioinformatics Educational Resource
ERIC Educational Resources Information Center
Attwood, Terri K.; Selimas, Ioannis; Buis, Rob; Altenburg, Ruud; Herzog, Robert; Ledent, Valerie; Ghita, Viorica; Fernandes, Pedro; Marques, Isabel; Brugman, Marc
2005-01-01
EMBER was a European project aiming to develop bioinformatics teaching materials on the Web and CD-ROM to help address the recognised skills shortage in bioinformatics. The project grew out of pilot work on the development of an interactive web-based bioinformatics tutorial and the desire to repackage that resource with the help of a professional…
ERIC Educational Resources Information Center
Gardner, Joel; Belland, Brian R.
2017-01-01
To address the need for effective, efficient ways to apply active learning in undergraduate biology courses, in this paper, we propose a problem-centered approach that utilizes supplemental web-based instructional materials based on principles of active learning. We compared two supplementary web-based modules using active learning strategies: the…
Field Testing of Environmentally Friendly Drilling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Burnett
2009-05-31
The Environmentally Friendly Drilling (EFD) program addresses new low-impact technology that reduces the footprint of drilling activities, integrates light weight drilling rigs with reduced emission engine packages, addresses on-site waste management, optimizes the systems to fit the needs of a specific development sites and provides stewardship of the environment. In addition, the program includes industry, the public, environmental organizations, and elected officials in a collaboration that addresses concerns on development of unconventional natural gas resources in environmentally sensitive areas. The EFD program provides the fundamentals to result in greater access, reasonable regulatory controls, lower development cost and reduction of themore » environmental footprint associated with operations for unconventional natural gas. Industry Sponsors have supported the program with significant financial and technical support. This final report compendium is organized into segments corresponding directly with the DOE approved scope of work for the term 2005-2009 (10 Sections). Each specific project is defined by (a) its goals, (b) its deliverable, and (c) its future direction. A web site has been established that contains all of these detailed engineering reports produced with their efforts. The goals of the project are to (1) identify critical enabling technologies for a prototype low-impact drilling system, (2) test the prototype systems in field laboratories, and (3) demonstrate the advanced technology to show how these practices would benefit the environment.« less
SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services
Gessler, Damian DG; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T
2009-01-01
Background SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. Results There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at , developer tools at , and a portal to third-party ontologies at (a "swap meet"). Conclusion SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs. PMID:19775460
Hersch, Rebekah K.; Cook, Royer F.; Deitz, Diane K.; Kaplan, Seth; Hughes, Daniel; Friesen, Mary Ann; Vezina, Maria
2016-01-01
Background Nursing is a notoriously high-stress occupation emotionally taxing and physically draining, with a high incidence of burnout. In addition to the damaging effects of stress on nurses’ health and well being, stress is also a major contributor to attrition and widespread shortages in the nursing profession. Although there exist promising in-person interventions for addressing the problem of stress among nurses, the experience of our group across multiple projects in hospitals has indicated that the schedules and workloads of nurses can pose problems for implementing in-person interventions, and that web-based interventions might be ideally suited to addressing the high levels of stress among nurses. Purpose The purpose of this study was to evaluate the effectiveness of the web-based BREATHE: Stress Management for Nurses program. Methods The randomized controlled trial was conducted with 104 nurses in five hospitals in Virginia and one hospital in New York. The primary outcome measure was perceived nursing-related stress. Secondary measures included symptoms of distress, coping, work limitations, job satisfaction, use of substances to relieve stress, alcohol consumption, and understanding depression and anxiety. Results Program group participants experienced significantly greater reductions than the control group on the full Nursing Stress Scale, and six of the seven subscales. No other significant results were found. Moderator analysis found that nurses with greater experience benefitted more. Conclusion Using a web-based program holds tremendous promise for providing nurses with the tools they need to address nursing related stress. PMID:27969025
Data mining for personal navigation
NASA Astrophysics Data System (ADS)
Hariharan, Gurushyam; Franti, Pasi; Mehta, Sandeep
2002-03-01
Relevance is the key in defining what data is to be extracted from the Internet. Traditionally, relevance has been defined mainly by keywords and user profiles. In this paper we discuss a fairly untouched dimension to relevance: location. Any navigational information sought by a user at large on earth is evidently governed by his location. We believe that task oriented data mining of the web amalgamated with location information is the key to providing relevant information for personal navigation. We explore the existential hurdles and propose novel approaches to tackle them. We also present naive, task-oriented data mining based approaches and their implementations in Java, to extract location based information. Ad-hoc pairing of data with coordinates (x, y) is very rare on the web. But if the same co-ordinates are converted to a logical address (state/city/street), a wide spectrum of location-based information base opens up. Hence, given the coordinates (x, y) on the earth, the scheme points to the logical address of the user. Location based information could either be picked up from fixed and known service providers (e.g. Yellow Pages) or from any arbitrary website on the Web. Once the web servers providing information relevant to the logical address are located, task oriented data mining is performed over these sites keeping in mind what information is interesting to the contemporary user. After all this, a simple data stream is provided to the user with information scaled to his convenience. The scheme has been implemented for cities of Finland.
web-based interactive data processing: application to stable isotope metrology.
Verkouteren, R M; Lee, J N
2001-08-01
To address a fundamental need in stable isotope metrology, the National Institute of Standards and Technology (NIST) has established a web-based interactive data-processing system accessible through a common gateway interface (CGI) program on the internet site http://www. nist.gov/widps-co2. This is the first application of a web-based tool that improves the measurement traceability afforded by a series of NIST standard materials. Specifically, this tool promotes the proper usage of isotope reference materials (RMs) and improves the quality of reported data from extensive measurement networks. Through the International Atomic Energy Agency (IAEA), we have defined standard procedures for stable isotope measurement and data-processing, and have determined and applied consistent reference values for selected NIST and IAEA isotope RMs. Measurement data of samples and RMs are entered into specified fields on the web-based form. These data are submitted through the CGI program on a NIST Web server, where appropriate calculations are performed and results returned to the client. Several international laboratories have independently verified the accuracy of the procedures and algorithm for measurements of naturally occurring carbon-13 and oxygen-18 abundances and slightly enriched compositions up to approximately 150% relative to natural abundances. To conserve the use of the NIST RMs, users may determine value assignments for a secondary standard to be used in routine analysis. Users may also wish to validate proprietary algorithms embedded in their laboratory instrumentation, or specify the values of fundamental variables that are usually fixed in reduction algorithms to see the effect on the calculations. The results returned from the web-based tool are limited in quality only by the measurements themselves, and further value may be realized through the normalization function. When combined with stringent measurement protocols, two- to threefold improvements have been realized in the reproducibility of carbon-13 and oxygen-18 determinations across laboratories.
Automatic generation of Web mining environments
NASA Astrophysics Data System (ADS)
Cibelli, Maurizio; Costagliola, Gennaro
1999-02-01
The main problem related to the retrieval of information from the world wide web is the enormous number of unstructured documents and resources, i.e., the difficulty of locating and tracking appropriate sources. This paper presents a web mining environment (WME), which is capable of finding, extracting and structuring information related to a particular domain from web documents, using general purpose indices. The WME architecture includes a web engine filter (WEF), to sort and reduce the answer set returned by a web engine, a data source pre-processor (DSP), which processes html layout cues in order to collect and qualify page segments, and a heuristic-based information extraction system (HIES), to finally retrieve the required data. Furthermore, we present a web mining environment generator, WMEG, that allows naive users to generate a WME specific to a given domain by providing a set of specifications.
VarioML framework for comprehensive variation data representation and exchange.
Byrne, Myles; Fokkema, Ivo Fac; Lancaster, Owen; Adamusiak, Tomasz; Ahonen-Bishopp, Anni; Atlan, David; Béroud, Christophe; Cornell, Michael; Dalgleish, Raymond; Devereau, Andrew; Patrinos, George P; Swertz, Morris A; Taschner, Peter Em; Thorisson, Gudmundur A; Vihinen, Mauno; Brookes, Anthony J; Muilu, Juha
2012-10-03
Sharing of data about variation and the associated phenotypes is a critical need, yet variant information can be arbitrarily complex, making a single standard vocabulary elusive and re-formatting difficult. Complex standards have proven too time-consuming to implement. The GEN2PHEN project addressed these difficulties by developing a comprehensive data model for capturing biomedical observations, Observ-OM, and building the VarioML format around it. VarioML pairs a simplified open specification for describing variants, with a toolkit for adapting the specification into one's own research workflow. Straightforward variant data can be captured, federated, and exchanged with no overhead; more complex data can be described, without loss of compatibility. The open specification enables push-button submission to gene variant databases (LSDBs) e.g., the Leiden Open Variation Database, using the Cafe Variome data publishing service, while VarioML bidirectionally transforms data between XML and web-application code formats, opening up new possibilities for open source web applications building on shared data. A Java implementation toolkit makes VarioML easily integrated into biomedical applications. VarioML is designed primarily for LSDB data submission and transfer scenarios, but can also be used as a standard variation data format for JSON and XML document databases and user interface components. VarioML is a set of tools and practices improving the availability, quality, and comprehensibility of human variation information. It enables researchers, diagnostic laboratories, and clinics to share that information with ease, clarity, and without ambiguity.
VarioML framework for comprehensive variation data representation and exchange
2012-01-01
Background Sharing of data about variation and the associated phenotypes is a critical need, yet variant information can be arbitrarily complex, making a single standard vocabulary elusive and re-formatting difficult. Complex standards have proven too time-consuming to implement. Results The GEN2PHEN project addressed these difficulties by developing a comprehensive data model for capturing biomedical observations, Observ-OM, and building the VarioML format around it. VarioML pairs a simplified open specification for describing variants, with a toolkit for adapting the specification into one's own research workflow. Straightforward variant data can be captured, federated, and exchanged with no overhead; more complex data can be described, without loss of compatibility. The open specification enables push-button submission to gene variant databases (LSDBs) e.g., the Leiden Open Variation Database, using the Cafe Variome data publishing service, while VarioML bidirectionally transforms data between XML and web-application code formats, opening up new possibilities for open source web applications building on shared data. A Java implementation toolkit makes VarioML easily integrated into biomedical applications. VarioML is designed primarily for LSDB data submission and transfer scenarios, but can also be used as a standard variation data format for JSON and XML document databases and user interface components. Conclusions VarioML is a set of tools and practices improving the availability, quality, and comprehensibility of human variation information. It enables researchers, diagnostic laboratories, and clinics to share that information with ease, clarity, and without ambiguity. PMID:23031277
Sabariego, Carla; Cieza, Alarcos
2016-01-01
Background Mental disorders (MDs) affect almost 1 in 4 adults at some point during their lifetime, and coupled with substance use disorders are the fifth leading cause of disability adjusted life years worldwide. People with these disorders often use the Web as an informational resource, platform for convenient self-directed treatment, and a means for many other kinds of support. However, some features of the Web can potentially erect barriers for this group that limit their access to these benefits, and there is a lack of research looking into this eventuality. Therefore, it is important to identify gaps in knowledge about “what” barriers exist and “how” they could be addressed so that this knowledge can inform Web professionals who aim to ensure the Web is inclusive to this population. Objective The objective of this study was to provide an overview of existing evidence regarding the barriers people with mental disorders experience when using the Web and the facilitation measures used to address such barriers. Methods This study involved a systematic review of studies that have considered the difficulties people with mental disorders experience when using digital technologies. Digital technologies were included because knowledge about any barriers here would likely be also applicable to the Web. A synthesis was performed by categorizing data according to the 4 foundational principles of Web accessibility as proposed by the World Wide Web Consortium, which forms the necessary basis for anyone to gain adequate access to the Web. Facilitation measures recommended by studies were later summarized into a set of minimal recommendations. Results A total of 16 publications were included in this review, comprising 13 studies and 3 international guidelines. Findings suggest that people with mental disorders experience barriers that limit how they perceive, understand, and operate websites. Identified facilitation measures target these barriers in addition to ensuring that Web content can be reliably interpreted by a wide range of user applications. Conclusions People with mental disorders encounter barriers on the Web, and attempts have been made to remove or reduce these barriers. As forewarned by experts in the area, only a few studies investigating this issue were found. More rigorous research is needed to be exhaustive and to have a larger impact on improving the Web for people with mental disorders. PMID:27282115
Bernard, Renaldo; Sabariego, Carla; Cieza, Alarcos
2016-06-09
Mental disorders (MDs) affect almost 1 in 4 adults at some point during their lifetime, and coupled with substance use disorders are the fifth leading cause of disability adjusted life years worldwide. People with these disorders often use the Web as an informational resource, platform for convenient self-directed treatment, and a means for many other kinds of support. However, some features of the Web can potentially erect barriers for this group that limit their access to these benefits, and there is a lack of research looking into this eventuality. Therefore, it is important to identify gaps in knowledge about "what" barriers exist and "how" they could be addressed so that this knowledge can inform Web professionals who aim to ensure the Web is inclusive to this population. The objective of this study was to provide an overview of existing evidence regarding the barriers people with mental disorders experience when using the Web and the facilitation measures used to address such barriers. This study involved a systematic review of studies that have considered the difficulties people with mental disorders experience when using digital technologies. Digital technologies were included because knowledge about any barriers here would likely be also applicable to the Web. A synthesis was performed by categorizing data according to the 4 foundational principles of Web accessibility as proposed by the World Wide Web Consortium, which forms the necessary basis for anyone to gain adequate access to the Web. Facilitation measures recommended by studies were later summarized into a set of minimal recommendations. A total of 16 publications were included in this review, comprising 13 studies and 3 international guidelines. Findings suggest that people with mental disorders experience barriers that limit how they perceive, understand, and operate websites. Identified facilitation measures target these barriers in addition to ensuring that Web content can be reliably interpreted by a wide range of user applications. People with mental disorders encounter barriers on the Web, and attempts have been made to remove or reduce these barriers. As forewarned by experts in the area, only a few studies investigating this issue were found. More rigorous research is needed to be exhaustive and to have a larger impact on improving the Web for people with mental disorders.
An evidence-based clinical guideline for the use of antithrombotic therapies in spine surgery.
Bono, Christopher M; Watters, William C; Heggeness, Michael H; Resnick, Daniel K; Shaffer, William O; Baisden, Jamie; Ben-Galim, Peleg; Easa, John E; Fernand, Robert; Lamer, Tim; Matz, Paul G; Mendel, Richard C; Patel, Rajeev K; Reitman, Charles A; Toton, John F
2009-12-01
The objective of the North American Spine Society (NASS) Evidence-Based Clinical Guideline on antithrombotic therapies in spine surgery was to provide evidence-based recommendations to address key clinical questions surrounding the use of antithrombotic therapies in spine surgery. The guideline is intended to address these questions based on the highest quality clinical literature available on this subject as of February 2008. The goal of the guideline recommendations was to assist in delivering optimum, efficacious treatment with the goal of preventing thromboembolic events. To provide an evidence-based, educational tool to assist spine surgeons in minimizing the risk of deep venous thrombosis (DVT) and pulmonary embolism (PE). Systematic review and evidence-based clinical guideline. This report is from the Antithrombotic Therapies Work Group of the NASS Evidence-Based Guideline Development Committee. The work group was composed of multidisciplinary spine care specialists, all of whom were trained in the principles of evidence-based analysis. Each member of the group was involved in formatting a series of clinical questions to be addressed by the group. The final questions agreed on by the group are the subject of this report. A literature search addressing each question and using a specific search protocol was performed on English language references found in MEDLINE, EMBASE (Drugs and Pharmacology), and four additional, evidence-based databases. The relevant literature was then independently rated by at least three reviewers using the NASS-adopted standardized levels of evidence. An evidentiary table was created for each of the questions. Final grades of recommendation for the answers to each clinical question were arrived at via Web casts among members of the work group using standardized grades of recommendation. When Level I to IV evidence was insufficient to support a recommendation to answer a specific clinical question, expert consensus was arrived at by the work group through the modified nominal group technique and is clearly identified as such in the guideline. Fourteen clinical questions were formulated, addressing issues of incidence of DVT and PE in spine surgery and recommendations regarding utilization of mechanical prophylaxis and chemoprophylaxis in spine surgery. The answers to these 14 clinical questions are summarized in this article. The respective recommendations were graded by the strength of the supporting literature that was stratified by levels of evidence. A clinical guideline addressing the use of antithrombotic therapies in spine surgery has been created using the techniques of evidence-based medicine and using the best available evidence as a tool to assist spine surgeons in minimizing the risk of DVT and PE. The entire guideline document, including the evidentiary tables, suggestions for future research, and all references, is available electronically at the NASS Web site (www.spine.org) and will remain updated on a timely schedule.
Providing Multi-Page Data Extraction Services with XWRAPComposer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ling; Zhang, Jianjun; Han, Wei
2008-04-30
Dynamic Web data sources – sometimes known collectively as the Deep Web – increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deepmore » Web. To address these challenges, we present DYNABOT, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DYNABOT has three unique characteristics. First, DYNABOT utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DYNABOT employs a modular, self-tuning system architecture for focused crawling of the Deep Web using service class descriptions. Third, DYNABOT incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less
CIMAC: A Coordinated Introduction to Calculus and Mechanics
NASA Astrophysics Data System (ADS)
Fathe, Laurie; Quinn, Jennifer; McDonald, Michael A.
1997-04-01
CIMAC, new course incorporating Mechanics, Precalculus, and Calculus, targets the growing number of motivated but underprepared students who wish to pursue a major in science or mathematics. Team-taught by a Physicist and a Mathematician, CIMAC, a new course incorporating Mechanics, Precalculus, and Calculus, targets the growing number of motivated but underprepared students who wish to pursue a major in science or mathematics. Team-taught by a Physicist and a Mathematician, the class contains specific content while exploiting the substantial commonality of these subjects. CIMAC also addresses variety of non-content areas, including supplementing basic mathematics and communication skills, accommodating various learning styles, and building student confidence. Specific approaches include class formats; gateway exams; group assignments; emphasis on writing and reading; use of computers and graphing calculators for comprehension, data acquisition, analysis, and modeling; student-led help sessions; and use of the Web http://www.oxy.edu/ departments/math/cimac/ This talk highlights the development of the course and teaching insights and innovations which have arisen from it, and addresses benefits and difficulties of coordinating material and team teaching across disciplinary lines. Finally, it presents data on student success and retention.
Web-based technical assistance and training to promote community tobacco control policy change.
Young, Walter F; Montgomery, Debbie; Nycum, Colleen; Burns-Martin, Lavon; Buller, David B
2006-01-01
In 1998 the tobacco industry was released of claims that provided monetary relief for states. A significant expansion of tobacco control activity in many states created a need to develop local capacity. Technical assistance and training for new and experienced staff became a significant challenge for tobacco control leadership. In Colorado, this challenge was addressed in part through the development of a technical assistance and training Web site designed for local tobacco control staff and coalition members. Researchers, technical Web site development specialists, state health agency, and state tobacco control coalition staff collaborated to develop, promote, and test the efficacy of this Web site. The work group embodied a range of skills including tobacco control, Web site technical development, marketing, training, and project management. Persistent marketing, updating of Web site content, and institutionalizing it as a principal source of information and training were key to use by community coalition members.
Web-based distance continuing education: a new way of thinking for students and instructors.
Garrison, J A; Schardt, C; Kochi, J K
2000-07-01
As people have more difficulty taking time away from work to attend conferences and workshops, the idea of offering courses via the Web has become more desirable. Addressing a need voiced by Medical Library Association membership, the authors developed a Web-based continuing-education course on the subject of the librarian's role in evidence-based medicine. The aim of the course was to provide medical librarians with a well-constructed, content-rich learning experience available to them at their convenience via the Web. This paper includes a discussion of the considerations that need to be taken into account when developing Web-based courses, the issues that arise when the information delivery changes from face-to-face to online, the changing role of the instructor, and the pros and cons of offering Web-based versus traditional courses. The results of the beta test and future plans for the course are also discussed.
Web-based distance continuing education: a new way of thinking for students and instructors
Garrison, Julie A.; Schardt, Connie; Kochi, Julia K.
2000-01-01
As people have more difficulty taking time away from work to attend conferences and workshops, the idea of offering courses via the Web has become more desirable. Addressing a need voiced by Medical Library Association membership, the authors developed a Web-based continuing-education course on the subject of the librarian's role in evidence-based medicine. The aim of the course was to provide medical librarians with a well-constructed, content-rich learning experience available to them at their convenience via the Web. This paper includes a discussion of the considerations that need to be taken into account when developing Web-based courses, the issues that arise when the information delivery changes from face-to-face to online, the changing role of the instructor, and the pros and cons of offering Web-based versus traditional courses. The results of the beta test and future plans for the course are also discussed. PMID:10928706
Web usage mining at an academic health sciences library: an exploratory study.
Bracke, Paul J
2004-10-01
This paper explores the potential of multinomial logistic regression analysis to perform Web usage mining for an academic health sciences library Website. Usage of database-driven resource gateway pages was logged for a six-month period, including information about users' network addresses, referring uniform resource locators (URLs), and types of resource accessed. It was found that referring URL did vary significantly by two factors: whether a user was on-campus and what type of resource was accessed. Although the data available for analysis are limited by the nature of the Web and concerns for privacy, this method demonstrates the potential for gaining insight into Web usage that supplements Web log analysis. It can be used to improve the design of static and dynamic Websites today and could be used in the design of more advanced Web systems in the future.
Information integration from heterogeneous data sources: a Semantic Web approach.
Kunapareddy, Narendra; Mirhaji, Parsa; Richards, David; Casscells, S Ward
2006-01-01
Although the decentralized and autonomous implementation of health information systems has made it possible to extend the reach of surveillance systems to a variety of contextually disparate domains, public health use of data from these systems is not primarily anticipated. The Semantic Web has been proposed to address both representational and semantic heterogeneity in distributed and collaborative environments. We introduce a semantic approach for the integration of health data using the Resource Definition Framework (RDF) and the Simple Knowledge Organization System (SKOS) developed by the Semantic Web community.
NASA Astrophysics Data System (ADS)
Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.
2014-12-01
Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.
A Semantic Web Management Model for Integrative Biomedical Informatics
Deus, Helena F.; Stanislaus, Romesh; Veiga, Diogo F.; Behrens, Carmen; Wistuba, Ignacio I.; Minna, John D.; Garner, Harold R.; Swisher, Stephen G.; Roth, Jack A.; Correa, Arlene M.; Broom, Bradley; Coombes, Kevin; Chang, Allen; Vogel, Lynn H.; Almeida, Jonas S.
2008-01-01
Background Data, data everywhere. The diversity and magnitude of the data generated in the Life Sciences defies automated articulation among complementary efforts. The additional need in this field for managing property and access permissions compounds the difficulty very significantly. This is particularly the case when the integration involves multiple domains and disciplines, even more so when it includes clinical and high throughput molecular data. Methodology/Principal Findings The emergence of Semantic Web technologies brings the promise of meaningful interoperation between data and analysis resources. In this report we identify a core model for biomedical Knowledge Engineering applications and demonstrate how this new technology can be used to weave a management model where multiple intertwined data structures can be hosted and managed by multiple authorities in a distributed management infrastructure. Specifically, the demonstration is performed by linking data sources associated with the Lung Cancer SPORE awarded to The University of Texas MDAnderson Cancer Center at Houston and the Southwestern Medical Center at Dallas. A software prototype, available with open source at www.s3db.org, was developed and its proposed design has been made publicly available as an open source instrument for shared, distributed data management. Conclusions/Significance The Semantic Web technologies have the potential to addresses the need for distributed and evolvable representations that are critical for systems Biology and translational biomedical research. As this technology is incorporated into application development we can expect that both general purpose productivity software and domain specific software installed on our personal computers will become increasingly integrated with the relevant remote resources. In this scenario, the acquisition of a new dataset should automatically trigger the delegation of its analysis. PMID:18698353
Monitoring of small laboratory animal experiments by a designated web-based database.
Frenzel, T; Grohmann, C; Schumacher, U; Krüll, A
2015-10-01
Multiple-parametric small animal experiments require, by their very nature, a sufficient number of animals which may need to be large to obtain statistically significant results.(1) For this reason database-related systems are required to collect the experimental data as well as to support the later (re-) analysis of the information gained during the experiments. In particular, the monitoring of animal welfare is simplified by the inclusion of warning signals (for instance, loss in body weight >20%). Digital patient charts have been developed for human patients but are usually not able to fulfill the specific needs of animal experimentation. To address this problem a unique web-based monitoring system using standard MySQL, PHP, and nginx has been created. PHP was used to create the HTML-based user interface and outputs in a variety of proprietary file formats, namely portable document format (PDF) or spreadsheet files. This article demonstrates its fundamental features and the easy and secure access it offers to the data from any place using a web browser. This information will help other researchers create their own individual databases in a similar way. The use of QR-codes plays an important role for stress-free use of the database. We demonstrate a way to easily identify all animals and samples and data collected during the experiments. Specific ways to record animal irradiations and chemotherapy applications are shown. This new analysis tool allows the effective and detailed analysis of huge amounts of data collected through small animal experiments. It supports proper statistical evaluation of the data and provides excellent retrievable data storage. © The Author(s) 2015.
Distributed spatial information integration based on web service
NASA Astrophysics Data System (ADS)
Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng
2008-10-01
Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.
Distributed spatial information integration based on web service
NASA Astrophysics Data System (ADS)
Tong, Hengjian; Zhang, Yun; Shao, Zhenfeng
2009-10-01
Spatial information systems and spatial information in different geographic locations usually belong to different organizations. They are distributed and often heterogeneous and independent from each other. This leads to the fact that many isolated spatial information islands are formed, reducing the efficiency of information utilization. In order to address this issue, we present a method for effective spatial information integration based on web service. The method applies asynchronous invocation of web service and dynamic invocation of web service to implement distributed, parallel execution of web map services. All isolated information islands are connected by the dispatcher of web service and its registration database to form a uniform collaborative system. According to the web service registration database, the dispatcher of web services can dynamically invoke each web map service through an asynchronous delegating mechanism. All of the web map services can be executed at the same time. When each web map service is done, an image will be returned to the dispatcher. After all of the web services are done, all images are transparently overlaid together in the dispatcher. Thus, users can browse and analyze the integrated spatial information. Experiments demonstrate that the utilization rate of spatial information resources is significantly raised thought the proposed method of distributed spatial information integration.
Code of Federal Regulations, 2014 CFR
2014-04-01
... (§ 229.1105 of this chapter) may be provided under the following conditions on an Internet Web site for... Internet address where the information is posted. (2) Such information shall be provided through the Web site unrestricted as to access and free of charge. (3) Such information shall remain available on the...
Code of Federal Regulations, 2012 CFR
2012-04-01
... (§ 229.1105 of this chapter) may be provided under the following conditions on an Internet Web site for... Internet address where the information is posted. (2) Such information shall be provided through the Web site unrestricted as to access and free of charge. (3) Such information shall remain available on the...
Code of Federal Regulations, 2010 CFR
2010-04-01
... (§ 229.1105 of this chapter) may be provided under the following conditions on an Internet Web site for... Internet address where the information is posted. (2) Such information shall be provided through the Web site unrestricted as to access and free of charge. (3) Such information shall remain available on the...
Code of Federal Regulations, 2013 CFR
2013-04-01
... (§ 229.1105 of this chapter) may be provided under the following conditions on an Internet Web site for... Internet address where the information is posted. (2) Such information shall be provided through the Web site unrestricted as to access and free of charge. (3) Such information shall remain available on the...
Code of Federal Regulations, 2011 CFR
2011-04-01
... (§ 229.1105 of this chapter) may be provided under the following conditions on an Internet Web site for... Internet address where the information is posted. (2) Such information shall be provided through the Web site unrestricted as to access and free of charge. (3) Such information shall remain available on the...
ERIC Educational Resources Information Center
Krause, Jaclyn A.
2010-01-01
As Web 2.0 tools and technologies increase in popularity in consumer markets, enterprises are seeking ways to take advantage of the rich social knowledge exchanges that these tools offer. The problem this study addresses is that it remains unknown whether employees perceive that these tools offer value to the organization and therefore will be…
Code of Federal Regulations, 2011 CFR
2011-01-01
..., in conjunction with the member's social security number, driver's license number, account number... should provide the FTC's Web site address and toll-free telephone number that members may use to obtain... Web site for the ID Theft brochure and the FTC Hotline phone number are http://www.ftc.gov/idtheft and...
Self-Arrangement of Fleeting Student Pairs: A Web 2.0 Approach for Peer Tutoring
ERIC Educational Resources Information Center
Westera, Wim; de Bakker, Gijs; Wagemans, Leo
2009-01-01
This article presents a Web 2.0 approach for the arrangement of peer tutoring in online learning. In online learning environments, the learners' expectations of obtaining frequent, one-to-one support from their teachers tend to increase the teachers' workloads to unacceptably high levels. To address this problem of workload a self-organised peer…
ERIC Educational Resources Information Center
Klemm, E. Barbara; Iding, Marie K.; Crosby, Martha E.
This study addresses the need to develop research-based criteria for science teacher educators to use in preparing teachers to critically evaluate and select web-based resources for their students' use. The study focuses on the cognitive load imposed on the learner for tasks required in using text, illustrations, and other features of multi-…
ERIC Educational Resources Information Center
Rasmussen, Ann Marie
2011-01-01
This article describes an undergraduate, German-language course that aimed to improve students' language skills, critical thinking, and declarative knowledge of German history and culture by studying multiple manifestations of the legend of Siegfried the Dragonslayer. The course used web-based e-learning tools to address two major learning…
Code of Federal Regulations, 2010 CFR
2010-07-01
... filed through the Office's web site, at http://www.uspto.gov. Paper documents and cover sheets to be... trademark documents can be ordered through the Office's web site at www.uspto.gov. Paper requests for...: Madrid Processing Unit, 600 Dulany Street, MDE-7B87, Alexandria, VA 22314-5793. [68 FR 48289, Aug. 13...
ERIC Educational Resources Information Center
Huang, Tony Cheng-Kui; Huang, Chih-Hong
2010-01-01
With advances in information and network technologies, lots of data have been digitized to reveal information for users by the construction of Web sites. Unfortunately, they are both overloading and overlapping in Internet so that users cannot distinguish their quality. To address this issue in education, Hwang, Huang, and Tseng proposed a group…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-23
... the Division of Migratory Bird Management's Web site at http://www.fws.gov/migratorybirds/ , or at... under ADDRESSES or from our Web site at http://www.fws.gov/migratorybirds/NewsPublicationsReports.html... recommended raising the dark goose daily bag limit from 4 to 5 geese in the aggregate, with the exception of...
Technology: Cookies, Web Profilers, Social Network Cartography, and Proxy Servers
ERIC Educational Resources Information Center
Van Horn, Royal
2004-01-01
The Internet was designed as an open system that promoted the two-way flow of information. In other words, everything that is sent has a return address called an IP or Internet Protocol address of the form: 000.11.222.33. Whenever you connect to a website, the site learns your IP address. It also learns the type of computer you are using, the…
Systemic Vulnerabilities in Customer-Premises Equipment (CPE) Routers
2017-07-01
equipment (CPE),1 specifically small office/home office (SOHO) routers, has become ubiquitous. CPE routers are notorious for their web interface...and enabling remote management, although all settings controllable over the web -management interface can be manipulated. • 85% (11 of 13) of...specifically small office/home office (SOHO) routers— has become ubiquitous. CPE routers are notorious for their web interface vulnerabilities, old ver- sions
Worldwide Research, Worldwide Participation: Web-Based Test Logger
NASA Technical Reports Server (NTRS)
Clark, David A.
1998-01-01
Thanks to the World Wide Web, a new paradigm has been born. ESCORT (steady state data system) facilities can now be configured to use a Web-based test logger, enabling worldwide participation in tests. NASA Lewis Research Center's new Web-based test logger for ESCORT automatically writes selected test and facility parameters to a browser and allows researchers to insert comments. All data can be viewed in real time via Internet connections, so anyone with a Web browser and the correct URL (universal resource locator, or Web address) can interactively participate. As the test proceeds and ESCORT data are taken, Web browsers connected to the logger are updated automatically. The use of this logger has demonstrated several benefits. First, researchers are free from manual data entry and are able to focus more on the tests. Second, research logs can be printed in report format immediately after (or during) a test. And finally, all test information is readily available to an international public.
DeBonis, Katrina; Blair, Thomas R; Payne, Samuel T; Wigan, Katherine; Kim, Sara
2015-12-01
Web-based instruction in post-graduate psychiatry training has shown comparable effectiveness to in-person instruction, but few topics have been addressed in this format. This study sought to evaluate the viability of a web-based curriculum in teaching electrocardiogram (EKG) reading skills to psychiatry residents. Interest in receiving educational materials in this format was also assessed. A web-based curriculum of 41 slides, including eight pre-test and eight post-test questions with emphasis on cardiac complications of psychotropic medications, was made available to all psychiatry residents via email. Out of 57 residents, 30 initiated and 22 completed the module. Mean improvement from pre-test to post-test was 25 %, and all 22 completing participants indicated interest in future web-based instruction. This pilot study suggests that web-based instruction is feasible and under-utilized as a means of teaching psychiatry residents. Potential uses of web-based instruction, such as tracking learning outcomes or patient care longitudinally, are also discussed.
Gruber, Andreas R; Bernhart, Stephan H; Lorenz, Ronny
2015-01-01
The ViennaRNA package is a widely used collection of programs for thermodynamic RNA secondary structure prediction. Over the years, many additional tools have been developed building on the core programs of the package to also address issues related to noncoding RNA detection, RNA folding kinetics, or efficient sequence design considering RNA-RNA hybridizations. The ViennaRNA web services provide easy and user-friendly web access to these tools. This chapter describes how to use this online platform to perform tasks such as prediction of minimum free energy structures, prediction of RNA-RNA hybrids, or noncoding RNA detection. The ViennaRNA web services can be used free of charge and can be accessed via http://rna.tbi.univie.ac.at.
Web Services Security - Implementation and Evaluation Issues
NASA Astrophysics Data System (ADS)
Pimenidis, Elias; Georgiadis, Christos K.; Bako, Peter; Zorkadis, Vassilis
Web services development is a key theme in the utilization the commercial exploitation of the semantic web. Paramount to the development and offering of such services is the issue of security features and they way these are applied in instituting trust amongst participants and recipients of the service. Implementing such security features is a major challenge to developers as they need to balance these with performance and interoperability requirements. Being able to evaluate the level of security offered is a desirable feature for any prospective participant. The authors attempt to address the issues of security requirements and evaluation criteria, while they discuss the challenges of security implementation through a simple web service application case.
GSP: A web-based platform for designing genome-specific primers in polyploids
USDA-ARS?s Scientific Manuscript database
The sequences among subgenomes in a polyploid species have high similarity. This makes difficult to design genome-specific primers for sequence analysis. We present a web-based platform named GSP for designing genome-specific primers to distinguish subgenome sequences in the polyploid genome backgr...
The Way of the Web: Answers to Your Questions about Web Site Marketing.
ERIC Educational Resources Information Center
Wassom, Julie
2002-01-01
Provides suggestions for effective web site marketing for child care and early education programs. Includes key considerations in designing a web site, specific elements that cause visitors to stay on and return to the site, use of interactive sites, web-site updating and revision, and use of traditional marketing activities to direct prospective…
Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology
NASA Astrophysics Data System (ADS)
Ritschel, Bernd; Seelus, Christoph; Neher, Günther; Iyemori, Toshihiko; Koyama, Yukinobu; Yatagai, Akiyo; Murayama, Yasuhiro; King, Todd; Hughes, John; Fung, Shing; Galkin, Ivan; Hapgood, Michael; Belehaki, Anna
2015-04-01
Opportunities for the Mashup of Heterogenous Data Server via Semantic Web Technology European Union ESPAS, Japanese IUGONET and GFZ ISDC data server are developed for the ingestion, archiving and distributing of geo and space science domain data. Main parts of the data -managed by the mentioned data server- are related to near earth-space and geomagnetic field data. A smart mashup of the data server would allow a seamless browse and access to data and related context information. However the achievement of a high level of interoperability is a challenge because the data server are based on different data models and software frameworks. This paper is focused on the latest experiments and results for the mashup of the data server using the semantic Web approach. Besides the mashup of domain and terminological ontologies, especially the options to connect data managed by relational databases using D2R server and SPARQL technology will be addressed. A successful realization of the data server mashup will not only have a positive impact to the data users of the specific scientific domain but also to related projects, such as e.g. the development of a new interoperable version of NASA's Planetary Data System (PDS) or ICUS's World Data System alliance. ESPAS data server: https://www.espas-fp7.eu/portal/ IUGONET data server: http://search.iugonet.org/iugonet/ GFZ ISDC data server (semantic Web based prototype): http://rz-vm30.gfz-potsdam.de/drupal-7.9/ NASA PDS: http://pds.nasa.gov ICSU-WDS: https://www.icsu-wds.org
Web mining for topics defined by complex and precise predicates
NASA Astrophysics Data System (ADS)
Lee, Ching-Cheng; Sampathkumar, Sushma
2004-04-01
The enormous growth of the World Wide Web has made it important to perform resource discovery efficiently for any given topic. Several new techniques have been proposed in the recent years for this kind of topic specific web-mining, and among them a key new technique called focused crawling which is able to crawl topic-specific portions of the web without having to explore all pages. Most existing research on focused crawling considers a simple topic definition that typically consists of one or more keywords connected by an OR operator. However this kind of simple topic definition may result in too many irrelevant pages in which the same keyword appears in a wrong context. In this research we explore new strategies for crawling topic specific portions of the web using complex and precise predicates. A complex predicate will allow the user to precisely specify a topic using Boolean operators such as "AND", "OR" and "NOT". Our work will concentrate on defining a format to specify this kind of a complex topic definition and secondly on devising a crawl strategy to crawl the topic specific portions of the web defined by the complex predicate, efficiently and with minimal overhead. Our new crawl strategy will improve the performance of topic-specific web crawling by reducing the number of irrelevant pages crawled. In order to demonstrate the effectiveness of the above approach, we have built a complete focused crawler called "Eureka" with complex predicate support, and a search engine that indexes and supports end-user searches on the crawled pages.
Food-Web Complexity in Guaymas Basin Hydrothermal Vents and Cold Seeps.
Portail, Marie; Olu, Karine; Dubois, Stanislas F; Escobar-Briones, Elva; Gelinas, Yves; Menot, Lénaick; Sarrazin, Jozée
In the Guaymas Basin, the presence of cold seeps and hydrothermal vents in close proximity, similar sedimentary settings and comparable depths offers a unique opportunity to assess and compare the functioning of these deep-sea chemosynthetic ecosystems. The food webs of five seep and four vent assemblages were studied using stable carbon and nitrogen isotope analyses. Although the two ecosystems shared similar potential basal sources, their food webs differed: seeps relied predominantly on methanotrophy and thiotrophy via the Calvin-Benson-Bassham (CBB) cycle and vents on petroleum-derived organic matter and thiotrophy via the CBB and reductive tricarboxylic acid (rTCA) cycles. In contrast to symbiotic species, the heterotrophic fauna exhibited high trophic flexibility among assemblages, suggesting weak trophic links to the metabolic diversity of chemosynthetic primary producers. At both ecosystems, food webs did not appear to be organised through predator-prey links but rather through weak trophic relationships among co-occurring species. Examples of trophic or spatial niche differentiation highlighted the importance of species-sorting processes within chemosynthetic ecosystems. Variability in food web structure, addressed through Bayesian metrics, revealed consistent trends across ecosystems. Food-web complexity significantly decreased with increasing methane concentrations, a common proxy for the intensity of seep and vent fluid fluxes. Although high fluid-fluxes have the potential to enhance primary productivity, they generate environmental constraints that may limit microbial diversity, colonisation of consumers and the structuring role of competitive interactions, leading to an overall reduction of food-web complexity and an increase in trophic redundancy. Heterogeneity provided by foundation species was identified as an additional structuring factor. According to their biological activities, foundation species may have the potential to partly release the competitive pressure within communities of low fluid-flux habitats. Finally, ecosystem functioning in vents and seeps was highly similar despite environmental differences (e.g. physico-chemistry, dominant basal sources) suggesting that ecological niches are not specifically linked to the nature of fluids. This comparison of seep and vent functioning in the Guaymas basin thus provides further supports to the hypothesis of continuity among deep-sea chemosynthetic ecosystems.
Food web heterogeneity and succession in created saltmarshes
Nordstrom, M C; Demopoulos, Amanda W.J.; Whitcraft, CR; Rismondo, A.; McMillan, P.; Gonzales, J P; Levin, L A
2015-01-01
1. Ecological restoration must achieve functional as well as structural recovery. Functional metrics for reestablishment of trophic interactions can be used to complement traditional monitoring of structural attributes. In addition, topographic effects on food web structure provide added information within a restoration context; often, created sites may require spatial heterogeneity to effectively match structure and function of natural habitats. 2. We addressed both of these issues in our study of successional development of benthic food web structure, with focus on bottom–up driven changes in macroinvertebrate consumer assemblages in the salt marshes of the Venice Lagoon, Italy. We combined quantified estimates of the changing community composition with stable isotope data (13C:12C and 15N:14N) to compare the general trophic structure between created (2–14 years) marshes and reference sites and along topographic elevation gradients within salt marshes. 3. Macrofaunal invertebrate consumers exhibited local, habitat-specific trophic patterns. Stable isotope-based trophic structure changed with increasing marsh age, in particular with regards to mid-elevation (Salicornia) habitats. In young marshes, the mid-elevation consumer signatures resembled those of unvegetated ponds. The mid elevation of older and natural marshes had a more distinct Salicornia-zone food web, occasionally resembling that of the highest (Sarcocornia-dominated) elevation. In summary, this indicates that primary producers and availability of vascular plant detritus structure consumer trophic interactions and the flow of carbon. 4. Functionally different consumers, subsurface-feeding detritivores (Oligochaeta) and surface grazers (Hydrobia sp.), showed distinct but converging trajectories of isotopic change over time, indicating that successional development may be asymmetric between ‘brown’ (detrital) guilds and ‘green’ (grazing) guilds in the food web. 5. Synthesis and applications. Created marsh food webs converged into a natural state over about a decade, with successional shifts seen in both consumer community composition and stable isotope space. Strong spatial effects were noted, highlighting the utility of stable isotopes to evaluate functional equivalence in spatially heterogeneous systems. Understanding the recovery of functional properties such as food web support, and their inherent spatial variability, is key to planning and managing successful habitat restoration.
Ghoncheh, Rezvan; Gould, Madelyn S; Twisk, Jos Wr; Kerkhof, Ad Jfm; Koot, Hans M
2016-01-29
Face-to-face gatekeeper training can be an effective strategy in the enhancement of gatekeepers' knowledge and self-efficacy in adolescent suicide prevention. However, barriers related to access (eg, time, resources) may hamper participation in face-to-face training sessions. The transition to a Web-based setting could address obstacles associated with face-to-face gatekeeper training. Although Web-based suicide prevention training targeting adolescents exists, so far no randomized controlled trials (RCTs) have been conducted to investigate their efficacy. This RCT study investigated the efficacy of a Web-based adolescent suicide prevention program entitled Mental Health Online, which aimed to improve the knowledge and self-confidence of gatekeepers working with adolescents (12-20 years old). The program consisted of 8 short e-learning modules each capturing an important aspect of the process of early recognition, guidance, and referral of suicidal adolescents, alongside additional information on the topic of (adolescent) suicide prevention. A total of 190 gatekeepers (ages 21 to 62 years) participated in this study and were randomized to either the experimental group or waitlist control group. The intervention was not masked. Participants from both groups completed 3 Web-based assessments (pretest, posttest, and 3-month follow-up). The outcome measures of this study were actual knowledge, and participants' ratings of perceived knowledge and perceived self-confidence using questionnaires developed specifically for this study. The actual knowledge, perceived knowledge, and perceived self-confidence of gatekeepers in the experimental group improved significantly compared to those in the waitlist control group at posttest, and the effects remained significant at 3-month follow-up. The overall effect sizes were 0.76, 1.20, and 1.02, respectively, across assessments. The findings of this study indicate that Web-based suicide prevention e-learning modules can be an effective educational method to enhance knowledge and self-confidence of gatekeepers with regard to adolescent suicide prevention. Gatekeepers with limited time and resources can benefit from the accessibility, simplicity, and flexibility of Web-based training. Netherlands Trial Register NTR3625; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=3625 (Archived by WebCite at http://www.webcitation.org/6eHvyRh6M).
The BCube Crawler: Web Scale Data and Service Discovery for EarthCube.
NASA Astrophysics Data System (ADS)
Lopez, L. A.; Khalsa, S. J. S.; Duerr, R.; Tayachow, A.; Mingo, E.
2014-12-01
Web-crawling, a core component of the NSF-funded BCube project, is researching and applying the use of big data technologies to find and characterize different types of web services, catalog interfaces, and data feeds such as the ESIP OpenSearch, OGC W*S, THREDDS, and OAI-PMH that describe or provide access to scientific datasets. Given the scale of the Internet, which challenges even large search providers such as Google, the BCube plan for discovering these web accessible services is to subdivide the problem into three smaller, more tractable issues. The first, to be able to discover likely sites where relevant data and data services might be found, the second, to be able to deeply crawl the sites discovered to find any data and services which might be present. Lastly, to leverage the use of semantic technologies to characterize the services and data found, and to filter out everything but those relevant to the geosciences. To address the first two challenges BCube uses an adapted version of Apache Nutch (which originated Hadoop), a web scale crawler, and Amazon's ElasticMapReduce service for flexibility and cost effectiveness. For characterization of the services found, BCube is examining existing web service ontologies for their applicability to our needs and will re-use and/or extend these in order to query for services with specific well-defined characteristics in scientific datasets such as the use of geospatial namespaces. The original proposal for the crawler won a grant from Amazon's academic program, which allowed us to become operational; we successfully tested the Bcube Crawler at web scale obtaining a significant corpus, sizeable enough to enable work on characterization of the services and data found. There is still plenty of work to be done, doing "smart crawls" by managing the frontier, developing and enhancing our scoring algorithms and fully implementing the semantic characterization technologies. We describe the current status of the project, our successes and issues encountered. The final goal of the BCube crawler project is to provide relevant data services to other projects on the EarthCube stack and third party partners so they can be brokered and used by a wider scientific community.
Food-Web Complexity in Guaymas Basin Hydrothermal Vents and Cold Seeps
Olu, Karine; Dubois, Stanislas F.; Escobar-Briones, Elva; Gelinas, Yves; Menot, Lénaick; Sarrazin, Jozée
2016-01-01
In the Guaymas Basin, the presence of cold seeps and hydrothermal vents in close proximity, similar sedimentary settings and comparable depths offers a unique opportunity to assess and compare the functioning of these deep-sea chemosynthetic ecosystems. The food webs of five seep and four vent assemblages were studied using stable carbon and nitrogen isotope analyses. Although the two ecosystems shared similar potential basal sources, their food webs differed: seeps relied predominantly on methanotrophy and thiotrophy via the Calvin-Benson-Bassham (CBB) cycle and vents on petroleum-derived organic matter and thiotrophy via the CBB and reductive tricarboxylic acid (rTCA) cycles. In contrast to symbiotic species, the heterotrophic fauna exhibited high trophic flexibility among assemblages, suggesting weak trophic links to the metabolic diversity of chemosynthetic primary producers. At both ecosystems, food webs did not appear to be organised through predator-prey links but rather through weak trophic relationships among co-occurring species. Examples of trophic or spatial niche differentiation highlighted the importance of species-sorting processes within chemosynthetic ecosystems. Variability in food web structure, addressed through Bayesian metrics, revealed consistent trends across ecosystems. Food-web complexity significantly decreased with increasing methane concentrations, a common proxy for the intensity of seep and vent fluid fluxes. Although high fluid-fluxes have the potential to enhance primary productivity, they generate environmental constraints that may limit microbial diversity, colonisation of consumers and the structuring role of competitive interactions, leading to an overall reduction of food-web complexity and an increase in trophic redundancy. Heterogeneity provided by foundation species was identified as an additional structuring factor. According to their biological activities, foundation species may have the potential to partly release the competitive pressure within communities of low fluid-flux habitats. Finally, ecosystem functioning in vents and seeps was highly similar despite environmental differences (e.g. physico-chemistry, dominant basal sources) suggesting that ecological niches are not specifically linked to the nature of fluids. This comparison of seep and vent functioning in the Guaymas basin thus provides further supports to the hypothesis of continuity among deep-sea chemosynthetic ecosystems. PMID:27683216
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 6 2012-07-01 2012-07-01 false Referrals. 806.9 Section 806.9 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION... access, NTIS has paper copies for sale. Give requesters the web address or NTIS address when appropriate...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 6 2013-07-01 2013-07-01 false Referrals. 806.9 Section 806.9 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION... access, NTIS has paper copies for sale. Give requesters the web address or NTIS address when appropriate...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 6 2010-07-01 2010-07-01 false Referrals. 806.9 Section 806.9 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION... access, NTIS has paper copies for sale. Give requesters the web address or NTIS address when appropriate...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 6 2014-07-01 2014-07-01 false Referrals. 806.9 Section 806.9 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION... access, NTIS has paper copies for sale. Give requesters the web address or NTIS address when appropriate...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 6 2011-07-01 2011-07-01 false Referrals. 806.9 Section 806.9 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION... access, NTIS has paper copies for sale. Give requesters the web address or NTIS address when appropriate...
32 CFR Appendix A to Part 651 - References
Code of Federal Regulations, 2014 CFR
2014-07-01
... publications and forms are accessible from a variety of sources through the use of electronic media or paper products. In most cases, electronic publications and forms that are associated with military organizations can be accessed at various address or web sites on the Internet. Since electronic addresses can...
32 CFR Appendix A to Part 651 - References
Code of Federal Regulations, 2012 CFR
2012-07-01
... publications and forms are accessible from a variety of sources through the use of electronic media or paper products. In most cases, electronic publications and forms that are associated with military organizations can be accessed at various address or web sites on the Internet. Since electronic addresses can...
32 CFR Appendix A to Part 651 - References
Code of Federal Regulations, 2013 CFR
2013-07-01
... publications and forms are accessible from a variety of sources through the use of electronic media or paper products. In most cases, electronic publications and forms that are associated with military organizations can be accessed at various address or web sites on the Internet. Since electronic addresses can...
EduMOOs: Virtual Learning Centers.
ERIC Educational Resources Information Center
Woods, Judy C.
1998-01-01
Multi-user Object Oriented Internet activities (MOOs) permit real time interaction in a text-based virtual reality via the Internet. This article explains EduMOOs (educational MOOs) and provides brief descriptions, World Wide Web addresses, and telnet addresses for selected EduMOOs. Instructions for connecting to a MOO and a list of related Web…
Programs for Deaf-Blind Children and Adults.
ERIC Educational Resources Information Center
American Annals of the Deaf, 1999
1999-01-01
This directory of programs for deaf-blind children and adults lists program name, address, telephone numbers, e-mail address, Web site, and administrator name. The directory also lists, with similar information, Helen Keller Centers for Deaf-Blind Youth and Adults, and programs for training teachers of deaf-blind students. (DB)
12 CFR 614.4595 - Public disclosure about OFIs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Public disclosure about OFIs. 614.4595 Section 614.4595 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM LOAN POLICIES AND OPERATIONS... the public the name, address, telephone number, and Internet Web site address of any affiliated OFI...
Focused Crawling of the Deep Web Using Service Class Descriptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rocco, D; Liu, L; Critchlow, T
2004-06-21
Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address thesemore » challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less
Curating Virtual Data Collections
NASA Technical Reports Server (NTRS)
Lynnes, Chris; Leon, Amanda; Ramapriyan, Hampapuram; Tsontos, Vardis; Shie, Chung-Lin; Liu, Zhong
2015-01-01
NASAs Earth Observing System Data and Information System (EOSDIS) contains a rich set of datasets and related services throughout its many elements. As a result, locating all the EOSDIS data and related resources relevant to particular science theme can be daunting. This is largely because EOSDIS data's organizing principle is affected more by the way they are produced than around the expected end use. Virtual collections oriented around science themes can overcome this by presenting collections of data and related resources that are organized around the user's interest, not around the way the data were produced. Virtual collections consist of annotated web addresses (URLs) that point to data and related resource addresses, thus avoiding the need to copy all of the relevant data to a single place. These URL addresses can be consumed by a variety of clients, ranging from basic URL downloaders (wget, curl) and web browsers to sophisticated data analysis programs such as the Integrated Data Viewer.
Muessig, Kathryn E; Nekkanti, Manali; Bauermeister, Jose; Bull, Sheana; Hightow-Weidman, Lisa B
2015-03-01
eHealth, mHealth and "Web 2.0" social media strategies can effectively reach and engage key populations in HIV prevention across the testing, treatment, and care continuum. To assess how these tools are currently being used within the field of HIV prevention and care, we systematically reviewed recent (2013-2014) published literature, conference abstracts, and funded research. Our searches identified 23 published intervention studies and 32 funded projects underway. In this synthesis we describe the technology modes applied and the stages of the HIV care cascade addressed, including both primary and secondary prevention activities. Overall trends include use of new tools including social networking sites, provision of real-time assessment and feedback, gamification and virtual reality. While there has been increasing attention to use of technology to address the care continuum, gaps remain around linkage to care, retention in care, and initiation of antiretroviral therapy.
Dasgupta, Dipanwita; Johnson, Reid A; Chaudhry, Beenish; Reeves, Kimberly G; Willaert, Patty; Chawla, Nitesh V
2016-01-01
Medication non-adherence is a pressing concern among seniors, leading to a lower quality of life and higher healthcare costs. While mobile applications provide a viable medium for medication management, their utility can be limited without tackling the specific needs of seniors and facilitating the active involvement of care providers. To address these limitations, we are developing a tablet-based application designed specifically for seniors to track their medications and a web portal for their care providers to track medication adherence. In collaboration with a local Aging in Place program, we conducted a three-month study with sixteen participants from an independent living facility. Our study found that the application helped participants to effectively track their medications and improved their sense of wellbeing. Our findings highlight the importance of catering to the needs of seniors and of involving care providers in this process, with specific recommendations for the development of future medication management applications.
Pennington, Jeffrey W; Ruth, Byron; Italia, Michael J; Miller, Jeffrey; Wrazien, Stacey; Loutrel, Jennifer G; Crenshaw, E Bryan; White, Peter S
2014-01-01
Biomedical researchers share a common challenge of making complex data understandable and accessible as they seek inherent relationships between attributes in disparate data types. Data discovery in this context is limited by a lack of query systems that efficiently show relationships between individual variables, but without the need to navigate underlying data models. We have addressed this need by developing Harvest, an open-source framework of modular components, and using it for the rapid development and deployment of custom data discovery software applications. Harvest incorporates visualizations of highly dimensional data in a web-based interface that promotes rapid exploration and export of any type of biomedical information, without exposing researchers to underlying data models. We evaluated Harvest with two cases: clinical data from pediatric cardiology and demonstration data from the OpenMRS project. Harvest's architecture and public open-source code offer a set of rapid application development tools to build data discovery applications for domain-specific biomedical data repositories. All resources, including the OpenMRS demonstration, can be found at http://harvest.research.chop.edu.
Pathak, Jyotishman; Murphy, Sean P; Willaert, Brian N; Kremers, Hilal M; Yawn, Barbara P; Rocca, Walter A; Chute, Christopher G
2011-01-01
RxNorm and NDF-RT published by the National Library of Medicine (NLM) and Veterans Affairs (VA), respectively, are two publicly available federal medication terminologies. In this study, we evaluate the applicability of RxNorm and National Drug File-Reference Terminology (NDF-RT) for extraction and classification of medication data retrieved using structured querying and natural language processing techniques from electronic health records at two different medical centers within the Rochester Epidemiology Project (REP). Specifically, we explore how mappings between RxNorm concept codes and NDF-RT drug classes can be leveraged for hierarchical organization and grouping of REP medication data, identify gaps and coverage issues, and analyze the recently released NLM's NDF-RT Web service API. Our study concludes that RxNorm and NDF-RT can be applied together for classification of medication extracted from multiple EHR systems, although several issues and challenges remain to be addressed. We further conclude that the Web service APIs developed by the NLM provide useful functionalities for such activities.
Pennington, Jeffrey W; Ruth, Byron; Italia, Michael J; Miller, Jeffrey; Wrazien, Stacey; Loutrel, Jennifer G; Crenshaw, E Bryan; White, Peter S
2014-01-01
Biomedical researchers share a common challenge of making complex data understandable and accessible as they seek inherent relationships between attributes in disparate data types. Data discovery in this context is limited by a lack of query systems that efficiently show relationships between individual variables, but without the need to navigate underlying data models. We have addressed this need by developing Harvest, an open-source framework of modular components, and using it for the rapid development and deployment of custom data discovery software applications. Harvest incorporates visualizations of highly dimensional data in a web-based interface that promotes rapid exploration and export of any type of biomedical information, without exposing researchers to underlying data models. We evaluated Harvest with two cases: clinical data from pediatric cardiology and demonstration data from the OpenMRS project. Harvest's architecture and public open-source code offer a set of rapid application development tools to build data discovery applications for domain-specific biomedical data repositories. All resources, including the OpenMRS demonstration, can be found at http://harvest.research.chop.edu PMID:24131510
Progress on ultrasonic flaw sizing in turbine-engine rotor components: bore and web geometries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, J.H.; Gray, T.A.; Thompson, R.B.
1983-01-01
The application of generic flaw-sizing techniques to specific components generally involves difficulties associated with geometrical complexity and simplifications arising from a knowledge of the expected flaw distribution. This paper is concerned with the case of ultrasonic flaw sizing in turbine-engine rotor components. The sizing of flat penny-shaped cracks in the web geometry discussed and new crack-sizing algorithms based on the Born and Kirchhoff approximations are introduced. Additionally, we propose a simple method for finding the size of a flat, penny-shaped crack given only the magnitude of the scattering amplitude. The bore geometry is discussed with primary emphasis on the cylindricalmore » focusing of the incident beam. Important questions which are addressed include the effects of diffraction and the position of the flaw with respect to the focal line. The appropriate deconvolution procedures to account for these effects are introduced. Generic features of the theory are compared with experiment. Finally, the effects of focused transducers on the Born inversion algorithm are discussed.« less
Kawamoto, Kensaku; Lobach, David F
2005-01-01
Despite their demonstrated ability to improve care quality, clinical decision support systems are not widely used. In part, this limited use is due to the difficulty of sharing medical knowledge in a machine-executable format. To address this problem, we developed a decision support Web service known as SEBASTIAN. In SEBASTIAN, individual knowledge modules define the data requirements for assessing a patient, the conclusions that can be drawn using that data, and instructions on how to generate those conclusions. Using standards-based XML messages transmitted over HTTP, client decision support applications provide patient data to SEBASTIAN and receive patient-specific assessments and recommendations. SEBASTIAN has been used to implement four distinct decision support systems; an architectural overview is provided for one of these systems. Preliminary assessments indicate that SEBASTIAN fulfills all original design objectives, including the re-use of executable medical knowledge across diverse applications and care settings, the straightforward authoring of knowledge modules, and use of the framework to implement decision support applications with significant clinical utility.
NASA Astrophysics Data System (ADS)
Castellazzi, Bernhard; Biberacher, Markus
2016-04-01
Many European cities nowadays offer their citizens Web-GIS applications to access data about solar potentials for specific buildings. However, the actual benefit of such solar systems can only be investigated, if their generation is not considered singularly, but in combination with information about temporal appearance of energy demand (heat, electricity), type of primary heating system, hourly internal consumption of photovoltaic power, feed-in power and other important financial and ecological aspects. Hence, the presented application addresses citizens, who are interested in the integration of solar power in buildings and would like to have an extended view on related impacts. Based on user inputs on building parameters and energy use, as well as high spatial and temporal resolved solar data for individual roof areas, financial and ecological effects of solar thermal installations and PV are estimated. Also interactions between heat and power generation are considered in the implemented approach. The tool was developed within the Central Europe project „Cities on Power" and is being realized for the cities Torino, Warsaw, Dresden, Klagenfurt and Ravenna.
Individual-based models in ecology after four decades
Grimm, Volker
2014-01-01
Individual-based models simulate populations and communities by following individuals and their properties. They have been used in ecology for more than four decades, with their use and ubiquity in ecology growing rapidly in the last two decades. Individual-based models have been used for many applied or “pragmatic” issues, such as informing the protection and management of particular populations in specific locations, but their use in addressing theoretical questions has also grown rapidly, recently helping us to understand how the sets of traits of individual organisms influence the assembly of communities and food webs. Individual-based models will play an increasingly important role in questions posed by complex ecological systems. PMID:24991416
Molecular trophic markers in marine food webs and their potential use for coral ecology.
Leal, Miguel Costa; Ferrier-Pagès, Christine
2016-10-01
Notable advances in ecological genomics have been driven by high-throughput sequencing technology and taxonomically broad sequence repositories that allow us to accurately assess species interactions with great taxonomic resolution. The use of DNA as a marker for ingested food is particularly relevant to address predator-prey interactions and disentangle complex marine food webs. DNA-based methods benefit from reductionist molecular approaches to address ecosystem scale processes, such as community structure and energy flow across trophic levels, among others. Here we review how molecular trophic markers have been used to better understand trophic interactions in the marine environment and their advantages and limitations. We focus on animal groups where research has been focused, such as marine mammals, seabirds, fishes, pelagic invertebrates and benthic invertebrates, and use case studies to illustrate how DNA-based methods unraveled food-web interactions. The potential of molecular trophic markers for disentangling the complex trophic ecology of corals is also discussed. Copyright © 2016 Elsevier B.V. All rights reserved.
Implementing WebQuest Based Instruction on Newton's Second Law
ERIC Educational Resources Information Center
Gokalp, Muhammed Sait; Sharma, Manjula; Johnston, Ian; Sharma, Mia
2013-01-01
The purpose of this study was to investigate how WebQuests can be used in physics classes for teaching specific concepts. The study had three stages. The first stage was to develop a WebQuest on Newton's second law. The second stage involved developing a lesson plan to implement the WebQuest in class. In the final stage, the WebQuest was…
A web-based approach to managing stress and mood disorders in the workforce.
Billings, Douglas W; Cook, Royer F; Hendrickson, April; Dove, David C
2008-08-01
To evaluate the effectiveness of a web-based multimedia health promotion program for the workplace, designed to help reduce stress and to prevent depression, anxiety, and substance abuse. Using a randomized controlled trial design, 309 working adults were randomly assigned to the web-based condition or to a wait-list control condition. All participants were assessed on multiple self-reported outcomes at pretest and posttest. Relative to controls, the web-based group reduced their stress, increased their knowledge of depression and anxiety, developed more positive attitudes toward treatment, and adopted a more healthy approach to alcohol consumption. We found that a brief and easily adaptable web-based stress management program can simultaneously reduce worker stress and address stigmatized behavioral health problems by embedding this prevention material into a more positive stress management framework.
Göritz, Anja S; Birnbaum, Michael H
2005-11-01
The customizable PHP script Generic HTML Form Processor is intended to assist researchers and students in quickly setting up surveys and experiments that can be administered via the Web. This script relieves researchers from the burdens of writing new CGI scripts and building databases for each Web study. Generic HTML Form Processor processes any syntactically correct HTML forminput and saves it into a dynamically created open-source database. We describe five modes for usage of the script that allow increasing functionality but require increasing levels of knowledge of PHP and Web servers: The first two modes require no previous knowledge, and the fifth requires PHP programming expertise. Use of Generic HTML Form Processor is free for academic purposes, and its Web address is www.goeritz.net/brmic.
Load Index Metrics for an Optimized Management of Web Services: A Systematic Evaluation
Souza, Paulo S. L.; Santana, Regina H. C.; Santana, Marcos J.; Zaluska, Ed; Faical, Bruno S.; Estrella, Julio C.
2013-01-01
The lack of precision to predict service performance through load indices may lead to wrong decisions regarding the use of web services, compromising service performance and raising platform cost unnecessarily. This paper presents experimental studies to qualify the behaviour of load indices in the web service context. The experiments consider three services that generate controlled and significant server demands, four levels of workload for each service and six distinct execution scenarios. The evaluation considers three relevant perspectives: the capability for representing recent workloads, the capability for predicting near-future performance and finally stability. Eight different load indices were analysed, including the JMX Average Time index (proposed in this paper) specifically designed to address the limitations of the other indices. A systematic approach is applied to evaluate the different load indices, considering a multiple linear regression model based on the stepwise-AIC method. The results show that the load indices studied represent the workload to some extent; however, in contrast to expectations, most of them do not exhibit a coherent correlation with service performance and this can result in stability problems. The JMX Average Time index is an exception, showing a stable behaviour which is tightly-coupled to the service runtime for all executions. Load indices are used to predict the service runtime and therefore their inappropriate use can lead to decisions that will impact negatively on both service performance and execution cost. PMID:23874776
PACCMIT/PACCMIT-CDS: identifying microRNA targets in 3' UTRs and coding sequences.
Šulc, Miroslav; Marín, Ray M; Robins, Harlan S; Vaníček, Jiří
2015-07-01
The purpose of the proposed web server, publicly available at http://paccmit.epfl.ch, is to provide a user-friendly interface to two algorithms for predicting messenger RNA (mRNA) molecules regulated by microRNAs: (i) PACCMIT (Prediction of ACcessible and/or Conserved MIcroRNA Targets), which identifies primarily mRNA transcripts targeted in their 3' untranslated regions (3' UTRs), and (ii) PACCMIT-CDS, designed to find mRNAs targeted within their coding sequences (CDSs). While PACCMIT belongs among the accurate algorithms for predicting conserved microRNA targets in the 3' UTRs, the main contribution of the web server is 2-fold: PACCMIT provides an accurate tool for predicting targets also of weakly conserved or non-conserved microRNAs, whereas PACCMIT-CDS addresses the lack of similar portals adapted specifically for targets in CDS. The web server asks the user for microRNAs and mRNAs to be analyzed, accesses the precomputed P-values for all microRNA-mRNA pairs from a database for all mRNAs and microRNAs in a given species, ranks the predicted microRNA-mRNA pairs, evaluates their significance according to the false discovery rate and finally displays the predictions in a tabular form. The results are also available for download in several standard formats. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Terminology for Neuroscience Data Discovery: Multi-tree Syntax and Investigator-Derived Semantics
Goldberg, David H.; Grafstein, Bernice; Robert, Adrian; Gardner, Esther P.
2009-01-01
The Neuroscience Information Framework (NIF), developed for the NIH Blueprint for Neuroscience Research and available at http://nif.nih.gov and http://neurogateway.org, is built upon a set of coordinated terminology components enabling data and web-resource description and selection. Core NIF terminologies use a straightforward syntax designed for ease of use and for navigation by familiar web interfaces, and readily exportable to aid development of relational-model databases for neuroscience data sharing. Datasets, data analysis tools, web resources, and other entities are characterized by multiple descriptors, each addressing core concepts, including data type, acquisition technique, neuroanatomy, and cell class. Terms for each concept are organized in a tree structure, providing is-a and has-a relations. Broad general terms near each root span the category or concept and spawn more detailed entries for specificity. Related but distinct concepts (e.g., brain area and depth) are specified by separate trees, for easier navigation than would be required by graph representation. Semantics enabling NIF data discovery were selected at one or more workshops by investigators expert in particular systems (vision, olfaction, behavioral neuroscience, neurodevelopment), brain areas (cerebellum, thalamus, hippocampus), preparations (molluscs, fly), diseases (neurodegenerative disease), or techniques (microscopy, computation and modeling, neurogenetics). Workshop-derived integrated term lists are available Open Source at http://brainml.org; a complete list of participants is at http://brainml.org/workshops. PMID:18958630
Hofmeister, Erik H; Watson, Victoria; Snyder, Lindsey B C; Love, Emma J
2008-12-15
To determine the validity of the information on the World Wide Web concerning veterinary anesthesia in dogs and to determine the methods dog owners use to obtain that information. Web-based search and client survey. 73 Web sites and 92 clients. Web sites were scored on a 5-point scale for completeness and accuracy of information about veterinary anesthesia by 3 board-certified anesthesiologists. A search for anesthetic information regarding 49 specific breeds of dogs was also performed. A survey was distributed to the clients who visited the University of Georgia Veterinary Teaching Hospital during a 4-month period to solicit data about sources used by clients to obtain veterinary medical information and the manner in which information obtained from Web sites was used. The general search identified 73 Web sites that included information on veterinary anesthesia; these sites received a mean score of 3.4 for accuracy and 2.5 for completeness. Of 178 Web sites identified through the breed-specific search, 57 (32%) indicated that a particular breed was sensitive to anesthesia. Of 83 usable, completed surveys, 72 (87%) indicated the client used the Web for veterinary medical information. Fifteen clients (18%) indicated they believed their animal was sensitive to anesthesia because of its breed. Information available on the internet regarding anesthesia in dogs is generally not complete and may be misleading with respect to risks to specific breeds. Consequently, veterinarians should appropriately educate clients regarding anesthetic risk to their particular dog.
NASA Astrophysics Data System (ADS)
Santhana Vannan, S.; Cook, R. B.; Wilson, B. E.; Wei, Y.
2010-12-01
Terrestrial ecology data sets are produced from diverse data sources such as model output, field data collection, laboratory analysis and remote sensing observation. These data sets can be created, distributed, and consumed in diverse ways as well. However, this diversity can hinder the usability of the data, and limit data users’ abilities to validate and reuse data for science and application purposes. Geospatial web services, such as those described in this paper, are an important means of reducing this burden. Terrestrial ecology researchers generally create the data sets in diverse file formats, with file and data structures tailored to the specific needs of their project, possibly as tabular data, geospatial images, or documentation in a report. Data centers may reformat the data to an archive-stable format and distribute the data sets through one or more protocols, such as FTP, email, and WWW. Because of the diverse data preparation, delivery, and usage patterns, users have to invest time and resources to bring the data into the format and structure most useful for their analysis. This time-consuming data preparation process shifts valuable resources from data analysis to data assembly. To address these issues, the ORNL DAAC, a NASA-sponsored terrestrial ecology data center, has utilized geospatial Web service technology, such as Open Geospatial Consortium (OGC) Web Map Service (WMS) and OGC Web Coverage Service (WCS) standards, to increase the usability and availability of terrestrial ecology data sets. Data sets are standardized into non-proprietary file formats and distributed through OGC Web Service standards. OGC Web services allow the ORNL DAAC to store data sets in a single format and distribute them in multiple ways and formats. Registering the OGC Web services through search catalogues and other spatial data tools allows for publicizing the data sets and makes them more available across the Internet. The ORNL DAAC has also created a Web-based graphical user interface called Spatial Data Access Tool (SDAT) that utilizes OGC Web services standards and allows data distribution and consumption for users not familiar with OGC standards. SDAT also allows for users to visualize the data set prior to download. Google Earth visualizations of the data set are also provided through SDAT. The use of OGC Web service standards at the ORNL DAAC has enabled an increase in data consumption. In one case, a data set had ~10 fold increase in download through OGC Web service in comparison to the conventional FTP and WWW method of access. The increase in download suggests that users are not only finding the data sets they need but also able to consume them readily in the format they need.
ERIC Educational Resources Information Center
Geçer, Aynur Kolburan
2014-01-01
This study addresses university students' information search and commitment strategies on web environment and internet usage self-efficacy beliefs in terms of such variables as gender, department, grade level and frequency of internet use; and whether there is a significant relation between these beliefs. Descriptive method was used in the study.…
ERIC Educational Resources Information Center
Scott, George A.
2010-01-01
In this report, the author and his colleagues respond to a mandate in the Higher Education Opportunity Act requiring GAO (Government Accountability Office) to study the feasibility of developing a national clearinghouse of federal and private student loans on the Department of Education's (Education) Web site. They addressed the following…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-11
... the RoC. The process is available on the NTP Web site ( http://ntp.niehs.nih.gov/go/rocprocess ) or by... Counselors public meeting (76 FR 68461) on December 15, 2011 ( http://ntp.niehs.nih.gov/go/9741 ). The NTP... Web site ( http://ntp.niehs.nih.gov/go/rocprocess ) or by contacting Dr. Lunn (see ADDRESSES...
10th Conference on Bayesian Nonparametrics
2016-05-08
RETURN YOUR FORM TO THE ABOVE ADDRESS. North Carolina State University 2701 Sullivan Drive Admin Srvcs III, Box 7514 Raleigh, NC 27695 -7514 ABSTRACT...the conference. The findings from the conference is widely disseminated. The conference web site displays slides of the talks presented in the...being published by the Electronic Journal of Statistics consisting of about 20 papers read at the conference. The conference web site displays
Vosbergen, Sandra; Mahieu, Guy R; Laan, Eva K; Kraaijenhagen, Roderik A; Jaspers, Monique WM
2014-01-01
Background Increasingly, Web-based health applications are developed for the prevention and management of chronic diseases. However, their reach and utilization is often disappointing. Qualitative evaluations post-implementation can be used to inform the optimization process and ultimately enhance their adoption. In current practice, such evaluations are mainly performed with end-user surveys. However, a review approach by experts in a focus group may be easier to administer and might provide similar results. Objective The aim of this study was to assess whether industrial design engineers in a focus group would address the same issues as end users in a Web-based survey when evaluating a commercial Web-based health risk assessment (HRA) with tailored feedback. Methods Seven Dutch companies used the HRA as part of their corporate health management strategy. Employees using the HRA (N=2289) and 10 independent industrial designers were invited to participate in the study. The HRA consisted of four components: (1) an electronic health questionnaire, (2) biometric measurements, (3) laboratory evaluation, and (4) individually tailored feedback generated by decision support software. After participating in the HRA as end users, both end users and designers evaluated the program. End users completed an evaluation questionnaire that included a free-text field. Designers participated in a focus group discussion. Constructs from user satisfaction and technology acceptance theories were used to categorize and compare the remarks from both evaluations. Results We assessed and qualitatively analyzed 294 remarks of 189 end users and 337 remarks of 6 industrial designers, pertaining to 295 issues in total. Of those, 137 issues were addressed in the end-user survey and 148 issues in the designer focus group. Only 7.3% (10/137) of the issues addressed in the survey were also addressed in the focus group. End users made more remarks about the usefulness of the HRA and prior expectations that were not met. Designers made more remarks about how the information was presented to end users, quality of the feedback provided by the HRA, recommendations on the marketing and on how to create more unity in the design of the HRA, and on how to improve the HRA based on these issues. Conclusions End-user surveys should not be substituted for expert focus groups. Issues identified by end users in the survey and designers in the focus group differed considerably, and the focus group produced a lot of new issues. The issues addressed in the focus group often focused on different aspects of user satisfaction and technology acceptance than those addressed by the survey participants; when they did focus on the same aspects, then the nature of issues differed considerably in content. PMID:24384408
Vosbergen, Sandra; Mahieu, Guy R; Laan, Eva K; Kraaijenhagen, Roderik A; Jaspers, Monique Wm; Peek, Niels
2014-01-02
Increasingly, Web-based health applications are developed for the prevention and management of chronic diseases. However, their reach and utilization is often disappointing. Qualitative evaluations post-implementation can be used to inform the optimization process and ultimately enhance their adoption. In current practice, such evaluations are mainly performed with end-user surveys. However, a review approach by experts in a focus group may be easier to administer and might provide similar results. The aim of this study was to assess whether industrial design engineers in a focus group would address the same issues as end users in a Web-based survey when evaluating a commercial Web-based health risk assessment (HRA) with tailored feedback. Seven Dutch companies used the HRA as part of their corporate health management strategy. Employees using the HRA (N=2289) and 10 independent industrial designers were invited to participate in the study. The HRA consisted of four components: (1) an electronic health questionnaire, (2) biometric measurements, (3) laboratory evaluation, and (4) individually tailored feedback generated by decision support software. After participating in the HRA as end users, both end users and designers evaluated the program. End users completed an evaluation questionnaire that included a free-text field. Designers participated in a focus group discussion. Constructs from user satisfaction and technology acceptance theories were used to categorize and compare the remarks from both evaluations. We assessed and qualitatively analyzed 294 remarks of 189 end users and 337 remarks of 6 industrial designers, pertaining to 295 issues in total. Of those, 137 issues were addressed in the end-user survey and 148 issues in the designer focus group. Only 7.3% (10/137) of the issues addressed in the survey were also addressed in the focus group. End users made more remarks about the usefulness of the HRA and prior expectations that were not met. Designers made more remarks about how the information was presented to end users, quality of the feedback provided by the HRA, recommendations on the marketing and on how to create more unity in the design of the HRA, and on how to improve the HRA based on these issues. End-user surveys should not be substituted for expert focus groups. Issues identified by end users in the survey and designers in the focus group differed considerably, and the focus group produced a lot of new issues. The issues addressed in the focus group often focused on different aspects of user satisfaction and technology acceptance than those addressed by the survey participants; when they did focus on the same aspects, then the nature of issues differed considerably in content.
31 CFR 558.305 - Licenses; general and specific.
Code of Federal Regulations, 2014 CFR
2014-07-01
... made available on OFAC's Web site: www.treasury.gov/ofac. (c) The term specific license means any... available on OFAC's Web site: www.treasury.gov/ofac. Note to § 558.305: See § 501.801 of this chapter on...
31 CFR 537.310 - Licenses; general and specific.
Code of Federal Regulations, 2014 CFR
2014-07-01
... part or are made available on OFAC's Web site: www.treasury.gov/ofac. (c) The term specific license... part or made available on OFAC's Web site. Note to § 537.310: See § 501.801 of this chapter on...
A systematic review of studies of web portals for patients with diabetes mellitus.
Coughlin, Steven S; Williams, Lovoria B; Hatzigeorgiou, Christos
2017-01-01
Patient web portals are password-protected online websites that offer patients 24-hour access to personal health information from anywhere with an Internet connection. Due to advances in health information technologies, there has been increasing interest among providers and researchers in patient web portals for use by patients with diabetes and other chronic conditions. This article, which is based upon bibliographic searches in PubMed, reviews web portals for patients with diabetes mellitus including patient web portals tethered to electronic medical records and web portals developed specifically for patients with diabetes. Twelve studies of the impact of patient web portals on the management of diabetes patients were identified. Three had a cross-sectional design, 1 employed mixed-methods, one had a matched-control design, 3 had a retrospective cohort design, and 5 were randomized controlled trials. Six (50%) of the studies examined web portals tethered to electronic medical records and the remainder were web portals developed specifically for diabetes patients. The results of this review suggest that secure messaging between adult diabetic patients and their clinician is associated with improved glycemic control. However, results from observational studies indicate that many diabetic patients do not take advantage of web portal features such as secure messaging, perhaps because of a lack of internet access or lack of experience in navigating web portal resources. Although results from randomized controlled trials provide stronger evidence of the efficacy of web portal use in improving glycemic control among diabetic patients, the number of trials is small and results from the trials have been mixed. Studies suggest that secure messaging between adult diabetic patients and their clinician is associated with improved glycemic control, but negative findings have also been reported. The number of randomized controlled trials that have examined the efficacy of web portal use in improving glycemic control among diabetic patients is still small. Additional research is needed to identify specific portal features that may impact quality of care or improve glycemic control.
A systematic review of studies of web portals for patients with diabetes mellitus
Williams, Lovoria B.; Hatzigeorgiou, Christos
2017-01-01
Patient web portals are password-protected online websites that offer patients 24-hour access to personal health information from anywhere with an Internet connection. Due to advances in health information technologies, there has been increasing interest among providers and researchers in patient web portals for use by patients with diabetes and other chronic conditions. This article, which is based upon bibliographic searches in PubMed, reviews web portals for patients with diabetes mellitus including patient web portals tethered to electronic medical records and web portals developed specifically for patients with diabetes. Twelve studies of the impact of patient web portals on the management of diabetes patients were identified. Three had a cross-sectional design, 1 employed mixed-methods, one had a matched-control design, 3 had a retrospective cohort design, and 5 were randomized controlled trials. Six (50%) of the studies examined web portals tethered to electronic medical records and the remainder were web portals developed specifically for diabetes patients. The results of this review suggest that secure messaging between adult diabetic patients and their clinician is associated with improved glycemic control. However, results from observational studies indicate that many diabetic patients do not take advantage of web portal features such as secure messaging, perhaps because of a lack of internet access or lack of experience in navigating web portal resources. Although results from randomized controlled trials provide stronger evidence of the efficacy of web portal use in improving glycemic control among diabetic patients, the number of trials is small and results from the trials have been mixed. Studies suggest that secure messaging between adult diabetic patients and their clinician is associated with improved glycemic control, but negative findings have also been reported. The number of randomized controlled trials that have examined the efficacy of web portal use in improving glycemic control among diabetic patients is still small. Additional research is needed to identify specific portal features that may impact quality of care or improve glycemic control. PMID:28736732
Processing biological literature with customizable Web services supporting interoperable formats.
Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia
2014-01-01
Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. © The Author(s) 2014. Published by Oxford University Press.
Processing biological literature with customizable Web services supporting interoperable formats
Rak, Rafal; Batista-Navarro, Riza Theresa; Carter, Jacob; Rowley, Andrew; Ananiadou, Sophia
2014-01-01
Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225
Sojic, Aleksandra; Terkaj, Walter; Contini, Giorgia; Sacco, Marco
2016-05-04
The public health initiatives for obesity prevention are increasingly exploiting the advantages of smart technologies that can register various kinds of data related to physical, physiological, and behavioural conditions. Since individual features and habits vary among people, the design of appropriate intervention strategies for motivating changes in behavioural patterns towards a healthy lifestyle requires the interpretation and integration of collected information, while considering individual profiles in a personalised manner. The ontology-based modelling is recognised as a promising approach in facing the interoperability and integration of heterogeneous information related to characterisation of personal profiles. The presented ontology captures individual profiles across several obesity-related knowledge-domains structured into dedicated modules in order to support inference about health condition, physical features, behavioural habits associated with a person, and relevant changes over time. The modularisation strategy is designed to facilitate ontology development, maintenance, and reuse. The domain-specific modules formalised in the Web Ontology Language (OWL) integrate the domain-specific sets of rules formalised in the Semantic Web Rule Language (SWRL). The inference rules follow a modelling pattern designed to support personalised assessment of health condition as age- and gender-specific. The test cases exemplify a personalised assessment of the obesity-related health conditions for the population of teenagers. The paper addresses several issues concerning the modelling of normative concepts related to obesity and depicts how the public health concern impacts classification of teenagers according to their phenotypes. The modelling choices regarding the ontology-structure are explained in the context of the modelling goal to integrate multiple knowledge-domains and support reasoning about the individual changes over time. The presented modularisation pattern enhances reusability of the domain-specific modules across various health care domains.
Gender in health technology assessment: pilot study on agency approaches.
Panteli, Dimitra; Zentner, Annette; Storz-Pfennig, Philipp; Busse, Reinhard
2011-07-01
Gender as a social construct is a recognized health determinant. Because best practice in reporting health technology assessment (HTA) clearly specifies the need to appraise a technology's social impact within the target population, the extent to which gender issues are taken into account in HTA production is of interest, not only in light of equitable practices but also for reasons of effectiveness. The aim of this study is to provide a first assessment of the degree of gender sensitivity shown by HTA agencies around the world today. The Web sites of sixty HTA agencies were analyzed. The consideration of gender aspects was specifically looked for in each agency's general mission statement, its priority setting process, and its methodological approach. Additionally, specific gender-oriented initiatives not belonging to any of the aforementioned categories were identified. Of the sixty agencies, less than half mention a commitment to addressing the social implication of health technologies. Only fifteen institutions make information on their priority setting principles available on their Web sites and gender was an issue in two of those cases. Data on methodology were obtainable online from 18 agencies, two of which mentioned gender issues explicitly. Finally, gender-oriented initiatives were identified by thirteen agencies. A gender-sensitive approach is apparently rarely adopted in current HTA production. Exceptional practices and relevant tools do exist and could serve as examples to be promoted by international collaborative networks.
BCube: Building a Geoscience Brokering Framework
NASA Astrophysics Data System (ADS)
Jodha Khalsa, Siri; Nativi, Stefano; Duerr, Ruth; Pearlman, Jay
2014-05-01
BCube is addressing the need for effective and efficient multi-disciplinary collaboration and interoperability through the advancement of brokering technologies. As a prototype "building block" for NSF's EarthCube cyberinfrastructure initiative, BCube is demonstrating how a broker can serve as an intermediary between information systems that implement well-defined interfaces, thereby providing a bridge between communities that employ different specifications. Building on the GEOSS Discover and Access Broker (DAB), BCube will develop new modules and services including: • Expanded semantic brokering capabilities • Business Model support for work flows • Automated metadata generation • Automated linking to services discovered via web crawling • Credential passing for seamless access to data • Ranking of search results from brokered catalogs Because facilitating cross-discipline research involves cultural and well as technical challenges, BCube is also addressing the sociological and educational components of infrastructure development. We are working, initially, with four geoscience disciplines: hydrology, oceans, polar and weather, with an emphasis on connecting existing domain infrastructure elements to facilitate cross-domain communications.
Nagamani, S; Gaur, A S; Tanneeru, K; Muneeswaran, G; Madugula, S S; Consortium, Mpds; Druzhilovskiy, D; Poroikov, V V; Sastry, G N
2017-11-01
Molecular property diagnostic suite (MPDS) is a Galaxy-based open source drug discovery and development platform. MPDS web portals are designed for several diseases, such as tuberculosis, diabetes mellitus, and other metabolic disorders, specifically aimed to evaluate and estimate the drug-likeness of a given molecule. MPDS consists of three modules, namely data libraries, data processing, and data analysis tools which are configured and interconnected to assist drug discovery for specific diseases. The data library module encompasses vast information on chemical space, wherein the MPDS compound library comprises 110.31 million unique molecules generated from public domain databases. Every molecule is assigned with a unique ID and card, which provides complete information for the molecule. Some of the modules in the MPDS are specific to the diseases, while others are non-specific. Importantly, a suitably altered protocol can be effectively generated for another disease-specific MPDS web portal by modifying some of the modules. Thus, the MPDS suite of web portals shows great promise to emerge as disease-specific portals of great value, integrating chemoinformatics, bioinformatics, molecular modelling, and structure- and analogue-based drug discovery approaches.
SSWAP: A Simple Semantic Web Architecture and Protocol for semantic web services.
Gessler, Damian D G; Schiltz, Gary S; May, Greg D; Avraham, Shulamit; Town, Christopher D; Grant, David; Nelson, Rex T
2009-09-23
SSWAP (Simple Semantic Web Architecture and Protocol; pronounced "swap") is an architecture, protocol, and platform for using reasoning to semantically integrate heterogeneous disparate data and services on the web. SSWAP was developed as a hybrid semantic web services technology to overcome limitations found in both pure web service technologies and pure semantic web technologies. There are currently over 2400 resources published in SSWAP. Approximately two dozen are custom-written services for QTL (Quantitative Trait Loci) and mapping data for legumes and grasses (grains). The remaining are wrappers to Nucleic Acids Research Database and Web Server entries. As an architecture, SSWAP establishes how clients (users of data, services, and ontologies), providers (suppliers of data, services, and ontologies), and discovery servers (semantic search engines) interact to allow for the description, querying, discovery, invocation, and response of semantic web services. As a protocol, SSWAP provides the vocabulary and semantics to allow clients, providers, and discovery servers to engage in semantic web services. The protocol is based on the W3C-sanctioned first-order description logic language OWL DL. As an open source platform, a discovery server running at http://sswap.info (as in to "swap info") uses the description logic reasoner Pellet to integrate semantic resources. The platform hosts an interactive guide to the protocol at http://sswap.info/protocol.jsp, developer tools at http://sswap.info/developer.jsp, and a portal to third-party ontologies at http://sswapmeet.sswap.info (a "swap meet"). SSWAP addresses the three basic requirements of a semantic web services architecture (i.e., a common syntax, shared semantic, and semantic discovery) while addressing three technology limitations common in distributed service systems: i.e., i) the fatal mutability of traditional interfaces, ii) the rigidity and fragility of static subsumption hierarchies, and iii) the confounding of content, structure, and presentation. SSWAP is novel by establishing the concept of a canonical yet mutable OWL DL graph that allows data and service providers to describe their resources, to allow discovery servers to offer semantically rich search engines, to allow clients to discover and invoke those resources, and to allow providers to respond with semantically tagged data. SSWAP allows for a mix-and-match of terms from both new and legacy third-party ontologies in these graphs.
2016-04-30
software (OSS) and proprietary (CSS) software elements or remote services (Scacchi, 2002, 2010), eventually including recent efforts to support Web ...specific platforms, including those operating on secured Web /mobile devices. Common Development Technology provides AC development tools and common...transition to OA systems and OSS software elements, specifically for Web and Mobile devices within the realm of C3CB. OA, Open APIs, OSS, and CSS OA
DICOMweb™: Background and Application of the Web Standard for Medical Imaging.
Genereaux, Brad W; Dennison, Donald K; Ho, Kinson; Horn, Robert; Silver, Elliot Lewis; O'Donnell, Kevin; Kahn, Charles E
2018-05-10
This paper describes why and how DICOM, the standard that has been the basis for medical imaging interoperability around the world for several decades, has been extended into a full web technology-based standard, DICOMweb. At the turn of the century, healthcare embraced information technology, which created new problems and new opportunities for the medical imaging industry; at the same time, web technologies matured and began serving other domains well. This paper describes DICOMweb, how it extended the DICOM standard, and how DICOMweb can be applied to problems facing healthcare applications to address workflow and the changing healthcare climate.
Workspaces in the Semantic Web
NASA Technical Reports Server (NTRS)
Wolfe, Shawn R.; Keller, RIchard M.
2005-01-01
Due to the recency and relatively limited adoption of Semantic Web technologies. practical issues related to technology scaling have received less attention than foundational issues. Nonetheless, these issues must be addressed if the Semantic Web is to realize its full potential. In particular, we concentrate on the lack of scoping methods that reduce the size of semantic information spaces so they are more efficient to work with and more relevant to an agent's needs. We provide some intuition to motivate the need for such reduced information spaces, called workspaces, give a formal definition, and suggest possible methods of deriving them.
Timpka, Toomas; Eriksson, Henrik; Ludvigsson, Johnny; Ekberg, Joakim; Nordfeldt, Sam; Hanberger, Lena
2008-01-01
Background Chronic disease management is a global health concern. By the time they reach adolescence, 10–15% of all children live with a chronic disease. The role of educational interventions in facilitating adaptation to chronic disease is receiving growing recognition, and current care policies advocate greater involvement of patients in self-care. Web 2.0 is an umbrella term for new collaborative Internet services characterized by user participation in developing and managing content. Key elements include Really Simple Syndication (RSS) to rapidly disseminate awareness of new information; weblogs (blogs) to describe new trends, wikis to share knowledge, and podcasts to make information available on personal media players. This study addresses the potential to develop Web 2.0 services for young persons with a chronic disease. It is acknowledged that the management of childhood chronic disease is based on interplay between initiatives and resources on the part of patients, relatives, and health care professionals, and where the balance shifts over time to the patients and their families. Methods Participatory action research was used to stepwise define a design specification in the form of a pattern language. Support for children diagnosed with diabetes Type 1 was used as the example area. Each individual design pattern was determined graphically using card sorting methods, and textually in the form Title, Context, Problem, Solution, Examples and References. Application references were included at the lowest level in the graphical overview in the pattern language but not specified in detail in the textual descriptions. Results The design patterns are divided into functional and non-functional design elements, and formulated at the levels of organizational, system, and application design. The design elements specify access to materials for development of the competences needed for chronic disease management in specific community settings, endorsement of self-learning through online peer-to-peer communication, and systematic accreditation and evaluation of materials and processes. Conclusion The use of design patterns allows representing the core design elements of a Web 2.0 system upon which an 'ecological' development of content respecting these constraints can be built. Future research should include evaluations of Web 2.0 systems implemented according to the architecture in practice settings. PMID:19040738
76 FR 74776 - Forum-Trends in Extreme Winds, Waves, and Extratropical Storms Along the Coasts
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-01
... Winds, Waves, and Extratropical Storms Along the Coasts AGENCY: National Environmental Satellite, Data... information, please check the forum Web site at https://sites.google.com/a/noaa.gov/extreme-winds-waves.../noaa.gov/extreme-winds-waves-extratropical-storms/home . Topics To Be Addressed This forum will address...
Using QR Codes to Differentiate Learning for Gifted and Talented Students
ERIC Educational Resources Information Center
Siegle, Del
2015-01-01
QR codes are two-dimensional square patterns that are capable of coding information that ranges from web addresses to links to YouTube video. The codes save time typing and eliminate errors in entering addresses incorrectly. These codes make learning with technology easier for students and motivationally engage them in news ways.
Schools and Programs in the United States.
ERIC Educational Resources Information Center
American Annals of the Deaf, 2000
2000-01-01
This annual directory lists U.S. schools and programs enrolling deaf and hard of hearing children and youth in two sections: (1) the directory listing including name, address, phone numbers, administrator or contact person, and e-mail/Web address and (2) program and services chart including whether day or residential, enrollment, services, and…
31 CFR 542.309 - Licenses; general and specific.
Code of Federal Regulations, 2014 CFR
2014-07-01
... part or made available on OFAC's Web site: www.treasury.gov/ofac. (c) The term specific license means... or made available on OFAC's Web site: www.treasury.gov/ofac. Note to § 542.309: See § 501.801 of this...
31 CFR 589.305 - Licenses; general and specific.
Code of Federal Regulations, 2014 CFR
2014-07-01
... part or made available on OFAC's Web site: www.treasury.gov/ofac. (c) The term specific license means... or made available on OFAC's Web site: www.treasury.gov/ofac. Note to § 589.305: See § 501.801 of this...
A novel architecture for information retrieval system based on semantic web
NASA Astrophysics Data System (ADS)
Zhang, Hui
2011-12-01
Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.
Web Accessibility Policies at Land-Grant Universities
ERIC Educational Resources Information Center
Bradbard, David A.; Peters, Cara; Caneva, Yoana
2010-01-01
The Web has become an integral part of postsecondary education within the United States. There are specific laws that legally mandate postsecondary institutions to have Web sites that are accessible for students with disabilities (e.g., the Americans with Disabilities Act (ADA)). Web accessibility policies are a way for universities to provide a…
iRefWeb: interactive analysis of consolidated protein interaction data and their supporting evidence
Turner, Brian; Razick, Sabry; Turinsky, Andrei L.; Vlasblom, James; Crowdy, Edgard K.; Cho, Emerson; Morrison, Kyle; Wodak, Shoshana J.
2010-01-01
We present iRefWeb, a web interface to protein interaction data consolidated from 10 public databases: BIND, BioGRID, CORUM, DIP, IntAct, HPRD, MINT, MPact, MPPI and OPHID. iRefWeb enables users to examine aggregated interactions for a protein of interest, and presents various statistical summaries of the data across databases, such as the number of organism-specific interactions, proteins and cited publications. Through links to source databases and supporting evidence, researchers may gauge the reliability of an interaction using simple criteria, such as the detection methods, the scale of the study (high- or low-throughput) or the number of cited publications. Furthermore, iRefWeb compares the information extracted from the same publication by different databases, and offers means to follow-up possible inconsistencies. We provide an overview of the consolidated protein–protein interaction landscape and show how it can be automatically cropped to aid the generation of meaningful organism-specific interactomes. iRefWeb can be accessed at: http://wodaklab.org/iRefWeb. Database URL: http://wodaklab.org/iRefWeb/ PMID:20940177
Information on infantile colic on the World Wide Web.
Bailey, Shana D; D'Auria, Jennifer P; Haushalter, Jamie P
2013-01-01
The purpose of this study was to explore and describe the type and quality of information on infantile colic that a parent might access on the World Wide Web. Two checklists were used to evaluate the quality indicators of 24 Web sites and the colic-specific content. Fifteen health information Web sites met more of the quality parameters than the nine commercial sites. Eight Web sites included information about colic and infant abuse, with six being health information sites. The colic-specific content on 24 Web sites reflected current issues and controversies; however, the completeness of the information in light of current evidence varied among the Web sites. Strategies to avoid complications of parental stress or infant abuse were not commonly found on the Web sites. Pediatric professionals must guide parents to reliable colic resources that also include emotional support and understanding of infant crying. A best evidence guideline for the United States would eliminate confusion and uncertainty about which colic therapies are safe and effective for parents and professionals. Copyright © 2013 National Association of Pediatric Nurse Practitioners. Published by Mosby, Inc. All rights reserved.
Food-chain contamination evaluations in ecological risk assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linder, G.
Food-chain models have become increasingly important within the ecological risk assessment process. This is the case particularly when acute effects are not readily apparent, or the contaminants of concern are not readily detoxified, have a high likelihood for partitioning into lipids, or have specific target organs or tissues that may increase their significance in evaluating their potential adverse effects. An overview of food-chain models -- conceptual, theoretical, and empirical -- will be considered through a series of papers that will focus on their application within the ecological risk assessment process. Whether a food-chain evaluation is being developed to address relativelymore » simple questions related to chronic effects of toxicants on target populations, or whether a more complex food-web model is being developed to address questions related to multiple-trophic level transfers of toxicants, the elements within the food chain contamination evaluation can be generalized to address the mechanisms of toxicant accumulation in individual organisms. This can then be incorporated into more elaborate models that consider these organismal-level processes within the context of a species life-history or community-level responses that may be associated with long-term exposures.« less
Zartarian, Valerie G; Schultz, Bradley D; Barzyk, Timothy M; Smuts, Marybeth; Hammond, Davyda M; Medina-Vera, Myriam; Geller, Andrew M
2011-12-01
Our primary objective was to provide higher quality, more accessible science to address challenges of characterizing local-scale exposures and risks for enhanced community-based assessments and environmental decision-making. After identifying community needs, priority environmental issues, and current tools, we designed and populated the Community-Focused Exposure and Risk Screening Tool (C-FERST) in collaboration with stakeholders, following a set of defined principles, and considered it in the context of environmental justice. C-FERST is a geographic information system and resource access Web tool under development for supporting multimedia community assessments. Community-level exposure and risk research is being conducted to address specific local issues through case studies. C-FERST can be applied to support environmental justice efforts. It incorporates research to develop community-level data and modeled estimates for priority environmental issues, and other relevant information identified by communities. Initial case studies are under way to refine and test the tool to expand its applicability and transferability. Opportunities exist for scientists to address the many research needs in characterizing local cumulative exposures and risks and for community partners to apply and refine C-FERST.
Stable-isotope analysis: a neglected tool for placing parasites in food webs.
Sabadel, A J M; Stumbo, A D; MacLeod, C D
2018-02-28
Parasites are often overlooked in the construction of food webs, despite their ubiquitous presence in almost every type of ecosystem. Researchers who do recognize their importance often struggle to include parasites using classical food-web theory, mainly due to the parasites' multiple hosts and life stages. A novel approach using compound-specific stable-isotope analysis promises to provide considerable insight into the energetic exchanges of parasite and host, which may solve some of the issues inherent in incorporating parasites using a classical approach. Understanding the role of parasites within food webs, and tracing the associated biomass transfers, are crucial to constructing new models that will expand our knowledge of food webs. This mini-review focuses on stable-isotope studies published in the past decade, and introduces compound-specific stable-isotope analysis as a powerful, but underutilized, newly developed tool that may answer many unresolved questions regarding the role of parasites in food webs.
Harker, Laura; Bamps, Yvan; Flemming, Shauna St. Clair; Perryman, Jennie P; Thompson, Nancy J; Patzer, Rachel E; Williams, Nancy S DeSousa; Arriola, Kimberly R Jacob
2017-01-01
Background The lack of available organs is often considered to be the single greatest problem in transplantation today. Internet use is at an all-time high, creating an opportunity to increase public commitment to organ donation through the broad reach of Web-based behavioral interventions. Implementing Internet interventions, however, presents challenges including preventing fraudulent respondents and ensuring intervention uptake. Although Web-based organ donation interventions have increased in recent years, process evaluation models appropriate for Web-based interventions are lacking. Objective The aim of this study was to describe a refined process evaluation model adapted for Web-based settings and used to assess the implementation of a Web-based intervention aimed to increase organ donation among African Americans. Methods We used a randomized pretest-posttest control design to assess the effectiveness of the intervention website that addressed barriers to organ donation through corresponding videos. Eligible participants were African American adult residents of Georgia who were not registered on the state donor registry. Drawing from previously developed process evaluation constructs, we adapted reach (the extent to which individuals were found eligible, and participated in the study), recruitment (online recruitment mechanism), dose received (intervention uptake), and context (how the Web-based setting influenced study implementation) for Internet settings and used the adapted model to assess the implementation of our Web-based intervention. Results With regard to reach, 1415 individuals completed the eligibility screener; 948 (67.00%) were determined eligible, of whom 918 (96.8%) completed the study. After eliminating duplicate entries (n=17), those who did not initiate the posttest (n=21) and those with an invalid ZIP code (n=108), 772 valid entries remained. Per the Internet protocol (IP) address analysis, only 23 of the 772 valid entries (3.0%) were within Georgia, and only 17 of those were considered unique entries and could be considered for analyses. With respect to recruitment, 517 of the 772 valid entries (67.0%) of participants were recruited from a Web recruiter. Regarding dose received, no videos from the intervention website were watched in their entirety, and the average viewing duration was 17 seconds over the minimum. With respect to context, context analysis provided us with valuable insights into factors in the Internet environment that may have affected study implementation. Although only active for a brief period of time, the Craigslist website advertisement may have contributed the largest volume of fraudulent responses. Conclusions We determined fraud and low uptake to be serious threats to this study and further confirmed the importance of conducting a process evaluation to identify such threats. We suggest checking participants’ IP addresses before study initiation, selecting software that allows for automatic duplicate protection, and tightening minimum requirements for intervention uptake. Further research is needed to understand how process evaluation models can be used to monitor implementation of Web-based studies. PMID:29191799
Web services in the U.S. geological survey streamstats web application
Guthrie, J.D.; Dartiguenave, C.; Ries, Kernell G.
2009-01-01
StreamStats is a U.S. Geological Survey Web-based GIS application developed as a tool for waterresources planning and management, engineering design, and other applications. StreamStats' primary functionality allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, Web services have been developed that provide the capability to remote users and applications to access comprehensive GIS tools that are available in StreamStats, including delineating drainage-basin boundaries, computing basin characteristics, estimating streamflow statistics for user-selected locations, and determining point features that coincide with a National Hydrography Dataset (NHD) reach address. For the state of Kentucky, a web service also has been developed that provides users the ability to estimate daily time series of drainage-basin average values of daily precipitation and temperature. The use of web services allows the user to take full advantage of the datasets and processes behind the Stream Stats application without having to develop and maintain them. ?? 2009 IEEE.
Fast access to the CMS detector condition data employing HTML5 technologies
NASA Astrophysics Data System (ADS)
Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo
2011-12-01
This paper focuses on using HTML version 5 (HTML5) for accessing condition data for the CMS experiment, evaluating the benefits and risks posed by the use of this technology. According to the authors of HTML5, this technology attempts to solve issues found in previous iterations of HTML and addresses the needs of web applications, an area previously not adequately covered by HTML. We demonstrate that employing HTML5 brings important benefits in terms of access performance to the CMS condition data. The combined use of web storage and web sockets allows increasing the performance and reducing the costs in term of computation power, memory usage and network bandwidth for client and server. Above all, the web workers allow creating different scripts that can be executed using multi-thread mode, exploiting multi-core microprocessors. Web workers have been employed in order to substantially decrease the web page rendering time to display the condition data stored in the CMS condition database.
Metadata tables to enable dynamic data modeling and web interface design: the SEER example.
Weiner, Mark; Sherr, Micah; Cohen, Abigail
2002-04-01
A wealth of information addressing health status, outcomes and resource utilization is compiled and made available by various government agencies. While exploration of the data is possible using existing tools, in general, would-be users of the resources must acquire CD-ROMs or download data from the web, and upload the data into their own database. Where web interfaces exist, they are highly structured, limiting the kinds of queries that can be executed. This work develops a web-based database interface engine whose content and structure is generated through interaction with a metadata table. The result is a dynamically generated web interface that can easily accommodate changes in the underlying data model by altering the metadata table, rather than requiring changes to the interface code. This paper discusses the background and implementation of the metadata table and web-based front end and provides examples of its use with the NCI's Surveillance, Epidemiology and End-Results (SEER) database.