Note: This page contains sample records for the topic wfs web feature from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results. Last update: August 15, 2014.
Open geospatial consortium (OGC) webfeature services (WFSs) facilitate feature-level spatial data sharing over the web. However, OGC WFSs only emphasize technical data interoperability via standard interfaces and cannot resolve semantic heterogeneity problems in spatial data sharing. The lack of explicit semantics in the OGC WFS description proves to be a major limitation to automatic geospatial feature discovery and WFS
Chuanrong Zhang; Tian Zhao; Weidong Li; Jeffrey P. Osleeb
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s WebFeature Service (WFS) Specification to enable the sharing of transportation data among the federal level with data from BTS/DOT, the state level through VDOT, the industries through Intergraph. CEOSR develops WFS solutions using Intergraph software. Relevant technical documents are also developed and disseminated through the partners. The WFS is integrated with operational geospatial systems at CEOSR and VDOT. CEOSR works with Intergraph on developing WFS solutions and technical documents. GeoMedia WebMap WFS toolkit is used with software and technical support from Intergraph. ESRI ArcIMS WFS connector is used with GMU's campus license of ESRI products. Tested solutions are integrated with framework data service operational systems, including 1) CEOSR's interoperable geospatial information services, FGDC clearinghouse Node, Geospatial One Stop (GOS) portal, and WMS services, 2) VDOT's state transportation data and GIS infrastructure, and 3)BTS/DOT's national transportation data. The project presents: 1) develop and deploy an operational OGC WFS 1.1 interfaces at CEOSR for registering with FGDC/GOS Portal and responding to Web ``POST'' requests for transportation Framework data as listed in Table 1; 2) build the WFS service that can return the data that conform to the drafted ANSI/INCITS L1 Standard (when available) for each identified theme in the format given by OGC Geography Markup Language (GML) Version 3.0 or higher; 3) integrate the OGC WFS with CEOSR's clearinghouse nodes, 4) establish a formal partnership to develop and share WFS-based geospatial interoperability technology among GMU, VDOT, BTS/DOT, and Intergraph; and 5) develop WFS-based solutions and technical documents using the GeoMedia WebMap WFS toolkit. Geospatial WebFeature Service is demonstrated to be more efficient in sharing vector data and supports direct Internet access transportation data. Developed WFS solutions also enhanced the interoperable service provided by CEOSR through the FGDC clearinghouse node and the GOS Portal.
Yang, C.; Wong, D. W.; Phillips, T.; Wright, R. A.; Lindsey, S.; Kafatos, M.
As a teamed partnership of the Center for Earth Observing and Space Research (CEOSR) at George Mason University (GMU), Virginia Department of Transportation (VDOT), Bureau of Transportation Statistics at the Department of Transportation (BTS\\/DOT), and Intergraph, we established Transportation Framework Data Services using Open Geospatial Consortium (OGC)'s WebFeature Service (WFS) Specification to enable the sharing of transportation data among
C. Yang; D. W. Wong; T. Phillips; R. A. Wright; S. Lindsey; M. Kafatos
Not a virtual exhibition, the Mark Rothko WebFeature by the National Gallery of Art is really a reference work, providing context and background information on the artist. The WebFeature was produced in conjunction with the exhibition, Mark Rothko, at the National Gallery from May 3 through August 16, 1998, now travelling to the Whitney Museum of American Art, September 17-November 29, 1998. The resemblance to a reference book is enhanced by the design of the site, which encourages visitors to page through images of over 30 paintings in chronological order. The Gallery has divided Rothko's career into five periods, and a highlighted navigational bar shows visitors where they are in the chronology. Rothko's explanations of the philosophies behind his work and photographs of the artist help to place the work in context. The actual application of the paint on the canvas is important in Rothko's work, as in that of other abstract expressionists, and some of this nuance is not visible in the WebFeature. In fact, three paintings reproduced as flat black squares, but it is doubtful that these pictures would reproduce any better in the type of art reference book the WebFeature emulates.
Scientists from different organizations and disciplines need to work together to find the solutions to complex problems. Multi-disciplinary science typically involves users with specialized tools and their own preferred view of the data including unique characteristics of the user's information model and symbology. Even though organizations use web services to expose data, there are still semantic inconsistencies that need to be solved. Recent activities within the OGC Interoperability Program (IP) have helped advance semantic mediation solutions when using OGC services to help solve complex problems. The OGC standards development process is influenced by the feedback of activities within the Interoperability Program, which conducts international interoperability initiatives such as Testbeds, Pilot Projects, Interoperability Experiments, and Interoperability Support Services. These activities are designed to encourage rapid development, testing, validation, demonstration and adoption of open, consensus based standards and best practices. Two recent Testbeds, the OGC Web Services Phase 8 and Phase 9, have advanced the use of semantic mediation approaches to increase semantic interoperability among geospatial communities. The Cross-Community Interoperability (CCI) thread within these two testbeds, advanced semantic mediation approaches for data discovery, access and use of heterogeneous data models and heterogeneous metadata models. This presentation will provide an overview of the interoperability program, the CCI Thread and will explain the methodology to mediate heterogeneous GML Application Profiles served via WFS, including discovery of services via a catalog standard interface and mediating symbology applicable to each application profile.
Hobona, G.; Bermudez, L. E.; Brackin, R.; Percivall, G. S.
The ASA24 Respondent Web site guides the participant through the completion of a 24-hour recall for the previous day, either from midnight to midnight or for the past 24-hours, using a dynamic user interface.
Geospatial interoperability technology makes better and easier use of the huge volume of distributed heterogeneous geospatial data and services in Earth related science research and applications . Open Geospatial Consortium (OGC) has been developing interoperable Web service specifications, such as Web Coverage Service (WCS), Web Map Service (WMS), WebFeature Service (WFS) and Catalog Service for Web (CSW), for promoting
Presents a chart that explains the search syntax, features, and commands used by the 12 most widely used general Web search engines. Discusses Web standardization, expanded types of content searched, size of databases, and search engines that include both simple and advanced versions. (LRW)
Over the past 10 years, there have been great advances in the interoperability technologies in geographic information science. More than 10,000 map layers are available online today through Open Geospatial Consortium (OGC) specified interfaces, such as Web Map Service (WMS), WebFeature Service (WFS), and Web Coverage Service (WCS). These map layers are persistently serving the geospatial communities; however, our
Wolfram syndrome is an early onset genetic disease (1/180,000) featuring diabetes mellitus and optic neuropathy, associated to mutations in the WFS1 gene. Wfs1-/- mouse model shows pancreatic beta cell atrophy, but its visual performance has not been investigated, prompting us to study its visual function and histopathology of the retina and optic nerve. Electroretinogram and visual evoked potentials (VEPs) were performed in Wfs1-/- and Wfs1+/+ mice at 3, 6, 9 and 12 months of age. Fundi were pictured with Micron III apparatus. Retinal ganglion cell (RGC) abundance was determined from Brn3a immunolabeling of retinal sections. RGC axonal loss was quantified by electron microscopy in transversal optic nerve sections. Endoplasmic reticulum stress was assessed using immunoglobulin binding protein (BiP), protein disulfide isomerase (PDI) and inositol-requiring enzyme 1 alpha (Ire1?) markers. Electroretinograms amplitudes were slightly reduced and latencies increased with time in Wfs1-/- mice. Similarly, VEPs showed decreased N+P amplitudes and increased N-wave latency. Analysis of unfolded protein response signaling revealed an activation of endoplasmic reticulum stress in Wfs1-/- mutant mouse retinas. Altogether, progressive VEPs alterations with minimal neuronal cell loss suggest functional alteration of the action potential in the Wfs1-/- optic pathways. PMID:24823368
Wolfram syndrome is an early onset genetic disease (1/180,000) featuring diabetes mellitus and optic neuropathy, associated to mutations in the WFS1 gene. Wfs1?/? mouse model shows pancreatic beta cell atrophy, but its visual performance has not been investigated, prompting us to study its visual function and histopathology of the retina and optic nerve. Electroretinogram and visual evoked potentials (VEPs) were performed in Wfs1?/? and Wfs1+/+ mice at 3, 6, 9 and 12 months of age. Fundi were pictured with Micron III apparatus. Retinal ganglion cell (RGC) abundance was determined from Brn3a immunolabeling of retinal sections. RGC axonal loss was quantified by electron microscopy in transversal optic nerve sections. Endoplasmic reticulum stress was assessed using immunoglobulin binding protein (BiP), protein disulfide isomerase (PDI) and inositol-requiring enzyme 1 alpha (Ire1?) markers. Electroretinograms amplitudes were slightly reduced and latencies increased with time in Wfs1?/? mice. Similarly, VEPs showed decreased N+P amplitudes and increased N-wave latency. Analysis of unfolded protein response signaling revealed an activation of endoplasmic reticulum stress in Wfs1?/? mutant mouse retinas. Altogether, progressive VEPs alterations with minimal neuronal cell loss suggest functional alteration of the action potential in the Wfs1?/? optic pathways.
Purpose – The main obstacle in realising semantic-based image retrieval from the web is that it is difficult to capture semantic description of an image in low-level features. Text-based keywords can be generated from web documents to capture semantic information for narrowing down the search space. The combination of keywords and various low-level features effectively increases the retrieval precision. The
The paper describes feature subset selection used in\\u000a learning on text data (text learning) and gives a brief\\u000a overview of feature subset selection commonly used in\\u000a machine learning. Several known and some new feature\\u000a scoring measures appropriate for feature subset selection\\u000a on large text data are described and related to each\\u000a other. Experimental comparison of the described measures\\u000a is given
Organizations of Geographic Information Standardization have ignored the development of management of Geographic Information Services. Thus, different GIS software vendors have implemented their own service management approaches and have not provided a unified interface, which brings many difficulties to users in using, integration, dynamic management, etc. In this context, a specification to manage WMS\\/WFS is proposed, which is called Management
In order to match the customary strengths of the still dominant face-to-face instructional mode, a high-performance online learning system must employ synchronous as well as asynchronous communications; buttress graphics, animation, and text with live audio and video; and provide many of the features and processes associated with course management…
Wolfram syndrome (WS) is a recessively inherited mendelian form of diabetes and neurodegeneration also known by the acronym DIDMOAD from the major clinical features, including diabetes insipidus, diabetes mellitus, optic atrophy, and deafness. Affected individuals may also show renal tract abnormalities as well as multiple neurological and psychiatric symptoms. The causative gene for WS (WFS1) encoding wolframin maps to chromosome 4p16.1 and consists of eight exons, spanning 33.44 Kb of genomic DNA. In this study we report on the mutational analysis of the WFS1 coding region in 19 Italian WS patients and 25 relatives, using a DHPLC-based protocol. A total of 19 different mutations in WFS1 were found in 18 of 19 patients (95%). All these mutations, except one, are novel, preferentially located in WFS1 exon 8, and include deletions, insertions, duplications, and nonsense and missense changes. In particular, a 16 base-pair deletion in WFS1 codon 454 was detected in five different unrelated nuclear families, being the most prevalent alteration in this Italian group. Nine neutral changes and polymorphisms were also identified. Overall, this study represents the molecular characterization of the largest cohort of Italian WS patients and carriers studied so far, and increases the number of identified WFS1 allelic variants worldwide. PMID:12754709
Colosimo, Alessia; Guida, Valentina; Rigoli, Luciana; Di Bella, Chiara; De Luca, Alessandro; Briuglia, Silvana; Stuppia, Liborio; Salpietro, Damiano Carmelo; Dallapiccola, Bruno
This plan outlines the process that will be used to collect samples from soil excavated during removal of underground storage tanks 200W-FS-34 and 200W-FS-35. The samples will be analyzed to determine if gasoline and diesel fuel are present in the soil at...
This paper proposes a web-enabled computational environment for the spatial modelling of habitat suitability of mosquito vectors. Under a component-based architecture and implemented using an object-oriented data model, we integrate database interfaces, Webfeature services (WFS) based on the open GIS consortium (OGC) protocols, and the data-mining tool WEKA, coupled through Java servlet scripts (JSP). The prototype, based exclusively on
P. Zeilhofer; P. S. Arraes Neto; W. Y. Maja; D. A. Vecchiato
Wolfram syndrome (WFS) is a neurodegenerative genetic condition characterized by juvenile-onset of diabetes mellitus and optic atrophy. We studied clinical features and the molecular basis of severe WFS (neurodegenerative complications) in two consanguineous families from Iran. A clinical and molecular genetic investigation was performed in the affected and healthy members of two families. The clinical diagnosis of WFS was confirmed by the existence of diabetes mellitus and optic atrophy in the affected patients, who in addition had severe neurodegenerative complications. Sequencing of WFS1 was undertaken in one affected member from each family. Targeted mutations were tested in all members of relevant families. Patients had most of the reported features of WFS. Two affected males in the first family had fathered unaffected children. We identified two homozygous mutations previously reported with apparently milder phenotypes: family 1: c.631G>A (p.Asp211Asn) in exon 5, and family 2: c.1456C>T (p.Gln486*) in exon 8. Heterozygous carriers were unaffected. This is the first report of male Wolfram patients who have successfully fathered children. Surprisingly, they also had almost all the complications associated with WFS. Our report has implications for genetic counseling and family planning advice for other affected families.
Abstract Objective: Mutations in the WFS1 gene can cause Wolfram syndrome or nonsyndromic hearing impairment (HI). The objective of this study was to ascertain the presence of mutations in WFS1 among children with HI from unknown causes. Design: We screened 105 Finnish children with HI for mutations in exon 8 in WFS1. Study sample: Children were born in a defined area in Northern Finland and they had sensorineural, mild to profound, syndromic, or nonsyndromic HI. They were negative for GJB2 mutations and for the m.1555A> G and m.3243A> G mutations in mitochondrial DNA. Results: We found three rare variants and the novel p.Gly831Ser variant in WFS1. Segregation analysis suggested that the novel variant had arisen de novo. The p.Gly831Ser variant may be a new member to the group of heterozygous WFS1 mutations that lead to HI, while the pathogenicity of the rare variant p.Gly674Arg remained unclear. The other two rare variants, p.Glu385Lys and p.Glu776Val, did not segregate with HI in the families. Conclusions: WFS1 gene mutations are a rare cause of HI among Finnish children with HI. PMID:24909696
Häkli, Sanna; Kytövuori, Laura; Luotonen, Mirja; Sorri, Martti; Majamaa, Kari
The present work aims at designing and implementing a spatial data infrastructure for storing and sharing ecological data through geospatial web services. As case study, we concentrated on limnological data coming from the drainage basin of Lake Maggiore in the Northern of Italy. In order to establish the infrastructure, we started with two basic questions: (1) What type of data is the ecological dataset? (2) Which are the geospatial web services standards most suitable to store and share ecological data? In this paper we describe the possibilities for sharing ecological data using geospatial web services and the difficulties that can be encountered in this task. In order to test actual technological solutions, we use real data of a limnological published study.We concluded that limnological data can be considered observational data, composed by biological (species) data and environmental data, and it can be modeled using Observation and Measurement (O&M) specification. With the actual web service implementation the geospatial web services that could potentially be used to publish limnological data are Sensor Observation Services (SOS) and WebFeature Services (WFS). SOS holds the essential components to represent time series observations, while WFS is a simple model that requires profiling. Both, SOS and WFS are not perfectly suitable to publish biological data, so other alternatives must be considered, as linked data.
Offers an abundance of World Wide Web resources for K-12 educators. Briefly describes interesting web sites, including the "K-12 History on the Internet Resource Guide" and sites by National Geographic, the Smithsonian Institute, and Ben and Jerry's. Discusses personal web pages including the author's own. (MJP)
OGC web service provides an important method for GIS data sharing and interoperability. But it is difficult to find suitable web service, how to get appropriate service has become a serious problem. We discuss the method of finding web services, filtering unsuitable web services and invoking web services. To find WFS and WMS web services, several search engines' source code
A novel video retrieval method based on Web community extraction using audio and visual features and textual features of video materials is proposed in this paper. In this proposed method, canonical correlation analysis is applied to these three features calculated from video materials and their Web pages, and transformation of each feature into the same variate space is possible. The transformed variates are based on the relationships between visual, audio and textual features of video materials, and the similarity between video materials in the same feature space for each feature can be calculated. Next, the proposed method introduces the obtained similarities of video materials into the link relationship between their Web pages. Furthermore, by performing link analysis of the obtained weighted link relationship, this approach extracts Web communities including similar topics and provides the degree of attribution of video materials in each Web community for each feature. Therefore, by calculating similarities of the degrees of attribution between the Web communities extracted from the three kinds of features, the desired ones are automatically selected. Consequently, by monitoring the degrees of attribution of the obtained Web communities, the proposed method can perform effective video retrieval. Some experimental results obtained by applying the proposed method to video materials obtained from actual Web pages are shown to verify the effectiveness of the proposed method.
Examines perceptions of top administrators concerning courses with Webfeatures at Association of Schools of Journalism and Mass Communication (ASJMC) programs. Studies the imperatives and pressures to implement courses with Webfeatures as well as resistances to implementation. Suggests that administrators perceive an extensive set of needs and…
Most library Web sites offer lists of recommended Web sites for primary sources with only cursory summaries of the sites. While many of the resources listed are outstanding, too many are dubious in quality, often referring to dead URLs or sites containing no information on their sponsor, source of material, or other information needed to evaluate…
The migratory shorebirds of the East Atlantic flyway land in huge numbers during a migratory stopover or wintering on the French Atlantic coast. The Brouage bare mudflat (Marennes-Oléron Bay, NE Atlantic) is one of the major stopover sites in France. The particular structure and function of a food web affects the efficiency of carbon transfer. The structure and functioning of the Brouage food web is crucial for the conservation of species landing within this area because it provides sufficient food, which allows shorebirds to reach the north of Europe where they nest. The aim of this study was to describe and understand which food web characteristics support nutritional needs of birds. Two food-web models were constructed, based on in situ measurements that were made in February 2008 (the presence of birds) and July 2008 (absence of birds). To complete the models, allometric relationships and additional data from the literature were used. The missing flow values of the food web models were estimated by Monte Carlo Markov Chain – Linear Inverse Modelling. The flow solutions obtained were used to calculate the ecological network analysis indices, which estimate the emergent properties of the functioning of a food-web. The total activities of the Brouage ecosystem in February and July are significantly different. The specialisation of the trophic links within the ecosystem does not appear to differ between the two models. In spite of a large export of carbon from the primary producer and detritus in winter, the higher recycling leads to a similar retention of carbon for the two seasons. It can be concluded that in February, the higher activity of the ecosystem coupled with a higher cycling and a mean internal organization, ensure the sufficient feeding of the migratory shorebirds.
The NASA Operation IceBridge mission collects airborne remote sensing measurements to bridge the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat) mission and the upcoming ICESat-2 mission. The IceBridge Data Portal from the National Snow and Ice Data Center provides an intuitive web interface for accessing IceBridge mission observations and measurements. Scientists and users usually do not have knowledge about the individual campaigns but are interested in data collected in a specific place. We have developed a high-performance map interface to allow users to quickly zoom to an area of interest and see any Operation IceBridge overflights. The map interface consists of two layers: the user can pan and zoom on the base map layer; the flight line layer that overlays the base layer provides all the campaign missions that intersect with the current map view. The user can click on the flight campaigns and download the data as needed. The OpenGIS® Web Map Service Interface Standard (WMS) provides a simple HTTP interface for requesting geo-registered map images from one or more distributed geospatial databases. WebFeature Service (WFS) provides an interface allowing requests for geographical features across the web using platform-independent calls. OpenLayers provides vector support (points, polylines and polygons) to build a WMS/WFS client for displaying both layers on the screen. Map Server, an open source development environment for building spatially enabled internet applications, is serving the WMS and WFS spatial data to OpenLayers. Early releases of the portal displayed unacceptably poor load time performance for flight lines and the base map tiles. This issue was caused by long response times from the map server in generating all map tiles and flight line vectors. We resolved the issue by implementing various caching strategies on top of the WMS and WFS services, including the use of Squid (www.squid-cache.org) to cache frequently-used content. Our presentation includes the architectural design of the application, and how we use OpenLayers, WMS and WFS with Squid to build a responsive web application capable of efficiently displaying geospatial data to allow the user to quickly interact with the displayed information. We describe the design, implementation and performance improvement of our caching strategies, and the tools and techniques developed to assist our data caching strategies.
Liu, M.; Brodzik, M.; Collins, J. A.; Lewis, S.; Oldenburg, J.
Objective: The WFS1 gene encodes an endoplasmic reticulum (ER) membrane-embedded protein. Homozygous WFS1 gene mutations cause Wolfram syndrome, characterized by insulin-deficient dia- betes mellitus and optic atropy. Pancreatic b-cells are selectively lost from the patient's islets. ER local- ization suggests that WFS1 protein has physiological functions in membrane trafficking, secretion, processing and\\/or regulation of ER calcium homeostasis. Disturbances or overloading
Although effects of land use/cover on nutrient concentrations in aquatic systems are well known, half or more of the variation in nutrient concentration remains unexplained by land use/cover alone. Hydrogeomorphic (HGM) landscape features can explain much remaining variation and influence food web interactions. To explore complex linkages among land use/cover, HGM features, reservoir productivity, and food webs, we sampled 11 Ohio reservoirs, ranging broadly in agricultural catchment land use/cover, for 3 years. We hypothesized that HGM features mediate the bottom-up effects of land use/cover on reservoir productivity, chlorophyll a, zooplankton, and recruitment of gizzard shad, an omnivorous fish species common throughout southeastern U.S. reservoirs and capable of exerting strong effects on food web and nutrient dynamics. We tested specific hypotheses using a model selection approach. Percent variation explained was highest for total nitrogen (R2 = 0.92), moderately high for total phosphorus, chlorophyll a, and rotifer biomass (R2 = 0.57 to 0.67), relatively low for crustacean zooplankton biomass and larval gizzard shad hatch abundance (R2 = 0.43 and 0.42), and high for larval gizzard shad survivor abundance (R2 = 0.79). The trophic status models included agricultural land use/cover and an HGM predictor, whereas the zooplankton models had few HGM predictors. The larval gizzard shad models had the highest complexity, including more than one HGM feature and food web components. We demonstrate the importance of integrating land use/cover, HGM features, and food web interactions to investigate critical interactions and feedbacks among physical, chemical, and biological components of linked land-water ecosystems.
Bremigan, M. T.; Soranno, P. A.; Gonzalez, M. J.; Bunnell, D. B.; Arend, K. K.; Renwick, W. H.; Stein, R. A.; Vanni, M. J.
Web matured from Web to Web 2.0; contents and services matured from static content to dynamic contents; streaming, voice over IP, instant messaging, forums, blogs, commerce, payments, and lifestyle based actionable information. In access methods, we saw maturity from classical Web to Wireless Web to Mobile Web. Web is used in the advanced economies today for \\
Background In the past decade, the use of technologies to persuade, motivate, and activate individuals’ health behavior change has been a quickly expanding field of research. The use of the Web for delivering interventions has been especially relevant. Current research tends to reveal little about the persuasive features and mechanisms embedded in Web-based interventions targeting health behavior change. Objectives The purpose of this systematic review was to extract and analyze persuasive system features in Web-based interventions for substance use by applying the persuasive systems design (PSD) model. In more detail, the main objective was to provide an overview of the persuasive features within current Web-based interventions for substance use. Methods We conducted electronic literature searches in various databases to identify randomized controlled trials of Web-based interventions for substance use published January 1, 2004, through December 31, 2009, in English. We extracted and analyzed persuasive system features of the included Web-based interventions using interpretive categorization. Results The primary task support components were utilized and reported relatively widely in the reviewed studies. Reduction, self-monitoring, simulation, and personalization seem to be the most used features to support accomplishing user’s primary task. This is an encouraging finding since reduction and self-monitoring can be considered key elements for supporting users to carry out their primary tasks. The utilization of tailoring was at a surprisingly low level. The lack of tailoring may imply that the interventions are targeted for too broad an audience. Leveraging reminders was the most common way to enhance the user-system dialogue. Credibility issues are crucial in website engagement as users will bind with sites they perceive credible and navigate away from those they do not find credible. Based on the textual descriptions of the interventions, we cautiously suggest that most of them were credible. The prevalence of social support in the reviewed interventions was encouraging. Conclusions Understanding the persuasive elements of systems supporting behavior change is important. This may help users to engage and keep motivated in their endeavors. Further research is needed to increase our understanding of how and under what conditions specific persuasive features (either in isolation or collectively) lead to positive health outcomes in Web-based health behavior change interventions across diverse health contexts and populations.
Although the fast development of OGC (Open Geospatial Consortium) WFS (WebFeature Service) technologies has undoubtedly improved the sharing and synchronization of feature-level geospatial information across diverse resources, literature shows that there are still apparent limitations in the current implementation of OGC WFSs. Currently, the implementation of OGC WFSs only emphasizes syntactic data interoperability via standard interfaces and cannot resolve semantic heterogeneity problems in geospatial data sharing. To help emergency responders and disaster managers find new ways of efficiently searching for needed geospatial information at the feature level, this paper aims to propose a framework for automatic search of geospatial features using Geospatial Semantic Web technologies and natural language interfaces. We focus on two major tasks: (1) intelligent geospatial feature retrieval using Geospatial Semantic Web technologies; (2) a natural language interface to a geospatial knowledge base and webfeature services over the Semantic Web. Based on the proposed framework we implemented a prototype. Results show that it is practical to directly discover desirable geospatial features from multiple semantically heterogeneous sources using Geospatial Semantic Web technologies and natural language interfaces.
Wolframin (Wfs1) is a membrane glycoprotein that resides in the endoplasmic reticulum (ER) and regulates cellular Ca(2+) homeostasis. In pancreas Wfs1 attenuates unfolded protein response (UPR) and protects cells from apoptosis. Loss of Wfs1 function results in Wolfram syndrome (OMIM 222300) characterized by early-onset diabetes mellitus, progressive optic atrophy, diabetes insipidus, deafness, and psychiatric disorders. Similarly, Wfs1-/- mice exhibit diabetes and increased basal anxiety. In the adult central nervous system Wfs1 is prominent in central extended amygdala, striatum and hippocampus, brain structures largely involved in behavioral adaptation of the organism. Here, we describe the initiation pattern of Wfs1 expression in mouse forebrain using mRNA in situ hybridization and compare it with Synaptophysin (Syp1), a gene encoding synaptic vesicle protein widely used as neuronal differentiation marker. We show that the expression of Wfs1 starts during late embryonic development in the dorsal striatum and amygdala, then expands broadly at birth, possessing several transitory regions during maturation. Syp1 expression precedes Wfs1 and it is remarkably upregulated during the period of Wfs1 expression initiation and maturation, suggesting relationship between neural activation and Wfs1 expression. Using in situ hybridization and quantitative real-time PCR we show that UPR-related genes (Grp78, Grp94, and Chop) display dynamic expression in the perinatal brain when Wfs1 is initiated and their expression pattern is not altered in the brain lacking functional Wfs1. PMID:24694561
We describe two unrelated patients aged 9 and 12 years. The first patient presented with multiple congenital contractures not associated with webbing (pterygia). Interestingly, his genetic testing showed the typical genotypic criteria of Escobar syndrome (CHRNG heterozygous mutation). The characteristics of the second child were compatible with the phenotypic and genotypic criteria for Escobar syndrome. Both patients manifested the typical facial features suggestive of Escobar syndrome. The aim of this paper is twofold: first, to illustrate that the absence of popliteal webbing is not a sufficient reason to exclude Escobar syndrome in patients with multiple contractures and second, dysmorphic facial features and the presence of certain radiological abnormalities might be considered baseline diagnostic tools in favor of this syndromic entity. PMID:24254455
Al Kaissi, Ali; Kenis, Vladimir; Laptiev, Sergey; Ghachem, Maher Ben; Klaushofer, Klaus; Ganger, Rudolf; Grill, Franz
There is no data about the energy metabolism of patients with Wolfram syndrome caused by mutations in the wolframin (Wfs1) gene. The aim of this study was to investigate the role of Wfs1 in energy metabolism and thyroid function in Wfs1 deficient mice (Wfs1KO). 16 male (8 Wfs1KO, 8 wild type (wt)) and 16 female (8 Wfs1KO, 8wt) mice aged 11-13 weeks were studied alone in a specific metabolic cage for 48?h. Body weight, food, water and O2 consumption, motor activity, CO2 and heat production of mice were recorded. At the age of 14-20 weeks, plasma levels of thyroxine (T4), TSH and leptin were measured and histology of thyroid tissues examined. Mean CO2 and heat production was not different between the groups. Mean O2 consumption was higher in the Wfs1KO females compared to the Wfs1KO males (3?410.0±127.0 vs. 2?806.0±82.4?ml/kg/h; p<0.05), but not compared to the wt mice. The mean movement activity was not different between the groups except that the Wfs1KO females reared up more often than the wt females (199.8±63.46 vs. 39.26±24.71?cnts/48?h; p<0.05). Both male and female Wfs1KO mice had significantly lower body mass and food intake than wt mice. Male Wfs1KO mice also lost more weight in metabolic cage than wt males (20.43±0.41 vs. 16.07±0.86%; p<0.05) indicating more pronounced response to isolation. Male Wfs1KO mice had significantly lower levels of plasma leptin than wt male mice (3.37±0.40 vs. 5.82±0.71?ng/ml; p<0.01). Thyroid function measured by serum TSH and T4 levels was not different between Wfs1KO and wt groups, but both Wfs1KO and wt male mice had significantly higher mean T4 levels than female mice. The histology of thyroid tissue of Wfs1KO males showed a trend to a smaller mean number of epithelial cells per follicle than the wt male mice.Although Wfs1KO mice were smaller and lost more weight during the experiment, their energy metabolism was not different from wt mice except that the female Wfs1KO mice consumed more O2. As mice in this study were relatively young, longitudinal studies in older mice are necessary to clarify whether Wfs1 has a role in energy metabolism when the disease progresses further. PMID:24710642
Noormets, K; Kõks, S; Ivask, M; Aunapuu, M; Arend, A; Vasar, E; Tillmann, V
Different spaceborne sensors enable different, potentially novel analyses of hurricanes. Scatterometer data augment traditional satellite images of clouds by providing direct measurements of surface winds to compare with observed cloud patterns, better determining a hurricane's location, direction, structure, and strength. Sea surface temperature data illuminate both the preconditions and the effects of hurricanes along their tracks. To further multi-sensor studies of hurricanes, PO.DAAC offers the web page http://podaac.jpl.nasa.gov/hurricanes, which features - easy access to hurricane-specific data for multiple sensors - near-real-time and historical hurricane data - visualized storm tracks with data hits located by time The primary sensor for this web page is QuikSCAT. In near-real-time, ultra-high resolution wind images (2.5 km/pixel speeds overlaid with 12.5 km directions) visualize the also available backscatter and merged geophysical data records. The web site also offers historical, more scientifically accurate images, backscatter, and wind vectors dating back to the beginning of the mission in 1999. Furthermore, the web page provides sea surface temperature data at 50km resolution from the AMSR-E instrument. Future plans include adding other radiometer data and sea surface height data from the Jason altimeter.
Chen, R.; Rodriguez, E.; Vazquez, J.; Rigor, E.; Poulsen, L.; Dunbar, S.; Long, D.; Kessling, M.; Callahan, P.; Liggett, P.
Extracting informative content from Web article pages has many applications such as printing and content reuse. Title is a very significant and unique component of an article. However, identifying the true title is not an easy problem even for human readers. In this paper, we present a title identification method that takes into account of several features including the title field of the HTML page and HTML tag of a DOM node as well as font size and horizontal alignment. We tested our method on a ground truth data set consisting of 1993 pages from 98 web sites and achieved 97.5% accuracy, about 20% above a baseline method based on only the font size.
Background Wolfram Syndrome (WS) is an autosomal recessive disorder characterised by non-autoimmune diabetes mellitus, optic atrophy, cranial diabetes insipidus and sensorineural deafness. Some reports have described hypogonadism in male WS patients. The aim of our study was to find out whether Wfs1 deficient (Wfs1KO) male mice have reduced fertility and, if so, to examine possible causes. Methods Wfs1KO mice were generated by homologous recombination. Both Wfs1KO and wild type (wt) male mice were mated with wt female mice. The number of litters and the number of pups were counted and pregnancy rates calculated. The motility and morphology of the sperm and the histology of testes were analysed. Serum testosterone and FSH concentrations were also measured. Results The pregnancy rate in wt females mated with Wfs1KO males was significantly lower than in the control group (15% vs. 32%; p < 0.05), but there was no significant difference in litter size. Analysis of male fertility showed that, in the Wfs1KO group, eight males out of 13 had pups whereas in the control group all 13 males had at least one litter. Sperm motility was not affected in Wfs1KO mice, but Wfs1KO males had less proximal bent tails (14.4 +/- 1.2% vs. 21.5 +/- 1.3 p < 0.05) and less abnormal sperm heads (22.8 +/- 1.8 vs. 31.5 +/- 3.5, p < 0.05) than wt males. Testes histology revealed significantly reduced number of spermatogonia (23.9 +/- 4.9 vs. 38.1 +/- 2.8; p < 0.05) and Sertoli cells (6.4 +/- 0.5 vs. 9.2 +/- 1.0; p < 0.05) in Wfs1KO mice. Serum testosterone and FSH concentrations did not differ between the two groups. Conclusion The impaired fertility of Wfs1KO male mice is most likely due to changes in sperm morphology and reduced number of spermatogenic cells. The exact mechanism through which the Wfs1 gene influences sperm morphology needs to be clarified in further studies.
NOAA's National Climatic Data Center (NCDC) currently archives over 1.5 petabytes of climatological data from various networks and sources including in-situ, numerical models, radar and satellite. Access to these datasets is evolving from interactive web interfaces utilizing database technology to standardized web services in a Service Oriented Architecture (SOA). NCDC is currently offering several web services using Simple Object Access Protocol (SOAP), XML over Representational State Transfer (REST/XML), Open Geospatial Consortium (OGC) Web Map Service (WMS) / WebFeature Service (WFS) / Web Coverage Service (WCS) and OPeNDAP web service protocols. These services offer users a direct connection between their client applications and NCDC data servers. In addition, users may embed access to the services in custom applications to efficiently navigate and subset data in an automated fashion. NCDC currently provides gridded numerical model data through a THREDDS Data Server and GrADS Data Server which offers OPeNDAP and WCS access. In-situ network metadata are available through WMS and WFS while the corresponding time-series data are accessible through SOAP and REST web services. These in-situ services are a part of the Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI) WaterOneFlow services, a consolidated access system for hydrologic data, and comply with the WaterOneFlow specifications. NCDC's Severe Weather Data Inventory (SWDI), which provides user access to archives of several datasets critical to the detection and evaluation of severe weather, is also accessible through REST/XML services. Providing cataloging, access and search capabilities for many of NCDC's datasets using community driven standards is a top priority for the ever increasing data volumes being archived at NCDC. Providing interoperable access is critical to supporting data stewardship across multiple scientific disciplines and user types. This demonstration will showcase NCDC's latest work towards standardized web services with both server and client examples.
Ansari, S.; Baldwin, R.; Del Greco, S.; Lott, N.; Rutledge, G.
Valproic acid (VPA) is a widely used anticonvulsant and mood-stabilizing drug whose use is often associated with drug-induced weight gain. Treatment with VPA has been shown to upregulate Wfs1 expression in vitro. Aim of the present study was to compare the effect of chronic VPA treatment in wild type (WT) and Wfs1 knockout (KO) mice on hepatic gene expression profile. Wild type, Wfs1 heterozygous, and homozygous mice were treated with VPA for three months (300?mg/kg i.p. daily) and gene expression profiles in liver were evaluated using Affymetrix Mouse GeneChip 1.0?ST array. We identified 42 genes affected by Wfs1 genotype, 10 genes regulated by VPA treatment, and 9 genes whose regulation by VPA was dependent on genotype. Among the genes that were regulated differentially by VPA depending on genotype was peroxisome proliferator-activated receptor delta (Ppard), whose expression was upregulated in response to VPA treatment in WT, but not in Wfs1 KO mice. Thus, regulation of Ppard by VPA is dependent on Wfs1 genotype.
Sutt, Silva; Koks, Sulev; Schalkwyk, Leonard C.; Fernandes, Catherine; Vasar, Eero
The incredible increase in the amount of information on the World Wide Web has caused the birth of topic specific crawling of the Web. During a focused crawling process, an automatic Web page classification mechanism is needed to determine whether the page being considered is on the topic or not. In this study, a genetic algorithm (GA) based automatic Web
Massively parallel sequencing technologies have made the generation of genomic data sets a routine component of many biological investigations. For example, Chromatin immunoprecipitation followed by sequence assays detect genomic regions bound (directly or indirectly) by specific factors, and DNase-seq identifies regions of open chromatin. A major bottleneck in the interpretation of these data is the identification of the underlying DNA sequence code that defines, and ultimately facilitates prediction of, these transcription factor (TF) bound or open chromatin regions. We have recently developed a novel computational methodology, which uses a support vector machine (SVM) with kmer sequence features (kmer-SVM) to identify predictive combinations of short transcription factor-binding sites, which determine the tissue specificity of these genomic assays (Lee, Karchin and Beer, Discriminative prediction of mammalian enhancers from DNA sequence. Genome Res. 2011; 21:2167–80). This regulatory information can (i) give confidence in genomic experiments by recovering previously known binding sites, and (ii) reveal novel sequence features for subsequent experimental testing of cooperative mechanisms. Here, we describe the development and implementation of a web server to allow the broader research community to independently apply our kmer-SVM to analyze and interpret their genomic datasets. We analyze five recently published data sets and demonstrate how this tool identifies accessory factors and repressive sequence elements. kmer-SVM is available at http://kmersvm.beerlab.org.
Fletez-Brant, Christopher; Lee, Dongwon; McCallion, Andrew S.; Beer, Michael A.
In coastal observing systems it is needed to integrate different observing methods like remote sensing, in-situ measurements, and models into a synoptic view of the state of the observed region. This integration can be based solely on web services combining data and metadata. Such an approach is pursued for COSYNA (Coastal Observing System for Northern and Artic seas). Data from satellite and radar remote sensing, measurements of buoys, stations and Ferryboxes are the observation part of COSYNA. These data are assimilated into models to create pre-operational forecasts. For discovering data an OGC WebFeature Service (WFS) is used by the COSYNA data portal. This WebFeature Service knows the necessary metadata not only for finding data, but in addition the URLs of web services to view and download the data. To make the data from different resources comparable a common vocabulary is needed. For COSYNA the standard names from CF-conventions are stored within the metadata whenever possible. For the metadata an INSPIRE and ISO19115 compatible data format is used. The WFS is fed from the metadata-system using database-views. Actual data are stored in two different formats, in NetCDF-files for gridded data and in an RDBMS for time-series-like data. The web service URLs are mostly standard based the standards are mainly OGC standards. Maps were created from netcdf files with the help of the ncWMS tool whereas a self-developed java servlet is used for maps of moving measurement platforms. In this case download of data is offered via OGC SOS. For NetCDF-files OPeNDAP is used for the data download. The OGC CSW is used for accessing extended metadata. The concept of data management in COSYNA will be presented which is independent of the special services used in COSYNA. This concept is parameter and data centric and might be useful for other observing systems.
The Giant Magellan Telescope presents an unique astronomical facility with seven 8.4 m diameter primary mirrors matched by seven adaptive secondary mirrors (ASM). The ASMs will be controlled by several Adaptive Optics systems; one of them is the Laser Tomography Adaptive Optics (LTAO) system. A key component in any design of a LTAO system is the Laser Tomographic Wavefront Sensor (LTAO WFS). The LTAO WFS Assembly consists of six optically equal 60x60 Shack-Hartmann WFS aligned to the six Laser Guide Stars (LGS). Changing telescope elevation and changes in the mean altitude of the sodium layer result in a varying back focal distance and F-number. Therefore, very accurate focus compensation and pupil size adjustment, combined with very high requirements for pupil stability and optical performance, are the main challenges for opto-mechanical design of the LTAO WFS Assembly. We are presenting a compact solution developed during the LTAO preliminary design phase. In our design, the six LGS wavefront sensors use the same focus and zoom stage. Besides the presentation of the optical performance, we will show the results of the tolerancing, the alignment concept and the mechanical realisation.
Most cases of juvenile-onset diabetes (JOD) are diagnosed as type 1 diabetes (T1D), for which genetic studies conducted in outbred Caucasian populations support the concept of multifactorial inheritance. However, this view may be partly challenged in particular population settings. In view of the suggestive evidence for a high prevalence of Wolfram syndrome (WFS) in Lebanon, the phenotypic variability associated with
Pierre A. Zalloua; Sami T. Azar; Marc Delepine; Nadine J. Makhoul; H. Blanc; M. Sanyoura; A. Lavergne; K. Stankov; A. Lemainque; P. Baz; C. Julier
We present a working prototype of YouASTRO (www.youastro.org), a web-based BibTeX-compliant reference management software (RMS) for astrophysical papers in the SAO/NASA ADS database. It also includes as a main feature the concept of distributed paper comments and ratings. In this paper we introduce the main characteristics of the web application, and we will briefly discuss what could be the advantages and drawbacks of such a system being widely adopted by the astrophysical community for its scientific literature.
Bocchino, F.; Lopez-Santiago, J.; Albacete-Colombo, F.; Bucciantini, N.
A study of the "Ask an Expert" feature of StratSoy, a Web-based information system, surveyed 50 users and 48 using it for the first time. Topic areas of interest and web site features desired by respondents were identified. (JOW)
Wool, D. L.; Kanfer, A. G.; Michaels, J.; Thompson, S.; Morris, S. A.; Hasler, C. M.
Most library Web sites offer lists of recommended Web sites for primary sources with only cursory summaries of the sites. While many of the resources listed are outstanding, too many are dubious in quality, often referring to dead URLs or sites containing no information on their sponsor, source of material, or other information needed to evaluate the accuracy of the
Reviews and analyzes selected Web sites that use standard library classification schemes or controlled vocabularies to enhance access to Web information sources. Profiles common elements of many sites and the structural and navigational approaches incorporated with select sites. Includes an appendix of sites that use standard classification…
The global coverage of OneGeology Web Services (www.onegeology.org and portal.onegeology.org) achieved since 2007 from the 120 participating geological surveys will be reviewed and issues arising discussed. Recent enhancements to the OneGeology Web Services capabilities will be covered including new up to 5 star service accreditation scheme utilising the ISO/OGC Web Mapping Service standard version 1.3, core ISO 19115 metadata additions and Version 2.0 WebFeature Services (WFS) serving the new IUGS-CGI GeoSciML V3.2 geological web data exchange language standard (http://www.geosciml.org/) with its associated 30+ IUGS-CGI available vocabularies (http://resource.geosciml.org/ and http://srvgeosciml.brgm.fr/eXist2010/brgm/client.html). Use of the CGI simpelithology and timescale dictionaries now allow those who wish to do so to offer data harmonisation to query their GeoSciML 3.2 based WebFeature Services and their GeoSciML_Portrayal V2.0.1 (http://www.geosciml.org/) Web Map Services in the OneGeology portal (http://portal.onegeology.org). Contributing to OneGeology involves offering to serve ideally 1:1000,000 scale geological data (in practice any scale now is warmly welcomed) as an OGC (Open Geospatial Consortium) standard based WMS (Web Mapping Service) service from an available WWW server. This may either be hosted within the Geological Survey or a neighbouring, regional or elsewhere institution that offers to serve that data for them i.e. offers to help technically by providing the web serving IT infrastructure as a 'buddy'. OneGeology is a standards focussed Spatial Data Infrastructure (SDI) and works to ensure that these standards work together and it is now possible for European Geological Surveys to register their INSPIRE web services within the OneGeology SDI (e.g. see http://www.geosciml.org/geosciml/3.2/documentation/cookbook/INSPIRE_GeoSciML_Cookbook%20_1.0.pdf). The Onegeology portal (http://portal.onegeology.org) is the first port of call for anyone wishing to discover the availability of global geological web services and has new functionality to view and use such services including multiple projection support. KEYWORDS : OneGeology; GeoSciML V 3.2; Data exchange; Portal; INSPIRE; Standards; OGC; Interoperability; GeoScience information; WMS; WFS; Cookbook.
BioBayesNet is a new web application that allows the easy modeling and classification of biological data using Bayesian networks. To learn Bayesian networks the user can either upload a set of annotated FASTA sequences or a set of pre-computed feature vectors. In case of FASTA sequences, the server is able to generate a wide range of sequence and structural features from the sequences. These features are used to learn Bayesian networks. An automatic feature selection procedure assists in selecting discriminative features, providing an (locally) optimal set of features. The output includes several quality measures of the overall network and individual features as well as a graphical representation of the network structure, which allows to explore dependencies between features. Finally, the learned Bayesian network or another uploaded network can be used to classify new data. BioBayesNet facilitates the use of Bayesian networks in biological sequences analysis and is flexible to support modeling and classification applications in various scientific fields. The BioBayesNet server is available at http://biwww3.informatik.uni-freiburg.de:8080/BioBayesNet/.
There is a lack of published research on designing Web-based instruction for the adult U.S. Latino population. Instructional designers need guidance on how to design culturally relevant learning environments for this audience, particularly for Latino people from Mexican heritage. The authors used content analysis to investigate the extent to which…
To interpret whole exome/genome sequence data for clinical and research purposes, comprehensive phenotypic information, knowledge of pedigree structure, and results of previous clinical testing are essential. With these requirements in mind and to meet the needs of the Centers for Mendelian Genomics project, we have developed PhenoDB (http://phenodb.net), a secure, Web-based portal for entry, storage, and analysis of phenotypic and other clinical information. The phenotypic features are organized hierarchically according to the major headings and subheadings of the Online Mendelian Inheritance in Man (OMIM®) clinical synopses, with further subdivisions according to structure and function. Every string allows for a free-text entry. All of the approximately 2,900 features use the preferred term from Elements of Morphology and are fully searchable and mapped to the Human Phenotype Ontology and Elements of Morphology. The PhenoDB allows for ascertainment of relevant information from a case in a family or cohort, which is then searchable by family, OMIM number, phenotypic feature, mode of inheritance, genes screened, and so on. The database can also be used to format phenotypic data for submission to dbGaP for appropriately consented individuals. PhenoDB was built using Django, an open source Web development tool, and is freely available through the Johns Hopkins McKusick-Nathans Institute of Genetic Medicine (http://phenodb.net). PMID:23378291
Hamosh, Ada; Sobreira, Nara; Hoover-Fong, Julie; Sutton, V Reid; Boehm, Corinne; Schiettecatte, François; Valle, David
To interpret whole exome/genome sequence data for clinical and research purposes, comprehensive phenotypic information, knowledge of pedigree structure, and results of previous clinical testing are essential. With these requirements in mind and to meet the needs of the Centers for Mendelian Genomics project, we have developed PhenoDB (http://phenodb.net), a secure, Web-based portal for entry, storage, and analysis of phenotypic and other clinical information. The phenotypic features are organized hierarchically according to the major headings and subheadings of the Online Mendelian Inheritance in Man (OMIM®) clinical synopses, with further subdivisions according to structure and function. Every string allows for a free-text entry. All of the approximately 2,900 features use the preferred term from Elements of Morphology and are fully searchable and mapped to the Human Phenotype Ontology and Elements of Morphology. The PhenoDB allows for ascertainment of relevant information from a case in a family or cohort, which is then searchable by family, OMIM number, phenotypic feature, mode of inheritance, genes screened, and so on. The database can also be used to format phenotypic data for submission to dbGaP for appropriately consented individuals. PhenoDB was built using Django, an open source Web development tool, and is freely available through the Johns Hopkins McKusick-Nathans Institute of Genetic Medicine (http://phenodb.net).
Hamosh, Ada; Sobreira, Nara; Hoover-Fong, Julie; Sutton, V Reid; Boehm, Corinne; Schiettecatte, Francois; Valle, David
Even if still nowadays it's often referred to as an "innovative concept", the Pyramid wavefront sensor has been technologically demonstrated at the TNG in Canary Island years ago. It was again tested in laboratory and on the sky in the framework of the development of MAD, led by ESO, and recently succeeded to achieve outstanding performances at the LBT telescope. At the same time, several theoretical development raised novel features of this device, and actual measurements in the framework of Pyramir experimentally confirmed the better behaviour of this sensor with respect to the Shack-Hartmann in terms of noise propagation in closed loop, as previously analytically predicted. After a brief review of previous works, which revealed or demonstrated some peculiarities of this type of wavefront sensor with respect to other systems, we present a generalization of the photon efficiency and the non-linearity estimations of such sensor. The aim of this study is to devise, through analytical computations and Fourier wave-optics propagation simulations, the behaviour of the Pyramid wavefront sensor when not-ideal illumination conditions, such as faint-end sources and partial wavefront correction, are purposely applied. In the same framework, the effects of introducing a pyramid modulation are discussed too.
Spatial relations among simple features can be used to characterize complex geospatial features. These spatial relations are often represented using linguistic terms such as near, which have inherent vagueness and imprecision. Fuzzy logic can be used to modeling fuzziness of the terms. Once simple features are extracted from remote sensing imagery, degree of satisfaction of spatial relations among these simple features can be derived to detect complex features. The derivation process can be performed in a distributed service environment, which benefits Earth science society in the last decade. Workflow-based service can provide ondemand uncertainty-aware discovery of complex features in a distributed environment. A use case on the complex facility detection illustrates the applicability of the fuzzy logic-supported service-oriented approach.
Background Obesity remains a serious issue in many countries. Web-based programs offer good potential for delivery of weight loss programs. Yet, many Internet-delivered weight loss studies include support from medical or nutritional experts, and relatively little is known about purely web-based weight loss programs. Objective To determine whether supportive features and personalization in a 12-week web-based lifestyle intervention with no in-person professional contact affect retention and weight loss. Methods We assessed the effect of different features of a web-based weight loss intervention using a 12-week repeated-measures randomized parallel design. We developed 7 sites representing 3 functional groups. A national mass media promotion was used to attract overweight/obese Australian adults (based on body mass index [BMI] calculated from self-reported heights and weights). Eligible respondents (n = 8112) were randomly allocated to one of 3 functional groups: information-based (n = 183), supportive (n = 3994), or personalized-supportive (n = 3935). Both supportive sites included tools, such as a weight tracker, meal planner, and social networking platform. The personalized-supportive site included a meal planner that offered recommendations that were personalized using an algorithm based on a user’s preferences for certain foods. Dietary and activity information were constant across sites, based on an existing and tested 12-week weight loss program (the Total Wellbeing Diet). Before and/or after the intervention, participants completed demographic (including self-reported weight), behavioral, and evaluation questionnaires online. Usage of the website and features was objectively recorded. All screening and data collection procedures were performed online with no face-to-face contact. Results Across all 3 groups, attrition was high at around 40% in the first week and 20% of the remaining participants each week. Retention was higher for the supportive sites compared to the information-based site only at week 12 (P = .01). The average number of days that each site was used varied significantly (P = .02) and was higher for the supportive site at 5.96 (SD 11.36) and personalized-supportive site at 5.50 (SD 10.35), relative to the information-based site at 3.43 (SD 4.28). In total, 435 participants provided a valid final weight at the 12-week follow-up. Intention-to-treat analyses (using multiple imputations) revealed that there were no statistically significant differences in weight loss between sites (P = .42). On average, participants lost 2.76% (SE 0.32%) of their initial body weight, with 23.7% (SE 3.7%) losing 5% or more of their initial weight. Within supportive conditions, the level of use of the online weight tracker was predictive of weight loss (model estimate = 0.34, P < .001). Age (model estimate = 0.04, P < .001) and initial BMI (model estimate = -0.03, P < .002) were associated with frequency of use of the weight tracker. Conclusions Relative to a static control, inclusion of social networking features and personalized meal planning recommendations in a web-based weight loss program did not demonstrate additive effects for user weight loss or retention. These features did, however, increase the average number of days that a user engaged with the system. For users of the supportive websites, greater use of the weight tracker tool was associated with greater weight loss.
Spatial analysis packages and thematic mapping are available in a number of traditional desktop GIS. However, visualizing thematic maps through the Internet is still limited to fix contents and restrict changes of the input data. The users with limited GIS knowledge or people who do not own digital map data are normally having difficulties to create output thematic maps from generic data. In this study, we developed thematic mapping services that can be applied to non-spatial data format served through powerful map services solutions. Novice users who have no GIS software experience or have no digital base map can simply input a plain text file with location identifier field such as place name or gazetteer to generate thematic maps online. We implemented a prototype by using web service standards recommended by the Open Geospatial Consortium (OGC) such as Web Map Service (WMS), WebFeature Service (WFS) and Styled Layer Descriptor (SLD) to provide a principle for communication and allow users to visualize spatial information as thematic maps. The system dedicates a great deal of effort to the initial study of geospatial analysis and visualization for novice users including those with no past experience using Geographic Information Systems.
As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and WebFeature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations.
Gemmell, A. L.; Barciela, R. M.; Blower, J. D.; Haines, K.; Harpham, Q.; Millard, K.; Price, M. R.; Saulter, A.
The Severe Weather Data Inventory (SWDI) at NOAA's National Climatic Data Center (NCDC) provides user access to archives of several datasets critical to the detection and evaluation of severe weather. These datasets include archives of: · NEXRAD Level-III point features describing general storm structure, hail, mesocyclone and tornado signatures · National Weather Service Storm Events Database · National Weather Service Local Storm Reports collected from storm spotters · National Weather Service Warnings · Lightning strikes from Vaisala's National Lightning Detection Network (NLDN) SWDI archives all of these datasets in a spatial database that allows for convenient searching and subsetting. These data are accessible via the NCDC web site, WebFeature Services (WFS) or automated web services. The results of interactive web page queries may be saved in a variety of formats, including plain text, XML, Google Earth's KMZ, standards-based NetCDF and Shapefile. NCDC's Storm Risk Assessment Project (SRAP) uses data from the SWDI database to derive gridded climatology products that show the spatial distributions of the frequency of various events. SRAP also can relate SWDI events to other spatial data such as roads, population, watersheds, and other geographic, sociological, or economic data to derive products that are useful in municipal planning, emergency management, the insurance industry, and other areas where there is a need to quantify and qualify how severe weather patterns affect people and property.
Roughly 5,000 new crystal structures are added to the (approximately 104,000 entry) Inorganic Crystal Structure Database each year (see http://icsdweb.fiz-karlsruhe.de/index.php for an approximately 4,000 entry demonstration version). Other commercial crystallographic databases specialize in organics, metals and alloys, and ``non-organics'' including minerals. This presentation gives an overview over these databases and evaluates the potential of open-access databases such as the (approximately 68,000 entry) Crystallography Open Database (http://crystallography.net/) and Portland State University's (PSU's) Wiki Crystallography Database, Crystal Morphology Database, and Nano-Crystallography Database (http://nanocrystallography.research.pdx.edu/CIF-searchable). Key features of open-access crystallographic databases are: a universal data exchange format, unrestricted internet access to the actual data (including downloads), search capabilities, and crystal structure identification functionalities. Interactive three-dimensional structure or morphology visualizations are also available at PSU's site. Most recently, we implemented at PSU community-based, Wikipedia-inspired data upload and database content management provisions. A selection of all of these features will be demonstrated (online) during the presentation.
As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in-situ marine data. The distributed model and in-situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and WebFeature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in-situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in-situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations.
Gemmell, A. L.; Barciela, R. M.; Blower, J. D.; Haines, K.; Harpham, Q.; Millard, K.; Price, M. R.; Saulter, A.
Skip to Main Content at the National Institutes of Health | www.cancer.gov Print Page E-mail Page Search: Please wait while this form is being loaded.... Home Browse by Resource Type Browse by Area of Research Research Networks Funding Information About
Skip to Main Content at the National Institutes of Health | www.cancer.gov Print Page E-mail Page Search: You can enter up to 10 e-mail addresses (separate with commas): * Send to: * Your e-mail: E-mail a copy to myself The information used on
Wolfram (DIDMOAD) syndrome is an autosomal recessive neurodegenerative disorder accompanied by insulin-dependent diabetes mellitus and progressive optic atrophy. Recent positional cloning led to iden- tification of the WFS1 (Wolfram syndrome 1) gene, a member of a novel gene family of unknown function. In this study, we generated a specific antibody against the C-terminus of the WFS1 protein and investigated its
In the framework of the European ELT design, partially open-loop MCAO systems, coupled with virtual DMs, have been proposed to achieve AO correction using solely NGSs, to be selected in a FoV as wide as allowed by the Telescope optical design. The conceptual design of a very compact wavefront sensor, exploiting the just mentioned concept and characterized by a dynamic range limited by the stroke of the Deformable Mirror and by a limiting magnitude performance typical of a closed loop coherent wavefront sensor, have been presented in the past. This concept was based on the usage of a very linear wavefront sensor, like a YAW sensor, but a DM having the actual shape known “a-priori” could simplify a lot the design of such a compact WFS. We investigate here the realm of possible opto-mechanical realization of a probe, capable to co-exist with the currently foreseen E-ELT LGS probes and giving the possibility to exploit the open loop wavefront sensing operation with the aim of reaching a preliminary design of such a system. Furthermore, we devise a conceptual opto-mechanical design of a precursor of such a system, which could exploit at the VLT Global MCAO correction on the lower part of the atmosphere.
Farinato, J.; Ragazzoni, R.; Magrin, D.; Viotto, V.; Bergomi, M.; Brunelli, A.; Dima, M.; Marafatto, L.
We designed, developed, and tested a Variable Curvature Mirror (VCM) as an active refocusing system for the Laser Guide Star (LGS) Wave Front Sensor (WFS) of the E-ELT EAGLE instrument . This paper is the second of two from our team on this R&D activity: Hugot et al. this conf.  presented the mirror design and performance simulations. Here, we report on the fabrication integration, testing and performance of the VCM system. During this activity, we developed all necessary parts for the VCM system: a metallic mirror, its housing and mounts, a computer-controlled pressure system, an internal metrology, a testbench etc. The functional testing of the VCM system is successful: we can control the internal pressure to less than 1 mBar, and measure the mirror displacement with a 100 nm accuracy. The mirror displacement is a near-linear and well-simulated function of internal pressure for the desired range of focus. The intrinsic optical quality of the mirror meniscus is well within the specifications. Once mounted in its housing, we observe additional mechanical constraints for the current design that generate optical aberrations. We measured the amplitude of the Zernike modes, and we showed that the axisymetric terms display a variation trend very similar to simulations, with amplitude close to simulations. All these results are very promising for a design of focus compensation without any moving part.
This article reports on the development of a personalized, Web-based asthma-education program for parents whose 4- to 12-year-old children have moderate to severe asthma. Personalization includes computer-based tailored messages and a human coach to build asthma self-management skills. Computerized features include the Asthma Manager, My Calendar\\/Reminder, My Goals, and a tailored home page. These are integrated with monthly asthma-education phone
Meg Wise; David H. Gustafson; Christine A. Sorkness; Todd Molfenter; Anthony Staresinic; Tracy Meis; Robert P. Hawkins; Kathleen Kelly Shanovich; Nola P. Walker
Creation of Webpages, with their long and complex pages of HTML code for even the simplest of formats, can be greatly facilitated by the use of Web editors. These powerful tools provide shortcuts to change the appearance or arrangement of Webpages, eliminating many keystrokes. Recent Web editors have added many new and sophisticated features to this basic function. This preconference
Geographic information technologies and related geo-information systems currently play an important role in the management of public administration in Poland. One of these tasks is to maintain and update Geodetic Evidence of Public Utilities (GESUT), part of the National Geodetic and Cartographic Resource, which contains an important for many institutions information of technical infrastructure. It requires an active exchange of data between the Geodesy and Cartography Documentation Centers and institutions, which administrate transmission lines. The administrator of public utilities, is legally obliged to provide information about utilities to GESUT. The aim of the research work was to develop a universal data exchange methodology, which can be implemented on a variety of hardware and software platforms. This methodology use Unified Modeling Language (UML), eXtensible Markup Language (XML), and Geography Markup Language (GML). The proposed methodology is based on the two different strategies: Model Driven Architecture (MDA) and Service Oriented Architecture (SOA). Used solutions are consistent with the INSPIRE Directive and ISO 19100 series standards for geographic information. On the basis of analysis of the input data structures, conceptual models were built for both databases. Models were written in the universal modeling language: UML. Combined model that defines a common data structure was also built. This model was transformed into developed for the exchange of geographic information GML standard. The structure of the document describing the data that may be exchanged is defined in the .xsd file. Network services were selected and implemented in the system designed for data exchange based on open source tools. Methodology was implemented and tested. Data in the agreed data structure and metadata were set up on the server. Data access was provided by geospatial network services: data searching possibilities by Catalog Service for the Web (CSW), data collection by WebFeature Service (WFS). WFS provides also operation for modification data, for example to update them by utility administrator. The proposed solution significantly increases the efficiency of data exchange and facilitates maintenance the National Geodetic and Cartographic Resource.
This paper presents the results of an online survey and a usability test performed on three foreign language learning websites that use Web 2.0 technology. The online survey was conducted to gain an understanding of how current users of language learning websites use them for learning and social purposes. The usability test was conducted to gain…
The dramatic speed of new developments in Web technology often makes designing large, corporate Web pages a difficult task. The life cycle of standards used in Web design is very short, with features introduced in one version sometimes dropped in the next. The evolution of client software for the Web also proceeds quickly, but the distribution of this software doesn't
Geospatial data are important to understand the Earth - ecosystem dynamics, land cover changes, resource management, and human interactions with the Earth to name a few. One of the biggest difficulties users face is to discover, access, and assemble distributed, large volume, heterogeneous geospatial data to conduct geo-analysis. Traditional methods of geospatial data discovery, visualization, and delivery lack the capabilities of resource sharing and automation across systems or organizational boundaries. They require users to download the data ldquoas-isrdquo in their original file format, projection, and extent. Also, discovering data served by traditional methods requires prior knowledge of data location, and processing requires specialized expertise. These drawbacks of traditional methods create additional burden to users, introduce too much overhead to research, and also reduce the potential usage of the data. At the Oak Ridge National Laboratory (ORNL), researchers working on NASA-sponsored projects: Distributed Active Archive Center (DAAC) and Modeling and Synthesis Thematic Data Center (MAST-DC) have tapped into the benefits of Open Geospatial Consortium (OGC) standards to overcome the drawbacks of traditional methods of geospatial data discovery, visualization, and delivery. The OGC standards-based approach facilitates data sharing and interoperability across network, organizational, and geopolitical boundaries. Tools and services based on OGC standards deliver the data in many user defined formats and allow users to visualize the data prior to download. This paper introduces an approach taken to visualize and deliver ORNL DAAC, MAST-DC, and other relevant geospatial data through OGC standards-based Web Services, including Web Map Service (WMS), Web Coverage Service (WCS), and WebFeature Service (WFS). It also introduces a WebGIS system built on top of OGC services that helps users discover, visualize, and access geospatial data.
Wei, Yaxing [ORNL; SanthanaVannan, Suresh K [ORNL; Cook, Robert B [ORNL
Autosomal dominant optic atrophy (ADOA) is genetically heterogeneous, with OPA1 on 3q28 being the most prevalently mutated gene. Additional loci are OPA3, OPA4, and OPA5, located at 19q13.2, 18q12.2, and 22q12.1-q13.1, respectively. Mutations in the WFS1 gene, at 4p16.3, are associated with either optic atrophy (OA) as part of the autosomal recessive Wolfram syndrome or with autosomal dominant progressive low frequency sensorineural hearing loss (LFSNHL) without any ophthalmological abnormalities. Linkage and sequence mutation analyses of the ADOA candidate genes OPA1, OPA3, OPA4, and OPA5, including the genes WFS1, GJB2, and GJB6 associated with recessive inherited OA or dominant LFSNHL, were performed. We identified one novel WFS1 missense mutation E864K, c.2590G-->A in exon 8 that co-segregates with ADOA combined with hearing impairment and impaired glucose regulation. This is the first example of autosomal dominant optic atrophy and hearing loss associated with a WFS1 mutation, supporting the notion that mutations in WFS1 as well as in OPA1 may lead to ADOA combined with impaired hearing. PMID:16648378
Eiberg, H; Hansen, L; Kjer, B; Hansen, T; Pedersen, O; Bille, M; Rosenberg, T; Tranebjaerg, L
Optic atrophy (OA) and sensorineural hearing loss (SNHL) are key abnormalities in several syndromes, including the recessively inherited Wolfram syndrome, caused by mutations in WFS1. In contrast, the association of autosomal dominant OA and SNHL without other phenotypic abnormalities is rare, and almost exclusively attributed to mutations in the Optic Atrophy-1 gene (OPA1), most commonly the p.R445H mutation. We present eight probands and their families from the US, Sweden, and UK with OA and SNHL, whom we analyzed for mutations in OPA1 and WFS1. Among these families, we found three heterozygous missense mutations in WFS1 segregating with OA and SNHL: p.A684V (six families), and two novel mutations, p.G780S and p.D797Y, all involving evolutionarily conserved amino acids and absent from 298 control chromosomes. Importantly, none of these families harbored the OPA1 p.R445H mutation. No mitochondrial DNA deletions were detected in muscle from one p.A684V patient analyzed. Finally, wolframin p.A684V mutant ectopically expressed in HEK cells showed reduced protein levels compared to wild-type wolframin, strongly indicating that the mutation is disease-causing. Our data support OA and SNHL as a phenotype caused by dominant mutations in WFS1 in these additional eight families. Importantly, our data provide the first evidence that a single, recurrent mutation in WFS1, p.A684V, may be a common cause of ADOA and SNHL, similar to the role played by the p.R445H mutation in OPA1. Our findings suggest that patients who are heterozygous for WFS1 missense mutations should be carefully clinically examined for OA and other manifestations of Wolfram syndrome. PMID:21538838
Rendtorff, Nanna D; Lodahl, Marianne; Boulahbel, Houda; Johansen, Ida R; Pandya, Arti; Welch, Katherine O; Norris, Virginia W; Arnos, Kathleen S; Bitner-Glindzicz, Maria; Emery, Sarah B; Mets, Marilyn B; Fagerheim, Toril; Eriksson, Kristina; Hansen, Lars; Bruhn, Helene; Möller, Claes; Lindholm, Sture; Ensgaard, Stefan; Lesperance, Marci M; Tranebjaerg, Lisbeth
WebMO is a World Wide Web-based interface to computational chemistry packages. WebMO is available for free and can be installed on nearly any Unix or Linux system. WebMO Pro is a commercial add-on to the freeware WebMO computational chemistry package. It features a variety of powerful enhancements that are suitable for serious education, commercial, or research-level users.
Assay data is a key information type used in the solid earth sciences, particularly for mineral exploration. As well as its conceptual importance it comprises a significant proportion of data transferred between organizations. In collaboration with both the commercial and regulatory sector, we have developed an XML encoding for assay data. The Assay Data Exchange (ADX) encoding is a GML Application Schema, based on the Observations and Measurements recommendation from Open Geospatial Consortium (OGC), extended with assay specific attributes. The model is normalized, distinguishing the observation event and result from the observation target (specimen) and procedure (instrument). ADX has been used in the deployment of standard web-service interfaces to the archives of agencies who are custodians of such data (geological surveys). The web-service interface is a profile of the standard http-based OGC WebFeature Service. Map-visualization and tabular-report web-clients have been deployed that access these services. In addition, a number of standard data processing and visualization software packages have been enhanced to act as clients to these sources. The immediate value of the standard format and interface is that it enables lossless transfer of assay data from laboratory to explorer, from explorer to regulator, and then from statutory custodian back to speculative explorer, using a common format. This represents the complete data transfer cycle relevant to the mineral exploration industry. However, the more general value of the project is that it demonstrates the efficiencies that can be gained by standardizing data access protocols, and the use of a consensus process within the appropriate community for design of technical standards constructed as profiles of more generic standards.
Cox, S. J.; Woodcock, R.; Dent, A.; Girvan, S.; Atkinson, R.
Flood protection is one of several disciplines where geospatial data is very important and is a crucial component. Its management, processing and sharing form the foundation for their efficient use; therefore, special attention is required in the development of effective, precise, standardized, and interoperable models for the discovery and publishing of data on the Web. This paper describes the design of a methodology to discover Open Geospatial Consortium (OGC) services on the Web and collect descriptive information, i.e., metadata in a geocatalogue. A pilot implementation of the proposed methodology - Geocatalogue of geospatial information provided by OGC services discovered on Google (hereinafter "Geocatalogue") - was used to search for available resources relevant to the area of flood protection. The result is an analysis of the availability of resources discovered through their metadata collected from the OGC services (WMS, WFS, etc.) and the resources they provide (WMS layers, WFS objects, etc.) within the domain of flood protection.
Background The development and use of Web-based programs for weight loss is increasing rapidly, yet they have rarely been evaluated using randomized controlled trials (RCTs). Interestingly, most people who attempt weight loss use commercially available programs, yet it is very uncommon for commercial programs to be evaluated independently or rigorously. Objective To compare the efficacy of a standard commercial Web-based weight-loss program (basic) versus an enhanced version of this Web program that provided additional personalized e-feedback and contact from the provider (enhanced) versus a wait-list control group (control) on weight outcomes in overweight and obese adults. Methods This purely Web-based trial using a closed online user group was an assessor-blinded RCT with participants randomly allocated to the basic or enhanced 12-week Web-based program, based on social cognitive theory, or the control, with body mass index (BMI) as the primary outcome. Results We enrolled 309 adults (129/309, 41.8% male, BMI mean 32.3, SD 4 kg/m2) with 84.1% (260/309) retention at 12 weeks. Intention-to-treat analysis showed that both intervention groups reduced their BMI compared with the controls (basic: –0.72, SD 1.1 kg/m2, enhanced: –1.0, SD 1.4, control: 0.15, SD 0.82; P < .001) and lost significant weight (basic: –2.1, SD 3.3 kg, enhanced: –3.0, SD 4.1, control: 0.4, SD 2.3; P < .001) with changes in waist circumference (basic: –2.0, SD 3.5 cm, enhanced: –3.2, SD 4.7, control: 0.5, SD 3.0; P < .001) and waist-to-height ratio (basic: –0.01, SD 0.02, enhanced: –0.02, SD 0.03, control: 0.0, SD 0.02; P < .001), but no differences were observed between the basic and enhanced groups. The addition of personalized e-feedback and contact provided limited additional benefits compared with the basic program. Conclusions A commercial Web-based weight-loss program can be efficacious across a range of weight-related outcomes and lifestyle behaviors and achieve clinically important weight loss. Although the provision of additional personalized feedback did not facilitate greater weight loss after 12 weeks, the impact of superior participant retention on longer-term outcomes requires further study. Further research is required to determine the optimal mix of program features that lead to the biggest treatment impact over time. Trial Registration Australian New Zealand Clinical Trials Registry (ANZCTR): 12610000197033; http://www.anzctr.org.au/trial_view.aspx?id=335159 (Archived by WebCite at http://www.webcitation.org/66Wq0Yb7U)
Morgan, Philip J; Jones, Penelope; Fletcher, Kate; Martin, Julia; Aguiar, Elroy J; Lucas, Ashlee; Neve, Melinda J; Callister, Robin
This article reports on the development of a personalized, Web-based asthma-education program for parents whose 4- to 12-year-old children have moderate to severe asthma. Personalization includes computer-based tailored messages and a human coach to build asthma self-management skills. Computerized features include the Asthma Manager, My Calendar/Reminder, My Goals, and a tailored home page. These are integrated with monthly asthma-education phone calls from an asthmanurse case manager. The authors discuss the development process and issues and describe the current randomized evaluation study to test whether the yearlong integrated intervention can improve adherence to a daily asthma controller medication, asthma control, and parent quality of life to reduce asthma-related healthcare utilization. Implications for health education for chronic disease management are raised.
Wise, Meg; Gustafson, David H.; Sorkness, Christine A.; Molfenter, Todd; Staresinic, Anthony; Meis, Tracy; Hawkins, Robert P.; Shanovich, Kathleen Kelly; Walker, Nola P.
Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) WebFeature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.
Aberrant DNA hypermethylation of gene promoter regions has been increasingly recognized as a common molecular alteration in\\u000a carcinogenesis. We evaluated the association between major clinicopathological features and hypermethylation of genes in tumors\\u000a among 803 incidence breast cancer cases from a large population-based case–control study conducted in Western New York State.\\u000a DNA samples were isolated from archive paraffin embedded tumor tissue
Meng Hua Tao; Peter G. Shields; Jing Nie; Amy Millen; Christine B. Ambrosone; Stephen B. Edge; Shiva S. Krishnan; Catalin Marian; Bin Xie; Janet Winston; Maurizio Trevisan; Jo L. Freudenheim
Manually indexing the World Wide Web is obviously an impossible task, and it is even a daunting challenge for automated techniques. Web content mining is a general term used to describe these techniques, which are intended for information categorization and filtering. Web robots serve a variety of purposes, including indexing; and they can be useful or, in some cases, harmful. Web usage mining, on the other hand, is used to determine how a Web site's structure and organization effect the way users navigate the site.The Web Robots Pages (1) is an excellent starting place to learn about these automated programs. Several hundred robots are documented in a database, and a selection of papers considers proper ethics and guidelines for using robots, among other things. An article on Web mining and its subclasses is given on DM Review (2). It describes the basics of Web analysis and outlines many benefits Web mining can offer. A course homepage on Web data mining from DePaul University (3) offers a broad selection of reading material on the subject. Mostly consisting of research papers and journal articles, the documents range from general applications to specific theories and case studies. Two computer scientists from Polytechnic University propose a robust, distributed Web crawler (another term for Web robot), intended for large-scale network interaction (4). The twelve page paper begins with the motivation for the project, and continues with a full description of the system architecture and implementation. The November 2002 issue of Computer magazine featured an article on Data Mining for Web Intelligence (5). It points out that today's Internet is lacking in many key aspects, and that Web mining will play an important role in the development of improved search engines and automatic document classification. A short poster presentation from the 2002 International World Wide Web Conference (6) introduces GeniMiner, a Web search strategy based on a genetic algorithm. GeniMiner operates on the premise of finding a nearly optimal solution in order to minimize manual analysis of the search results. KDnuggets (7) is a free, biweekly newsletter on data and Web mining. In recent issues, special attention has been given to the Total Information Awareness project, which is investigating ways of mining the Web and email for possible information about terrorist activity. Web robots are occasionally used for malicious purposes, namely to automatically register for free email or participate in online polls. A technology that was developed to counter these robots involved using a blurred or distorted word to gain access, which could easily be read by a human but would be impossible for a robot to read. In a press release from the University of California at Berkeley (8), researchers have discovered a way to allow Web robots to crack this security system. The article describes how it was accomplished and provides motivation for more advanced security measures.
The celebrated PageRank algorithm has proved to be a very effective paradigm for ranking results of web search algorithms. In this paper we refine this basic paradigm to take into account several evolving prominent features of the web, and propose several algorithmic innovations. First, we analyze features of the rapidly growing \\
Protein turnover metabolism plays important roles in cell cycle progression, signal transduction, and differentiation. Those proteins with short half-lives are involved in various regulatory processes. To better understand the regulation of cell process, it is important to study the key sequence-derived factors affecting short-lived protein degradation. Until now, most of protein half-lives are still unknown due to the difficulties of traditional experimental methods in measuring protein half-lives in human cells. To investigate the molecular determinants that affect short-lived proteins, a computational method was proposed in this work to recognize short-lived proteins based on sequence-derived features in human cells. In this study, we have systematically analyzed many features that perhaps correlated with short-lived protein degradation. It is found that a large fraction of proteins with signal peptides and transmembrane regions in human cells are of short half-lives. We have constructed an SVM-based classifier to recognize short-lived proteins, due to the fact that short-lived proteins play pivotal roles in the control of various cellular processes. By employing the SVM model on human dataset, we achieved 80.8% average sensitivity and 79.8% average specificity, respectively, on ten testing dataset (TE1-TE10). We also obtained 89.9%, 99% and 83.9% of average accuracy on an independent validation datasets iTE1, iTE2 and iTE3 respectively. The approach proposed in this paper provides a valuable alternative for recognizing the short-lived proteins in human cells, and is more accurate than the traditional N-end rule. Furthermore, the web server SProtP (http://reprod.njmu.edu.cn/sprotp) has been developed and is freely available for users. PMID:22114707
Wolfram syndrome, also named "DIDMOAD" (diabetes insipidus, diabetes mellitus, optic atrophy, and deafness), is an inherited association of juvenile-onset diabetes mellitus and optic atrophy as key diagnostic criteria. Renal tract abnormalities and neurodegenerative disorder may occur in the third and fourth decade. The wolframin gene, WFS1, associated with this syndrome, is located on chromosome 4p16.1. Many mutations have been described since the identification of WFS1 as the cause of Wolfram syndrome. We identified a new homozygous WFS1 mutation (c.1532T>C; p.Leu511Pro) causing Wolfram syndrome in a large inbred Turkish family. The patients showed early onset of IDDM, diabetes insipidus, optic atrophy, sensorineural hearing impairment and very rapid progression to renal failure before age 12 in three females. Ectopic expression of the wolframin mutant in HEK cells results in greatly reduced levels of protein expression compared to wild-type wolframin, strongly supporting that this mutation is disease-causing. The mutation showed perfect segregation with disease in the family, characterized by early and severe clinical manifestations. PMID:21968327
Web pages may be organized, indexed, searched, and navi- gated along several dierent feature dimensions. We investi- gate dierent approaches to discovering geographic context for web pages, and describe a navigational tool for browsing web resources by geographic proximity.
... Videos & Cool Tools ESPAÑOL MedlinePlus Guide to Healthy Web Surfing To use the sharing features on this ... when evaluating the quality of health information on Web sites? Here are some suggestions based on our ...
Educational researchers and policy makers have come to rely on data from sample surveys. However, survey research on educational issues poses some special challenges. In many respects, the survey methodology issues in educational research are the same as those throughout the social and behavioral sciences. These issues concern obtaining the best…
Recent progress in hardware and software technology opens up vistas where flexible services on large, multi-dimensional coverage data become a commodity. Interactive data browsing like with Virtual Globes, selective download, and ad-hoc analysis services are about to become available routinely, as several sites already demonstrate. However, for easy access and true machine-machine communication, Semantic Web concepts as being investigated for vector and meta data, need to be extended to raster data and other coverage types. Even more will it then be important to rely on open standards for data and service interoperability. The Open GeoSpatial Consortium (OGC), following a modular approach to specifying geo service interfaces, has issued the Web Coverage Service (WCS) Implementation Standard for accessing coverages or parts thereof. In contrast to the Web Map Service (WMS), which delivers imagery, WCS preserves data semantics and, thus, allows further processing. Together with the Web Catalog Service (CS-W) and the WebFeature Service (WFS) WCS completes the classical triad of meta, vector, and raster data. As such, they represent the core data services on which other services build. The current version of WCS is 1.1 with Corrigendum 2, also referred to as WCS 1.1.2. The WCS Standards Working Group (WCS.SWG) is continuing development of WCS in various directions. One work item is to extend WCS, which currently is confined to regularly gridded data, with support for further coverage types, such as those specified in ISO 19123. Two recently released extensions to WCS are WCS-T ("T" standing for "transactional") which adds upload capabilities to coverage servers and WCPS (Web Coverage Processing Service) which offers a coverage processing language, thereby bridging the gap to the generic WPS (Web Processing Service). All this is embedded into OGC's current initiative to achieve modular topical specification suites through so-called "extensions" which add focused capabilities to some minimal "core" specification. In this talk the current status of WCS, ongoing work, and directions under consideration are outlined. Further, embedding of WCS in the larger context of OGC's modular specification framework and into SOA concepts is discussed. The author, who is co-chair of OGC's WCS Working Group (WG) and Coverages WG, presents facts and personal views on the future of large-scale coverage services.
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples.
Nadkarni, Prakash M.; Brandt, Cynthia M.; Marenco, Luis
The web site is a library's most important feature. Patrons use the web site for numerous functions, such as renewing materials, placing holds, requesting information, and accessing databases. The homepage is the place they turn to look up the hours, branch locations, policies, and events. Whether users are at work, at home, in a building, or on…
Sand covers only about 20 percent of the Earth's deserts. Nearly 50 percent of desert surfaces are gravel plains where removal of fine-grained material by the wind has exposed loose gravel and occasional cobbles. This web page, produced by the U.S. Geological Survey, features text and photographs that describe desert landforms, soils, plants, and the role of water in the formation of desert landscapes.
Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim
Aim: The Korean Multiple Myeloma Working Party performed a nationwide registration of multiple myeloma patients via a web-based data bank system. Methods: We retrospectively analyzed registered data from 3,209 patients since 1999. Results: The median overall survival (OS) was 50.13 months (95% confidence interval: 46.20–54.06 months). Patients ?40 years demonstrated a longer OS than patients >65 years of age (median
Seok Jin Kim; Kihyun Kim; Byung Soo Kim; Deog-Yeon Jo; Hye Jin Kang; Jin Seok Kim; Yeung-Chul Mun; Chul Soo Kim; Sang Kyun Sohn; Hyeon-Seok Eom; Jae-Yong Kwak; Hyeok Shim; Hwi-Joong Yoon; Jong-Youl Jin; Chang-Ki Min; Hyunchoon Shin; Jong-Ho Won; Je-Jung Lee; Jung Hye Kwon; Young-Don Joo; Young Rok Do; Sung-Hyun Kim; Sukjoong Oh; Cheolwon Suh; Junglim Lee; Sung-Soo Yoon; Min Kyoung Kim; Soo-Mee Bang; Hun-Mo Ryoo; Bong-Seog Kim; Hawk Kim; Hyo Jung Kim; Yang Soo Kim; Chong Won Park; Gyeong-Won Lee; Ho Jin Shin; Seong Kyu Park; Joon Seong Park; Ho Young Kim; Dong Soon Lee; Jae Hoon Lee
WebGL leverages the power of OpenGL to present accelerated 3D graphics on a webpage. The ability to put hardware- accelerated 3D content in the browser will provide a means for the creation of new web based applications that were previously the exclusive domain of the desktop environment. It will also allow the inclusion of features that standalone 3D applications do
Given that pairwise similarity computations are essential in ontology learning and data mining, we propose WebSim (Web-based term Similarity metric), whose feature extraction and similarity model is based on a conventional Web search engine. There are two main aspects that we can beneflt from utilizing a Web search engine. First, we can obtain the freshest content for each term that
The article designed a Web software mining system, discussed the techniques what used in the system and raised the solutions for issues in system. A Web crawler software has been designed and implemented according to the feature of the World Wide Web. Base on the information present by Web pages, the article improved feature selection method and key words weighted
This study examines the dialogic features of corporate Web sites in order to determine the Web site practices of the corporations for building relationships with their publics. Content analysis of 100 Fortune 500 companies’ Web sites revealed that the corporations designed their Web sites to serve important publics and foster dialogic communication. The corporate Web sites appear to promote control
Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research...
Y. Deshpande S. Murugeesan A. Ginige S. Hansen D. Schwabe M. M. Gaedke B. White
WebOS (Web based operating system) is a new form of Operating Systems. You can use your desktop as a virtual desktop on the web, accessible via a browser, with multiple integrated built-in applications that allow the user to easily manage and organize her data from any location. Desktop on web can be named as WEBtop. This paper starts with a introduction of WebOS and its benefits. For this paper, We have reviewed some most interesting WebOS available nowadays and tried to provide a detailed description of their features. We have identified some parameters as comparison criteria among them. A technical review is given with research design and future goals to design better web based operating systems is a part of this study. Findings of the study conclude this paper.
The representation depicts 4 different food webs: Antarctica, the African Grasslands, the Australia Grasslands and a Marine environment. A separate food web for scavengers and decomposers is present in the African Grasslands section. Viewers must first build the web by moving boxes with the organism's picture and name to the appropriate spot on a grid. Clues describing food requirements are given as the boxes are moved. When the boxes are correctly placed a complete food web (with arrows) is displayed.
Web Engineering is the application of systematic, disciplined and\\u000aquantifiable approaches to development, operation, and maintenance of Web-based\\u000aapplications. It is both a pro-active approach and a growing collection of\\u000atheoretical and empirical research in Web application development. This paper\\u000agives an overview of Web Engineering by addressing the questions: a) why is it\\u000aneeded? b) what is its domain
Yogesh Deshpande; San Murugesan; Athula Ginige; Steve Hansen; Daniel Schwabe; Martin Gaedke; Bebo White
The infrastructure to gather, store and access information about our environment is improving and growing rapidly. The increasing amount of information allows us to get a better understanding of the current state of our environment, historical processes and to simulate and predict the future state of the environment. Finer grained spatial and temporal data and more reliable communications make it easier to model dynamic states and ephemeral features. The exchange of information within and across geospatial domains is facilitated through the use of harmonized information models. The Observations & Measurements (O&M) developed through OGC and standardised by ISO is an example of such a cross-domain information model. It is used in many domains, including meteorology, hydrology as well as the emergency management. O&M enables harmonized representation of common metadata that belong to the act of determining the state of a feature property, whether by sensors, simulations or humans. In addition to the resulting feature property value, information such as the result quality but especially the time that the result applies to the feature property can be represented. Temporal metadata is critical to modelling past and future states of a feature. The features, and the semantics of each property, are defined in domain specific Application Schema using the General Feature Model (GFM) from ISO 19109 and usually encoded following ISO 19136. However, at the moment these standards provide only limited support for the representation and handling of time varying feature data. Features like rivers, wildfires or gas plumes have a defined state - for example geographic extent - at any given point in time. To keep track of changes, a more complex model for example using time-series coverages is required. Furthermore, the representation and management of feature property value changes via the service interfaces defined by OGC and ISO - namely: WFS and WCS - would be rather complex. Keeping track of feature property value corrections or even feature (state change) cancellations for auditing purposes is also not easy to achieve. The aviation domain has strong requirements to represent and manage the state of aeronautical features through time. Being able to efficiently encode and manage feature state changes, keeping track of all changes for auditing purposes and being able to determine the future state of an aeronautical feature as currently known to the system are vital for aeronautical applications. In order to support these requirements, the Aeronautical Information Exchange Model (AIXM) which has been developed by the aviation domain is based on the so called AIXM Temporality Model (AIXM-TM). The AIXM-TM defines various rules for modeling, representing and handling the state of aeronautical features through time. This is a promising approach that can be incorporated into the GFM so that ultimately the modeling and management of time varying feature data is supported in an interoperable and harmonized way in all geospatial domains. This presentation gives an introduction to the main concepts of the AIXM-TM. It also shows how the GFM can be extended to support time varying feature data. Finally, the relationship of O&M and time varying features is discussed.
In this short paper, we examine current Semantic Web application and we highlight what we see as a shift away from first generation Semantic Web applications, towards a new generation of applications, designed to exploit the large amounts of heterogeneous semantic markup, which are increasingly becoming available. Our analysis aims both to highlight the main features that can be used
In this article, the author presents several Web sites supporting electronic presentation skills. The sites featured here will help fine-tune one's skills in modeling effective presentations and provide suggestions for managing student presentations meeting National Educational Technology Standards (NETS). Most use PowerPoint, the current industry…
A multimedia public education project designed to raise awareness of the world ocean and the life within it. Find articles on the latest ocean issues, links to resources and audio clips of the radio show Ocean Report. Also features information on SeaWeb programs, such as aquaculture initiatives for both fish and their eggs (caviar), and publications.
The helpful Stat My Web site gives visitors the ability to learn about the statistics and metrics associated with any specific site. Visitors can learn when a site was created, where it is hosted, and how much it is worth. The site has two dozen features, including IP Location, Server Status, and Reciprocal Link Checker. This particular version is compatible with all operating systems.
Presents some of the perceived pedagogical challenges posed by use of the World Wide Web. Proposes that the debate surrounding use of the Web in university teaching should center on learning and not technical issues. Learning issues and challenges discussed in this article include learning approaches, using the technical features of the Web to…
Web Apollo is the first instantaneous, collaborative genomic annotation editor available on the web. One of the natural consequences following from current advances in sequencing technology is that there are more and more researchers sequencing new genomes. These researchers require tools to describe the functional features of their newly sequenced genomes. With Web Apollo researchers can use any of the common browsers (for example, Chrome or Firefox) to jointly analyze and precisely describe the features of a genome in real time, whether they are in the same room or working from opposite sides of the world.
Web-based logs contain potentially useful data with which designers can assess the usability and effectiveness of their choices. Most guides to World Wide Web (Web) design derived from artistic or usability principles feature no empirical validation, while empirical studies of Web use typically rely on observer ratings. Several sources of unobtrusive usage data are available to Web designers, including Web
The present research examined how narcissism is manifested on a social networking Web site (i.e., Facebook.com). Narcissistic personality self-reports were collected from social networking Web page owners. Then their Web pages were coded for both objective and subjective content features. Finally, strangers viewed the Web pages and rated their impression of the owner on agentic traits, communal traits, and narcissism. Narcissism predicted (a) higher levels of social activity in the online community and (b) more self-promoting content in several aspects of the social networking Web pages. Strangers who viewed the Web pages judged more narcissistic Web page owners to be more narcissistic. Finally, mediational analyses revealed several Web page content features that were influential in raters' narcissistic impressions of the owners, including quantity of social interaction, main photo self-promotion, and main photo attractiveness. Implications of the expression of narcissism in social networking communities are discussed. PMID:18599659
\\u000a What is a web browser? Is it a computer program used to access sites on the World Wide Web? Is it a program that interprets\\u000a HTML coding so as to display formatted data? Well, it’s all of the above and more; a web browser (or simply browser) is an\\u000a application\\/software program that retrieves, interprets, and renders HTML or another language,
The Factsheets web application was conceived out of the requirement to create, update, publish, and maintain a web site with dynamic research and development (R and D) content. Before creating the site, a requirements discovery process was done in order to accurately capture the purpose and functionality of the site. One of the high priority requirements for the site would be that no specialized training in web page authoring would be necessary. All functions of uploading, creation, and editing of factsheets needed to be accomplished by entering data directly into web form screens generated by the application. Another important requirement of the site was to allow for access to the factsheet web pages and data via the internal Sandia Restricted Network and Sandia Open Network based on the status of the input data. Important to the owners of the web site would be to allow the published factsheets to be accessible to all personnel within the department whether or not the sheets had completed the formal Review and Approval (R and A) process. Once the factsheets had gone through the formal review and approval process, they could then be published both internally and externally based on their individual publication status. An extended requirement and feature of the site would be to provide a keyword search capability to search through the factsheets. Also, since the site currently resides on both the internal and external networks, it would need to be registered with the Sandia search engines in order to allow access to the content of the site by the search engines. To date, all of the above requirements and features have been created and implemented in the Factsheet web application. These have been accomplished by the use of flat text databases, which are discussed in greater detail later in this paper.
Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.
In this paper, we describe Swoop, a hypermedia inspired Ontology Browser and Editor based on OWL, the recently standardized Web-oriented ontology language. After discussing the design rationale and architecture of Swoop, we focus mainly on its features, using illustrative examples to highlight its use. We demonstrate that with its web-metaphor, adherence to OWL recommendations and key unique features such as
Aditya Kalyanpur; Bijan Parsia; Evren Sirin; Bernardo Cuenca Grau; James A. Hendler
Because of the complex Web structure, most approaches of focused crawling employ a local search algorithm, which will only search pages in a sub-graph of the Web. And the multi-topic feature of Web pages makes it difficult to determine the relevance of a Web page to a given topic. Towards those two issues, in this paper we present a new
This article features "Mr. Rhine's Technology Education Web Site," a winner of the Web Site of the Month. This Web site was designed by Luke Rhine, a teacher at the Reservoir High School in Fulton, Maryland. Rhine's Web site offers course descriptions and syllabuses, class calendars, lectures and presentations, design briefs and other course…
This paper presents a web-based tool to supple- ment defense against security misconfiguration vulnerabilities in web applications. The tool automatically audits security con- figuration settings of server environments in web application development and deployment. It also offers features to automat- ically adjust security configuration settings and quantitatively rates level of safety for server environments before deploying web applications. Using the
An eye tracking study was conducted to evaluate specific design features for a prototype web portal application. This software serves independent web content through separate, rectangular, user-modifiable portlets on a web page. Each of seven participants navigated across multiple web pages while conducting six specific tasks, such as removing a link from a portlet. Specific experimental questions included (1) whether
Joseph H. Goldberg; Mark J. Stimson; Marion Lewenstein; Neil Scott; Anna M. Wichansky
A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.
Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)
The structure of ecological communities is usually represented by food webs. In these webs, we describe species by means of vertices connected by links representing the predations. We can therefore study different webs by considering the shape (topology) of these networks. Comparing food webs by searching for regularities is of fundamental importance, because universal patterns would reveal common principles underlying the organization of different ecosystems. However, features observed in small food webs are different from those found in large ones. Furthermore, food webs (except in isolated cases) do not share general features with other types of network (including the Internet, the World Wide Web and biological webs). These features are a small-world character and a scale-free (power-law) distribution of the degree (the number of links per vertex). Here we propose to describe food webs as transportation networks by extending to them the concept of allometric scaling (how branching properties change with network size). We then decompose food webs in spanning trees and loop-forming links. We show that, whereas the number of loops varies significantly across real webs, spanning trees are characterized by universal scaling relations. PMID:12736684
Summary: Ondex Web is a new web-based implementation of the network visualization and exploration tools from the Ondex data integration platform. New features such as context-sensitive menus and annotation tools provide users with intuitive ways to explore and manipulate the appearance of heterogeneous biological networks. Ondex Web is open source, written in Java and can be easily embedded into Web sites as an applet. Ondex Web supports loading data from a variety of network formats, such as XGMML, NWB, Pajek and OXL. Availability and implementation: http://ondex.rothamsted.ac.uk/OndexWeb. Contact: email@example.com
Taubert, Jan; Hassani-Pak, Keywan; Castells-Brooke, Nathalie; Rawlings, Christopher J.
...NRC-2012-0180] Adequacy of Design Features and Functional...5, ``Adequacy of design features and functional... Federal Rulemaking Web site: Go to http...on the NRC's external Web page (http://www...5, ``Adequacy of design features and...
... Spanish-language queries when using this parameter. term Text query submitted to the Web service. All special ... query?db=healthTopicsSpanish&term=asma Field Searching The text for term can include limiters to restrict the ...
Presents Web sites useful for teaching about the Salem (Massachusetts) witchcraft trials. Includes Web sites that offer primary source material, collections of Web sites, teaching material, and sites that are interactive, including features, such as QuickTime movies. (CMK)
We present WebSim (Web-based Similarity metric), whose feature extraction and similarity model is based on a conventional Web search engine. By utilizing the search engine, we can obtain the freshest content for each term that represents the up-to-date knowledge on the term. In comparison with previous text mining approaches that use the certain amount of crawled Web documents as corpus,
This much-better-than-average specialized Web directory on Herodotus is maintained by Tim Spalding, an amateur Greek historian. The Herodotus site, launched late last year, features "over 200 links to resources about the seminal historian and his age. These include texts and translations, books about Herodotus, essays and articles, and a new links section. Spalding also provides a free email news service for his site.
Background Personally controlled health management systems (PCHMS), which include a personal health record (PHR), health management tools, and consumer resources, represent the next stage in consumer eHealth systems. It is still unclear, however, what features contribute to an engaging and efficacious PCHMS. Objective To identify features in a Web-based PCHMS that are associated with consumer utilization of primary care and counselling services, and help-seeking rates for physical and emotional well-being concerns. Methods A one-group pre/posttest online prospective study was conducted on a university campus to measure use of a PCHMS for physical and emotional well-being needs during a university academic semester (July to November 2011). The PCHMS integrated an untethered personal health record (PHR) with well-being journeys, social forums, polls, diaries, and online messaging links with a health service provider, where journeys provide information for consumer participants to engage with clinicians and health services in an actionable way. 1985 students and staff aged 18 and above with access to the Internet were recruited online. Logistic regression, the Pearson product-moment correlation coefficient, and chi-square analyses were used to associate participants’ help-seeking behaviors and health service utilization with PCHMS usage among the 709 participants eligible for analysis. Results A dose-response association was detected between the number of times a user logged into the PCHMS and the number of visits to a health care professional (P=.01), to the university counselling service (P=.03), and help-seeking rates (formal or informal) for emotional well-being matters (P=.03). No significant association was detected between participant pre-study characteristics or well-being ratings at different PCHMS login frequencies. Health service utilization was strongly correlated with use of a bundle of features including: online appointment booking (primary care: OR 1.74, 95% CI 1.01-3.00; counselling: OR 6.04, 95% CI 2.30-15.85), personal health record (health care professional: OR 2.82, 95% CI 1.63-4.89), the poll (health care professional: OR 1.47, 95% CI 1.02-2.12), and diary (counselling: OR 4.92, 95% CI 1.40-17.35). Help-seeking for physical well-being matters was only correlated with use of the personal health record (OR 1.73, 95% CI 1.18-2.53). Help-seeking for emotional well-being concerns (including visits to the university counselling service) was correlated with a bundle comprising the poll (formal or informal help-seeking: OR 1.03, 95% CI 1.00-1.05), diary (counselling: OR 4.92, 95% CI 1.40-17.35), and online appointment booking (counselling: OR 6.04, 95% CI 2.30-15.85). Conclusions Frequent usage of a PCHMS was significantly associated with increased consumer health service utilization and help-seeking rates for emotional health matters in a university sample. Different bundles of PCHMS features were associated with physical and emotional well-being matters. PCHMS appears to be a promising mechanism to engage consumers in help-seeking or health service utilization for physical and emotional well-being matters.
This paper discusses the application of web wrapping technology in extracting metadata from web sources. This capability has been incorporated into a software tool known as Dynamic Dublin Core\\/Resource Description Framework Metadata Editor (DDC\\/RDF-Editor) which supports metadata development and management for resources in the World Wide Web. One key feature of the editor is the ability to automatically extract relevant
Congenital tracheal web is a rare entity often misdiagnosed as refractory asthma. Clinical suspicion based on patient history, examination, and pulmonary function tests should lead to its consideration. Bronchoscopy combined with CT imaging and multiplanar reconstruction is an accepted, highly sensitive means of diagnosis. PMID:14586524
Web services are a new breed of Web applications. These independent application components are published on to the Web in such a way that other Web applications can find and use them. They take the Web to its next stage of evolution, in which software components can discover other software components and conduct business transactions. Examples of Web services include
Web de Anza: an Interactive Study Environment on Spanish Exploration and Colonization of "Alta California" 1774-1776 offers researchers, educators, and students information resources related to the study of two eighteenth-century Spanish overland expeditions from Sonora (Arizona) in New Spain to northern California. This site features primary source materials, such as the original diaries and letters of the expeditionary leader, Juan Bautista de Anza, commandant of the Presidio of Tubac in Sonora. The site also provides a wealth of additional information including bibliographies, biographies, commentaries, maps, timelines, pictures, and sounds and video clips associated with the expeditions. Although its developers claim that the site is still under construction, it contains enough information to keep visitors occupied for hours, if not days. Web de Anza is hosted by the Center for Advanced Technology in Education at the University of Oregon.
Explains extensible markup language (XML) and how it differs from hypertext markup language (HTML) and standard generalized markup language (SGML). Highlights include features of XML, including better formatting of documents, better searching capabilities, multiple uses for hyperlinking, and an increase in Web applications; Web browsers; and what…
Timeout mechanisms are a useful feature for web applications. However, these mechanisms need to be used with care because, if used as-is, they are vulner- able to timing attacks. This paper focuses on internal timing attacks, a particularly dangerous class of timing attacks, where the attacker needs no access to a clock. In the context of client-side web application security,
Initially available from universities and individual enthusiasts, software tools to author World Wide Web pages are maturing into very feature-rich applications and are now offered by large corporations. These applications are enabling more companies to create and maintain pages themselves on the Web or on corporate Intranets. The market continues…
Healthcare IT systems must manipulate semantically rich and highly structured clinical data in a distributed environment. To address this, the healthcare sector has developed standards for medical vocabulary (SNOMED-CT) and message information models (HL7 Version 3) that carry many of the features present in Semantic Web standards such as the Web Ontology Language (OWL). In this paper we examine this
Describes a Web-based reading course for college developmental reading classes. Discusses course features, the first week of class, the benefits of Web-based instruction, old media versus new media, designing a successful site, and the benefits of patience. (SR)
We describe the software architecture of a system for doing multi- physics simulation of a coupled fluid, thermal, and mechanical fracture problem. The system is organized as a collection of geographically-distributed software components in which each component provides a web service, and uses standard web-service protocols to interact with other components. The resulting system incorporates many features such as componentization
L. Paul Chew; Nikos Chrisochoides; S. Gopalsamy; Gerd Heber; Anthony R. Ingraffea; Edward Luke; Joaquim B. Cavalcante Neto; Keshav Pingali; Alan M. Shih; Bharat K. Soni; Paul Stodghill; David Thompson; Stephen A. Vavasis; Paul A. Wawrzynek
This paper describes the Lucent Personalized Web Assistant (LPWA), a novel software systemdesigned to address these user concerns. Users may browse the web in a personalized, simple,private, and secure fashion using LPWA-generated aliases and other LPWA features. LPWA gen-
Eran Gabber; Phillip B. Gibbons; David M. Kristol; Yossi Matias; Alain J. Mayer
This article features the WebQuest, an inquiry-oriented activity in which some or all of the information that learners interact with comes from resources on the Internet. WebQuests, when properly constructed, are activities, usually authentic in nature, that require the student to use Internet-based resources to deepen their understanding and…
Discusses trends in Web-based learning environments. Highlights include common features of existing Web-based learning environments; challenges; the large amount of available online educational material; adaptation and dynamic interaction; real-time interaction; and monitoring online testing. (LRW)
Hong, Hong; Kinshuk; He, Xiaoqin; Patel, Ashok; Jesshope, Chris
A WWW proxy server, proxy for short, provides access to the Web for people on closed subnets who can only access the Internet through a firewall machine. The hypertext server developed at CERN, cern_httpd, is capable of running as a proxy, providing seamless external access to HTTP, Gopher, WAIS and FTP. cern_httpd has had gateway features for a long time,
We examined whether students with access to a supplemental course Web site enhanced with e-mail, discussion boards, and chat room capability reacted to it more positively than students who used a Web site with the same content but no communication features. Students used the Web sites on a voluntary basis. At the end of the semester, students…
Elicker, Joelle D.; O'Malley, Alison L.; Williams, Christine M.
Web pages are not purely text, nor are they solely HTML. This paper surveys HTML web pages; not only on textual content, but with an emphasis on higher order visual features and supplementary technology. Using a crawler with an in-house developed rendering engine, data on a pseudo-random sample of web pages is collected. First, several basic attributes are collected to
Brought to the Internet by the National Biological Information Infrastructure (NBII) (reviewed in the November 12, 1997 Scout Report for Science & Engineering), FrogWeb offers summaries of up-to-date research news related to Amphibians. Currently featured is the recent finding that parasites, not pollution, may be implicated in some amphibian declines. The Research section offers hyperlinked summaries of research on amphibian ecology, amphibian declines, and deformities. The News Releases section links users to current media coverage of frog research, and Education provides learning materials for K-12 students. For anyone interested in learning more about the decline of amphibians, this site offers a useful place from which to start.
Mono Lake Web Site is the homepage of the Mono Lake Committee and offers helpful information regarding the unique hypersaline and alkaline environment. Visitors will find information about the Mono Lake Committee, natural and political histories of the area, related water policies, a photo gallery with image descriptions, and links to related sites- including a clearinghouse. Those interested in Mono Basin birds will find sightings, counts, bird walks, and other related information. An additional feature, Mono Lake Live, offers up-to-the-minute data on road conditions, satellite images, weather, lake level, bird sightings, snow pack, and earthquakes.
TechWeb Network is an online resource for IT professionals providing "contextual access to the resources of CMP's network of industry-leading technology publications." The site offers this handy encyclopedia of technology terms where visitors can search a database of over 20,000 IT terms. Results are given as short definitions with links to related terms, along with links to definitions of other terms that are similar to the original term requested. For fun, visitors will also find a featured "random definition" and can browse the top 10 requested definitions.
The web browsers are the application software that are used to access information from the World Wide Web. With the increasing popularity of the web browsers, the modern web browsers are designed to contain more features as compared to the existing web browsers. For the transferring of information through these browsers, various protocols have been implemented on these modern web browsers to make these browsers more efficient. Different protocols used in different layers have different functions and by increasing the efficiency of these protocols we can make the working of browsers more efficient.
To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.
To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.
To make the web work better for science, OSTI has developed state-of-the-art technologies and services including a deep web search capability. The deep web includes content in searchable databases available to web users but not accessible by popular search engines, such as Google. This video provides an introduction to the deep web search engine.
The web crawler space is often delimited into two general areas: full-web crawling and focused crawling. We present netSifter, a crawler system which integrates features from these two areas to provide an effective mechanism for web- scale crawling. netSifter utilizes a combination of page-level analytics and heuristics which are applied to a sample of web pages from a given website.
Ivan Gonzalez; Adam Marcus; Daniel N. Meredith; Linda A. Nguyen
This "web museum" devoted to vintage calculators shows "the evolution from mechanical calculator to hand held electronic calculator." Some items featured include: Mechanical and early electronic desk calculators, "strange hand-held calculators," and articles, photographs, and databases from the archives of the International Association of Calculator Collectors. A history of the technology and information on British and sterling currency calculators are also posted here. The website also offers a Calculator time-line (chronicling calculator developments), background on the technology used by mechanical and early electronic calculators, and information on The Calculator Business. An index allows visitors to search the calculators featured on this site. The Puzzle Corner section asks visitors to contact them with any information that may answer unresolved questions regarding vintage calculators.
Web Assistant Private 2004 gives users the ability to archive all websites of note offline, something that will come in handy for those looking to peruse any number of websites when they are unavailable to connect to the Internet. Some of the features of the application include a hierarchical archive structure that represents a mirror of every website, topic-specific archiving of webpages, and the filtration of unwanted material from each site. This version of Web Assistant Private 2004 is compatible with all systems running Windows 95 and higher.
The web is a potentially useful corpus for language study because it provides examples of language that are contextualized and authentic, and is large and easily searchable. However, web contents are heterogeneous in the extreme, uncontrolled and hence "dirty," and exhibit features different from the written and spoken texts in other linguistic…
Lists equipment necessary to get connected to the Web at home or at school and additional equipment needed to maintain a Web site and publish on the Internet. Provides short cuts and tips for maintaining a Web site. (PA)
To discriminate spam Web hosts\\/pages from normal ones, text-based and link-based data are provided for Web Spam Challenge Track II. Given a small part of labeled nodes (about 10%) in a Web linkage graph, the challenge is to predict other nodes' class to be spam or normal. We extract features from link-based data, and then combine them with text-based features.
Yuchun Tang; Yuanchen He; Sven Krasser; Paul Judge
Presents a prototype system for image retrieval from the Internet using Web mining. Discusses the architecture of the Web image retrieval prototype; document space modeling; user log mining; and image retrieval experiments to evaluate the proposed system. (AEF)
... Web Links to Relevant CERES Information Relevant information about CERES, CERES references, ... Instrument Working Group Home Page Aerosol Retrieval Web Page (Center for Satellite Applications and Research) ...
Web-based portals have been increasingly used to integrate information from a variety of back-end Web services. In these systems, security is a critical feature. This paper presents a SAML\\/XACML based access control between portal and Web services to extend the authentication and authorization mechanism within a portal to external Web services.
Web technologies achieved significant improvements in last years, but many application areas are not yet Web-impacted. Upcoming software products enhance feature sets of Web browsers and make it possible to use systems based on new Web technologies as adv...
Background The World Wide Web (WWW) has become an increasingly essential resource for health information consumers. The ability to obtain accurate medical information online quickly, conveniently and privately provides health consumers with the opportunity to make informed decisions and participate actively in their personal care. Little is known, however, about whether the content of this online health information is equally accessible to people with disabilities who must rely on special devices or technologies to process online information due to their visual, hearing, mobility, or cognitive limitations. Objective To construct a framework for an automated Web accessibility evaluation; to evaluate the state of accessibility of consumer health information Web sites; and to investigate the possible relationships between accessibility and other features of the Web sites, including function, popularity and importance. Methods We carried out a cross-sectional study of the state of accessibility of health information Web sites to people with disabilities. We selected 108 consumer health information Web sites from the directory service of a Web search engine. A measurement framework was constructed to automatically measure the level of Web Accessibility Barriers (WAB) of Web sites following Web accessibility specifications. We investigated whether there was a difference between WAB scores across various functional categories of the Web sites, and also evaluated the correlation between the WAB and Alexa traffic rank and Google Page Rank of the Web sites. Results We found that none of the Web sites we looked at are completely accessible to people with disabilities, i.e., there were no sites that had no violation of Web accessibility rules. However, governmental and educational health information Web sites do exhibit better Web accessibility than the other categories of Web sites (P < 0.001). We also found that the correlation between the WAB score and the popularity of a Web site is statistically significant (r = 0.28, P < 0.05), although there is no correlation between the WAB score and the importance of the Web sites (r = 0.15, P = 0.111). Conclusions Evaluation of health information Web sites shows that no Web site scrupulously abides by Web accessibility specifications, even for entities mandated under relevant laws and regulations. Government and education Web sites show better performance than Web sites among other categories. Accessibility of a Web site may have a positive impact on its popularity in general. However, the Web accessibility of a Web site may not have a significant relationship with its importance on the Web.
Service composition is gaining momentum as the potential silver bullet for the envisioned Semantic Web.It pur- ports to take the Web to unexplored efficiencies and provide a flexible approach for promoting all types of activities in to- morrow's Web.Applications expected to heavily take advan- tage of Web service composition include B2B E-commerce and E-government.To date, enabling composite services has largely
Brahim Medjahed; Athman Bouguettaya; Ahmed K. Elmagarmid
Planners Web is a great way to get acquainted with the field of urban planning as it provides commentary on the rise of the "creative class" and conversations about designing walkable cities. First-time visitors can check out the What's New area to read through compelling articles that include "Ten Things You Should Know About Project Opponents" and "Transportation That Works for All Walks of Life.Ã¢ÂÂ Moving on, visitors can look over great sections that cover American planning history ("A is for Automobile to Z is for ZoningÃ¢ÂÂ) and The Secrets of Successful Communities. On the top of the homepage, visitors will find sections such as Basic Tools, Planning Topics, and Today's Planning News. Also, visitors may wish to sign up for free email updates.
Using case reports and a review of the literature, the clinical features of envenomation by the genus of Australian funnel web spiders known as Hadronyche are characterised. Five cases are reported here, including the first life-threatening envenomation by Hadronyche species 14 (the Port Macquarie funnel web). Two severe envenomations by Hadronyche cerberea (the Southern Tree funnel web) and one each
Mark K. Miller; Ian M. Whyte; Julian White; Philippa M. Keir
This paper addresses the problems related to multipoint distribution of Web documents over Internet. We present a multicast Web application which allows the sharing of Web resources among a group of people by using the MBONE technology. We describe a general-purposed light-weight reliable multicast transport protocol (LRMP) which is an important building block of the application. We will also discuss
Food webs, the networks of feeding links between species, are central to our understanding of ecosystem structure, stability, and function. One of the key aspects of food web structure is complexity, or connectance, the number of links expressed as a proportion of the total possible number of links. Connectance (complexity) is linked to the stability of webs and is a key parameter in recent models of other aspects of web structure. However, there is still no fundamental biological explanation for connectance in food webs. Here, we propose that constraints on diet breadth, driven by optimal foraging, provide such an explanation. We show that a simple diet breadth model predicts highly constrained values of connectance as an emergent consequence of individual foraging behavior. When combined with features of real food web data, such as taxonomic and trophic aggregation and cumulative sampling of diets, the model predicts well the levels of connectance and scaling of connectance with species richness, seen in real food webs. This result is a previously undescribed synthesis of foraging theory and food web theory, in which network properties emerge from the behavior of individuals and, as such, provides a mechanistic explanation of connectance currently lacking in food web models.
Beckerman, Andrew P.; Petchey, Owen L.; Warren, Philip H.
We propose a number of features for Web spam filtering based on the occurrence of keywords that are either of high advertisement value or highly spammed. Our features include popular words from search engine query logs as well as high cost or volume words according to Google AdWords. We also demonstrate the spam filtering power of the Online Commercial Intention
András A. Benczúr; István Bíró; Károly Csalogány; Tamás Sarlós
A framework to support creation of location based services (LBS) applications using map services is designed and implemented. It provides a means to develop interoperable LBS applications which are independent of service provider, device or data providers. Application developers can use the framework to utilize multiple map servers following Web Map Service (WMS) and WebFeature Service (WFS) standards in
The OWL Web Ontology Language is a new formal language for representing ontologies in the Semantic Web. OWL has features from several families of representation languages, including primarily Descrip- tion Logics and frames. OWL also shares many characteristics with RDF, the W3C base of the Semantic Web. In this paper we discuss how the philosophy and features of OWL can
Ian Horrocks; Peter F. Patel-schneider; Frank Van Harmelen
WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system. PMID:10566471
Our current understanding of Web structure is based on large graphs created by centralized crawlers and indexers. They obtain data almost exclusively from the so-called surface Web, which consists, loosely speaking, of interlinked HTML pages. The deep Web, by contrast, is information that is reachable over the Web, but that resides in databases; it is dynamically available in response to
Tools for the assessment of the quality and reliability of Web applications are based on the possibility of downloading the target of the analysis. This is achieved through Web crawlers, which can automatically navigate within a Web site and perform proper actions (such as download) during the visit. The most important performance indicators for a Web crawler are its completeness
Data stored outside Web pages and accessible from the Web, typically through HTML forms, consitute the so-called Deep Web. Such data are of great value, but difficult to query and search. We survey techniques to optimize query processing on the Deep Web, in a setting where data are represented in the relational model. We illustrate optimizations both at query plan
En el sitio web del NCI se proporcionan enlaces a otros sitios web con fines informativos y para conveniencia del público. Si el usuario selecciona un sitio web externo, saldrá del sitio web del NCI y estará sujeto a las políticas de privacidad y seguridad de dicho sitio.
Discusses problems that users with disabilities, particularly visual impairments, have with Web-based classes. Discusses efforts by the Texas Commission for the Blind to offer Web-based training to visually impaired staff members; and explains how to test a Web site for accessibility and how to make a Web site more accessible and effective. (LRW)
The Villanova Center for Information Law and Policy is pleased to announce the Federal Web Locator service. This is a World Wide Web page for accessing over 210 different World Wide Web servers with federal government information. The Federal Web Locator is intended to be a one-stop kiosk for jumping off to federal sites.
Web services make information and software available programmaticallyvia the Internet and may be used as build- ing blocks for applications. A composite web service is one that is built using multiple component web services. Com- posite web services are emerging as a programming model of distributed computation applicable to business process. Once its specification has been developed, the composite service
We present a new approach in web search engines. The web creates new challenges for information retrieval. The vast improvement in information access is not the only advantage resulting from the keyword search. Ad- ditionally, much potential exists for analyzing interests and relationships within the structure of the web. The creation of a hyperlink by the author of a web
Defines Web literacy, a subset of information literacy, as the ability to access, search, utilize, communicate, and create information on the World Wide Web. Offers 10 stages toward Web literacy, including using hyperlinks and bookmarks, an information resource for research, creating classroom lessons, guiding student use, and creating Web pages.…
The Web Style Guide, 2nd Edition, which is the online version of a book with the same name, demonstrates the step-by-step process involved in designing a Web site. Visitors are assumed to be familiar with whatever Web publishing tool they are using. The guide gives few technical details but instead focuses on the usability, layout, and attractiveness of a Web site, with the goal being to make it as popular with the intended audience as possible. Considerations such as graphics, typography, and multimedia enhancements are discussed. Web site structure, fine-tuned features on individual pages, and almost everything in between is addressed by the guide, making it a handy resource for people who place great importance on the effectiveness of their online creations.
...5: ``Adequacy of Design Features and Functional... Federal rulemaking Web site: Go to http...5: ``Adequacy of Design Features and Functional...licenses, design approvals, design certifications, manufacturing... Federal Rulemaking Web Site: Go to...
In this tutorial we provide an insight into Web Mining, i.e., discover- ing knowledge from the World Wide Web, especially with reference to the lat- est developments in Web technology. The topics covered are: the Deep Web, also known as the Hidden Web or Invisible Web; the Semantic Web including standards such as RDFS and OWL; the eXtensible Markup Language
Aparna S. Varde; Fabian M. Suchanek; Richi Nayak; Pierre Senellart
BioWeb has two different parts: the client side and the server side. The client part consists of DHTML pages and the browser. The browser was a common navigator (Netscape). The system simulates a website with a UserId\\/Password to control the access to it; indeed BioWeb has several features as the capability of register new users, simulate a login, and the
We present DeepPeep (http:\\/\\/www.deeppeep.org), a new system for discovering, organizing and analyzing Web forms. DeepPeep allows users to explore the entry points to hidden-Web sites whose contents are out of reach for traditional search engines. Besides demonstrating important features of DeepPeep and describing the infrastructure we used to build the system, we will show how this infrastructure can be used
Luciano Barbosa; Hoa Nguyen; Thanh Nguyen; Ramesh Pinnamaneni; Juliana Freire
We present here the design and features of the Web Enabled Source Identifier with X-Matching (WESIX). With the proliferation of large imaging surveys, it has become increasingly apparent that tasks performed frequently by astronomers need to be made available in a web-aware manner. The reasons for this are twofold: First, it is no longer feasible to work with the complete data sets. Calculations are much more efficient if they can be carried out at the data center where large files can be transferred quickly. Second, exploratory science can be greatly facilitated by combining common tasks into integrated web services. WESIX addresses both of these issues. It is deployable to large data centers where source identification can be carried out at the data source. In addition, WESIX can transparently leverage the capabilities of Open SkyQuery to crossmatch with large catalogs. The result is a web-based service that integrates object detection with the ability to crossmatch against published catalog data. In this chapter we will discuss how WESIX is constructed, its functionality and some example usage. Section 1 will give a brief overview of the architecture of the service. Section 2 will introduce the features of the service through both the web browser and SOAP web service interfaces. Section 3 gives a detailed overview of the web service methods. Section 4 walks through the example client distributed with the software package.
The Marine Biology Web, created by veteran marine biologist Dr. Jeff Levinton of the State University of New York at Stony Brook, is a great educational resource for both curious students and prospective marine biologists. The Becoming a Marine Biologist page gives students frank advice, and a realistic sense of what marine biology is and what marine biologists do. This website contains a sizeable list of hyperlinked marine labs, institutes, graduate programs, and undergraduate programs. A nice list of marine biology-related internships and courses are included as well. The website also features the useful MBREFÃ¢ÂÂ-A Reference Source for Marine Biology Student Research. The site even links to a system that allows visitors Ã¢ÂÂto obtain tidal predictions computed by CO-OPS for more than 3000 water level stations.Ã¢ÂÂ
Presents a theory of feature representation that accounts for feature indeterminacy and feature resolution within the lexical functional grammar (LFG) framework. The representations discussed, together with minimal extensions of LFG's description language, enable a simple and intuitive characterization of both these phenomena. (Author/VWL)
The overloaded term Web 2.0 web site usually connotes an interactive web application that offers features normally associated with free-standing applications running directly under the control of an operating system. Such an interactive web applications, also known as a rich internet application (RIA), run within web browsers and must download XHTML and client-side scripts to control user interactivity. Via a
The WebQuest website offers various resources for teachers looking to use the WebQuest model to teach with the Web. A WebQuest "is an inquiry-oriented activity in which most or all of the information used by learners is drawn from the Web." The model, developed in 1995 at San Diego State University by Bernie Dodge with Tom March, has received significant attention in recent years. The bulk of the website is located in the Readings and Trainings Materials section, where teachers can find the paper that started the WebQuest project, as well as various articles providing different perspectives on what the WebQuest is all about and how to proceed to create your own lesson using the Internet. Examples of WebQuests created by teachers and a template with sections such as Introduction, Task, Process, Evaluation, Resources, and Conclusion help guide you through the process. The Portal provides updates on WebQuest news (mostly workshops and conferences) and a link to the Top rated WebQuests, as well as some "Middling" WebQuests, and new ones that have not yet been rated. The WebQuests in their database as well as various articles can also be searched from the Portal. The Forum section is a place for "conversations about using and extending the WebQuest model."
Ignored in the finalized Master Settlement Agreement (National Association of Attorneys General, 1998), the unmonitored, unregulated World Wide Web (Web) can operate as a major vehicle for delivering pro-tobacco messages, images, and products to millions of young consumers. A content analysis of 318 randomly sampled pro-tobacco Web sites revealed that tobacco has a pervasive presence on the Web, especially on e-commerce sites and sites featuring hobbies, recreation, and "fetishes." Products can be ordered online on nearly 50% of the sites, but only 23% of the sites included underage verification. Further, only 11% of these sites contain health warnings. Instead, pro-tobacco sites frequently associate smoking with "glamorous" and "alternative" lifestyles, and with images of young males and young (thin, attractive) females. Finally, many of the Web sites offered interactive site features that are potentially appealing to young Web users. Recommendations for future research and counterstrategies are discussed. PMID:12356288
Geospatial data sharing is an increasingly important subject as large amount of data is produced by a variety of sources, stored in incompatible formats, and accessible through different GIS applications. Past efforts to enable sharing have produced standardized data format such as GML and data access protocols such as WebFeature Service (WFS). While these standards help enabling client applications to gain access to heterogeneous data stored in different formats from diverse sources, the usability of the access is limited due to the lack of data semantics encoded in the WFSfeature types. Past research has used ontology languages to describe the semantics of geospatial data but ontology-based queries cannot be applied directly to legacy data stored in databases or shapefiles, or to feature data in WFS services. This paper presents a method to enable ontology query on spatial data available from WFS services and on data stored in databases. We do not create ontology instances explicitly and thus avoid the problems of data replication. Instead, user queries are rewritten to WFS getFeature requests and SQL queries to database. The method also has the benefits of being able to utilize existing tools of databases, WFS, and GML while enabling query based on ontology semantics. ?? 2008 Springer-Verlag Berlin Heidelberg.
The Jet Propulsion Laboratory (JPL) WEB Team activities were reported for activities which were directed toward identifying and attacking areas in the growth of dendritic web ribbon, to complement the program at Westinghouse Electric Corp.
This worksheet outlines a method to evaluate the quality and reliability of a web site. It could be given to students to help them develop the ability to make critical judgements of web-based materials.
This set of online activities consists of matching games about meadow, arctic and pond food webs. Intended for younger students, each game involves placing images of various plants and animals into their proper places in the food webs.
\\u000a Django is a modern Python web framework that redefined web development in the Python world. A fullstack approach, pragmatic\\u000a design, and superb documentation are some of the reasons for its success.
Josh Juneau; Jim Baker; Victor Ng; Leo Soto; Frank Wierzbicki
The vast and increasing amount of Web resources make evident that the lexical-statistical aids, by itself, can't resolve the information retrieval problems. The Semantic Web project tries to diminish these problems (and much more) by the \\
Web services are key enabling technology for information interoperability and serve as the foundation for spatial Web portal. However, the quality of web services is different and becomes one of the challenges for deploying a spatial web portal. We conducted initial research and developed a spatial web service evaluator to assess the qualities of web services taken Open Geospatial Consortium
As the Web is expanding, the Web traffic is changing, and so are the users. To get an adequate understanding of the Web as a medium, it is of paramount importance to know the users and the dominant usage patterns. The purpose of this paper is to map important aspects of Web usage patterns, Web traffic and Web search behavior. Issues dealt with…
In this chapter, we are looking for a relationship between the intent of Web pages, their architecture and the communities who take part in their usage and creation. For us, the Web page is entity carrying information around these communities. Our chapter describes techniques which can be used to extract the mentioned information as well as tools usable in the analysis of this information. Information about communities could be used in several ways thanks to our approach. Finally, we present experiments which prove the feasibility of our approach. These experiments also show a possible way as to how to measure the similarity of Web pages and Web sites using microgenres. We define the microgenre as a building block of Web pages which is based on the social interaction.
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples. PMID:10887163
Created and maintained by Georgia Southern professor of psychology Russell Dewey, this Website offers a wealth of materials for students and researchers in the general field of psychology. Perhaps the site's most impressive feature is a searchable journals database offering a directory of annotated links to hundreds of online journals in psychology and related fields -- some of which offer free, full-text access. But there's a great deal more on the site as well, including annotated links to metasites in the field and APA style guidelines and tutorials; links to departments of psychology around the world; and a directory of annotated resources on various subfields and related topics such as statistics, social psychology, abnormal psychology, language and speech, memory, testing and assessment, behavioral psychology, career issues, cognitive science, and hundreds of others. There are also instructional materials on specific topics posted here such as hypnosis and lucid dreaming, sports psychology, psychology of religion, and cognitive therapy. Definitely a site for any undergraduate or graduate student of psychology to visit and bookmark.
In studying actual Web searching by the public at large,we analyzed over one million Web queries by users of the Excite search engine.We found that most people use few search terms, few modified queries, view few Web pages, and rarely use advanced search features. A small number of search terms are used with high frequency,and a great many terms are
Amanda Spink; Dietmar Wolfram; Major B. J. Jansen; Tefko Saracevic
\\u000a In this chapter, we’ll explore what you can do with the most powerful communication feature in the HTML5 specification: HTML5 WebSockets, which defines a full-duplex communication channel that operates through a single socket over the web. Websocket is not just\\u000a another incremental enhancement to conventional HTTP communications; it represents a large advance, especially for real-time,\\u000a eventdriven web applications.
The Deep Web, i.e., content hidden behind HTML forms, has long been acknowledged as a significant gap in search engine coverage. Since it represents a large portion of the structured data on the Web, accessing Deep-Web content has been a long-standing challenge for the database commu- nity. This paper describes a system for surfacing Deep-Web content, i.e., pre-computing submissions for
Jayant Madhavan; David Ko; Lucja Kot; Vignesh Ganapathy; Alex Rasmussen; Alon Y. Halevy
The potential to achieve dynamic, scalable and cost-efiective mar- ketplaces and eCommerce solutions has driven recent research efiorts towards so-called Semantic Web Services, that are enriching Web services with machine- processable semantics. To this end, the Web Service Modeling Ontology (WSMO) provides the conceptual underpinning and a formal language for semantically describing all relevant aspects of Web services in order
Dumitru Roman; Uwe Keller; Holger Lausen; Jos De Bruijn; Rubén Lara; Michael Stollberg; Axel Polleres; Cristina Feier; Christoph Bussler; Dieter Fensel
The World Wide Web is an interlinked collection of billions of documents formatted using HTML. Ironically the very size of this collection has become an obstacle for information retrieval. The user has to shift through scores of pages to come upon the information he\\/she desires. Web crawlers are the heart of search engines. Web crawlers continuously keep on crawling the
Web usability focuses on design elements and processes that make web pages easy to use. A website for college students was evaluated for underutilization. One-on-one testing, focus groups, web analytics, peer university review and marketing focus group and demographic data were utilized to conduct usability evaluation. The results indicated that…
The rapid diffusion of Internet and open standard technologies is producing a significant growth of the demand of Web sites and Web applications with more and more strict requirements of usability, reliability, interoperability and security. While several methodological and technological proposals for developing Web applications are coining both from industry and academia, there is a general lack of methods and
Giuseppe Antonio Di Lucca; Anna Rita Fasolino; Francesco Faralli; Ugo De Carlini
Instructors should be concerned with how to incorporate the World Wide Web into an information systems (IS) curriculum organized across three areas of knowledge: information technology, organizational and management concepts, and theory and development of systems. The Web fits broadly into the information technology component. For the Web to be…
The human brain contains an estimated 100 billion neurons, and browsing the Web, one might be led to believe that there's a Web site for every one of those cells. It's no surprise that there are lots of Web sites concerning the nervous system. After all, the human brain is toward the top of nearly everyone's list of favorite organs and of…
lead to lost productivity and rev-enue. The question of how to improve the design of informational Web sites is thus of critical importance. Although most prominent Web sites are created by professional design firms, many smaller sites are built by people with little design experience or training. As a consequence, Web sites with local reach, such as those belonging to
The searchable Solar Feature Catalogues (SFCs) are developed from digitized solar images using automated pattern recognition techniques. The techniques were applied for the detection of sunspots, active regions, filaments and line-of-sight magnetic neutral lines in automatically standardized full disk solar images in Ca II K1, Ca II K3 and Ha lines taken at the Paris-Meudon Observatory and white light images and magnetograms from SOHO/MDI. The results of the automated recognition were verified with manual synoptic maps and available statistical data that revealed good detection accuracy. Based on the recognized parameters, a structured database of Solar Feature Catalogues was built on a MySQL server for every feature and published with various pre-designed search pages on the Bradford University web site http://www.cyber.brad.ac.uk/egso/SFC/. The SFCs with nine year coverage (1996-2004) is to be used for deeper investigation of the feature classification and solar activity forecast.
Zharkova, V. V.; Aboudarham, J.; Zharkov, S.; Ipson, S. S.; Benkhalil, A. K.; Fuller, N.
In recent years, Web applications have become prevalent around the world. Many companies have developed or integrated their mission-critical applications using Web technologies. As Web applications become more complex, testing Web applications becomes crucial. We extend data flow testing techniques to Web applications. Several data flow issues for analyzing HTML and eXtensible Markup Language (XML) documents in Web applications are
Chien-hung Liu; David Chenho Kung; Pei Hsia; Chih-tung Hsu
Oceans for Schools (http://www.soc.soton.ac.uk/JRD/SCHOOL/) is an interactive web magazine about oceanography in all its forms, aimed at the 12-16 age group. It combines a magazine format with a web site, covering topics such as: expeditions and research cruises, descriptions of the oceans and major seas, the science needed for understanding the oceans, technology (underwater vehicles, satellites, instruments) and competitions and quizzes. Three issues per year are planned, with frequent updates and monthly competitions within each issue to encourage return visits. An 'Email an oceanographer' feature will provide interaction between the audience and practising scientists, on occasion these scientists will be on board a research cruise or scientific expedition.
Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624
Huang, Conrad C; Meng, Elaine C; Morris, John H; Pettersen, Eric F; Ferrin, Thomas E
For those individuals who would like to design a Web site, but still might not have the technical acumen required, this site is a free resource that features numerous templates, Web graphics, fonts, and a number of other useful items that can be used to create a complete Web site. The template section alone contains 129 free full-page templates, along with 21 horizontal menus, 26 vertical menus, and 23 table templates. The Web graphics and fonts section contains 144 fonts, 120 icons (with such popular images as tables, graphs, and charts), 67 types of arrows, and 128 buttons. For users with queries, an online forum is also available where users can submit and read questions. Lastly, there is an area where advanced users can submit their own contributions for inclusion on the site.
Web-based presentation tools are sometimes referred to as "next generation presentation tools" (EDUCAUSE, 2010). At the most basic level, these tools are simply online versions of traditional presentation software, such as Microsoft's PowerPoint or Apple's Keynote, but some services offer features like web-based collaboration, online presentation…
Though the commercial success of search engines and large scale web page crawlers, the problems of page refresh, new URL discovering, large file downloading, distributed multimedia content feature extracting and indexing etc. are still open. The independent working behavior of each crawler makes it very hard to seek solutions for all these problems under the classical web crawler architecture. To
One of the most important features of Web service is ease of access over the Internet. But the downside is that security is compromised. This paper presents an access control model for Web service. It uses XACML (extensible access control markup language) for the description of access control criteria and combines the power of XACML and SAML (security assertion markup
The paper aims to suggest models for predicting computer and Web attitudes using artificial neural networks with an additional support of standard statistics in data preparation and feature selection. Based on previously confirmed statistical instruments CAS and WAS, three models were designed. The first two models observed computer and Web attitudes separately, while both attitudes were observed in the third
Workflow templates are necessary for various different Web Service related tasks such as encoding business rules in a B2B application, specifying domain knowledge in a scien- tific Grid application, and defining preferences for users that interact with Web Services. Abstract activities in templates can be used to specify the features of a required service and concretes service can be discovered
The evaluation reports in this series usually feature several products at once. The current review, however, comes at a time when one of the most widely used (and expensive) online learning management systems is undergoing a major change in its marketing strategy and corporate focus. "WebCT" is currently evolving to a new version ("WebCT Vista"),…
Created by the World-Wide Web Virtual Library, this page links users to the web sites of the major nuclear physics research facilities in the world. It also features selected journals, radioactive beam facilities, a miscellaneous section and software. While basic, this is a helpful resource for those interested in nuclear physics research worldwide.
Discusses how an experienced educator's Web site recommendations can help learners use the World Wide Web in ways bookmarks alone cannot. Highlights include browser-based and page-based annotations; examples; useful information to include (title or topic, operating instructions, description, special site features, cautions, and helpful…
Highlights features of Copernic 2000, a desktop application that helps manage information from the Web more efficiently. Explains search capabilities that execute simultaneous queries in multiple search engines and ranks them by relevance; refining search options on retrieved Web sites; browsing capabilities; managing results with bookmarks;…
We introduce and simulate a growth model of the world-wide Web based on the dynamics of outgoing links that is motivated by the conduct of the agents in the real Web to update outgoing links (re)directing them towards constantly changing selected nodes. Emergent statistical correlation between the distributions of outgoing and incoming links is a key feature of the dynamics
This article examined the search functions for all individual EAD Web sites listed on the Library of Congress Web site in 2003. In particular, the type of search engine, search modes, options for searching, search results display, search feedback, and other features of the search systems were studied. The data analysis suggests that there have…
A process and an apparatus for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquified eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciately stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers.
Warren, David W. (9253 Glenoaks Blvd., Sun Valley, CA 91352)
A process and an apparatus are disclosed for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquefied eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciatively stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers. 6 figs.
This paper highlights features, uses, and training issues related to AltaVista Forums, a World Wide Web-based conferencing tool. An overview of the features of AltaVista Forums is provided; highlights include asynchronous online discussions, chats, document attachments, URL posting, a calendar feature, and mail-to listings. Pricing and system…
We propose a novel probabilistic image selection method for the Web image gathering system we proposed before. It employed two-step processing: (1) Gather HTML files of Web pages related to given keywords, analyze them and fetch only Web images expected to be highly related to the keywords. (2) Select only relevant images from the gathered images based on the image-feature-based clustering. In this paper, we propose building a generative model based on the Gaussian mixture model to represent the distribution of image features of images related to the given keywords, and applying it to select images instead of the processing (2). We call the new system ``Probabilistic Image Collector''. We show the effectiveness of our proposed system by the experimental results.
This paper presents a categorisation of Web hosted assessment systems and then uses this to produce an outline specification for a fully featured system, against which other systems can be evaluated. It concludes by considering the logistics of using such systems for summative assessment.
This paper presents a categorisation of Web hosted assessment systems and then uses this to produce an outline specification for a fully featured system, against which other systems can be evaluated. It concludes by considering the logistics of using such systems for summative assessment.
This article features "Tech Directions" School Web Site of the Month. The website (http://satellite.stcharles.k12.la.us) was produced by technology education students at the Satellite Center of St. Charles Parish Public Schools in Luling, Louisiana. The Satellite Center focuses on the career paths projected to expand the most over the next decade.…
This article discusses a project to design and implement a small French vocabulary tutor for the World Wide Web. The tutor includes words, pic- tures, and sounds to help students learn new words and their pronuncia- tion. The article highlights salient features and design of the tutor and then focuses on two variants of a module on technology-related vocabu- lary
This paper describes a World Wide Web-based introductory course titled "Hypermedia Structures and Systems," offered as an optional part of the curriculum in computing science at the Eindhoven University of Technology (Netherlands). The technical environment for the current (1996) edition of the course is presented, which features automatic…
If you build it, will they come? This is one of the fundamental questions anybody creating a Web site has to confront, whether you're a business person, a Web professional or a home user. One of the fundamental ways to ensure people do come, and return, is to make the content of your site as appealing and as accessible as possible. A new study by…
This paper explores the use of blogs, a simple application of Web 2.0 technologies, in middle school geometry instruction. Specifically, it provides an overview of the interactive features of Web 2.0 technologies and the feasibility of using Web 2.0 technologies in geometry teaching and learning, as well as a proposed model for creating a…
Web-based testing has become a ubiquitous self-assessment method for online learning. One useful feature that is missing from today's web-based testing systems is the reliable capability to fulfill different assessment requirements of students based on a large-scale question data set. A promising approach for supporting large-scale web-based…
The popularity of ubiquitous Web access requires run- time adaptations of the Web contents. A significant trend in these content adaptation services is the growing amount of personalization required by users. Personalized servic es are and will be a key feature for the success of the ubiqui- tous Web, but they open two critical issues: performance and profile management. Issues
Abstract The popularity of ubiquitous Web access requires run- time adaptations of the Web contents. A significant trend in these content adaptation services is the growing,amount of personalization required by users. Personalized servic es are and will be a key feature for the success of the ubiqui- tous Web, but they open two critical issues: performance and profile management.,Issues related
The World Wide Web has become a huge repository of data of interest for a variety of application domains. However, the same features that have made the Web so useful and popular also impose important restrictions on the way the data it contains can be manipulated. Particularly, in the traditional Web scenario, there is an inherent difficulty in gaining access
As part of research into Web-based document analysis including Web page downloading and classification, an algorithm has been developed to automatically identify article links in Web-based online journals. This algorithm is based on feature vectors calculated from attributes and contents of links extracted from HTML files, and an instance- based learning algorithm using a nearest neighbor methodology to identify article
This paper presents a fully automated object extraction system -- Omini. A distinct feature of Omini is the suite of algorithms and the automatically learned information extraction rules for discovering and extracting objects from dynamic Web pages or static Web pages that contain multiple object instances. We evaluated the system using more than 2,000 Web pages over 40 sites. It
Examines situations/purposes users have for searching the Web and perceptions of the textual features that assist users in characterizing documents/text retrieval in actual Web searches. The study employs a bottom-up approach to derive a set of genres built around actual use of the Web. After employing content-analytic techniques to derive a…
Nilan, Michael Sanford; Pomerantz, Jeffrey; Paling, Stephen
Summary 1. Body-size may be an important feature of the structure of food webs. Detailed food web data are however scarce, particularly those including ontogenetic dietary shifts within species. We examined the predator guild in a well characterized food web, that of Broadstone Stream (UK), to assess the importance of body-size within and among species in relation to intraguild predation
Rapid growing of internet applications and users has led the improvement of mobile web browser technology and standards such as HTML5. Recently, HTML5 is turning to be a de facto standard after some of its features has been implemented in major mobile web browsers. Moodle as a web based Learning Management System (LMS) has been popular in academic environment for
Royyana M. Ijtihadie; Yoshifumi Chisaki; Tsuyoshi Usagawa; H. B. Cahyo; Achmad Affandi
The American Mathematical Society's MathSciNet now presents Featured Reviews from Mathematical Reviews online. "Since its founding in 1940, Mathematical Reviews (MR) has aimed to serve researchers and scholars in the mathematical sciences by providing timely information on articles and books that contain new contributions to mathematical research," state the editors. The purpose of the Featured Reviews page is to assist researchers in accessing the most outstanding reviews without having to wade through the thousands of reviews that are posted to MR online each month. The editors state that the Featured Reviews "...will cover some of the very best papers published in mathematics, identified by the MR editors with the advice of distinguished outside mathematicians as being especially important in one or more of the areas covered by MR. The reviewers for these papers are asked to set the paper in context, perhaps with some historical background, state the main results of the paper, outline (in not too technical a fashion) the main new ideas in the paper and include their evaluation of the paper." Each four- to six-paragraph-long review, available in HTML, .dvi, .ps, or .pdf format, gives the reviewer's name and the full article citation, hyperlinked when possible. This should prove to be a valuable Web resource for academic mathematicians.
The Solar Feature Catalogues (SFCs) are created from digitized solar images using automated pattern recognition techniques developed in the European Grid of Solar Observation (EGSO) project. The techniques were applied for detection of sunspots, active regions and filaments in the automatically standardized full-disk solar images in Caii K1, Caii K3 and H? taken at the Meudon Observatory and white-light images and magnetograms from SOHO/MDI. The results of automated recognition are verified with the manual synoptic maps and available statistical data from other observatories that revealed high detection accuracy. A structured database of the Solar Feature Catalogues is built on the MySQL server for every feature from their recognized parameters and cross-referenced to the original observations. The SFCs are published on the Bradford University web site http://www.cyber.brad.ac.uk/egso/SFC/ with the pre-designed web pages for a search by time, size and location. The SFCs with 9 year coverage (1996 2004) provide any possible information that can be extracted from full disk digital solar images. Thus information can be used for deeper investigation of the feature origin and association with other features for their automated classification and solar activity forecast.
Zharkova, V. V.; Aboudarham, J.; Zharkov, S.; Ipson, S. S.; Benkhalil, A. K.; Fuller, N.
Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…
With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…
Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju
A WebQuest is a model or framework for designing effective Web-based instructional strategies featuring inquiry-oriented activities. It is an innovative approach to learning that is enhanced by the use of evolving instructional technology. WebQuests have invigorated the primary school (grades K through 12) educational sector around the globe, yet there is sparse evidence in the literature of WebQuests at the college and university levels. WebQuests are congruent with pedagogical approaches and cognitive activities commonly used in nursing education. They are simple to construct using a step-by-step approach, and nurse educators will find many related resources on the Internet to help them get started. Included in this article are a discussion of the critical attributes and main features of WebQuests, construction tips, recommended Web sites featuring essential resources, a discussion of WebQuest-related issues identified in the literature, and some suggestions for further research. PMID:17496479
iWeb Objective The objective of this lesson is to be able to produce a simple website on a Mac using the software program iWeb. Introduction This lesson is an informal guide for friends and family - it provides the things I found most useful when first beginning to use iWeb. The Apple website has the best information but I found some other information to be supplemental and useful. When you have navigated ...
The Semantic Web is the new-generation Web that tries to represent information so that it can be used by machines not just for display purposes, but for automation, integration, and reuse across applications (Boley et al. 2001). It has been one of the hottest R&D topics in recent years in the AI community, as well as in the Internet community—the Semantic Web is an important W3C activity (SW Activity 2008).
Abstract— Many different Web browsers are available on the Internet, free of charge. A browser performs several tasks, such as rendering Web pages on the screen and executing client-side code often embedded,in Web pages. Users typically choose a browser that gives them a satisfying browsing experience, which is partly determined by the speed of the browser. This paper presents benchmark,performance
The problems associated with the management of Internet sites that require dynamic content or constant changes has led to the production of the article. http:\\/\\/www.smart-city.com.au is a local government Internet site which has made use of Smart-Web to alleviate the bottleneck that is encountered at the Web master when highly dynamic Web content most be altered. It was found that
The cover story of Byte's March 1998 issue contains several articles on the future of web publishing techniques. The articles begin with the proposition that Hyptertext Markup Language (HTML) is no longer able to keep up with the development of dynamic web content. They then discuss the future of web markup, with particular emphasis on the advantages of the newly emerging Extensible Markup Language (XML). Examples are provided.
Web spam is a widely-recognized threat to the quality and security of the Web. Web spam pages pollute search en- gine indexes, burden Web crawlers and Web mining ser- vices, and expose users to dangerous Web-borne malware. To defend against Web spam, most previous research ana- lyzes the contents of Web pages and the link structure of the Web graph.
Most people will never see the eruption of an active volcano. Even so, evidence of these dramatic displays can be found all over the world. In fact, more can be learned about some aspects of volcanic activity by exploring evidence left by past eruptions than by watching an eruption in progress. This interactive resource adapted from the National Park Service explores a variety of volcanic landforms and features, and describes how they form.
As Digital Libraries (DL) become more aligned with the web architecture, their functional components need to be fundamentally rethought in terms of URIs and HTTP. Annotation, a core scholarly activity enabled by many DL solutions, exhibits a clearly unacceptable characteristic when existing models are applied to the web: due to the representations of web resources changing over time, an annotation made about a web resource today may no longer be relevant to the representation that is served from that same resource tomorrow. We assume the existence of archived versions of resources, and combine the temporal features of the emerging Open Annotation data model with the capability offered by the Memento framework that allows seamless navigation from the URI of a resource to archived versions of that resource, and arrive at a solution that provides guarantees regarding the persistence of web annotations over time. More specifically, we provide theoretical solutions and proof-of-concept experimental evaluations for two problems: reconstructing an existing annotation so that the correct archived version is displayed for all resources involved in the annotation, and retrieving all annotations that involve a given archived version of a web resource.
Sanderson, Robert [Los Alamos National Laboratory; Van De Sompel, Herbert [Los Alamos National Laboratory
Progress in the development of techniques to grow silicon web at 25 wq cm/min output rate is reported. Feasibility of web growth with simultaneous melt replenishment is discussed. Other factors covered include: (1) tests of aftertrimmers to improve web width; (2) evaluation of growth lid designs to raise speed and output rate; (3) tests of melt replenishment hardware; and (4) investigation of directed gas flow systems to control unwanted oxide deposition in the system and to improve convective cooling of the web. Compatibility with sufficient solar cell performance is emphasized.
Duncan, C. S.; Seidensticker, R. G.; Hopkins, R. H.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.
Led by the World Wide Web Consortium, the Web Standards Project is an effort to make "technologies for creating and interpreting web-based content." These standards allow many types of languages and object models to be compatible on different browsers and platforms. The project's home page has many resources for users to learn about standards and guidelines. Some of the major topics include HTML, XML, Cascading Style Sheets, and accessibility. An informative section on Web browsers compares the standards compliance of eight popular browsers. Special email addresses of some browser manufacturers are also given, so users can report bugs to help improve the quality of the software.
The Web is growing and changing from a paradigm of static publishing to one of participation and interaction. This change has implications for people with disabilities who rely on access to the Web for employment, information, entertainment, and increased independence. The interactive and collaborative nature of Web 2.0 can present access problems for some users. There are some best practices which can be put in place today to improve access. New specifications such as Accessible Rich Internet Applications (ARIA) and IAccessible2 are opening the doors to increasing the accessibility of Web 2.0 and beyond.
Progress in the development of techniques to grow silicon web at 25 wq cm/min output rate is reported. Feasibility of web growth with simultaneous melt replenishment is discussed. Other factors covered include: (1) tests of aftertrimmers to improve web width; (2) evaluation of growth lid designs to raise speed and output rate; (3) tests of melt replenishment hardware; and (4) investigation of directed gas flow systems to control unwanted oxide deposition in the system and to improve convective cooling of the web. Compatibility with sufficient solar cell performance is emphasized.
Duncan, C. S.; Seidensticker, R. G.; Hopkins, R. H.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.
Security testing a Web application or Web site requires careful thought and planning due to both tool and industry immaturity. Finding the right tools involves several steps, including analyzing the development environment and process, business needs, and the Web application's complexity. Here, we describe the different technology types for analyzing Web applications and Web services for security vulnerabilities, along with
This study compared the designs of a traditional style WebQuest and a Web 2.0 style WebQuest in terms of their effectiveness as a teaching tool. The sample included 104 university sophomore students. Students were randomly assigned to two groups, with one group using the traditional style WebQuest and the other used the Web 2.0 style WebQuest. Data were collected (a)
The journal Nature presents this online special feature on the recently sequenced Y chromosome. The Web site offers a number of free informative resources, including an account of the sequencing project as well as related scientific papers and letters published in the journal. An archive of Y chromosome-related articles are also available for registered users (no cost for registration). In all, this Web special offers an excellent resource for exploring the Y chromosome, formerly regarded as a genetic wasteland before sequencing research revealed that we may have underestimated its powers.
Chicago's Field Museum has dubbed the 2003-2004 school year as "The Year of Biodiversity and Conservation," and invites everyone to join in "exploring, celebrating, and protecting our planet's amazing web of life." Visitors can start the exploring, celebrating, and protecting right here in the well-designed and informative Web site created for the program. Biodiversity basics, conservation issues and efforts, free downloadable lesson plans, and other resources are all on hand. The site also contains an interactive map featuring Field Museum researchers studying biodiversity around the world. This site is also reviewed in the October 17, 2003 NSDL Life Sciences Report.
The San Andreas fault system, a complex of faults that display predominantly large-scale strike slip, is part of an even more complex system of faults, isolated segments of the East Pacific Rise, and scraps of plates lying east of the East Pacific Rise that collectively separate the North American plate from the Pacific plate. This chapter briefly describes the San Andreas fault system, its setting along the Pacific Ocean margin of North America, its extent, and the patterns of faulting. Only selected characteristics are described, and many features are left for depictions on maps and figures.
... br/>Media Advisory 05-001New NSF Web Site Released NSF's new Web site has been redesigned from the ... The National Science Foundation introduces a new Web site, entirely redesigned to better serve both ...
The Web is currently the pre-eminent medium for electronic service delivery to remote users. As a consequence, authentication of servers is more important than ever. Even sophisticated users base their decision whether or not to trust a site on browser cues—such as location bar information, SSL icons, SSL warnings, certificate information, response time, etc. In their seminal work on web
Recent technology trends in the Web Services (WS) domain in- dicate that a solution eliminating the presumed complexity of the WS-* standards may be in sight: advocates of REpresentational State Transfer (REST) have come to believe that their ideas ex- plaining why the World Wide Web works are just as applicable to solve enterprise application integration problems and to simplify
Argues that the preservation of areas like the Shoreline Park (California) wetlands depends on educating students about the value of natural resources. Describes the creation of a Web page on the wetlands for third-grade students by seventh-grade art and ecology students. Outlines the technical process of developing a Web page. (DSK)
A teacher in the English education program at Buffalo State College describes her development of Web-based literature guides for preservice teachers to use in preparation and student teaching and for secondary-level English/language arts teachers to use in their classrooms. Discusses assembling materials for the web guide; an overview of site…
... determine if you have a ring or a web, your doctor may order one of these tests: Barium swallow test. This allows the radiologist to ... contribute to the development of esophageal rings and webs, your doctor probably will order a blood test for iron levels and, if you are deficient, ...
The use of Semantic Web Services (SWS) for increasing agility and adaptability in process execution is currently investigated in many settings. The common underlying idea is the dynamic selection, composition and mediation - on the basis of available SWS descriptions - of the most adequate Web resource (services and data) to accomplish a specific process activity. In this paper we
Stefania Galizia; Alessio Gugliotta; Carlos Pedrinaci; John Domingue
The Factsheets web application was conceived out of the requirement to create, update, publish, and maintain a web site with dynamic research and development (R and D) content. Before creating the site, a requirements discovery process was done in order t...
This thesis explores the concept of Web Database Development using Active Server Pages (ASP) and Java Server Pages (JSP). These are among the leading technologies in the web database development. The focus of this thesis was to analyze and compare the ASP...
Abstract We consider the problem of compressing graphs of the link structure of the World Wide Web. We provide efficient algorithms for such compression that are motivated by recently proposed random graph models for describing the Web. The algorithms are based on reducing the compression problem to the problem of finding a minimum,spanning tree in a directed graph related to
Integrating security throughout the life cycle can improve overall Web application security. With a detailed review of the steps involved in applying security-specific activities throughout the software development life cycle, the author walks practitioners through effective, efficient application design, development, and testing. With this article, the author shares a way to improve Web application security by integrating security throughout the
Designing a web home page involves many decisions that affect how the page will look, the kind of technology required to use the page, the links the page will provide, and kinds of patrons who can use the page. The theme of information literacy needs to be built into every web page; users need to be taught the skills of sorting and applying…
We oer an overview of current Web search engine design. After introducing a generic search engine architecture, we examine each engine component in turn. We cover crawling, local Web page storage, indexing, and the use of link analysis for boosting search performance. The most common design and implementation techniques for each of these components are presented. We draw for this
Arvind Arasu; Junghoo Cho; Hector Garcia-molina; Andreas Paepcke; Sriram Raghavan
A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the Web. Earth science applications are often both computing- and data-intensive that involve diverse sources of data and complex processing functions. It is often not only time-consuming but also difficult to find, obtain, and process heterogeneous geospatial information.
Web spamming refers to actions intended to mislead search engines and give some pages higher ranking than they deserve. Recently, the amount of web spam has increased dramatically, leading to a degradation of search results. This paper presents a comprehen- sive taxonomy of current spamming techniques, which we believe can help in developing appropriate countermeasures.
Web 2.0 seems to be all the rage these days. One cannot go to a library conference and attend presentations or stroll down the halls without hearing some mention of it in magical tones reserved for some great discovery. The excitement surrounding Web 2.0 reminds the author of the frenzy that gripped the country between 1848 and 1855, when…
This article takes a look at tech guru Will Richardson's new book, "Blogs, Wikis, Podcasts, and Other Powerful Web Tools for Classrooms." Whether it's blogs or wikis or RSS, all roads now point to a Web where little is done in isolation. The biggest, most sweeping change in the people's relationship with the Internet may not be as much the ability…
The availability of web search has revolutionised the way people discover in- formation, yet as search services maintain larger and larger indexes they are in danger of becoming a victim of their own success. Many common searches can return a vast number of web pages, many of which will be irrelevant to the searcher, and of which only about ten
The social impact from the World Wide Web cannot be underestimated, but technologies used to build the Web are also revolutionizing the sharing of business and government information within intranets. In many ways the lessons learned from the Internet carry over directly to intranets, but others do not apply. In particular, the social forces that guide the development of intranets
Ronald Fagin; Ravi Kumar; Kevin S. McCurley; Jasmine Novak; D. Sivakumar; John A. Tomlin; David P. Williamson
ChemWeb.com is now the largest online chemical community in the world. ChemWeb.com is a unique resource which combines a huge range of information for those in research chemistry, the chemicals industries and related disciplines. Membership is completely free. ChemWeb.com members can access over 350 journals and 15 databases from a variety of publishers. A number of databases offer structure-based searching and manipulation of molecular structures. ChemWeb.com services include the Careers Centre in association with sciencejobs.com, Conference Centre, Bookstore and the online magazine, the alchemist, as well as several specialist Subject Areas based around specific chemical fields. More information on ChemWeb can be found in the User Guide.
Web is a globally distributed, still highly personalized media for cost-effective delivery of multimedia information and services. Web is expected to have a strong impact on almost every aspect of how we learn. "Total Quality" is the totality of features, as perceived by the customers of the product or service. Totality of features includes stated…
Collaborative Drug Discovery (CDD) has created a scalable platform that combines traditional drug discovery informatics with Web2.0 features. Traditional drug discovery capabilities include substructure, similarity searching and export to excel or sdf formats. Web2.0 features inc...
Purpose: The purpose of the paper is to examine the various features and components of web-based online public access catalogues (OPACs) of IIT libraries in India with the help of a specially designed evaluation checklist. Design/methodology/approach: The various features of the web-based OPACs in six IIT libraries (IIT Delhi, IIT Bombay, IIT…
THE STATE STANDARDS for this project are as follows; STANDARD 1 Making: Students will assemble and create works of art by experiencing a variety of art media and by learning the art elements and principles. STANDARD 2 Perceiving: Students will find meaning by analyzing, criticizing, and evaluating works of art. STANDARD 3 Expressing: Students will create meaning in art. STANDARD 4 Contextualizing: Students will find meaning in works of art through settings and other modes of learning. Below is a list of useful site to help in drawing facial features, along with useful tutorial and resources. QUICK TEST (test your ability and knowledge) * Draw a circle. * Draw a light vertical line at the center of the circle. * Make light horizontal dashes a little above the center of the circle. ...
This nice site from NOAA starts with a bold statement: "Big fish eat little fish; that's how the food cycle works." It's a fitting introduction to this exploration of aquatic food webs. Offered as part of NOAA's main Education Resources site, this site offers a dozen well-produced videos, lesson plans, and data sets divided into areas that include Background Information and Multimedia. These items include "Tagging of Pacific Pelagics," "Census of Marine Life Biodiversity," and "Components of a Food Web." Visitors can also look over the Features area near the bottom of the site, whose offerings range from a profile of the Stellwagen Bank National Marine Sanctuary to a longitudinal study of sea trout in the food web. Finally, visitors can use the social media tabs to share resources from the site with colleagues and others.
Current Web applications embed sophisticated user interfaces and business logic. The original interaction paradigm of the Web based on static content pages that are browsed by hyperlinks is, therefore, not valid anymore. In this paper, we advocate a paradigm shift for browsers and Web applications, that improves the management of user interaction and browsing history. Pages are replaced by States as basic navigation nodes, and Back/Forward navigation along the browsing history is replaced by a full-fledged interactive application paradigm, supporting transactions at the interface level and featuring Undo/Redo capabilities. This new paradigm offers a safer and more precise interaction model, protecting the user from unexpected behaviours of the applications and the browser.
Brambilla, Marco; Cabot, Jordi; Grossniklaus, Michael
It is commonplace to say that theWeb has changed everything.Machine learning researchers often say that their projects and results respond to that change with better methods for finding and organizing Web information. However, not much of the theory, or even the current practice, of machine learning take the Web seriously. We continue to devote much effort to refining supervised learning, but the Web reality is that labeled data is hard to obtain, while unlabeled data is inexhaustible. We cling to the iid assumption, while all the Web data generation processes drift rapidly and involve many hidden correlations. Many of our theory and algorithms assume data representations of fixed dimension, while in fact the dimensionality of data, for example the number of distinct words in text, grows with data size. While there has been much work recently on learning with sparse representations, the actual patterns of sparsity on the Web are not paid much attention. Those patterns might be very relevant to the communication costs of distributed learning algorithms, which are necessary at Web scale, but little work has been done on this.
With more and more services relying on the Web to com- municate with their users, the amount of information ex- changed daily by an individual through various Web chan- nels has become dicult to control. Not only have Web 2.0 applications participated to this increase from within the browser, but many other tools now rely on the Web as their
Designing and developing an effective web crawler is a challenging role in a large search engine. This paper proposes component based web crawler along with the indexer. The WebCrawler consist of crawler services and indexer services and realized as web services. The communication between the services is sent and received using XML, SOAP and WSDL. In the crawler service, the
A Vadivel; S. G Shaila; R. Devi Mahalakshmi; J. Karthika
Since the wide adoption of database and Web service technologies in cross-enterprise manufacturing collaboration, leveraging deep Web contents through service discovery has become one of the most advanced trends in the manufacturing field. However, the effective service discovery and knowledge retrieval from manufacturing deep Web services is still a critical issue because of the underlying intricate structures of deep Web
Zhang Wenyu; Yin Jianwei; Cai Ming; Wu Jian; Lin Lanfen
The growth of inexpensive bandwidth and the maturation of Web development technology have enabled a significant adoption of Web-based applications for interactions between customers and business, between businesses, and between citizens and institutions. However, those same improvements in bandwidth and corresponding rise in Web system complexity has also been of use to those with malicious intent. Thus Web security (the
This paper presents a World Wide Web-based infrastructure for cooperation between many different parties. The infrastructure is designed for Web-based competitions involving an editorial board, designers of assignments or events, evaluators, different organizational layers, and contestants. Web-CS is entirely Web-based: all the communication…
Aerts, A. T. M.; Bierhoff, P. F. M.; De Bra, P. M. E.
Compares the use of enhanced television features and television commerce features on the Web sites of cable and broadcast television networks. Shows differences in strategies and site usability; proposes three enhanced television strategy models; and discusses implications on television revenue and viewership. (Author/LRW)
The paper presents the working architecture of an existing module for web search, classification and presentation of documents with different lexical features. Unlike the existing search engines, the module discriminates aspects of the style of the document - readability, explanation, illustrations and summarization. The visualization component enhances perceptual support of these features when retrieval of searched documents take place. We
Maya Dimitrov; Ivan Terziev; Plamena Andreev; Petia Radev; Joan Jose Villanueva
The silicon web process takes advantage of natural crystallographic stabilizing forces to grow long, thin single crystal ribbons directly from liquid silicon. The ribbon, or web, is formed by the solidification of a liquid film supported by surface tension between two silicon filaments, called dendrites, which border the edges of the growing strip. The ribbon can be propagated indefinitely by replenishing the liquid silicon as it is transformed to crystal. The dendritic web process has several advantages for achieving low cost, high efficiency solar cells. These advantages are discussed.
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.
SRD 69 NIST Chemistry WebBook (Web, free access) The NIST Chemistry WebBook contains: Thermochemical data for over 7000 organic and small inorganic compounds; thermochemistry data for over 8000 reactions; IR spectra for over 16,000 compounds; mass spectra for over 33,000 compounds; UV/Vis spectra for over 1600 compounds; electronic and vibrational spectra for over 5000 compounds; constants of diatomic molecules(spectroscopic data) for over 600 compounds; ion energetics data for over 16,000 compounds; thermophysical property data for 74 fluids.
A fibrous web having a first surface and a second surface. The fibrous web has a first region and at least one discrete second region, the second region being a discontinuity on the second surface and being a tuft comprising a plurality of tufted fibers extending from the first surface. The tufted fibers define a distal portion, the distal portion comprising portions of the tufted fibers being bonded together. Bonding can be thermal melt-bonding. In another embodiment the second surface of the web can have non-intersecting or substantially continuous bonded regions, which also can be thermal melt-bonding.
Access to Deep Web sources is concerned with querying data that is hidden behind Web forms and primarily not accessible by\\u000a common query languages. Web forms do not contain any type information, and it thus follows that Deep Web sources only work\\u000a on string data in its rudimentary form. In this paper, we demonstrate how Semantic Web technologies can be
We describe WebMon, a tool for correlated, transaction- oriented performance monitoring of web services. Data collected with WebMon can be analyzed from a variety of perspectives: business, client, transaction, or systems. Maintainers of web services can use such analysis to better understand and manage the performance of their services. Moreover, WebMon's data will enable the construction of more accurate performance
Thomas Gschwind; Kave Eshghi; Pankaj K. Garg; Klaus Wurster
Better understanding the attitude and behaviors of students using the Internet for school work can provide valuable insight for today's school librarian. The Pew Internet & American Life Project conducted a qualitative study of Internet-using public middle and high school students drawn from across the country ranging from 12 to 17 years of age.…
Web content replication began as explicit manual mirroring of Web sites. It has since evolved into user- transparent request distribution among a static set of mirr ors. The next step (especially important for large-scale Web hosting service providers) is dynamic cont ent replication, where object replicas are cre- ated, deleted, or migrated among Web hosts in response to cha nging
In this paper, we present a system called DEQUE (Deep WEb QUery SystEm) for modeling and querying the deep Web. We propose a data model for representing and storing HTML forms, and a web form query language called DEQUEL for retrieving data from the deep Web and storing them in the format convenient for additional processing. Our system is able
We discuss the problems associated with managing ontologies in distributed environments such as the Web. The Web poses unique problems for the use of ontologies because of the rapid evolution and autonomy of web sites. We present SHOE, a web-based knowledge representation language that supports multiple versions of ontologies. We describe SHOE in the terms of a logic that separates
To organize similar Web services for more precise service discovery and easy service composition, it is important to predict the difference of two Web services. Furthermore, to provide a quantified measure on the differences of Web service turns out to be a key enabler for understanding Web services. In this paper, we present a model designed to measure the difference
In the recent time we have witnessed an expansion of web applications, which are offering a wide range of public and business services. The web applications are efficient and convenient, however, increasing number of new security threats is a risk for both users of web applications and the companies which are offering their services through web applications. In order to
In this paper we present Web design frameworks as a conceptual approach to maximize reuse in Web applications. We first analyze the current state of the art of Web applications design, stating the need for an approach that clearly separates concerns (conceptual, navigational, interface). We briefly introduce the OOHDM approach for Web applications design. We next focus on the problem
Daniel Schwabe; Luiselena Esmeraldo; Gustavo Rossi; Fernando Lyardet
Focuses on the pre-design questions that a Web site designer and client need to address. Discusses the why, who, what, where, and when phases of the Web site design process. Includes a Web Site Development Form to guide site designs through the first two stages of Web site development: specification and design. (AEF)
WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed…
Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.
We discuss the problems associated with managing ontologies in distributed environments such as the Web. The Web poses unique problems for the use of ontologies because of the rapid evolution and autonomy of web sites. We present SHOE, a web-based knowled...
With the growth of the World Wide Web has come the insight that currently available methods for finding and using information on the web are often insufficient. In order to move the Web from a data repository to an information resource, a totally new way of organizing information is needed. The advent of the Semantic Web promises better retrieval methods
Thirty-five (35) furnace runs were carried out during this quarter, of which 25 produced a total of 120 web crystals. The two main thermal models for the dendritic growth process were completed and are being used to assist the design of the thermal geometry of the web growth apparatus. The first model, a finite element representation of the susceptor and crucible, was refined to give greater precision and resolution in the critical central region of the melt. The second thermal model, which describes the dissipation of the latent heat to generate thickness-velocity data, was completed. Dendritic web samples were fabricated into solar cells using a standard configuration and a standard process for a N(+) -P-P(+) configuration. The detailed engineering design was completed for a new dendritic web growth facility of greater width capability than previous facilities.
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Blais, P. D.; Davis, J. R., Jr.
A study reveals that spider orb webs fail in a nonlinear fashion, owing to the hierarchical organization of the silk proteins. The discovery may serve as inspiration for engineers for the design of aerial, light-weight, robust architectures.
Welcome to the Web is a great place to learn the basics of using the Internet and the World Wide Web. It is tailored to children, but if inexperienced adults can get past the cartoon drawings, anyone can benefit from the site. The first section provides an overview of the Internet and some terminology. Next is a section on guestbooks, followed by an overview of Web browsers. "Searching the Net" and research techniques comprise the last two sections. Each category consists of several interactive Web pages that lead the user through each step. Once the user has completed the sections, he/she can try the challenge that tests all of the previous materials.
_Internet Web Text_ links users to information about Internet orientation, guides, reference materials, browsing and exploring tools, subject- and word-oriented searching tools, and information about connecting with people.
Discusses the basic technical concepts of using graphics in World Wide Web pages, including: color depth and dithering, dots-per-inch, image size, file types, Graphics Interchange Formats (GIFs), Joint Photographic Experts Group (JPEG), format, and software recommendations. (AEF)
This article examines the WEBS (Women Evolving the Biological Sciences) symposium which addresses the conditions that have led to inadequate numbers of women in postdoctoral, tenure-track, and tenured faculty positions.
In this activity, learners investigate feeding relationships. Learners complete a food web and then make a mobile to represent a food chain. Use this activity to talk about predator/prey relationships and ecosystems.
Thermal models were developed that accurately predict the thermally generated stresses in the web crystal which, if too high, cause the crystal to degenerate. The application of the modeling results to the design of low-stress experimental growth configurations will allow the growth of wider web crystals at higher growth velocities. A new experimental web growth machine was constructed. This facility includes all the features necessary for carrying out growth experiments under steady thermal conditions. Programmed growth initiation was developed to give reproducible crystal starts. Width control permits the growth of long ribbons at constant width. Melt level is controlled to 0.1 mm or better. Thus, the capability exists to grow long web crystals of constant width and thickness with little operator intervention, and web growth experiments can now be performed with growth variables controlled to a degree not previously possible.
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D. L.; Schruben, J.
Use the two chemical element web sites below to evaluate which site is more reliable. Which is more authoritative? Which is more accurate? Which is more current? How does their coverage or detail compare? Is there advertising on the page? Is there bias or objectivity? ChemicalElements.com Web Elements Evaluation Criteria by Susan E. Beck - links to a page that will help you evaluate each website. ...
From the Missouri Botanical Gardens, this Web site provides a classification of angiosperms that can be navigated either by phylogenic tree, Order, or Family. A detailed description of the classification system is provided by the author, along with literature references and links to other related Web sites. Descriptions of each Family and Order include synonyms and the geographic range in which each is found. Users with a background in botany will find this a useful resource.
Online access to the Internet and the World Wide Web has become important for public awareness and for educating the world's population, including its political leaders, students, researchers, teachers, and ordinary citizens seeking information. After a brief Introduction, relevant information found on photosynthesis-related Web sites and other online locations is presented under five categories: (a) group sites, (b) sites by subject, (c) individual researcher's sites, (d) sites for educators and students, and (e) other useful sites. PMID:23708976
The non-Abelian exponentiation theorem has recently been generalised to correlators of multiple Wilson line operators. The perturbative expansions of these correlators exponentiate in terms of sets of diagrams called webs, which together give rise to colour factors corresponding to connected graphs. The colour and kinematic degrees of freedom of individual diagrams in a web are entangled by mixing matrices of purely combinatorial origin. In this paper we relate the combinatorial study of these matrices to properties of partially ordered sets (posets), and hence obtain explicit solutions for certain families of web-mixing matrix, at arbitrary order in perturbation theory. We also provide a general expression for the rank of a general class of mixing matrices, which governs the number of independent colour factors arising from such webs. Finally, we use the poset language to examine a previously conjectured sum rule for the columns of web-mixing matrices which governs the cancellation of the leading subdivergences between diagrams in the web. Our results, when combined with parallel developments in the evaluation of kinematic integrals, offer new insights into the all-order structure of infrared singularities in non-Abelian gauge theories.
Dukes, M.; Gardi, E.; McAslan, H.; Scott, D. J.; White, C. D.
FLOW is a comprehensive curriculum about the Great Lakes ecosystem. Lessons are geared toward educators who teach upper elementary and middle school students. Each lesson is aligned with national and state curriculum standards for science and social studies and features a hands-on classroom activity. Lessons cover topics such as: food webs, nonindigenous species, watersheds, water quality, fish identification, fish habitat, preserving biodiversity, and water-related careers.
Basic Local Alignment Search Tool (BLAST) is a sequence similarity search program. The public interface of BLAST, http:\\/\\/www.ncbi.nlm.nih.gov\\/ blast, at the NCBI website has recently been reengineered to improve usability and performance. Key new features include simplified search forms, improved navigation, a list of recent BLAST results, saved search strategies and a documentation directory. Here, we describe the BLAST web
Mark Johnson; Irena Zaretskaya; Yan Raytselis; Yuri Merezhuk; Scott Mcginnis; Thomas L. Madden
From science writer David Bradley and Advanced Chemistry Development, this newly released Web-based chemistry magazine "will provide the chemistry community with cutting edge reports of exciting developments in the world of the chemical sciences and related fields." The magazine crosses a research orientation with a popular look and feel. Features examine current chemistry developments in areas such as chromatography and nanotechnology, as well as news pertaining to work being done by researchers at Advanced Chemistry Development.
Web services play an active role in the business integration and other fields such as bioinformatics. Current Web services technologies such as WSDL, UDDI, BPEL4WS and BSML are not semantic-oriented. Several proposals have been proposed to develop Semantic Web services to facilitate the discovery of relevant Web services . In our vision, with the mature of Semantic Web services technologies,
Objective: To identify food and beverage brand Web sites featuring designated children's areas, assess marketing techniques present on those industry Web sites, and determine nutritional quality of branded food items marketed to children. Design: Systematic content analysis of food and beverage brand Web sites and nutrient analysis of food and…
This exploratory study analyzed the content of medical tourism Web sites in an attempt to examine how they convey information about benefits and risks of medical procedures, how they frame credibility, and the degree to which these Web sites include interactive features for consumers. Drawing upon framing theory, the researchers content analyzed a sample of 66 medical tourism Web sites
We propose BeamAuth, a two-factor web authentication tech- nique where the second factor is a specially crafted book- mark. BeamAuth presents two interesting features: (1) only server-side deployment is required alongside any modern, out-of-the-box web browser on the client side, and (2) cre- dentials remain safe against many types of phishing attacks, even if the user fails to check proper
This paper introduces a new automatic network discovery\\/map system via Web architecture, the so-called Web-based Automatic Network discovery\\/Map Systems (WANMS). The system functions as a plug-in for a wellknown network management system, Cacti. Enriched features, especially the automatic networking discovery and map module, have been added in order to enhance the efficiency of Cacti embedded with a weather map plug-in.
Understanding and using the data and knowledge encoded in semantic web documents requires an inference engine. F-OWL is an inference engine for the semantic web language OWL language based on F-logic, an approach to defining frame-based systems in logic. F-OWL is implemented using XSB and Flora-2 and takes full advantage of their features. We describe how F-OWL computes ontology entailment and compare it with other description logic based approaches. We also describe TAGA, a trading agent environment that we have used as a test bed for F-OWL and to explore how multiagent systems can use semantic web concepts and technology.
Here, we present the new UCL Bioinformatics Group’s PSIPRED Protein Analysis Workbench. The Workbench unites all of our previously available analysis methods into a single web-based framework. The new web portal provides a greatly streamlined user interface with a number of new features to allow users to better explore their results. We offer a number of additional services to enable computationally scalable execution of our prediction methods; these include SOAP and XML-RPC web server access and new HADOOP packages. All software and services are available via the UCL Bioinformatics Group website at http://bioinf.cs.ucl.ac.uk/.
Buchan, Daniel W. A.; Minneci, Federico; Nugent, Tim C. O.; Bryson, Kevin; Jones, David T.
As the Internet continues to grow, users are increasingly faced with the problem of evaluating the trustworthiness of Web sites. This paper presents the results of two studies conducted to determine how people approach this situation. The researchers "gathered the comments people wroteÃ¢ÂÂ¦about each site's credibility and analyzed these comments to track which features of a Web site were noticed (or went unnoticed)" when considering credibility. Eighteen aspects of Web site design are ranked according to their importance as reported by the study's subjects. The results are quite surprising, because many of the critical items were rated lower than the more aesthetic ones.
Danielsen, David.; Fogg, B. J.; Marable, Leslie.; Soohoo, Cathy.; Stanford, Julianne.; Tauber, Ellen R.
With the increasing popularization of the Internet, Internet services used as illegal purposes have become a serious problem.\\u000a How to prevent these phenomena from happening has become a major concern for society. In this paper, a cybercrime forensic\\u000a method for Chinese illegal web information authorship analysis was described. Various writing-style features including linguistic\\u000a features and structural features were extracted. To
Jianbin Ma; Guifa Teng; Yuxin Zhang; Yueli Li; Ying Li
OBJECTIVE To address the need for women's health education by designing, implementing, and evaluating a self-study, web-based women's health curriculum. DESIGN Cohort of students enrolled in the ambulatory portion of the medicine clerkship with comparison group of students who had not yet completed this rotation. PARTICIPANTS/SETTING Third- and fourth-year medical students on the required medicine clerkship (115 students completed the curriculum; 158 completed patient-related logs). INTERVENTION Following an extensive needs assessment and formulation of competencies and objectives, we developed a web-based women's health curriculum completed during the ambulatory portion of the medicine clerkship. The modules were case based and included web links, references, and immediate feedback on posttesting. We discuss technical issues with implementation and maintenance. MEASUREMENTS AND MAIN RESULTS We evaluated this curriculum using anonymous questionnaires, open-ended narrative comments, online multiple-choice tests, and personal digital assistant (PDA) logs of patient-related discussions of women's health. Students completing the curriculum valued learning women's health, preferred this self-directed learning over lecture, scored highly on knowledge tests, and were involved in more and higher-level discussions of women's health with faculty (P <.001). CONCLUSIONS We present a model for the systematic design of a web-based women's health curriculum as part of a medicine clerkship. The web-based instruction resolved barriers associated with limited curriculum time and faculty availability, provided an accessible and standard curriculum, and met the needs of adult learners (with their motivation to learn topics they value and apply this knowledge in their daily work). We hypothesize that our web-based curriculum spurred students to later discuss these topics with faculty. Web-based learning may be particularly suited for women's health because of its multidisciplinary nature and need for vertical integration throughout medical school curricula.
Zebrack, Jennifer R; Mitchell, Julie L; Davids, Susan L; Simpson, Deborah E
Researchers in the geosciences are being faced with a deluge of large-scale datasets that needs to be analyzed in a fast and efficient manner. We have developed an interactive web-based scheme for data-mining large datasets using three distinct techniques. Collectively WEB-IS (Web-Based Integrated System) is a webservice based on a client-server paradigm which employs the power of a server to provide visualization and data analysis capabilities to multiple clients remotely. WEB-IS1 (http://boy.msi.umn.edu/web-is) extracts multi-resolutional structures in seismic catalogs through cluster analysis. Off-screen rendering is then applied on the server which allows the client to view the results in 3-D. WEB-IS2 (http://boy.msi.umn.edu/amira) exploits the ease of use in the powerful visualization package Amira (www.amiravis.com) through a web-based module which allows for manipulating and analyzing 3-D datasets. WEB-IS3 (http://tomo.msi.umn.edu/~max) is an imaging service which displays selected features from a low resolution environment to one with increased resolution by zooming into the data. Our software uses a combination of programming languages to seamlessly integrate server-side processing with client-side interaction utilities. Fast two-way communication between client and server will be considered for GRID purposes using SOAP (Livingston, D., Advanced SOAP for Web development, Prentice Hall, 2002). Future use of WEB-IS in the GRID computing environment is also being explored using the Narada-Brokering (www.naradabrokering.org) method. Our aim with this software tool is to overcome the hardware limitations of a thin client by harnessing the power of a large visualization server using a simple web interface.
Kadlec, B. J.; Yang, X.; Wang, Y.; Bollig, E. F.; Garbow, Z. A.; Yuen, D. A.; Erlebacher, G.
Once the domain of engineering techies and university computer gurus, the Internet has taken a popular route into business and homes. With the breakthrough of the World Wide Web (the Web), use of the Internet has grown astonishingly, doubling each year for the last couple of years. Many think that Internet usage is shallow and short term, but other companies are becoming less skeptical. Marketing and communications are two reasons why utilities, businesses, and organizations of all sorts are setting up shop on the Web. Corporation and associations have established home pages where a company can disseminate information about itself. the number of Web sites doubles every 53 days. Electric utilities have taken note of the number of users and about 40 had developed home pages as of Labor Day. This article examines some of these utilities and how they are using their Web sites to provide not only standard corporate information, press releases, and financial information, but also to personalize their contact with their customer base.
Summary TMpro is a transmembrane (TM) helix prediction algorithm that uses language processing methodology for TM segment identification. It is primarily based on the analysis of statistical distributions of properties of amino acids in transmembrane segments. This article describes the availability of TMpro on the internet via a web interface. The key features of the interface are: (i) output is generated in multiple formats including a user-interactive graphical chart which allows comparison of TMpro predicted segment locations with other labeled segments input by the user, such as predictions from other methods. (ii) Up to 5000 sequences can be submitted at a time for prediction. (iii) TMpro is available as a web server and is published as a web service so that the method can be accessed by users as well as other services depending on the need for data integration.
Ganapathiraju, Madhavi; Jursa, Christopher Jon; Karimi, Hassan A.
WebQuests are activities in which students use Web resources to learn about school topics. WebQuests are advocated as constructivist activities and ones generally well regarded by students. Two experiments were conducted in school settings to compare learning using WebQuests versus conventional instruction. Students and teachers both enjoyed WebQuest instruction and spoke highly of it. In one experiment, however, conventional instruction led to significantly greater student learning. In the other, there were no significant differences in the learning outcomes between conventional versus WebQuest-based instruction.
Gaskill, Martonia; McNulty, Anastasia; Brooks, David W.
SeaWeb is a nonprofit organization aimed at raising awareness of the ocean and marine life that play "a critical role in our everyday life and in the future of our planet." SeaWeb employs a team of professionals from biology, exploration, and various communication disciplines. The current campaigns include an effort to protect the declining Caspian Sea Sturgeon ("the source of most of the world's caviar"), an attempt to reduce overfishing of swordfish, and a report about the changes occurring in the world's oceans. This Web site is a robust source of information about many threats that are facing marine ecosystems, and an attempt to reduce the dangers by educating the public about the impacts of their behavior.
Here is a new, rich resource for K-12 teachers and students from the US Geological Survey (USGS). The Learning Web provides online lesson plans, activities, tutorials (some downloadable and printable in .pdf), and links to references dealing with interdisciplinary studies of natural science. For example, the Exploring Caves section (1-3 level) covers the basic geology of caves, life habits of cave-dwelling organisms, and cave safety and conservation. Other topics explored on the Learning Web include maps, climate, wildlife, earth processes, and more. Learning Web culls pages appropriate for K-12 instruction from the USGS's vast online collection of factsheets, data, and program sites, allowing teachers and students to spend time learning rather than searching. However, because this site is so full of information, it can be tricky to navigate and important sections can be missed, so try using their search engine to find specific topics. Note also that elementary content is much more abundant here than secondary.
Neuroscience Web Search, created by Fred K. Lenherr of the Applied Computing Systems Institute, attempts to provide interested users with a searchable index of over 112,000 web pages related to neuroscience. It "includes the home pages of neuroscientists, neuro-medical links, and computational neuroscience." Users can search on the text and/or titles of pages. This search tool is similar to Argos (discussed in the November 1, 1996 Scout Report). The difference is that whereas Argos is both quite clear about the sources of its database and is peer reviewed, Neuroscience Web Search is neither. It is a much larger local subject search engine, but it is more difficult to assess the quality of the materials retrieved.
Museums have been gravitating to the web for over a decade in order to educate the public about their collections, and in recent years, they have also used the web to form online communities for themselves. Museums and the Web is one of the ways that museum specialists can gather together to learn more about each others work, and also to engage in online discussion and research activities. First-time visitors to the site should take a look through some of these ongoing discussions, which include discussions about educational practices and in-house evaluation practices. The site includes an impressive search engine which allows users to look for previously submitted papers and comments. Visitors can access most of the material here without officially registering, but should they wish to do so, they will also be able to use the site to network with colleagues at different institutions.
In an effort to substantially improve the usefulness and accessibility of federal government health information on the World Wide Web, the National Cancer Institute has launched a new Web site called Usability.gov (http://usability.gov).
So far, Rachel Soltanoff's instincts had been right. As CEO in this fictional case study, she had successfully navigated TradeRite Software's transition from a news service for stockbrokers to a $70 million provider of shrink-wrapped software geared toward both brokers and the growing day-trader market. Now a well-financed start-up, Stock-net.com, was testing a very competitive product that traders could download directly over the Web. And TradeRite's Web site was nothing more than a collection of elaborate marketing brochures. Rachel knew she needed to start selling over the Web. But the e-commerce consultants she had hired to set up her Web store were behind schedule, and their 21-year-old CEO had just resigned. Her product manager, Lisa Bandini, was working overtime to transform TradeRite's entire product line into Web-aware applications to match Stocknet's, and Rachel had $2.5 million to launch them. But the consultants said it would take $5 million just to rent e-commerce capabilities. Ace sales VP Brian Rockart thought the company had already wasted too much time and money--money from his budget--on its Web site. Marketing VP Rob Collins thought TradeRite should focus on its core stockbroker customers. Chief Technical Officer Joe Martinez doesn't want to go ahead without a pilot project. Should Rachel try to convince Brian, Rob, and the rest of the senior management team that e-commerce is the way to go? Four commentators offer advice. PMID:10387770
... Level 2 Vertical Feature Mask feature classification flag value. It is written in Interactive Data Language (IDL) as a callable ... receives as an argument a 16-bit feature classification flag value and prints the feature type information extracted from the bits in the ...
Developed by Philip E. Molebash of San Diego State University, Web Inquiry Projects (WIP) offers ideas for inquiry-based learning that reach "higher levels of inquiry" than typically achieved by the more familiar WebQuest model. You'll find a good selection of examples in history and the humanities (as well as math and science). Each example includes a detailed teacher version and a student version with just the project prompt. The goal is to provide teachers with everything they need to sharpen their inquiry-teaching skills.
Arratia, [Arratia, R. (1979) Ph.D. thesis (University of Wisconsin, Madison) and unpublished work] and later Toth and Werner [Toth, B. & Werner, W. (1998) Probab. Theory Relat. Fields 111, 375-452] constructed random processes that formally correspond to coalescing one-dimensional Brownian motions starting from every space-time point. We extend their work by constructing and characterizing what we call the Brownian Web as a random variable taking values in an appropriate (metric) space whose points are (compact) sets of paths. This leads to general convergence criteria and, in particular, to convergence in distribution of coalescing random walks in the scaling limit to the Brownian Web. PMID:12451173
Fontes, L R G; Isopi, M; Newman, C M; Ravishankar, K
Arratia, [Arratia, R. (1979) Ph.D. thesis (University of Wisconsin, Madison) and unpublished work] and later Toth and Werner [Toth, B. & Werner, W. (1998) Probab. Theory Relat. Fields111, 375–452] constructed random processes that formally correspond to coalescing one-dimensional Brownian motions starting from every space-time point. We extend their work by constructing and characterizing what we call the Brownian Web as a random variable taking values in an appropriate (metric) space whose points are (compact) sets of paths. This leads to general convergence criteria and, in particular, to convergence in distribution of coalescing random walks in the scaling limit to the Brownian Web.
Fontes, L. R. G.; Isopi, M.; Newman, C. M.; Ravishankar, K.
Released earlier this year, Web White & Blue 2000 is intended to "help voters, journalists, and others use the Internet to learn more about the presidential candidates, their campaigns, their scheduled debates this fall as well as the way the online resources are impacting politics in this presidential election year." The Best of the Best section provides links to election coverage and campaign material from a wide range of sources on the Internet. Beginning on October 1, the Rolling Cyber Debate is intended to provide a forum for candidates and their campaigns to continue debates online between the televised ones. Web White and Blue 2000 is supported by the Markle Foundation.
Anchor texts complement Web page content and have been used extensively in commercial Web search engines. Existing methods\\u000a for anchor text weighting rely on the hyperlink information which is created by page content editors. Since anchor texts are\\u000a created to help user browse the Web, browsing behavior of Web users may also provide useful or complementary information for\\u000a anchor text
Bo ZhouYiqun; Yiqun Liu; Min Zhang; Yijiang Jin; Shaoping Ma
Personalization of content returned from a Web site is an important problem in general and affects e-commerce and e-services in particular. Targeting appropriate information or products to the end user can significantly change (for the better) the user experience on a Web site. One possible approach to Web personalization is to mine typical user profiles from the vast amount of
Websites, notable by URLs are large collection of Web pages. They make a huge database of heterogeneous information gathered and collected distributive. The accumulated information is differentiated on the basis of certain templates, their URLs and information contained in these pages. In this research, we mainly concentrate on Web-forums. In the current circumstances, a Web crawler crawls all the Page
Namita Mittal; M. C. Govil; Richi Nayak; Neeraj Jain
This research aims to provide insight for web form design for older users. The effects of task complexity and information structure of web forms on older users' performance were examined. Forty-eight older participants with abundant computer and web experience were recruited. The results showed significant differences in task time and error rate…
Previous Web access authentication systems have used either the Web or the mobile channel individually to confirm the claimed identity of the remote user. Both approaches proved to be insecure when used in isolation. An investigation is presented into the enhanced security of a new combined Web\\/mobile authentication system. The hybrid system enables a strong authentication by augmenting the traditional
The success of the Web services technology has brought top- ics as software reuse and discovery once again on the agenda of software engineers. While there are several eorts towards automating Web ser- vice discovery and composition, many developers still search for services via online Web service repositories and then combine them manually. However, from our analysis of these repositories,
This article discusses using WebQEM, a quantitative evaluation strategy to assess Web site and application quality. Defining and measuring quality indicators can help stakeholders understand and improve Web products. An e-commerce case study illustrates the methodology's utility in systematically assessing attributes that influence product quality
This paper presents a knowledge discovery framework for the construction of Community Web Directories, a concept that we introduced in our recent work, applying personalization to Web directories. In this context, the Web directory is viewed as a thematic hierarchy and personalization is realized by constructing user community models on the basis of usage data. In contrast to most of
Web applications are the Achilles heel of our current ICT in- frastructure. NIST's national vulnerability database clearly shows that the percentage of vulnerabilities located in the application layer increases steadily. Web Application Fire- walls (WAFs) play an important role in preventing exploita- tion of vulnerabilities in web applications. However, WAFs are very pragmatic and ad hoc, and it is very
Lieven Desmet; Frank Piessens; Wouter Joosen; Pierre Verbaeten
\\u000a In feature selection, a part of the features is chosen as a new feature subset, while the rest of the features is ignored.\\u000a The neglected features still, however, may contain useful information for discriminating the data classes. To make use of\\u000a this information, the combined classifier approach can be used. In our paper we study the efficiency of combining applied
The document consists of a publicly available web site (george.arc.nasa.gov) for Joseph A. Garcia's personal web pages in the AI division. Only general information will be posted and no technical material. All the information is unclassified.
Garcia, Joseph A.; Smith, Charles A. (Technical Monitor)
We analyze the effectiveness of different Web browser update mechanisms on various operating systems; from Google Chrome's silent update mechanism to Opera's update requiring a full re-installation. We use anonymized logs from Google's world wide distributed Web servers. An analysis of the logged HTTP user-agent strings that Web browsers report when requesting any Web page is used to measure the daily browser version shares in active use. To the best of our knowledge, this is the first global scale measurement of Web browser update effectiveness comparing four different Web browser update strategies including Google Chrome. Our measurements prove that silent updates and little dependency on the underlying operating system are most effective to get users of Web browsers to surf the Web with the latest browser version.
The World Wide Web is a large, heterogeneous, distributed collection of documents connected by hypertext links. The most common technology currently used for search- ing the Web depends on sending information retrieval requests to \\
Alberto O. Mendelzon; George A. Mihaila; Tova Milo
This paper presents the QA-Pagelet as a fundamental data preparation technique for large-scale data analysis of the Deep Web. To support QA-Pagelet extraction, we present the Thor framework for sampling, locating, and partioning the QA-Pagelets from the Deep Web. Two unique features of the Thor framework are 1) the novel page clustering for grouping pages from a Deep Web source
IntroductionAs the Web increasingly becomes a core element of business strategy, the task of hosting web contenthas become mission critical. Few companies, however, have the resources, money and expertise tobuild their web site entirely in-house. For this reason, many businesses choose to outsource theirWeb hosting to Internet Service Providers and some equipment vendors which, according to ForresterResearch Inc., can slash
Web services are becoming a standard method of sharing data and functionality among loosely-coupled systems. We propose a general- purpose Web Service Management System (WSMS) that enables querying multiple web services in a transparent and integrated fash- ion. This paper tackles a first basic WSMS problem: query opti- mization for Select-Project-Join queries spanning multiple web ser- vices. Our main result
Utkarsh Srivastava; Kamesh Munagala; Jennifer Widom; Rajeev Motwani
\\u000a One of the fundamental goals of Web-based support systems is to promote and support human activities on the Web. The focus\\u000a of this Chapter is on the specific activities associated with Web search, with special emphasis given to the use of visualization\\u000a to enhance the cognitive abilities of Web searchers. An overview of information retrieval basics, along with a focus
BWAIN is a creative web browser which uniquely combines graphics and sound using the HTML structure of any selected web page.\\u000a The structure behind the web page which normally remains unseen, is revealed in a process of visualization. The HTML Code\\u000a acts as a genome for building an image which looks and sounds quite distinct. Several web pages may be
Linda Huber; Michael Bißmann; Stefan Gottschalk; Alexander Keck; Andreas Schönefeldt; Marko Seidenglanz; Norbert Sroke; Alexander Radke
\\u000a The World Wide Web is a distributed global information resource. It contains a large amount of information that have been\\u000a placed on the web independently by different organizations and thus, related information may appear across different web sites.\\u000a To manage and access heterogeneous information on WWW, we have started a project of building a web warehouse, called Whoweda (Warehouse of
Computer specifications for low stress dendritic crystal web growth configurations having thermal elements in fixed positions were developed, as well as computer specifications for web growth incorporating dynamic positioning of thermal elements. Low buckling stress with increased width of unbuckled growth was sought, as well as increased growth velocity while maintaining low residual stress. Model-defined advanced concepts for web growth configurations were defined and verified in experimental web growth. Major increases were achieved in width and velocity.
Seagrass beds provide important habitat for a wide range of marine species but are threatened by multiple human impacts in coastal waters. Although seagrass communities have been well-studied in the field, a quantification of their food-web structure and functioning, and how these change across space and human impacts has been lacking. Motivated by extensive field surveys and literature information, we analyzed the structural features of food webs associated with Zostera marina across 16 study sites in 3 provinces in Atlantic Canada. Our goals were to (i) quantify differences in food-web structure across local and regional scales and human impacts, (ii) assess the robustness of seagrass webs to simulated species loss, and (iii) compare food-web structure in temperate Atlantic seagrass beds with those of other aquatic ecosystems. We constructed individual food webs for each study site and cumulative webs for each province and the entire region based on presence/absence of species, and calculated 16 structural properties for each web. Our results indicate that food-web structure was similar among low impact sites across regions. With increasing human impacts associated with eutrophication, however, food-web structure show evidence of degradation as indicated by fewer trophic groups, lower maximum trophic level of the highest top predator, fewer trophic links connecting top to basal species, higher fractions of herbivores and intermediate consumers, and higher number of prey per species. These structural changes translate into functional changes with impacted sites being less robust to simulated species loss. Temperate Atlantic seagrass webs are similar to a tropical seagrass web, yet differed from other aquatic webs, suggesting consistent food-web characteristics across seagrass ecosystems in different regions. Our study illustrates that food-web structure and functioning of seagrass habitats change with human impacts and that the spatial scale of food-web analysis is critical for determining results.
Coll, Marta; Schmidt, Allison; Romanuk, Tamara; Lotze, Heike K.
Replication is a well-known technique to improve the accessibility of Web sites. It generally offers reduced client latencies and increases a site's availability. However, applying replication techniques is not trivial, and various Content Delivery Networks (CDNs) have been created to fa- cilitate replication for digital content providers. The success of these CDNs has triggered further research efforts into developing advanced
Swaminathan Sivasubramanian; M. Szymaniak; G. Pierre
Web search engines use indexes to eciently retrieve pages containing specified query terms, as well as pages linking to specified pages. The problem of compressed indexes that permit such fast retrieval has a long history. We consider the problem: assuming that the terms in (or links to) a page are generated from a probability distribution, how well com- pactly can
Flavio Chierichetti; Ravi Kumar; Prabhakar Raghavan
Teachers search for ways to enhance oceanography units in the classroom. There are many online resources available to help one explore the mysteries of the deep. This article describes a collection of Web sites on this topic appropriate for middle level classrooms.
Wighting, Mervyn J.; Lucking, Robert A.; Christmann, Edwin P.
The purpose of this study was to understand how elementary classroom Web sites support children's literacy. From a sociocultural perspective of literacy and a transformative stance toward the integration of literacy and technology, and building on explorations of new literacies, I discuss opportunities provided by the Internet that can support…
The lowest level of a food web includes producers, which are plants that make their own energy from the sun. Animals that eat these producers are called primary consumers, and consumers that eat other consumers are called secondary consumers. Decomposers break down dead plants and animals to release nutrients into the soil.
As the number of Internet users and the number of accessible Web pages grows, it is becoming increasingly difficult for users to find documents that are relevant to their particular needs. Users must either browse through a large hierarchy of concepts to find the information for which they are looking or submit a query to a publicly available search engine
A delicate pattern, like that of a spider web, appears on top of the Mars residual polar cap, after the seasonal carbon-dioxide ice slab has disappeared. Next spring, these will likely mark the sites of vents when the carbon-dioxide ice cap returns. This Mars Global Surveyor, Mars Orbiter Camera image is about 3-kilometers wide (2-miles).
A barrier crucible design which consistently maintains melt stability over long periods of time was successfully tested and used in long growth runs. The pellet feeder for melt replenishment was operated continuously for growth runs of up to 17 hours. The liquid level sensor comprising a laser/sensor system was operated, performed well, and meets the requirements for maintaining liquid level height during growth and melt replenishment. An automated feedback loop connecting the feed mechanism and the liquid level sensing system was designed and constructed and operated successfully for 3.5 hours demonstrating the feasibility of semi-automated dendritic web growth. The sensitivity of the cost of sheet, to variations in capital equipment cost and recycling dendrites was calculated and it was shown that these factors have relatively little impact on sheet cost. Dendrites from web which had gone all the way through the solar cell fabrication process, when melted and grown into web, produce crystals which show no degradation in cell efficiency. Material quality remains high and cells made from web grown at the start, during, and the end of a run from a replenished melt show comparable efficiencies.
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hill, F. E.; Skutch, M. E.; Driggers, J. M.; Hopkins, R. H.
Correct spelling is increasingly important in our technological world. We examined children's and adults' Web search behavior for easy and more difficult to spell target keywords. Grade 4 children and university students searched for the life cycle of the lemming (easy to spell target keyword) or the ptarmigan (difficult to spell target keyword).…
Recognized as one of our oldest yet still vital forms of communication, storytelling offers new opportunity when it takes place on the web. Even our every day activities of writing email, creating presentations, or participating in social media can become more dynamic when considered stories. A digital storyteller from outside the museum field…
In this article, the author presents a listing of Web resources which showcases sites created by school districts to support staff use of available technologies. These include Online Technology Tutorials from the Kent School District, Tips and Tutorials from the Kenton County (Kentucky) Schools Office of Instructional Technology, and Teacher…
The goal of this paper is to argue the need to approach the personalization issues in Web applications from the very beginning in the application's development cycle. Since personalization is a critical aspect in many popular domains such as e-commerce, it important enough that it should be dealt with through a design view, rather than only an implementation view (which
This paper is about the measurement of the friction of plastic and similar flexible webs in contact with cylindrical surfaces. There are numerous ways for measuring friction coefficients of materials in relative contact, but to be meaningful, the testing technique should simulate the tribosystem of interest. The subject of this paper is the capstan friction test, which simulates the tribosystem
Current practice of Web site development does not ad- dress explicitly the problems related to multilingual sites. Thesame information, as well as thesame navigation paths, page formatting and organization, are expected to be pro- vided by the site independently from the chosen language. This is typically ensured by adopting personal conventions on the way pages are named and on their
Paolo Tonella; Filippo Ricca; Emanuele Pianta; Christian Girardi
In a startling revelation, a team of university scientists has reported that a network of computers has become conscious and sentient, and is beginning to assume control of online information system. In spite of the ominous tone typically chosen for dramatic effect, a sentient Web would be more helpful and much easier for people to use. An agent is an
A patient with a `congenital' oesophageal web was suffering from dysphagia and was treated surgically. A review is made of a few similar, well-documented cases. An effort is made to compare the clinical and pathological characteristics of the lesion and to discuss its pathogenesis. Images
Web accessibility evaluation is a broad field that combines different disciplines and skills. It encompasses technical aspects such as the assessment of conformance to standards and guidelines, as well as non-technical aspects such as the involvement of end-users during the evaluation process. Since Web accessibility is a qualitative and experiential measure rather than a quantitative and concrete property, the evaluation approaches need to include different techniques and maintain flexibility and adaptability toward different situations. At the same time, evaluation approaches need to be robust and reliable so that they can be effective. This chapter explores some of the techniques and strategies to evaluate the accessibility of Web content for people with disabilities. It highlights some of the common approaches to carry out and manage evaluation processes rather than list out individual steps for evaluating Web content. This chapter also provides an outlook to some of the future directions in which the field seems to be heading, and outlines some opportunities for research and development.
Modularity in ontologies is key both for large scale ontology development and for distributed ontology reuse on the Web. However, the problems of formally characterizing a modular representation, on the one hand, and of automatically iden- tifying modules within an OWL ontology, on the other, has not been satisfactorily addressed, although their relevance has been widely accepted by the Ontology
In this article, the author presents Web sites about teen use of online social networks and age-appropriate resources. These resources can be used for teaching students ways in which to use these networks safely and ethically. Among other things, "Social Network Service" entry in Wikipedia, offers a description and a "List of Social Networking…
The work funded by DARPA and done by MIT and W3C under DAML Agent Markup Language (DAML) project between 2002 and 2005 provided key steps in the research in the Semantic Web technology, and also played an essential role in delivering the technology to ind...
The author has discussed the Multimedia Educational Resource for Teaching and Online Learning site, MERLOT, in a recent Electronic Roundup column. In this article, he discusses an entirely new Web page development tool that MERLOT has added for its members. The new tool is called the MERLOT Content Builder and is directly integrated into the…
A particular challenge which is critically important to the development and reusability of Web Service (WS) systems is to have a precise understanding of the functionality of the service under consideration. Currently, this information is not captured by the associated WS technologies. For instance, the WSDL description at best captures type information associated with each operation provided by the WS
The Web Search page is another collection of all the major search engines and directories, but with graphic representations which may make it more fun to have in your bookmarks. A text version is also provided which includes even more search engines, resulting in a new feeling of being overwhelmed.
Describes the use of Web technology to create a repository for American bird songs at Northeastern Sate University (Oklahoma). Explains the use of software to translated bird songs into a sonogram, or picture, of what the sounds look like for ornithology students to better learn bird vocalizations. (LRW)
Reviews ten Web sites that are designed to answer basic legal questions for the layperson. They provide resources on a broad range of legal topics, such as divorce, real estate, and criminal justice, as well as legal forms and information on finding a lawyer. (LRW)
This lesson plan is designed to help students understand the interrelatedness of food webs and to see how populations of organisms affect each other. Students assume the roles of the various organisms in the ecosystem; the ones that are dependent upon each other are symbolically connected by lengths of yarn. A materials list, instructions, assessment ideas, and educational standards are provided.
In this article, the history of the development of Australia's Web archive, PANDORA, is presented. Criteria for selection, harvesting techniques, the static or dynamic nature of the material, and the technical aspects of archiving are discussed. Policy matters include copyright, permission to archive, legal deposit arrangements, requests to remove harvested material. Practices for administrative, preservation, descriptive, and rights metadata are
This site provides a set of web-based tools for precise analysis of audio, acoustical and electrical signals. The modules include a sound level meter, a digital oscilloscope, a digital spectrum analyzer, and a versatile set of digital function generators.
Compares the attributes of the silk from spiders with those of the commercially harvested silk from silkworms. Discusses the evolution, design, and effectiveness of spider webs; the functional mechanics of the varieties of silk that can be produced by the same spider; and the composite, as well as molecular, structure of spider silk thread. (JJK)
This resource is a simulation game where students represent plants and animals living in a forest habitat. Sitting in a circle, they connect themselves using string to represent the ways they depend on each other. As they make connections, the string forms a web of life. They will also learn what occurs when an invasive species enters their environment.
Genetic toxicology is the scientific discipline dealing with the effects of chemical, physical and biological agents on the heredity of living organisms. The Internet offers a wide range of online digital resources for the field of Genetic Toxicology. The history of genetic toxicology and electronic data collections are reviewed. Web-based resources at US National Library of Medicine (NLM), including MEDLINE®,
In this paper, we will describe work in progress at the MITRE Cor- poration on embedding speech-enabled interfaces in Web browsers. This research is part of our work to establish the infrastructure to create Web-hosted versions of prototype multimodal interfaces, both intelligent and otherwise. Like many others, we believe that the Web is the best potential delivery and distribution vehicle
The new concept proposed in this paper is a query free web search that automatically retrieves a web page including information re- lated to the daily activity that we are currently engaged in for auto- matically displaying the page on Internet-connected domestic ap- pliances around us such as televisions. When we are washing a coffee maker, for example, a web
A WebQuest is an inquiry-based lesson plan that uses the Internet. This article explains what a WebQuest is, shows how to create one, and provides an example. When engaged in a WebQuest, students use technology to experience cooperative learning and discovery learning while honing their research, writing, and presentation skills. It has been found…
The semantic web or Web 3.0 makes information more meaningful to people by making it more understandable to machines. In this article, the author examines the implications of Web 3.0 for education. The author considers three areas of impact: knowledge construction, personal learning network maintenance, and personal educational administration.…
A system for electrically measuring variations over a flexible web has a capacitive sensor including spaced electrically conductive, transmit and receive electrodes mounted on a flexible substrate. The sensor is held against a flexible web with sufficient force to deflect the path of the web, which moves relative to the sensor.
Sleefe, Gerard E. (1 Snowcap Ct., Cedar Crest, NM 87008); Rudnick, Thomas J. (626 E. Jackson Rd., St. Louis, MO 63119); Novak, James L. (11048 Malaguena La. NE., Albuquerque, NM 87111)
Web software applications are increasingly being de- ployed in sensitive situations. Web applications are used to transmit, accept and store data that is personal, com- pany confidential and sensitive. Input validation testing (IVT) checks user inputs to ensure that they conform to the program's requirements, which is particularly im- portant for software that relies on user inputs, includ- ing Web
Explores the interrelation between Web publishing and information retrieval technologies and lists new approaches to Web indexing and searching. Highlights include Web directories; search engines; portalisation; Internet service providers; browser providers; meta search engines; popularity based analysis; natural language searching; links-based…
The World Wide Web, initially intended as a way to publish static hypertexts on the Internet, is moving toward complex applications. Static Web sites are being gradually replaced by dynamic sites, where information is stored in databases and non-trivial computation is performed.In such a scenario, ensuring the quality of a Web application from the user's perspective is crucial. Techniques are
Security, a quality attribute in web applications, improves the level of quality in the processes needed to manage information, and therefore achieving business objectives. Web Engineering must address new challenges facing web application development in order to offer new techniques that guarantee high quality applications. This work is part of an over all project that focuses on Risk Assessment in
Using Semantic Web ontologies to describe Web Services has proven to be useful for various different tasks including service discovery a nd composi- tion. AI planning techniques have been employed to automate the composition of Web Services described this way. Planners use the description of the pre- conditions and effects of a service to do various sorts of reasoning abo
The essential components to building a successful Web site are many times overlooked. There is a misconception that if an individual knows HTML or is a Web developer, an effective Web site can easily be created. In reality, a variety of other factors are needed before technical skills ever come into play. When instructing students in the art of…
The '95 World Wide Web conference, in Chicago, was a sold out event. This Web page was established in anticipation of a need to provide articles and updates across the Web for people who were interested - or who couldn't get in.
The study of the web as a graph is not only fascinating in its own right, but also yields valuable insight into web algorithms for crawling, searching and community discovery, and the sociological phenomena which characterize its evolution. We report on experiments on local and global properties of the web graph using two Altavista crawls each with over 200 million
Andrei Z. Broder; Ravi Kumar; Farzin Maghoul; Prabhakar Raghavan; Sridhar Rajagopalan; Raymie Stata; Andrew Tomkins; Janet L. Wiener
The bandwidth demands of the World Wide Web continue to grow at a hyper-exponential rate. Given this rocketing growth, caching of web objects as a means to reduce network bandwidth consumption is likely to be a necessity in the very near future. Unfortunately, many Web caches do not satisfactorily maintain cache consistency. This paper presents a survey of contemporary cache
The process-driven composition of Web services is emerging as a promising approach to integrate business applications within and across organizational boundaries. In this approach, individual Web services are federated into composite Web services whose business logic is expressed as a process model. The tasks of this process model are essentially invocations to functionalities offered by the underlying component services. Usually,
Liangzhao Zeng; Boualem Benatallah; Marlon Dumas; Jayant Kalagnanam; Quan Z. Sheng
Hyperlink analysis algorithms significantly improve the relevance of the search results on the Web, so much so that all major Web search engines claim to use some type of hyperlink analysis. However, the search engines do not disclose details about the type of hyperlink analysis they perform, mostly to avoid manipulation of search results by Web-positioning companies. The article discusses
With HTML5 and ever-improving browser performance, the web has emerged as an ideal platform for showcasing graphics applications. Several graphics applications that were once too slow to be written in anything but native code may now be fast enough to run as web apps. This course for developers who want to develop graphics applications for the web introduces the concepts
Pushkar Joshi; Mikaël Bourges-Sévenier; Kenneth Russell; Zhenyao Mo
The goal of the TREC Web track is to explore and evaluate retrieval approaches over large-scale subsets of the Web -- currently on the order of one billion pages. For TREC 2013, the fifth year of the Web track, we implemented the following significant upd...
C. L. Clarke E. M. Voorhees F. Diaz K. Collins-Thompson P. Bennett
Lopes, Christian T.; Franz, Max; Kazi, Farzana; Donaldson, Sylva L.; Morris, Quaid; Bader, Gary D.
This case study reports the investigations into the feasibility and reliability of calculating impact factors for web sites, called Web Impact Factors (Web-IF). The study analyses a selection of seven small and medium scale national and four large web domains as well as six institutional web sites over a series of snapshots taken of the web during a month. The
Discusses Web sales and explores the differences between heavy, medium, and light Web users in terms of their beliefs about Web advertising, attitudes toward Web advertising, purchasing patterns, and demographics. Suggests marketers need to target Web advertising to particular Web users. (Author/LRW)
We propose a formal model of web security based on an abstraction of the web platform and use this model to analyze the security of several sample web mechanisms and applications. We identify three distinct threat models that can be used to analyze web applications, ranging from a web attacker who controls malicious web sites and clients, to stronger attackers
Devdatta Akhawe; Adam Barth; Peifung E. Lam; John C. Mitchell; Dawn Song
Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk.
Web services have become a popular means of interconnecting solutions for processing a body of scientific literature. This has fuelled research on high-level data exchange formats suitable for a given domain and ensuring the interoperability of Web services. In this article, we focus on the biological domain and consider four interoperability formats, BioC, BioNLP, XMI and RDF, that represent domain-specific and generic representations and include well-established as well as emerging specifications. We use the formats in the context of customizable Web services created in our Web-based, text-mining workbench Argo that features an ever-growing library of elementary analytics and capabilities to build and deploy Web services straight from a convenient graphical user interface. We demonstrate a 2-fold customization of Web services: by building task-specific processing pipelines from a repository of available analytics, and by configuring services to accept and produce a combination of input and output data interchange formats. We provide qualitative evaluation of the formats as well as quantitative evaluation of automatic analytics. The latter was carried out as part of our participation in the fourth edition of the BioCreative challenge. Our analytics built into Web services for recognizing biochemical concepts in BioC collections achieved the highest combined scores out of 10 participating teams. Database URL: http://argo.nactem.ac.uk. PMID:25006225
Background As the “omics” revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter’s complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. Results COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a “semantic web in a box” approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. Conclusions The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/.
Motivating students to learn is a constant challenge for faculty. Technology can play a significant role. One such solution is WebAssign — a web-based homework system that offers new teaching and learning opportunities for educators and their students. WebAssign delivers, collects, grades, and records customized homework assignments over the Internet. Students get immediate feedback with credit and instructors can implement "Just-in-Time" teaching. In this talk, I will describe how assignments can be generated with different numerical values for each question, giving each student a unique problem to solve. This feature encourages independent thinking with the benefit of collaborative learning. Example assignments taken from textbook questions and intellectually engaging Java applet simulations will be shown. Studies and first-hand experience on the educational impact of using WebAssign will also be discussed.
Website accessibility evaluation is a complex task requiring a combination of human expertise and software support. There are several online and offline tools to support the manual web accessibility evaluation process. However, they all have some weaknesses because none of them includes all the desired features. In this paper we present Hera-FFX, an add-on for the Firefox web browser that
José L. Fuertes; Ricardo González; Emmanuelle Gutiérrez Y Restrepo; Loïc Martínez
Describes the results of a transaction log analysis of a Web search engine, WebCrawler, to analyze user's queries for information retrieval. Results suggest most users do not employ advanced search features, and the linguistic structure often resembles a human-human communication model that is not always successful in human-computer communication.…
We introduce the OWL Plugin, a Semantic Web extension of the Proteg´ e ontology development platform. The OWL Plugin can be used to edit ontologies in the Web Ontology Language (OWL), to access description logic reasoners, and to acquire instances for semantic markup. In many of these features, the OWL Plugin has created and facilitated new practices for building Semantic
Holger Knublauch; Ray W. Fergerson; Natalya Fridman Noy; Mark A. Musen
Tables are an important feature of presenting information & are widely used on the web. They show relational data in a simple & precise manner. A typical web page consists of many blocks or areas e.g. main content areas, advertisements, images etc. Tables contain meaningful information. Almost all data is arranged in tabular format. Tables describe relational information in a
Web-based application development is a difficult ta sk, since these applications include various features, like graphical interfaces, navigational s tructures, business models, and wireless commu- nications, as well as other issues, such as serving a multitude of users, and the need for shorter development time. To overcome these complexities, it is indispensable to use web-based applica- tion designs and software
The problem of identity and reference is receiving increasing attention in the (semantic) web community and is emerging as one of the key features which distinguish traditional knowledge representation from knowledge representation on the web with respect to data interlinking and knowledge integration on a large scale. As part of this debate, the OKKAM project proposed the creation of an
Paolo Bouquet; Themis Palpanas; Heiko Stoermer; Massimiliano Vignolo
This study content analyzed a randomly selected stratified national sample of 203 four-year United States colleges' counseling center Web sites to assess the degree to which such sites feature information and reference services for lesbian, gay, bisexual, and transgender (LGBT) collegians. Results revealed that LGBT-targeted communications were infrequent. For instance, fewer than one third of counseling center Web sites described
Tan Kee Leong; Borhanuddin Mohd Ali; Veeraraghavan Prakash; Nor Kamariah Nordin
Describes how a group of middle-school students in Walled Lake, Michigan, collaborated with a Web development firm and the county information technology department to build a district court Web site (www.52-1districtcourt.com) to provide community access to legal information. Includes such features as a virtual tour of the court, "Ask the Judge,"…
Despite the fact that average screen size and resolution have dramatically increased, many of today's web sites still do not scale well in larger viewing contexts. The upcoming HTML5 and CSS3 standards propose features that can be used to build more flexible web page layouts, but their potential to accommodate a wider range of display environments is currently relatively unexplored.
Michael Nebeling; Fabrice Matulic; Lucas Streit; Moira C. Norrie
Web video databases tend to contain tremendous near-duplicates with the explosive growth of online videos. How to detect and eliminate these near-duplicates has become an essential problem for Web video storage and indexing. In this paper, we propose a hierarchical approach to solve this problem efficiently and effectively. For an incoming video, firstly we compare global features to exclude videos
This work describes a new Java web application that enables users to explore and annotate VRML worlds collaboratively. The web application consists of a Java servlet, Java Server Pages (JSP) and supporting classes. There are two significant features of this work. First, users can explore existing standard VRML worlds collaboratively with automatic multiuser augmentation. Second, the user is free from
We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.
Web Page Design for Designers is an excellent resource written and designed by Joe Gillespie and sponsored by Pixel Productions. It offers web site design information and advice to those who are already familiar with HTML. This site's main focus is on how to obtain an optimal design that is sensitive to both multiple platforms and screen sizes. Issues discussed include web page size, fonts, graphics and color palettes, and navigation. Users can download tools such as a web page ruler, color palettes, and more. Web Page Design for Designers also contains a well-rounded listing of other resources.
The ProBiS web server is a web server for detection of structurally similar binding sites in the PDB and for local pairwise alignment of protein structures. In this article, we present a new version of the ProBiS web server that is 10 times faster than earlier versions, due to the efficient parallelization of the ProBiS algorithm, which now allows significantly faster comparison of a protein query against the PDB and reduces the calculation time for scanning the entire PDB from hours to minutes. It also features new web services, and an improved user interface. In addition, the new web server is united with the ProBiS-Database and thus provides instant access to pre-calculated protein similarity profiles for over 29?000 non-redundant protein structures. The ProBiS web server is particularly adept at detection of secondary binding sites in proteins. It is freely available at http://probis.cmm.ki.si/old-version, and the new ProBiS web server is at http://probis.cmm.ki.si.
The purpose of the study was to examine effects of a compact training for developing web sites on teachers' web attitude, as composed of: web self efficacy, perceived web enjoyment, perceived web usefulness and behavioral intention to use the web. To measure the related constructs, the Web Attitude Scale was adapted into Turkish and tested with a…
Creating an adaptive web site, which is a web site that automatically changes its contents and organization ac- cording to usage, is a challenge for web site designers. We introduce a novel algorithm for creating an adaptive web site based on combining web usage mining and adaptive web site generation. We outline the design of a system called WebAdaptor that
This paper describes several known and some new methods\\u000a for feature subset selection on large text data.\\u000a Experimental comparison given on real-world data collected\\u000a from Web users shows that characteristics of the problem\\u000a domain and machine learning algorithm should be considered\\u000a when feature scoring measure is selected. Our problem\\u000a domain consists of hyperlinks given in a form of\\u000a small-documents represented
The development of a low cost and reliable contact system for solar cells and the fabrication of several solar cell modules using ultrasonic bonding for the interconnection of cells and ethylene vinyl acetate as the potting material for module encapsulation are examined. The cells in the modules were made from dendritic web silicon. To reduce cost, the electroplated layer of silver was replaced with an electroplated layer of copper. The modules that were fabricated used the evaporated Ti, Pd, Ag and electroplated Cu (TiPdAg/Cu) system. Adherence of Ni to Si is improved if a nickel silicide can be formed by heat treatment. The effectiveness of Ni as a diffusion barrier to Cu and the ease with which nickel silicide is formed is discussed. The fabrication of three modules using dendritic web silicon and employing ultrasonic bonding for interconnecting calls and ethylene vinyl acetate as the potting material is examined.
Meier, D. L.; Campbell, R. B.; Sienkiewicz, L. J.; Rai-Choudhury, P.
Sponsored by the British Broadcasting Corporation (BBC), Becoming Webwise is an online course for the novice Internet user that wants to learn at his/her own pace. The course consists of eight sections that take users through the Internet basics in a simple and easy-to-follow format. Becoming WebWise covers topics such as getting connected, emailing, searching, bookmarking, creating address books, and the basic fundamentals of building a Web page. Users will also learn about technological developments like Digital TV, WAP phones, legal online rights, the history of the Net, as well as other ways of accessing the Internet. The course is estimated to take up to ten hours to complete, and users are able to return to any of the sections as often as they choose.
This lesson plan is part of the DiscoverySchool.com lesson plan library for grades 6-8. It focuses on the seasonal changes that affect life in a temperate forest ecosystem and how organisms in a temperate forest are dependent on one another for proper nutrition. Students describe the three major types of organisms that live in an ecosystem, three types of consumers, food webs, and food chains. They then create a food web diagram for display in their classrooms. Included are objectives, materials, procedures, discussion questions, evaluation ideas, suggested readings, and vocabulary. There are videos available to order which complement this lesson, and links to teaching tools for making custom quizzes, worksheets, puzzles and lesson plans.
This lesson plan is part of the DiscoverySchool.com lesson plan library for grades 6-8. It focuses on the seasonal changes that affect life in a temperate forest ecosystem and how organisms in a temperate forest are dependent on one another for proper nutrition. Students describe the three major types of organisms that live in an ecosystem, three types of consumers, food webs, and food chains. They then create a food web diagram for display in their classrooms. Included are objectives, materials, procedures, discussion questions, evaluation ideas, suggested readings, and vocabulary. There are videos available to order which complement this lesson, and links to teaching tools for making custom quizzes, worksheets, puzzles and lesson plans.
We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.
Gross, D. [Institute for Theoretical Physics, Leibniz University Hannover, D-30167 Hannover (Germany); Eisert, J. [Institute for Physics and Astronomy, University of Potsdam, D-14476 Potsdam (Germany); Institute for Advanced Study Berlin, D-14193 Berlin (Germany)
Data from the National Health and Nutrition Examination Survey (NHANES) have provided unique opportunities to study major nutrition, infection, environmental, and chronic health conditions in the US. The National Center for Health Statistics (NCHS) makes NHANES datasets publicly available on its Web site. However, all NHANES users face similar challenges because of the complexity of NHANES' survey design and vast amount of information available in NHANES data.
This is a web quest for students to research weather forecasting using the Internet. Students work in groups to study how accurate weather forecasts are by tracking the weather for 3 days in several locations. Using graphs students then compare how each location scored in accuracy and present their findings to the class. This site contains links for students to use for more background information, a process for the students to follow, and evaluation rubrics for the student-produced graphs and presentation.
\\u000a The expressivity of RDF and RDF Schema that was described in  is deliberately very limited: RDF is (roughly) limited to binary ground predicates, and RDF Schema is (again roughly) limited\\u000a to a subclass hierarchy and a property hierarchy, with domain and range definitions of these properties.\\u000a \\u000a \\u000a However, the Web Ontology Working Group of W3C  identified a number of
The journal Nature offers this free Web focus on Severe Acute Respiratory Syndrome (SARS), in which Nature's reporters pose key questions about the outbreak, and assess our preparedness to deal with future viral threats. Reader will find dozens of articles, including editorials, Science Updates, and Brief Communications from the journal. The articles trace the chronology of the SARS epidemic, and the section titled What Have We Learned? offers an excellent overview of what we know and what remains to be seen.
By Peter Guest, Research Associate Profession at the Naval Postgraduate School in Monterey, this wonderfully organized set of web resources in polar meteorology can be helpful to educators looking to supplement their lessons. The links are divided into sections, including Current Weather, Current Sea Ice, Climate, Climate Change, Education, Maps, Research Projects, Institutions, Organizations, and People. Visitors will also find a list of general polar links at the end.
Parents and those concerned about young people surfing the Internet may want to take a close look at this particular application. With this application, users have the ability to block adult sites and other potentially offensive content from the eyes of children and other impressionable persons. K9 Web Protection can also be configured to stop spyware or gambling programs. This particular version is compatible with all computers running Windows 2000 and XP.
This paper shifts the focus of Web search towards finding and exploiting small text nuggets, rather than full-length documents, assumming that the type of targeted information (e.g., date) is specified in the queries. Each nugget is a document sentence fragment that encodes open-domain factual information associated to some entity. The entities are dates (e.g., 1971) when the events captured by
\\u000a One of the major benefits of using Jython is the ability to make use of Java platform capabilities programming in the Python\\u000a programming language instead of Java. In the Java world today, the most widely used web development technique is the Java\\u000a servlet. Now in JavaEE, there are techniques and frameworks used so that we can essentially code HTML or
Josh Juneau; Jim Baker; Victor Ng; Leo Soto; Frank Wierzbicki
There is a great deal of research about improving Web server performance and building better, faster servers, but little research in characterizing servers and the load imposed upon them. While some tremendously popular and busy sites, such as netscape.com, playboy.com, and altavista.com, receive several million hits per day, most servers are never subjected to loads of this magnitude. This paper
\\u000a We describe a framework for human action retrieval in still web images by verb queries, for instance “phoning”. Firstly, we\\u000a build a group of visual discriminative instances for each action class, called “Exemplarlets”. Thereafter we employ Multiple\\u000a Kernel Learning (MKL) to learn an optimal combination of histogram intersection kernels, each of which captures a state-of-the-art\\u000a feature channel. Our features include
We compute a class of diagrams contributing to the multi-leg soft anomalous dimension through three loops, by renormalizing a product of semi-infinite non-lightlike Wilson lines in dimensional regularization. Using non-Abelian exponentiation we directly compute contributions to the exponent in terms of webs. We develop a general strategy to compute webs with multiple gluon exchanges between Wilson lines in configuration space, and explore their analytic structure in terms of ? ij , the exponential of the Minkowski cusp angle formed between the lines i and j. We show that beyond the obvious inversion symmetry ? ij ? 1 /? ij , at the level of the symbol the result also admits a crossing symmetry ? ij ? - ? ij , relating spacelike and timelike kinematics, and hence argue that in this class of webs the symbol alphabet is restricted to ? ij and . We carry out the calculation up to three gluons connecting four Wilson lines, finding that the contributions to the soft anomalous dimension are remarkably simple: they involve pure functions of uniform weight, which are written as a sum of products of polylogarithms, each depending on a single cusp angle. We conjecture that this type of factorization extends to all multiple-gluon-exchange contributions to the anomalous dimension.
The paper discusses web based application for distributed automation. Realization is made over three-layer distributed model. XML table driven communication model is used for heterogeneous connection of different parts of the system. Functionality of the model is delegated and distributed among servers and embedded systems. Majority features of realization, concerning scalability, flexibility, distribution, collecting and delegating of functionality, reliability and
This guide presents a selection of the most authoritative, informative and useful Web sites for today's college and university American literature students and scholars. Nearly 300 American authors whose works are features in the most recent editions of widely used literature anthologies are included in this volume. A chronological arrangement…
Compared with traditional business operations, WWW-based commerce has many advantages, such as timeliness, worldwide communication, hyperlinks, and multimedia. However, there are also several browsing problems, such as getting lost, consuming a great amount of time browsing, and lack of customized interactive features. To acquire a competitive advantage over the countless number of Web sites, it is critical to solve these
Modeling in the development of low stress configurations for wide web growth is presented. Parametric sensitivity to identify design features which can be used for dynamic trimming of the furnace element was studied. Temperature measurements of experimental growth behavior led to modification in the growth system to improve lateral temperature distributions.
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.
In this paper the problems in the automation testing of GUI based Web applications are discussed. A new automation testing framework based on the concept of object feature set and dynamic searching policy is proposed. The design and implementation of it are both given. The framework working using result shows that it makes the testing more convenient and efficient with
Amenable to extensive parallelization, Google's web search application lets different queries run on different processors and, by partitioning the overall index, also lets a single query use multiple processors. to handle this workload, Googless architecture features clusters of more than 15,000 commodity-class PCs with fault tolerant software. This architecture achieves superior performance at a fraction of the cost of a
The searchable Solar Feature Catalogues (SFC) developed using automated pattern recognition techniques from digitized solar images are presented. The techniques were applied for detection of sunspots, active regions, filaments and line-on-sight magnetic neutral lines in the automatically standardized full disk solar images in Ca II K1, Ca II K3 and Ha taken at the Meudon Observatory and white light images and magnetograms from SOHO/MDI. The results of automated recognition were verified with the manual synoptic maps and available statistical data that revealed good detection accuracy. Based on the recognized parameters a structured database of the Solar Feature Catalogues was built on a mysql server for every feature and published with various pre-designed search pages on the Bradford University web site. The SFCs with 10 year coverage (1996-2005) is to be used for deeper investigation of the solar activity, activity feature classification and forecast.
Zharkova, V. V.; Aboudarham, J.; Zharkov, S. I.; Ipson, S. S.; Benkhalil, A. K.; Fuller, N.
The most commonly deployed web service applications employ client-server communication patterns, with clients running remotely and services hosted in data centers. In this paper, we make the case for Service-Oriented Colla- boration (SOC) applications that combine service-hosted data with collaboration features implemented using peer- to-peer protocols. Collaboration features are awkward to support solely based on the existing web services technol-
Ken Birman; Jared Cantwell; Daniel Freedman; Qi Huang; Petko Nikolov; Krzysztof Ostrowski
Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take precedence. Meta-policies are specified in defeasible logic, a computationally efficient non-monotonic logic developed to model human reasoning. One drawback of this method is that at one point the algorithm starts an exhaustive search of all subsets of the set of conclusions of a defeasible theory. Although the propositional defeasible logic has linear complexity, the set of conclusions here may be large, especially in real-life practical cases. This phenomenon leads to an inefficient exponential explosion of complexity. The current process of getting a Web security policy from combination of two meta-policies consists of two steps. The first is generating a new meta-policy that is a composition of the input meta-policies, and the second is mapping the meta-policy onto a security policy. The new algorithm avoids the exhaustive search in the current algorithm, and provides a security policy that matches all requirements of the involved metapolicies.
An Earth-observing sensor web is an organization of space, airborne, or in situ sensing devices for collecting measurements of the Earth's processes. Sensor web coordination involves formulating Earth science goals and transforming them into sensor web workflows, i.e., sequences of data acquisition and processing tasks that satisfy the specified goals. Automating parts of this process using recent advances in intelligent control software technology will offer improved sensor web effectiveness. Our approach to the coordination problem applies architectural concepts of workflow management systems by identifying two phases in workflow generation. In the first phase, users formulate high-level campaign goals that are automatically transformed into abstract workflow plans. An abstract workflow plan represents the organization of data acquisition and processing actions that fulfills the goals specified by the user, but leaves out details such as how requests for access to a data resource are formatted. Abstracting away these details improves the usability of sensor web resources by scientists. To implement the first phase, we utilize the Labeled Transition System Analyzer (LTSA), a model-checking software tool. LTSA contains a concise process-based language, FSP (Finite State Processes) for designing and modeling software programs. We will use LTSA and FSP to automate the process of building executable plans for accessing resources on a sensor web. FSP has the constructs for representing conditional dependencies, iterations, and parallel actions, all of which are common features in Earth science campaigns. The second phase of the process consists of the automatic transformation of an abstract plan into a concrete plan, i.e., a sequence of actions that can be autonomously executed on a sensor web. The transformation in phase two might require further decomposition of actions in the abstract plan into a sequence of lower-level data acquisition requests. It may also involve the selection of resources to accomplish a given action and the representation of data acquisition tasks in a format that is recognized by the targeted resource (e.g. a sensor control command or a data archive query). The second phase relies on a service-layer information infrastructure for accessing sensor web resources. Standardizing requirements for such a service layer through the Open Geospatial Consortium Sensor Web Enablement (OGC/SWE) effort should allow access to numerous and diverse sensor web resources. For the purpose of demonstrating a prototype of our workflow management concepts, our system currently utilizes a simpler information infrastructure layer for servicing requests. This layer controls access to TOPS (Terrestrial Observation and Prediction System), a modeling software system that brings together technologies in information technology, weather/climate forecasting, ecosystem modeling, and satellite remote sensing to enhance management decisions related to floods, droughts, forest fires, human health, and crop, range, and forest production. We provide examples of concrete plans for accessing TOPS data and modeling resources and how they are generated from abstract plans.
WebCrawler, the first comprehensive full-text search engine for the World-Wide Web, has played a fundamental role in making the Web easier to use for millions of people. Its invention and subsequent evolution, spanning a three-year period, helped fuel the Web??s growth by creating a new way of navigating hypertext. Before search engines like WebCrawler, users found Web documents by following
\\u000a HTML forms are the predominant interface between users and web applications. Many of these applications display a sequence\\u000a of multiple forms on separate pages, for instance to book a flight or order a DVD. We introduce a method to wrap these multi-stepped\\u000a forms and offer their individual functionality as a single consolidated Web Service. This Web Service in turn maps
Web content plays an increasingly important role in the knowledge-based society, and the preservation and long-term accessibility of Web history has high value (e.g., for scholarly studies, market analyses, intellectual property disputes, etc.). There is strongly growing interest in its preservation by libraries and archival organizations as well as emerging industrial services. Web content characteristics (high dynamics, volatility, contributor and
Thomas Risse; Julien Masanès; András A. Benczúr; Marc Spaniol
With the enormous amount of information presented on the web, the retrieval of relevant information has become a serious problem and is also the topic of research for last few years. The most common tools to retrieve information from web are search engines like Google. The Search engines are usually based on keyword searching and indexing of web pages. This approach is not very efficient as the result-set of web pages obtained include large irrelevant pages. Sometimes even the entire result-set may contain lot of irrelevant pages for the user. The next generation of search engines must address this problem. Recently, many semantic web search engines have been developed like Ontolook, Swoogle, which help in searching meaningful documents presented on semantic web. In this process the ranking of the retrieved web pages is very crucial. Some attempts have been made in ranking of semantic web pages but still the ranking of these semantic web documents is neither satisfactory and nor up to the user's expectations. In this paper we have proposed a semantic web based document ranking scheme that relies not only on the keywords b