Development of a Web-based financial application System
NASA Astrophysics Data System (ADS)
Hasan, M. R.; Ibrahimy, M. I.; Motakabber, S. M. A.; Ferdaus, M. M.; Khan, M. N. H.; Mostafa, M. G.
2013-12-01
The paper describes a technique to develop a web based financial system, following latest technology and business needs. In the development of web based application, the user friendliness and technology both are very important. It is used ASP .NET MVC 4 platform and SQL 2008 server for development of web based financial system. It shows the technique for the entry system and report monitoring of the application is user friendly. This paper also highlights the critical situations of development, which will help to develop the quality product.
A Structural and Content-Based Analysis for Web Filtering.
ERIC Educational Resources Information Center
Lee, P. Y.; Hui, S. C.; Fong, A. C. M.
2003-01-01
Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)
Techniques for Enhancing Web-Based Education.
ERIC Educational Resources Information Center
Barbieri, Kathy; Mehringer, Susan
The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…
Development of high efficiency solar cells on silicon web
NASA Technical Reports Server (NTRS)
Rohatgi, A.; Meier, D. L.; Campbell, R. B.; Schmidt, D. N.; Rai-Choudhury, P.
1984-01-01
Web base material is being improved with a goal toward obtaining solar cell efficiencies in excess of 18% (AM1). Carrier loss mechanisms in web silicon was investigated, techniques were developed to reduce carrier recombination in the web, and web cells were fabricated using effective surface passivation. The effect of stress on web cell performance was also investigated.
ERIC Educational Resources Information Center
Medina-Dominguez, Fuensanta; Sanchez-Segura, Maria-Isabel; Mora-Soto, Arturo; Amescua, Antonio
2010-01-01
The development of collaborative Web applications does not follow a software engineering methodology. This is because when university students study Web applications in general, and collaborative Web portals in particular, they are not being trained in the use of software engineering techniques to develop collaborative Web portals. This paper…
NASA Astrophysics Data System (ADS)
Sinha, Mukesh Kumar; Das, B. R.; Kumar, Kamal; Kishore, Brij; Prasad, N. Eswara
2017-06-01
The article reports a novel technique for functionization of nanoweb to develop ultraviolet (UV) radiation protective fabric. UV radiation protection effect is produced by combination of electrospinning and electrospraying technique. A nanofibrous web of polyvinylidene difluoride (PVDF) coated on polypropylene nonwoven fabric is produced by latest nanospider technology. Subsequently, web is functionalized by titanium dioxide (TiO2). The developed web is characterized for evaluation of surface morphology and other functional properties; mechanical, chemical, crystalline and thermal. An optimal (judicious) nanofibre spinning condition is achieved and established. The produced web is uniformly coated by defect free functional nanofibres in a continuous form of useable textile structural membrane for ultraviolet (UV) protective clothing. This research initiative succeeds in preparation and optimization of various nanowebs for UV protection. Field Emission Scanning Electron Microscope (FESEM) result reveals that PVDF webs photo-degradative behavior is non-accelerated, as compared to normal polymeric grade fibres. Functionalization with TiO2 has enhanced the photo-stability of webs. The ultraviolet protection factor of functionalized and non-functionalized nanowebs empirically evaluated to be 65 and 24 respectively. The developed coated layer could be exploited for developing various defence, para-military and civilian UV protective light weight clothing (tent, covers and shelter segments, combat suit, snow bound camouflaging nets). This research therefore, is conducted in an attempt to develop a scientific understanding of PVDF fibre coated webs for photo-degradation and applications for defence protective textiles. This technological research in laboratory scale could be translated into bulk productionization.
Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs
NASA Astrophysics Data System (ADS)
O'Connor, Rory V.
This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.
Silicon Web Process Development. [for solar cell fabrication
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Hopkins, R. H.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.
1979-01-01
Silicon dendritic web, ribbon form of silicon and capable of fabrication into solar cells with greater than 15% AMl conversion efficiency, was produced from the melt without die shaping. Improvements were made both in the width of the web ribbons grown and in the techniques to replenish the liquid silicon as it is transformed to web. Through means of improved thermal shielding stress was reduced sufficiently so that web crystals nearly 4.5 cm wide were grown. The development of two subsystems, a silicon feeder and a melt level sensor, necessary to achieve an operational melt replenishment system, is described. A gas flow management technique is discussed and a laser reflection method to sense and control the melt level as silicon is replenished is examined.
Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems
ERIC Educational Resources Information Center
Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul
2009-01-01
Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…
WebSat--a web software for microsatellite marker development.
Martins, Wellington Santos; Lucas, Divino César Soares; Neves, Kelligton Fabricio de Souza; Bertioli, David John
2009-01-01
Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. The web tool may be accessed at http://purl.oclc.org/NET/websat/
Testing Web Applications with Mutation Analysis
ERIC Educational Resources Information Center
Praphamontripong, Upsorn
2017-01-01
Web application software uses new technologies that have novel methods for integration and state maintenance that amount to new control flow mechanisms and new variables scoping. While modern web development technologies enhance the capabilities of web applications, they introduce challenges that current testing techniques do not adequately test…
WebSat ‐ A web software for microsatellite marker development
Martins, Wellington Santos; Soares Lucas, Divino César; de Souza Neves, Kelligton Fabricio; Bertioli, David John
2009-01-01
Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. Availability The web tool may be accessed at http://purl.oclc.org/NET/websat/ PMID:19255650
Designing a Pedagogical Model for Web Engineering Education: An Evolutionary Perspective
ERIC Educational Resources Information Center
Hadjerrouit, Said
2005-01-01
In contrast to software engineering, which relies on relatively well established development approaches, there is a lack of a proven methodology that guides Web engineers in building reliable and effective Web-based systems. Currently, Web engineering lacks process models, architectures, suitable techniques and methods, quality assurance, and a…
Silicon Web Process Development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Hopkins, R. H.; Mchugh, J. P.; Hill, F. E.; Heimlich, M. E.; Driggers, J. M.
1978-01-01
Progress in the development of techniques to grow silicon web at 25 wq cm/min output rate is reported. Feasibility of web growth with simultaneous melt replenishment is discussed. Other factors covered include: (1) tests of aftertrimmers to improve web width; (2) evaluation of growth lid designs to raise speed and output rate; (3) tests of melt replenishment hardware; and (4) investigation of directed gas flow systems to control unwanted oxide deposition in the system and to improve convective cooling of the web. Compatibility with sufficient solar cell performance is emphasized.
QoS measurement of workflow-based web service compositions using Colored Petri net.
Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra
2014-01-01
Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.
An Expertise Recommender using Web Mining
NASA Technical Reports Server (NTRS)
Joshi, Anupam; Chandrasekaran, Purnima; ShuYang, Michelle; Ramakrishnan, Ramya
2001-01-01
This report explored techniques to mine web pages of scientists to extract information regarding their expertise, build expertise chains and referral webs, and semi automatically combine this information with directory information services to create a recommender system that permits query by expertise. The approach included experimenting with existing techniques that have been reported in research literature in recent past , and adapted them as needed. In addition, software tools were developed to capture and use this information.
Web Application Design Using Server-Side JavaScript
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hampton, J.; Simons, R.
1999-02-01
This document describes the application design philosophy for the Comprehensive Nuclear Test Ban Treaty Research & Development Web Site. This design incorporates object-oriented techniques to produce a flexible and maintainable system of applications that support the web site. These techniques will be discussed at length along with the issues they address. The overall structure of the applications and their relationships with one another will also be described. The current problems and future design changes will be discussed as well.
An efficient scheme for automatic web pages categorization using the support vector machine
NASA Astrophysics Data System (ADS)
Bhalla, Vinod Kumar; Kumar, Neeraj
2016-07-01
In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.
Users' Interaction with World Wide Web Resources: An Exploratory Study Using a Holistic Approach.
ERIC Educational Resources Information Center
Wang, Peiling; Hawk, William B.; Tenopir, Carol
2000-01-01
Presents results of a study that explores factors of user-Web interaction in finding factual information, develops a conceptual framework for studying user-Web interaction, and applies a process-tracing method for conducting holistic user-Web studies. Describes measurement techniques and proposes a model consisting of the user, interface, and the…
60. The World-Wide Inaccessible Web, Part 1: Browsing
ERIC Educational Resources Information Center
Baggaley, Jon; Batpurev, Batchuluun
2007-01-01
Two studies are reported, comparing the browser loading times of webpages created using common Web development techniques. The loading speeds were estimated in 12 Asian countries by members of the "PANdora" network, funded by the International Development Research Centre (IDRC) to conduct collaborative research in the development of…
Analysis of Web Spam for Non-English Content: Toward More Effective Language-Based Classifiers
Alsaleh, Mansour; Alarifi, Abdulrahman
2016-01-01
Web spammers aim to obtain higher ranks for their web pages by including spam contents that deceive search engines in order to include their pages in search results even when they are not related to the search terms. Search engines continue to develop new web spam detection mechanisms, but spammers also aim to improve their tools to evade detection. In this study, we first explore the effect of the page language on spam detection features and we demonstrate how the best set of detection features varies according to the page language. We also study the performance of Google Penguin, a newly developed anti-web spamming technique for their search engine. Using spam pages in Arabic as a case study, we show that unlike similar English pages, Google anti-spamming techniques are ineffective against a high proportion of Arabic spam pages. We then explore multiple detection features for spam pages to identify an appropriate set of features that yields a high detection accuracy compared with the integrated Google Penguin technique. In order to build and evaluate our classifier, as well as to help researchers to conduct consistent measurement studies, we collected and manually labeled a corpus of Arabic web pages, including both benign and spam pages. Furthermore, we developed a browser plug-in that utilizes our classifier to warn users about spam pages after clicking on a URL and by filtering out search engine results. Using Google Penguin as a benchmark, we provide an illustrative example to show that language-based web spam classifiers are more effective for capturing spam contents. PMID:27855179
Analysis of Web Spam for Non-English Content: Toward More Effective Language-Based Classifiers.
Alsaleh, Mansour; Alarifi, Abdulrahman
2016-01-01
Web spammers aim to obtain higher ranks for their web pages by including spam contents that deceive search engines in order to include their pages in search results even when they are not related to the search terms. Search engines continue to develop new web spam detection mechanisms, but spammers also aim to improve their tools to evade detection. In this study, we first explore the effect of the page language on spam detection features and we demonstrate how the best set of detection features varies according to the page language. We also study the performance of Google Penguin, a newly developed anti-web spamming technique for their search engine. Using spam pages in Arabic as a case study, we show that unlike similar English pages, Google anti-spamming techniques are ineffective against a high proportion of Arabic spam pages. We then explore multiple detection features for spam pages to identify an appropriate set of features that yields a high detection accuracy compared with the integrated Google Penguin technique. In order to build and evaluate our classifier, as well as to help researchers to conduct consistent measurement studies, we collected and manually labeled a corpus of Arabic web pages, including both benign and spam pages. Furthermore, we developed a browser plug-in that utilizes our classifier to warn users about spam pages after clicking on a URL and by filtering out search engine results. Using Google Penguin as a benchmark, we provide an illustrative example to show that language-based web spam classifiers are more effective for capturing spam contents.
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
Seeking health information on the web: positive hypothesis testing.
Kayhan, Varol Onur
2013-04-01
The goal of this study is to investigate positive hypothesis testing among consumers of health information when they search the Web. After demonstrating the extent of positive hypothesis testing using Experiment 1, we conduct Experiment 2 to test the effectiveness of two debiasing techniques. A total of 60 undergraduate students searched a tightly controlled online database developed by the authors to test the validity of a hypothesis. The database had four abstracts that confirmed the hypothesis and three abstracts that disconfirmed it. Findings of Experiment 1 showed that majority of participants (85%) exhibited positive hypothesis testing. In Experiment 2, we found that the recommendation technique was not effective in reducing positive hypothesis testing since none of the participants assigned to this server could retrieve disconfirming evidence. Experiment 2 also showed that the incorporation technique successfully reduced positive hypothesis testing since 75% of the participants could retrieve disconfirming evidence. Positive hypothesis testing on the Web is an understudied topic. More studies are needed to validate the effectiveness of the debiasing techniques discussed in this study and develop new techniques. Search engine developers should consider developing new options for users so that both confirming and disconfirming evidence can be presented in search results as users test hypotheses using search engines. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Developing the E-Delphi System: A Web-Based Forecasting Tool for Educational Research.
ERIC Educational Resources Information Center
Chou, Chien
2002-01-01
Discusses use of the Delphi technique and describes the development of an electronic version, called e-Delphi, in which questionnaire construction and communication with panel members was accomplished using the Web. Explains system function and interface and discusses evaluation of the e-Delphi system. (Author/LRW)
NASA Astrophysics Data System (ADS)
Demir, I.
2013-12-01
Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.
Web-based services for drug design and discovery.
Frey, Jeremy G; Bird, Colin L
2011-09-01
Reviews of the development of drug discovery through the 20(th) century recognised the importance of chemistry and increasingly bioinformatics, but had relatively little to say about the importance of computing and networked computing in particular. However, the design and discovery of new drugs is arguably the most significant single application of bioinformatics and cheminformatics to have benefitted from the increases in the range and power of the computational techniques since the emergence of the World Wide Web, commonly now referred to as simply 'the Web'. Web services have enabled researchers to access shared resources and to deploy standardized calculations in their search for new drugs. This article first considers the fundamental principles of Web services and workflows, and then explores the facilities and resources that have evolved to meet the specific needs of chem- and bio-informatics. This strategy leads to a more detailed examination of the basic components that characterise molecules and the essential predictive techniques, followed by a discussion of the emerging networked services that transcend the basic provisions, and the growing trend towards embracing modern techniques, in particular the Semantic Web. In the opinion of the authors, the issues that require community action are: increasing the amount of chemical data available for open access; validating the data as provided; and developing more efficient links between the worlds of cheminformatics and bioinformatics. The goal is to create ever better drug design services.
[Radiology information system using HTML, JavaScript, and Web server].
Sone, M; Sasaki, M; Oikawa, H; Yoshioka, K; Ehara, S; Tamakawa, Y
1997-12-01
We have developed a radiology information system using intranet techniques, including hypertext markup language, JavaScript, and Web server. JavaScript made it possible to develop an easy-to-use application, as well as to reduce network traffic and load on the server. The system we have developed is inexpensive and flexible, and its development and maintenance are much easier than with the previous system.
Developing a Multigenerational Creativity Website for Gifted and Talented Learners.
ERIC Educational Resources Information Center
Montgomery, Diane; Overton, Robert; Bull, Kay S.; Kimball, Sarah; Griffin, John
This paper discusses techniques and resources to use to stimulate creativity through a web site for several "generations" of gifted and talented learners. To organize a web site to stimulate creativity, two categories of development issues must be considered: intrinsic person variables, and process variables such as thinking skills,…
Sensor Web Dynamic Measurement Techniques and Adaptive Observing Strategies
NASA Technical Reports Server (NTRS)
Talabac, Stephen J.
2004-01-01
Sensor Web observing systems may have the potential to significantly improve our ability to monitor, understand, and predict the evolution of rapidly evolving, transient, or variable environmental features and events. This improvement will come about by integrating novel data collection techniques, new or improved instruments, emerging communications technologies and protocols, sensor mark-up languages, and interoperable planning and scheduling systems. In contrast to today's observing systems, "event-driven" sensor webs will synthesize real- or near-real time measurements and information from other platforms and then react by reconfiguring the platforms and instruments to invoke new measurement modes and adaptive observation strategies. Similarly, "model-driven" sensor webs will utilize environmental prediction models to initiate targeted sensor measurements or to use a new observing strategy. The sensor web concept contrasts with today's data collection techniques and observing system operations concepts where independent measurements are made by remote sensing and in situ platforms that do not share, and therefore cannot act upon, potentially useful complementary sensor measurement data and platform state information. This presentation describes NASA's view of event-driven and model-driven Sensor Webs and highlights several research and development activities at the Goddard Space Flight Center.
Large area sheet task: Advanced Dendritic Web Growth Development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.
1981-01-01
A melt level control system was implemented to provide stepless silicon feed rates from zero to rates exactly matching the silicon consumed during web growth. Bench tests of the unit were successfully completed and the system mounted in a web furnace for operational verification. Tests of long term temperature drift correction techniques were made; web width monitoring seems most appropriate for feedback purposes. A system to program the initiation of the web growth cycle was successfully tested. A low cost temperature controller was tested which functions as well as units four times as expensive.
Failure Analysis for Composition of Web Services Represented as Labeled Transition Systems
NASA Astrophysics Data System (ADS)
Nadkarni, Dinanath; Basu, Samik; Honavar, Vasant; Lutz, Robyn
The Web service composition problem involves the creation of a choreographer that provides the interaction between a set of component services to realize a goal service. Several methods have been proposed and developed to address this problem. In this paper, we consider those scenarios where the composition process may fail due to incomplete specification of goal service requirements or due to the fact that the user is unaware of the functionality provided by the existing component services. In such cases, it is desirable to have a composition algorithm that can provide feedback to the user regarding the cause of failure in the composition process. Such feedback will help guide the user to re-formulate the goal service and iterate the composition process. We propose a failure analysis technique for composition algorithms that views Web service behavior as multiple sequences of input/output events. Our technique identifies the possible cause of composition failure and suggests possible recovery options to the user. We discuss our technique using a simple e-Library Web service in the context of the MoSCoE Web service composition framework.
Design and Development of Basic Physical Layer WiMAX Network Simulation Models
2009-01-01
Wide Web . The third software version was developed during the period of 22 August to 4 November, 2008. The software version developed during the...researched on the Web . The mathematics of some fundamental concepts such as Fourier transforms, convolutional coding techniques were also reviewed...Mathworks Matlab users’ website. A simulation model was found, entitled Estudio y Simulacion de la capa Jisica de la norma 802.16 ( Sistema WiMAX) developed
Tethys: A Platform for Water Resources Modeling and Decision Support Apps
NASA Astrophysics Data System (ADS)
Nelson, J.; Swain, N. R.
2015-12-01
The interactive nature of web applications or "web apps" makes it an excellent medium for conveying complex scientific concepts to lay audiences and creating decision support tools that harness cutting edge modeling techniques. However, the technical expertise required to develop web apps represents a barrier for would-be developers. This barrier can be characterized by the following hurdles that developers must overcome: (1) identify, select, and install software that meet the spatial and computational capabilities commonly required for water resources modeling; (2) orchestrate the use of multiple free and open source (FOSS) projects and navigate their differing application programming interfaces; (3) learn the multi-language programming skills required for modern web development; and (4) develop a web-secure and fully featured web portal to host the app. Tethys Platform has been developed to lower the technical barrier and minimize the initial development investment that prohibits many scientists and engineers from making use of the web app medium. It includes (1) a suite of FOSS that address the unique data and computational needs common to water resources web app development, (2) a Python software development kit that streamlines development, and (3) a customizable web portal that is used to deploy the completed web apps. Tethys synthesizes several software projects including PostGIS, 52°North WPS, GeoServer, Google Maps™, OpenLayers, and Highcharts. It has been used to develop a broad array of web apps for water resources modeling and decision support for several projects including CI-WATER, HydroShare, and the National Flood Interoperability Experiment. The presentation will include live demos of some of the apps that have been developed using Tethys to demonstrate its capabilities.
Development of grid-like applications for public health using Web 2.0 mashup techniques.
Scotch, Matthew; Yip, Kevin Y; Cheung, Kei-Hoi
2008-01-01
Development of public health informatics applications often requires the integration of multiple data sources. This process can be challenging due to issues such as different file formats, schemas, naming systems, and having to scrape the content of web pages. A potential solution to these system development challenges is the use of Web 2.0 technologies. In general, Web 2.0 technologies are new internet services that encourage and value information sharing and collaboration among individuals. In this case report, we describe the development and use of Web 2.0 technologies including Yahoo! Pipes within a public health application that integrates animal, human, and temperature data to assess the risk of West Nile Virus (WNV) outbreaks. The results of development and testing suggest that while Web 2.0 applications are reasonable environments for rapid prototyping, they are not mature enough for large-scale public health data applications. The application, in fact a "systems of systems," often failed due to varied timeouts for application response across web sites and services, internal caching errors, and software added to web sites by administrators to manage the load on their servers. In spite of these concerns, the results of this study demonstrate the potential value of grid computing and Web 2.0 approaches in public health informatics.
NASA Astrophysics Data System (ADS)
Valentine, Andrew; Belski, Iouri; Hamilton, Margaret
2017-11-01
Problem-solving is a key engineering skill, yet is an area in which engineering graduates underperform. This paper investigates the potential of using web-based tools to teach students problem-solving techniques without the need to make use of class time. An idea generation experiment involving 90 students was designed. Students were surveyed about their study habits and reported they use electronic-based materials more than paper-based materials while studying, suggesting students may engage with web-based tools. Students then generated solutions to a problem task using either a paper-based template or an equivalent web interface. Students who used the web-based approach performed as well as students who used the paper-based approach, suggesting the technique can be successfully adopted and taught online. Web-based tools may therefore be adopted as supplementary material in a range of engineering courses as a way to increase students' options for enhancing problem-solving skills.
Creating a web-enhanced interactive preclinic technique manual: case report and student response.
Boberick, Kenneth G
2004-12-01
This article describes the development, use, and student response to an online manual developed with off-the-shelf software and made available using a web-based course management system (Blackboard) that was used to transform a freshman restorative preclinical technique course from a lecture-only course into an interactive web-enhanced course. The goals of the project were to develop and implement a web-enhanced interactive learning experience in a preclinical restorative technique course and shift preclinical education from a teacher-centered experience to a student-driven experience. The project was evaluated using an anonymous post-course survey (95 percent response rate) of 123 freshman students that assessed enabling (technical support and access to the technology), process (the actual experience and usability), and outcome criteria (acquisition and successful use of the knowledge gained and skills learned) of the online manual. Students responded favorably to sections called "slide galleries" where ideal and non-ideal examples of projects could be viewed. Causes, solutions, and preventive measures were provided for the errors shown. Sections called "slide series" provided cookbook directions allowing for self-paced and student-directed learning. Virtually all of the students, 99 percent, found the quality of the streaming videos adequate to excellent. Regarding Internet connections and video viewing, 65 percent of students successfully viewed the videos from a remote site; cable connections were the most reliable, dial-up connections were inadequate, and DSL connections were variable. Seventy-three percent of the students felt the videos were an effective substitute for in-class demonstrations. Students preferred video with sound over video with subtitles and preferred short video clips embedded in the text over compilation videos. The results showed it is possible to develop and implement web-enhanced and interactive dental education in a preclinical restorative technique course that successfully delivered information beyond the textual format.
NASA Astrophysics Data System (ADS)
Juanle, Wang; Shuang, Li; Yunqiang, Zhu
2005-10-01
According to the requirements of China National Scientific Data Sharing Program (NSDSP), the research and development of web oriented RS Image Publication System (RSIPS) is based on Java Servlet technique. The designing of RSIPS framework is composed of 3 tiers, which is Presentation Tier, Application Service Tier and Data Resource Tier. Presentation Tier provides user interface for data query, review and download. For the convenience of users, visual spatial query interface is included. Served as a middle tier, Application Service Tier controls all actions between users and databases. Data Resources Tier stores RS images in file and relationship databases. RSIPS is developed with cross platform programming based on Java Servlet tools, which is one of advanced techniques in J2EE architecture. RSIPS's prototype has been developed and applied in the geosciences clearinghouse practice which is among the experiment units of NSDSP in China.
Development of high-efficiency solar cells on silicon web
NASA Technical Reports Server (NTRS)
Meier, D. L.
1986-01-01
Achievement of higher efficiency cells by directing efforts toward identifying carrier loss mechanisms; design of cell structures; and development of processing techniques are described. Use of techniques such as deep-level transient spectroscopy (DLTS), laser-beam-induced current (LBIC), and transmission electron microscopy (TEM) indicated that dislocations in web material rather than twin planes were primarily responsible for limiting diffusion lengths in the web. Lifetimes and cell efficiencies can be improved from 19 to 120 microns, and 8 to 10.3% (no AR), respectively, by implanting hydrogen at 1500 eV and a beam current density of 2.0 mA/sq cm. Some of the processing improvements included use of a double-layer AR coating (ZnS and MgF2) and an addition of an aluminum back surface reflectors. Cells of more than 16% efficiency were achieved.
Kain, Zeev N.; Fortier, Michelle A.; Chorney, Jill MacLaren; Mayes, Linda
2014-01-01
Background Due to cost-containment efforts, preparation programs for outpatient surgery are currently not available to the majority of children and parents. The recent dramatic growth in the Internet presents a unique opportunity to transform how children and their parents are prepared for surgery. In this article we describe the development of a Web-based tailored preparation program for children and parents undergoing surgery (WebTIPS). Development of Program A multidisciplinary taskforce agreed that a Web-based tailored intervention comprised of intake, matrix and output modules was the preferred approach. Next, the content of the various intake variables, the matrix logic and the output content was developed. The output product has a parent component and a child component and is described in http://surgerywebtips.com/about.php. The child component makes use of preparation strategies such as information provision, modeling, play and coping skills training. The parent component of WebTIPS includes strategies such as information provision, coping skills training, relaxation and distraction techniques. A reputable animation and Web-design company developed a secured Web-based product based on the above description. Conclusions In this article we describe the development of a Web-based tailored preoperative preparation program that can be accessed by children and parents multiple times before and after surgery. A follow-up article in this issue of Anesthesia & Analgesia describes formative evaluation and preliminary efficacy testing of this Web-based tailored preoperative preparation program. PMID:25790212
Harper, Simon; Yesilada, Yeliz
2012-01-01
This is a technological review paper focussed on identifying both the research challenges and opportunities for further investigation arising from emerging technologies, and it does not aim to propose any recommendation or standard. It is focussed on blind and partially sighted World Wide Web (Web) users along with others who use assistive technologies. The Web is a fast moving interdisciplinary domain in which new technologies, techniques and research is in perpetual development. It is often difficult to maintain a holistic view of new developments within the multiple domains which together make up the Web. This suggests that knowledge of the current developments and predictions of future developments are additionally important for the accessibility community. Web accessibility has previously been characterised by the correction of our past mistakes to make the current Web fulfil the original vision of access for all. New technologies were not designed with accessibility in mind and technologies that could be useful for addressing accessibility issues were not identified or adopted by the accessibility community. We wish to enable the research community to undertake preventative measures and proactively address challenges, while recognising opportunities, before they become unpreventable or require retrospective technological enhancement. This article then reviews emerging trends within the Web and Web Accessibility domains.
Mining Longitudinal Web Queries: Trends and Patterns.
ERIC Educational Resources Information Center
Wang, Peiling; Berry, Michael W.; Yang, Yiheng
2003-01-01
Analyzed user queries submitted to an academic Web site during a four-year period, using a relational database, to examine users' query behavior, to identify problems they encounter, and to develop techniques for optimizing query analysis and mining. Linguistic analyses focus on query structures, lexicon, and word associations using statistical…
Adaptive Social Learning Based on Crowdsourcing
ERIC Educational Resources Information Center
Karataev, Evgeny; Zadorozhny, Vladimir
2017-01-01
Many techniques have been developed to enhance learning experience with computer technology. A particularly great influence of technology on learning came with the emergence of the web and adaptive educational hypermedia systems. While the web enables users to interact and collaborate with each other to create, organize, and share knowledge via…
An Intelligent Semantic E-Learning Framework Using Context-Aware Semantic Web Technologies
ERIC Educational Resources Information Center
Huang, Weihong; Webster, David; Wood, Dawn; Ishaya, Tanko
2006-01-01
Recent developments of e-learning specifications such as Learning Object Metadata (LOM), Sharable Content Object Reference Model (SCORM), Learning Design and other pedagogy research in semantic e-learning have shown a trend of applying innovative computational techniques, especially Semantic Web technologies, to promote existing content-focused…
NASA Astrophysics Data System (ADS)
Bártek, Luděk; Ošlejšek, Radek; Pitner, Tomáš
Recent development in Web shows a significant trend towards more user participation, massive use of new devices including portables, and high interactivity. The user participation goes hand in hand with inclusion of all potential user groups - also with special needs. However, we claim that albeit all the effort towards accessibility, it has not yet found an appopriate reflection among stakeholders of the "Top Web Applications" nor their users. This leads to undesired causes - the business-driven Web without all user participation is not a really democratic medium and, actually, does not comply with the original characteristics of Web 2.0. The paper tries to identify perspectives of further development including standardization processes and technical obstacles behind. It also shows ways and techniques to cope with the challenge based on our own research and development in accessible graphics and dialog-based systems.
Using Web-Based Knowledge Extraction Techniques to Support Cultural Modeling
NASA Astrophysics Data System (ADS)
Smart, Paul R.; Sieck, Winston R.; Shadbolt, Nigel R.
The World Wide Web is a potentially valuable source of information about the cognitive characteristics of cultural groups. However, attempts to use the Web in the context of cultural modeling activities are hampered by the large-scale nature of the Web and the current dominance of natural language formats. In this paper, we outline an approach to support the exploitation of the Web for cultural modeling activities. The approach begins with the development of qualitative cultural models (which describe the beliefs, concepts and values of cultural groups), and these models are subsequently used to develop an ontology-based information extraction capability. Our approach represents an attempt to combine conventional approaches to information extraction with epidemiological perspectives of culture and network-based approaches to cultural analysis. The approach can be used, we suggest, to support the development of models providing a better understanding of the cognitive characteristics of particular cultural groups.
Panoramic-image-based rendering solutions for visualizing remote locations via the web
NASA Astrophysics Data System (ADS)
Obeysekare, Upul R.; Egts, David; Bethmann, John
2000-05-01
With advances in panoramic image-based rendering techniques and the rapid expansion of web advertising, new techniques are emerging for visualizing remote locations on the WWW. Success of these techniques depends on how easy and inexpensive it is to develop a new type of web content that provides pseudo 3D visualization at home, 24-hours a day. Furthermore, the acceptance of this new visualization medium depends on the effectiveness of the familiarization tools by a segment of the population that was never exposed to this type of visualization. This paper addresses various hardware and software solutions available to collect, produce, and view panoramic content. While cost and effectiveness of building the content is being addressed using a few commercial hardware solutions, effectiveness of familiarization tools is evaluated using a few sample data sets.
The GenTechnique Project: Developing an Open Environment for Learning Molecular Genetics.
ERIC Educational Resources Information Center
Calza, R. E.; Meade, J. T.
1998-01-01
The GenTechnique project at Washington State University uses a networked learning environment for molecular genetics learning. The project is developing courseware featuring animation, hyper-link controls, and interactive self-assessment exercises focusing on fundamental concepts. The first pilot course featured a Web-based module on DNA…
Web-Based Computational Chemistry Education with CHARMMing I: Lessons and Tutorial
Miller, Benjamin T.; Singh, Rishi P.; Schalk, Vinushka; Pevzner, Yuri; Sun, Jingjun; Miller, Carrie S.; Boresch, Stefan; Ichiye, Toshiko; Brooks, Bernard R.; Woodcock, H. Lee
2014-01-01
This article describes the development, implementation, and use of web-based “lessons” to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing) web user interface (http://www.charmming.org). Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets), allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that “point and click” simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance. PMID:25057988
Web-based computational chemistry education with CHARMMing I: Lessons and tutorial.
Miller, Benjamin T; Singh, Rishi P; Schalk, Vinushka; Pevzner, Yuri; Sun, Jingjun; Miller, Carrie S; Boresch, Stefan; Ichiye, Toshiko; Brooks, Bernard R; Woodcock, H Lee
2014-07-01
This article describes the development, implementation, and use of web-based "lessons" to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing) web user interface (http://www.charmming.org). Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets), allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that "point and click" simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance.
Link Correlated Military Data for Better Decision Support
2011-06-01
automatically translated into URI based links, thus can greatly reduce man power cost on software development. 3 Linked Data Technique Tim Berners - Lee ...Linked Data - while Linked Data is usually considered as part of Semantic Web, or “the Semantic Web done right” as described by Tim himself - has been...Required data of automatic link construction mechanism on more kinds of correlations. References [1] B. L. Tim , “The next Web of open, linked data
Kain, Zeev N; Fortier, Michelle A; Chorney, Jill MacLaren; Mayes, Linda
2015-04-01
As a result of cost-containment efforts, preparation programs for outpatient surgery are currently not available to the majority of children and parents. The recent dramatic growth in the Internet presents a unique opportunity to transform how children and their parents are prepared for surgery. In this article, we describe the development of a Web-based Tailored Intervention for Preparation of parents and children undergoing Surgery (WebTIPS). A multidisciplinary taskforce agreed that a Web-based tailored intervention consisting of intake, matrix, and output modules was the preferred approach. Next, the content of the various intake variables, the matrix logic, and the output content was developed. The output product has a parent component and a child component and is described in http://surgerywebtips.com/about.php. The child component makes use of preparation strategies such as information provision, modeling, play, and coping skills training. The parent component of WebTIPS includes strategies such as information provision, coping skills training, and relaxation and distraction techniques. A reputable animation and Web design company developed a secured Web-based product based on the above description. In this article, we describe the development of a Web-based tailored preoperative preparation program that can be accessed by children and parents multiple times before and after surgery. A follow-up article in this issue of Anesthesia & Analgesia describes formative evaluation and preliminary efficacy testing of this Web-based tailored preoperative preparation program.
Ajax Architecture Implementation Techniques
NASA Astrophysics Data System (ADS)
Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader
2012-03-01
Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.
Raising Awareness of Individual Creative Potential in Bioscientists Using a Web-Site Based Approach
ERIC Educational Resources Information Center
Adams, David J.; Hugh-Jones, Siobhan; Sutherland, Ed
2010-01-01
We report the preliminary results of work with a unique, web-site-based approach designed to help individual bioscientists identify and develop their individual creative capacity. The site includes a number of features that encourage individuals to interact with creativity techniques, communicate with colleagues remotely using an electronic notice…
Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory
ERIC Educational Resources Information Center
Fiester, Herbert R.
2010-01-01
The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…
2014-05-01
developed techniques for building better IP geolocation systems. Geolocation has many applications, such as presenting advertisements for local business ...presenting advertisements for local business establishments on web pages to debugging network performance issues to attributing attack traffic to...Pennsylvania.” Geolocation has many applications, such as presenting advertisements for local business establishments on web pages to debugging network
Adopting and adapting a commercial view of web services for the Navy
NASA Astrophysics Data System (ADS)
Warner, Elizabeth; Ladner, Roy; Katikaneni, Uday; Petry, Fred
2005-05-01
Web Services are being adopted as the enabling technology to provide net-centric capabilities for many Department of Defense operations. The Navy Enterprise Portal, for example, is Web Services-based, and the Department of the Navy is promulgating guidance for developing Web Services. Web Services, however, only constitute a baseline specification that provides the foundation on which users, under current approaches, write specialized applications in order to retrieve data over the Internet. Application development may increase dramatically as the number of different available Web Services increases. Reasons for specialized application development include XML schema versioning differences, adoption/use of diverse business rules, security access issues, and time/parameter naming constraints, among others. We are currently developing for the US Navy a system which will improve delivery of timely and relevant meteorological and oceanographic (MetOc) data to the warfighter. Our objective is to develop an Advanced MetOc Broker (AMB) that leverages Web Services technology to identify, retrieve and integrate relevant MetOc data in an automated manner. The AMB will utilize a Mediator, which will be developed by applying ontological research and schema matching techniques to MetOc forms of data. The AMB, using the Mediator, will support a new, advanced approach to the use of Web Services; namely, the automated identification, retrieval and integration of MetOc data. Systems based on this approach will then not require extensive end-user application development for each Web Service from which data can be retrieved. Users anywhere on the globe will be able to receive timely environmental data that fits their particular needs.
Issues to Consider in Designing WebQuests: A Literature Review
ERIC Educational Resources Information Center
Kurt, Serhat
2012-01-01
A WebQuest is an inquiry-based online learning technique. This technique has been widely adopted in K-16 education. Therefore, it is important that conditions of effective WebQuest design are defined. Through this article the author presents techniques for improving WebQuest design based on current research. More specifically, the author analyzes…
Large-area sheet task advanced dendritic web growth development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.
1984-01-01
The thermal models used for analyzing dendritic web growth and calculating the thermal stress were reexamined to establish the validity limits imposed by the assumptions of the models. Also, the effects of thermal conduction through the gas phase were evaluated and found to be small. New growth designs, both static and dynamic, were generated using the modeling results. Residual stress effects in dendritic web were examined. In the laboratory, new techniques for the control of temperature distributions in three dimensions were developed. A new maximum undeformed web width of 5.8 cm was achieved. A 58% increase in growth velocity of 150 micrometers thickness was achieved with dynamic hardware. The area throughput goals for transient growth of 30 and 35 sq cm/min were exceeded.
Mallet, Cindy; Ilharreborde, Brice; Jehanno, Pascal; Litzelmann, Estelle; Valenti, Philippe; Mazda, Keyvan; Penneçot, Georges-François; Fitoussi, Franck
2013-03-01
Many commissural reconstruction techniques have been described for the treatment of syndactyly. This study is the first to compare long-term results of 2 commissural dorsal flap procedures (T-flap and omega-flap). Fifty-nine web-spaces in 39 patients, operated on between 1991 and 2008, were retrospectively analyzed. Thirty-six T-flap and 23 omega-flap procedures were performed using full-thickness skin graft in every case for digital resurfacing. Factors that could affect the long-term outcome were collected, including development of web-creep, clinodactyly, and flexion contracture. Patients were reviewed with a mean follow-up of 5 years and 8 months. Preoperative complexity of syndactyly influenced the development of clinodactyly and flexion contracture. Among the patients who developed clinodactyly, 96% had surgery for complex syndactyly. No difference was found between the 2 flap methods concerning digital deformation and mobility. However, web-creep occurred more frequently after T-flap than after omega-flap procedures (17% vs. 5%). The combination of either dorsal commissural T-flaps or omega-flaps with full-thickness graft to resurface digits is a reliable technique for the treatment of syndactyly with satisfactory functional and cosmetic results. Long-term results are not influenced by the type of flap. Nevertheless, the omega-flap technique, using 2 triangular lateral-palmar flaps, avoids use of skin graft to cover lateral-palmar aspects of the new commissure, consequently reducing the incidence of web-creep. In cases of syndactyly, the primary prognostic factor is whether the patient has simple or complex syndactyly. In complex syndactyly, the risk of long-term unfavorable results is higher. When complex complicated syndactyly is involved, postoperative complication rates increase. Level III.
Schrader, Ulrich; Tackenberg, Peter; Widmer, Rudolf; Portenier, Lucien; König, Peter
2007-01-01
To ease and speed up the translation of the ICNP version 1 into the German language a web service was developed to support the collaborative work of all Austrian, Swiss, and German translators and subsequently of the evaluators of the resultant translation. The web service does help to support a modified Delphi technique. Since the web service is multilingual by design it can facilitate the translation of the ICNP into other languages as well. The process chosen can be adopted by other projects involved in translating terminologies.
Non-visual Web Browsing: Beyond Web Accessibility
Ramakrishnan, I.V.; Ashok, Vikas
2017-01-01
People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability. PMID:29202137
Non-visual Web Browsing: Beyond Web Accessibility.
Ramakrishnan, I V; Ashok, Vikas; Billah, Syed Masum
2017-07-01
People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability.
ERIC Educational Resources Information Center
Lee, Cynthia; Wong, Kelvin C. K.; Cheung, William K.; Lee, Fion S. L.
2009-01-01
The paper first describes a web-based essay critiquing system developed by the authors using latent semantic analysis (LSA), an automatic text analysis technique, to provide students with immediate feedback on content and organisation for revision whenever there is an internet connection. It reports on its effectiveness in enhancing adult EFL…
Instruction on the Web: The Online Student's Perspective.
ERIC Educational Resources Information Center
Mory, Edna Holland; Gambill, Lewis E.; Browning, J. Burton
The purpose of this study was to examine the experiences of two university graduate students while taking an online course over the World Wide Web, in order to identify issues of design, implementation, and motivation from a user's perspective. The online course was a graduate class on the methods and techniques of training and development. Data…
ERIC Educational Resources Information Center
Abdelaziz, Hamdy A.
2012-01-01
Web-based education is facing a paradigm shift under the rapid development of information and communication technology. The new paradigm of learning requires special techniques of course design, special instructional models, and special methods of evaluation. This paper investigates the effectiveness of an adaptive instructional strategy for…
Weber, Kristi; Story, Mary; Harnack, Lisa
2006-09-01
Americans are spending an increasing amount of time using "new media" like the Internet. There has been little research examining food and beverage Web sites' content and marketing practices, especially those that attract children and adolescents. The purpose of this study was to conduct a content analysis of food- and beverage-brand Web sites and the marketing techniques and advertising strategies present on these sites. The top five brands in eight food and beverage categories, 40 brands in total, were selected based on annual sales data from Brandweek magazine's annual "Superbrands" report. Data were collected using a standardized coding form. The results show a wide variety of Internet marketing techniques and advertising strategies targeting children and adolescents. "Advergaming" (games in which the advertised product is part of the game) was present on 63% of the Web sites. Half or more of the Web sites used cartoon characters (50%) or spokescharacters (55%), or had a specially designated children's area (58%) with a direct link from the homepage. With interactive media still in its developmental stage, there is a need to develop safeguards for children. Food and nutrition professionals need to advocate for responsible marketing techniques that will support the health of children.
Sylvester, B.D.; Zammit, K.; Fong, A.J.; Sabiston, C.M.
2017-01-01
Background Cancer centre Web sites can be a useful tool for distributing information about the benefits of physical activity for breast cancer (bca) survivors, and they hold potential for supporting health behaviour change. However, the extent to which cancer centre Web sites use evidence-based behaviour change techniques to foster physical activity behaviour among bca survivors is currently unknown. The aim of our study was to evaluate the presentation of behaviour-change techniques on Canadian cancer centre Web sites to promote physical activity behaviour for bca survivors. Methods All Canadian cancer centre Web sites (n = 39) were evaluated by two raters using the Coventry, Aberdeen, and London–Refined (calo-re) taxonomy of behaviour change techniques and the eEurope 2002 Quality Criteria for Health Related Websites. Descriptive statistics were calculated. Results The most common behaviour change techniques used on Web sites were providing information about consequences in general (80%), suggesting goal-setting behaviour (56%), and planning social support or social change (46%). Overall, Canadian cancer centre Web sites presented an average of M = 6.31 behaviour change techniques (of 40 that were coded) to help bca survivors increase their physical activity behaviour. Evidence of quality factors ranged from 90% (sites that provided evidence of readability) to 0% (sites that provided an editorial policy). Conclusions Our results provide preliminary evidence that, of 40 behaviour-change techniques that were coded, fewer than 20% were used to promote physical activity behaviour to bca survivors on cancer centre Web sites, and that the most effective techniques were inconsistently used. On cancer centre Web sites, health promotion specialists could focus on emphasizing knowledge mobilization efforts using available research into behaviour-change techniques to help bca survivors increase their physical activity. PMID:29270056
Sylvester, B D; Zammit, K; Fong, A J; Sabiston, C M
2017-12-01
Cancer centre Web sites can be a useful tool for distributing information about the benefits of physical activity for breast cancer (bca) survivors, and they hold potential for supporting health behaviour change. However, the extent to which cancer centre Web sites use evidence-based behaviour change techniques to foster physical activity behaviour among bca survivors is currently unknown. The aim of our study was to evaluate the presentation of behaviour-change techniques on Canadian cancer centre Web sites to promote physical activity behaviour for bca survivors. All Canadian cancer centre Web sites ( n = 39) were evaluated by two raters using the Coventry, Aberdeen, and London-Refined (calo-re) taxonomy of behaviour change techniques and the eEurope 2002 Quality Criteria for Health Related Websites. Descriptive statistics were calculated. The most common behaviour change techniques used on Web sites were providing information about consequences in general (80%), suggesting goal-setting behaviour (56%), and planning social support or social change (46%). Overall, Canadian cancer centre Web sites presented an average of M = 6.31 behaviour change techniques (of 40 that were coded) to help bca survivors increase their physical activity behaviour. Evidence of quality factors ranged from 90% (sites that provided evidence of readability) to 0% (sites that provided an editorial policy). Our results provide preliminary evidence that, of 40 behaviour-change techniques that were coded, fewer than 20% were used to promote physical activity behaviour to bca survivors on cancer centre Web sites, and that the most effective techniques were inconsistently used. On cancer centre Web sites, health promotion specialists could focus on emphasizing knowledge mobilization efforts using available research into behaviour-change techniques to help bca survivors increase their physical activity.
Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout RH; Stewart-Knox, Barbara J; Mathers, John C
2018-01-01
Background To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. Objective The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. Methods The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype–based, and intake+phenotype+gene–based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Results Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. Conclusions The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. Trial Registration ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1) PMID:29631993
Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.
2009-01-01
Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.
New Interfaces to Web Documents and Services
NASA Technical Reports Server (NTRS)
Carlisle, W. H.
1996-01-01
This paper reports on investigations into how to extend capabilities of the Virtual Research Center (VRC) for NASA's Advanced Concepts Office. The work was performed as part of NASA's 1996 Summer Faculty Fellowship program, and involved research into and prototype development of software components that provide documents and services for the World Wide Web (WWW). The WWW has become a de-facto standard for sharing resources over the internet, primarily because web browsers are freely available for the most common hardware platforms and their operating systems. As a consequence of the popularity of the internet, tools, and techniques associated with web browsers are changing rapidly. New capabilities are offered by companies that support web browsers in order to achieve or remain a dominant participant in internet services. Because a goal of the VRC is to build an environment for NASA centers, universities, and industrial partners to share information associated with Advanced Concepts Office activities, the VRC tracks new techniques and services associated with the web in order to determine the their usefulness for distributed and collaborative engineering research activities. Most recently, Java has emerged as a new tool for providing internet services. Because the major web browser providers have decided to include Java in their software, investigations into Java were conducted this summer.
The Secret to Patron-Centered Web Design: Cheap, Easy, and Powerful Usability Techniques
ERIC Educational Resources Information Center
Reynolds, Erica
2008-01-01
When the Johnson County Library (JoCo Library), a midsize suburban public library in Kansas City, Kansas, completely rebuilt its 2,000-plus-page website--one that had remained relatively stable for almost 5 years--the reaction from patrons and staff members was overwhelmingly positive. The web development team at the JoCo Library realized that the…
ERIC Educational Resources Information Center
O'Connell, Timothy S.; Dyment, Janet E.
2016-01-01
Encouraging reflective practice and developing reflective practitioners is a goal of many disciplines in higher education. A variety of pedagogical techniques have been used to promote critical reflection including portfolios, narratives and reflective journals. Over the past decade, the use of Web 2.0 technologies with students has been…
The Formative Evaluation of a Web-based Course-Management System within a University Setting.
ERIC Educational Resources Information Center
Maslowski, Ralf; Visscher, Adrie J.; Collis, Betty; Bloemen, Paul P. M.
2000-01-01
Discussion of Web-based course management systems (W-CMSs) in higher education focuses on formative evaluation and its contribution in the design and development of high-quality W-CMSs. Reviews methods and techniques that can be applied in formative evaluation and examines TeLeTOP, a W-CMS produced at the University of Twente (Netherlands). (LRW)
Henry, Anna E; Story, Mary
2009-01-01
To identify food and beverage brand Web sites featuring designated children's areas, assess marketing techniques present on those industry Web sites, and determine nutritional quality of branded food items marketed to children. Systematic content analysis of food and beverage brand Web sites and nutrient analysis of food and beverages advertised on these Web sites. The World Wide Web. One-hundred thirty Internet Web sites of food and beverage brands with top media expenditures based on the America's Top 2000 Brands section of Brandweek magazine's annual "Superbrands" report. A standardized content analysis rating form to determine marketing techniques used on the food and beverage brand Web sites. Nutritional analysis of food brands was conducted. Of 130 Web sites analyzed, 48% featured designated children's areas. These Web sites featured a variety of Internet marketing techniques, including advergaming on 85% of the Web sites and interactive programs on 92% of the Web sites. Branded spokescharacters and tie-ins to other products were featured on the majority of the Web sites, as well. Few food brands (13%) with Web sites that market to children met the nutrition criteria set by the National Alliance for Nutrition and Activity. Nearly half of branded Web sites analyzed used designated children's areas to market food and beverages to children, 87% of which were of low nutritional quality. Nutrition professionals should advocate the use of advertising techniques to encourage healthful food choices for children.
NASA Astrophysics Data System (ADS)
Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi
2017-03-01
This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.
Search Techniques for the Web of Things: A Taxonomy and Survey.
Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus
2016-04-27
The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented.
Search Techniques for the Web of Things: A Taxonomy and Survey
Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus
2016-01-01
The Web of Things aims to make physical world objects and their data accessible through standard Web technologies to enable intelligent applications and sophisticated data analytics. Due to the amount and heterogeneity of the data, it is challenging to perform data analysis directly; especially when the data is captured from a large number of distributed sources. However, the size and scope of the data can be reduced and narrowed down with search techniques, so that only the most relevant and useful data items are selected according to the application requirements. Search is fundamental to the Web of Things while challenging by nature in this context, e.g., mobility of the objects, opportunistic presence and sensing, continuous data streams with changing spatial and temporal properties, efficient indexing for historical and real time data. The research community has developed numerous techniques and methods to tackle these problems as reported by a large body of literature in the last few years. A comprehensive investigation of the current and past studies is necessary to gain a clear view of the research landscape and to identify promising future directions. This survey reviews the state-of-the-art search methods for the Web of Things, which are classified according to three different viewpoints: basic principles, data/knowledge representation, and contents being searched. Experiences and lessons learned from the existing work and some EU research projects related to Web of Things are discussed, and an outlook to the future research is presented. PMID:27128918
Digital Mapping Techniques '11–12 workshop proceedings
Soller, David R.
2014-01-01
At these meetings, oral and poster presentations and special discussion sessions emphasized: (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase formats; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
ERIC Educational Resources Information Center
Chen, Hsinchun
2003-01-01
Discusses information retrieval techniques used on the World Wide Web. Topics include machine learning in information extraction; relevance feedback; information filtering and recommendation; text classification and text clustering; Web mining, based on data mining techniques; hyperlink structure; and Web size. (LRW)
A case study of data integration for aquatic resources using semantic web technologies
Gordon, Janice M.; Chkhenkeli, Nina; Govoni, David L.; Lightsom, Frances L.; Ostroff, Andrea C.; Schweitzer, Peter N.; Thongsavanh, Phethala; Varanka, Dalia E.; Zednik, Stephan
2015-01-01
Use cases, information modeling, and linked data techniques are Semantic Web technologies used to develop a prototype system that integrates scientific observations from four independent USGS and cooperator data systems. The techniques were tested with a use case goal of creating a data set for use in exploring potential relationships among freshwater fish populations and environmental factors. The resulting prototype extracts data from the BioData Retrieval System, the Multistate Aquatic Resource Information System, the National Geochemical Survey, and the National Hydrography Dataset. A prototype user interface allows a scientist to select observations from these data systems and combine them into a single data set in RDF format that includes explicitly defined relationships and data definitions. The project was funded by the USGS Community for Data Integration and undertaken by the Community for Data Integration Semantic Web Working Group in order to demonstrate use of Semantic Web technologies by scientists. This allows scientists to simultaneously explore data that are available in multiple, disparate systems beyond those they traditionally have used.
Nondestructive web thickness measurement of micro-drills with an integrated laser inspection system
NASA Astrophysics Data System (ADS)
Chuang, Shui-Fa; Chen, Yen-Chung; Chang, Wen-Tung; Lin, Ching-Chih; Tarng, Yeong-Shin
2010-09-01
Nowadays, the electric and semiconductor industries use numerous micro-drills to machine micro-holes in printed circuit boards. The measurement of web thickness of micro-drills, a key parameter of micro-drill geometry influencing drill rigidity and chip-removal ability, is quite important to ensure quality control. Traditionally, inefficiently destructive measuring method is adopted by inspectors. To improve quality and efficiency of the web thickness measuring tasks, a nondestructive measuring method is required. In this paper, based on the laser micro-gauge (LMG) and laser confocal displacement meter (LCDM) techniques, a nondestructive measuring principle of web thickness of micro-drills is introduced. An integrated laser inspection system, mainly consisting of a LMG, a LCDM and a two-axis-driven micro-drill fixture device, was developed. Experiments meant to inspect web thickness of micro-drill samples with a nominal diameter of 0.25 mm were conducted to test the feasibility of the developed laser inspection system. The experimental results showed that the web thickness measurement could achieve an estimated repeatability of ± 1.6 μm and a worst repeatability of ± 7.5 μm. The developed laser inspection system, combined with the nondestructive measuring principle, was able to undertake the web thickness measuring tasks for certain micro-drills.
ERIC Educational Resources Information Center
Valentine, Andrew; Belski, Iouri; Hamilton, Margaret
2017-01-01
Problem-solving is a key engineering skill, yet is an area in which engineering graduates underperform. This paper investigates the potential of using web-based tools to teach students problem-solving techniques without the need to make use of class time. An idea generation experiment involving 90 students was designed. Students were surveyed…
The design and implementation of web mining in web sites security
NASA Astrophysics Data System (ADS)
Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li
2003-06-01
The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.
Density-based parallel skin lesion border detection with webCL
2015-01-01
Background Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Methods Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Results Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In addition, we tested parallel code on 100 dermoscopy images and showed the execution speedups with respect to the serial version. Results indicate that parallel (WebCL) version and serial version of density based lesion border detection methods generate the same accuracy rates for 100 dermoscopy images, in which mean of border error is 6.94%, mean of recall is 76.66%, and mean of precision is 99.29% respectively. Moreover, WebCL version's speedup factor for 100 dermoscopy images' lesion border detection averages around ~491.2. Conclusions When large amount of high resolution dermoscopy images considered in a usual clinical setting along with the critical importance of early detection and diagnosis of melanoma before metastasis, the importance of fast processing dermoscopy images become obvious. In this paper, we introduce WebCL and the use of it for biomedical image processing applications. WebCL is a javascript binding of OpenCL, which takes advantage of GPU computing from a web browser. Therefore, WebCL parallel version of density based skin lesion border detection introduced in this study can supplement expert dermatologist, and aid them in early diagnosis of skin lesions. While WebCL is currently an emerging technology, a full adoption of WebCL into the HTML5 standard would allow for this implementation to run on a very large set of hardware and software systems. WebCL takes full advantage of parallel computational resources including multi-cores and GPUs on a local machine, and allows for compiled code to run directly from the Web Browser. PMID:26423836
Density-based parallel skin lesion border detection with webCL.
Lemon, James; Kockara, Sinan; Halic, Tansel; Mete, Mutlu
2015-01-01
Dermoscopy is a highly effective and noninvasive imaging technique used in diagnosis of melanoma and other pigmented skin lesions. Many aspects of the lesion under consideration are defined in relation to the lesion border. This makes border detection one of the most important steps in dermoscopic image analysis. In current practice, dermatologists often delineate borders through a hand drawn representation based upon visual inspection. Due to the subjective nature of this technique, intra- and inter-observer variations are common. Because of this, the automated assessment of lesion borders in dermoscopic images has become an important area of study. Fast density based skin lesion border detection method has been implemented in parallel with a new parallel technology called WebCL. WebCL utilizes client side computing capabilities to use available hardware resources such as multi cores and GPUs. Developed WebCL-parallel density based skin lesion border detection method runs efficiently from internet browsers. Previous research indicates that one of the highest accuracy rates can be achieved using density based clustering techniques for skin lesion border detection. While these algorithms do have unfavorable time complexities, this effect could be mitigated when implemented in parallel. In this study, density based clustering technique for skin lesion border detection is parallelized and redesigned to run very efficiently on the heterogeneous platforms (e.g. tablets, SmartPhones, multi-core CPUs, GPUs, and fully-integrated Accelerated Processing Units) by transforming the technique into a series of independent concurrent operations. Heterogeneous computing is adopted to support accessibility, portability and multi-device use in the clinical settings. For this, we used WebCL, an emerging technology that enables a HTML5 Web browser to execute code in parallel for heterogeneous platforms. We depicted WebCL and our parallel algorithm design. In addition, we tested parallel code on 100 dermoscopy images and showed the execution speedups with respect to the serial version. Results indicate that parallel (WebCL) version and serial version of density based lesion border detection methods generate the same accuracy rates for 100 dermoscopy images, in which mean of border error is 6.94%, mean of recall is 76.66%, and mean of precision is 99.29% respectively. Moreover, WebCL version's speedup factor for 100 dermoscopy images' lesion border detection averages around ~491.2. When large amount of high resolution dermoscopy images considered in a usual clinical setting along with the critical importance of early detection and diagnosis of melanoma before metastasis, the importance of fast processing dermoscopy images become obvious. In this paper, we introduce WebCL and the use of it for biomedical image processing applications. WebCL is a javascript binding of OpenCL, which takes advantage of GPU computing from a web browser. Therefore, WebCL parallel version of density based skin lesion border detection introduced in this study can supplement expert dermatologist, and aid them in early diagnosis of skin lesions. While WebCL is currently an emerging technology, a full adoption of WebCL into the HTML5 standard would allow for this implementation to run on a very large set of hardware and software systems. WebCL takes full advantage of parallel computational resources including multi-cores and GPUs on a local machine, and allows for compiled code to run directly from the Web Browser.
Stockburger, D W
1999-05-01
Active server pages permit a software developer to customize the Web experience for users by inserting server-side script and database access into Web pages. This paper describes applications of these techniques and provides a primer on the use of these methods. Applications include a system that generates and grades individualized homework assignments and tests for statistics students. The student accesses the system as a Web page, prints out the assignment, does the assignment, and enters the answers on the Web page. The server, running on NT Server 4.0, grades the assignment, updates the grade book (on a database), and returns the answer key to the student.
Development of high-efficiency solar cells on silicon web
NASA Technical Reports Server (NTRS)
Meier, D. L.; Greggi, J.; Okeeffe, T. W.; Rai-Choudhury, P.
1986-01-01
Work was performed to improve web base material with a goal of obtaining solar cell efficiencies in excess of 18% (AM1). Efforts in this program are directed toward identifying carrier loss mechanisms in web silicon, eliminating or reducing these mechanisms, designing a high efficiency cell structure with the aid of numerical models, and fabricating high efficiency web solar cells. Fabrication techniques must preserve or enhance carrier lifetime in the bulk of the cell and minimize recombination of carriers at the external surfaces. Three completed cells were viewed by cross-sectional transmission electron microscopy (TEM) in order to investigate further the relation between structural defects and electrical performance of web cells. Consistent with past TEM examinations, the cell with the highest efficiency (15.0%) had no dislocations but did have 11 twin planes.
Tiburcio, Marcela; Lara, Ma Asunción; Aguilar Abrego, Araceli; Fernández, Morise; Martínez Vélez, Nora; Sánchez, Alejandro
2016-09-29
The development of Web-based interventions for substance abuse in Latin America is a new field of interest with great potential for expansion to other Spanish-speaking countries. This paper describes a project aimed to develop and evaluate the usability of the Web-based Help Program for Drug Abuse and Depression (Programa de Ayuda para Abuso de Drogas y Depresión, PAADD, in Spanish) and also to construct a systematic frame of reference for the development of future Web-based programs. The PAADD aims to reduce substance use and depressive symptoms with cognitive behavioral techniques translated into Web applications, aided by the participation of a counselor to provide support and guidance. This Web-based intervention includes 4 steps: (1) My Starting Point, (2) Where Do I Want to Be? (3) Strategies for Change, and (4) Maintaining Change. The development of the program was an interactive multistage process. The first stage defined the core structure and contents, which were validated in stage 2 by a group of 8 experts in addiction treatment. Programming of the applications took place in stage 3, taking into account 3 types of end users: administrators, counselors, and substance users. Stage 4 consisted of functionality testing. In stage 5, a total of 9 health professionals and 20 drug users currently in treatment voluntarily interacted with the program in a usability test, providing feedback about adjustments needed to improve users' experience. The main finding of stage 2 was the consensus of the health professionals about the cognitive behavioral strategies and techniques included in PAADD being appropriate for changing substance use behaviors. In stage 5, the health professionals found the functionalities easy to learn; their suggestions were related to the page layout, inclusion of confirmation messages at the end of activities, avoiding "read more" links, and providing feedback about every activity. On the other hand, the users said the information presented within the modules was easy to follow and suggested more dynamic features with concrete instructions and feedback. The resulting Web-based program may have advantages over traditional face-to-face therapies owing to its low cost, wide accessibility, anonymity, and independence of time and distance factors. The detailed description of the process of designing a Web-based program is an important contribution to others interested in this field. The potential benefits must be verified in specific studies. International Standard Randomized Controlled Trial Number (ISRCTN): 25429892; http://www.controlled-trials.com/ISRCTN25429892 (Archived by WebCite at http://www.webcitation.org/6ko1Fsvym).
Database of extended radiation maps and its access system
NASA Astrophysics Data System (ADS)
Verkhodanov, O. V.; Naiden, Ya. V.; Chernenkov, V. N.; Verkhodanova, N. V.
2014-01-01
We describe the architecture of the developed computing web server http://cmb.sao.ru allowing to synthesize the maps of extended radiation on the full sphere from the spherical harmonics in the GLESP pixelization grid, smooth them with the power beam pattern with various angular resolutions in the multipole space, and identify regions of the sky with given coordinates. We describe the server access and administration systems as well as the technique constructing the sky region maps, organized in Python in the Django web-application development framework.
Methodology of decreasing software complexity using ontology
NASA Astrophysics Data System (ADS)
DÄ browska-Kubik, Katarzyna
2015-09-01
In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.
Multi-Element Integrated Project Planning at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Mullon, Robert
2008-01-01
This presentation demonstrates how the ASRC Scheduling team developed working practices to support multiple NASA and ASRC Project Managers using the enterprise capabilities of Primavera P6 and P6 Web Access. This work has proceeded as part of Kennedy Ground Systems' preparation for its transition from the Shuttle Program to the Constellation Program. The presenters will cover Primavera's enterprise-class capabilities for schedule development, integrated critical path analysis, and reporting, as well as advanced Primavera P6 Web Access tools and techniques for communicating project status.
Protecting Database Centric Web Services against SQL/XPath Injection Attacks
NASA Astrophysics Data System (ADS)
Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique
Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.
D'Arcangelo, M; Gilbert, A; Pirrello, R
1996-06-01
The long-term results of a technique for correction of syndactyly are reported. The technique consists of a dorsal omega flap and a palmar anchor forming two palmar and lateral flaps. A long-term review was made of 50 patients with a minimum of 8 years follow-up operated over a period of 10 years. A total of 122 web spaces in simple, complex and syndromic syndactyly were operated on. Most patients achieved satisfactory reconstruction of the web spaces, resulting in a web of good shape. At long-term review, web creep was recorded in eight webs, and skin contractures in three fingers. This study shows the technique to be effective in reconstructing web spaces and in minimizing the prevalence of complications.
Supporting Multi-view User Ontology to Understand Company Value Chains
NASA Astrophysics Data System (ADS)
Zuo, Landong; Salvadores, Manuel; Imtiaz, Sm Hazzaz; Darlington, John; Gibbins, Nicholas; Shadbolt, Nigel R.; Dobree, James
The objective of the Market Blended Insight (MBI) project is to develop web based techniques to improve the performance of UK Business to Business (B2B) marketing activities. The analysis of company value chains is a fundamental task within MBI because it is an important model for understanding the market place and the company interactions within it. The project has aggregated rich data profiles of 3.7 million companies that form the active UK business community. The profiles are augmented by Web extractions from heterogeneous sources to provide unparalleled business insight. Advances by the Semantic Web in knowledge representation and logic reasoning allow flexible integration of data from heterogeneous sources, transformation between different representations and reasoning about their meaning. The MBI project has identified that the market insight and analysis interests of different types of users are difficult to maintain using a single domain ontology. Therefore, the project has developed a technique to undertake a plurality of analyses of value chains by deploying a distributed multi-view ontology to capture different user views over the classification of companies and their various relationships.
Zooniverse - A Platform for Data-Driven Citizen Science
NASA Astrophysics Data System (ADS)
Smith, A.; Lintott, C.; Bamford, S.; Fortson, L.
2011-12-01
In July 2007 a team of astrophysicists created a web-based astronomy project called Galaxy Zoo in which members of the public were asked to classify galaxies from the Sloan Digital Sky Survey by their shape. Over the following year a community of more than 150,000 people classified each of the 1 million galaxies more than 50 times each. Four years later this community of 'citizen scientists' is more than 450,000 strong and is contributing their time and efforts to more than 10 Zooniverse projects each with its own science team and research case. With projects ranging from transcribing ancient greek texts (ancientlives.org) to lunar science (moonzoo.org) the challenges to the Zooniverse community have gone well beyond the relatively simple original Galaxy Zoo interface. Delivering a range of citizen science projects to a large web-based audience presents challenges on a number of fronts including interface design, data architecture/modelling and reduction techniques, web-infrastructure and software design. In this paper we will describe how the Zooniverse team (a collaboration of scientists, software developers and educators ) have developed tools and techniques to solve some of these issues.
Ultrabroadband photonic internet: safety aspects
NASA Astrophysics Data System (ADS)
Kalicki, Arkadiusz; Romaniuk, Ryszard
2008-11-01
Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.
Visualizing multiattribute Web transactions using a freeze technique
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Cotting, Daniel; Dayal, Umeshwar; Machiraju, Vijay; Garg, Pankaj
2003-05-01
Web transactions are multidimensional and have a number of attributes: client, URL, response times, and numbers of messages. One of the key questions is how to simultaneously lay out in a graph the multiple relationships, such as the relationships between the web client response times and URLs in a web access application. In this paper, we describe a freeze technique to enhance a physics-based visualization system for web transactions. The idea is to freeze one set of objects before laying out the next set of objects during the construction of the graph. As a result, we substantially reduce the force computation time. This technique consists of three steps: automated classification, a freeze operation, and a graph layout. These three steps are iterated until the final graph is generated. This iterated-freeze technique has been prototyped in several e-service applications at Hewlett Packard Laboratories. It has been used to visually analyze large volumes of service and sales transactions at online web sites.
Use of Standardized, Quantitative Digital Photography in a Multicenter Web-based Study
Molnar, Joseph A.; Lew, Wesley K.; Rapp, Derek A.; Gordon, E. Stanley; Voignier, Denise; Rushing, Scott; Willner, William
2009-01-01
Objective: We developed a Web-based, blinded, prospective, randomized, multicenter trial, using standardized digital photography to clinically evaluate hand burn depth and accurately determine wound area with digital planimetry. Methods: Photos in each center were taken with identical digital cameras with standardized settings on a custom backdrop developed at Wake Forest University containing a gray, white, black, and centimeter scale. The images were downloaded, transferred via the Web, and stored on servers at the principal investigator's home institution. Color adjustments to each photo were made using Adobe Photoshop 6.0 (Adobe, San Jose, Calif). In an initial pilot study, model hands marked with circles of known areas were used to determine the accuracy of the planimetry technique. Two-dimensional digital planimetry using SigmaScan Pro 5.0 (SPSS Science, Chicago, Ill) was used to calculate wound area from the digital images. Results: Digital photography is a simple and cost-effective method for quantifying wound size when used in conjunction with digital planimetry (SigmaScan) and photo enhancement (Adobe Photoshop) programs. The accuracy of the SigmaScan program in calculating predetermined areas was within 4.7% (95% CI, 3.4%–5.9%). Dorsal hand burns of the initial 20 patients in a national study involving several centers were evaluated with this technique. Images obtained by individuals denying experience in photography proved reliable and useful for clinical evaluation and quantification of wound area. Conclusion: Standardized digital photography may be used quantitatively in a Web-based, multicenter trial of burn care. This technique could be modified for other medical studies with visual endpoints. PMID:19212431
Use of standardized, quantitative digital photography in a multicenter Web-based study.
Molnar, Joseph A; Lew, Wesley K; Rapp, Derek A; Gordon, E Stanley; Voignier, Denise; Rushing, Scott; Willner, William
2009-01-01
We developed a Web-based, blinded, prospective, randomized, multicenter trial, using standardized digital photography to clinically evaluate hand burn depth and accurately determine wound area with digital planimetry. Photos in each center were taken with identical digital cameras with standardized settings on a custom backdrop developed at Wake Forest University containing a gray, white, black, and centimeter scale. The images were downloaded, transferred via the Web, and stored on servers at the principal investigator's home institution. Color adjustments to each photo were made using Adobe Photoshop 6.0 (Adobe, San Jose, Calif). In an initial pilot study, model hands marked with circles of known areas were used to determine the accuracy of the planimetry technique. Two-dimensional digital planimetry using SigmaScan Pro 5.0 (SPSS Science, Chicago, Ill) was used to calculate wound area from the digital images. Digital photography is a simple and cost-effective method for quantifying wound size when used in conjunction with digital planimetry (SigmaScan) and photo enhancement (Adobe Photoshop) programs. The accuracy of the SigmaScan program in calculating predetermined areas was within 4.7% (95% CI, 3.4%-5.9%). Dorsal hand burns of the initial 20 patients in a national study involving several centers were evaluated with this technique. Images obtained by individuals denying experience in photography proved reliable and useful for clinical evaluation and quantification of wound area. Standardized digital photography may be used quantitatively in a Web-based, multicenter trial of burn care. This technique could be modified for other medical studies with visual endpoints.
The Fabric of the Universe: Exploring the Cosmic Web in 3D Prints and Woven Textiles
NASA Astrophysics Data System (ADS)
Diemer, Benedikt; Facio, Isaac
2017-05-01
We introduce The Fabric of the Universe, an art and science collaboration focused on exploring the cosmic web of dark matter with unconventional techniques and materials. We discuss two of our projects in detail. First, we describe a pipeline for translating three-dimensional (3D) density structures from N-body simulations into solid surfaces suitable for 3D printing, and present prints of a cosmological volume and of the infall region around a massive cluster halo. In these models, we discover wall-like features that are invisible in two-dimensional projections. Going beyond the sheer visualization of simulation data, we undertake an exploration of the cosmic web as a three-dimensional woven textile. To this end, we develop experimental 3D weaving techniques to create sphere-like and filamentary shapes and radically simplify a region of the cosmic web into a set of filaments and halos. We translate the resulting tree structure into a series of commands that can be executed by a digital weaving machine, and present a large-scale textile installation.
Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout Rh; Stewart-Knox, Barbara J; Mathers, John C; Lovegrove, Julie A
2018-04-09
To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype-based, and intake+phenotype+gene-based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1). ©Anna L Macready, Rosalind Fallaize, Laurie T Butler, Judi A Ellis, Sharron Kuznesof, Lynn J Frewer, Carlos Celis-Morales, Katherine M Livingstone, Vera Araújo-Soares, Arnout RH Fischer, Barbara J Stewart-Knox, John C Mathers, Julie A Lovegrove. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.04.2018.
Research on ecological function zoning information system based on WebGIS
NASA Astrophysics Data System (ADS)
Zhang, Jianxiong; Zhang, Gang
2007-06-01
With the development of information technology, application of WebGIS will make it possible to realize digitization and intellectualization in issuing and managing information of ecological function zoning. Firstly, this paper introduces the fundamental principles, basic methods and current situation of development and various support techniques about WebGIS. Secondly, the paper not only compares and analyzes the above methods but also discusses their applied prospect and feasibility in Web management. Finally, exemplified by Jiaozuo City, the paper puts forward an idea of design and a project of realization about the information system. In this research, the digital map and establishment of map database have been finished by MapInfo. Combining with some technical data of ecological environment of Jiaozuo City, the information of ecological environment resources is collected, stored, analyzed, calculated and displayed in the form of pictures and graphs on the WebGIS platform, which makes use of secondary development flat-MapXtreme for Java and some tools such as Java, JSP and JavaScript. Serve mode is adopted in the system which has realized the operating, inquiring of basic map and working out thematic map. By the finished system, it brings some references.
NASA Astrophysics Data System (ADS)
Satheendran, S.; John, C. M.; Fasalul, F. K.; Aanisa, K. M.
2014-11-01
Web geoservices is the obvious graduation of Geographic Information System in a distributed environment through a simple browser. It enables organizations to share domain-specific rich and dynamic spatial information over the web. The present study attempted to design and develop a web enabled GIS application for the School of Environmental Sciences, Mahatma Gandhi University, Kottayam, Kerala, India to publish various geographical databases to the public through its website. The development of this project is based upon the open source tools and techniques. The output portal site is platform independent. The premier webgis frame work `Geomoose' is utilized. Apache server is used as the Web Server and the UMN Map Server is used as the map server for this project. It provides various customised tools to query the geographical database in different ways and search for various facilities in the geographical area like banks, attractive places, hospitals, hotels etc. The portal site was tested with the output geographical database of 2 projects of the School such as 1) the Tourism Information System for the Malabar region of Kerala State consisting of 5 northern districts 2) the geoenvironmental appraisal of the Athirappilly Hydroelectric Project covering the entire Chalakkudy river basin.
ERIC Educational Resources Information Center
Chowdhury, Gobinda G.
2003-01-01
Discusses issues related to natural language processing, including theoretical developments; natural language understanding; tools and techniques; natural language text processing systems; abstracting; information extraction; information retrieval; interfaces; software; Internet, Web, and digital library applications; machine translation for…
Interactive 3d Landscapes on Line
NASA Astrophysics Data System (ADS)
Fanini, B.; Calori, L.; Ferdani, D.; Pescarin, S.
2011-09-01
The paper describes challenges identified while developing browser embedded 3D landscape rendering applications, our current approach and work-flow and how recent development in browser technologies could affect. All the data, even if processed by optimization and decimation tools, result in very huge databases that require paging, streaming and Level-of-Detail techniques to be implemented to allow remote web based real time fruition. Our approach has been to select an open source scene-graph based visual simulation library with sufficient performance and flexibility and adapt it to the web by providing a browser plug-in. Within the current Montegrotto VR Project, content produced with new pipelines has been integrated. The whole Montegrotto Town has been generated procedurally by CityEngine. We used this procedural approach, based on algorithms and procedures because it is particularly functional to create extensive and credible urban reconstructions. To create the archaeological sites we used optimized mesh acquired with laser scanning and photogrammetry techniques whereas to realize the 3D reconstructions of the main historical buildings we adopted computer-graphic software like blender and 3ds Max. At the final stage, semi-automatic tools have been developed and used up to prepare and clusterise 3D models and scene graph routes for web publishing. Vegetation generators have also been used with the goal of populating the virtual scene to enhance the user perceived realism during the navigation experience. After the description of 3D modelling and optimization techniques, the paper will focus and discuss its results and expectations.
2012-12-03
paper provides an introduction of Lagrangian techniques for locating flow boundaries that encompass regions of recirculation in time- dependent flows...the low- to mid- level embryonic vortex from adverse conditions, while the 1The glossary on NOAA’s Hurricane Research Division’s web - site uses...wave or disturbance. This paper provides an introduction of Lagrangian techniques for locating flow boundaries that encompass regions of recirculation
On transform coding tools under development for VP10
NASA Astrophysics Data System (ADS)
Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao
2016-09-01
Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.
NASA Astrophysics Data System (ADS)
Demir, I.
2015-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. This presentation showcase information communication interfaces, games, and virtual and immersive reality applications for supporting teaching and learning of concepts in atmospheric and hydrological sciences. The information communication platforms utilizes latest web technologies and allow accessing and visualizing large scale data on the web. The simulation system is a web-based 3D interactive learning environment for teaching hydrological and atmospheric processes and concepts. The simulation systems provides a visually striking platform with realistic terrain and weather information, and water simulation. The web-based simulation system provides an environment for students to learn about the earth science processes, and effects of development and human activity on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users.
Automatic home medical product recommendation.
Luo, Gang; Thomas, Selena B; Tang, Chunqiang
2012-04-01
Web-based personal health records (PHRs) are being widely deployed. To improve PHR's capability and usability, we proposed the concept of intelligent PHR (iPHR). In this paper, we use automatic home medical product recommendation as a concrete application to demonstrate the benefits of introducing intelligence into PHRs. In this new application domain, we develop several techniques to address the emerging challenges. Our approach uses treatment knowledge and nursing knowledge, and extends the language modeling method to (1) construct a topic-selection input interface for recommending home medical products, (2) produce a global ranking of Web pages retrieved by multiple queries, and (3) provide diverse search results. We demonstrate the effectiveness of our techniques using USMLE medical exam cases.
WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides
NASA Astrophysics Data System (ADS)
Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston
2007-06-01
Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.
Web-video-mining-supported workflow modeling for laparoscopic surgeries.
Liu, Rui; Zhang, Xiaoli; Zhang, Hao
2016-11-01
As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.
Study on online community user motif using web usage mining
NASA Astrophysics Data System (ADS)
Alphy, Meera; Sharma, Ajay
2016-04-01
The Web usage mining is the application of data mining, which is used to extract useful information from the online community. The World Wide Web contains at least 4.73 billion pages according to Indexed Web and it contains at least 228.52 million pages according Dutch Indexed web on 6th august 2015, Thursday. It’s difficult to get needed data from these billions of web pages in World Wide Web. Here is the importance of web usage mining. Personalizing the search engine helps the web user to identify the most used data in an easy way. It reduces the time consumption; automatic site search and automatic restore the useful sites. This study represents the old techniques to latest techniques used in pattern discovery and analysis in web usage mining from 1996 to 2015. Analyzing user motif helps in the improvement of business, e-commerce, personalisation and improvement of websites.
Kopp, Sandra L; Smith, Hugh M
2011-01-01
Little is known about the use of Web-based education in regional anesthesia training. Benefits of Web-based education include the ability to standardize learning material quality and content, build appropriate learning progressions, use interactive multimedia technologies, and individualize delivery of course materials. The goals of this investigation were (1) to determine whether module design influences regional anesthesia knowledge acquisition, (2) to characterize learner preference patterns among anesthesia residents, and (3) to determine whether learner preferences play a role in knowledge acquisition. Direct comparison of knowledge assessments, learning styles, and learner preferences will be made between an interactive case-based and a traditional textbook-style module design. Forty-three Mayo Clinic anesthesiology residents completed 2 online modules, a knowledge pretest, posttest, an Index of Learning Styles assessment, and a participant satisfaction survey. Interscalene and lumbar plexus regional techniques were selected as the learning content for 4 Web modules constructed using the Blackboard Vista coursework application. One traditional textbook-style module and 1 interactive case-based module were designed for each of the interscalene and lumbar plexus techniques. Participants scored higher on the postmodule knowledge assessment for both of the interscalene and lumbar plexus modules. Postmodule knowledge performance scores were independent of both module design (interactive case-based versus traditional textbook style) and learning style preferences. However, nearly all participants reported a preference for Web-based learning and believe that it should be used in anesthesia resident education. Participants did not feel that Web-base learning should replace the current lecture-based curriculum. All residents scored higher on the postmodule knowledge assessment, but this improvement was independent of the module design and individual learning styles. Although residents believe that online learning should be used in anesthesia training, the results of this study do not demonstrate improved learning or justify the time and expense of developing complex case-based training modules. While there may be practical benefits of Web-based education, educators in regional anesthesia should be cautious about developing curricula based on learner preference data.
Web2Quests: Updating a Popular Web-Based Inquiry-Oriented Activity
ERIC Educational Resources Information Center
Kurt, Serhat
2009-01-01
WebQuest is a popular inquiry-oriented activity in which learners use Web resources. Since the creation of the innovation, almost 15 years ago, the Web has changed significantly, while the WebQuest technique has changed little. This article examines possible applications of new Web trends on WebQuest instructional strategy. Some possible…
2016-01-01
Background The development of Web-based interventions for substance abuse in Latin America is a new field of interest with great potential for expansion to other Spanish-speaking countries. Objective This paper describes a project aimed to develop and evaluate the usability of the Web-based Help Program for Drug Abuse and Depression (Programa de Ayuda para Abuso de Drogas y Depresión, PAADD, in Spanish) and also to construct a systematic frame of reference for the development of future Web-based programs. Methods The PAADD aims to reduce substance use and depressive symptoms with cognitive behavioral techniques translated into Web applications, aided by the participation of a counselor to provide support and guidance. This Web-based intervention includes 4 steps: (1) My Starting Point, (2) Where Do I Want to Be? (3) Strategies for Change, and (4) Maintaining Change. The development of the program was an interactive multistage process. The first stage defined the core structure and contents, which were validated in stage 2 by a group of 8 experts in addiction treatment. Programming of the applications took place in stage 3, taking into account 3 types of end users: administrators, counselors, and substance users. Stage 4 consisted of functionality testing. In stage 5, a total of 9 health professionals and 20 drug users currently in treatment voluntarily interacted with the program in a usability test, providing feedback about adjustments needed to improve users’ experience. Results The main finding of stage 2 was the consensus of the health professionals about the cognitive behavioral strategies and techniques included in PAADD being appropriate for changing substance use behaviors. In stage 5, the health professionals found the functionalities easy to learn; their suggestions were related to the page layout, inclusion of confirmation messages at the end of activities, avoiding “read more” links, and providing feedback about every activity. On the other hand, the users said the information presented within the modules was easy to follow and suggested more dynamic features with concrete instructions and feedback. Conclusions The resulting Web-based program may have advantages over traditional face-to-face therapies owing to its low cost, wide accessibility, anonymity, and independence of time and distance factors. The detailed description of the process of designing a Web-based program is an important contribution to others interested in this field. The potential benefits must be verified in specific studies. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 25429892; http://www.controlled-trials.com/ISRCTN25429892 (Archived by WebCite at http://www.webcitation.org/6ko1Fsvym) PMID:27687965
Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009
Soller, David R.
2011-01-01
As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
How to Boost Engineering Support Via Web 2.0 - Seeds for the Ares Project...and/or Yours?
NASA Technical Reports Server (NTRS)
Scott, David W.
2010-01-01
The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares launch system development. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal intended to provide a seamless interface to support and simplify two critical activities: a) Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison. b) Provide archive storage for engineering instrumentation data to support engineering design, development, and test. A mix of NASA-written and COTS software provides engineering analysis tools. A by-product of using a data portal to access and display data is access to collaborative tools inherent in a Web 2.0 environment. This paper discusses how Web 2.0 techniques, particularly social media, might be applied to the traditionally conservative and formal engineering support arena. A related paper by the author [1] considers use
NASA Technical Reports Server (NTRS)
Scott, David W.
2010-01-01
The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares rocket development and operations. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal to support and simplify two critical activities: Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison Establish and maintain collaborative communities within the Ares teams/subteams and with other projects, e.g., Space Shuttle, International Space Station (ISS). AEON seeks to provide a seamless interface to a) locally developed engineering applications and b) a Commercial-Off-The-Shelf (COTS) collaborative environment that includes Web 2.0 capabilities, e.g., blogging, wikis, and social networking. This paper discusses how Web 2.0 might be applied to the typically conservative engineering support arena, based on feedback from Integration, Verification, and Validation (IV&V) testing and on searching for their use in similar environments.
We'll Make You a Better Teacher: Learning from Guitar Techniques
NASA Astrophysics Data System (ADS)
Greenbowe, Thomas J.
2008-02-01
It is worth noting that there are more resources and more uses of technology available world-wide to help individuals become better guitar players than there are resources available to help individuals become better science teachers. Providing resources and services to help individuals become effective chemistry teachers and improve their chemistry teaching and expand their range of techniques is a worthwhile endeavor. This commentary proposes that a new magazine should be developed and designed to complement and augment the Journal of Chemical Education , the Examinations Institute, the BCCEs, and programming at regional, national, and international meetings. We need to be making use of the expertise of chemical educators from around the world to convey the best practices of teaching chemistry. This magazine would feature topics directly relating to teaching chemistry in the classroom and it would include master teachers explaining and discussing chemistry education techniques. A Web site and perhaps a DVD would have digital movies of master chemistry teachers illustrating how they implement a specific technique with students. The Web site would serve as a repository for resources. It would serve as an alternative site for professional development.
Teaching Web Search Skills: Techniques and Strategies of Top Trainers
ERIC Educational Resources Information Center
Notess, Greg R.
2006-01-01
Here is a unique and practical reference for anyone who teaches Web searching. Greg Notess shares his own techniques and strategies along with expert tips and advice from a virtual "who's who" of Web search training: Joe Barker, Paul Barron, Phil Bradley, John Ferguson, Alice Fulbright, Ran Hock, Jeff Humphrey, Diane Kovacs, Gary Price, Danny…
NASA Astrophysics Data System (ADS)
Demir, I.; Krajewski, W. F.
2013-12-01
As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools developed within the light of these challenges.
An industrial information integration approach to in-orbit spacecraft
NASA Astrophysics Data System (ADS)
Du, Xiaoning; Wang, Hong; Du, Yuhao; Xu, Li Da; Chaudhry, Sohail; Bi, Zhuming; Guo, Rong; Huang, Yongxuan; Li, Jisheng
2017-01-01
To operate an in-orbit spacecraft, the spacecraft status has to be monitored autonomously by collecting and analysing real-time data, and then detecting abnormities and malfunctions of system components. To develop an information system for spacecraft state detection, we investigate the feasibility of using ontology-based artificial intelligence in the system development. We propose a new modelling technique based on the semantic web, agent, scenarios and ontologies model. In modelling, the subjects of astronautics fields are classified, corresponding agents and scenarios are defined, and they are connected by the semantic web to analyse data and detect failures. We introduce the modelling methodologies and the resulted framework of the status detection information system in this paper. We discuss system components as well as their interactions in details. The system has been prototyped and tested to illustrate its feasibility and effectiveness. The proposed modelling technique is generic which can be extended and applied to the system development of other large-scale and complex information systems.
ERIC Educational Resources Information Center
Emiroglu, Bulent Gursel
2007-01-01
Of the fields on which developments related to information and communication technologies are effective will keep increasing is education. That's why the methods and techniques that have been formed over the long years may change relatively. In the past years, the field of higher education has been impacted very much from the developments and…
Using Open Web APIs in Teaching Web Mining
ERIC Educational Resources Information Center
Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju
2009-01-01
With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…
Evaluating Web accessibility at different processing phases
NASA Astrophysics Data System (ADS)
Fernandes, N.; Lopes, R.; Carriço, L.
2012-09-01
Modern Web sites use several techniques (e.g. DOM manipulation) that allow for the injection of new content into their Web pages (e.g. AJAX), as well as manipulation of the HTML DOM tree. This has the consequence that the Web pages that are presented to users (i.e. after browser processing) are different from the original structure and content that is transmitted through HTTP communication (i.e. after browser processing). This poses a series of challenges for Web accessibility evaluation, especially on automated evaluation software. This article details an experimental study designed to understand the differences posed by accessibility evaluation after Web browser processing. We implemented a Javascript-based evaluator, QualWeb, that can perform WCAG 2.0 based accessibility evaluations in the two phases of browser processing. Our study shows that, in fact, there are considerable differences between the HTML DOM trees in both phases, which have the consequence of having distinct evaluation results. We discuss the impact of these results in the light of the potential problems that these differences can pose to designers and developers that use accessibility evaluators that function before browser processing.
NASA Astrophysics Data System (ADS)
Deng, Chao; Liu, Wanjun; Zhang, Yinjiang; Huang, Chen; Zhao, Yi; Jin, Xiangyu
2018-04-01
Developing wet-laid papers with a good wet strength remains a longstanding challenge in the papermaking industry. In this study, hydroentanglement, a mechanical bonding technique is developed to consolidate the wet-laid fibre web. The results indicate that wet tensile strength, ductile stretching property, softness, air permeability and water absorbency of the wet-laid fibre web are significantly improved by hydroentanglement. In addition, the abrasion test shows that the dusting off rate of wet-laid fibre web can be effectively reduced through hydroentanglement. Moreover, the disintegration experiment proves that wet-laid hydroentangled nonwovens could be easily dispersed when compared with conventional carded hydroentangled nonwovens. Therefore, the new wet-laid hydroentangled nonwovens can maintain excellent performance in a wet state, showing a great potential for personal hygiene applications.
Deng, Chao; Liu, Wanjun; Zhang, Yinjiang; Huang, Chen; Zhao, Yi; Jin, Xiangyu
2018-04-01
Developing wet-laid papers with a good wet strength remains a longstanding challenge in the papermaking industry. In this study, hydroentanglement, a mechanical bonding technique is developed to consolidate the wet-laid fibre web. The results indicate that wet tensile strength, ductile stretching property, softness, air permeability and water absorbency of the wet-laid fibre web are significantly improved by hydroentanglement. In addition, the abrasion test shows that the dusting off rate of wet-laid fibre web can be effectively reduced through hydroentanglement. Moreover, the disintegration experiment proves that wet-laid hydroentangled nonwovens could be easily dispersed when compared with conventional carded hydroentangled nonwovens. Therefore, the new wet-laid hydroentangled nonwovens can maintain excellent performance in a wet state, showing a great potential for personal hygiene applications.
Web image retrieval using an effective topic and content-based technique
NASA Astrophysics Data System (ADS)
Lee, Ching-Cheng; Prabhakara, Rashmi
2005-03-01
There has been an exponential growth in the amount of image data that is available on the World Wide Web since the early development of Internet. With such a large amount of information and image available and its usefulness, an effective image retrieval system is thus greatly needed. In this paper, we present an effective approach with both image matching and indexing techniques that improvise on existing integrated image retrieval methods. This technique follows a two-phase approach, integrating query by topic and query by example specification methods. In the first phase, The topic-based image retrieval is performed by using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. This technique consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. In the second phase, we use query by example specification to perform a low-level content-based image match in order to retrieve smaller and relatively closer results of the example image. From this, information related to the image feature is automatically extracted from the query image. The main objective of our approach is to develop a functional image search and indexing technique and to demonstrate that better retrieval results can be achieved.
A WebGIS-based command control system for forest fire fighting
NASA Astrophysics Data System (ADS)
Yang, Jianyu; Ming, Dongping; Zhang, Xiaodong; Huang, Haitao
2006-10-01
Forest is a finite resource and fire prevention is crucial work. However, once a forest fire or accident occurs, timely and effective fire-fighting is the only necessary measure. The aim of this research is to build a computerized command control system based on WEBGIS to direct fire-fighting. Firstly, this paper introduces the total technique flow and functional modules of the system. Secondly, this paper analyses the key techniques for building the system, and they are data obtaining, data organizing & management, architecture of WebGIS and sharing & interoperation technique. In the end, this paper demonstrates the on line martial symbol editing function to show the running result of system. The practical application of this system showed that it played very important role in the forest fire fighting work. In addition, this paper proposes some strategic recommendations for the further development of the system.
A step-by-step solution for embedding user-controlled cines into educational Web pages.
Cornfeld, Daniel
2008-03-01
The objective of this article is to introduce a simple method for embedding user-controlled cines into a Web page using a simple JavaScript. Step-by-step instructions are included and the source code is made available. This technique allows the creation of portable Web pages that allow the user to scroll through cases as if seated at a PACS workstation. A simple JavaScript allows scrollable image stacks to be included on Web pages. With this technique, you can quickly and easily incorporate entire stacks of CT or MR images into online teaching files. This technique has the potential for use in case presentations, online didactics, teaching archives, and resident testing.
Convergent Technologies in Distance Learning Delivery.
ERIC Educational Resources Information Center
Wheeler, Steve
1999-01-01
Describes developments in British education in distance learning technologies. Highlights include networking the rural areas; communication, community, and paradigm shifts; digital compression techniques and telematics; Web-based material delivered over the Internet; system flexibility; social support; learning support; videoconferencing; and…
Riffle, Michael; Merrihew, Gennifer E; Jaschob, Daniel; Sharma, Vagisha; Davis, Trisha N; Noble, William S; MacCoss, Michael J
2015-11-01
Regulation of protein abundance is a critical aspect of cellular function, organism development, and aging. Alternative splicing may give rise to multiple possible proteoforms of gene products where the abundance of each proteoform is independently regulated. Understanding how the abundances of these distinct gene products change is essential to understanding the underlying mechanisms of many biological processes. Bottom-up proteomics mass spectrometry techniques may be used to estimate protein abundance indirectly by sequencing and quantifying peptides that are later mapped to proteins based on sequence. However, quantifying the abundance of distinct gene products is routinely confounded by peptides that map to multiple possible proteoforms. In this work, we describe a technique that may be used to help mitigate the effects of confounding ambiguous peptides and multiple proteoforms when quantifying proteins. We have applied this technique to visualize the distribution of distinct gene products for the whole proteome across 11 developmental stages of the model organism Caenorhabditis elegans. The result is a large multidimensional dataset for which web-based tools were developed for visualizing how translated gene products change during development and identifying possible proteoforms. The underlying instrument raw files and tandem mass spectra may also be downloaded. The data resource is freely available on the web at http://www.yeastrc.org/wormpes/ . Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Riffle, Michael; Merrihew, Gennifer E.; Jaschob, Daniel; Sharma, Vagisha; Davis, Trisha N.; Noble, William S.; MacCoss, Michael J.
2015-11-01
Regulation of protein abundance is a critical aspect of cellular function, organism development, and aging. Alternative splicing may give rise to multiple possible proteoforms of gene products where the abundance of each proteoform is independently regulated. Understanding how the abundances of these distinct gene products change is essential to understanding the underlying mechanisms of many biological processes. Bottom-up proteomics mass spectrometry techniques may be used to estimate protein abundance indirectly by sequencing and quantifying peptides that are later mapped to proteins based on sequence. However, quantifying the abundance of distinct gene products is routinely confounded by peptides that map to multiple possible proteoforms. In this work, we describe a technique that may be used to help mitigate the effects of confounding ambiguous peptides and multiple proteoforms when quantifying proteins. We have applied this technique to visualize the distribution of distinct gene products for the whole proteome across 11 developmental stages of the model organism Caenorhabditis elegans. The result is a large multidimensional dataset for which web-based tools were developed for visualizing how translated gene products change during development and identifying possible proteoforms. The underlying instrument raw files and tandem mass spectra may also be downloaded. The data resource is freely available on the web at http://www.yeastrc.org/wormpes/.
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
ERIC Educational Resources Information Center
Ercan, Orhan; Bilen, Kadir
2014-01-01
Advances in computer technologies and adoption of related methods and techniques in education have developed parallel to each other. This study focuses on the need to utilize more than one teaching method and technique in education rather than focusing on a single teaching method. By using the pre-test post-test and control group semi-experimental…
KernPaeP - a web-based pediatric palliative documentation system for home care.
Hartz, Tobias; Verst, Hendrik; Ueckert, Frank
2009-01-01
KernPaeP is a new web-based on- and offline documentation system, which has been developed for pediatric palliative care-teams supporting patient documentation and communication among health care professionals. It provides a reliable system making fast and secure home care documentation possible. KernPaeP is accessible online by registered users using any web-browser. Home care teams use an offline version of KernPaeP running on a netbook for patient documentation on site. Identifying and medical patient data are strictly separated and stored on two database servers. The system offers a stable, enhanced two-way algorithm for synchronization between the offline component and the central database servers. KernPaeP is implemented meeting highest security standards while still maintaining high usability. The web-based documentation system allows ubiquitous and immediate access to patient data. Sumptuous paper work is replaced by secure and comprehensive electronic documentation. KernPaeP helps saving time and improving the quality of documentation. Due to development in close cooperation with pediatric palliative professionals, KernPaeP fulfils the broad needs of home-care documentation. The technique of web-based online and offline documentation is in general applicable for arbitrary home care scenarios.
NGL Viewer: Web-based molecular graphics for large complexes.
Rose, Alexander S; Bradley, Anthony R; Valasatava, Yana; Duarte, Jose M; Prlic, Andreas; Rose, Peter W
2018-05-29
The interactive visualization of very large macromolecular complexes on the web is becoming a challenging problem as experimental techniques advance at an unprecedented rate and deliver structures of increasing size. We have tackled this problem by developing highly memory-efficient and scalable extensions for the NGL WebGL-based molecular viewer and by using MMTF, a binary and compressed Macromolecular Transmission Format. These enable NGL to download and render molecular complexes with millions of atoms interactively on desktop computers and smartphones alike, making it a tool of choice for web-based molecular visualization in research and education. The source code is freely available under the MIT license at github.com/arose/ngl and distributed on NPM (npmjs.com/package/ngl). MMTF-JavaScript encoders and decoders are available at github.com/rcsb/mmtf-javascript. asr.moin@gmail.com.
Enhancing Icing Training for Pilots Through Web-Based Multimedia
NASA Technical Reports Server (NTRS)
Fletcher, William; Nolan, Gary; Adanich, Emery; Bond, Thomas H.
2006-01-01
The Aircraft Icing Project of the NASA Aviation Safety Program has developed a number of in-flight icing education and training aids designed to increase pilot awareness about the hazards associated with various icing conditions. The challenges and advantages of transitioning these icing training materials to a Web-based delivery are discussed. Innovative Web-based delivery devices increased course availability to pilots and dispatchers while increasing course flexibility and utility. These courses are customizable for both self-directed and instructor-led learning. Part of our goal was to create training materials with enough flexibility to enable Web-based delivery and downloadable portability while maintaining a rich visual multimedia-based learning experience. Studies suggest that using visually based multimedia techniques increases the effectiveness of icing training materials. This paper describes these concepts, gives examples, and discusses the transitional challenges.
Benchmark of Client and Server-Side Catchment Delineation Approaches on Web-Based Systems
NASA Astrophysics Data System (ADS)
Demir, I.; Sermet, M. Y.; Sit, M. A.
2016-12-01
Recent advances in internet and cyberinfrastructure technologies have provided the capability to acquire large scale spatial data from various gauges and sensor networks. The collection of environmental data increased demand for applications which are capable of managing and processing large-scale and high-resolution data sets. With the amount and resolution of data sets provided, one of the challenging tasks for organizing and customizing hydrological data sets is delineation of watersheds on demand. Watershed delineation is a process for creating a boundary that represents the contributing area for a specific control point or water outlet, with intent of characterization and analysis of portions of a study area. Although many GIS tools and software for watershed analysis are available on desktop systems, there is a need for web-based and client-side techniques for creating a dynamic and interactive environment for exploring hydrological data. In this project, we demonstrated several watershed delineation techniques on the web with various techniques implemented on the client-side using JavaScript and WebGL, and on the server-side using Python and C++. We also developed a client-side GPGPU (General Purpose Graphical Processing Unit) algorithm to analyze high-resolution terrain data for watershed delineation which allows parallelization using GPU. The web-based real-time analysis of watershed segmentation can be helpful for decision-makers and interested stakeholders while eliminating the need of installing complex software packages and dealing with large-scale data sets. Utilization of the client-side hardware resources also eliminates the need of servers due its crowdsourcing nature. Our goal for future work is to improve other hydrologic analysis methods such as rain flow tracking by adapting presented approaches.
Discovering Authorities and Hubs in Different Topological Web Graph Structures.
ERIC Educational Resources Information Center
Meghabghab, George
2002-01-01
Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)
Hera: Using NASA Astronomy Data in the Classroom
NASA Astrophysics Data System (ADS)
Lochner, James C.; Mitchell, S.; Pence, W. D.
2006-12-01
Hera is a free internet-based tool that provides students access to both analysis software and data for studying astronomical objects such as black holes, binary star systems, supernovae, and galaxies. Students use a subset of the same software, and experience the same analysis process, that an astronomer follows in analyzing data obtained from an orbiting satellite observatory. Hera is accompanied by a web-based tutorial which steps students through the science background, procedures for accessing the data, and using the Hera software. The web pages include a lesson plan in which students explore data from a binary star system containing a normal star and a black hole. The objective of the lesson is for students to use plotting, estimation, and statistical techniques to determine the orbital period. Students may then apply these techniques to a number of data sets and draw conclusions on the natures of the systems (for example, students discover that one system is an eclipsing binary). The web page tutorial is self-guided and contains a number of exercises; students can work independently or in groups. Hera has been use with high school students and in introductory astronomy classes in community colleges. This poster describes Hera and its web-based tutorial. We outline the underlying software architecture, the development process, and its testing and classroom applications. We also describe the benefits to students in developing skills which extend basic science and math concepts into real applications.
A Web Server and Mobile App for Computing Hemolytic Potency of Peptides.
Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C; Raghava, Gajendra P S
2016-03-08
Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., "FKK", "LKL", "KKLL", "KWK", "VLK", "CYCR", "CRR", "RFC", "RRR", "LKKL") are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/).
Moran, Jean M; Feng, Mary; Benedetti, Lisa A; Marsh, Robin; Griffith, Kent A; Matuszak, Martha M; Hess, Michael; McMullen, Matthew; Fisher, Jennifer H; Nurushev, Teamour; Grubb, Margaret; Gardner, Stephen; Nielsen, Daniel; Jagsi, Reshma; Hayman, James A; Pierce, Lori J
A database in which patient data are compiled allows analytic opportunities for continuous improvements in treatment quality and comparative effectiveness research. We describe the development of a novel, web-based system that supports the collection of complex radiation treatment planning information from centers that use diverse techniques, software, and hardware for radiation oncology care in a statewide quality collaborative, the Michigan Radiation Oncology Quality Consortium (MROQC). The MROQC database seeks to enable assessment of physician- and patient-reported outcomes and quality improvement as a function of treatment planning and delivery techniques for breast and lung cancer patients. We created tools to collect anonymized data based on all plans. The MROQC system representing 24 institutions has been successfully deployed in the state of Michigan. Since 2012, dose-volume histogram and Digital Imaging and Communications in Medicine-radiation therapy plan data and information on simulation, planning, and delivery techniques have been collected. Audits indicated >90% accurate data submission and spurred refinements to data collection methodology. This model web-based system captures detailed, high-quality radiation therapy dosimetry data along with patient- and physician-reported outcomes and clinical data for a radiation therapy collaborative quality initiative. The collaborative nature of the project has been integral to its success. Our methodology can be applied to setting up analogous consortiums and databases. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Improving Geoscience Outreach Through Multimedia Enhanced Web Sites - An Example From Connecticut
NASA Astrophysics Data System (ADS)
Hyatt, J. A.; Coron, C. R.; Schroeder, T. J.; Fleming, T.; Drzewiecki, P. A.
2005-12-01
Although large governmental web sites (e.g. USGS, NASA etc.) are important resources, particularly in relation to phenomena with global to regional significance (e.g. recent Tsunami and Hurricane disasters), smaller academic web portals continue to make substantive contributions to web-based learning in the geosciences. The strength of "home-grown" web sites is that they easily can be tailored to specific classes, they often focus on local geologic content, and they potentially integrate classroom, laboratory, and field-based learning in ways that improve introductory classes. Furthermore, innovative multimedia techniques including virtual reality, image manipulations, and interactive streaming video can improve visualization and be particularly helpful for first-time geology students. This poster reports on one such web site, Learning Tools in Earth Science (LTES, http://www.easternct .edu/personal/faculty/hyattj/LTES-v2/), a site developed by geoscience faculty at two state institutions. In contrast to some large web sites with media development teams, LTES geoscientists, with strong support from media and IT service departments, are responsible for geologic content and verification, media development and editing, and web development and authoring. As such, we have considerable control over both content and design of this site. At present the main content modules for LTES include "mineral" and "virtual field trip" links. The mineral module includes an interactive mineral gallery, and a virtual mineral box of 24 unidentified samples that are identical to those used in some of our classes. Students navigate an intuitive web portal to manipulate images and view streaming video segments that explain and undertake standard mineral identification tests. New elements highlighted in our poster include links to a virtual petrographic microscope, in which users can manipulate images to simulate stage rotation in both plane- and cross-polarized light. Virtual field trips include video-based excursions to sites in Georgia, Connecticut and Greenland. New to these VFT's is the integration of "virtual walks" in which users are able to navigate through some field sites in a virtual sense. Development of this resource is ongoing, but response from students, faculty outside of Earth Science and K-12 instructors indicate that this small web site can provide useful resources for those educators utilizing web-based learning in their courses. .edu/personal/faculty/hyattj/LTES-v2/
NASA Astrophysics Data System (ADS)
Sugumaran, Ramanathan; Meyer, James C.; Davis, Jim
2004-10-01
Local governments often struggle to balance competing demands for residential, commercial and industrial development with imperatives to minimize environmental degradation. In order to effectively manage this development process on a sustainable basis, local planners and government agencies are increasingly seeking better tools and techniques. In this paper, we describe the development of a Web-Based Environmental Decision Support System (WEDSS), which helps to prioritize local watersheds in terms of environmental sensitivity using multiple criteria identified by planners and local government staff in the city of Columbia, and Boone County, Missouri. The development of the system involved three steps, the first was to establish the relevant environmental criteria and develop data layers for each criterion, then a spatial model was developed for analysis, and lastly a Web-based interface with analysis tools was developed using client-server technology. The WEDSS is an example of a way to run spatial models over the Web and represents a significant increase in capability over other WWW-based GIS applications that focus on database querying and map display. The WEDSS seeks to aid in the development of agreement regarding specific local areas deserving increased protection and the public policies to be pursued in minimizing the environmental impact of future development. The tool is also intended to assist ongoing public information and education efforts concerning watershed management and water quality issues for the City of Columbia, Missouri and adjacent developing areas within Boone County, Missouri.
Exposing the structure of an Arctic food web.
Wirta, Helena K; Vesterinen, Eero J; Hambäck, Peter A; Weingartner, Elisabeth; Rasmussen, Claus; Reneerkens, Jeroen; Schmidt, Niels M; Gilg, Olivier; Roslin, Tomas
2015-09-01
How food webs are structured has major implications for their stability and dynamics. While poorly studied to date, arctic food webs are commonly assumed to be simple in structure, with few links per species. If this is the case, then different parts of the web may be weakly connected to each other, with populations and species united by only a low number of links. We provide the first highly resolved description of trophic link structure for a large part of a high-arctic food web. For this purpose, we apply a combination of recent techniques to describing the links between three predator guilds (insectivorous birds, spiders, and lepidopteran parasitoids) and their two dominant prey orders (Diptera and Lepidoptera). The resultant web shows a dense link structure and no compartmentalization or modularity across the three predator guilds. Thus, both individual predators and predator guilds tap heavily into the prey community of each other, offering versatile scope for indirect interactions across different parts of the web. The current description of a first but single arctic web may serve as a benchmark toward which to gauge future webs resolved by similar techniques. Targeting an unusual breadth of predator guilds, and relying on techniques with a high resolution, it suggests that species in this web are closely connected. Thus, our findings call for similar explorations of link structure across multiple guilds in both arctic and other webs. From an applied perspective, our description of an arctic web suggests new avenues for understanding how arctic food webs are built and function and of how they respond to current climate change. It suggests that to comprehend the community-level consequences of rapid arctic warming, we should turn from analyses of populations, population pairs, and isolated predator-prey interactions to considering the full set of interacting species.
Infrastructure for the Geospatial Web
NASA Astrophysics Data System (ADS)
Lake, Ron; Farley, Jim
Geospatial data and geoprocessing techniques are now directly linked to business processes in many areas. Commerce, transportation and logistics, planning, defense, emergency response, health care, asset management and many other domains leverage geospatial information and the ability to model these data to achieve increased efficiencies and to develop better, more comprehensive decisions. However, the ability to deliver geospatial data and the capacity to process geospatial information effectively in these domains are dependent on infrastructure technology that facilitates basic operations such as locating data, publishing data, keeping data current and notifying subscribers and others whose applications and decisions are dependent on this information when changes are made. This chapter introduces the notion of infrastructure technology for the Geospatial Web. Specifically, the Geography Markup Language (GML) and registry technology developed using the ebRIM specification delivered from the OASIS consortium are presented as atomic infrastructure components in a working Geospatial Web.
World Wide Web Pages--Tools for Teaching and Learning.
ERIC Educational Resources Information Center
Beasley, Sarah; Kent, Jean
Created to help educators incorporate World Wide Web pages into teaching and learning, this collection of Web pages presents resources, materials, and techniques for using the Web. The first page focuses on tools for teaching and learning via the Web, providing pointers to sites containing the following: (1) course materials for both distance and…
Kushniruk, A W; Patel, C; Patel, V L; Cimino, J J
2001-04-01
The World Wide Web provides an unprecedented opportunity for widespread access to health-care applications by both patients and providers. The development of new methods for assessing the effectiveness and usability of these systems is becoming a critical issue. This paper describes the distance evaluation (i.e. 'televaluation') of emerging Web-based information technologies. In health informatics evaluation, there is a need for application of new ideas and methods from the fields of cognitive science and usability engineering. A framework is presented for conducting evaluations of health-care information technologies that integrates a number of methods, ranging from deployment of on-line questionnaires (and Web-based forms) to remote video-based usability testing of user interactions with clinical information systems. Examples illustrating application of these techniques are presented for the assessment of a patient clinical information system (PatCIS), as well as an evaluation of use of Web-based clinical guidelines. Issues in designing, prototyping and iteratively refining evaluation components are discussed, along with description of a 'virtual' usability laboratory.
Construction of Multimedia Courseware and Web-based E-Learning Courses of "Biomedical Materials".
Xiaoying, Lu; Jian, He; Tian, Qin; Dongxu, Jiang; Wei, Chen
2005-01-01
In order to reform the traditional teaching methodology and to improve the teaching effect, we developed new teaching system for course "Biomedical Materials" in our university by the support of the computer technique and Internet. The new teaching system includes the construction of the multimedia courseware and web-based e-learning courses. More than 2000 PowerPoint slides have been designed and optimized and flash movies for several capitals are included. On the basis of this multimedia courseware, a web-based educational environment has been established further, which includes course contents, introduction of the teacher, courseware download, study forum, sitemap of the web, and relative link. The multimedia courseware has been introduced in the class teaching for "Biomedical Materials" for 6 years and a good teaching effect has been obtained. The web-based e-learning courses have been constructed for two years and proved that they are helpful for the students by their preparing and reviewing the teaching contents before and after the class teaching.
Homesteading on the Web: The Queensland Department of Education Virtual Library.
ERIC Educational Resources Information Center
Cram, Jennifer; Allison, Myrl
1996-01-01
The Queensland Department of Education (Australia) developed a homesteading model as an alternative to the urban-built environment model of large multi-purpose networks. This resulted in the in-house development of a low-cost, stand-alone server and homepage. The charette technique was used to plan and design the Queensland Department of Education…
Library OPACs on the Web: Finding and Describing Directories.
ERIC Educational Resources Information Center
Henry, Marcia
1997-01-01
Provides current descriptions of some of the major directories that link to library catalogs on the World Wide Web. Highlights include LibWeb; Hytelnet; WebCats; WWW Library Directory; and techniques for finding new library OPAC (online public access catalog) directories. (LRW)
iview: an interactive WebGL visualizer for protein-ligand complex.
Li, Hongjian; Leung, Kwong-Sak; Nakane, Takanori; Wong, Man-Hon
2014-02-25
Visualization of protein-ligand complex plays an important role in elaborating protein-ligand interactions and aiding novel drug design. Most existing web visualizers either rely on slow software rendering, or lack virtual reality support. The vital feature of macromolecular surface construction is also unavailable. We have developed iview, an easy-to-use interactive WebGL visualizer of protein-ligand complex. It exploits hardware acceleration rather than software rendering. It features three special effects in virtual reality settings, namely anaglyph, parallax barrier and oculus rift, resulting in visually appealing identification of intermolecular interactions. It supports four surface representations including Van der Waals surface, solvent excluded surface, solvent accessible surface and molecular surface. Moreover, based on the feature-rich version of iview, we have also developed a neat and tailor-made version specifically for our istar web platform for protein-ligand docking purpose. This demonstrates the excellent portability of iview. Using innovative 3D techniques, we provide a user friendly visualizer that is not intended to compete with professional visualizers, but to enable easy accessibility and platform independence.
Secure Web-Site Access with Tickets and Message-Dependent Digests
Donato, David I.
2008-01-01
Although there are various methods for restricting access to documents stored on a World Wide Web (WWW) site (a Web site), none of the widely used methods is completely suitable for restricting access to Web applications hosted on an otherwise publicly accessible Web site. A new technique, however, provides a mix of features well suited for restricting Web-site or Web-application access to authorized users, including the following: secure user authentication, tamper-resistant sessions, simple access to user state variables by server-side applications, and clean session terminations. This technique, called message-dependent digests with tickets, or MDDT, maintains secure user sessions by passing single-use nonces (tickets) and message-dependent digests of user credentials back and forth between client and server. Appendix 2 provides a working implementation of MDDT with PHP server-side code and JavaScript client-side code.
Dynamic "inline" images: context-sensitive retrieval and integration of images into Web documents.
Kahn, Charles E
2008-09-01
Integrating relevant images into web-based information resources adds value for research and education. This work sought to evaluate the feasibility of using "Web 2.0" technologies to dynamically retrieve and integrate pertinent images into a radiology web site. An online radiology reference of 1,178 textual web documents was selected as the set of target documents. The ARRS GoldMiner image search engine, which incorporated 176,386 images from 228 peer-reviewed journals, retrieved images on demand and integrated them into the documents. At least one image was retrieved in real-time for display as an "inline" image gallery for 87% of the web documents. Each thumbnail image was linked to the full-size image at its original web site. Review of 20 randomly selected Collaborative Hypertext of Radiology documents found that 69 of 72 displayed images (96%) were relevant to the target document. Users could click on the "More" link to search the image collection more comprehensively and, from there, link to the full text of the article. A gallery of relevant radiology images can be inserted easily into web pages on any web server. Indexing by concepts and keywords allows context-aware image retrieval, and searching by document title and subject metadata yields excellent results. These techniques allow web developers to incorporate easily a context-sensitive image gallery into their documents.
FRASS: the web-server for RNA structural comparison
2010-01-01
Background The impressive increase of novel RNA structures, during the past few years, demands automated methods for structure comparison. While many algorithms handle only small motifs, few techniques, developed in recent years, (ARTS, DIAL, SARA, SARSA, and LaJolla) are available for the structural comparison of large and intact RNA molecules. Results The FRASS web-server represents a RNA chain with its Gauss integrals and allows one to compare structures of RNA chains and to find similar entries in a database derived from the Protein Data Bank. We observed that FRASS scores correlate well with the ARTS and LaJolla similarity scores. Moreover, the-web server can also reproduce satisfactorily the DARTS classification of RNA 3D structures and the classification of the SCOR functions that was obtained by the SARA method. Conclusions The FRASS web-server can be easily used to detect relationships among RNA molecules and to scan efficiently the rapidly enlarging structural databases. PMID:20553602
Technical progress in silicon sheet growth under DOE/JPL FSA program, 1975-1986
NASA Technical Reports Server (NTRS)
Kalejs, J. P.
1986-01-01
The technical progress made in the Silicon Sheet Growth Program during its 11 years was reviewed. At present, in 1986, only two of the original 9 techniques have survived to the start-up, pilot-plant stage in industry. These two techniques are the edge-defined, film-fed growth (EFG) technique that produces closed shape polygons, and the WEB dendritic technique that produces single ribbons. Both the status and future concerns of the EFG and WEB techniques were discussed.
Use of a secure Internet Web site for collaborative medical research.
Marshall, W W; Haley, R W
2000-10-11
Researchers who collaborate on clinical research studies from diffuse locations need a convenient, inexpensive, secure way to record and manage data. The Internet, with its World Wide Web, provides a vast network that enables researchers with diverse types of computers and operating systems anywhere in the world to log data through a common interface. Development of a Web site for scientific data collection can be organized into 10 steps, including planning the scientific database, choosing a database management software system, setting up database tables for each collaborator's variables, developing the Web site's screen layout, choosing a middleware software system to tie the database software to the Web site interface, embedding data editing and calculation routines, setting up the database on the central server computer, obtaining a unique Internet address and name for the Web site, applying security measures to the site, and training staff who enter data. Ensuring the security of an Internet database requires limiting the number of people who have access to the server, setting up the server on a stand-alone computer, requiring user-name and password authentication for server and Web site access, installing a firewall computer to prevent break-ins and block bogus information from reaching the server, verifying the identity of the server and client computers with certification from a certificate authority, encrypting information sent between server and client computers to avoid eavesdropping, establishing audit trails to record all accesses into the Web site, and educating Web site users about security techniques. When these measures are carefully undertaken, in our experience, information for scientific studies can be collected and maintained on Internet databases more efficiently and securely than through conventional systems of paper records protected by filing cabinets and locked doors. JAMA. 2000;284:1843-1849.
Exploring Remote Rensing Through The Use Of Readily-Available Classroom Technologies
NASA Astrophysics Data System (ADS)
Rogers, M. A.
2013-12-01
Frontier geoscience research using remotely-sensed satellite observation routinely requires sophisticated and novel remote sensing techniques to succeed. Describing these techniques in an educational format presents significant challenges to the science educator, especially with regards to the professional development setting where a small, but competent audience has limited instructor contact time to develop the necessary understanding. In this presentation, we describe the use of simple and cheaply available technologies, including ultrasonic transducers, FLIR detectors, and even simple web cameras to provide a tangible analogue to sophisticated remote sensing platforms. We also describe methods of curriculum development that leverages the use of these simple devices to teach the fundamentals of remote sensing, resulting in a deeper and more intuitive understanding of the techniques used in modern remote sensing research. Sample workshop itineraries using these techniques are provided as well.
NASA Astrophysics Data System (ADS)
Saito, Daisuke; Saito, Keiichi; Notomi, Kazuhiro; Saito, Masao
This paper presents the visibility ordering of several web safe colors. The research of web page visibility is important because of the rapid dissemination of the World Wide Web. The combination of a foreground color and a background color is an important factor in providing sufficient visibility. Therefore, the rating of color combination visibility is necessary when developing accessible web sites. In this study, the visibility of several web-safe color combinations was examined using psychological methodology, i.e., paired comparison. Eighteen chromatic and 3 achromatic web-safe colors were employed for visual stimuli. Twenty-eight subjects ranging from ages 21 to 75 were recruited, and all were with normal color sensation. They looked at two different colored characters simultaneously on the white background and were instructed to identify which one enabled them to see more clearly. In examining the relationship between the psychological rankings of the color combinations and the visual sensations, each color combination was first scored as to the visibility by Thurstone's paired comparison technique. Secondly, the visual sensation was deduced by applying Weber-Fechner's law to the luminance of the foreground colors. As results, the luminance of a foreground color influenced the visibility; however the visibility rating is difficult only using the luminance of web-safe colors. These indicate that the chromaticity and chroma saturation are necessary in rating of chromatic web-safe color visibility.
A Study on Visibility Rating of Several Representative Web-Safe Colors
NASA Astrophysics Data System (ADS)
Saito, Daisuke; Saito, Keiichi; Notomi, Kazuhiro; Saito, Masao
This paper presents the visibility ordering of several web-safe colors. The research of web site visibility is important because of the rapid dissemination of the World Wide Web. The combination of a foreground color and a background color is an important factor in providing sufficient visibility. Therefore, the rating of color combination visibility is necessary when developing accessible web sites. In this study, the visibility of several web-safe color combinations was examined using psychological methodology, i.e., a paired comparison. Eighteen chromatic web-safe colors were employed for visual stimuli. Nine students ranging from ages 21 to 29 (average 23.7) were recruited, and all were with normal color sensation. These nine subjects looked at two different colored characters simultaneously on the white background and were instructed to identify which one enabled them to see more clearly. In examining the relationship between the psychological rankings of the color combinations and the visual sensations, each color combination was first scored as to the visibility by Thurstone's paired comparisons technique. Secondly, the visual sensation was deduced by applying Weber-Fechner's law to the luminance of the foreground colors. As results, the luminance of a foreground color influenced the visibility; however the visibility rating is difficult only using the luminance of web-safe colors. These indicate that the chromaticity and chroma saturation are necessary in rating of chromatic web-safe color visibility.
Flush-mounting technique for composite beams
NASA Technical Reports Server (NTRS)
Harman, T. C.; Kay, B. F.
1980-01-01
Procedure permits mounting of heavy parts to surface of composite beams without appreciably weakening beam web. Web is split and held apart in region where attachment is to be made by lightweight precast foam filler. Bolt hole penetrates foam rather than web, and is secured by barrelnut in transverse bushing through web.
Visual Based Retrieval Systems and Web Mining--Introduction.
ERIC Educational Resources Information Center
Iyengar, S. S.
2001-01-01
Briefly discusses Web mining and image retrieval techniques, and then presents a summary of articles in this special issue. Articles focus on Web content mining, artificial neural networks as tools for image retrieval, content-based image retrieval systems, and personalizing the Web browsing experience using media agents. (AEF)
A Testbed for Data Fusion for Helicopter Diagnostics and Prognostics
2003-03-01
and algorithm design and tuning in order to develop advanced diagnostic and prognostic techniques for air craft health monitoring . Here a...and development of models for diagnostics, prognostics , and anomaly detection . Figure 5 VMEP Server Browser Interface 7 Download... detections , and prognostic prediction time horizons. The VMEP system and in particular the web component are ideal for performing data collection
Android Based Mobile Environment for Moodle Users
ERIC Educational Resources Information Center
de Clunie, Gisela T.; Clunie, Clifton; Castillo, Aris; Rangel, Norman
2013-01-01
This paper is about the development of a platform that eases, throughout Android based mobile devices, mobility of users of virtual courses at Technological University of Panama. The platform deploys computational techniques such as "web services," design patterns, ontologies and mobile technologies to allow mobile devices communicate…
Optimizing real-time Web-based user interfaces for observatories
NASA Astrophysics Data System (ADS)
Gibson, J. Duane; Pickering, Timothy E.; Porter, Dallan; Schaller, Skip
2008-08-01
In using common HTML/Ajax approaches for web-based data presentation and telescope control user interfaces at the MMT Observatory (MMTO), we rapidly were confronted with web browser performance issues. Much of the operational data at the MMTO is highly dynamic and is constantly changing during normal operations. Status of telescope subsystems must be displayed with minimal latency to telescope operators and other users. A major motivation of migrating toward web-based applications at the MMTO is to provide easy access to current and past observatory subsystem data for a wide variety of users on their favorite operating system through a familiar interface, their web browser. Performance issues, especially for user interfaces that control telescope subsystems, led to investigations of more efficient use of HTML/Ajax and web server technologies as well as other web-based technologies, such as Java and Flash/Flex. The results presented here focus on techniques for optimizing HTML/Ajax web applications with near real-time data display. This study indicates that direct modification of the contents or "nodeValue" attribute of text nodes is the most efficient method of updating data values displayed on a web page. Other optimization techniques are discussed for web-based applications that display highly dynamic data.
Rozen, Warren Matthew; Spychal, Robert T.; Hunter-Smith, David J.
2016-01-01
Background Accurate volumetric analysis is an essential component of preoperative planning in both reconstructive and aesthetic breast procedures towards achieving symmetrization and patient-satisfactory outcome. Numerous comparative studies and reviews of individual techniques have been reported. However, a unifying review of all techniques comparing their accuracy, reliability, and practicality has been lacking. Methods A review of the published English literature dating from 1950 to 2015 using databases, such as PubMed, Medline, Web of Science, and EMBASE, was undertaken. Results Since Bouman’s first description of water displacement method, a range of volumetric assessment techniques have been described: thermoplastic casting, direct anthropomorphic measurement, two-dimensional (2D) imaging, and computed tomography (CT)/magnetic resonance imaging (MRI) scans. However, most have been unreliable, difficult to execute and demonstrate limited practicability. Introduction of 3D surface imaging has revolutionized the field due to its ease of use, fast speed, accuracy, and reliability. However, its widespread use has been limited by its high cost and lack of high level of evidence. Recent developments have unveiled the first web-based 3D surface imaging program, 4D imaging, and 3D printing. Conclusions Despite its importance, an accurate, reliable, and simple breast volumetric analysis tool has been elusive until the introduction of 3D surface imaging technology. However, its high cost has limited its wide usage. Novel adjunct technologies, such as web-based 3D surface imaging program, 4D imaging, and 3D printing, appear promising. PMID:27047788
The Electron Microscopy Outreach Program: A Web-based resource for research and education.
Sosinsky, G E; Baker, T S; Hand, G; Ellisman, M H
1999-01-01
We have developed a centralized World Wide Web (WWW)-based environment that serves as a resource of software tools and expertise for biological electron microscopy. A major focus is molecular electron microscopy, but the site also includes information and links on structural biology at all levels of resolution. This site serves to help integrate or link structural biology techniques in accordance with user needs. The WWW site, called the Electron Microscopy (EM) Outreach Program (URL: http://emoutreach.sdsc.edu), provides scientists with computational and educational tools for their research and edification. In particular, we have set up a centralized resource containing course notes, references, and links to image analysis and three-dimensional reconstruction software for investigators wanting to learn about EM techniques either within or outside of their fields of expertise. Copyright 1999 Academic Press.
A Web Server and Mobile App for Computing Hemolytic Potency of Peptides
NASA Astrophysics Data System (ADS)
Chaudhary, Kumardeep; Kumar, Ritesh; Singh, Sandeep; Tuknait, Abhishek; Gautam, Ankur; Mathur, Deepika; Anand, Priya; Varshney, Grish C.; Raghava, Gajendra P. S.
2016-03-01
Numerous therapeutic peptides do not enter the clinical trials just because of their high hemolytic activity. Recently, we developed a database, Hemolytik, for maintaining experimentally validated hemolytic and non-hemolytic peptides. The present study describes a web server and mobile app developed for predicting, and screening of peptides having hemolytic potency. Firstly, we generated a dataset HemoPI-1 that contains 552 hemolytic peptides extracted from Hemolytik database and 552 random non-hemolytic peptides (from Swiss-Prot). The sequence analysis of these peptides revealed that certain residues (e.g., L, K, F, W) and motifs (e.g., “FKK”, “LKL”, “KKLL”, “KWK”, “VLK”, “CYCR”, “CRR”, “RFC”, “RRR”, “LKKL”) are more abundant in hemolytic peptides. Therefore, we developed models for discriminating hemolytic and non-hemolytic peptides using various machine learning techniques and achieved more than 95% accuracy. We also developed models for discriminating peptides having high and low hemolytic potential on different datasets called HemoPI-2 and HemoPI-3. In order to serve the scientific community, we developed a web server, mobile app and JAVA-based standalone software (http://crdd.osdd.net/raghava/hemopi/).
WMT: The CSDMS Web Modeling Tool
NASA Astrophysics Data System (ADS)
Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.
2015-12-01
The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.
Web Navigation Sequences Automation in Modern Websites
NASA Astrophysics Data System (ADS)
Montoto, Paula; Pan, Alberto; Raposo, Juan; Bellas, Fernando; López, Javier
Most today’s web sources are designed to be used by humans, but they do not provide suitable interfaces for software programs. That is why a growing interest has arisen in so-called web automation applications that are widely used for different purposes such as B2B integration, automated testing of web applications or technology and business watch. Previous proposals assume models for generating and reproducing navigation sequences that are not able to correctly deal with new websites using technologies such as AJAX: on one hand existing systems only allow recording simple navigation actions and, on the other hand, they are unable to detect the end of the effects caused by an user action. In this paper, we propose a set of new techniques to record and execute web navigation sequences able to deal with all the complexity existing in AJAX-based web sites. We also present an exhaustive evaluation of the proposed techniques that shows very promising results.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-06
...: Exchange Programs Alumni Web Site Registration, DS-7006 ACTION: Notice of request for public comments... the Paperwork Reduction Act of 1995. Title of Information Collection: Exchange Programs Alumni Web... techniques or other forms of technology. Abstract of proposed collection: The State Alumni Web site requires...
A Good Teaching Technique: WebQuests
ERIC Educational Resources Information Center
Halat, Erdogan
2008-01-01
In this article, the author first introduces and describes a new teaching tool called WebQuests to practicing teachers. He then provides detailed information about the structure of a good WebQuest. Third, the author shows the strengths and weaknesses of using Web-Quests in teaching and learning. Last, he points out the challenges for practicing…
ERIC Educational Resources Information Center
Macmillan, Roderick H.
1996-01-01
Describes a management system developed by BT Laboratories (United Kingdom) that is based on ISO 9001 using the World Wide Web, a hypermedia system, and part of the Internet. Subject matter is presented as an alphabetical list of linked entries, numerous navigational techniques are available, and searching options function within an index file.…
Gamification strategy on prevention of STDs for youth.
Gabarron, Elia; Schopf, Thomas; Serrano, J Artur; Fernandez-Luque, Luis; Dorronzoro, Enrique
2013-01-01
Sexually transmitted diseases (STDs) and especially chlamydia is a worrying problem among North-Norwegian youngsters. Gamified web applications should be valued for sexual health education, and thus STDs prevention, for their potential to get users engaged and involved with their healthcare. Aiming to achieve that youngsters become more aware of STDs we have developed "sjekkdeg.no", a gamified web application focused on sexual health targeting North-Norwegian youngsters. Gamification techniques like avatars, achievement-based gifts and social network sharing buttons have been implemented in the site that includes educational content on sexual health and a STDs symptom checker. Preliminary results show that the game-style web app could be useful to encourage users to learn more on sexual health and STDs and thus changing their risky behaviors and preventing sexually transmitted diseases.
Aufderheide, Helge; Rudolf, Lars; Gross, Thilo; Lafferty, Kevin D.
2013-01-01
Recent attempts to predict the response of large food webs to perturbations have revealed that in larger systems increasingly precise information on the elements of the system is required. Thus, the effort needed for good predictions grows quickly with the system's complexity. Here, we show that not all elements need to be measured equally well, suggesting that a more efficient allocation of effort is possible. We develop an iterative technique for determining an efficient measurement strategy. In model food webs, we find that it is most important to precisely measure the mortality and predation rates of long-lived, generalist, top predators. Prioritizing the study of such species will make it easier to understand the response of complex food webs to perturbations.
Overcoming Terminology Barrier Using Web Resources for Cross-Language Medical Information Retrieval
Lu, Wen-Hsiang; Lin, Ray Shih-Jui; Chan, Yi-Che; Chen, Kuan-Hsi
2006-01-01
A number of authoritative medical websites, such as PubMed and MedlinePlus, provide consumers with the most up-to-date health information. However, non-English speakers often encounter not only language barriers (from other languages to English) but also terminology barriers (from laypersons’ terms to professional medical terms) when retrieving information from these websites. Our previous work addresses language barriers by developing a multilingual medical thesaurus, Chinese-English MeSH, while this study presents an approach to overcome terminology barriers based on Web resources. Two techniques were utilized in our approach: monolingual concept mapping using approximate string matching and crosslingual concept mapping using Web resources. The evaluation shows that our approach can significantly improve the performance on MeSH concept mapping and cross-language medical information retrieval. PMID:17238395
Web-based interventions for the management of stress in the workplace: Focus, form, and efficacy
Ryan, Cathal; Bergin, Michael; Chalder, Trudie; Wells, John SG
2017-01-01
Objectives: This review sought to determine what is currently known about the focus, form, and efficacy of web-based interventions that aim to support the well-being of workers and enable them to manage their work-related stress. Method: A scoping review of the literature as this relates to web-based interventions for the management of work-related stress and supporting the psychological well-being of workers was conducted. Results: Forty-eight web-based interventions were identified and reviewed, the majority of which (n = 37) were "individual" -focused and utilized cognitive-behavioral techniques, relaxation exercises, mindfulness, or cognitive behavior therapy. Most interventions identified were provided via a website (n = 34) and were atheoretical in nature. Conclusions: There is some low-to-moderate quality evidence that "individual" -focused interventions are effective for supporting employee well-being and managing their work-related stress. There are few web-based interventions that target "organizational" or "individual/organization" interface factors, and there is limited support for their efficacy. A clear gap appears to exist between work-stress theory and its application in the design and development of web-based interventions for the management of work-related stress. PMID:28320977
Share2Quit: Web-Based Peer-Driven Referrals for Smoking Cessation
2013-01-01
Background Smoking is the number one preventable cause of death in the United States. Effective Web-assisted tobacco interventions are often underutilized and require new and innovative engagement approaches. Web-based peer-driven chain referrals successfully used outside health care have the potential for increasing the reach of Internet interventions. Objective The objective of our study was to describe the protocol for the development and testing of proactive Web-based chain-referral tools for increasing the access to Decide2Quit.org, a Web-assisted tobacco intervention system. Methods We will build and refine proactive chain-referral tools, including email and Facebook referrals. In addition, we will implement respondent-driven sampling (RDS), a controlled chain-referral sampling technique designed to remove inherent biases in chain referrals and obtain a representative sample. We will begin our chain referrals with an initial recruitment of former and current smokers as seeds (initial participants) who will be trained to refer current smokers from their social network using the developed tools. In turn, these newly referred smokers will also be provided the tools to refer other smokers from their social networks. We will model predictors of referral success using sample weights from the RDS to estimate the success of the system in the targeted population. Results This protocol describes the evaluation of proactive Web-based chain-referral tools, which can be used in tobacco interventions to increase the access to hard-to-reach populations, for promoting smoking cessation. Conclusions Share2Quit represents an innovative advancement by capitalizing on naturally occurring technology trends to recruit smokers to Web-assisted tobacco interventions. PMID:24067329
A genetic algorithm for replica server placement
NASA Astrophysics Data System (ADS)
Eslami, Ghazaleh; Toroghi Haghighat, Abolfazl
2012-01-01
Modern distribution systems use replication to improve communication delay experienced by their clients. Some techniques have been developed for web server replica placement. One of the previous studies was Greedy algorithm proposed by Qiu et al, that needs knowledge about network topology. In This paper, first we introduce a genetic algorithm for web server replica placement. Second, we compare our algorithm with Greedy algorithm proposed by Qiu et al, and Optimum algorithm. We found that our approach can achieve better results than Greedy algorithm proposed by Qiu et al but it's computational time is more than Greedy algorithm.
A genetic algorithm for replica server placement
NASA Astrophysics Data System (ADS)
Eslami, Ghazaleh; Toroghi Haghighat, Abolfazl
2011-12-01
Modern distribution systems use replication to improve communication delay experienced by their clients. Some techniques have been developed for web server replica placement. One of the previous studies was Greedy algorithm proposed by Qiu et al, that needs knowledge about network topology. In This paper, first we introduce a genetic algorithm for web server replica placement. Second, we compare our algorithm with Greedy algorithm proposed by Qiu et al, and Optimum algorithm. We found that our approach can achieve better results than Greedy algorithm proposed by Qiu et al but it's computational time is more than Greedy algorithm.
Robotics and Virtual Reality for Cultural Heritage Digitization and Fruition
NASA Astrophysics Data System (ADS)
Calisi, D.; Cottefoglie, F.; D'Agostini, L.; Giannone, F.; Nenci, F.; Salonia, P.; Zaratti, M.; Ziparo, V. A.
2017-05-01
In this paper we present our novel approach for acquiring and managing digital models of archaeological sites, and the visualization techniques used to showcase them. In particular, we will demonstrate two technologies: our robotic system for digitization of archaeological sites (DigiRo) result of over three years of efforts by a group of cultural heritage experts, computer scientists and roboticists, and our cloud-based archaeological information system (ARIS). Finally we describe the viewers we developed to inspect and navigate the 3D models: a viewer for the web (ROVINA Web Viewer) and an immersive viewer for Virtual Reality (ROVINA VR Viewer).
Lescher, Stephanie; du Mesnil de Rochemont, Richard; Berkefeld, Joachim
2016-04-01
The introduction of the Woven Endobridge (WEB) device increases the feasibility of endovascular treatment of wide-neck bifurcation aneurysms with limitations given by currently available sizes and shapes of the device. Parallel to other studies, we used the new device for selected patients who were no optimal candidates for established techniques like neurosurgical clipping or endovascular coiling. We aimed to report the angiographic and clinical results of WEB implantations or combinations between WEB and coiling or intracranial stents. We reviewed the records of n = 23 interventions in 22 patients with unruptured wide-neck aneurysms (UIA) who were assigned for aneurysm treatment with the use of the WEB or adjunctive techniques. Interventional procedures and clinical and angiographic outcomes are reported for the periprocedural phase and in mid-term FU. Of the included 22 patients, six patients needed additional coiling, intracranial stenting, or implantation of a flow diverter. WEB implantation was technically feasible in 22 out of the 23 interventions. Follow-up angiographic imaging proved total or subtotal occlusion of the aneurysm in 19 of 22 cases. Two minor recurrences remained stable during a period of 15 months. One patient with a partially thrombosed giant MCA aneurysm had a major recurrence and was retreated with a second WEB in combination with coiling. Despite of unfavorable anatomic conditions, broad-based and large UIA endovascular treatment with the WEB and adjunctive techniques was feasible with a low risk of complications and promising occlusion rates in mid-term follow-up.
Information system building of the urban electromagnetic environment
NASA Astrophysics Data System (ADS)
Wang, Jiechen; Rui, Yikang; Shen, Dingtao; Yu, Qing
2007-06-01
The pollution of urban electromagnetic radiation has become more serious, however, there is still lack of a perfect and interactive User System to manage, analyze and issue the information. In this study, taking the electromagnetic environment of Nanjing as an example, an information system based on WebGIS with the techniques of ArcIMS and JSP has been developed, in order to provide the services and technique supports for information query of public and decision making of relevant departments.
bioWeb3D: an online webGL 3D data visualisation tool.
Pettit, Jean-Baptiste; Marioni, John C
2013-06-07
Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets.
Service-based analysis of biological pathways
Zheng, George; Bouguettaya, Athman
2009-01-01
Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403
Flipping the Physical Examination: Web-Based Instruction and Live Assessment of Bedside Technique.
Williams, Dustyn E; Thornton, John W
2016-01-01
The skill of physicians teaching the physical examination skill has decreased, with newer faculty underperforming compared to their seniors. Improved methods of instruction with an emphasis on physical examinations are necessary to both improve the quality of medical education and alleviate the teaching burden of faculty physicians. We developed a curriculum that combines web-based instruction with real-life practice and features individualized feedback. This innovative medical education model should allow the physical examination to be taught and assessed in an effective manner. The model is under study at Baton Rouge General Medical Center. Our goals are to limit faculty burden, maximize student involvement as learners and evaluators, and effectively develop students' critical skills in performing bedside assessments.
NASA Astrophysics Data System (ADS)
Deshpande, Ruchi R.; Requejo, Philip; Sutisna, Erry; Wang, Ximing; Liu, Margaret; McNitt-Gray, Sarah; Ruparel, Puja; Liu, Brent J.
2012-02-01
Patients confined to manual wheel-chairs are at an added risk of shoulder injury. There is a need for developing optimal bio-mechanical techniques for wheel-chair propulsion through movement analysis. Data collected is diverse and in need of normalization and integration. Current databases are ad-hoc and do not provide flexibility, extensibility and ease of access. The need for an efficient means to retrieve specific trial data, display it and compare data from multiple trials is unmet through lack of data association and synchronicity. We propose the development of a robust web-based ePR system that will enhance workflow and facilitate efficient data management.
Improving Conceptual Design for Launch Vehicles
NASA Technical Reports Server (NTRS)
Olds, John R.
1998-01-01
This report summarizes activities performed during the second year of a three year cooperative agreement between NASA - Langley Research Center and Georgia Tech. Year 1 of the project resulted in the creation of a new Cost and Business Assessment Model (CABAM) for estimating the economic performance of advanced reusable launch vehicles including non-recurring costs, recurring costs, and revenue. The current year (second year) activities were focused on the evaluation of automated, collaborative design frameworks (computation architectures or computational frameworks) for automating the design process in advanced space vehicle design. Consistent with NASA's new thrust area in developing and understanding Intelligent Synthesis Environments (ISE), the goals of this year's research efforts were to develop and apply computer integration techniques and near-term computational frameworks for conducting advanced space vehicle design. NASA - Langley (VAB) has taken a lead role in developing a web-based computing architectures within which the designer can interact with disciplinary analysis tools through a flexible web interface. The advantages of this approach are, 1) flexible access to the designer interface through a simple web browser (e.g. Netscape Navigator), 2) ability to include existing 'legacy' codes, and 3) ability to include distributed analysis tools running on remote computers. To date, VAB's internal emphasis has been on developing this test system for the planetary entry mission under the joint Integrated Design System (IDS) program with NASA - Ames and JPL. Georgia Tech's complementary goals this year were to: 1) Examine an alternate 'custom' computational architecture for the three-discipline IDS planetary entry problem to assess the advantages and disadvantages relative to the web-based approach.and 2) Develop and examine a web-based interface and framework for a typical launch vehicle design problem.
Planning and Management of Real-Time Geospatialuas Missions Within a Virtual Globe Environment
NASA Astrophysics Data System (ADS)
Nebiker, S.; Eugster, H.; Flückiger, K.; Christen, M.
2011-09-01
This paper presents the design and development of a hardware and software framework supporting all phases of typical monitoring and mapping missions with mini and micro UAVs (unmanned aerial vehicles). The developed solution combines state-of-the art collaborative virtual globe technologies with advanced geospatial imaging techniques and wireless data link technologies supporting the combined and highly reliable transmission of digital video, high-resolution still imagery and mission control data over extended operational ranges. The framework enables the planning, simulation, control and real-time monitoring of UAS missions in application areas such as monitoring of forest fires, agronomical research, border patrol or pipeline inspection. The geospatial components of the project are based on the Virtual Globe Technology i3D OpenWebGlobe of the Institute of Geomatics Engineering at the University of Applied Sciences Northwestern Switzerland (FHNW). i3D OpenWebGlobe is a high-performance 3D geovisualisation engine supporting the web-based streaming of very large amounts of terrain and POI data.
NASA Astrophysics Data System (ADS)
Qin, Rufu; Lin, Liangzhao
2017-06-01
Coastal seiches have become an increasingly important issue in coastal science and present many challenges, particularly when attempting to provide warning services. This paper presents the methodologies, techniques and integrated services adopted for the design and implementation of a Seiches Monitoring and Forecasting Integration Framework (SMAF-IF). The SMAF-IF is an integrated system with different types of sensors and numerical models and incorporates the Geographic Information System (GIS) and web techniques, which focuses on coastal seiche events detection and early warning in the North Jiangsu shoal, China. The in situ sensors perform automatic and continuous monitoring of the marine environment status and the numerical models provide the meteorological and physical oceanographic parameter estimates. A model outputs processing software was developed in C# language using ArcGIS Engine functions, which provides the capabilities of automatically generating visualization maps and warning information. Leveraging the ArcGIS Flex API and ASP.NET web services, a web based GIS framework was designed to facilitate quasi real-time data access, interactive visualization and analysis, and provision of early warning services for end users. The integrated framework proposed in this study enables decision-makers and the publics to quickly response to emergency coastal seiche events and allows an easy adaptation to other regional and scientific domains related to real-time monitoring and forecasting.
Terminology for Neuroscience Data Discovery: Multi-tree Syntax and Investigator-Derived Semantics
Goldberg, David H.; Grafstein, Bernice; Robert, Adrian; Gardner, Esther P.
2009-01-01
The Neuroscience Information Framework (NIF), developed for the NIH Blueprint for Neuroscience Research and available at http://nif.nih.gov and http://neurogateway.org, is built upon a set of coordinated terminology components enabling data and web-resource description and selection. Core NIF terminologies use a straightforward syntax designed for ease of use and for navigation by familiar web interfaces, and readily exportable to aid development of relational-model databases for neuroscience data sharing. Datasets, data analysis tools, web resources, and other entities are characterized by multiple descriptors, each addressing core concepts, including data type, acquisition technique, neuroanatomy, and cell class. Terms for each concept are organized in a tree structure, providing is-a and has-a relations. Broad general terms near each root span the category or concept and spawn more detailed entries for specificity. Related but distinct concepts (e.g., brain area and depth) are specified by separate trees, for easier navigation than would be required by graph representation. Semantics enabling NIF data discovery were selected at one or more workshops by investigators expert in particular systems (vision, olfaction, behavioral neuroscience, neurodevelopment), brain areas (cerebellum, thalamus, hippocampus), preparations (molluscs, fly), diseases (neurodegenerative disease), or techniques (microscopy, computation and modeling, neurogenetics). Workshop-derived integrated term lists are available Open Source at http://brainml.org; a complete list of participants is at http://brainml.org/workshops. PMID:18958630
Allen, Edwin B; Walls, Richard T; Reilly, Frank D
2008-02-01
This study investigated the effects of interactive instructional techniques in a web-based peripheral nervous system (PNS) component of a first year medical school human anatomy course. Existing data from 9 years of instruction involving 856 students were used to determine (1) the effect of web-based interactive instructional techniques on written exam item performance and (2) differences between student opinions of the benefit level of five different types of interactive learning objects used. The interactive learning objects included Patient Case studies, review Games, Simulated Interactive Patients (SIP), Flashcards, and unit Quizzes. Exam item analysis scores were found to be significantly higher (p < 0.05) for students receiving the instructional treatment incorporating the web-based interactive learning objects than for students not receiving this treatment. Questionnaires using a five-point Likert scale were analysed to determine student opinion ratings of the interactive learning objects. Students reported favorably on the benefit level of all learning objects. Students rated the benefit level of the Simulated Interactive Patients (SIP) highest, and this rating was significantly higher (p < 0.05) than all other learning objects. This study suggests that web-based interactive instructional techniques improve student exam performance. Students indicated a strong acceptance of Simulated Interactive Patient learning objects.
Process Development for the Design and Manufacturing of Personalizable Mouth Sticks.
Berger, Veronika M; Pölzer, Stephan; Nussbaum, Gerhard; Ernst, Waltraud; Major, Zoltan
2017-01-01
To increase the independence of people with reduced hand/arm functionality, a process to generate personalizable mouth sticks was developed based on the participatory design principle. In a web tool, anybody can choose the geometry and the materials of their mouth piece, stick and tip. Manufacturing techniques (e.g. 3D printing) and materials used in the process are discussed and evaluated.
2012-01-01
This paper presents the rationale and methods for a randomized controlled evaluation of web-based training in motivational interviewing, goal setting, and behavioral task assignment. Web-based training may be a practical and cost-effective way to address the need for large-scale mental health training in evidence-based practice; however, there is a dearth of well-controlled outcome studies of these approaches. For the current trial, 168 mental health providers treating post-traumatic stress disorder (PTSD) were assigned to web-based training plus supervision, web-based training, or training-as-usual (control). A novel standardized patient (SP) assessment was developed and implemented for objective measurement of changes in clinical skills, while on-line self-report measures were used for assessing changes in knowledge, perceived self-efficacy, and practice related to cognitive behavioral therapy (CBT) techniques. Eligible participants were all actively involved in mental health treatment of veterans with PTSD. Study methodology illustrates ways of developing training content, recruiting participants, and assessing knowledge, perceived self-efficacy, and competency-based outcomes, and demonstrates the feasibility of conducting prospective studies of training efficacy or effectiveness in large healthcare systems. PMID:22583520
A new plastic surgical technique for adult congenital webbed penis
Chen, Yue-bing; Ding, Xian-fan; Luo, Chong; Yu, Shi-cheng; Yu, Yan-lan; Chen, Bi-de; Zhang, Zhi-gen; Li, Gong-hui
2012-01-01
Objective: To introduce a novel surgical technique for correction of adult congenital webbed penis. Methods: From March 2010 to December 2011, 12 patients (age range: 14–23 years old) were diagnosed as having a webbed penis and underwent a new surgical procedure designed by us. Results: All cases were treated successfully without severe complication. The operation time ranged from 20 min to 1 h. The average bleeding volume was less than 50 ml. All patients achieved satisfactory cosmetic results after surgery. The penile curvature disappeared in all cases and all patients remained well after 1 to 3 months of follow-up. Conclusions: Adult webbed penis with complaints of discomfort or psychological pressure due to a poor profile should be indicators for surgery. Good corrective surgery should expose the glans and coronal sulcus, match the penile skin length to the penile shaft length dorsally and ventrally, and provide a normal penoscrotal junction. Our new technique is a safe and effective method for the correction of adult webbed penis, which produces satisfactory results. PMID:22949367
An Exploratory Study of User Searching of the World Wide Web: A Holistic Approach.
ERIC Educational Resources Information Center
Wang, Peiling; Tenopir, Carol; Laymman, Elizabeth; Penniman, David; Collins, Shawn
1998-01-01
Examines Web users' behaviors and needs and tests a methodology for studying users' interaction with the Web. A process-tracing technique, together with tests of cognitive style, anxiety levels, and self-report computer experience, provided data on how users interact with the Web in the process of finding factual information. (Author/AEF)
ERIC Educational Resources Information Center
Gilstrap, Donald L.
1998-01-01
Explains how to build World Wide Web home pages using frames-based HTML so that librarians can manage Web-based information and improve their home pages. Provides descriptions and 15 examples for writing frames-HTML code, including advanced concepts and additional techniques for home-page design. (Author/LRW)
An Empirical Comparison of Visualization Tools To Assist Information Retrieval on the Web.
ERIC Educational Resources Information Center
Heo, Misook; Hirtle, Stephen C.
2001-01-01
Discusses problems with navigation in hypertext systems, including cognitive overload, and describes a study that tested information visualization techniques to see which best represented the underlying structure of Web space. Considers the effects of visualization techniques on user performance on information searching tasks and the effects of…
ERIC Educational Resources Information Center
Chantoem, Rewadee; Rattanavich, Saowalak
2016-01-01
This research compares the English language achievements of vocational students, their reading and writing abilities, and their attitudes towards learning English taught with just-in-time teaching techniques through web technologies and conventional methods. The experimental and control groups were formed, a randomized true control group…
Pastorello, Gilberto Z.; Sanchez-Azofeifa, G. Arturo; Nascimento, Mario A.
2011-01-01
Ecosystems monitoring is essential to properly understand their development and the effects of events, both climatological and anthropological in nature. The amount of data used in these assessments is increasing at very high rates. This is due to increasing availability of sensing systems and the development of new techniques to analyze sensor data. The Enviro-Net Project encompasses several of such sensor system deployments across five countries in the Americas. These deployments use a few different ground-based sensor systems, installed at different heights monitoring the conditions in tropical dry forests over long periods of time. This paper presents our experience in deploying and maintaining these systems, retrieving and pre-processing the data, and describes the Web portal developed to help with data management, visualization and analysis. PMID:22163965
A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence.
Alphy, Anna; Prabakaran, S
2015-01-01
In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations.
A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence
Alphy, Anna; Prabakaran, S.
2015-01-01
In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations. PMID:26229978
A web-based instruction module for interpretation of craniofacial cone beam CT anatomy.
Hassan, B A; Jacobs, R; Scarfe, W C; Al-Rawi, W T
2007-09-01
To develop a web-based module for learner instruction in the interpretation and recognition of osseous anatomy on craniofacial cone-beam CT (CBCT) images. Volumetric datasets from three CBCT systems were acquired (i-CAT, NewTom 3G and AccuiTomo FPD) for various subjects using equipment-specific scanning protocols. The datasets were processed using multiple software to provide two-dimensional (2D) multiplanar reformatted (MPR) images (e.g. sagittal, coronal and axial) and three-dimensional (3D) visual representations (e.g. maximum intensity projection, minimum intensity projection, ray sum, surface and volume rendering). Distinct didactic modules which illustrate the principles of CBCT systems, guided navigation of the volumetric dataset, and anatomic correlation of 3D models and 2D MPR graphics were developed using a hybrid combination of web authoring and image analysis techniques. Interactive web multimedia instruction was facilitated by the use of dynamic highlighting and labelling, and rendered video illustrations, supplemented with didactic textual material. HTML coding and Java scripting were heavily implemented for the blending of the educational modules. An interactive, multimedia educational tool for visualizing the morphology and interrelationships of osseous craniofacial anatomy, as depicted on CBCT MPR and 3D images, was designed and implemented. The present design of a web-based instruction module may assist radiologists and clinicians in learning how to recognize and interpret the craniofacial anatomy of CBCT based images more efficiently.
NASA Astrophysics Data System (ADS)
Sukariasih, Luh
2017-05-01
This study aims to produce teaching materials integrated natural science (IPA) webbed type of handout types are eligible for use in integrated science teaching. This type of research IS a kind of research and development / Research and Development (R & D) with reference to the 4D development model that is (define, design, develop, and disseminate). Data analysis techniques used to process data from the results of the assessment by the validator expert, and the results of the assessment by teachers and learners while testing is limited (12 students of class VIII SMPN 10 Kendari) using quantitative descriptive data analysis techniques disclosed in the distribution of scores on the scale of five categories grading scale that has been determined. The results of due diligence material gain votes validator material in the category of “very good” and “good”, of the data generated in the feasibility test presentation obtained the category of “good” and “excellent”, from the data generated in the feasibility of graphic test obtained the category of “very good “and” good “, as well as of the data generated in the test the feasibility of using words and language obtained the category of“very good “and” good “, so with qualifications gained the teaching materials IPA integrated type webbed by applying discourse analysis on the theme of energy and food for Junior High School (SMP) grade VIII suitable as teaching materials. In limited testing, data generated in response to a science teacher at SMPN 10 Kendari to product instructional materials as “excellent”, and from the data generated while testing is limited by the 12 students of class VIII SMPN 10 Kendari are more students who score indicates category “very good”, so that the qualification obtained by the natural science (IPA) teaching material integrated type webbed by applying discourse analysis on the theme of energy and food for SMP / class VIII fit for use as teaching material.
Cooperative Learning Environment with the Web 2.0 Tool E-Portfolios
ERIC Educational Resources Information Center
Or Kan, Soh
2011-01-01
In recent years, the development of information and communication technology (ICT) in the world and Malaysia namely has created a significant impact on the methods of communicating information and knowledge to the learners and consequently, innovative teaching techniques have evolved to change the ways teachers teach and the ways students learn.…
ERIC Educational Resources Information Center
Chang, Ray I.; Hung, Yu Hsin; Lin, Chun Fu
2015-01-01
With the rapid development of web techniques, information and communication technology is being increasingly used in curricula, and learning portfolios can be automatically retrieved and maintained as learners interact through e-learning platforms. Further, massive open online courses (MOOCs), which apply such technology to provide open access to…
Improving Marine Ecosystem Models with Biochemical Tracers
NASA Astrophysics Data System (ADS)
Pethybridge, Heidi R.; Choy, C. Anela; Polovina, Jeffrey J.; Fulton, Elizabeth A.
2018-01-01
Empirical data on food web dynamics and predator-prey interactions underpin ecosystem models, which are increasingly used to support strategic management of marine resources. These data have traditionally derived from stomach content analysis, but new and complementary forms of ecological data are increasingly available from biochemical tracer techniques. Extensive opportunities exist to improve the empirical robustness of ecosystem models through the incorporation of biochemical tracer data and derived indices, an area that is rapidly expanding because of advances in analytical developments and sophisticated statistical techniques. Here, we explore the trophic information required by ecosystem model frameworks (species, individual, and size based) and match them to the most commonly used biochemical tracers (bulk tissue and compound-specific stable isotopes, fatty acids, and trace elements). Key quantitative parameters derived from biochemical tracers include estimates of diet composition, niche width, and trophic position. Biochemical tracers also provide powerful insight into the spatial and temporal variability of food web structure and the characterization of dominant basal and microbial food web groups. A major challenge in incorporating biochemical tracer data into ecosystem models is scale and data type mismatches, which can be overcome with greater knowledge exchange and numerical approaches that transform, integrate, and visualize data.
The Laugh Model: Reframing and Rebranding Public Health Through Social Media.
Lister, Cameron; Royne, Marla; Payne, Hannah E; Cannon, Ben; Hanson, Carl; Barnes, Michael
2015-11-01
We examined the use of low-cost social media platforms in communicating public health messages and outline the laugh model, a framework through which public health organizations can reach and engage communities. In August 2014, we developed an online campaign (Web site and social media) to help promote healthy family meals in Utah in conjunction with the state and local health departments. By the end of September 2014, a total of 3641 individuals had visited the Utahfamilymeals.org Web site. Facebook ads reached a total of 29 078 people, and 56 900 people were reached through Twitter ads. The per-person price of the campaign was 0.2 cents, and the total estimated target population reach was between 10% and 12%. There are 3 key takeaways from our campaign: use of empowering and engaging techniques may be more effective than use of educational techniques; use of social media Web sites and online marketing tactics can enhance collaboration, interdisciplinary strategies, and campaign effectiveness; and use of social media as a communication platform is often preferable to use of mass media in terms of cost-effectiveness, more precise evaluations of campaign success, and increased sustainability.
The Laugh Model: Reframing and Rebranding Public Health Through Social Media
Royne, Marla; Payne, Hannah E.; Cannon, Ben; Hanson, Carl; Barnes, Michael
2015-01-01
Objectives. We examined the use of low-cost social media platforms in communicating public health messages and outline the laugh model, a framework through which public health organizations can reach and engage communities. Methods. In August 2014, we developed an online campaign (Web site and social media) to help promote healthy family meals in Utah in conjunction with the state and local health departments. Results. By the end of September 2014, a total of 3641 individuals had visited the Utahfamilymeals.org Web site. Facebook ads reached a total of 29 078 people, and 56 900 people were reached through Twitter ads. The per-person price of the campaign was 0.2 cents, and the total estimated target population reach was between 10% and 12%. Conclusions. There are 3 key takeaways from our campaign: use of empowering and engaging techniques may be more effective than use of educational techniques; use of social media Web sites and online marketing tactics can enhance collaboration, interdisciplinary strategies, and campaign effectiveness; and use of social media as a communication platform is often preferable to use of mass media in terms of cost-effectiveness, more precise evaluations of campaign success, and increased sustainability. PMID:26378824
The Use of NASA near Real-time and Archived Satellite Data to Support Disaster Assessment
NASA Technical Reports Server (NTRS)
McGrath, Kevin M.; Molthan, Andrew; Burks, Jason
2014-01-01
With support from a NASA's Applied Sciences Program, The Short-term Prediction Research and Transition (SPoRT) Center has explored a variety of techniques for utilizing archived and near real-time NASA satellite data to support disaster assessment activities. MODIS data from the NASA Land Atmosphere Near Real-time Capability for EOS currently provides true color and other imagery for assessment and potential applications including, but not limited to, flooding, fires, and tornadoes. In May 2013, the SPoRT Center developed unique power outage composites using the VIIRS Day/Night Band to represent the first clear sky view of damage inflicted upon Moore and Oklahoma City, Oklahoma following the devastating EF-5 tornado that occurred on May 20. Pre-event imagery provided by the NASA funded Web-Enabled Landsat Data project offer a basis of comparison for monitoring post-disaster recovery efforts. Techniques have also been developed to generate products from higher resolution imagery from the recently available International Space Station SERVIR Environmental Research and Visualization System instrument. Of paramount importance is to deliver these products to end users expeditiously and in formats compatible with Decision Support Systems (DSS). Delivery techniques include a Tile Map Service (TMS) and a Web Mapping Service (WMS). These mechanisms allow easy integration of satellite products into DSS's, including the National Weather Service's Damage Assessment Toolkit for use by personnel conducting damage surveys. This poster will present an overview of the developed techniques and products and compare the strengths and weaknesses of the TMS and WMS.
ERIC Educational Resources Information Center
Henry, Anna E.; Story, Mary
2009-01-01
Objective: To identify food and beverage brand Web sites featuring designated children's areas, assess marketing techniques present on those industry Web sites, and determine nutritional quality of branded food items marketed to children. Design: Systematic content analysis of food and beverage brand Web sites and nutrient analysis of food and…
CH5M3D: an HTML5 program for creating 3D molecular structures.
Earley, Clarke W
2013-11-18
While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user's computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/.
CH5M3D: an HTML5 program for creating 3D molecular structures
2013-01-01
Background While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Results Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user’s computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. Conclusions A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/. PMID:24246004
Userscripts for the life sciences.
Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J
2007-12-21
The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity.
Userscripts for the Life Sciences
Willighagen, Egon L; O'Boyle, Noel M; Gopalakrishnan, Harini; Jiao, Dazhi; Guha, Rajarshi; Steinbeck, Christoph; Wild, David J
2007-01-01
Background The web has seen an explosion of chemistry and biology related resources in the last 15 years: thousands of scientific journals, databases, wikis, blogs and resources are available with a wide variety of types of information. There is a huge need to aggregate and organise this information. However, the sheer number of resources makes it unrealistic to link them all in a centralised manner. Instead, search engines to find information in those resources flourish, and formal languages like Resource Description Framework and Web Ontology Language are increasingly used to allow linking of resources. A recent development is the use of userscripts to change the appearance of web pages, by on-the-fly modification of the web content. This opens possibilities to aggregate information and computational results from different web resources into the web page of one of those resources. Results Several userscripts are presented that enrich biology and chemistry related web resources by incorporating or linking to other computational or data sources on the web. The scripts make use of Greasemonkey-like plugins for web browsers and are written in JavaScript. Information from third-party resources are extracted using open Application Programming Interfaces, while common Universal Resource Locator schemes are used to make deep links to related information in that external resource. The userscripts presented here use a variety of techniques and resources, and show the potential of such scripts. Conclusion This paper discusses a number of userscripts that aggregate information from two or more web resources. Examples are shown that enrich web pages with information from other resources, and show how information from web pages can be used to link to, search, and process information in other resources. Due to the nature of userscripts, scientists are able to select those scripts they find useful on a daily basis, as the scripts run directly in their own web browser rather than on the web server. This flexibility allows the scientists to tune the features of web resources to optimise their productivity. PMID:18154664
Usage and applications of Semantic Web techniques and technologies to support chemistry research
2014-01-01
Background The drug discovery process is now highly dependent on the management, curation and integration of large amounts of potentially useful data. Semantics are necessary in order to interpret the information and derive knowledge. Advances in recent years have mitigated concerns that the lack of robust, usable tools has inhibited the adoption of methodologies based on semantics. Results This paper presents three examples of how Semantic Web techniques and technologies can be used in order to support chemistry research: a controlled vocabulary for quantities, units and symbols in physical chemistry; a controlled vocabulary for the classification and labelling of chemical substances and mixtures; and, a database of chemical identifiers. This paper also presents a Web-based service that uses the datasets in order to assist with the completion of risk assessment forms, along with a discussion of the legal implications and value-proposition for the use of such a service. Conclusions We have introduced the Semantic Web concepts, technologies, and methodologies that can be used to support chemistry research, and have demonstrated the application of those techniques in three areas very relevant to modern chemistry research, generating three new datasets that we offer as exemplars of an extensible portfolio of advanced data integration facilities. We have thereby established the importance of Semantic Web techniques and technologies for meeting Wild’s fourth “grand challenge”. PMID:24855494
Usage and applications of Semantic Web techniques and technologies to support chemistry research.
Borkum, Mark I; Frey, Jeremy G
2014-01-01
The drug discovery process is now highly dependent on the management, curation and integration of large amounts of potentially useful data. Semantics are necessary in order to interpret the information and derive knowledge. Advances in recent years have mitigated concerns that the lack of robust, usable tools has inhibited the adoption of methodologies based on semantics. THIS PAPER PRESENTS THREE EXAMPLES OF HOW SEMANTIC WEB TECHNIQUES AND TECHNOLOGIES CAN BE USED IN ORDER TO SUPPORT CHEMISTRY RESEARCH: a controlled vocabulary for quantities, units and symbols in physical chemistry; a controlled vocabulary for the classification and labelling of chemical substances and mixtures; and, a database of chemical identifiers. This paper also presents a Web-based service that uses the datasets in order to assist with the completion of risk assessment forms, along with a discussion of the legal implications and value-proposition for the use of such a service. We have introduced the Semantic Web concepts, technologies, and methodologies that can be used to support chemistry research, and have demonstrated the application of those techniques in three areas very relevant to modern chemistry research, generating three new datasets that we offer as exemplars of an extensible portfolio of advanced data integration facilities. We have thereby established the importance of Semantic Web techniques and technologies for meeting Wild's fourth "grand challenge".
Pitfalls in Persuasion: How Do Users Experience Persuasive Techniques in a Web Service?
NASA Astrophysics Data System (ADS)
Segerståhl, Katarina; Kotro, Tanja; Väänänen-Vainio-Mattila, Kaisa
Persuasive technologies are designed by utilizing a variety of interactive techniques that are believed to promote target behaviors. This paper describes a field study in which the aim was to discover possible pitfalls of persuasion, i.e., situations in which persuasive techniques do not function as expected. The study investigated persuasive functionality of a web service targeting weight loss. A qualitative online questionnaire was distributed through the web service and a total of 291 responses were extracted for interpretative analysis. The Persuasive Systems Design model (PSD) was used for supporting systematic analysis of persuasive functionality. Pitfalls were identified through situations that evoked negative user experiences. The primary pitfalls discovered were associated with manual logging of eating and exercise behaviors, appropriateness of suggestions and source credibility issues related to social facilitation. These pitfalls, when recognized, can be addressed in design by applying functional and facilitative persuasive techniques in meaningful combinations.
Design, Development and Testing of Web Services for Multi-Sensor Snow Cover Mapping
NASA Astrophysics Data System (ADS)
Kadlec, Jiri
This dissertation presents the design, development and validation of new data integration methods for mapping the extent of snow cover based on open access ground station measurements, remote sensing images, volunteer observer snow reports, and cross country ski track recordings from location-enabled mobile devices. The first step of the data integration procedure includes data discovery, data retrieval, and data quality control of snow observations at ground stations. The WaterML R package developed in this work enables hydrologists to retrieve and analyze data from multiple organizations that are listed in the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI) Water Data Center catalog directly within the R statistical software environment. Using the WaterML R package is demonstrated by running an energy balance snowpack model in R with data inputs from CUAHSI, and by automating uploads of real time sensor observations to CUAHSI HydroServer. The second step of the procedure requires efficient access to multi-temporal remote sensing snow images. The Snow Inspector web application developed in this research enables the users to retrieve a time series of fractional snow cover from the Moderate Resolution Imaging Spectroradiometer (MODIS) for any point on Earth. The time series retrieval method is based on automated data extraction from tile images provided by a Web Map Tile Service (WMTS). The average required time for retrieving 100 days of data using this technique is 5.4 seconds, which is significantly faster than other methods that require the download of large satellite image files. The presented data extraction technique and space-time visualization user interface can be used as a model for working with other multi-temporal hydrologic or climate data WMTS services. The third, final step of the data integration procedure is generating continuous daily snow cover maps. A custom inverse distance weighting method has been developed to combine volunteer snow reports, cross-country ski track reports and station measurements to fill cloud gaps in the MODIS snow cover product. The method is demonstrated by producing a continuous daily time step snow presence probability map dataset for the Czech Republic region. The ability of the presented methodology to reconstruct MODIS snow cover under cloud is validated by simulating cloud cover datasets and comparing estimated snow cover to actual MODIS snow cover. The percent correctly classified indicator showed accuracy between 80 and 90% using this method. Using crowdsourcing data (volunteer snow reports and ski tracks) improves the map accuracy by 0.7--1.2%. The output snow probability map data sets are published online using web applications and web services. Keywords: crowdsourcing, image analysis, interpolation, MODIS, R statistical software, snow cover, snowpack probability, Tethys platform, time series, WaterML, web services, winter sports.
Treatment of Wide-Neck Bifurcation Aneurysm Using "WEB Device Waffle Cone Technique".
Mihalea, Cristian; Caroff, Jildaz; Rouchaud, Aymeric; Pescariu, Sorin; Moret, Jacques; Spelle, Laurent
2018-05-01
The endovascular treatment of wide-neck bifurcation aneurysms can be challenging and often requires the use of adjunctive techniques and devices. We report our first experience of using a waffle-cone technique adapted to the Woven Endoluminal Bridge (WEB) device in a large-neck basilar tip aneurysm, suitable in cases where the use of Y stenting or other techniques is limited due to anatomic restrictions. The procedure was complete, and angiographic occlusion of the aneurysm was achieved 24 hours post treatment, as confirmed by digital subtraction angiography. No complications occurred. The case reported here was not suitable for Y stenting or deployment of the WEB device alone, due to the small caliber of both posterior cerebral arteries and their origin at the neck level. The main advantage of this technique is that both devices have a controlled detachment system and are fully independent. To our knowledge, this technique has not been reported previously and this modality of treatment has never been described in the literature. Copyright © 2018 Elsevier Inc. All rights reserved.
Little Boy Blue Goes High-Tech: Providing Customers with Topic-Driven Content
ERIC Educational Resources Information Center
King, David
2005-01-01
In an attempt to make the Kansas City Public Library's Web site more user friendly, the Web team took an opportunity to completely redesign the site. This article describes the techniques that the team used to organize and design the new Web site. By adopting a guided approach to the Internet, they were able to streamline their Web links and…
O'Brien, Nicola; Heaven, Ben; Teal, Gemma; Evans, Elizabeth H; Cleland, Claire; Moffatt, Suzanne; Sniehotta, Falko F; White, Martin; Mathers, John C
2016-01-01
Background Integrating stakeholder involvement in complex health intervention design maximizes acceptability and potential effectiveness. However, there is little methodological guidance about how to integrate evidence systematically from various sources in this process. Scientific evidence derived from different approaches can be difficult to integrate and the problem is compounded when attempting to include diverse, subjective input from stakeholders. Objective The intent of the study was to describe and appraise a systematic, sequential approach to integrate scientific evidence, expert knowledge and experience, and stakeholder involvement in the co-design and development of a complex health intervention. The development of a Web-based lifestyle intervention for people in retirement is used as an example. Methods Evidence from three systematic reviews, qualitative research findings, and expert knowledge was compiled to produce evidence statements (stage 1). Face validity of these statements was assessed by key stakeholders in a co-design workshop resulting in a set of intervention principles (stage 2). These principles were assessed for face validity in a second workshop, resulting in core intervention concepts and hand-drawn prototypes (stage 3). The outputs from stages 1-3 were translated into a design brief and specification (stage 4), which guided the building of a functioning prototype, Web-based intervention (stage 5). This prototype was de-risked resulting in an optimized functioning prototype (stage 6), which was subject to iterative testing and optimization (stage 7), prior to formal pilot evaluation. Results The evidence statements (stage 1) highlighted the effectiveness of physical activity, dietary and social role interventions in retirement; the idiosyncratic nature of retirement and well-being; the value of using specific behavior change techniques including those derived from the Health Action Process Approach; and the need for signposting to local resources. The intervention principles (stage 2) included the need to facilitate self-reflection on available resources, personalization, and promotion of links between key lifestyle behaviors. The core concepts and hand-drawn prototypes (stage 3) had embedded in them the importance of time use and work exit planning, personalized goal setting, and acceptance of a Web-based intervention. The design brief detailed the features and modules required (stage 4), guiding the development of wireframes, module content and functionality, virtual mentors, and intervention branding (stage 5). Following an iterative process of intervention testing and optimization (stage 6), the final Web-based intervention prototype of LEAP (Living, Eating, Activity, and Planning in retirement) was produced (stage 7). The approach was resource intensive and required a multidisciplinary team. The design expert made an invaluable contribution throughout the process. Conclusions Our sequential approach fills an important methodological gap in the literature, describing the stages and techniques useful in developing an evidence-based complex health intervention. The systematic and rigorous integration of scientific evidence, expert knowledge and experience, and stakeholder input has resulted in an intervention likely to be acceptable and feasible. PMID:27489143
O'Brien, Nicola; Heaven, Ben; Teal, Gemma; Evans, Elizabeth H; Cleland, Claire; Moffatt, Suzanne; Sniehotta, Falko F; White, Martin; Mathers, John C; Moynihan, Paula
2016-08-03
Integrating stakeholder involvement in complex health intervention design maximizes acceptability and potential effectiveness. However, there is little methodological guidance about how to integrate evidence systematically from various sources in this process. Scientific evidence derived from different approaches can be difficult to integrate and the problem is compounded when attempting to include diverse, subjective input from stakeholders. The intent of the study was to describe and appraise a systematic, sequential approach to integrate scientific evidence, expert knowledge and experience, and stakeholder involvement in the co-design and development of a complex health intervention. The development of a Web-based lifestyle intervention for people in retirement is used as an example. Evidence from three systematic reviews, qualitative research findings, and expert knowledge was compiled to produce evidence statements (stage 1). Face validity of these statements was assessed by key stakeholders in a co-design workshop resulting in a set of intervention principles (stage 2). These principles were assessed for face validity in a second workshop, resulting in core intervention concepts and hand-drawn prototypes (stage 3). The outputs from stages 1-3 were translated into a design brief and specification (stage 4), which guided the building of a functioning prototype, Web-based intervention (stage 5). This prototype was de-risked resulting in an optimized functioning prototype (stage 6), which was subject to iterative testing and optimization (stage 7), prior to formal pilot evaluation. The evidence statements (stage 1) highlighted the effectiveness of physical activity, dietary and social role interventions in retirement; the idiosyncratic nature of retirement and well-being; the value of using specific behavior change techniques including those derived from the Health Action Process Approach; and the need for signposting to local resources. The intervention principles (stage 2) included the need to facilitate self-reflection on available resources, personalization, and promotion of links between key lifestyle behaviors. The core concepts and hand-drawn prototypes (stage 3) had embedded in them the importance of time use and work exit planning, personalized goal setting, and acceptance of a Web-based intervention. The design brief detailed the features and modules required (stage 4), guiding the development of wireframes, module content and functionality, virtual mentors, and intervention branding (stage 5). Following an iterative process of intervention testing and optimization (stage 6), the final Web-based intervention prototype of LEAP (Living, Eating, Activity, and Planning in retirement) was produced (stage 7). The approach was resource intensive and required a multidisciplinary team. The design expert made an invaluable contribution throughout the process. Our sequential approach fills an important methodological gap in the literature, describing the stages and techniques useful in developing an evidence-based complex health intervention. The systematic and rigorous integration of scientific evidence, expert knowledge and experience, and stakeholder input has resulted in an intervention likely to be acceptable and feasible.
Effectiveness of Web-Based Psychological Interventions for Depression: A Meta-Analysis
ERIC Educational Resources Information Center
Cowpertwait, Louise; Clarke, Dave
2013-01-01
Web-based psychological interventions aim to make psychological treatments more accessible and minimize clinician input, but their effectiveness requires further examination. The purposes of the present study are to evaluate the outcomes of web-based interventions for treating depressed adults using meta-analytic techniques, and to examine…
Automatic Semantic Generation and Arabic Translation of Mathematical Expressions on the Web
ERIC Educational Resources Information Center
Doush, Iyad Abu; Al-Bdarneh, Sondos
2013-01-01
Automatic processing of mathematical information on the web imposes some difficulties. This paper presents a novel technique for automatic generation of mathematical equations semantic and Arabic translation on the web. The proposed system facilitates unambiguous representation of mathematical equations by correlating equations to their known…
Academic Departments and Student Attitudes toward Different Dimensions of Web-based Education.
ERIC Educational Resources Information Center
Federico, Pat-Anthony
2001-01-01
Describes research at the Naval Postgraduate School that investigated student attitudes toward various aspects of Web-based instruction. Results of a survey, which were analyzed using a variety of multivariate and univariate statistical techniques, showed significantly different attitudes toward different dimensions of Web-based education…
Geomorphology and the World Wide Web
NASA Astrophysics Data System (ADS)
Shroder, John F.; Bishop, Michael P.; Olsenholler, Jeffrey; Craiger, J. Philip
2002-10-01
The Internet and the World Wide Web have brought many dimensions of new technology to education and research in geomorphology. As with other disciplines on the Web, Web-based geomorphology has become an eclectic mix of whatever material an individual deems worthy of presentation, and in many cases is without quality control. Nevertheless, new electronic media can facilitate education and research in geomorphology. For example, virtual field trips can be developed and accessed to reinforce concepts in class. Techniques for evaluating Internet references helps students to write traditional term papers, but professional presentations can also involve student papers that are published on the Web. Faculty can also address plagiarism issues by using search engines. Because of the lack of peer review of much of the content on the Web, care must be exercised in using it for reference searches. Today, however, refereed journals are going online and can be accessed through subscription or payment per article viewed. Library reference desks regularly use the Web for searches of refereed articles. Research on the Web ranges from communication between investigators, data acquisition, scientific visualization, or comprehensive searches of refereed sources, to interactive analyses of remote data sets. The Nanga Parbat and the Global Land Ice Measurements from Space (GLIMS) Projects are two examples of geomorphologic research that are achieving full potential through use of the Web. Teaching and research in geomorphology are undergoing a beneficial, but sometimes problematic, transition with the new technology. The learning curve is steep for some users but the view from the top is bright. Geomorphology can only prosper from the benefits offered by computer technologies.
NASA Astrophysics Data System (ADS)
Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.
2007-12-01
NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.
bioWeb3D: an online webGL 3D data visualisation tool
2013-01-01
Background Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. Results An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Conclusions Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets. PMID:23758781
ERIC Educational Resources Information Center
Nagasinghe, Iranga
2010-01-01
This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…
Using a Video Split-Screen Technique To Evaluate Streaming Instructional Videos.
ERIC Educational Resources Information Center
Gibbs, William J.; Bernas, Ronan S.; McCann, Steven A.
The Media Center at Eastern Illinois University developed and streamed on the Internet 26 short (one to five minutes) instructional videos about WebCT that illustrated specific functions, including logging-in, changing a password, and using chat. This study observed trainees using and reacting to selections of these videos. It set out to assess…
Critical Thinking of Young Citizens towards News Headlines in Chile
ERIC Educational Resources Information Center
Vernier, Matthieu; Cárcamo, Luis; Scheihing, Eliana
2018-01-01
Strengthening critical thinking abilities of citizens in the face of news published on the web represents a key challenge for education. Young citizens appear to be vulnerable in the face of poor quality news or those containing non-explicit ideologies. In the field of data science, computational and statistical techniques have been developed to…
Digital Pedagogies for Teachers' CPD
ERIC Educational Resources Information Center
Montebello, Matthew
2017-01-01
The continuous professional development of educators is not only essential to highly maintain their expertise levels and ensure that their knowledge is up to scratch, but also to catch up and adopt new pedagogical tools, skills and techniques. The advent of the Web 2.0 brought about a plethora of digital tools that teachers have not only struggled…
ERIC Educational Resources Information Center
Davis, Gary Alan; Kovacs, Paul J.; Scarpino, John; Turchek, John C.
2010-01-01
The emergence of increasingly sophisticated communication technologies and the media-rich extensions of the World Wide Web have prompted universities to use alternatives to the traditional classroom teaching and learning methods. This demand for alternative delivery methods has led to the development of a wide range of eLearning techniques.…
Metadata-driven Delphi rating on the Internet.
Deshpande, Aniruddha M; Shiffman, Richard N; Nadkarni, Prakash M
2005-01-01
Paper-based data collection and analysis for consensus development is inefficient and error-prone. Computerized techniques that could improve efficiency, however, have been criticized as costly, inconvenient and difficult to use. We designed and implemented a metadata-driven Web-based Delphi rating and analysis tool, employing the flexible entity-attribute-value schema to create generic, reusable software. The software can be applied to various domains by altering the metadata; the programming code remains intact. This approach greatly reduces the marginal cost of re-using the software. We implemented our software to prepare for the Conference on Guidelines Standardization. Twenty-three invited experts completed the first round of the Delphi rating on the Web. For each participant, the software generated individualized reports that described the median rating and the disagreement index (calculated from the Interpercentile Range Adjusted for Symmetry) as defined by the RAND/UCLA Appropriateness Method. We evaluated the software with a satisfaction survey using a five-level Likert scale. The panelists felt that Web data entry was convenient (median 4, interquartile range [IQR] 4.0-5.0), acceptable (median 4.5, IQR 4.0-5.0) and easily accessible (median 5, IQR 4.0-5.0). We conclude that Web-based Delphi rating for consensus development is a convenient and acceptable alternative to the traditional paper-based method.
Bozzetto Ambrosi, Patricia; Sivan-Hoffmann, Rotem; Riva, Roberto; Signorelli, Francesco; Labeyrie, Paul-Emile; Eldesouky, Islam; Sadeh-Gonike, Udi; Armoiry, Xavier; Turjman, Francis
2015-01-01
Background The WEB device is a recent intrasaccular flow disruption technique developed for the treatment of wide-necked intracranial aneurysms. To date, a single report on the WEB Single-Layer (SL) treatment of intracranial aneurysms has been published with 1-months' safety results. The aim of this study is to report our experience and 6-month clinical and angiographic follow-up of endovascular treatment of wide-neck aneurysm with the WEB SL. Methods Ten patients with 10 unruptured wide-necked aneurysms were prospectively enrolled in this study. Feasibility, intraoperative and postoperative complications, and outcomes were recorded. Immediate and 6-month clinical and angiographic results were evaluated. Results Failure of WEB SL placement occurred in two cases. Eight aneurysms were successfully treated using one WEB SL without additional treatment. Three middle cerebral artery, four anterior communicating artery, and one basilar artery aneurysms were treated. Average dome width was 7.5 mm (range 5.4–10.7 mm), and average neck size was 4.9 mm (range 2.6–6.5 mm). No periprocedural complication was observed, and morbi-mortality at discharge and 6 months was 0.0%. Angiographic follow-up at 6 months demonstrated complete aneurysm occlusion in 2/8 aneurysms, neck remnant in 5/8 aneurysms, and aneurysm remnant in 1/8 aneurysm. Conclusions From this preliminary study, treatment of bifurcation intracranial aneurysms using WEB SL is feasible. WEB SL treatment seems safe at 6 months; however, the rate of neck remnants is not negligible due to compression of the WEB SL. Further technical improvements may be needed in order to ameliorate the occlusion in the WEB SL treatment. PMID:26111987
Research Techniques Made Simple: Web-Based Survey Research in Dermatology: Conduct and Applications.
Maymone, Mayra B C; Venkatesh, Samantha; Secemsky, Eric; Reddy, Kavitha; Vashi, Neelam A
2018-07-01
Web-based surveys, or e-surveys, are surveys designed and delivered using the internet. The use of these survey tools is becoming increasingly common in medical research. Their advantages are appealing to surveyors because they allow for rapid development and administration of surveys, fast data collection and analysis, low cost, and fewer errors due to manual data entry than telephone or mailed questionnaires. Internet surveys may be used in clinical and academic research settings with improved speed and efficacy of data collection compared with paper or verbal survey modalities. However, limitations such as potentially low response rates, demographic biases, and variations in computer literacy and internet access remain areas of concern. We aim to briefly describe some of the currently available Web-based survey tools, focusing on advantages and limitations to help guide their use and application in dermatologic research. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
SSBRP User Operations Facility (UOF) Overview and Development Strategy
NASA Technical Reports Server (NTRS)
Picinich, Lou; Stone, Thom; Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)
1995-01-01
This paper will present the Space Station Biological Research Project (SSBRP) User Operations Facility (UOF) architecture and development strategy. A major element of the UOF at NASA Ames Research Center, the Communication and Data System (CDS) will be the primary focus of the discussions. CDS operational, telescience, security, and development objectives will be discussed along with CDS implementation strategy. The implementation strategy discussions will include: Object Oriented Analysis & Design, System & Software Prototyping, and Technology Utilization. A CDS design overview that includes: CDS Context Diagram, CDS Architecture, Object Models, Use Cases, and User Interfaces will also be presented. CDS development brings together "cutting edge" technologies and techniques such as: object oriented development, network security, multimedia networking, web-based data distribution, JAVA, and graphical user interfaces. Use of these "cutting edge" technologies and techniques translates directly to lower development and operations costs.
2013-01-01
Background Subunit vaccines based on recombinant proteins have been effective in preventing infectious diseases and are expected to meet the demands of future vaccine development. Computational approach, especially reverse vaccinology (RV) method has enormous potential for identification of protein vaccine candidates (PVCs) from a proteome. The existing protective antigen prediction software and web servers have low prediction accuracy leading to limited applications for vaccine development. Besides machine learning techniques, those software and web servers have considered only protein’s adhesin-likeliness as criterion for identification of PVCs. Several non-adhesin functional classes of proteins involved in host-pathogen interactions and pathogenesis are known to provide protection against bacterial infections. Therefore, knowledge of bacterial pathogenesis has potential to identify PVCs. Results A web server, Jenner-Predict, has been developed for prediction of PVCs from proteomes of bacterial pathogens. The web server targets host-pathogen interactions and pathogenesis by considering known functional domains from protein classes such as adhesin, virulence, invasin, porin, flagellin, colonization, toxin, choline-binding, penicillin-binding, transferring-binding, fibronectin-binding and solute-binding. It predicts non-cytosolic proteins containing above domains as PVCs. It also provides vaccine potential of PVCs in terms of their possible immunogenicity by comparing with experimentally known IEDB epitopes, absence of autoimmunity and conservation in different strains. Predicted PVCs are prioritized so that only few prospective PVCs could be validated experimentally. The performance of web server was evaluated against known protective antigens from diverse classes of bacteria reported in Protegen database and datasets used for VaxiJen server development. The web server efficiently predicted known vaccine candidates reported from Streptococcus pneumoniae and Escherichia coli proteomes. The Jenner-Predict server outperformed NERVE, Vaxign and VaxiJen methods. It has sensitivity of 0.774 and 0.711 for Protegen and VaxiJen dataset, respectively while specificity of 0.940 has been obtained for the latter dataset. Conclusions Better prediction accuracy of Jenner-Predict web server signifies that domains involved in host-pathogen interactions and pathogenesis are better criteria for prediction of PVCs. The web server has successfully predicted maximum known PVCs belonging to different functional classes. Jenner-Predict server is freely accessible at http://117.211.115.67/vaccine/home.html PMID:23815072
Scrutinizing the Cybersell: Teen-Targeted Web Sites as Texts
ERIC Educational Resources Information Center
Crovitz, Darren
2007-01-01
Darren Crovitz explains that the explosive growth of Web-based content and communication in recent years compels us to teach students how to examine the "rhetorical nature and ethical dimensions of the online world." He demonstrates successful approaches to accomplish this goal through his analysis of the selling techniques of two Web sites…
Not Your Father's Web Site: Corporate Sites Emerge as New Content Innovators.
ERIC Educational Resources Information Center
O'Leary, Mick
2002-01-01
New economy corporate Web sites have pioneered exciting techniques-rich media, interactivity, personalization, community, and integration of much third-party content. Discusses business-to-business (B2B) Web commerce, with examples of several B2B corporate sites; portal and content elements of these sites; and corporate content outlooks. (AEF)
Card-Sorting Usability Tests of the WMU Libraries' Web Site
ERIC Educational Resources Information Center
Whang, Michael
2008-01-01
This article describes the card-sorting techniques used by several academic libraries, reports and discusses the results of card-sorting usability tests of the Western Michigan University Libraries' Web site, and reveals how the WMU libraries incorporated the findings into a new Web site redesign, setting the design direction early on. The article…
Comprehensive Analysis of Semantic Web Reasoners and Tools: A Survey
ERIC Educational Resources Information Center
Khamparia, Aditya; Pandey, Babita
2017-01-01
Ontologies are emerging as best representation techniques for knowledge based context domains. The continuing need for interoperation, collaboration and effective information retrieval has lead to the creation of semantic web with the help of tools and reasoners which manages personalized information. The future of semantic web lies in an ontology…
Busch, D Shallin; Greene, Correigh M; Good, Thomas P
2013-12-01
Marine hydrokinetic power projects will operate as marine environments change in response to increased atmospheric carbon dioxide concentrations. We considered how tidal power development and stressors resulting from climate change may affect Puget Sound species listed under the U.S. Endangered Species Act (ESA) and their food web. We used risk tables to assess the singular and combined effects of tidal power development and climate change. Tidal power development and climate change posed risks to ESA-listed species, and risk increased with incorporation of the effects of these stressors on predators and prey of ESA-listed species. In contrast, results of a model of strikes on ESA-listed species from turbine blades suggested that few ESA-listed species are likely to be killed by a commercial-scale tidal turbine array. We applied scenarios to a food web model of Puget Sound to explore the effects of tidal power and climate change on ESA-listed species using more quantitative analytical techniques. To simulate development of tidal power, we applied results of the blade strike model. To simulate environmental changes over the next 50 years, we applied scenarios of change in primary production, plankton community structure, dissolved oxygen, ocean acidification, and freshwater flooding events. No effects of tidal power development on ESA-listed species were detected from the food web model output, but the effects of climate change on them and other members of the food web were large. Our analyses exemplify how natural resource managers might assess environmental effects of marine technologies in ways that explicitly incorporate climate change and consider multiple ESA-listed species in the context of their ecological community. Estimación de los Efectos de Proyectos de Energía de las Mareas y el Cambio Climático sobre Especies Marinas Amenazadas y en Peligro y su Red Alimentaria. © 2013 Society for Conservation Biology No claim to original US government works.
Web Transfer Over Satellites Being Improved
NASA Technical Reports Server (NTRS)
Allman, Mark
1999-01-01
Extensive research conducted by NASA Lewis Research Center's Satellite Networks and Architectures Branch and the Ohio University has demonstrated performance improvements in World Wide Web transfers over satellite-based networks. The use of a new version of the Hypertext Transfer Protocol (HTTP) reduced the time required to load web pages over a single Transmission Control Protocol (TCP) connection traversing a satellite channel. However, an older technique of simultaneously making multiple requests of a given server has been shown to provide even faster transfer time. Unfortunately, the use of multiple simultaneous requests has been shown to be harmful to the network in general. Therefore, we are developing new mechanisms for the HTTP protocol which may allow a single request at any given time to perform as well as, or better than, multiple simultaneous requests. In the course of study, we also demonstrated that the time for web pages to load is at least as short via a satellite link as it is via a standard 28.8-kbps dialup modem channel. This demonstrates that satellites are a viable means of accessing the Internet.
2015-01-01
Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser.1 One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing’s capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of “re-dockings” with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing’s docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening. PMID:25151852
Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee
2014-09-22
Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.
Task Oriented Evaluation of Module Extraction Techniques
NASA Astrophysics Data System (ADS)
Palmisano, Ignazio; Tamma, Valentina; Payne, Terry; Doran, Paul
Ontology Modularization techniques identify coherent and often reusable regions within an ontology. The ability to identify such modules, thus potentially reducing the size or complexity of an ontology for a given task or set of concepts is increasingly important in the Semantic Web as domain ontologies increase in terms of size, complexity and expressivity. To date, many techniques have been developed, but evaluation of the results of these techniques is sketchy and somewhat ad hoc. Theoretical properties of modularization algorithms have only been studied in a small number of cases. This paper presents an empirical analysis of a number of modularization techniques, and the modules they identify over a number of diverse ontologies, by utilizing objective, task-oriented measures to evaluate the fitness of the modules for a number of statistical classification problems.
NASA Astrophysics Data System (ADS)
Walker, J. I.; Blodgett, D. L.; Suftin, I.; Kunicki, T.
2013-12-01
High-resolution data for use in environmental modeling is increasingly becoming available at broad spatial and temporal scales. Downscaled climate projections, remotely sensed landscape parameters, and land-use/land-cover projections are examples of datasets that may exceed an individual investigation's data management and analysis capacity. To allow projects on limited budgets to work with many of these data sets, the burden of working with them must be reduced. The approach being pursued at the U.S. Geological Survey Center for Integrated Data Analytics uses standard self-describing web services that allow machine to machine data access and manipulation. These techniques have been implemented and deployed in production level server-based Web Processing Services that can be accessed from a web application or scripted workflow. Data publication techniques that allow machine-interpretation of large collections of data have also been implemented for numerous datasets at U.S. Geological Survey data centers as well as partner agencies and academic institutions. Discovery of data services is accomplished using a method in which a machine-generated metadata record holds content--derived from the data's source web service--that is intended for human interpretation as well as machine interpretation. A distributed search application has been developed that demonstrates the utility of a decentralized search of data-owner metadata catalogs from multiple agencies. The integrated but decentralized system of metadata, data, and server-based processing capabilities will be presented. The design, utility, and value of these solutions will be illustrated with applied science examples and success stories. Datasets such as the EPA's Integrated Climate and Land Use Scenarios, USGS/NASA MODIS derived land cover attributes, and downscaled climate projections from several sources are examples of data this system includes. These and other datasets, have been published as standard, self-describing, web services that provide the ability to inspect and subset the data. This presentation will demonstrate this file-to-web service concept and how it can be used from script-based workflows or web applications.
NASA Astrophysics Data System (ADS)
Kearney, K.; Aydin, K.
2016-02-01
Oceanic food webs are often depicted as network graphs, with the major organisms or functional groups displayed as nodes and the fluxes of between them as the edges. However, the large number of nodes and edges and high connectance of many management-oriented food webs coupled with graph layout algorithms poorly-suited to certain desired characteristics of food web visualizations often lead to hopelessly tangled diagrams that convey little information other than, "It's complex." Here, I combine several new graph visualization techniques- including a new node layout alorithm based on a trophic similarity (quantification of shared predator and prey) and trophic level, divided edge bundling for edge routing, and intelligent automated placement of labels- to create a much clearer visualization of the important fluxes through a food web. The technique will be used to highlight the differences in energy flow within three Alaskan Large Marine Ecosystems (the Bering Sea, Gulf of Alaska, and Aleutian Islands) that include very similar functional groups but unique energy pathways.
Development of web tools to disseminate space geodesy data-related products
NASA Astrophysics Data System (ADS)
Soudarin, Laurent; Ferrage, Pascale; Mezerette, Adrien
2015-04-01
In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). A database was created to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).
Pierot, L; Biondi, A; Narata, A-P; Mihalea, C; Januel, A-C; Metaxas, G; Bibi, R; Caroff, J; Soize, S; Cognard, C; Spelle, L; Herbreteau, D
2017-06-01
Flow disruption with the WEB device is an innovative technique for the endovascular treatment of wide neck bifurcation aneurysms. Good clinical practice trials have shown high safety of this treatment with good efficacy. Technical developments (single layer devices and smaller microcatheters) facilitate the treatment, potentially leading to enlargement of indications. This series is collecting aneurysms in "atypical" locations for WEB treatment and analyzing safety and efficacy of this treatment. In each participating center, patients with aneurysms treated with WEB were prospectively included in a local database. Patients treated for aneurysms in "atypical" locations were extracted. Patient and aneurysm characteristics, intraoperative complications, and anatomical results at the end of the procedure and at last follow-up were collected and analyzed. Five French neurointerventional centers included 20 patients with 20 aneurysms in "atypical" locations for WEB treatment treated with WEB. Aneurysm locations were ICA carotid-ophthalmic in 9 aneurysms (45.0%), ICA posterior communicating in 4 (20.0%), Pericallosal artery in 5 (25.0%), and basilar artery between P1 and superior cerebellar artery in 2 (10.0%). There were no complications (thromboembolic or intraoperative rupture) in this series. At follow-up (mean: 7.4 months), adequate occlusion was obtained in 100.0% of aneurysms. This series confirms that it is possible to enlarge indications of WEB treatment to "atypical" locations with good safety and efficacy. These data have to be confirmed in large prospective series. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Intelligent web agents for a 3D virtual community
NASA Astrophysics Data System (ADS)
Dave, T. M.; Zhang, Yanqing; Owen, G. S. S.; Sunderraman, Rajshekhar
2003-08-01
In this paper, we propose an Avatar-based intelligent agent technique for 3D Web based Virtual Communities based on distributed artificial intelligence, intelligent agent techniques, and databases and knowledge bases in a digital library. One of the goals of this joint NSF (IIS-9980130) and ACM SIGGRAPH Education Committee (ASEC) project is to create a virtual community of educators and students who have a common interest in comptuer graphics, visualization, and interactive techniqeus. In this virtual community (ASEC World) Avatars will represent the educators, students, and other visitors to the world. Intelligent agents represented as specially dressed Avatars will be available to assist the visitors to ASEC World. The basic Web client-server architecture of the intelligent knowledge-based avatars is given. Importantly, the intelligent Web agent software system for the 3D virtual community is implemented successfully.
Integrated Knowledge Based Expert System for Disease Diagnosis System
NASA Astrophysics Data System (ADS)
Arbaiy, Nureize; Sulaiman, Shafiza Eliza; Hassan, Norlida; Afizah Afip, Zehan
2017-08-01
The role and importance of healthcare systems to improve quality of life and social welfare in a society have been well recognized. Attention should be given to raise awareness and implementing appropriate measures to improve health care. Therefore, a computer based system is developed to serve as an alternative for people to self-diagnose their health status based on given symptoms. This strategy should be emphasized so that people can utilize the information correctly as a reference to enjoy healthier life. Hence, a Web-based Community Center for Healthcare Diagnosis system is developed based on expert system technique. Expert system reasoning technique is employed in the system to enable information about treatment and prevention of the diseases based on given symptoms. At present, three diseases are included which are arthritis, thalassemia and pneumococcal. Sets of rule and fact are managed in the knowledge based system. Web based technology is used as a platform to disseminate the information to users in order for them to optimize the information appropriately. This system will benefit people who wish to increase health awareness and seek expert knowledge on the diseases by performing self-diagnosis for early disease detection.
ERIC Educational Resources Information Center
Arnone, Marilyn P.; Small, Ruth V.
Designed for elementary or middle school teachers and library media specialists, this book provides educators with practical, easy-to-use ways of applying motivation assessment techniques when selecting World Wide Web sites for inclusion in their lessons and offers concrete examples of how to use Web evaluation with young learners. WebMAC…
A Comparison of Techniques To Find Mirrored Hosts on the WWW.
ERIC Educational Resources Information Center
Bharat, Krishna; Broder, Andrei; Dean, Jefferey; Henzinger, Monika R.
2000-01-01
Compares several "top-down" algorithms for identifying mirrored hosts on the Web. The algorithms operate on the basis of URL strings and linkage data: the type of information about Web pages easily available from Web proxies and crawlers. Results reveal that the best approach is a combination of five algorithms: on test data this…
The Effectiveness of Web-Based Learning Environment: A Case Study of Public Universities in Kenya
ERIC Educational Resources Information Center
Kirui, Paul A.; Mutai, Sheila J.
2010-01-01
Web mining is emerging in many aspects of e-learning, aiming at improving online learning and teaching processes and making them more transparent and effective. Researchers using Web mining tools and techniques are challenged to learn more about the online students' reshaping online courses and educational websites, and create tools for…
PLAN2L: a web tool for integrated text mining and literature-based bioentity relation extraction.
Krallinger, Martin; Rodriguez-Penagos, Carlos; Tendulkar, Ashish; Valencia, Alfonso
2009-07-01
There is an increasing interest in using literature mining techniques to complement information extracted from annotation databases or generated by bioinformatics applications. Here we present PLAN2L, a web-based online search system that integrates text mining and information extraction techniques to access systematically information useful for analyzing genetic, cellular and molecular aspects of the plant model organism Arabidopsis thaliana. Our system facilitates a more efficient retrieval of information relevant to heterogeneous biological topics, from implications in biological relationships at the level of protein interactions and gene regulation, to sub-cellular locations of gene products and associations to cellular and developmental processes, i.e. cell cycle, flowering, root, leaf and seed development. Beyond single entities, also predefined pairs of entities can be provided as queries for which literature-derived relations together with textual evidences are returned. PLAN2L does not require registration and is freely accessible at http://zope.bioinfo.cnio.es/plan2l.
ERIC Educational Resources Information Center
Metz, Ray E.; Junion-Metz, Gail
This book provides basic information about the World Wide Web and serves as a guide to the tools and techniques needed to browse the Web, integrate it into library services, or build an attractive, user-friendly home page for the library. Chapter 1 provides an overview of Web basics and chapter 2 discusses some of the big issues related to…
Designing Websites for Displaying Large Data Sets and Images on Multiple Platforms
NASA Astrophysics Data System (ADS)
Anderson, A.; Wolf, V. G.; Garron, J.; Kirschner, M.
2012-12-01
The desire to build websites to analyze and display ever increasing amounts of scientific data and images pushes for web site designs which utilize large displays, and to use the display area as efficiently as possible. Yet, scientists and users of their data are increasingly wishing to access these websites in the field and on mobile devices. This results in the need to develop websites that can support a wide range of devices and screen sizes, and to optimally use whatever display area is available. Historically, designers have addressed this issue by building two websites; one for mobile devices, and one for desktop environments, resulting in increased cost, duplicity of work, and longer development times. Recent advancements in web design technology and techniques have evolved which allow for the development of a single website that dynamically adjusts to the type of device being used to browse the website (smartphone, tablet, desktop). In addition they provide the opportunity to truly optimize whatever display area is available. HTML5 and CSS3 give web designers media query statements which allow design style sheets to be aware of the size of the display being used, and to format web content differently based upon the queried response. Web elements can be rendered in a different size, position, or even removed from the display entirely, based upon the size of the display area. Using HTML5/CSS3 media queries in this manner is referred to as "Responsive Web Design" (RWD). RWD in combination with technologies such as LESS and Twitter Bootstrap allow the web designer to build web sites which not only dynamically respond to the browser display size being used, but to do so in very controlled and intelligent ways, ensuring that good layout and graphic design principles are followed while doing so. At the University of Alaska Fairbanks, the Alaska Satellite Facility SAR Data Center (ASF) recently redesigned their popular Vertex application and converted it from a traditional, fixed-layout website into a RWD site built on HTML5, LESS and Twitter Bootstrap. Vertex is a data portal for remotely sensed imagery of the earth, offering Synthetic Aperture Radar (SAR) data products from the global ASF archive. By using Responsive Web Design, ASF is able to provide access to a massive collection of SAR imagery and allow the user to use mobile devices and desktops to maximum advantage. ASF's Vertex web site demonstrates that with increased interface flexibility, scientists, managers and users can increase their personal effectiveness by accessing data portals from their preferred device as their science dictates.
Kling-Petersen, T; Pascher, R; Rydmark, M
1999-01-01
Academic and medical imaging are increasingly using computer based 3D reconstruction and/or visualization. Three-dimensional interactive models play a major role in areas such as preclinical medical education, clinical visualization and medical research. While 3D is comparably easy to do on a high end workstations, distribution and use of interactive 3D graphics necessitate the use of personal computers and the web. Several new techniques have been demonstrated providing interactive 3D via a web browser thereby allowing a limited version of VR to be experienced by a larger majority of students, medical practitioners and researchers. These techniques include QuickTimeVR2 (QTVR), VRML2, QuickDraw3D, OpenGL and Java3D. In order to test the usability of the different techniques, Mednet have initiated a number of projects designed to evaluate the potentials of 3D techniques for scientific reporting, clinical visualization and medical education. These include datasets created by manual tracing followed by triangulation, smoothing and 3D visualization, MRI or high-resolution laserscanning. Preliminary results indicate that both VRML and QTVR fulfills most of the requirements of web based, interactive 3D visualization, whereas QuickDraw3D is too limited. Presently, the JAVA 3D has not yet reached a level where in depth testing is possible. The use of high-resolution laserscanning is an important addition to 3D digitization.
Cant, Robyn; Young, Susan; Cooper, Simon J; Porter, Joanne
2015-03-01
This study explores preregistration nursing students' views of a Web-based simulation program: FIRST ACTWeb (Feedback Incorporating Review and Simulation Techniques to Act on Clinical Trends-Web). The multimedia program incorporating three videoed scenarios portrayed by a standardized patient (human actor) aims to improve students' recognition and management of hospital patient deterioration. Participants were 367 final-year nursing students from three universities who completed an online evaluation survey and 19 students from two universities who attended one of five focus groups. Two researchers conducted a thematic analysis of the transcribed narratives. Three core themes identified were as follows: "ease of program use," "experience of e-Simulation," and "satisfaction with the learning experience." The Web-based clinical learning environment was endorsed as functional, feasible, and easy to use and was reported to have high fidelity and realism. Feedback in both focus groups and surveys showed high satisfaction with the learning experience. Overall, evaluation suggested that the Web-based simulation program successfully integrated elements essential for blended learning. Although Web-based educational applications are resource intensive to develop, positive appraisal of program quality, plus program accessibility and repeatability, appears to provide important educational benefits. Further research is needed to determine the transferability of these learning experiences into real-world practice.
Designed a web crawler which oriented network public opinion data acquisition
NASA Astrophysics Data System (ADS)
Lu, Shan; Ma, Hui; Gao, Ying
2015-12-01
The paper describes the meaning of network public opinion and the network public opinion research of data acquisition technique. Designed and implemented a web crawler which oriented network public opinion data acquisition. Insufficient analysis of the generic web crawler, using asynchronous Socket, DNS cache, and queue downloads to improve its bottom story frame, increase the speed of collecting.
The new ALICE DQM client: a web access to ROOT-based objects
NASA Astrophysics Data System (ADS)
von Haller, B.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Delort, C.; Dénes, E.; Diviá, R.; Fuchs, U.; Niedziela, J.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Wegrzynek, A.
2015-12-01
A Large Ion Collider Experiment (ALICE) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The online Data Quality Monitoring (DQM) plays an essential role in the experiment operation by providing shifters with immediate feedback on the data being recorded in order to quickly identify and overcome problems. An immediate access to the DQM results is needed not only by shifters in the control room but also by detector experts worldwide. As a consequence, a new web application has been developed to dynamically display and manipulate the ROOT-based objects produced by the DQM system in a flexible and user friendly interface. The architecture and design of the tool, its main features and the technologies that were used, both on the server and the client side, are described. In particular, we detail how we took advantage of the most recent ROOT JavaScript I/O and web server library to give interactive access to ROOT objects stored in a database. We describe as well the use of modern web techniques and packages such as AJAX, DHTMLX and jQuery, which has been instrumental in the successful implementation of a reactive and efficient application. We finally present the resulting application and how code quality was ensured. We conclude with a roadmap for future technical and functional developments.
Benthic food webs support the production of sympatric flatfish ...
Identifying nursery habitats is of paramount importance to define proper management and conservation strategies for flatfish species. Flatfish nursery studies usually report upon habitat occupation, but few attempted to quantify the importance of those habitats to larvae development. The reliance of two sympatric flatfish species larvae, the European flounder Platichthys flesus and the common sole Solea solea, on the estuarine food web (benthic vs. pelagic) was determined through carbon and nitrogen stable isotope analysis. The organic matter sources supporting the production of P. flesus and S. solea larvae biomass originates chiefly in the benthic food web. However, these species have significantly different ä13C and ä15N values which suggests that they prey on organisms that use a different mixture of sources or assimilate different components from similar OM pools (or both). Fisheries production is a highly-valued ecosystem service of coastal habitats globally. Developing ecological production functions to relate nursery habitat to this ecosystem service, however, has proved challenging owing to lack of techniques to effectively relate habitat use with the early life stages of fish. In this study, we demonstrate how stable isotopes can be used to evaluate nursery habitat-fish production linkages using stable isotope analysis of an estuarine food web. In particular, we show how to quantify the reliance of two commercially-important fish species to various h
Online Hydrologic Impact Assessment Decision Support System using Internet and Web-GIS Capability
NASA Astrophysics Data System (ADS)
Choi, J.; Engel, B. A.; Harbor, J.
2002-05-01
Urban sprawl and the corresponding land use change from lower intensity uses, such as agriculture and forests, to higher intensity uses including high density residential and commercial has various long- and short-term environment impacts on ground water recharge, water pollution, and storm water drainage. A web-based Spatial Decision Support System, SDSS, for Web-based operation of long-term hydrologic impact modeling and analysis was developed. The system combines a hydrologic model, databases, web-GIS capability and HTML user interfaces to create a comprehensive hydrologic analysis system. The hydrologic model estimates daily direct runoff using the NRCS Curve Number technique and annual nonpoint source pollution loading by an event mean concentration approach. This is supported by a rainfall database with over 30 years of daily rainfall for the continental US. A web-GIS interface and a robust Web-based watershed delineation capability were developed to simplify the spatial data preparation task that is often a barrier to hydrologic model operation. The web-GIS supports browsing of map layers including hydrologic soil groups, roads, counties, streams, lakes and railroads, as well as on-line watershed delineation for any geographic point the user selects with a simple mouse click. The watershed delineation results can also be used to generate data for the hydrologic and water quality models available in the DSS. This system is already being used by city and local government planners for hydrologic impact evaluation of land use change from urbanization, and can be found at http://pasture.ecn.purdue.edu/~watergen/hymaps. This system can assist local community, city and watershed planners, and even professionals when they are examining impacts of land use change on water resources. They can estimate the hydrologic impact of possible land use changes using this system with readily available data supported through the Internet. This system provides a cost effective approach to serve potential users who require easy-to-use tools.
Experimental evaluation of two 36 inch by 47 inch graphite/epoxy sandwich shear webs
NASA Technical Reports Server (NTRS)
Bush, H. G.
1975-01-01
The design is described and test of two large (36 in. x 47 in.) graphite/epoxy sandwich shear webs. One sandwich web was designed to exhibit strength failure of the facings at a shear load of 7638 lbs/in., which is a characteristic loading for the space shuttle orbiter main engine thrust beam structure. The second sandwich web was designed to exhibit general instability failure at a shear load of 5000 lbs/in., to identify problem areas of stability critical sandwich webs and to assess the adequacy of contemporary analysis techniques.
Web-Based Tools for Data Visualization and Decision Support for South Asia
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Pulla, S. T.; Ames, D. P.; Souffront, M.; David, C. H.; Zaitchik, B. F.; Gatlin, P. N.; Matin, M. A.
2017-12-01
The objective of the NASA SERVIR project is to assist developing countries in using information provided by Earth observing satellites to assess and manage climate risks, land use, and water resources. We present a collection of web apps that integrate earth observations and in situ data to facilitate deployment of data and water resources models as decision-making tools in support of this effort. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated for many of the regional SERVIR hubs where both financial and technical capacity may be limited. All that is needed to use the system is an Internet connection and a web browser. We take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization make results intuitive and information derived actionable. We also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This makes our tools interoperable and extensible via application programming interfaces (APIs) so that tools and data from other projects can both consume and share the tools developed in our project. Our approach enables the integration of multiple types of data and models, thus facilitating collaboration between science teams in SERVIR. The apps developed thus far by our team process time-varying netCDF files from Earth observations and large-scale computer simulations and allow visualization and exploration via raster animation and extraction of time series at selected points and/or regions.
User Interface Design in Medical Distributed Web Applications.
Serban, Alexandru; Crisan-Vida, Mihaela; Mada, Leonard; Stoicu-Tivadar, Lacramioara
2016-01-01
User interfaces are important to facilitate easy learning and operating with an IT application especially in the medical world. An easy to use interface has to be simple and to customize the user needs and mode of operation. The technology in the background is an important tool to accomplish this. The present work aims to creating a web interface using specific technology (HTML table design combined with CSS3) to provide an optimized responsive interface for a complex web application. In the first phase, the current icMED web medical application layout is analyzed, and its structure is designed using specific tools, on source files. In the second phase, a new graphic adaptable interface to different mobile terminals is proposed, (using HTML table design (TD) and CSS3 method) that uses no source files, just lines of code for layout design, improving the interaction in terms of speed and simplicity. For a complex medical software application a new prototype layout was designed and developed using HTML tables. The method uses a CSS code with only CSS classes applied to one or multiple HTML table elements, instead of CSS styles that can be applied to just one DIV tag at once. The technique has the advantage of a simplified CSS code, and a better adaptability to different media resolutions compared to DIV-CSS style method. The presented work is a proof that adaptive web interfaces can be developed just using and combining different types of design methods and technologies, using HTML table design, resulting in a simpler to learn and use interface, suitable for healthcare services.
Tesfaye, Brook; Atique, Suleman; Elias, Noah; Dibaba, Legesse; Shabbir, Syed-Abdul; Kebede, Mihiretu
2017-03-01
Improving child health and reducing child mortality rate are key health priorities in developing countries. This study aimed to identify determinant sand develop, a web-based child mortality prediction model in Ethiopian local language using classification data mining algorithm. Decision tree (using J48 algorithm) and rule induction (using PART algorithm) techniques were applied on 11,654 records of Ethiopian demographic and health survey data. Waikato Environment for Knowledge Analysis (WEKA) for windows version 3.6.8 was used to develop optimal models. 8157 (70%) records were randomly allocated to training group for model building while; the remaining 3496 (30%) records were allocated as the test group for model validation. The validation of the model was assessed using accuracy, sensitivity, specificity and area under Receiver Operating Characteristics (ROC) curve. Using Statistical Package for Social Sciences (SPSS) version 20.0; logistic regressions and Odds Ratio (OR) with 95% Confidence Interval (CI) was used to identify determinants of child mortality. The child mortality rate was 72 deaths per 1000 live births. Breast-feeding (AOR= 1.46, (95% CI [1.22. 1.75]), maternal education (AOR= 1.40, 95% CI [1.11, 1.81]), family planning (AOR= 1.21, [1.08, 1.43]), preceding birth interval (AOR= 4.90, [2.94, 8.15]), presence of diarrhea (AOR= 1.54, 95% CI [1.32, 1.66]), father's education (AOR= 1.4, 95% CI [1.04, 1.78]), low birth weight (AOR= 1.2, 95% CI [0.98, 1.51]) and, age of the mother at first birth (AOR= 1.42, [1.01-1.89]) were found to be determinants for child mortality. The J48 model had better performance, accuracy (94.3%), sensitivity (93.8%), specificity (94.3%), Positive Predictive Value (PPV) (92.2%), Negative Predictive Value (NPV) (94.5%) and, the area under ROC (94.8%). Subsequent to developing an optimal prediction model, we relied on this model to develop a web-based application system for child mortality prediction. In this study, nearly accurate results were obtained by employing decision tree and rule induction techniques. Determinants are identified and a web-based child mortality prediction model in Ethiopian local language is developed. Thus, the result obtained could support child health intervention programs in Ethiopia where trained human resource for health is limited. Advanced classification algorithms need to be tested to come up with optimal models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Forecasting seasonal outbreaks of influenza.
Shaman, Jeffrey; Karspeck, Alicia
2012-12-11
Influenza recurs seasonally in temperate regions of the world; however, our ability to predict the timing, duration, and magnitude of local seasonal outbreaks of influenza remains limited. Here we develop a framework for initializing real-time forecasts of seasonal influenza outbreaks, using a data assimilation technique commonly applied in numerical weather prediction. The availability of real-time, web-based estimates of local influenza infection rates makes this type of quantitative forecasting possible. Retrospective ensemble forecasts are generated on a weekly basis following assimilation of these web-based estimates for the 2003-2008 influenza seasons in New York City. The findings indicate that real-time skillful predictions of peak timing can be made more than 7 wk in advance of the actual peak. In addition, confidence in those predictions can be inferred from the spread of the forecast ensemble. This work represents an initial step in the development of a statistically rigorous system for real-time forecast of seasonal influenza.
Forecasting seasonal outbreaks of influenza
Shaman, Jeffrey; Karspeck, Alicia
2012-01-01
Influenza recurs seasonally in temperate regions of the world; however, our ability to predict the timing, duration, and magnitude of local seasonal outbreaks of influenza remains limited. Here we develop a framework for initializing real-time forecasts of seasonal influenza outbreaks, using a data assimilation technique commonly applied in numerical weather prediction. The availability of real-time, web-based estimates of local influenza infection rates makes this type of quantitative forecasting possible. Retrospective ensemble forecasts are generated on a weekly basis following assimilation of these web-based estimates for the 2003–2008 influenza seasons in New York City. The findings indicate that real-time skillful predictions of peak timing can be made more than 7 wk in advance of the actual peak. In addition, confidence in those predictions can be inferred from the spread of the forecast ensemble. This work represents an initial step in the development of a statistically rigorous system for real-time forecast of seasonal influenza. PMID:23184969
Oceans 2.0: a Data Management Infrastructure as a Platform
NASA Astrophysics Data System (ADS)
Pirenne, B.; Guillemot, E.
2012-04-01
Oceans 2.0: a Data Management Infrastructure as a Platform Benoît Pirenne, Associate Director, IT, NEPTUNE Canada Eric Guillemot, Manager, Software Development, NEPTUNE Canada The Data Management and Archiving System (DMAS) serving the needs of a number of undersea observing networks such as VENUS and NEPTUNE Canada was conceived from the beginning as a Service-Oriented Infrastructure. Its core functional elements (data acquisition, transport, archiving, retrieval and processing) can interact with the outside world using Web Services. Those Web Services can be exploited by a variety of higher level applications. Over the years, DMAS has developed Oceans 2.0: an environment where these techniques are implemented. The environment thereby becomes a platform in that it allows for easy addition of new and advanced features that build upon the tools at the core of the system. The applications that have been developed include: data search and retrieval, including options such as data product generation, data decimation or averaging, etc. dynamic infrastructure description (search all observatory metadata) and visualization data visualization, including dynamic scalar data plots, integrated fast video segment search and viewing Building upon these basic applications are new concepts, coming from the Web 2.0 world that DMAS has added: They allow people equipped only with a web browser to collaborate and contribute their findings or work results to the wider community. Examples include: addition of metadata tags to any part of the infrastructure or to any data item (annotations) ability to edit and execute, share and distribute Matlab code on-line, from a simple web browser, with specific calls within the code to access data ability to interactively and graphically build pipeline processing jobs that can be executed on the cloud web-based, interactive instrument control tools that allow users to truly share the use of the instruments and communicate with each other and last but not least: a public tool in the form of a game, that crowd-sources the inventory of the underwater video archive content, thereby adding tremendous amounts of metadata Beyond those tools that represent the functionality presently available to users, a number of the Web Services dedicated to data access are being exposed for anyone to use. This allows not only for ad hoc data access by individuals who need non-interactive access, but will foster the development of new applications in a variety of areas.
An automatic method for retrieving and indexing catalogues of biomedical courses.
Maojo, Victor; de la Calle, Guillermo; García-Remesal, Miguel; Bankauskaite, Vaida; Crespo, Jose
2008-11-06
Although there is wide information about Biomedical Informatics education and courses in different Websites, information is usually not exhaustive and difficult to update. We propose a new methodology based on information retrieval techniques for extracting, indexing and retrieving automatically information about educational offers. A web application has been developed to make available such information in an inventory of courses and educational offers.
GeoCENS: a geospatial cyberinfrastructure for the world-wide sensor web.
Liang, Steve H L; Huang, Chih-Yuan
2013-10-02
The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision.
Review of Extracting Information From the Social Web for Health Personalization
Karlsen, Randi; Bonander, Jason
2011-01-01
In recent years the Web has come into its own as a social platform where health consumers are actively creating and consuming Web content. Moreover, as the Web matures, consumers are gaining access to personalized applications adapted to their health needs and interests. The creation of personalized Web applications relies on extracted information about the users and the content to personalize. The Social Web itself provides many sources of information that can be used to extract information for personalization apart from traditional Web forms and questionnaires. This paper provides a review of different approaches for extracting information from the Social Web for health personalization. We reviewed research literature across different fields addressing the disclosure of health information in the Social Web, techniques to extract that information, and examples of personalized health applications. In addition, the paper includes a discussion of technical and socioethical challenges related to the extraction of information for health personalization. PMID:21278049
GeoCENS: A Geospatial Cyberinfrastructure for the World-Wide Sensor Web
Liang, Steve H.L.; Huang, Chih-Yuan
2013-01-01
The world-wide sensor web has become a very useful technique for monitoring the physical world at spatial and temporal scales that were previously impossible. Yet we believe that the full potential of sensor web has thus far not been revealed. In order to harvest the world-wide sensor web's full potential, a geospatial cyberinfrastructure is needed to store, process, and deliver large amount of sensor data collected worldwide. In this paper, we first define the issue of the sensor web long tail followed by our view of the world-wide sensor web architecture. Then, we introduce the Geospatial Cyberinfrastructure for Environmental Sensing (GeoCENS) architecture and explain each of its components. Finally, with demonstration of three real-world powered-by-GeoCENS sensor web applications, we believe that the GeoCENS architecture can successfully address the sensor web long tail issue and consequently realize the world-wide sensor web vision. PMID:24152921
Development of a web-based CT dose calculator: WAZA-ARI.
Ban, N; Takahashi, F; Sato, K; Endo, A; Ono, K; Hasegawa, T; Yoshitake, T; Katsunuma, Y; Kai, M
2011-09-01
A web-based computed tomography (CT) dose calculation system (WAZA-ARI) is being developed based on the modern techniques for the radiation transport simulation and for software implementation. Dose coefficients were calculated in a voxel-type Japanese adult male phantom (JM phantom), using the Particle and Heavy Ion Transport code System. In the Monte Carlo simulation, the phantom was irradiated with a 5-mm-thick, fan-shaped photon beam rotating in a plane normal to the body axis. The dose coefficients were integrated into the system, which runs as Java servlets within Apache Tomcat. Output of WAZA-ARI for GE LightSpeed 16 was compared with the dose values calculated similarly using MIRD and ICRP Adult Male phantoms. There are some differences due to the phantom configuration, demonstrating the significance of the dose calculation with appropriate phantoms. While the dose coefficients are currently available only for limited CT scanner models and scanning options, WAZA-ARI will be a useful tool in clinical practice when development is finalised.
Internet (WWW) based system of ultrasonic image processing tools for remote image analysis.
Zeng, Hong; Fei, Ding-Yu; Fu, Cai-Ting; Kraft, Kenneth A
2003-07-01
Ultrasonic Doppler color imaging can provide anatomic information and simultaneously render flow information within blood vessels for diagnostic purpose. Many researchers are currently developing ultrasound image processing algorithms in order to provide physicians with accurate clinical parameters from the images. Because researchers use a variety of computer languages and work on different computer platforms to implement their algorithms, it is difficult for other researchers and physicians to access those programs. A system has been developed using World Wide Web (WWW) technologies and HTTP communication protocols to publish our ultrasonic Angle Independent Doppler Color Image (AIDCI) processing algorithm and several general measurement tools on the Internet, where authorized researchers and physicians can easily access the program using web browsers to carry out remote analysis of their local ultrasonic images or images provided from the database. In order to overcome potential incompatibility between programs and users' computer platforms, ActiveX technology was used in this project. The technique developed may also be used for other research fields.
Best Practices for Building Web Data Portals
NASA Astrophysics Data System (ADS)
Anderson, R. A.; Drew, L.
2013-12-01
With a data archive of more than 1.5 petabytes and a key role as the NASA Distributed Active Archive Center (DAAC) for synthetic aperture radar (SAR) data, the Alaska Satellite Facility (ASF) has an imperative to develop effective Web data portals. As part of continuous enhancement and expansion of its website, ASF recently created two data portals for distribution of SAR data: one for the archiving and distribution of NASA's MEaSUREs Wetlands project and one for newly digitally processed data from NASA's 1978 Seasat satellite. These case studies informed ASF's development of the following set of best practices for developing Web data portals. 1) Maintain well-organized, quality data. This is fundamental. If data are poorly organized or contain errors, credibility is lost and the data will not be used. 2) Match data to likely data uses. 3) Identify audiences in as much detail as possible. ASF DAAC's Seasat and Wetlands portals target three groups of users: a) scientists already familiar with ASF DAAC's SAR archive and our data download tool, Vertex; b) scientists not familiar with SAR or ASF, but who can use the data for their research of oceans, sea ice, volcanoes, land deformation and other Earth sciences; c) audiences wishing to learn more about SAR and its use in Earth sciences. 4) Identify the heaviest data uses and the terms scientists search for online when trying to find data for those uses. 5) Create search engine optimized (SEO) Web content that corresponds to those searches. Because search engines do not yet search raw data, so Web data portals must include content that ties the data to its likely uses. 6) Create Web designs that best serves data users (user centered design), not for how the organization views itself or its data. Usability testing was conducted for the ASF DAAC Wetlands portal to improve the user experience. 7) Use SEO tips and techniques. The ASF DAAC Seasat portal used numerous SEO techniques, including social media, blogging technology, SEO rich content and more. As a result, it was on the first page of numerous related Google search results within 24 hours of the portal launch. 8) Build in-browser data analysis tools showing scientists how the data can be used in their research. The ASF DAAC Wetlands portal demonstrates that allowing the user to examine the data quickly and graphically online readily enables users to perceive the value of the data and how to use it. 9) Use responsive Web design (RWD) so content and tools can be accessed from a wide range of devices. Wetlands and Seasat can be accessed from smartphones, tablets and desktops. 10) Use Web frameworks to enable rapid building of new portals using consistent design patterns. Seasat and Wetlands both use Django and Twitter Bootstrap. 11) Use load-balanced servers if high demand for the data is anticipated. Using load-balanced servers for the Seasat and Wetlands portals allows ASF to simply add hardware as needed to support increased capacity. 12) Use open-source software when possible. Seasat and Wetlands portal development costs were reduced, and functionality was increased, with the use of open-source software. 13) Use third-party virtual servers (e.g. Amazon EC2 and S3 Services) where applicable. 14) Track visitors using analytic tools. 15) Continually improve design.
SPEER-SERVER: a web server for prediction of protein specificity determining sites
Chakraborty, Abhijit; Mandloi, Sapan; Lanczycki, Christopher J.; Panchenko, Anna R.; Chakrabarti, Saikat
2012-01-01
Sites that show specific conservation patterns within subsets of proteins in a protein family are likely to be involved in the development of functional specificity. These sites, generally termed specificity determining sites (SDS), might play a crucial role in binding to a specific substrate or proteins. Identification of SDS through experimental techniques is a slow, difficult and tedious job. Hence, it is very important to develop efficient computational methods that can more expediently identify SDS. Herein, we present Specificity prediction using amino acids’ Properties, Entropy and Evolution Rate (SPEER)-SERVER, a web server that predicts SDS by analyzing quantitative measures of the conservation patterns of protein sites based on their physico-chemical properties and the heterogeneity of evolutionary changes between and within the protein subfamilies. This web server provides an improved representation of results, adds useful input and output options and integrates a wide range of analysis and data visualization tools when compared with the original standalone version of the SPEER algorithm. Extensive benchmarking finds that SPEER-SERVER exhibits sensitivity and precision performance that, on average, meets or exceeds that of other currently available methods. SPEER-SERVER is available at http://www.hpppi.iicb.res.in/ss/. PMID:22689646
Atreja, Ashish; Mehta, Neil; Miller, Deborah; Moore, Shirley; Nichols, Karen; Miller, Holly; Harris, C Martin
2005-01-01
Disabled and elderly populations are the fastest growing segment of Internet usage. However, these people face an “Inverse Information law”- access to appropriate information is particularly difficult to those who need it the most. Our tertiary care Multiple Sclerosis (MS) center received funding to develop a MS specific patient portal linked to web messaging system so as to empower patients to become more active participants in their health care. In order to design an effective portal, we conducted a qualitative study using focus groups and direct observation techniques. The study explores the perceptions, expectations and interactions of MS patients with the portal and underscores the many challenges MS patients face in getting quality health information on the Internet. Many of the patient barriers were due to inappropriate font sizes, low contrast, cluttering of web page and use of dynamic and flashing objects. Some of these issues are not addressed by Section 508 accessibility guidelines. We believe that any future patient portal or health information web site needs to address these issues and educate the patients about accessibility options to enhance utilization and user satisfaction. PMID:16778993
SPEER-SERVER: a web server for prediction of protein specificity determining sites.
Chakraborty, Abhijit; Mandloi, Sapan; Lanczycki, Christopher J; Panchenko, Anna R; Chakrabarti, Saikat
2012-07-01
Sites that show specific conservation patterns within subsets of proteins in a protein family are likely to be involved in the development of functional specificity. These sites, generally termed specificity determining sites (SDS), might play a crucial role in binding to a specific substrate or proteins. Identification of SDS through experimental techniques is a slow, difficult and tedious job. Hence, it is very important to develop efficient computational methods that can more expediently identify SDS. Herein, we present Specificity prediction using amino acids' Properties, Entropy and Evolution Rate (SPEER)-SERVER, a web server that predicts SDS by analyzing quantitative measures of the conservation patterns of protein sites based on their physico-chemical properties and the heterogeneity of evolutionary changes between and within the protein subfamilies. This web server provides an improved representation of results, adds useful input and output options and integrates a wide range of analysis and data visualization tools when compared with the original standalone version of the SPEER algorithm. Extensive benchmarking finds that SPEER-SERVER exhibits sensitivity and precision performance that, on average, meets or exceeds that of other currently available methods. SPEER-SERVER is available at http://www.hpppi.iicb.res.in/ss/.
Innovative Visualization Techniques applied to a Flood Scenario
NASA Astrophysics Data System (ADS)
Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael
2013-04-01
The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other windows. These concepts can be part of a collaborative platform, where multiple people share and work together on the data, via online access, which also allows its remote usage from a mobile platform. Storytelling augments analysis and decision-making capabilities allowing to assimilate complex situations and reach informed decisions, in addition to helping the public visualize information. In our visualization scenario, developed in the context of the VA-4D project for the European Space Agency (see http://www.ca3-uninova.org/project_va4d), we make use of the GAV (GeoAnalytics Visualization) framework, a web-oriented visual analytics application based on multiple interactive views. The final visualization that we produce includes multiple interactive views, including a dynamic multi-layer map surrounded by other visualizations such as bar charts, time graphs and scatter plots. The map provides flood and building information, on top of a base city map (street maps and/or satellite imagery provided by online map services such as Google Maps, Bing Maps etc.). Damage over time for selected buildings, damage for all buildings at a chosen time period, correlation between damage and water depth can be analysed in the other views. This interactive web-based visualization that incorporates the ideas of storytelling, web-based linked views, and other visualization techniques, for a 4 hour flood event in Lisbon in 2010, can be found online at http://www.ncomva.se/flash/projects/esa/flooding/.
Scalability Issues for Remote Sensing Infrastructure: A Case Study.
Liu, Yang; Picard, Sean; Williamson, Carey
2017-04-29
For the past decade, a team of University of Calgary researchers has operated a large "sensor Web" to collect, analyze, and share scientific data from remote measurement instruments across northern Canada. This sensor Web receives real-time data streams from over a thousand Internet-connected sensors, with a particular emphasis on environmental data (e.g., space weather, auroral phenomena, atmospheric imaging). Through research collaborations, we had the opportunity to evaluate the performance and scalability of their remote sensing infrastructure. This article reports the lessons learned from our study, which considered both data collection and data dissemination aspects of their system. On the data collection front, we used benchmarking techniques to identify and fix a performance bottleneck in the system's memory management for TCP data streams, while also improving system efficiency on multi-core architectures. On the data dissemination front, we used passive and active network traffic measurements to identify and reduce excessive network traffic from the Web robots and JavaScript techniques used for data sharing. While our results are from one specific sensor Web system, the lessons learned may apply to other scientific Web sites with remote sensing infrastructure.
Promoting Teachers' Positive Attitude towards Web Use: A Study in Web Site Development
ERIC Educational Resources Information Center
Akpinar, Yavuz; Bayramoglu, Yusuf
2008-01-01
The purpose of the study was to examine effects of a compact training for developing web sites on teachers' web attitude, as composed of: web self efficacy, perceived web enjoyment, perceived web usefulness and behavioral intention to use the web. To measure the related constructs, the Web Attitude Scale was adapted into Turkish and tested with a…
Oh! Web 2.0, Virtual Reference Service 2.0, Tools and Techniques (I): A Basic Approach
ERIC Educational Resources Information Center
Arya, Harsh Bardhan; Mishra, J. K.
2011-01-01
This study targets librarians and information professionals who use Web 2.0 tools and applications with a view to providing snapshots on how Web 2.0 technologies are used. It also aims to identify values and impact that such tools have exerted on libraries and their services, as well as to detect various issues associated with the implementation…
Semantic Similarity between Web Documents Using Ontology
NASA Astrophysics Data System (ADS)
Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh
2018-06-01
The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.
Semantic Similarity between Web Documents Using Ontology
NASA Astrophysics Data System (ADS)
Chahal, Poonam; Singh Tomer, Manjeet; Kumar, Suresh
2018-03-01
The World Wide Web is the source of information available in the structure of interlinked web pages. However, the procedure of extracting significant information with the assistance of search engine is incredibly critical. This is for the reason that web information is written mainly by using natural language, and further available to individual human. Several efforts have been made in semantic similarity computation between documents using words, concepts and concepts relationship but still the outcome available are not as per the user requirements. This paper proposes a novel technique for computation of semantic similarity between documents that not only takes concepts available in documents but also relationships that are available between the concepts. In our approach documents are being processed by making ontology of the documents using base ontology and a dictionary containing concepts records. Each such record is made up of the probable words which represents a given concept. Finally, document ontology's are compared to find their semantic similarity by taking the relationships among concepts. Relevant concepts and relations between the concepts have been explored by capturing author and user intention. The proposed semantic analysis technique provides improved results as compared to the existing techniques.
A novel methodology for querying web images
NASA Astrophysics Data System (ADS)
Prabhakara, Rashmi; Lee, Ching Cheng
2005-01-01
Ever since the advent of Internet, there has been an immense growth in the amount of image data that is available on the World Wide Web. With such a magnitude of image availability, an efficient and effective image retrieval system is required to make use of this information. This research presents an effective image matching and indexing technique that improvises on existing integrated image retrieval methods. The proposed technique follows a two-phase approach, integrating query by topic and query by example specification methods. The first phase consists of topic-based image retrieval using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. It consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. The second phase uses the query by example specification to perform a low-level content-based image match for the retrieval of smaller and relatively closer results of the example image. Information related to the image feature is automatically extracted from the query image by the image processing system. A technique that is not computationally intensive based on color feature is used to perform content-based matching of images. The main goal is to develop a functional image search and indexing system and to demonstrate that better retrieval results can be achieved with this proposed hybrid search technique.
A novel methodology for querying web images
NASA Astrophysics Data System (ADS)
Prabhakara, Rashmi; Lee, Ching Cheng
2004-12-01
Ever since the advent of Internet, there has been an immense growth in the amount of image data that is available on the World Wide Web. With such a magnitude of image availability, an efficient and effective image retrieval system is required to make use of this information. This research presents an effective image matching and indexing technique that improvises on existing integrated image retrieval methods. The proposed technique follows a two-phase approach, integrating query by topic and query by example specification methods. The first phase consists of topic-based image retrieval using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. It consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. The second phase uses the query by example specification to perform a low-level content-based image match for the retrieval of smaller and relatively closer results of the example image. Information related to the image feature is automatically extracted from the query image by the image processing system. A technique that is not computationally intensive based on color feature is used to perform content-based matching of images. The main goal is to develop a functional image search and indexing system and to demonstrate that better retrieval results can be achieved with this proposed hybrid search technique.
The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Thomas, Loic; Bernardie, Severine
2016-04-01
The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.
A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems
NASA Astrophysics Data System (ADS)
Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.
2017-05-01
Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.
Web-based system for surgical planning and simulation
NASA Astrophysics Data System (ADS)
Eldeib, Ayman M.; Ahmed, Mohamed N.; Farag, Aly A.; Sites, C. B.
1998-10-01
The growing scientific knowledge and rapid progress in medical imaging techniques has led to an increasing demand for better and more efficient methods of remote access to high-performance computer facilities. This paper introduces a web-based telemedicine project that provides interactive tools for surgical simulation and planning. The presented approach makes use of client-server architecture based on new internet technology where clients use an ordinary web browser to view, send, receive and manipulate patients' medical records while the server uses the supercomputer facility to generate online semi-automatic segmentation, 3D visualization, surgical simulation/planning and neuroendoscopic procedures navigation. The supercomputer (SGI ONYX 1000) is located at the Computer Vision and Image Processing Lab, University of Louisville, Kentucky. This system is under development in cooperation with the Department of Neurological Surgery, Alliant Health Systems, Louisville, Kentucky. The server is connected via a network to the Picture Archiving and Communication System at Alliant Health Systems through a DICOM standard interface that enables authorized clients to access patients' images from different medical modalities.
Fabrication and Test of Large Area Spider-Web Bolometers for CMB Measurements
NASA Astrophysics Data System (ADS)
Biasotti, M.; Ceriale, V.; Corsini, D.; De Gerone, M.; Gatti, F.; Orlando, A.; Pizzigoni, G.
2016-08-01
Detecting the primordial 'B-mode' polarization of the cosmic microwave background is one of the major challenges of modern observational cosmology. Microwave telescopes need sensitive cryogenic bolometers with an overall equivalent noise temperature in the nK range. In this paper, we present the development status of large area (about 1 cm2) spider-web bolometer, which imply additional fabrication challenges. The spider-web is a suspended Si3N4 1 \\upmu m-thick and 8-mm diameter with mesh size of 250 \\upmu m. The thermal sensitive element is a superconducting transition edge sensor (TES) at the center of the bolometer. The first prototype is a Ti-Au TES with transition temperature tuned around 350 mK, new devices will be a Mo-Au bilayer tuned to have a transition temperature of 500 mK. We present the fabrication process with micro-machining techniques from silicon wafer covered with SiO2 - Si3N4 CVD films, 0.3 and 1 \\upmu m- thick, respectively, and preliminary tests.
A Module Experimental Process System Development Unit (MEPSDU)
NASA Technical Reports Server (NTRS)
1981-01-01
Design work for a photovoltaic module, fabricated using single crystal silicon dendritic web sheet material, resulted in the identification of surface treatment to the module glass superstrate which improved module efficiencies. A final solar module environmental test, a simulated hailstone impact test, was conducted on full size module superstrates to verify that the module's tempered glass superstrate can withstand specified hailstone impacts near the corners and edges of the module. Process sequence design work on the metallization process selective, liquid dopant investigation, dry processing, and antireflective/photoresist application technique tasks, and optimum thickness for Ti/Pd are discussed. A noncontact cleaning method for raw web cleaning was identified and antireflective and photoresist coatings for the dendritic webs were selected. The design of a cell string conveyor, an interconnect feed system, rolling ultrasonic spot bonding heat, and the identification of the optimal commercially available programmable control system are also discussed. An economic analysis to assess cost goals of the process sequence is also given.
Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules
NASA Astrophysics Data System (ADS)
Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.
Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.
Zhou, Zhiwei; Xiong, Xin; Zhu, Zheng-Jiang
2017-07-15
In metabolomics, rigorous structural identification of metabolites presents a challenge for bioinformatics. The use of collision cross-section (CCS) values of metabolites derived from ion mobility-mass spectrometry effectively increases the confidence of metabolite identification, but this technique suffers from the limit number of available CCS values. Currently, there is no software available for rapidly generating the metabolites' CCS values. Here, we developed the first web server, namely, MetCCS Predictor, for predicting CCS values. It can predict the CCS values of metabolites using molecular descriptors within a few seconds. Common users with limited background on bioinformatics can benefit from this software and effectively improve the metabolite identification in metabolomics. The web server is freely available at: http://www.metabolomics-shanghai.org/MetCCS/ . jiangzhu@sioc.ac.cn. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Development of web tools to disseminate space geodesy data-related products
NASA Astrophysics Data System (ADS)
Soudarin, L.; Ferrage, P.; Mezerette, A.
2014-12-01
In order to promote the products of the DORIS system, the French Space Agency CNES has developed and implemented on the web site of the International DORIS Service (IDS) a set of plot tools to interactively build and display time series of site positions, orbit residuals and terrestrial parameters (scale, geocenter). An interactive global map is also available to select sites, and to get access to their information. Besides the products provided by the CNES Orbitography Team and the IDS components, these tools allow comparing time evolutions of coordinates for collocated DORIS and GNSS stations, thanks to the collaboration with the Terrestrial Frame Combination Center of the International GNSS Service (IGS). The next step currently in progress is the creation of a database to improve robustness and efficiency of the tools, with the objective to propose a complete web service to foster data exchange with the other geodetic services of the International Association of Geodesy (IAG). The possibility to visualize and compare position time series of the four main space geodetic techniques DORIS, GNSS, SLR and VLBI is already under way at the French level. A dedicated version of these web tools has been developed for the French Space Geodesy Research Group (GRGS). It will give access to position time series provided by the GRGS Analysis Centers involved in DORIS, GNSS, SLR and VLBI data processing for the realization of the International Terrestrial Reference Frame. In this presentation, we will describe the functionalities of these tools, and we will address some aspects of the time series (content, format).
Use of NASA Near Real-Time and Archived Satellite Data to Support Disaster Assessment
NASA Technical Reports Server (NTRS)
McGrath, Kevin M.; Molthan, Andrew L.; Burks, Jason E.
2014-01-01
NASA's Short-term Prediction Research and Transition (SPoRT) Center partners with the NWS to provide near realtime data in support of a variety of weather applications, including disasters. SPoRT supports NASA's Applied Sciences Program: Disasters focus area by developing techniques that will aid the disaster monitoring, response, and assessment communities. SPoRT has explored a variety of techniques for utilizing archived and near real-time NASA satellite data. An increasing number of end-users - such as the NWS Damage Assessment Toolkit (DAT) - access geospatial data via a Web Mapping Service (WMS). SPoRT has begun developing open-standard Geographic Information Systems (GIS) data sets via WMS to respond to end-user needs.
Mericli, Alexander F; Black, Jonathan S; Morgan, Raymond F
2015-09-01
To describe the technique and results of the tapered M-to-V flap for syndactyly web space construction. We reviewed a single-surgeon, single-institution experience of all syndactyly reconstructions performed between 1982 and 2013. Demographic data and patient characteristics were recorded. Complications included flap loss, graft loss, web creep, infection, restricted range of motion, and digit deviation. A total of 138 web spaces were reconstructed in 93 patients. There were 89 primary congenital hand and 32 foot syndactylies. Four patients had an acquired simple incomplete syndactyly and 13 patients had secondary reconstructions. The complication rate was 14%. The most common complication was web creep resulting from partial skin graft loss (12 web spaces; 9%). There were no total flap losses. Univariate analysis revealed no factor to be predictive of an elevated complication rate. Average follow-up was 2.6 years (range, 6 mo to 26 y). The tapered M-to-V flap proved to be a reliable and versatile technique for web space reconstruction, offering several advantages over the standard rectangular flap method of repair, such as ease of intraoperative adjustment, a z-plasty at the palmodigital crease to minimize scar contracture, and better color match. Therapeutic IV. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Fan Database and Web-tool for Choosing Quieter Spaceflight Fans
NASA Technical Reports Server (NTRS)
Allen, Christopher S.; Burnside, Nathan J.
2007-01-01
One critical aspect of designing spaceflight hardware is the selection of fans to provide the necessary cooling. And with efforts to minimize cost and the tendancy to be conservative with the amount of cooling provided, it is easy to choose an overpowered fan. One impact of this is that the fan uses more energy than is necessary. But, the more significant impact is that the hardware produces much more acoustic noise than if an optimal fan was chosen. Choosing the right fan for a specific hardware application is no simple task. It requires knowledge of cooling requirements and various fan performance characteristics as well as knowledge of the aerodynamic losses of the hardware in which the fan is to be installed. Knowledge of the acoustic emissions of each fan as a function of operating condition is also required in order to choose a quieter fan for a given design point. The purpose of this paper is to describe a database and design-tool that have been developed to aid spaceflight hardware developers in choosing a fan for their application that is based on aerodynamic performance and reduced acoustic emissions as well. This web-based-tool provides a limited amount of fan-data, provides a method for selecting a fan based on its projected operating point, and also provides a method for comparing and contrasting aerodynamic performance and acoustic data from different fans. Drill-down techniques are used to display details of the spectral noise characteristics of the fan at specific operation conditions. The fan aerodynamic and acoustic data were acquired at Ames Research Center in the Experimental Aero-Physics Branch's Anechoic Chamber. Acoustic data were acquired according to ANSI Standard S12.11-1987, "Method for the Measurement of Noise Emitted by Small Air-Moving Devices." One significant improvement made to this technique included automation that allows for a significant increase in flow-rate resolution. The web-tool was developed at Johnson Space Center and is based on the web-development application, SEQUEL, which includes graphics and drill-down capabilities. This paper will describe the type and amount of data taken for the fans and will give examples of this data. This paper will also describe the data-tool and gives examples of how it can be used to choose quieter fans for use in spaceflight hardware.
Quality in Web-Supported Learning.
ERIC Educational Resources Information Center
Fresen, Jill
2002-01-01
Discusses quality assurance for Web-based courses, based on experiences at the University of Pretoria. Topics include evaluation of courseware; the concept of quality, including quality control, quality assurance, and total quality management; implementing a quality management system; measurement techniques; and partnerships. (LRW)
Defect characterization of silicon dendritic web ribbons
NASA Technical Reports Server (NTRS)
Cheng, L. J.
1985-01-01
Progress made in the study of defect characterization of silicon dendritic web ribbon is presented. Chemical etching is used combined with optical microscopy, as well as the electron beam induced current (EBIC) technique. Thermal annealing effect on carrier lifetime is examined.
Miller, Leslie; Schweingruber, Heidi; Oliver, Robert; Mayes, Janice; Smith, Donna
2002-02-01
New technological and cultural developments surrounding adolescents' use of the World Wide Web offer an opportunity for turning aspects of the Internet gaming phenomenon to the advantage of neuroscience education. Specifically, an experimental project to transmit aspects of problem-based learning and the National Science Standards through an interactive Web adventure is reported here. The Reconstructors is an episodic Web-based adventure series entitled Medicinal Mysteries from History. It is funded by the National Institute on Drug Abuse, and the first series focuses on opioids. It was created with the input of middle school students and teachers. Through the use of multimedia technologies, middle school students enter a futuristic world in which they become "reconstructors," members of an elite scientific unit charged with recovering lost medical knowledge about analgesic drugs. Two of the four episodes have been evaluated through a comprehensive review process involving middle school students, teachers, neuroscience researchers, and clinicians. Analysis of the pretest and posttest scores demonstrated significant knowledge gain that validly can be attributed to use of the game. These data provide evidence that science content can be transmitted through innovative online techniques without sacrificing compelling content or effective pedagogical strategies.
Zhang, Melvyn Wb; Tsang, Tammy; Cheow, Enquan; Ho, Cyrus Sh; Yeong, Ng Beng; Ho, Roger Cm
2014-11-11
The use of mobile phones, and specifically smartphones, in the last decade has become more and more prevalent. The latest mobile phones are equipped with comprehensive features that can be used in health care, such as providing rapid access to up-to-date evidence-based information, provision of instant communications, and improvements in organization. The estimated number of health care apps for mobile phones is increasing tremendously, but previous research has highlighted the lack of critical appraisal of new apps. This lack of appraisal of apps has largely been due to the lack of clinicians with technical knowledge of how to create an evidence-based app. We discuss two freely available methodologies for developing Web-based mobile phone apps: a website builder and an app builder. With these, users can program not just a Web-based app, but also integrate multimedia features within their app, without needing to know any programming language. We present techniques for creating a mobile Web-based app using two well-established online mobile app websites. We illustrate how to integrate text-based content within the app, as well as integration of interactive videos and rich site summary (RSS) feed information. We will also briefly discuss how to integrate a simple questionnaire survey into the mobile-based app. A questionnaire survey was administered to students to collate their perceptions towards the app. These two methodologies for developing apps have been used to convert an online electronic psychiatry textbook into two Web-based mobile phone apps for medical students rotating through psychiatry in Singapore. Since the inception of our mobile Web-based app, a total of 21,991 unique users have used the mobile app and online portal provided by WordPress, and another 717 users have accessed the app via a Web-based link. The user perspective survey results (n=185) showed that a high proportion of students valued the textbook and objective structured clinical examination videos featured in the app. A high proportion of students concurred that a self-designed mobile phone app would be helpful for psychiatry education. These methodologies can enable busy clinicians to develop simple mobile Web-based apps for academic, educational, and research purposes, without any prior knowledge of programming. This will be beneficial for both clinicians and users at large, as there will then be more evidence-based mobile phone apps, or at least apps that have been appraised by a clinician.
Tsang, Tammy; Cheow, Enquan; Ho, Cyrus SH; Yeong, Ng Beng; Ho, Roger CM
2014-01-01
Background The use of mobile phones, and specifically smartphones, in the last decade has become more and more prevalent. The latest mobile phones are equipped with comprehensive features that can be used in health care, such as providing rapid access to up-to-date evidence-based information, provision of instant communications, and improvements in organization. The estimated number of health care apps for mobile phones is increasing tremendously, but previous research has highlighted the lack of critical appraisal of new apps. This lack of appraisal of apps has largely been due to the lack of clinicians with technical knowledge of how to create an evidence-based app. Objective We discuss two freely available methodologies for developing Web-based mobile phone apps: a website builder and an app builder. With these, users can program not just a Web-based app, but also integrate multimedia features within their app, without needing to know any programming language. Methods We present techniques for creating a mobile Web-based app using two well-established online mobile app websites. We illustrate how to integrate text-based content within the app, as well as integration of interactive videos and rich site summary (RSS) feed information. We will also briefly discuss how to integrate a simple questionnaire survey into the mobile-based app. A questionnaire survey was administered to students to collate their perceptions towards the app. Results These two methodologies for developing apps have been used to convert an online electronic psychiatry textbook into two Web-based mobile phone apps for medical students rotating through psychiatry in Singapore. Since the inception of our mobile Web-based app, a total of 21,991 unique users have used the mobile app and online portal provided by WordPress, and another 717 users have accessed the app via a Web-based link. The user perspective survey results (n=185) showed that a high proportion of students valued the textbook and objective structured clinical examination videos featured in the app. A high proportion of students concurred that a self-designed mobile phone app would be helpful for psychiatry education. Conclusions These methodologies can enable busy clinicians to develop simple mobile Web-based apps for academic, educational, and research purposes, without any prior knowledge of programming. This will be beneficial for both clinicians and users at large, as there will then be more evidence-based mobile phone apps, or at least apps that have been appraised by a clinician. PMID:25486985
Web-GIS platform for monitoring and forecasting of regional climate and ecological changes
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.
2012-12-01
Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.
Compression-based aggregation model for medical web services.
Al-Shammary, Dhiah; Khalil, Ibrahim
2010-01-01
Many organizations such as hospitals have adopted Cloud Web services in applying their network services to avoid investing heavily computing infrastructure. SOAP (Simple Object Access Protocol) is the basic communication protocol of Cloud Web services that is XML based protocol. Generally,Web services often suffer congestions and bottlenecks as a result of the high network traffic that is caused by the large XML overhead size. At the same time, the massive load on Cloud Web services in terms of the large demand of client requests has resulted in the same problem. In this paper, two XML-aware aggregation techniques that are based on exploiting the compression concepts are proposed in order to aggregate the medical Web messages and achieve higher message size reduction.
New Trends in Computer Assisted Language Learning and Teaching.
ERIC Educational Resources Information Center
Perez-Paredes, Pascual, Ed.; Cantos-Gomez, Pascual, Ed.
2002-01-01
Articles in this special issue include the following: "ICT and Modern Foreign Languages: Learning Opportunities and Training Needs" (Graham Davies); "Authoring, Pedagogy and the Web: Expectations Versus Reality" (Paul Bangs); "Web-based Instructional Environments: Tools and Techniques for Effective Second Language…
A High Performance SOAP Engine for Grid Computing
NASA Astrophysics Data System (ADS)
Wang, Ning; Welzl, Michael; Zhang, Liang
Web Service technology still has many defects that make its usage for Grid computing problematic, most notably the low performance of the SOAP engine. In this paper, we develop a novel SOAP engine called SOAPExpress, which adopts two key techniques for improving processing performance: SCTP data transport and dynamic early binding based data mapping. Experimental results show a significant and consistent performance improvement of SOAPExpress over Apache Axis.
Morris, Chris; Pajon, Anne; Griffiths, Susanne L.; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M.; Wilter da Silva, Alan; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S.; Stuart, David I.; Henrick, Kim; Esnouf, Robert M.
2011-01-01
The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service. PMID:21460443
Morris, Chris; Pajon, Anne; Griffiths, Susanne L; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M; da Silva, Alan Wilter; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S; Stuart, David I; Henrick, Kim; Esnouf, Robert M
2011-04-01
The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service.
Digital Mapping Techniques '07 - Workshop Proceedings
Soller, David R.
2008-01-01
The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.
Oluwasola, Abideen O; Malaka, David; Khramtsov, Andrey Ilyich; Ikpatt, Offiong Francis; Odetunde, Abayomi; Adeyanju, Oyinlolu Olorunsogo; Sveen, Walmy Elisabeth; Falusi, Adeyinka Gloria; Huo, Dezheng; Olopade, Olufunmilayo Ibironke
2013-12-01
The importance of hormone receptor status in assigning treatment and the potential use of human epidermal growth factor receptor 2 (HER2)-targeted therapy have made it beneficial for laboratories to improve detection techniques. Because interlaboratory variability in immunohistochemistry (IHC) tests may also affect studies of breast cancer subtypes in different countries, we undertook a Web-based quality improvement training and a comparative study of accuracy of immunohistochemical tests of breast cancer biomarkers between a well-established laboratory in the United States (University of Chicago) and a field laboratory in Ibadan, Nigeria. Two hundred and thirty-two breast tumor blocks were evaluated for estrogen receptors (ERs), progesterone receptors (PRs), and HER2 status at both laboratories using tissue microarray technique. Initially, concordance analysis revealed κ scores of 0.42 (moderate agreement) for ER, 0.41 (moderate agreement) for PR, and 0.39 (fair agreement) for HER2 between the 2 laboratories. Antigen retrieval techniques and scoring methods were identified as important reasons for discrepancy. Web-based conferences using Web conferencing tools such as Skype and WebEx were then held periodically to discuss IHC staining protocols and standard scoring systems and to resolve discrepant cases. After quality assurance and training, the agreement improved to 0.64 (substantial agreement) for ER, 0.60 (moderate agreement) for PR, and 0.75 (substantial agreement) for HER2. We found Web-based conferences and digital microscopy useful and cost-effective tools for quality assurance of IHC, consultation, and collaboration between distant laboratories. Quality improvement exercises in testing of tumor biomarkers will reduce misclassification in epidemiologic studies of breast cancer subtypes and provide much needed capacity building in resource-poor countries. © 2013.
Nicephor[e]: a web-based solution for teaching forensic and scientific photography.
Voisard, R; Champod, C; Furrer, J; Curchod, J; Vautier, A; Massonnet, G; Buzzini, P
2007-04-11
Nicephor[e] is a project funded by "Swiss Virtual Campus" and aims at creating a distant or mixed web-based learning system in forensic and scientific photography and microscopy. The practical goal is to organize series of on-line modular courses corresponding to the educational requirements of undergraduate academic programs. Additionally, this program could be used in the context of continuing educational programs. The architecture of the project is designed to guarantee a high level of knowledge in forensic and scientific photographic techniques, and to have an easy content production and the ability to create a number of different courses sharing the same content. The e-learning system Nicephor[e] consists of three different parts. The first one is a repository of learning objects that gathers all theoretical subject matter of the project such as texts, animations, images, and films. This repository is a web content management system (Typo3) that permits creating, publishing, and administrating dynamic content via a web browser as well as storing it into a database. The flexibility of the system's architecture allows for an easy updating of the content to follow the development of photographic technology. The instructor of a course can decide which modular contents need to be included in the course, and in which order they will be accessed by students. All the modular courses are developed in a learning management system (WebCT or Moodle) that can deal with complex learning scenarios, content distribution, students, tests, and interaction with instructor. Each course has its own learning scenario based on the goals of the course and the student's profile. The content of each course is taken from the content management system. It is then structured in the learning management system according to the pedagogical goals defined by the instructor. The modular courses are created in a highly interactive setting and offer autoevaluating tests to the students. The last part of the system is a digital assets management system (Extensis Portfolio). The practical portion of each course is to produce images of different marks or objects. The collection of all this material produced, indexed by the students and corrected by the instructor is essential to the development of a knowledge base of photographic techniques applied to a specific forensic subject. It represents also an extensible collection of different marks from known sources obtained under various conditions. It allows to reuse these images for creating image-based case files.
Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G
2006-01-01
The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.
Improving entrepreneurial opportunity recognition through web content analytics
NASA Astrophysics Data System (ADS)
Bakar, Muhamad Shahbani Abu; Azmi, Azwiyati
2017-10-01
The ability to recognize and develop an opportunity into a venture defines an entrepreneur. Research in opportunity recognition has been robust and focuses more on explaining the processes involved in opportunity recognition. Factors such as prior knowledge, cognitive and creative capabilities are shown to affect opportunity recognition in entrepreneurs. Prior knowledge in areas such as customer problems, ways to serve the market, and technology has been shows in various studies to be a factor that facilitates entrepreneurs to identify and recognize opportunities. Findings from research also shows that experienced entrepreneurs search and scan for information to discover opportunities. Searching and scanning for information has also been shown to help novice entrepreneurs who lack prior knowledge to narrow this gap and enable them to better identify and recognize opportunities. There is less focus in research on finding empirically proven techniques and methods to develop and enhance opportunity recognition in student entrepreneurs. This is important as the country pushes for more graduate entrepreneurs that can drive the economy. This paper aims to discuss Opportunity Recognition Support System (ORSS), an information support system to help especially student entrepreneurs in identifying and recognizing business opportunities. The ORSS aims to provide the necessary knowledge to student entrepreneurs to be able to better identify and recognize opportunities. Applying design research, theories in opportunity recognition are applied to identify the requirements for the support system and the requirements in turn dictate the design of the support system. The paper proposes the use of web content mining and analytics as two core components and techniques for the support system. Web content mining can mine the vast knowledge repositories available on the internet and analytics can provide entrepreneurs with further insights into the information needed to recognize opportunities in a given market or industry.
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
DelGiudice, Nancy J; Street, Nancy; Torchia, Ronald J; Sawyer, Susan S; Bernard, Sylvia Allison; Holick, Michael F
2018-05-24
Vitamin D deficiency and insufficiency is a pandemic problem in children and adolescents in the United States. The problem may be aggravated by the inconsistent implementation of current clinical practice guidelines for vitamin D management by pediatric primary care providers. This study examines the relationship between primary care providers' prescribing vitamin D to children ages 1 through 18 years and their practice actions and knowledge. A descriptive correlation design was used. Participants were recruited from a purposive sample of pediatricians and pediatric nurse practitioners through an online invitation to participate in a survey. Reliability and validity was established for the survey developed by the principal investigator using a web-based Delphi technique. Results from this study indicate that although most providers are aware that vitamin D insufficiency and deficiency are problems, fewer than half currently recommend 600- to 1,000-IU supplementation to their patients ages 1 through 18 years. Copyright © 2018 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kitaura, Francisco-Shu
2016-10-01
One of the main goals in cosmology is to understand how the Universe evolves, how it forms structures, why it expands, and what is the nature of dark matter and dark energy. Next decade large and expensive observational projects will bring information on the structure and the distribution of many millions of galaxies at different redshifts enabling us to make great progress in answering these questions. However, these data require a very special and complex set of analysis tools to extract the maximum valuable information. Statistical inference techniques are being developed, bridging the gaps between theory, simulations, and observations. In particular, we discuss the efforts to address the question: What is the underlying nonlinear matter distribution and dynamics at any cosmic time corresponding to a set of observed galaxies in redshift space? An accurate reconstruction of the initial conditions encodes the full phase-space information at any later cosmic time (given a particular structure formation model and a set of cosmological parameters). We present advances to solve this problem in a self-consistent way with Big Data techniques of the Cosmic Web.
Allocation of DSST in the New implementation of Tastrodyweb Tools Web-site
NASA Astrophysics Data System (ADS)
San Juan, J. F.; Lara, M.; López, R.; López, L. M.; Weeden, B.; Cefola, P. J.
2012-09-01
The Draper Semianalytic Satellite Theory (DSST) is a semianalytic orbit propagator, which was carried out on Fortran to run from a command line interface. The construction of DSST began at the Computer Sciences Corporation and continued at the Draper Laboratory in the late 1970's and early 1980's. There are two versions of this application. One of them can be found as an option within the Goddard Trajectory Determination System (GTDS), whereas the other is available as a Standalone Orbit Propagator Package. Both versions are constantly evolving and updating. This constant evolution and updating allows DSST to take into account a wide variety of perturbation forces, which can be selected by means of a non-trivial options system at run time, and makes DSST a useful tool for performing short-term high accuracy orbit determination as well as long-term evolution. DSST has been included as part of an open source project for Space Situational Awareness and space object catalog work. On the last IAC 2011 a first step was taken in this sense and DSST was included on the tastrody Web-Site prototype [3, 4], which provided DSST with a friendly web interface, thus simplifying its use for both expert and non-expert users. However, this prototype has evolved into a stable platform based on the Drupal open source content management system (http://drupal.org Drupal), which simplifies the integration of our own application server. Drupal is supported by a large group of developers and users. Furthermore, a significant number of web-sites have been created using Drupal. In this work we present the integration of DSST in the new web-site, the new facilities provide by this platform to create the research community based on DSST and the comparison tests between the GTDS DSST, DSST Standalone and DSST Web version. These tests will be available in order to facilitate the user with better understanding of DSST. REFERENCES [1] J. G. Neelon, P. J. Cefola, and R. J. Proulx, Current Development of the Draper Semianalytical Satellite Theory Standalone Orbit Propagator Package", AAS Pre-print 97-731, presented at the AAS/AIAA Astrodynamics Conference, Sun Valley, ID, August 1997. [2] P. J. Cefola, D. Phillion, and K. S. Kim, Improving Access to the Semi-Analytical Satellite Theory, AAS 09-341, presented at the AAS/AIAA Astrodynamic Specialist Conference, Pittsburgh, PA, August 2009. [3] P. J. Cefola, B. Weeden and C. Levit, Open Source Software Suite for Space Situational Awareness and Space Object Catalog Work, 4th International Conference on Astrodynamics Tools Techniques, Madrid, Spain, 3-6 May 2010. [4] J. F. San Juan, R. López and I. Pérez, Nonlinear Dynamics Web Tools, 4th International Conference on Astrodynamics Tools Techniques, Madrid, Spain, 3-6 May 2010. [5] J. F. San Juan, M. Lara, R. López. L. M. López, B. Weeden and P. J. Cefola, Using the DSST Semi-Analytical Orbit. Propagator Package via the NondyWebTools/. AstrodyWebTools. Proceedings of 62nd International Astronautical Congress, Cape Town, SA., 2011.
Ruch, P
2011-01-01
To summarize current advances of the so-called Web 3.0 and emerging trends of the semantic web. We provide a synopsis of the articles selected for the IMIA Yearbook 2011, from which we attempt to derive a synthetic overview of the today's and future activities in the field. while the state of the research in the field is illustrated by a set of fairly heterogeneous studies, it is possible to identify significant clusters. While the most salient challenge and obsessional target of the semantic web remains its ambition to simply interconnect all available information, it is interesting to observe the developments of complementary research fields such as information sciences and text analytics. The combined expression power and virtually unlimited data aggregation skills of Web 3.0 technologies make it a disruptive instrument to discover new biomedical knowledge. In parallel, such an unprecedented situation creates new threats for patients participating in large-scale genetic studies as Wjst demonstrate how various data set can be coupled to re-identify anonymous genetic information. The best paper selection of articles on decision support shows examples of excellent research on methods concerning original development of core semantic web techniques as well as transdisciplinary achievements as exemplified with literature-based analytics. This selected set of scientific investigations also demonstrates the needs for computerized applications to transform the biomedical data overflow into more operational clinical knowledge with potential threats for confidentiality directly associated with such advances. Altogether these papers support the idea that more elaborated computer tools, likely to combine heterogeneous text and data contents should soon emerge for the benefit of both experimentalists and hopefully clinicians.
A cloud-based multimodality case file for mobile devices.
Balkman, Jason D; Loehfelm, Thomas W
2014-01-01
Recent improvements in Web and mobile technology, along with the widespread use of handheld devices in radiology education, provide unique opportunities for creating scalable, universally accessible, portable image-rich radiology case files. A cloud database and a Web-based application for radiologic images were developed to create a mobile case file with reasonable usability, download performance, and image quality for teaching purposes. A total of 75 radiology cases related to breast, thoracic, gastrointestinal, musculoskeletal, and neuroimaging subspecialties were included in the database. Breast imaging cases are the focus of this article, as they best demonstrate handheld display capabilities across a wide variety of modalities. This case subset also illustrates methods for adapting radiologic content to cloud platforms and mobile devices. Readers will gain practical knowledge about storage and retrieval of cloud-based imaging data, an awareness of techniques used to adapt scrollable and high-resolution imaging content for the Web, and an appreciation for optimizing images for handheld devices. The evaluation of this software demonstrates the feasibility of adapting images from most imaging modalities to mobile devices, even in cases of full-field digital mammograms, where high resolution is required to represent subtle pathologic features. The cloud platform allows cases to be added and modified in real time by using only a standard Web browser with no application-specific software. Challenges remain in developing efficient ways to generate, modify, and upload radiologic and supplementary teaching content to this cloud-based platform. Online supplemental material is available for this article. ©RSNA, 2014.
NASA Astrophysics Data System (ADS)
Babbar-Sebens, M.
2016-12-01
Social computing technologies are transforming the way our society interacts and generates content on the Web via collective intelligence. Previously unimagined possibilities have arisen for using these technologies to engage stakeholders and involve them in policy making and planning efforts. While the Internet has been used in the past to support education and communication endeavors, we have developed a novel, web-based, interactive planning tool that engages the community in using science-based methods for the design of potential conservation practices on their landscape, and thereby, reducing undesirable impacts of extreme hydroclimatic events. The tool, Watershed REstoration using Spatio-Temporal Optimization of Resources (WRESTORE), uses a democratic voting process coupled with visualization interfaces, computational simulation and optimization models, and user modeling techniques to support a human-centered design approach. This human-centered design approach, which is reinforced by use of Web 2.0 technologies, has the potential to enable policy makers to connect to a larger community of stakeholders and directly engage them in environmental stewardship efforts. Additionally, the design framework can be used by watershed groups to plug-in their own hydrologic models, climate observations and forecasts, and various other simulation models unique to their watersheds. In this presentation, we will demonstrate the effectiveness of WRESTORE for designing alternatives of conservation practices in a HUC-11 Midwestern watershed, results of various experiments with a diverse set of test users and stakeholders, and discuss potential for future developments.
NASA Astrophysics Data System (ADS)
Forero-Romero, J. E.
2017-07-01
This talk summarizes different algorithms that can be used to trace the cosmic web both in simulations and observations. We present different applications in galaxy formation and cosmology. To finalize, we show how the Dark Energy Spectroscopic Instrument (DESI) could be a good place to apply these techniques.
Capturing Trust in Social Web Applications
NASA Astrophysics Data System (ADS)
O'Donovan, John
The Social Web constitutes a shift in information flow from the traditional Web. Previously, content was provided by the owners of a website, for consumption by the end-user. Nowadays, these websites are being replaced by Social Web applications which are frameworks for the publication of user-provided content. Traditionally, Web content could be `trusted' to some extent based on the site it originated from. Algorithms such as Google's PageRank were (and still are) used to compute the importance of a website, based on analysis of underlying link topology. In the Social Web, analysis of link topology merely tells us about the importance of the information framework which hosts the content. Consumers of information still need to know about the importance/reliability of the content they are reading, and therefore about the reliability of the producers of that content. Research into trust and reputation of the producers of information in the Social Web is still very much in its infancy. Every day, people are forced to make trusting decisions about strangers on the Web based on a very limited amount of information. For example, purchasing a product from an eBay seller with a `reputation' of 99%, downloading a file from a peer-to-peer application such as Bit-Torrent, or allowing Amazon.com tell you what products you will like. Even something as simple as reading comments on a Web-blog requires the consumer to make a trusting decision about the quality of that information. In all of these example cases, and indeed throughout the Social Web, there is a pressing demand for increased information upon which we can make trusting decisions. This chapter examines the diversity of sources from which trust information can be harnessed within Social Web applications and discusses a high level classification of those sources. Three different techniques for harnessing and using trust from a range of sources are presented. These techniques are deployed in two sample Social Web applications—a recommender system and an online auction. In all cases, it is shown that harnessing an increased amount of information upon which to make trust decisions greatly enhances the user experience with the Social Web application.
Visualization of usability and functionality of a professional website through web-mining.
Jones, Josette F; Mahoui, Malika; Gopa, Venkata Devi Pragna
2007-10-11
Functional interface design requires understanding of the information system structure and the user. Web logs record user interactions with the interface, and thus provide some insight into user search behavior and efficiency of the search process. The present study uses a data-mining approach with techniques such as association rules, clustering and classification, to visualize the usability and functionality of a digital library through in depth analyses of web logs.
A web-based solution for 3D medical image visualization
NASA Astrophysics Data System (ADS)
Hou, Xiaoshuai; Sun, Jianyong; Zhang, Jianguo
2015-03-01
In this presentation, we present a web-based 3D medical image visualization solution which enables interactive large medical image data processing and visualization over the web platform. To improve the efficiency of our solution, we adopt GPU accelerated techniques to process images on the server side while rapidly transferring images to the HTML5 supported web browser on the client side. Compared to traditional local visualization solution, our solution doesn't require the users to install extra software or download the whole volume dataset from PACS server. By designing this web-based solution, it is feasible for users to access the 3D medical image visualization service wherever the internet is available.
Living in the electronic age: musings on nearly two decades in Cyberspace.
Gavrin, Jonathan R
2009-01-01
My journey through Cyberspace began about 20 years ago with an introduction to e-mail. A few years later, I had the good fortune of working with artificial intelligence engineers who were developing information retrieval techniques and expert systems. By serendipity this led to an early introduction to the World Wide Web (www) and the use of Web browsers as tools for gathering information, long before the Internet became commercialized. Internet content and form are now so omnipresent that they have affected our language in both amusing and utilitarian ways. We are entering an era where social networking, from personal to professional lives, is potentially a vibrant new direction for the Internet. Future articles for this feature will explore how to maximize the utility of the Internet.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
The Role of the Web Server in a Capstone Web Application Course
ERIC Educational Resources Information Center
Umapathy, Karthikeyan; Wallace, F. Layne
2010-01-01
Web applications have become commonplace in the Information Systems curriculum. Much of the discussion about Web development for capstone courses has centered on the scripting tools. Very little has been discussed about different ways to incorporate the Web server into Web application development courses. In this paper, three different ways of…
XML Based Markup Languages for Specific Domains
NASA Astrophysics Data System (ADS)
Varde, Aparna; Rundensteiner, Elke; Fahrenholz, Sally
A challenging area in web based support systems is the study of human activities in connection with the web, especially with reference to certain domains. This includes capturing human reasoning in information retrieval, facilitating the exchange of domain-specific knowledge through a common platform and developing tools for the analysis of data on the web from a domain expert's angle. Among the techniques and standards related to such work, we have XML, the eXtensible Markup Language. This serves as a medium of communication for storing and publishing textual, numeric and other forms of data seamlessly. XML tag sets are such that they preserve semantics and simplify the understanding of stored information by users. Often domain-specific markup languages are designed using XML, with a user-centric perspective. Standardization bodies and research communities may extend these to include additional semantics of areas within and related to the domain. This chapter outlines the issues to be considered in developing domain-specific markup languages: the motivation for development, the semantic considerations, the syntactic constraints and other relevant aspects, especially taking into account human factors. Illustrating examples are provided from domains such as Medicine, Finance and Materials Science. Particular emphasis in these examples is on the Materials Markup Language MatML and the semantics of one of its areas, namely, the Heat Treating of Materials. The focus of this chapter, however, is not the design of one particular language but rather the generic issues concerning the development of domain-specific markup languages.
77 FR 47867 - Agency Information Collection Activities: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... phenology information to Nature's Notebook through a browser-based web application or via mobile applications for iPhone and Android operating systems, meeting GPEA requirements. The web application interface... techniques or other forms of information technology. Please note that the comments submitted in response to...
NATURAL AND ANTHROPOGENIC FACTORS INFLUENCING FOOD WEB STRUCTURE IN GREAT LAKES COASTAL WETLANDS
We are investigating factors governing the biological organization of Great Lakes coastal wetlands. Food web analyses using stable isotope techniques verify the role of algae as an energetic foundation, and also suggest that fundamental changes occur as a result of anthropogenic ...
Using sentiment analysis to review patient satisfaction data located on the internet.
Hopper, Anthony M; Uriyo, Maria
2015-01-01
The purpose of this paper is to test the usefulness of sentiment analysis and time-to-next-complaint methods in quantifying text-based information located on the internet. As important, the authors demonstrate how managers can use time-to-next-complaint techniques to organize sentiment analysis derived data into useful information, which can be shared with doctors and other staff. The authors used sentiment analysis to review patient feedback for a select group of gynecologists in Virginia. The authors utilized time-to-next-complaint methods along with other techniques to organize this data into meaningful information. The authors demonstrated that sentiment analysis and time-to-next-complaint techniques might be useful tools for healthcare managers who are interested in transforming web-based text into meaningful, quantifiable information. This study has several limitations. For one thing, neither the data set nor the techniques the authors used to analyze it will account for biases that resulted from selection issues related to gender, income, and culture, as well as from other socio-demographic concerns. Additionally, the authors lacked key data concerning patient volumes for the targeted physicians. Finally, it may be difficult to convince doctors to consider web-based comments as truthful, thereby preventing healthcare managers from using data located on the internet. The report illustrates some of the ways in which healthcare administrators can utilize sentiment analysis, along with time-to-next-complaint techniques, to mine web-based, patient comments for meaningful information. The paper is one of the first to illustrate ways in which administrators at clinics and physicians' offices can utilize sentiment analysis and time-to-next-complaint methods to analyze web-based patient comments.
Performance Analysis of Web-Based Ppp Services with DİFFERENT Visibility Conditions
NASA Astrophysics Data System (ADS)
Albayrak, M.; Erkaya, H.; Ozludemir, M. T.; Ocalan, T.
2016-12-01
GNSS is being used effectively to precise position for many measuring and geodetic purposes at the present time. There is an increasing variety of these systems including the post-processing calculations in terms of number, quality and features and many different techniques are developed to determine position. Precise positioning intend to derive requires user experience and scientific or commercial software with costly license fees. However, in recent years important alternatives to this software that are user friendly and offer free web-based online precise point positioning service have become widely used in geodetic applications. The aim of this study is to test the performance of PPP techniques on ground control points with different visibility conditions. Within this framework, static observations were carried out for three hours a day repeatedly for six days, in YTU Davutpasa Campus on three different ground control points. The locations of these stations were selected by taking into account the impact of natural (trees, etc.) and artificial (buildings, etc.) obstacles. In order to compare the obtained GPS observations with PPP performances, first of all the accurate coordinates of the control points were computed with relative positioning technique in connection with the IGS stations using Bernese v5.0 software. Afterwards, three different web-based positioning services (CSRS-PPP, magicGNSS, GAPS) were used to analyze the GPS observations via PPP technique. To compare all of the obtained results, ITRF2008 datum measurement epoch coordinates were preferred by taking the service result criteria into consideration. In coordinate comparison, for the first station located nearby a building and possibly subjected to multipath effect horizontal discrepancies vary between 2-14.5 cm while vertical differences are between 3.5-16 cm. For the second point located partly in a forestry area, the discrepancies have been obtained as 1.5-8 cm and 2-10 cm for horizontal and vertical components, respectively. For the third point located in an area with no obstacles, 1.5-7 cm horizontal and 1-7 cm vertical differences have been obtained. The results show that the PPP technique could be used effectively in several positioning applications.
ERIC Educational Resources Information Center
Chien, Hui-Min; Kao, Chia-Pin; Yeh, I-Jan; Lin, Kuen-Yi
2012-01-01
This study was conducted to investigate elementary school teachers' attitudes and motivation toward web-based professional development. The relationship between teachers' attitudes and motivation was explored using the AWPD (Attitudes toward Web-based Professional Development) and MWPD (Motivation toward Web-based Professional Development)…
PREDOSE: A Semantic Web Platform for Drug Abuse Epidemiology using Social Media
Cameron, Delroy; Smith, Gary A.; Daniulaityte, Raminta; Sheth, Amit P.; Dave, Drashti; Chen, Lu; Anand, Gaurish; Carlson, Robert; Watkins, Kera Z.; Falck, Russel
2013-01-01
Objectives The role of social media in biomedical knowledge mining, including clinical, medical and healthcare informatics, prescription drug abuse epidemiology and drug pharmacology, has become increasingly significant in recent years. Social media offers opportunities for people to share opinions and experiences freely in online communities, which may contribute information beyond the knowledge of domain professionals. This paper describes the development of a novel Semantic Web platform called PREDOSE (PREscription Drug abuse Online Surveillance and Epidemiology), which is designed to facilitate the epidemiologic study of prescription (and related) drug abuse practices using social media. PREDOSE uses web forum posts and domain knowledge, modeled in a manually created Drug Abuse Ontology (DAO) (pronounced dow), to facilitate the extraction of semantic information from User Generated Content (UGC). A combination of lexical, pattern-based and semantics-based techniques is used together with the domain knowledge to extract fine-grained semantic information from UGC. In a previous study, PREDOSE was used to obtain the datasets from which new knowledge in drug abuse research was derived. Here, we report on various platform enhancements, including an updated DAO, new components for relationship and triple extraction, and tools for content analysis, trend detection and emerging patterns exploration, which enhance the capabilities of the PREDOSE platform. Given these enhancements, PREDOSE is now more equipped to impact drug abuse research by alleviating traditional labor-intensive content analysis tasks. Methods Using custom web crawlers that scrape UGC from publicly available web forums, PREDOSE first automates the collection of web-based social media content for subsequent semantic annotation. The annotation scheme is modeled in the DAO, and includes domain specific knowledge such as prescription (and related) drugs, methods of preparation, side effects, routes of administration, etc. The DAO is also used to help recognize three types of data, namely: 1) entities, 2) relationships and 3) triples. PREDOSE then uses a combination of lexical and semantic-based techniques to extract entities and relationships from the scraped content, and a top-down approach for triple extraction that uses patterns expressed in the DAO. In addition, PREDOSE uses publicly available lexicons to identify initial sentiment expressions in text, and then a probabilistic optimization algorithm (from related research) to extract the final sentiment expressions. Together, these techniques enable the capture of fine-grained semantic information from UGC, and querying, search, trend analysis and overall content analysis of social media related to prescription drug abuse. Moreover, extracted data are also made available to domain experts for the creation of training and test sets for use in evaluation and refinements in information extraction techniques. Results A recent evaluation of the information extraction techniques applied in the PREDOSE platform indicates 85% precision and 72% recall in entity identification, on a manually created gold standard dataset. In another study, PREDOSE achieved 36% precision in relationship identification and 33% precision in triple extraction, through manual evaluation by domain experts. Given the complexity of the relationship and triple extraction tasks and the abstruse nature of social media texts, we interpret these as favorable initial results. Extracted semantic information is currently in use in an online discovery support system, by prescription drug abuse researchers at the Center for Interventions, Treatment and Addictions Research (CITAR) at Wright State University. Conclusion A comprehensive platform for entity, relationship, triple and sentiment extraction from such abstruse texts has never been developed for drug abuse research. PREDOSE has already demonstrated the importance of mining social media by providing data from which new findings in drug abuse research were uncovered. Given the recent platform enhancements, including the refined DAO, components for relationship and triple extraction, and tools for content, trend and emerging pattern analysis, it is expected that PREDOSE will play a significant role in advancing drug abuse epidemiology in future. PMID:23892295
PREDOSE: a semantic web platform for drug abuse epidemiology using social media.
Cameron, Delroy; Smith, Gary A; Daniulaityte, Raminta; Sheth, Amit P; Dave, Drashti; Chen, Lu; Anand, Gaurish; Carlson, Robert; Watkins, Kera Z; Falck, Russel
2013-12-01
The role of social media in biomedical knowledge mining, including clinical, medical and healthcare informatics, prescription drug abuse epidemiology and drug pharmacology, has become increasingly significant in recent years. Social media offers opportunities for people to share opinions and experiences freely in online communities, which may contribute information beyond the knowledge of domain professionals. This paper describes the development of a novel semantic web platform called PREDOSE (PREscription Drug abuse Online Surveillance and Epidemiology), which is designed to facilitate the epidemiologic study of prescription (and related) drug abuse practices using social media. PREDOSE uses web forum posts and domain knowledge, modeled in a manually created Drug Abuse Ontology (DAO--pronounced dow), to facilitate the extraction of semantic information from User Generated Content (UGC), through combination of lexical, pattern-based and semantics-based techniques. In a previous study, PREDOSE was used to obtain the datasets from which new knowledge in drug abuse research was derived. Here, we report on various platform enhancements, including an updated DAO, new components for relationship and triple extraction, and tools for content analysis, trend detection and emerging patterns exploration, which enhance the capabilities of the PREDOSE platform. Given these enhancements, PREDOSE is now more equipped to impact drug abuse research by alleviating traditional labor-intensive content analysis tasks. Using custom web crawlers that scrape UGC from publicly available web forums, PREDOSE first automates the collection of web-based social media content for subsequent semantic annotation. The annotation scheme is modeled in the DAO, and includes domain specific knowledge such as prescription (and related) drugs, methods of preparation, side effects, and routes of administration. The DAO is also used to help recognize three types of data, namely: (1) entities, (2) relationships and (3) triples. PREDOSE then uses a combination of lexical and semantic-based techniques to extract entities and relationships from the scraped content, and a top-down approach for triple extraction that uses patterns expressed in the DAO. In addition, PREDOSE uses publicly available lexicons to identify initial sentiment expressions in text, and then a probabilistic optimization algorithm (from related research) to extract the final sentiment expressions. Together, these techniques enable the capture of fine-grained semantic information, which facilitate search, trend analysis and overall content analysis using social media on prescription drug abuse. Moreover, extracted data are also made available to domain experts for the creation of training and test sets for use in evaluation and refinements in information extraction techniques. A recent evaluation of the information extraction techniques applied in the PREDOSE platform indicates 85% precision and 72% recall in entity identification, on a manually created gold standard dataset. In another study, PREDOSE achieved 36% precision in relationship identification and 33% precision in triple extraction, through manual evaluation by domain experts. Given the complexity of the relationship and triple extraction tasks and the abstruse nature of social media texts, we interpret these as favorable initial results. Extracted semantic information is currently in use in an online discovery support system, by prescription drug abuse researchers at the Center for Interventions, Treatment and Addictions Research (CITAR) at Wright State University. A comprehensive platform for entity, relationship, triple and sentiment extraction from such abstruse texts has never been developed for drug abuse research. PREDOSE has already demonstrated the importance of mining social media by providing data from which new findings in drug abuse research were uncovered. Given the recent platform enhancements, including the refined DAO, components for relationship and triple extraction, and tools for content, trend and emerging pattern analysis, it is expected that PREDOSE will play a significant role in advancing drug abuse epidemiology in future. Copyright © 2013 Elsevier Inc. All rights reserved.
WebEAV: automatic metadata-driven generation of web interfaces to entity-attribute-value databases.
Nadkarni, P M; Brandt, C M; Marenco, L
2000-01-01
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples.
Ribbon Growth of Single Crystal GaAs for Solar Cell Application.
1981-11-01
Entered) 20. Abstract (Cont.) 7growth techniques, dendrite seeds, and melt chemistry were optimized during the course of the program; however...Faceted Web. 10 Crystal Grown From a Melt Doped With 1.0 Atomic% Ge. 17 The Ge-Doped Crystals Grew at Low Undercooling and Contained Flatter Textured-Web...Ge Melt Doping. The 18 Textured-Web Sections Were the Widest Achieved at Small Undercooling, ɝ.0°C. 12 Radiation Exchange Between the Melt Surface
Practical guidelines for development of web-based interventions.
Chee, Wonshik; Lee, Yaelim; Chee, Eunice; Im, Eun-Ok
2014-10-01
Despite a recent high funding priority on technological aspects of research and a high potential impact of Web-based interventions on health, few guidelines for the development of Web-based interventions are currently available. In this article, we propose practical guidelines for development of Web-based interventions based on an empirical study and an integrative literature review. The empirical study aimed at development of a Web-based physical activity promotion program that was specifically tailored to Korean American midlife women. The literature review included a total of 202 articles that were retrieved through multiple databases. On the basis of the findings of the study and the literature review, we propose directions for development of Web-based interventions in the following steps: (1) meaningfulness and effectiveness, (2) target population, (3) theoretical basis/program theory, (4) focus and objectives, (5) components, (6) technological aspects, and (7) logistics for users. The guidelines could help promote further development of Web-based interventions at this early stage of Web-based interventions in nursing.
Oh! Web 2.0, Virtual Reference Service 2.0, Tools & Techniques (II)
ERIC Educational Resources Information Center
Arya, Harsh Bardhan; Mishra, J. K.
2012-01-01
The paper describes the theory and definition of the practice of librarianship, specifically addressing how Web 2.0 technologies (tools) such as synchronous messaging, collaborative reference service and streaming media, blogs, wikis, social networks, social bookmarking tools, tagging, RSS feeds, and mashups might intimate changes and how…
Interactive Information Organization: Techniques and Evaluation
2001-05-01
information search and access. Locating interesting information on the World Wide Web is the main task of on-line search engines . Such engines accept a...likelihood of being relevant to the user’s request. The majority of today’s Web search engines follow this scenario. The ordering of documents in the
ERIC Educational Resources Information Center
Okazaki, Shintaro; Alonso Rivas, Javier
2002-01-01
Discussion of research methodology for evaluating the degree of standardization in multinational corporations' online communication strategies across differing cultures focuses on a research framework for cross-cultural comparison of corporate Web pages, applying traditional advertising content study techniques. Describes pre-tests that examined…
NASA Technical Reports Server (NTRS)
Duncan, S.
1984-01-01
Technological goals for a silicon dendritic web growth program effort are presented. Principle objectives for this program include: (1) grow long web crystals front continuously replenished melt; (2) develop temperature distribution in web and melt; (3) improve reproductibility of growth; (4) develop configurations for increased growth rates (width and speed); (5) develop new growth system components as required for improved growth; and (6) evaluate quality of web growth.
Development of a functional, internet-accessible department of surgery outcomes database.
Newcomb, William L; Lincourt, Amy E; Gersin, Keith; Kercher, Kent; Iannitti, David; Kuwada, Tim; Lyons, Cynthia; Sing, Ronald F; Hadzikadic, Mirsad; Heniford, B Todd; Rucho, Susan
2008-06-01
The need for surgical outcomes data is increasing due to pressure from insurance companies, patients, and the need for surgeons to keep their own "report card". Current data management systems are limited by inability to stratify outcomes based on patients, surgeons, and differences in surgical technique. Surgeons along with research and informatics personnel from an academic, hospital-based Department of Surgery and a state university's Department of Information Technology formed a partnership to develop a dynamic, internet-based, clinical data warehouse. A five-component model was used: data dictionary development, web application creation, participating center education and management, statistics applications, and data interpretation. A data dictionary was developed from a list of data elements to address needs of research, quality assurance, industry, and centers of excellence. A user-friendly web interface was developed with menu-driven check boxes, multiple electronic data entry points, direct downloads from hospital billing information, and web-based patient portals. Data were collected on a Health Insurance Portability and Accountability Act-compliant server with a secure firewall. Protected health information was de-identified. Data management strategies included automated auditing, on-site training, a trouble-shooting hotline, and Institutional Review Board oversight. Real-time, daily, monthly, and quarterly data reports were generated. Fifty-eight publications and 109 abstracts have been generated from the database during its development and implementation. Seven national academic departments now use the database to track patient outcomes. The development of a robust surgical outcomes database requires a combination of clinical, informatics, and research expertise. Benefits of surgeon involvement in outcomes research include: tracking individual performance, patient safety, surgical research, legal defense, and the ability to provide accurate information to patient and payers.
Use of WebQuest Design for Inservice Teacher Professional Development
ERIC Educational Resources Information Center
Iskeceli-Tunc, Sinem; Oner, Diler
2016-01-01
This study investigated whether a teacher professional development module built around designing WebQuests could improve teachers' technological and pedagogical skills. The technological skills examined included Web searching and Web evaluating skills. The pedagogical skills targeted were developing a working definition for higher-order thinking…
Teaching Web Security Using Portable Virtual Labs
ERIC Educational Resources Information Center
Chen, Li-Chiou; Tao, Lixin
2012-01-01
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
Diwakar, Shyam; Parasuram, Harilal; Medini, Chaitanya; Raman, Raghu; Nedungadi, Prema; Wiertelak, Eric; Srivastava, Sanjeeva; Achuthan, Krishnashree; Nair, Bipin
2014-01-01
Classroom-level neuroscience experiments vary from detailed protocols involving chemical, physiological and imaging techniques to computer-based modeling. The application of Information and Communication Technology (ICT) is revolutionizing the current laboratory scenario in terms of active learning especially for distance education cases. Virtual web-based labs are an asset to educational institutions confronting economic issues in maintaining equipment, facilities and other conditions needed for good laboratory practice. To enhance education, we developed virtual laboratories in neuroscience and explored their first-level use in (Indian) University education in the context of developing countries. Besides using interactive animations and remotely-triggered experimental devices, a detailed mathematical simulator was implemented on a web-based software platform. In this study, we focused on the perceptions of technology adoption for a virtual neurophysiology laboratory as a new pedagogy tool for complementing college laboratory experience. The study analyses the effect of virtual labs on users assessing the relationship between cognitive, social and teaching presence. Combining feedback from learners and teachers, the study suggests enhanced motivation for students and improved teaching experience for instructors.
Diwakar, Shyam; Parasuram, Harilal; Medini, Chaitanya; Raman, Raghu; Nedungadi, Prema; Wiertelak, Eric; Srivastava, Sanjeeva; Achuthan, Krishnashree; Nair, Bipin
2014-01-01
Classroom-level neuroscience experiments vary from detailed protocols involving chemical, physiological and imaging techniques to computer-based modeling. The application of Information and Communication Technology (ICT) is revolutionizing the current laboratory scenario in terms of active learning especially for distance education cases. Virtual web-based labs are an asset to educational institutions confronting economic issues in maintaining equipment, facilities and other conditions needed for good laboratory practice. To enhance education, we developed virtual laboratories in neuroscience and explored their first-level use in (Indian) University education in the context of developing countries. Besides using interactive animations and remotely-triggered experimental devices, a detailed mathematical simulator was implemented on a web-based software platform. In this study, we focused on the perceptions of technology adoption for a virtual neurophysiology laboratory as a new pedagogy tool for complementing college laboratory experience. The study analyses the effect of virtual labs on users assessing the relationship between cognitive, social and teaching presence. Combining feedback from learners and teachers, the study suggests enhanced motivation for students and improved teaching experience for instructors. PMID:24693260
Pham, D; Hardcastle, N; Foroudi, F; Kron, T; Bressel, M; Hilder, B; Chesson, B; Oates, R; Montgomery, R; Ball, D; Siva, S
2016-09-01
In technically advanced multicentre clinical trials, participating centres can benefit from a credentialing programme before participating in the trial. Education of staff in participating centres is an important aspect of a successful clinical trial. In the multicentre study of fractionated versus single fraction stereotactic ablative body radiotherapy in lung oligometastases (TROG 13.01), knowledge transfer of stereotactic ablative body radiotherapy techniques to the local multidisciplinary team is intended as part of the credentialing process. In this study, a web-based learning platform was developed to provide education and training for the multidisciplinary trial teams at geographically distinct sites. A web-based platform using eLearning software consisting of seven training modules was developed. These modules were based on extracranial stereotactic theory covering the following discrete modules: Clinical background; Planning technique and evaluation; Planning optimisation; Four-dimensional computed tomography simulation; Patient-specific quality assurance; Cone beam computed tomography and image guidance; Contouring organs at risk. Radiation oncologists, medical physicists and radiation therapists from hospitals in Australia and New Zealand were invited to participate in this study. Each discipline was enrolled into a subset of modules (core modules) and was evaluated before and after completing each module. The effectiveness of the eLearning training will be evaluated based on (i) knowledge retention after participation in the web-based training and (ii) confidence evaluation after participation in the training. Evaluation consisted of a knowledge test and confidence evaluation using a Likert scale. In total, 130 participants were enrolled into the eLearning programme: 81 radiation therapists (62.3%), 27 medical physicists (20.8%) and 22 radiation oncologists (16.9%). There was an average absolute improvement of 14% in test score (P < 0.001) after learning. This score improvement compared with initial testing was also observed in the long-term testing (>4 weeks) after completing the modules (P < 0.001). For most there was significant increase in confidence (P < 0.001) after completing all the modules. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
ARDesigner: a web-based system for allosteric RNA design.
Shu, Wenjie; Liu, Ming; Chen, Hebing; Bo, Xiaochen; Wang, Shengqi
2010-12-01
RNA molecules play vital informational, structural, and functional roles in molecular biology, making them ideal targets for synthetic biology. However, several challenges remain for engineering novel allosteric RNA molecules, and the development of efficient computational design techniques is vitally needed. Here we describe the development of Allosteric RNA Designer (ARDesigner), a user-friendly and freely available web-based system for allosteric RNA design that incorporates mutational robustness in the design process. The system output includes detailed design information in a graphical HTML format. We used ARDesigner to engineer a temperature-sensitive AR, and found that the resulting design satisfied the prescribed properties/input. ARDesigner provides a simple means for researchers to design allosteric RNAs with specific properties. With its versatile framework and possibilities for further enhancement, ARDesigner may serve as a useful tool for synthetic biologists and therapeutic design. ARDesigner and its executable version are freely available at http://biotech.bmi.ac.cn/ARDesigner. Crown Copyright © 2010. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Barkanov, E.; Eglītis, E.; Almeida, F.; Bowering, M. C.; Watson, G.
2013-07-01
The present investigation is devoted to the development of new optimal design concepts that exploit the full potential of advanced composite materials in the upper covers of aircraft lateral wings. A finite-element simulation of three-rib-bay laminated composite panels with T-stiffeners and a stiffener pitch of 200 mm is carried out using ANSYS to investigate the effect of rib attachment to stiffener webs on the performance of stiffened panels in terms of their buckling behavior and in relation to skin and stiffener lay-ups, stiffener height, and root width. Due to the large dimension of numerical problems to be solved, an optimization methodology is developed employing the method of experimental design and the response surface technique. Minimal-weight optimization problems were solved for four load levels with account of manufacturing, repairability, and damage tolerance requirements. The optimal results were verified successfully by using the ANSYS and ABAQUS shared-node models.
Development and Evaluation of an Interactive WebQuest Environment: "Web Macerasi"
ERIC Educational Resources Information Center
Gulbahar, Yasemin; Madran, R. Orcun; Kalelioglu, Filiz
2010-01-01
This study was conducted to develop a web-based interactive system, Web Macerasi, for teaching-learning and evaluation purposes, and to find out the possible effects of this system. The study has two stages. In the first stage, a WebQuest site was designed as an interactive system in which various Internet and web technologies were used for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-21
NREL's Developer Network, developer.nrel.gov, provides data that users can access to provide data to their own analyses, mobile and web applications. Developers can retrieve the data through a Web services API (application programming interface). The Developer Network handles overhead of serving up web services such as key management, authentication, analytics, reporting, documentation standards, and throttling in a common architecture, while allowing web services and APIs to be maintained and managed independently.
Brasil, Lourdes M; Gomes, Marília M F; Miosso, Cristiano J; da Silva, Marlete M; Amvame-Nze, Georges D
2015-07-16
Dengue fever is endemic in Asia, the Americas, the East of the Mediterranean and the Western Pacific. According to the World Health Organization, it is one of the diseases of greatest impact on health, affecting millions of people each year worldwide. A fast detection of increases in populations of the transmitting vector, the Aedes aegypti mosquito, is essential to avoid dengue outbreaks. Unfortunately, in several countries, such as Brazil, the current methods for detecting populations changes and disseminating this information are too slow to allow efficient allocation of resources to fight outbreaks. To reduce the delay in providing the information regarding A. aegypti population changes, we propose, develop, and evaluate a system for counting the eggs found in special traps and to provide the collected data using a web structure with geographical location resources. One of the most useful tools for the detection and surveillance of arthropods is the ovitrap, a special trap built to collect the mosquito eggs. This allows for an egg counting process, which is still usually performed manually, in countries such as Brazil. We implement and evaluate a novel system for automatically counting the eggs found in the ovitraps' cardboards. The system we propose is based on digital image processing (DIP) techniques, as well as a Web based Semi-Automatic Counting System (SCSA-WEB). All data collected are geographically referenced in a geographic information system (GIS) and made available on a Web platform. The work was developed in Gama's administrative region, in Brasília/Brazil, with the aid of the Environmental Surveillance Directory (DIVAL-Gama) and Brasília's Board of Health (SSDF), in partnership with the University of Brasília (UnB). The system was built based on a field survey carried out during three months and provided by health professionals. These professionals provided 84 cardboards from 84 ovitraps, sized 15 × 5 cm. In developing the system, we conducted the following steps: i. Obtain images from the eggs on an ovitrap's cardboards, with a microscope. ii. Apply a proposed image-processing-based semi-automatic counting system. The system we developed uses the Java programming language and the Java Server Faces technology. This is a framework suite for web applications development. This approach will allow a simple migration to any Operating System platform and future applications on mobile devices. iii. Collect and store all data into a Database (DB) and then georeference them in a GIS. The Database Management System used to develop the DB is based on PostgreSQL. The GIS will assist in the visualization and spatial analysis of digital maps, allowing the location of Dengue outbreaks in the region of study. This will also facilitate the planning, analysis, and evaluation of temporal and spatial epidemiology, as required by the Brazilian Health Care Control Center. iv. Deploy the SCSA-WEB, DB and GIS on a single Web platform. The statistical results obtained by DIP were satisfactory when compared with the SCSA-WEB's semi-automated eggs count. The results also indicate that the time spent in manual counting has being considerably reduced when using our fully automated DIP algorithm and semi-automated SCSA-WEB. The developed georeferencing Web platform proves to be of great support for future visualization with statistical and trace analysis of the disease. The analyses suggest the efficiency of our algorithm for automatic eggs counting, in terms of expediting the work of the laboratory technician, reducing considerably its time and error counting rates. We believe that this kind of integrated platform and tools can simplify the decision making process of the Brazilian Health Care Control Center.
Improving life sciences information retrieval using semantic web technology.
Quan, Dennis
2007-05-01
The ability to retrieve relevant information is at the heart of every aspect of research and development in the life sciences industry. Information is often distributed across multiple systems and recorded in a way that makes it difficult to piece together the complete picture. Differences in data formats, naming schemes and network protocols amongst information sources, both public and private, must be overcome, and user interfaces not only need to be able to tap into these diverse information sources but must also assist users in filtering out extraneous information and highlighting the key relationships hidden within an aggregated set of information. The Semantic Web community has made great strides in proposing solutions to these problems, and many efforts are underway to apply Semantic Web techniques to the problem of information retrieval in the life sciences space. This article gives an overview of the principles underlying a Semantic Web-enabled information retrieval system: creating a unified abstraction for knowledge using the RDF semantic network model; designing semantic lenses that extract contextually relevant subsets of information; and assembling semantic lenses into powerful information displays. Furthermore, concrete examples of how these principles can be applied to life science problems including a scenario involving a drug discovery dashboard prototype called BioDash are provided.
Providing Multi-Page Data Extraction Services with XWRAPComposer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ling; Zhang, Jianjun; Han, Wei
2008-04-30
Dynamic Web data sources – sometimes known collectively as the Deep Web – increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deepmore » Web. To address these challenges, we present DYNABOT, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DYNABOT has three unique characteristics. First, DYNABOT utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DYNABOT employs a modular, self-tuning system architecture for focused crawling of the Deep Web using service class descriptions. Third, DYNABOT incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less
Nadkarni, Prakash M.; Brandt, Cynthia M.; Marenco, Luis
2000-01-01
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples. PMID:10887163
Enhanced interfaces for web-based enterprise-wide image distribution.
Jost, R Gilbert; Blaine, G James; Fritz, Kevin; Blume, Hartwig; Sadhra, Sarbjit
2002-01-01
Modern Web browsers support image distribution with two shortcomings: (1) image grayscale presentation at client workstations is often sub-optimal and generally inconsistent with the presentation state on diagnostic workstations and (2) an Electronic Patient Record (EPR) application usually cannot directly access images with an integrated viewer. We have modified our EPR and our Web-based image-distribution system to allow access to images from within the EPR. In addition, at the client workstation, a grayscale transformation is performed that consists of two components: a client-display-specific component based on the characteristic display function of the class of display system, and a modality-specific transformation that is downloaded with every image. The described techniques have been implemented in our institution and currently support enterprise-wide clinical image distribution. The effectiveness of the techniques is reviewed.
Kouloulias, V E; Ntasis, E; Poortmans, Ph; Maniatis, T A; Nikita, K S
2003-01-01
The desire to develop web-based platforms for remote collaboration among physicians and technologists is becoming a great challenge. In this paper we describe a web-based radiotherapy treatment planning (WBRTP) system to facilitate decentralized radiotherapy services by allowing remote treatment planning and quality assurance (QA) of treatment delivery. Significant prerequisites are digital storage of relevant data as well as efficient and reliable telecommunication system between collaborating units. The system of WBRTP includes video conferencing, display of medical images (CT scans, dose distributions etc), replication of selected data from a common database, remote treatment planning, evaluation of treatment technique and follow-up of the treated patients. Moreover the system features real-time remote operations in terms of tele-consulting like target volume delineation performed by a team of experts at different and distant units. An appraisal of its possibilities in quality assurance in radiotherapy is also discussed. As a conclusion, a WBRTP system would not only be a medium for communication between experts in oncology but mainly a tool for improving the QA in radiotherapy.
Using Standardized Lexicons for Report Template Validation with LexMap, a Web-based Application.
Hostetter, Jason; Wang, Kenneth; Siegel, Eliot; Durack, Jeremy; Morrison, James J
2015-06-01
An enormous amount of data exists in unstructured diagnostic and interventional radiology reports. Free text or non-standardized terminologies limit the ability to parse, extract, and analyze these report data elements. Medical lexicons and ontologies contain standardized terms for relevant concepts including disease entities, radiographic technique, and findings. The use of standardized terms offers the potential to improve reporting consistency and facilitate computer analysis. The purpose of this project was to implement an interface to aid in the creation of standards-compliant reporting templates for use in interventional radiology. Non-standardized procedure report text was analyzed and referenced to RadLex, SNOMED-CT, and LOINC. Using JavaScript, a web application was developed which determined whether exact terms or synonyms in reports existed within these three reference resources. The NCBO BioPortal Annotator web service was used to map terms, and output from this application was used to create an interactive annotated version of the original report. The application was successfully used to analyze and modify five distinct reports for the Society of Interventional Radiology's standardized reporting project.
Wu, Zhen-Yu; Tseng, Yi-Ju; Chung, Yufang; Chen, Yee-Chun; Lai, Feipei
2012-08-01
With the rapid development of the Internet, both digitization and electronic orientation are required on various applications in the daily life. For hospital-acquired infection control, a Web-based Hospital-acquired Infection Surveillance System was implemented. Clinical data from different hospitals and systems were collected and analyzed. The hospital-acquired infection screening rules in this system utilized this information to detect different patterns of defined hospital-acquired infection. Moreover, these data were integrated into the user interface of a signal entry point to assist physicians and healthcare providers in making decisions. Based on Service-Oriented Architecture, web-service techniques which were suitable for integrating heterogeneous platforms, protocols, and applications, were used. In summary, this system simplifies the workflow of hospital infection control and improves the healthcare quality. However, it is probable for attackers to intercept the process of data transmission or access to the user interface. To tackle the illegal access and to prevent the information from being stolen during transmission over the insecure Internet, a password-based user authentication scheme is proposed for information integrity.
CalFitter: a web server for analysis of protein thermal denaturation data.
Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri
2018-05-14
Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.
Frank, M S; Dreyer, K
2001-06-01
We describe a working software technology that enables educators to incorporate their expertise and teaching style into highly interactive and Socratic educational material for distribution on the world wide web. A graphically oriented interactive authoring system was developed to enable the computer novice to create and store within a database his or her domain expertise in the form of electronic knowledge. The authoring system supports and facilitates the input and integration of several types of content, including free-form, stylized text, miniature and full-sized images, audio, and interactive questions with immediate feedback. The system enables the choreography and sequencing of these entities for display within a web page as well as the sequencing of entire web pages within a case-based or thematic presentation. Images or segments of text can be hyperlinked with point-and-click to other entities such as adjunctive web pages, audio, or other images, cases, or electronic chapters. Miniature (thumbnail) images are automatically linked to their full-sized counterparts. The authoring system contains a graphically oriented word processor, an image editor, and capabilities to automatically invoke and use external image-editing software such as Photoshop. The system works in both local area network (LAN) and internet-centric environments. An internal metalanguage (invisible to the author but stored with the content) was invented to represent the choreographic directives that specify the interactive delivery of the content on the world wide web. A database schema was developed to objectify and store both this electronic knowledge and its associated choreographic metalanguage. A database engine was combined with page-rendering algorithms in order to retrieve content from the database and deliver it on the web in a Socratic style, assess the recipient's current fund of knowledge, and provide immediate feedback, thus stimulating in-person interaction with a human expert. This technology enables the educator to choreograph a stylized, interactive delivery of his or her message using multimedia components assembled in virtually any order, spanning any number of web pages for a given case or theme. An educator can thus exercise precise influence on specific learning objectives, embody his or her personal teaching style within the content, and ultimately enhance its educational impact. The described technology amplifies the efforts of the educator and provides a more dynamic and enriching learning environment for web-based education.
Flat-plate solar array project process development area: Process research of non-CZ silicon material
NASA Technical Reports Server (NTRS)
Campbell, R. B.
1986-01-01
Several different techniques to simultaneously diffuse the front and back junctions in dendritic web silicon were investigated. A successful simultaneous diffusion reduces the cost of the solar cell by reducing the number of processing steps, the amount of capital equipment, and the labor cost. The three techniques studied were: (1) simultaneous diffusion at standard temperatures and times using a tube type diffusion furnace or a belt furnace; (2) diffusion using excimer laser drive-in; and (3) simultaneous diffusion at high temperature and short times using a pulse of high intensity light as the heat source. The use of an excimer laser and high temperature short time diffusion experiment were both more successful than the diffusion at standard temperature and times. The three techniques are described in detail and a cost analysis of the more successful techniques is provided.
Web mining for topics defined by complex and precise predicates
NASA Astrophysics Data System (ADS)
Lee, Ching-Cheng; Sampathkumar, Sushma
2004-04-01
The enormous growth of the World Wide Web has made it important to perform resource discovery efficiently for any given topic. Several new techniques have been proposed in the recent years for this kind of topic specific web-mining, and among them a key new technique called focused crawling which is able to crawl topic-specific portions of the web without having to explore all pages. Most existing research on focused crawling considers a simple topic definition that typically consists of one or more keywords connected by an OR operator. However this kind of simple topic definition may result in too many irrelevant pages in which the same keyword appears in a wrong context. In this research we explore new strategies for crawling topic specific portions of the web using complex and precise predicates. A complex predicate will allow the user to precisely specify a topic using Boolean operators such as "AND", "OR" and "NOT". Our work will concentrate on defining a format to specify this kind of a complex topic definition and secondly on devising a crawl strategy to crawl the topic specific portions of the web defined by the complex predicate, efficiently and with minimal overhead. Our new crawl strategy will improve the performance of topic-specific web crawling by reducing the number of irrelevant pages crawled. In order to demonstrate the effectiveness of the above approach, we have built a complete focused crawler called "Eureka" with complex predicate support, and a search engine that indexes and supports end-user searches on the crawled pages.
Dosimeter for monitoring vapors and aerosols of organic compounds
Vo-Dinh, Tuan
1987-01-01
A dosimeter is provided for collecting and detecting vapors and aerosols of organic compounds. The dosimeter comprises a lightweight, passive device that can be conveniently worn by a person as a badge or placed at a stationary location. The dosimeter includes a sample collector comprising a porous web treated with a chemical for inducing molecular displacement and enhancing phosphorescence. Compounds are collected onto the web by molecular diffusion. The web also serves as the sample medium for detecting the compounds by a room temperature phosphorescence technique.
IMAGESEER - IMAGEs for Education and Research
NASA Technical Reports Server (NTRS)
Le Moigne, Jacqueline; Grubb, Thomas; Milner, Barbara
2012-01-01
IMAGESEER is a new Web portal that brings easy access to NASA image data for non-NASA researchers, educators, and students. The IMAGESEER Web site and database are specifically designed to be utilized by the university community, to enable teaching image processing (IP) techniques on NASA data, as well as to provide reference benchmark data to validate new IP algorithms. Along with the data and a Web user interface front-end, basic knowledge of the application domains, benchmark information, and specific NASA IP challenges (or case studies) are provided.
Advancements in silicon web technology
NASA Technical Reports Server (NTRS)
Hopkins, R. H.; Easoz, J.; Mchugh, J. P.; Piotrowski, P.; Hundal, R.
1987-01-01
Low defect density silicon web crystals up to 7 cm wide are produced from systems whose thermal environments are designed for low stress conditions using computer techniques. During growth, the average silicon melt temperature, the lateral melt temperature distribution, and the melt level are each controlled by digital closed loop systems to maintain thermal steady state and to minimize the labor content of the process. Web solar cell efficiencies of 17.2 pct AM1 have been obtained in the laboratory while 15 pct efficiencies are common in pilot production.
Two-step web-mining approach to study geology/geophysics-related open-source software projects
NASA Astrophysics Data System (ADS)
Behrends, Knut; Conze, Ronald
2013-04-01
Geology/geophysics is a highly interdisciplinary science, overlapping with, for instance, physics, biology and chemistry. In today's software-intensive work environments, geoscientists often encounter new open-source software from scientific fields that are only remotely related to the own field of expertise. We show how web-mining techniques can help to carry out systematic discovery and evaluation of such software. In a first step, we downloaded ~500 abstracts (each consisting of ~1 kb UTF-8 text) from agu-fm12.abstractcentral.com. This web site hosts the abstracts of all publications presented at AGU Fall Meeting 2012, the world's largest annual geology/geophysics conference. All abstracts belonged to the category "Earth and Space Science Informatics", an interdisciplinary label cross-cutting many disciplines such as "deep biosphere", "atmospheric research", and "mineral physics". Each publication was represented by a highly structured record with ~20 short data attributes, the largest authorship-record being the unstructured "abstract" field. We processed texts of the abstracts with the statistics software "R" to calculate a corpus and a term-document matrix. Using R package "tm", we applied text-mining techniques to filter data and develop hypotheses about software-development activities happening in various geology/geophysics fields. Analyzing the term-document matrix with basic techniques (e.g., word frequencies, co-occurences, weighting) as well as more complex methods (clustering, classification) several key pieces of information were extracted. For example, text-mining can be used to identify scientists who are also developers of open-source scientific software, and the names of their programming projects and codes can also be identified. In a second step, based on the intermediate results found by processing the conference-abstracts, any new hypotheses can be tested in another webmining subproject: by merging the dataset with open data from github.com and stackoverflow.com. These popular, developer-centric websites have powerful application-programmer interfaces, and follow an open-data policy. In this regard, these sites offer a web-accessible reservoir of information that can be tapped to study questions such as: which open source software projects are eminent in the various geoscience fields? What are the most popular programming languages? How are they trending? Are there any interesting temporal patterns in committer activities? How large are programming teams and how do they change over time? What free software packages exist in the vast realms of related fields? Does the software from these fields have capabilities that might still be useful to me as a researcher, or can help me perform my work better? Are there any open-source projects that might be commercially interesting? This evaluation strategy reveals programming projects that tend to be new. As many important legacy codes are not hosted on open-source code-repositories, the presented search method might overlook some older projects.
Tenório, Josceli Maria; Hummel, Anderson Diniz; Cohrs, Frederico Molina; Sdepanian, Vera Lucia; Pisa, Ivan Torres; de Fátima Marin, Heimar
2013-01-01
Background Celiac disease (CD) is a difficult-to-diagnose condition because of its multiple clinical presentations and symptoms shared with other diseases. Gold-standard diagnostic confirmation of suspected CD is achieved by biopsying the small intestine. Objective To develop a clinical decision–support system (CDSS) integrated with an automated classifier to recognize CD cases, by selecting from experimental models developed using intelligence artificial techniques. Methods A web-based system was designed for constructing a retrospective database that included 178 clinical cases for training. Tests were run on 270 automated classifiers available in Weka 3.6.1 using five artificial intelligence techniques, namely decision trees, Bayesian inference, k-nearest neighbor algorithm, support vector machines and artificial neural networks. The parameters evaluated were accuracy, sensitivity, specificity and area under the ROC curve (AUC). AUC was used as a criterion for selecting the CDSS algorithm. A testing database was constructed including 38 clinical CD cases for CDSS evaluation. The diagnoses suggested by CDSS were compared with those made by physicians during patient consultations. Results The most accurate method during the training phase was the averaged one-dependence estimator (AODE) algorithm (a Bayesian classifier), which showed accuracy 80.0%, sensitivity 0.78, specificity 0.80 and AUC 0.84. This classifier was integrated into the web-based decision–support system. The gold-standard validation of CDSS achieved accuracy of 84.2% and k = 0.68 (p < 0.0001) with good agreement. The same accuracy was achieved in the comparison between the physician’s diagnostic impression and the gold standard k = 0. 64 (p < 0.0001). There was moderate agreement between the physician’s diagnostic impression and CDSS k = 0.46 (p = 0.0008). Conclusions The study results suggest that CDSS could be used to help in diagnosing CD, since the algorithm tested achieved excellent accuracy in differentiating possible positive from negative CD diagnoses. This study may contribute towards developing of a computer-assisted environment to support CD diagnosis. PMID:21917512
Tenório, Josceli Maria; Hummel, Anderson Diniz; Cohrs, Frederico Molina; Sdepanian, Vera Lucia; Pisa, Ivan Torres; de Fátima Marin, Heimar
2011-11-01
Celiac disease (CD) is a difficult-to-diagnose condition because of its multiple clinical presentations and symptoms shared with other diseases. Gold-standard diagnostic confirmation of suspected CD is achieved by biopsying the small intestine. To develop a clinical decision-support system (CDSS) integrated with an automated classifier to recognize CD cases, by selecting from experimental models developed using intelligence artificial techniques. A web-based system was designed for constructing a retrospective database that included 178 clinical cases for training. Tests were run on 270 automated classifiers available in Weka 3.6.1 using five artificial intelligence techniques, namely decision trees, Bayesian inference, k-nearest neighbor algorithm, support vector machines and artificial neural networks. The parameters evaluated were accuracy, sensitivity, specificity and area under the ROC curve (AUC). AUC was used as a criterion for selecting the CDSS algorithm. A testing database was constructed including 38 clinical CD cases for CDSS evaluation. The diagnoses suggested by CDSS were compared with those made by physicians during patient consultations. The most accurate method during the training phase was the averaged one-dependence estimator (AODE) algorithm (a Bayesian classifier), which showed accuracy 80.0%, sensitivity 0.78, specificity 0.80 and AUC 0.84. This classifier was integrated into the web-based decision-support system. The gold-standard validation of CDSS achieved accuracy of 84.2% and k=0.68 (p<0.0001) with good agreement. The same accuracy was achieved in the comparison between the physician's diagnostic impression and the gold standard k=0. 64 (p<0.0001). There was moderate agreement between the physician's diagnostic impression and CDSS k=0.46 (p=0.0008). The study results suggest that CDSS could be used to help in diagnosing CD, since the algorithm tested achieved excellent accuracy in differentiating possible positive from negative CD diagnoses. This study may contribute towards developing of a computer-assisted environment to support CD diagnosis. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems
NASA Technical Reports Server (NTRS)
Ponyik, Joseph G.; York, David W.
2002-01-01
Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.
JavaScript Access to DICOM Network and Objects in Web Browser.
Drnasin, Ivan; Grgić, Mislav; Gogić, Goran
2017-10-01
Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.
An Implementation of Interactive Objects on the Web.
ERIC Educational Resources Information Center
Fritze, Paul
With the release of ShockWave, MacroMedia Director animations can now be incorporated directly into Web pages to provide high quality animation and interactivity, to support, for example, tutorial style questions and instantaneous feedback. This paper looks at the application of this technique in the translation of a traditional computer-based…
Web-Based Trainer for Electrical Circuit Analysis
ERIC Educational Resources Information Center
Weyten, L.; Rombouts, P.; De Maeyer, J.
2009-01-01
A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…
A Smart Itsy Bitsy Spider for the Web.
ERIC Educational Resources Information Center
Chen, Hsinchun; Chung, Yi-Ming; Ramsey, Marshall; Yang, Christopher C.
1998-01-01
This study tested two Web personal spiders (i.e., agents that take users' requests and perform real-time customized searches) based on best first-search and genetic-algorithm techniques. Both results were comparable and complementary, although the genetic algorithm obtained higher recall value. The Java-based interface was found to be necessary…
Considering the Efficacy of Web-Based Worked Examples in Introductory Chemistry
ERIC Educational Resources Information Center
Crippen, Kent J.; Earl, Boyd L.
2004-01-01
Theory suggests that studying worked examples and engaging in self-explanation will improve learning and problem solving. A growing body of evidence supports the use of web-based assessments for improving undergraduate performance in traditional large enrollment courses. This article describes a study designed to investigate these techniques in a…
ERIC Educational Resources Information Center
Mackenzie, Christine
2008-01-01
The paper describes the introduction of Web 2.0 techniques and tools at the Yarra Plenty public library in Victoria, beginning with library staff. The author argues that as libraries move from Web 1.0 type delivery systems to the social networking world of Library 2.0 librarians need to deploy and make accessible these radically different systems,…
Appropriating Invention through Concept Maps in Writing for Multimedia and the Web
ERIC Educational Resources Information Center
Bacabac, Florence Elizabeth
2015-01-01
As an alternative approach to web preproduction, I propose the use of concept maps for invention of website projects in business and professional writing courses. This mapping device approximates our students' initial site plans since rough ideas are formed based on a substantial exploratory technique. Incorporated in various disciplines, the…
ERIC Educational Resources Information Center
Kim, Deok-Hwan; Chung, Chin-Wan
2003-01-01
Discusses the collection fusion problem of image databases, concerned with retrieving relevant images by content based retrieval from image databases distributed on the Web. Focuses on a metaserver which selects image databases supporting similarity measures and proposes a new algorithm which exploits a probabilistic technique using Bayesian…
Creative Commons: A New Tool for Schools
ERIC Educational Resources Information Center
Pitler, Howard
2006-01-01
Technology-savvy instructors often require students to create Web pages or videos, tasks that require finding materials such as images, music, or text on the Web, reusing them, and then republishing them in a technique that author Howard Pitler calls "remixing." However, this requires both the student and the instructor to deal with often thorny…
Digital Ethnography: Library Web Page Redesign among Digital Natives
ERIC Educational Resources Information Center
Klare, Diane; Hobbs, Kendall
2011-01-01
Presented with an opportunity to improve Wesleyan University's dated library home page, a team of librarians employed ethnographic techniques to explore how its users interacted with Wesleyan's current library home page and web pages in general. Based on the data that emerged, a group of library staff and members of the campus' information…
Online Persistence in Higher Education Web-Supported Courses
ERIC Educational Resources Information Center
Hershkovitz, Arnon; Nachmias, Rafi
2011-01-01
This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…
Binary Coded Web Access Pattern Tree in Education Domain
ERIC Educational Resources Information Center
Gomathi, C.; Moorthi, M.; Duraiswamy, K.
2008-01-01
Web Access Pattern (WAP), which is the sequence of accesses pursued by users frequently, is a kind of interesting and useful knowledge in practice. Sequential Pattern mining is the process of applying data mining techniques to a sequential database for the purposes of discovering the correlation relationships that exist among an ordered list of…
Games and Web 2.0: A Winning Combination for Millennials
ERIC Educational Resources Information Center
Spiegelman, Marsha; Glass, Richard
2009-01-01
Gaming and social networking define the millennial student. This research focuses on an evolving collaboration between 2 faculty members of different disciplines who merged Web 2.0 and game scenarios to infuse research techniques as integral components of math/computer science courses. Blogs and wikis facilitated student-faculty interaction beyond…
World-Wide Web: The Information Universe.
ERIC Educational Resources Information Center
Berners-Lee, Tim; And Others
1992-01-01
Describes the World-Wide Web (W3) project, which is designed to create a global information universe using techniques of hypertext, information retrieval, and wide area networking. Discussion covers the W3 data model, W3 architecture, the document naming scheme, protocols, document formats, comparison with other systems, experience with the W3…
Adding the ocean to the study of seabirds: A brief history of at-sea seabird research
Ainley, David G.; Ribic, Christine A.; Woehler, Eric J.
2012-01-01
We review the history of how research directed towards marine ornithology has led to an appreciation of seabirds as highly specialized marine organisms. Beginning with R. C. Murphy (Pacific), V. C. Wynne-Edwards (Atlantic), and associates in the early 1900s, the research approach grew from an emphasis on seabird single-species ecology to an appreciation of interacting species assemblages and finally to seabirds being considered as important components of marine food webs. After a slow, drawn-out beginning, the initial main impetus for developing the field was a need to map seabird abundance and distribution tied to understanding impacts of continental shelf resource exploitation. Coalescing during the 1970s to 1980s to facilitate this line of research were 6 factors: (1) ability to identify birds at sea; (2) standardization of techniques to quantify abundance; (3) resources and techniques for mapping; (4) appreciation of how scale affects seabird relationships to hydrographic features and patchy prey; (5) development of computing power and appropriate statistics; and (6) seabird biologists becoming embedded in, as well as organizing, multidisciplinary marine research projects. Future advances in understanding the role of seabirds in marine food webs will be made by seabird biologists participating in multidisciplinary projects using grid-like surveys relative to oceanographic features in combination with instrumentation that reveals the finer details of seabird foraging behaviors.
Awan, Omer Abdulrehman; van Wagenberg, Frans; Daly, Mark; Safdar, Nabile; Nagy, Paul
2011-04-01
Many radiology information systems (RIS) cannot accept a final report from a dictation reporting system before the exam has been completed in the RIS by a technologist. A radiologist can still render a report in a reporting system once images are available, but the RIS and ancillary systems may not get the results because of the study's uncompleted status. This delay in completing the study caused an alarming number of delayed reports and was undetected by conventional RIS reporting techniques. We developed a Web-based reporting tool to monitor uncompleted exams and automatically page section supervisors when a report was being delayed by its incomplete status in the RIS. Institutional Review Board exemption was obtained. At four imaging centers, a Python script was developed to poll the dictation system every 10 min for exams in five different modalities that were signed by the radiologist but could not be sent to the RIS. This script logged the exams into an existing Web-based tracking tool using PHP and a MySQL database. The script also text-paged the modality supervisor. The script logged the time at which the report was finally sent, and statistics were aggregated onto a separate Web-based reporting tool. Over a 1-year period, the average number of uncompleted exams per month and time to problem resolution decreased at every imaging center and in almost every imaging modality. Automated feedback provides a vital link in improving technologist performance and patient care without assigning a human resource to manage report queues.
Focused Crawling of the Deep Web Using Service Class Descriptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rocco, D; Liu, L; Critchlow, T
2004-06-21
Dynamic Web data sources--sometimes known collectively as the Deep Web--increase the utility of the Web by providing intuitive access to data repositories anywhere that Web access is available. Deep Web services provide access to real-time information, like entertainment event listings, or present a Web interface to large databases or other data repositories. Recent studies suggest that the size and growth rate of the dynamic Web greatly exceed that of the static Web, yet dynamic content is often ignored by existing search engine indexers owing to the technical challenges that arise when attempting to search the Deep Web. To address thesemore » challenges, we present DynaBot, a service-centric crawler for discovering and clustering Deep Web sources offering dynamic content. DynaBot has three unique characteristics. First, DynaBot utilizes a service class model of the Web implemented through the construction of service class descriptions (SCDs). Second, DynaBot employs a modular, self-tuning system architecture for focused crawling of the DeepWeb using service class descriptions. Third, DynaBot incorporates methods and algorithms for efficient probing of the Deep Web and for discovering and clustering Deep Web sources and services through SCD-based service matching analysis. Our experimental results demonstrate the effectiveness of the service class discovery, probing, and matching algorithms and suggest techniques for efficiently managing service discovery in the face of the immense scale of the Deep Web.« less
NASA Astrophysics Data System (ADS)
Iwatsuki, Masami; Kato, Yoriyuki; Yonekawa, Akira
State-of-the-art Internet technologies allow us to provide advanced and interactive distance education services. However, we could not help but gather students for experiments and exercises in an education for engineering because large-scale equipments and expensive software are required. On the other hand, teleoperation systems with robot manipulator or vehicle via Internet have been developed in the field of robotics. By fusing these two techniques, we can realize remote experiment and exercise systems for the engineering education based on World Wide Web. This paper presents how to construct the remote environment that allows students to take courses on experiment and exercise independently of their locations. By using the proposed system, users can exercise and practice remotely about control of a manipulator and a robot vehicle and programming of image processing.
Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008
Soller, David R.
2009-01-01
The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
iGPCR-Drug: A Web Server for Predicting Interaction between GPCRs and Drugs in Cellular Networking
Xiao, Xuan; Min, Jian-Liang; Wang, Pu; Chou, Kuo-Chen
2013-01-01
Involved in many diseases such as cancer, diabetes, neurodegenerative, inflammatory and respiratory disorders, G-protein-coupled receptors (GPCRs) are among the most frequent targets of therapeutic drugs. It is time-consuming and expensive to determine whether a drug and a GPCR are to interact with each other in a cellular network purely by means of experimental techniques. Although some computational methods were developed in this regard based on the knowledge of the 3D (dimensional) structure of protein, unfortunately their usage is quite limited because the 3D structures for most GPCRs are still unknown. To overcome the situation, a sequence-based classifier, called “iGPCR-drug”, was developed to predict the interactions between GPCRs and drugs in cellular networking. In the predictor, the drug compound is formulated by a 2D (dimensional) fingerprint via a 256D vector, GPCR by the PseAAC (pseudo amino acid composition) generated with the grey model theory, and the prediction engine is operated by the fuzzy K-nearest neighbour algorithm. Moreover, a user-friendly web-server for iGPCR-drug was established at http://www.jci-bioinfo.cn/iGPCR-Drug/. For the convenience of most experimental scientists, a step-by-step guide is provided on how to use the web-server to get the desired results without the need to follow the complicated math equations presented in this paper just for its integrity. The overall success rate achieved by iGPCR-drug via the jackknife test was 85.5%, which is remarkably higher than the rate by the existing peer method developed in 2010 although no web server was ever established for it. It is anticipated that iGPCR-Drug may become a useful high throughput tool for both basic research and drug development, and that the approach presented here can also be extended to study other drug – target interaction networks. PMID:24015221
Biotool2Web: creating simple Web interfaces for bioinformatics applications.
Shahid, Mohammad; Alam, Intikhab; Fuellen, Georg
2006-01-01
Currently there are many bioinformatics applications being developed, but there is no easy way to publish them on the World Wide Web. We have developed a Perl script, called Biotool2Web, which makes the task of creating web interfaces for simple ('home-made') bioinformatics applications quick and easy. Biotool2Web uses an XML document containing the parameters to run the tool on the Web, and generates the corresponding HTML and common gateway interface (CGI) files ready to be published on a web server. This tool is available for download at URL http://www.uni-muenster.de/Bioinformatics/services/biotool2web/ Georg Fuellen (fuellen@alum.mit.edu).
The read-write Linked Data Web.
Berners-Lee, Tim; O'Hara, Kieron
2013-03-28
This paper discusses issues that will affect the future development of the Web, either increasing its power and utility, or alternatively suppressing its development. It argues for the importance of the continued development of the Linked Data Web, and describes the use of linked open data as an important component of that. Second, the paper defends the Web as a read-write medium, and goes on to consider how the read-write Linked Data Web could be achieved.
Cox, Martine Elizabeth; Small, Hannah Julie; Boyes, Allison W; O'Brien, Lorna; Rose, Shiho Karina; Baker, Amanda L; Henskens, Frans A; Kirkwood, Hannah Naomi; Roach, Della M
2017-01-01
Background Web-based typed exchanges are increasingly used by professionals to provide emotional support to patients. Although some empirical evidence exists to suggest that various strategies may be used to convey emotion during Web-based text communication, there has been no critical review of these data in patients with chronic conditions. Objectives The objective of this review was to identify the techniques used to convey emotion in written or typed Web-based communication and assess the empirical evidence regarding impact on communication and psychological outcomes. Methods An electronic search of databases, including MEDLINE, CINAHL, PsycINFO, EMBASE, and the Cochrane Library was conducted to identify literature published from 1990 to 2016. Searches were also conducted using Google Scholar, manual searching of reference lists of identified papers and manual searching of tables of contents for selected relevant journals. Data extraction and coding were completed by 2 reviewers (10.00% [573/5731] of screened papers, at abstract/title screening stage; 10.0% of screened [69/694] papers, at full-text screening stage). Publications were assessed against the eligibility criteria and excluded if they were duplicates, were not published in English, were published before 1990, referenced animal or nonhuman subjects, did not describe original research, were not journal papers, or did not empirically test the effect of one or more nonverbal communication techniques (for eg, smileys, emoticons, emotional bracketing, voice accentuation, trailers [ellipsis], and pseudowords) as part of Web-based or typed communication on communication-related variables, including message interpretation, social presence, the nature of the interaction (eg, therapeutic alliance), patient perceptions of the interaction (eg, participant satisfaction), or psychological outcomes, including depression, anxiety, and distress. Results A total of 6902 unique publications were identified. Of these, six publications met the eligibility criteria and were included in a narrative synthesis. All six studies addressed the effect of smileys or emoticons on participant responses, message interpretation, or social presence of the writer. None of these studies specifically targeted chronic conditions. It was found that emoticons were more effective in influencing the emotional impact of a message than no cue and that smileys and emoticons were able to convey a limited amount of emotion. No studies addressed other techniques for conveying emotion in written communication. No studies addressed the effects of any techniques on the nature of the interaction (eg, therapeutic alliance), patient perceptions of the interaction (eg, participant satisfaction), or psychological outcomes (depression, anxiety, or distress). Conclusions There is a need for greater empirical attention to the effects of the various proposed techniques for conveying emotion in Web-based typed communications to inform health service providers regarding best-practice communication skills in this setting. PMID:29066426
[A solution for display and processing of DICOM images in web PACS].
Xue, Wei-jing; Lu, Wen; Wang, Hai-yang; Meng, Jian
2009-03-01
Use the technique of Java Applet to realize the supporting of DICOM image in ordinary Web browser, thereby to expand the processing function of medical image. First analyze the format of DICOM file and design a class which can acquire the pixels, then design two Applet classes, of which one is used to disposal the DICOM image, the other is used to display DICOM image that have been disposaled in the first Applet. They all embedded in the View page, and they communicate by Applet Context object. The method designed in this paper can make users display and process DICOM images directly by using ordinary Web browser, which makes Web PACS not only have the advantages of B/S model, but also have the advantages of the C/S model. Java Applet is the key for expanding the Web browser's function in Web PACS, which provides a guideline to sharing of medical images.
Development of a Web-Based Periscope Simulator for Submarine Officer Training
2014-09-01
31 2. The Evolution of Web-Based technology .........................................32...DEVELOPMENT ............................................................................65 A. TECHNOLOGY ...the possibility to deliver 3D simulations using the web browsers and web technology . The objective is to create an effective and efficient WBLE that
Semantic Advertising for Web 3.0
NASA Astrophysics Data System (ADS)
Thomas, Edward; Pan, Jeff Z.; Taylor, Stuart; Ren, Yuan; Jekjantuk, Nophadol; Zhao, Yuting
Advertising on the World Wide Web is based around automatically matching web pages with appropriate advertisements, in the form of banner ads, interactive adverts, or text links. Traditionally this has been done by manual classification of pages, or more recently using information retrieval techniques to find the most important keywords from the page, and match these to keywords being used by adverts. In this paper, we propose a new model for online advertising, based around lightweight embedded semantics. This will improve the relevancy of adverts on the World Wide Web and help to kick-start the use of RDFa as a mechanism for adding lightweight semantic attributes to the Web. Furthermore, we propose a system architecture for the proposed new model, based on our scalable ontology reasoning infrastructure TrOWL.
Web based visualization of large climate data sets
Alder, Jay R.; Hostetler, Steven W.
2015-01-01
We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.
Nakamura, R; Sasaki, M; Oikawa, H; Harada, S; Tamakawa, Y
2000-03-01
To use an intranet technique to develop an information system that simultaneously supports both diagnostic reports and radiotherapy planning images. Using a file server as the gateway a radiation oncology LAN was connected to an already operative RIS LAN. Dose-distribution images were saved in tagged-image-file format by way of a screen dump to the file server. X-ray simulator images and portal images were saved in encapsulated postscript format in the file server and automatically converted to portable document format. The files on the file server were automatically registered to the Web server by the search engine and were available for searching and browsing using the Web browser. It took less than a minute to register planning images. For clients, searching and browsing the file took less than 3 seconds. Over 150,000 reports and 4,000 images from a six-month period were accessible. Because the intranet technique was used, construction and maintenance was completed without specialty. Prompt access to essential information about radiotherapy has been made possible by this system. It promotes public access to radiotherapy planning that may improve the quality of treatment.
Classification of HTTP Attacks: A Study on the ECML/PKDD 2007 Discovery Challenge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, Brian; Eliassi-Rad, Tina
2009-07-08
As the world becomes more reliant on Web applications for commercial, financial, and medical transactions, cyber attacks on the World Wide Web are increasing in frequency and severity. Web applications provide an attractive alternative to traditional desktop applications due to their accessibility and ease of deployment. However, the accessibility of Web applications also makes them extremely vulnerable to attack. This inherent vulnerability is intensified by the distributed nature ofWeb applications and the complexity of configuring application servers. These factors have led to a proliferation of Web-based attacks, in which attackers surreptitiously inject code into HTTP requests, allowing them to executemore » arbitrary commands on remote systems and perform malicious activities such as reading, altering, or destroying sensitive data. One approach for dealing with HTTP-based attacks is to identify malicious code in incoming HTTP requests and eliminate bad requests before they are processed. Using machine learning techniques, we can build a classifier to automatically label requests as “Valid” or “Attack.” For this study, we develop a simple, but effective HTTP attack classifier, based on the vector space model used commonly for Information Retrieval. Our classifier not only separates attacks from valid requests, but can also identify specific attack types (e.g., “SQL Injection” or “Path Traversal”). We demonstrate the effectiveness of our approach through experiments on the ECML/PKDD 2007 Discovery Challenge data set. Specifically, we show that our approach achieves higher precision and recall than previous methods. In addition, our approach has a number of desirable characteristics, including robustness to missing contextual information, interpretability of models, and scalability.« less
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Creating and sharing clinical decision support content with Web 2.0: Issues and examples.
Wright, Adam; Bates, David W; Middleton, Blackford; Hongsermeier, Tonya; Kashyap, Vipul; Thomas, Sean M; Sittig, Dean F
2009-04-01
Clinical decision support is a powerful tool for improving healthcare quality and patient safety. However, developing a comprehensive package of decision support interventions is costly and difficult. If used well, Web 2.0 methods may make it easier and less costly to develop decision support. Web 2.0 is characterized by online communities, open sharing, interactivity and collaboration. Although most previous attempts at sharing clinical decision support content have worked outside of the Web 2.0 framework, several initiatives are beginning to use Web 2.0 to share and collaborate on decision support content. We present case studies of three efforts: the Clinfowiki, a world-accessible wiki for developing decision support content; Partners Healthcare eRooms, web-based tools for developing decision support within a single organization; and Epic Systems Corporation's Community Library, a repository for sharing decision support content for customers of a single clinical system vendor. We evaluate the potential of Web 2.0 technologies to enable collaborative development and sharing of clinical decision support systems through the lens of three case studies; analyzing technical, legal and organizational issues for developers, consumers and organizers of clinical decision support content in Web 2.0. We believe the case for Web 2.0 as a tool for collaborating on clinical decision support content appears strong, particularly for collaborative content development within an organization.
A photogrammetric technique for generation of an accurate multispectral optical flow dataset
NASA Astrophysics Data System (ADS)
Kniaz, V. V.
2017-06-01
A presence of an accurate dataset is the key requirement for a successful development of an optical flow estimation algorithm. A large number of freely available optical flow datasets were developed in recent years and gave rise for many powerful algorithms. However most of the datasets include only images captured in the visible spectrum. This paper is focused on the creation of a multispectral optical flow dataset with an accurate ground truth. The generation of an accurate ground truth optical flow is a rather complex problem, as no device for error-free optical flow measurement was developed to date. Existing methods for ground truth optical flow estimation are based on hidden textures, 3D modelling or laser scanning. Such techniques are either work only with a synthetic optical flow or provide a sparse ground truth optical flow. In this paper a new photogrammetric method for generation of an accurate ground truth optical flow is proposed. The method combines the benefits of the accuracy and density of a synthetic optical flow datasets with the flexibility of laser scanning based techniques. A multispectral dataset including various image sequences was generated using the developed method. The dataset is freely available on the accompanying web site.
Implementing WebQuest Based Instruction on Newton's Second Law
ERIC Educational Resources Information Center
Gokalp, Muhammed Sait; Sharma, Manjula; Johnston, Ian; Sharma, Mia
2013-01-01
The purpose of this study was to investigate how WebQuests can be used in physics classes for teaching specific concepts. The study had three stages. The first stage was to develop a WebQuest on Newton's second law. The second stage involved developing a lesson plan to implement the WebQuest in class. In the final stage, the WebQuest was…
NASA Astrophysics Data System (ADS)
Veenendaal, B.; Brovelli, M. A.; Li, S.; Ivánová, I.
2017-09-01
Although maps have been around for a very long time, web maps are yet very young in their origin. Despite their relatively short history, web maps have been developing very rapidly over the past few decades. The use, users and usability of web maps have rapidly expanded along with developments in web technologies and new ways of mapping. In the process of these developments, the terms and terminology surrounding web mapping have also changed and evolved, often relating to the new technologies or new uses. Examples include web mapping, web GIS, cloud mapping, internet mapping, internet GIS, geoweb, map mashup, online mapping etc., not to mention those with prefixes such as "web-based" and "internet-based". So, how do we keep track of these terms, relate them to each other and have common understandings of their meanings so that references to them are not ambiguous, misunderstood or even different? This paper explores the terms surrounding web mapping and web GIS, and the development of their meaning over time. The paper then suggests the current context in which these terms are used and provides meanings that may assist in better understanding and communicating using these terms in the future.
ERIC Educational Resources Information Center
Lakonpol, Thongmee; Ruangsuwan, Chaiyot; Terdtoon, Pradit
2015-01-01
This research aimed to develop a web-based learning environment model for enhancing cognitive skills of undergraduate students in the field of electrical engineering. The research is divided into 4 phases: 1) investigating the current status and requirements of web-based learning environment models. 2) developing a web-based learning environment…
ERIC Educational Resources Information Center
Pumipuntu, Natawut; Kidrakarn, Pachoen; Chetakarn, Somchock
2015-01-01
This research aimed to develop the model of Web-based Collaborative (WBC) Training model for enhancing human performances on ICT for students in Banditpattanasilpa Institute. The research is divided into three phases: 1) investigating students and teachers' training needs on ICT web-based contents and performance, 2) developing a web-based…
ERIC Educational Resources Information Center
Dehinbo, Johnson
2011-01-01
The widespread use of the Internet and the World Wide Web led to the availability of many platforms for developing dynamic Web application and the problem of choosing the most appropriate platform that will be easy to use for undergraduate students of web applications development in tertiary institutions. Students beginning to learn web…
ERIC Educational Resources Information Center
Miller, Leslie; Chang, Ching-I; Hoyt, Daniel
2010-01-01
CSI: The Experience, a traveling museum exhibit and a companion web adventure, was created through a grant from the National Science Foundation as a potential model for informal learning. The website was designed to enrich and complement the exhibit by modeling the forensic process. Substantive science, real-world lab techniques, and higher-level…
ERIC Educational Resources Information Center
Theresa, Ofoegbu; Ugwu, Agboeze Matthias; Ihebuzoaju, Anyanwu Joy; Uche, Asogwa
2013-01-01
The study investigated the Web-browsing competencies of pre-service adult facilitators in the southeast geopolitical zone of Nigeria. Survey design was adopted for the study. The population consists of all pre-service adult facilitators in all the federal universities in the southeast geopolitical zone of Nigeria. Accidental sampling technique was…
A Clustering Methodology of Web Log Data for Learning Management Systems
ERIC Educational Resources Information Center
Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros
2012-01-01
Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…
Creating and Maintaining Data-Driven Course Web Sites.
ERIC Educational Resources Information Center
Heines, Jesse M.
This paper deals with techniques for reducing the amount of work that needs to be redone each semester when one prepares an existing course Web site for a new class. The key concept is algorithmic generation of common page elements while still allowing full control over page content via WYSIWYG tools like Microsoft FrontPage and Macromedia…
Student-Led Engagement of Journal Article Authors in the Classroom Using Web-Based Videoconferencing
ERIC Educational Resources Information Center
Stockman, Brian J.
2015-01-01
The learning environment described here uses Web-based videoconferencing technology to merge the traditional classroom journal article discussion with student-led interviews of journal article authors. Papers that describe recent applications of a given technique are selected, with the author engagement occurring at the end of a three or four week…
Practical Tips and Strategies for Finding Information on the Internet.
ERIC Educational Resources Information Center
Armstrong, Rhonda; Flanagan, Lynn
This paper presents the most important concepts and techniques to use in successfully searching the major World Wide Web search engines and directories, explains the basics of how search engines work, and describes what is included in their indexes. Following an introduction that gives an overview of Web directories and search engines, the first…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-14
...'s Web site at http://www.batstrading.com , at the principal office of the Exchange, and at the...-brand one of its routing strategies, currently referred to as ``DART,'' as the ``Dark Routing Technique... only one method. The Commission will post all comments on the Commission's Internet Web site ( http...
A Content Analysis of Themes That Emerge from School Principals' Web2.0 Conversations
ERIC Educational Resources Information Center
Manning, Rory
2011-01-01
The purpose of this qualitative study was to analyze the self initiated conversations held by school principals on web2.o platforms, such as blogs, through the lens of current leadership standards. The online writings of thirteen school principals were analyzed using grounded theory techniques (Strauss and Corbin, 1998) to elucidate emerging…
Caching strategies for improving performance of web-based Geographic applications
NASA Astrophysics Data System (ADS)
Liu, M.; Brodzik, M.; Collins, J. A.; Lewis, S.; Oldenburg, J.
2012-12-01
The NASA Operation IceBridge mission collects airborne remote sensing measurements to bridge the gap between NASA's Ice, Cloud and Land Elevation Satellite (ICESat) mission and the upcoming ICESat-2 mission. The IceBridge Data Portal from the National Snow and Ice Data Center provides an intuitive web interface for accessing IceBridge mission observations and measurements. Scientists and users usually do not have knowledge about the individual campaigns but are interested in data collected in a specific place. We have developed a high-performance map interface to allow users to quickly zoom to an area of interest and see any Operation IceBridge overflights. The map interface consists of two layers: the user can pan and zoom on the base map layer; the flight line layer that overlays the base layer provides all the campaign missions that intersect with the current map view. The user can click on the flight campaigns and download the data as needed. The OpenGIS® Web Map Service Interface Standard (WMS) provides a simple HTTP interface for requesting geo-registered map images from one or more distributed geospatial databases. Web Feature Service (WFS) provides an interface allowing requests for geographical features across the web using platform-independent calls. OpenLayers provides vector support (points, polylines and polygons) to build a WMS/WFS client for displaying both layers on the screen. Map Server, an open source development environment for building spatially enabled internet applications, is serving the WMS and WFS spatial data to OpenLayers. Early releases of the portal displayed unacceptably poor load time performance for flight lines and the base map tiles. This issue was caused by long response times from the map server in generating all map tiles and flight line vectors. We resolved the issue by implementing various caching strategies on top of the WMS and WFS services, including the use of Squid (www.squid-cache.org) to cache frequently-used content. Our presentation includes the architectural design of the application, and how we use OpenLayers, WMS and WFS with Squid to build a responsive web application capable of efficiently displaying geospatial data to allow the user to quickly interact with the displayed information. We describe the design, implementation and performance improvement of our caching strategies, and the tools and techniques developed to assist our data caching strategies.
Utilization of two web-based continuing education courses evaluated by Markov chain model.
Tian, Hao; Lin, Jin-Mann S; Reeves, William C
2012-01-01
To evaluate the web structure of two web-based continuing education courses, identify problems and assess the effects of web site modifications. Markov chain models were built from 2008 web usage data to evaluate the courses' web structure and navigation patterns. The web site was then modified to resolve identified design issues and the improvement in user activity over the subsequent 12 months was quantitatively evaluated. Web navigation paths were collected between 2008 and 2010. The probability of navigating from one web page to another was analyzed. The continuing education courses' sequential structure design was clearly reflected in the resulting actual web usage models, and none of the skip transitions provided was heavily used. The web navigation patterns of the two different continuing education courses were similar. Two possible design flaws were identified and fixed in only one of the two courses. Over the following 12 months, the drop-out rate in the modified course significantly decreased from 41% to 35%, but remained unchanged in the unmodified course. The web improvement effects were further verified via a second-order Markov chain model. The results imply that differences in web content have less impact than web structure design on how learners navigate through continuing education courses. Evaluation of user navigation can help identify web design flaws and guide modifications. This study showed that Markov chain models provide a valuable tool to evaluate web-based education courses. Both the results and techniques in this study would be very useful for public health education and research specialists.
Utilization of two web-based continuing education courses evaluated by Markov chain model
Lin, Jin-Mann S; Reeves, William C
2011-01-01
Objectives To evaluate the web structure of two web-based continuing education courses, identify problems and assess the effects of web site modifications. Design Markov chain models were built from 2008 web usage data to evaluate the courses' web structure and navigation patterns. The web site was then modified to resolve identified design issues and the improvement in user activity over the subsequent 12 months was quantitatively evaluated. Measurements Web navigation paths were collected between 2008 and 2010. The probability of navigating from one web page to another was analyzed. Results The continuing education courses' sequential structure design was clearly reflected in the resulting actual web usage models, and none of the skip transitions provided was heavily used. The web navigation patterns of the two different continuing education courses were similar. Two possible design flaws were identified and fixed in only one of the two courses. Over the following 12 months, the drop-out rate in the modified course significantly decreased from 41% to 35%, but remained unchanged in the unmodified course. The web improvement effects were further verified via a second-order Markov chain model. Conclusions The results imply that differences in web content have less impact than web structure design on how learners navigate through continuing education courses. Evaluation of user navigation can help identify web design flaws and guide modifications. This study showed that Markov chain models provide a valuable tool to evaluate web-based education courses. Both the results and techniques in this study would be very useful for public health education and research specialists. PMID:21976027
Pratte, Gabrielle; Hurtubise, Karen; Rivard, Lisa; Berbari, Jade; Camden, Chantal
2018-01-01
Web platforms are increasingly used to support virtual interactions between members of communities of practice (CoP). However, little is known about how to develop these platforms to support the implementation of best practices for health care professionals. The aim of this article is to explore pediatric physiotherapists' (PTs) perspectives regarding the utility and usability of the characteristic of a web platform developed to support virtual communities of practice (vCoP). This study adopted an explanatory sequential mixed methods design. A web platform supporting the interactions of vCoP members was developed for PTs working with children with developmental coordination disorder. Specific strategies and features were created to support the effectiveness of the platform across three domains: social, information-quality, and system-quality factors. Quantitative data were collected from a cross-sectional survey (n = 41) after 5 months of access to the web platform. Descriptive statistics were calculated. Qualitative data were also collected from semistructured interviews (n = 9), which were coded, interpreted, and analyzed by using Boucher's Web Ergonomics Conceptual Framework. The utility of web platform characteristics targeting the three key domain factors were generally perceived positively by PTs. However, web platform usability issues were noted by PTs, including problems with navigation and information retrieval. Web platform aiming to support vCoP should be carefully developed to target potential users' needs. Whenever possible, users should co-construct the web platform with vCoP developers. Moreover, each of the developed characteristics (eg, newsletter, search function) should be evaluated in terms of utility and usability for the users.
Warren, David W.
1997-01-01
A process and an apparatus for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquified eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciately stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers.
Guéguen, Nicolas
2003-04-01
In an attempt to test the door-in-the-face (DITF) technique in a computer-mediated context, 1,607 men and women taken at random in various e-mail lists were solicited to visit a web site for the profit of a humanitarian organization. In DITF condition, subjects were first solicited by an exaggerated request and, after refusing, were solicited for a small donation. In control condition the donation solicitation was formulated directly. In all the cases, the request was manipulated by the order of the successive HTML pages of the site. Results show that the DITF procedure increase compliance to the last request. The theoretical implication of the effect of this technique in a computer-communication context is discussed.
Web 1.0 to Web 3.0 Evolution: Reviewing the Impacts on Tourism Development and Opportunities
NASA Astrophysics Data System (ADS)
Eftekhari, M. Hossein; Barzegar, Zeynab; Isaai, M. T.
The most important event following the establishmenet of the Internet network was the Web introduced by Tim Berners-Lee. Websites give their owners features that allow sharing with which they can publish their content with users and visitors. In the last 5 years, we have seen some changes in the use of web. Users want to participate in content sharing and they like to interact with each other. This is known as Web 2.0. In the last year, Web 2.0 has reached maturity and now we need a smart web which will be accordingly be called Web 3.0. Web 3.0 is based on semantic web definition. Changing the way of using the web has had a clear impact on E-Tourism and its development and also on business models. In this paper, we review the definitions and describe the impacts of web evolution on E-Tourism.
NASA Astrophysics Data System (ADS)
Zhai, Dongsheng; Liu, Chen
Since 2005, the term Web 2.0 has gradually become a hot topic on the Internet. Web 2.0 lets users create web contents as distinct from webmasters or web coders. Web 2.0 has come to our work, our life and even has become an indispensable part of our web-life. Its applications have already been widespread in many fields on the Internet. So far, China has about 137 million netizens [1], therefore its Web 2.0 market is so attractive that many sources of venture capital flow into the Chinese Web 2.0 market and there are also a lot of new Web 2.0 companies in China. However, the development of Web 2.0 in China is accompanied by some problems and obstacles. In this paper, we will mainly discuss Web 2.0 applications in China, with their current problems and future development trends.
Dosimeter for monitoring vapors and aerosols of organic compounds
Vo-Dinh, T.
1987-07-14
A dosimeter is provided for collecting and detecting vapors and aerosols of organic compounds. The dosimeter comprises a lightweight, passive device that can be conveniently worn by a person as a badge or placed at a stationary location. The dosimeter includes a sample collector comprising a porous web treated with a chemical for inducing molecular displacement and enhancing phosphorescence. Compounds are collected onto the web by molecular diffusion. The web also serves as the sample medium for detecting the compounds by a room temperature phosphorescence technique. 7 figs.
The Web Resource Collaboration Center
ERIC Educational Resources Information Center
Dunlap, Joanna C.
2004-01-01
The Web Resource Collaboration Center (WRCC) is a web-based tool developed to help software engineers build their own web-based learning and performance support systems. Designed using various online communication and collaboration technologies, the WRCC enables people to: (1) build a learning and professional development resource that provides…
The Namibia Early Flood Warning System, A CEOS Pilot Project
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Frye, Stuart; Cappelaere, Pat; Sohlberg, Robert; Handy, Matthew; Grossman, Robert
2012-01-01
Over the past year few years, an international collaboration has developed a pilot project under the auspices of Committee on Earth Observation Satellite (CEOS) Disasters team. The overall team consists of civilian satellite agencies. For this pilot effort, the development team consists of NASA, Canadian Space Agency, Univ. of Maryland, Univ. of Colorado, Univ. of Oklahoma, Ukraine Space Research Institute and Joint Research Center(JRC) for European Commission. This development team collaborates with regional , national and international agencies to deliver end-to-end disaster coverage. In particular, the team in collaborating on this effort with the Namibia Department of Hydrology to begin in Namibia . However, the ultimate goal is to expand the functionality to provide early warning over the South Africa region. The initial collaboration was initiated by United Nations Office of Outer Space Affairs and CEOS Working Group for Information Systems and Services (WGISS). The initial driver was to demonstrate international interoperability using various space agency sensors and models along with regional in-situ ground sensors. In 2010, the team created a preliminary semi-manual system to demonstrate moving and combining key data streams and delivering the data to the Namibia Department of Hydrology during their flood season which typically is January through April. In this pilot, a variety of moderate resolution and high resolution satellite flood imagery was rapidly delivered and used in conjunction with flood predictive models in Namibia. This was collected in conjunction with ground measurements and was used to examine how to create a customized flood early warning system. During the first year, the team made use of SensorWeb technology to gather various sensor data which was used to monitor flood waves traveling down basins originating in Angola, but eventually flooding villages in Namibia. The team made use of standardized interfaces such as those articulated under the Open Cloud Consortium (OGC) Sensor Web Enablement (SWE) set of web services was good [1][2]. However, it was discovered that in order to make a system like this functional, there were many performance issues. Data sets were large and located in a variety of location behind firewalls and had to be accessed across open networks, so security was an issue. Furthermore, the network access acted as bottleneck to transfer map products to where they are needed. Finally, during disasters, many users and computer processes act in parallel and thus it was very easy to overload the single string of computers stitched together in a virtual system that was initially developed. To address some of these performance issues, the team partnered with the Open Cloud Consortium (OCC) who supplied a Computation Cloud located at the University of Illinois at Chicago and some manpower to administer this Cloud. The Flood SensorWeb [3] system was interfaced to the Cloud to provide a high performance user interface and product development engine. Figure 1 shows the functional diagram of the Flood SensorWeb. Figure 2 shows some of the functionality of the Computation Cloud that was integrated. A significant portion of the original system was ported to the Cloud and during the past year, technical issues were resolved which included web access to the Cloud, security over the open Internet, beginning experiments on how to handle surge capacity by using the virtual machines in the cloud in parallel, using tiling techniques to render large data sets as layers on map, interfaces to allow user to customize the data processing/product chain and other performance enhancing techniques. The conclusion reached from the effort and this presentation is that defining the interoperability standards in a small fraction of the work. For example, once open web service standards were defined, many users could not make use of the standards due to security restrictions. Furthermore, once an interoperable sysm is functional, then a surge of users can render a system unusable, especially in the disaster domain.
Marx, Sabrina; Phalkey, Revati; Aranda-Jan, Clara B; Profe, Jörn; Sauerborn, Rainer; Höfle, Bernhard
2014-11-20
Childhood malnutrition is a serious challenge in Sub-Saharan Africa (SSA) and a major underlying cause of death. It is the result of a dynamic and complex interaction between political, social, economic, environmental and other factors. As spatially oriented research has been established in health sciences in recent years, developments in Geographic Information Science (GIScience) provide beneficial tools to get an improved understanding of malnutrition. In order to assess the current state of knowledge regarding the use of geoinformation analyses for exploring malnutrition in SSA, a systematic literature review of peer-reviewed literature is conducted using Scopus, ISI Web of Science and PubMed. As a supplement to the review, we carry on to investigate the establishment of web-based geoportals for providing freely accessible malnutrition geodata to a broad community. Based on these findings, we identify current limitations and discuss how new developments in GIScience might help to overcome impending barriers. 563 articles are identified from the searches, from which a total of nine articles and eight geoportals meet inclusion criteria. The review suggests that the spatial dimension of malnutrition is analyzed most often at the regional and national level using geostatistical analysis methods. Therefore, heterogeneous geographic information at different spatial scales and from multiple sources is combined by applying geoinformation analysis methods such as spatial interpolation, aggregation and downscaling techniques. Geocoded malnutrition data from the Demographic and Health Survey Program are the most common information source to quantify the prevalence of malnutrition on a local scale and are frequently combined with regional data on climate, population, agriculture and/or infrastructure. Only aggregated geoinformation about malnutrition prevalence is freely accessible, mostly displayed via web map visualizations or downloadable map images. The lack of detailed geographic data at household and local level is a major limitation for an in-depth assessment of malnutrition and links to potential impact factors. We propose that the combination of malnutrition-related studies with most recent GIScience developments such as crowd-sourced geodata collection, (web-based) interoperable spatial health data infrastructures as well as (dynamic) information fusion approaches are beneficial to deepen the understanding of this complex phenomenon.
Genes2Networks: connecting lists of gene symbols using mammalian protein interactions databases.
Berger, Seth I; Posner, Jeremy M; Ma'ayan, Avi
2007-10-04
In recent years, mammalian protein-protein interaction network databases have been developed. The interactions in these databases are either extracted manually from low-throughput experimental biomedical research literature, extracted automatically from literature using techniques such as natural language processing (NLP), generated experimentally using high-throughput methods such as yeast-2-hybrid screens, or interactions are predicted using an assortment of computational approaches. Genes or proteins identified as significantly changing in proteomic experiments, or identified as susceptibility disease genes in genomic studies, can be placed in the context of protein interaction networks in order to assign these genes and proteins to pathways and protein complexes. Genes2Networks is a software system that integrates the content of ten mammalian interaction network datasets. Filtering techniques to prune low-confidence interactions were implemented. Genes2Networks is delivered as a web-based service using AJAX. The system can be used to extract relevant subnetworks created from "seed" lists of human Entrez gene symbols. The output includes a dynamic linkable three color web-based network map, with a statistical analysis report that identifies significant intermediate nodes used to connect the seed list. Genes2Networks is powerful web-based software that can help experimental biologists to interpret lists of genes and proteins such as those commonly produced through genomic and proteomic experiments, as well as lists of genes and proteins associated with disease processes. This system can be used to find relationships between genes and proteins from seed lists, and predict additional genes or proteins that may play key roles in common pathways or protein complexes.
Using a web-based survey tool to undertake a Delphi study: application for nurse education research.
Gill, Fenella J; Leslie, Gavin D; Grech, Carol; Latour, Jos M
2013-11-01
The Internet is increasingly being used as a data collection medium to access research participants. This paper reports on the experience and value of using web-survey software to conduct an eDelphi study to develop Australian critical care course graduate practice standards. The eDelphi technique used involved the iterative process of administering three rounds of surveys to a national expert panel. The survey was developed online using SurveyMonkey. Panel members responded to statements using one rating scale for round one and two scales for rounds two and three. Text boxes for panel comments were provided. For each round, the SurveyMonkey's email tool was used to distribute an individualized email invitation containing the survey web link. The distribution of panel responses, individual responses and a summary of comments were emailed to panel members. Stacked bar charts representing the distribution of responses were generated using the SurveyMonkey software. Panel response rates remained greater than 85% over all rounds. An online survey provided numerous advantages over traditional survey approaches including high quality data collection, ease and speed of survey administration, direct communication with the panel and rapid collation of feedback allowing data collection to be undertaken in 12 weeks. Only minor challenges were experienced using the technology. Ethical issues, specific to using the Internet to conduct research and external hosting of web-based software, lacked formal guidance. High response rates and an increased level of data quality were achieved in this study using web-survey software and the process was efficient and user-friendly. However, when considering online survey software, it is important to match the research design with the computer capabilities of participants and recognize that ethical review guidelines and processes have not yet kept pace with online research practices. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Colli, A.; Spadaro, G.
2012-04-01
C.R.E.A. (Reference Centre for Environmental Education), sponsored by Region Lombardia (Italy), is a reference point for environmental education. Every year different activities and laboratories are offered to Pavia's schools, in collaboration with science teachers in particular with ANISN ones. The wide range of material and techniques in geo and environmental sciences, the speed with which the discipline is developing, and the diversity of the student need a wide range of teaching approaches, including inquiry-, technology-, data-, field-, and game-based activities. The purpose of teaching is not only to provide students with detailed skills and knowledge, but also to let them develop the capability of critical thinking, dealing with controversial issues in a balanced and sensitive manner. An "active", research-based teaching-learning style to bring young people to reflect and act on issues of vital importance for their future is WebQuest, an inquiry-oriented lesson format in which most or all the information that learners work with comes from the web. In the project Mothership Earth students critically evaluate information they found in the web about geo and environment issues and design actions using the information gathered. In November 2011 the teachers and the students of a middle school of Bereguardo, a small town near Pavia, decided to realize a WebQuest about ecological footprint. They calculated the footprint of the food they eat every day: they calculated also the water used (water footprint). WebQuest influenced students' learning performance positively They discovered that animal food need a lot of water and has a very big footprint, so they decided to change their "diet" to make their footprint smaller. For the future we will implement the project, proposing the realization of WebQuest in outdoor: in real situations, students could acquire much more knowledge and experiences, with the aim to improve their lifestyle. Every year laboratories about "Water health" are realized by different pupils in Pavia's schools (age 6-18)."Where is our water?" is a book written by CREA collaborators, a useful tool for teachers to engage students in constructing knowledge, skills and values from direct experience with laboratories and practical activities based on experiential learning (learning by doing).
NASA Astrophysics Data System (ADS)
McAuliffe, C.; Ledley, T.; Dahlman, L.; Haddad, N.
2007-12-01
One of the challenges faced by Earth science teachers, particularly in K-12 settings, is that of connecting scientific research to classroom experiences. Helping teachers and students analyze Web-based scientific data is one way to bring scientific research to the classroom. The Earth Exploration Toolbook (EET) was developed as an online resource to accomplish precisely that. The EET consists of chapters containing step-by-step instructions for accessing Web-based scientific data and for using a software analysis tool to explore issues or concepts in science, technology, and mathematics. For example, in one EET chapter, users download Earthquake data from the USGS and bring it into a geographic information system (GIS), analyzing factors affecting the distribution of earthquakes. The goal of the EET Workshops project is to provide professional development that enables teachers to incorporate Web-based scientific data and analysis tools in ways that meet their curricular needs. In the EET Workshops project, Earth science teachers participate in a pair of workshops that are conducted in a combined teleconference and Web-conference format. In the first workshop, the EET Data Analysis Workshop, participants are introduced to the National Science Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). They also walk through an Earth Exploration Toolbook (EET) chapter and discuss ways to use Earth science datasets and tools with their students. In a follow-up second workshop, the EET Implementation Workshop, teachers share how they used these materials in the classroom by describing the projects and activities that they carried out with students. The EET Workshops project offers unique and effective professional development. Participants work at their own Internet-connected computers, and dial into a toll-free group teleconference for step-by-step facilitation and interaction. They also receive support via Elluminate, a Web-conferencing software program. The software allows participants to see the facilitator's computer as the analysis techniques of an EET chapter are demonstrated. If needed, the facilitator can also view individual participant's computers, assisting with technical difficulties. In addition, it enables a large number of end users, often widely distributed, to engage in interactive, real-time instruction. In this presentation, we will describe the elements of an EET Workshop pair, highlighting the capabilities and use of Elluminate. We will share lessons learned through several years of conducting this type of professional development. We will also share findings from survey data gathered from teachers who have participated in our workshops.
Work of the Web Weavers: Web Development in Academic Libraries
ERIC Educational Resources Information Center
Bundza, Maira; Vander Meer, Patricia Fravel; Perez-Stable, Maria A.
2009-01-01
Although the library's Web site has become a standard tool for seeking information and conducting research in academic institutions, there are a variety of ways libraries approach the often challenging--and sometimes daunting--process of Web site development and maintenance. Three librarians at Western Michigan University explored issues related…
Fleming Photo of Katherine Fleming Katherine Fleming Database and Web Applications Engineer and web application development in the Commercial Buildings Research group. Her projects include the , Katherine was pursuing a Ph.D. with a focus on robotics and working as a Web developer and Web accessibility
Digital Discernment: An E-Commerce Web Site Evaluation Tool
ERIC Educational Resources Information Center
Sigman, Betsy Page; Boston, Brian J.
2013-01-01
Students entering the business workforce today may well share some responsibility for developing, revising, or evaluating their company's Web site. They may lack the experience, however, to critique their employer's Web presence effectively. The purpose of developing Digital Discernment, an e-commerce Web site evaluation tool, was to prepare…
NASA Astrophysics Data System (ADS)
Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.
2016-06-01
Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.
[A Quality Assurance (QA) System with a Web Camera for High-dose-rate Brachytherapy].
Hirose, Asako; Ueda, Yoshihiro; Oohira, Shingo; Isono, Masaru; Tsujii, Katsutomo; Inui, Shouki; Masaoka, Akira; Taniguchi, Makoto; Miyazaki, Masayoshi; Teshima, Teruki
2016-03-01
The quality assurance (QA) system that simultaneously quantifies the position and duration of an (192)Ir source (dwell position and time) was developed and the performance of this system was evaluated in high-dose-rate brachytherapy. This QA system has two functions to verify and quantify dwell position and time by using a web camera. The web camera records 30 images per second in a range from 1,425 mm to 1,505 mm. A user verifies the source position from the web camera at real time. The source position and duration were quantified with the movie using in-house software which was applied with a template-matching technique. This QA system allowed verification of the absolute position in real time and quantification of dwell position and time simultaneously. It was evident from the verification of the system that the mean of step size errors was 0.31±0.1 mm and that of dwell time errors 0.1±0.0 s. Absolute position errors can be determined with an accuracy of 1.0 mm at all dwell points in three step sizes and dwell time errors with an accuracy of 0.1% in more than 10.0 s of the planned time. This system is to provide quick verification and quantification of the dwell position and time with high accuracy at various dwell positions without depending on the step size.
Web-based Food Behaviour Questionnaire: validation with grades six to eight students.
Hanning, Rhona M; Royall, Dawna; Toews, Jenn E; Blashill, Lindsay; Wegener, Jessica; Driezen, Pete
2009-01-01
The web-based Food Behaviour Questionnaire (FBQ) includes a 24-hour diet recall, a food frequency questionnaire, and questions addressing knowledge, attitudes, intentions, and food-related behaviours. The survey has been revised since it was developed and initially validated. The current study was designed to obtain qualitative feedback and to validate the FBQ diet recall. "Think aloud" techniques were used in cognitive interviews with dietitian experts (n=11) and grade six students (n=21). Multi-ethnic students (n=201) in grades six to eight at urban southern Ontario schools completed the FBQ and, subsequently, one-on-one diet recall interviews with trained dietitians. Food group and nutrient intakes were compared. Users provided positive feedback on the FBQ. Suggestions included adding more foods, more photos for portion estimation, and online student feedback. Energy and nutrient intakes were positively correlated between FBQ and dietitian interviews, overall and by gender and grade (all p<0.001). Intraclass correlation coefficients were ≥0.5 for energy and macro-nutrients, although the web-based survey underestimated energy (10.5%) and carbohydrate (-15.6%) intakes (p<0.05). Under-estimation of rice and pasta portions on the web accounted for 50% of this discrepancy. The FBQ is valid, relative to 24-hour recall interviews, for dietary assessment in diverse populations of Ontario children in grades six to eight.
[A web-based integrated clinical database for laryngeal cancer].
E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu
2014-08-01
To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.
Ontology Alignment Architecture for Semantic Sensor Web Integration
Fernandez, Susel; Marsa-Maestre, Ivan; Velasco, Juan R.; Alarcos, Bernardo
2013-01-01
Sensor networks are a concept that has become very popular in data acquisition and processing for multiple applications in different fields such as industrial, medicine, home automation, environmental detection, etc. Today, with the proliferation of small communication devices with sensors that collect environmental data, semantic Web technologies are becoming closely related with sensor networks. The linking of elements from Semantic Web technologies with sensor networks has been called Semantic Sensor Web and has among its main features the use of ontologies. One of the key challenges of using ontologies in sensor networks is to provide mechanisms to integrate and exchange knowledge from heterogeneous sources (that is, dealing with semantic heterogeneity). Ontology alignment is the process of bringing ontologies into mutual agreement by the automatic discovery of mappings between related concepts. This paper presents a system for ontology alignment in the Semantic Sensor Web which uses fuzzy logic techniques to combine similarity measures between entities of different ontologies. The proposed approach focuses on two key elements: the terminological similarity, which takes into account the linguistic and semantic information of the context of the entity's names, and the structural similarity, based on both the internal and relational structure of the concepts. This work has been validated using sensor network ontologies and the Ontology Alignment Evaluation Initiative (OAEI) tests. The results show that the proposed techniques outperform previous approaches in terms of precision and recall. PMID:24051523
Ontology alignment architecture for semantic sensor Web integration.
Fernandez, Susel; Marsa-Maestre, Ivan; Velasco, Juan R; Alarcos, Bernardo
2013-09-18
Sensor networks are a concept that has become very popular in data acquisition and processing for multiple applications in different fields such as industrial, medicine, home automation, environmental detection, etc. Today, with the proliferation of small communication devices with sensors that collect environmental data, semantic Web technologies are becoming closely related with sensor networks. The linking of elements from Semantic Web technologies with sensor networks has been called Semantic Sensor Web and has among its main features the use of ontologies. One of the key challenges of using ontologies in sensor networks is to provide mechanisms to integrate and exchange knowledge from heterogeneous sources (that is, dealing with semantic heterogeneity). Ontology alignment is the process of bringing ontologies into mutual agreement by the automatic discovery of mappings between related concepts. This paper presents a system for ontology alignment in the Semantic Sensor Web which uses fuzzy logic techniques to combine similarity measures between entities of different ontologies. The proposed approach focuses on two key elements: the terminological similarity, which takes into account the linguistic and semantic information of the context of the entity's names, and the structural similarity, based on both the internal and relational structure of the concepts. This work has been validated using sensor network ontologies and the Ontology Alignment Evaluation Initiative (OAEI) tests. The results show that the proposed techniques outperform previous approaches in terms of precision and recall.
Worobey, Lynn A; Rigot, Stephanie K; Hogaboom, Nathan S; Venus, Chris; Boninger, Michael L
2018-01-01
To determine the efficacy of a web-based transfer training module at improving transfer technique across 3 groups: web-based training, in-person training (current standard of practice), and a waitlist control group (WLCG); and secondarily, to determine subject factors that can be used to predict improvements in transfer ability after training. Randomized controlled trials. Summer and winter sporting events for disabled veterans. A convenience sample (N=71) of manual and power wheelchair users who could transfer independently. An individualized, in-person transfer training session or a web-based transfer training module. The WLCG received the web training at their follow-up visit. Transfer Assessment Instrument (TAI) part 1 score was used to assess transfers at baseline, skill acquisition immediately posttraining, and skill retention after a 1- to 2-day follow-up period. The in-person and web-based training groups improved their median (interquartile range) TAI scores from 7.98 (7.18-8.46) to 9.13 (8.57-9.58; P<.01), and from 7.14 (6.15-7.86) to 9.23 (8.46-9.82; P<.01), respectively, compared with the WLCG that had a median score of 7.69 for both assessments (baseline, 6.15-8.46; follow-up control, 5.83-8.46). Participants retained improvements at follow-up (P>.05). A lower initial TAI score was found to be the only significant predictor of a larger percent change in TAI score after receiving training. Transfer training can improve technique with changes retained within a short follow-up window, even among experienced wheelchair users. Web-based transfer training demonstrated comparable improvements to in-person training. With almost half of the United States population consulting online resources before a health care professional, web-based training may be an effective method to increase knowledge translation. Copyright © 2017 American Congress of Rehabilitation Medicine. All rights reserved.
Wang, Weiqi; Wang, Yanbo Justin; Bañares-Alcántara, René; Coenen, Frans; Cui, Zhanfeng
2009-12-01
In this paper, data mining is used to analyze the data on the differentiation of mammalian Mesenchymal Stem Cells (MSCs), aiming at discovering known and hidden rules governing MSC differentiation, following the establishment of a web-based public database containing experimental data on the MSC proliferation and differentiation. To this effect, a web-based public interactive database comprising the key parameters which influence the fate and destiny of mammalian MSCs has been constructed and analyzed using Classification Association Rule Mining (CARM) as a data-mining technique. The results show that the proposed approach is technically feasible and performs well with respect to the accuracy of (classification) prediction. Key rules mined from the constructed MSC database are consistent with experimental observations, indicating the validity of the method developed and the first step in the application of data mining to the study of MSCs.
Revenäs, Åsa; Opava, Christina H; Martin, Cathrin; Demmelmaier, Ingrid; Keller, Christina; Åsenlöf, Pernilla
2015-02-09
Long-term adherence to physical activity recommendations remains challenging for most individuals with rheumatoid arthritis (RA) despite evidence for its health benefits. The aim of this study was to provide basic data on system requirement specifications for a Web-based and mobile app to self-manage physical activity. More specifically, we explored the target user group, features of the future app, and correlations between the system requirements and the established behavior change techniques (BCTs). We used a participatory action research design. Qualitative data were collected using multiple methods in four workshops. Participants were 5 individuals with RA, a clinical physiotherapist, an officer from the Swedish Rheumatism Association, a Web designer, and 2 physiotherapy researchers. A taxonomy was used to determine the degree of correlation between the system requirements and established BCTs. Participants agreed that the future Web-based and mobile app should be based on two major components important for maintaining physical activity: (1) a calendar feature for goal setting, planning, and recording of physical activity performance and progress, and (2) a small community feature for positive feedback and support from peers. All system requirements correlated with established BCTs, which were coded as 24 different BCTs. To our knowledge, this study is the first to involve individuals with RA as co-designers, in collaboration with clinicians, researchers, and Web designers, to produce basic data to generate system requirement specifications for an eHealth service. The system requirements correlated to the BCTs, making specifications of content and future evaluation of effectiveness possible.
Rule-based statistical data mining agents for an e-commerce application
NASA Astrophysics Data System (ADS)
Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar
2003-03-01
Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.
Web-based technical assistance and training to promote community tobacco control policy change.
Young, Walter F; Montgomery, Debbie; Nycum, Colleen; Burns-Martin, Lavon; Buller, David B
2006-01-01
In 1998 the tobacco industry was released of claims that provided monetary relief for states. A significant expansion of tobacco control activity in many states created a need to develop local capacity. Technical assistance and training for new and experienced staff became a significant challenge for tobacco control leadership. In Colorado, this challenge was addressed in part through the development of a technical assistance and training Web site designed for local tobacco control staff and coalition members. Researchers, technical Web site development specialists, state health agency, and state tobacco control coalition staff collaborated to develop, promote, and test the efficacy of this Web site. The work group embodied a range of skills including tobacco control, Web site technical development, marketing, training, and project management. Persistent marketing, updating of Web site content, and institutionalizing it as a principal source of information and training were key to use by community coalition members.
Web-based learning resources - new opportunities for competency development.
Moen, Anne; Nygård, Kathrine A; Gauperaa, Torunn
2009-01-01
Creating web-based learning environments holds great promise for on the job training and competence development in nursing. The web-based learning environment was designed and customized by four professional development nurses. We interviewed five RNs that pilot tested the web-based resource. Our findings give some insight into how the web-based design tool are perceived and utilized, and how content is represented in the learning environment. From a competency development perspective, practicing authentic tasks in a web-based learning environment can be useful to train skills and keep up important routines. The approach found in this study also needs careful consideration. Emphasizing routines and skills can be important to reduce variation and ensure more streamlined practice from an institution-wide quality improvement efforts. How the emphasis on routines and skills plays out towards the individual's overall professional development needs further careful studies.
AMP: a science-driven web-based application for the TeraGrid
NASA Astrophysics Data System (ADS)
Woitaszek, M.; Metcalfe, T.; Shorrock, I.
The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.
The Development of Interactive World Wide Web Based Teaching Material in Forensic Science.
ERIC Educational Resources Information Center
Daeid, Niamh Nic
2001-01-01
Describes the development of a Web-based tutorial in the forensic science teaching program at the University of Strathclyde (Scotland). Highlights include the theoretical basis for course development; objectives; Web site design; student feedback; and staff feedback. (LRW)
Social Dimension of WEB 2.0 in Teacher Education: Focus on Peer-Learning
ERIC Educational Resources Information Center
Zascerinska, Jelena; Ahrens, Andreas
2010-01-01
The research deals with the analysis of efficiency of teaching techniques with the use of the social dimension of Web 2.0 within the English for Specific Purposes course in pre-school and primary teacher education that would help students to become more cognizant and more responsive to the emerging needs of the market for educational services and…
Cold Calling and Web Postings: Do They Improve Students' Preparation and Learning in Statistics?
ERIC Educational Resources Information Center
Levy, Dan
2014-01-01
Getting students to prepare well for class is a common challenge faced by instructors all over the world. This study investigates the effects that two frequently used techniques to increase student preparation--web postings and cold calling--have on student outcomes. The study is based on two experiments and a qualitative study conducted in a…
ERIC Educational Resources Information Center
Kaufman, Madeline
In response to low reading scores among first grade students of English as a Second Language (ESL) in one inner-city school, the teaching techniques of semantic webbing and brainstorming were used to improve student reading skills. Subjects were eight first grade ESL students. Pretests were administered to assess student levels of reading…
Using the Web To Deliver and Enhance Classes: Two Case Studies.
ERIC Educational Resources Information Center
Helford, Paul Q.; Lei, Richard M.
This paper discusses two case studies conducted at Northern Arizona University. The studies are from classes that are using the World Wide Web to enhance teaching and learning. One class is the Art of Cinema, a film studies class that has been taught via Instructional Television (ITV) for five years. Various techniques have been used over the…
Mining Student Data Captured from a Web-Based Tutoring Tool: Initial Exploration and Results
ERIC Educational Resources Information Center
Merceron, Agathe; Yacef, Kalina
2004-01-01
In this article we describe the initial investigations that we have conducted on student data collected from a web-based tutoring tool. We have used some data mining techniques such as association rule and symbolic data analysis, as well as traditional SQL queries to gain further insight on the students' learning and deduce information to improve…
NASA Technical Reports Server (NTRS)
Fieldler, F. S.; Ast, D.
1982-01-01
Experimental techniques for the preparation of electron beam induced current samples of Web-dentritic silicon are described. Both as grown and processed material were investigated. High density dislocation networks were found close to twin planes in the bulk of the material. The electrical activity of these networks is reduced in processed material.
Scalability Issues for Remote Sensing Infrastructure: A Case Study
Liu, Yang; Picard, Sean; Williamson, Carey
2017-01-01
For the past decade, a team of University of Calgary researchers has operated a large “sensor Web” to collect, analyze, and share scientific data from remote measurement instruments across northern Canada. This sensor Web receives real-time data streams from over a thousand Internet-connected sensors, with a particular emphasis on environmental data (e.g., space weather, auroral phenomena, atmospheric imaging). Through research collaborations, we had the opportunity to evaluate the performance and scalability of their remote sensing infrastructure. This article reports the lessons learned from our study, which considered both data collection and data dissemination aspects of their system. On the data collection front, we used benchmarking techniques to identify and fix a performance bottleneck in the system’s memory management for TCP data streams, while also improving system efficiency on multi-core architectures. On the data dissemination front, we used passive and active network traffic measurements to identify and reduce excessive network traffic from the Web robots and JavaScript techniques used for data sharing. While our results are from one specific sensor Web system, the lessons learned may apply to other scientific Web sites with remote sensing infrastructure. PMID:28468262
Welcome to the techno highway: development of a health assessment CD-ROM and website.
Bosco, Anna Maria; Ward, Catherine
2005-09-01
Traditionally teaching nursing students psychomotor skills took place in a laboratory setting; however, recent developments in computer technology have revolutionised how educators can transfer knowledge. To meet the need for an efficient and interactive learning experience a software product was required to educate nursing students about health assessment techniques. This paper presents how existing 'old technology' of a video was given new life by embracing new technology, resulting in development of an interactive CD-ROM with supporting WebCT. This innovation reflects a more flexible approach to learning as it is dynamic, portable, self-paced and more convenient for adult learners especially those in remote areas.
NASA Astrophysics Data System (ADS)
Cautun, Marius; van de Weygaert, Rien; Jones, Bernard J. T.; Frenk, Carlos S.
2014-07-01
The cosmic web is the largest scale manifestation of the anisotropic gravitational collapse of matter. It represents the transitional stage between linear and non-linear structures and contains easily accessible information about the early phases of structure formation processes. Here we investigate the characteristics and the time evolution of morphological components. Our analysis involves the application of the NEXUS Multiscale Morphology Filter technique, predominantly its NEXUS+ version, to high resolution and large volume cosmological simulations. We quantify the cosmic web components in terms of their mass and volume content, their density distribution and halo populations. We employ new analysis techniques to determine the spatial extent of filaments and sheets, like their total length and local width. This analysis identifies clusters and filaments as the most prominent components of the web. In contrast, while voids and sheets take most of the volume, they correspond to underdense environments and are devoid of group-sized and more massive haloes. At early times the cosmos is dominated by tenuous filaments and sheets, which, during subsequent evolution, merge together, such that the present-day web is dominated by fewer, but much more massive, structures. The analysis of the mass transport between environments clearly shows how matter flows from voids into walls, and then via filaments into cluster regions, which form the nodes of the cosmic web. We also study the properties of individual filamentary branches, to find long, almost straight, filaments extending to distances larger than 100 h-1 Mpc. These constitute the bridges between massive clusters, which seem to form along approximatively straight lines.
High throughput web inspection system using time-stretch real-time imaging
NASA Astrophysics Data System (ADS)
Kim, Chanju
Photonic time-stretch is a novel technology that enables capturing of fast, rare and non-repetitive events. Therefore, it operates in real-time with ability to record over long period of time while having fine temporal resolution. The powerful property of photonic time-stretch has already been employed in various fields of application such as analog-to-digital conversion, spectroscopy, laser scanner and microscopy. Further expanding the scope, we fully exploit the time-stretch technology to demonstrate a high throughput web inspection system. Web inspection, namely surface inspection is a nondestructive evaluation method which is crucial for semiconductor wafer and thin film production. We successfully report a dark-field web inspection system with line scan speed of 90.9 MHz which is up to 1000 times faster than conventional inspection instruments. The manufacturing of high quality semiconductor wafer and thin film may directly benefit from this technology as it can easily locate defects with area of less than 10 microm x 10 microm where it allows maximum web flow speed of 1.8 km/s. The thesis provides an overview of our web inspection technique, followed by description of the photonic time-stretch technique which is the keystone in our system. A detailed explanation of each component is covered to provide quantitative understanding of the system. Finally, imaging results from a hard-disk sample and flexible films are presented along with performance analysis of the system. This project was the first application of time-stretch to industrial inspection, and was conducted under financial support and with close involvement by Hitachi, Ltd.
Students' Reaction to WebCT: Implications for Designing On-Line Learning Environments
ERIC Educational Resources Information Center
Osman, Mohamed Eltahir
2005-01-01
There is a growing number of web-based and web-assisted course development tools and products that can be used to create on-line learning environment. The utility of these products, however, varies greatly depending on their feasibility, prerequisite infrastructure, technical features, interface, and course development and management tools. WebCT…
The Adoption and Diffusion of Web Technologies into Mainstream Teaching.
ERIC Educational Resources Information Center
Hansen, Steve; Salter, Graeme
2001-01-01
Discusses various adoption and diffusion frameworks and methodologies to enhance the use of Web technologies by teaching staff. Explains the use of adopter-based models for product development; discusses the innovation-decision process; and describes PlatformWeb, a Web information system that was developed to help integrate a universities'…
Exploring the Influence of Web-Based Portfolio Development on Learning To Teach Elementary Science.
ERIC Educational Resources Information Center
Avraamidou, Lucy; Zembal-Saul, Carla
This study examined how Web-based portfolio development supported reflective thinking and learning within a Professional Development School (PDS). It investigated the evidence-based philosophies developed by prospective teachers as a central part of the Web-based portfolio task, noting how technology contributed to the portfolio task. Participants…
Expanding the use of Scientific Data through Maps and Apps
NASA Astrophysics Data System (ADS)
Shrestha, S. R.; Zimble, D. A.; Herring, D.; Halpert, M.
2014-12-01
The importance of making scientific data more available can't be overstated. There is a wealth of useful scientific data available and demand for this data is only increasing; however, applying scientific data towards practical uses poses several technical challenges. These challenges can arise from difficulty in handling the data due largely to 1) the complexity, variety and volume of scientific data and 2) applying and operating the techniques and tools needed to visualize and analyze the data. As a result, the combined knowledge required to take advantage of these data requires highly specialized skill sets that in total, limit the ability of scientific data from being used in more practical day-to-day decision making activities. While these challenges are daunting, information technologies do exist that can help mitigate some of these issues. Many organizations for years have already been enjoying the benefits of modern service oriented architectures (SOAs) for everyday enterprise tasks. We can use this approach to modernize how we share and access our scientific data where much of the specialized tools and techniques needed to handle and present scientific data can be automated and executed by servers and done so in an appropriate way. We will discuss and show an approach for preparing file based scientific data (e.g. GRIB, netCDF) for use in standard based scientific web services. These scientific web services are able to encapsulate the logic needed to handle and describe scientific data through a variety of service types including, image, map, feature, geoprocessing, and their respective service methods. By combining these types of services and leveraging well-documented and modern web development APIs, we can afford to focus our attention on the design and development of user-friendly maps and apps. Our scenario will include developing online maps through these services by integrating various forecast data from the Climate Forecast System (CFSv2). This presentation showcases a collaboration between the National Oceanic and Atmospheric Administration's (NOAA) Climate.gov portal, Climate Prediction Center and Esri, Inc. on the implementation of the ArcGIS platform, which is aimed at helping modernize scientific data access through a service oriented architecture.
NASA Astrophysics Data System (ADS)
Kadow, C.; Illing, S.; Kunst, O.; Cubasch, U.
2014-12-01
The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: guest password: miklip
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich
2015-04-01
The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"
Bujar, Magdalena; McAuslane, Neil; Walker, Stuart R; Salek, Sam
2017-01-01
Introduction: Although pharmaceutical companies, regulatory authorities, and health technology assessment (HTA) agencies have been increasingly using decision-making frameworks, it is not certain whether these enable better quality decision making. This could be addressed by formally evaluating the quality of decision-making process within those organizations. The aim of this literature review was to identify current techniques (tools, questionnaires, surveys, and studies) for measuring the quality of the decision-making process across the three stakeholders. Methods: Using MEDLINE, Web of Knowledge, and other Internet-based search engines, a literature review was performed to systematically identify techniques for assessing quality of decision making in medicines development, regulatory review, and HTA. A structured search was applied using key words and a secondary review was carried out. In addition, the measurement properties of each technique were assessed and compared. Ten Quality Decision-Making Practices (QDMPs) developed previously were then used as a framework for the evaluation of techniques identified in the review. Due to the variation in studies identified, meta-analysis was inappropriate. Results: This review identified 13 techniques, where 7 were developed specifically to assess decision making in medicines' development, regulatory review, or HTA; 2 examined corporate decision making, and 4 general decision making. Regarding how closely each technique conformed to the 10 QDMPs, the 13 techniques assessed a median of 6 QDMPs, with a mode of 3 QDMPs. Only 2 techniques evaluated all 10 QDMPs, namely the Organizational IQ and the Quality of Decision Making Orientation Scheme (QoDoS), of which only one technique, QoDoS could be applied to assess decision making of both individuals and organizations, and it possessed generalizability to capture issues relevant to companies as well as regulatory authorities. Conclusion: This review confirmed a general paucity of research in this area, particularly regarding the development and systematic application of techniques for evaluating quality decision making, with no consensus around a gold standard. This review has identified QoDoS as the most promising available technique for assessing decision making in the lifecycle of medicines and the next steps would be to further test its validity, sensitivity, and reliability.
Developing Collections of Web-Published Materials
ERIC Educational Resources Information Center
Hsieh, Inga K.; Murray, Kathleen R.; Hartman, Cathy Nelson
2007-01-01
Librarians and archivists face challenges when adapting traditional collection development practices to meet the unique characteristics of Web-published materials. Likewise, preservation activities for Web-published materials must be undertaken at the outset of collection development lest they be lost forever. Standards and best practices for…
Dynamic selection mechanism for quality of service aware web services
NASA Astrophysics Data System (ADS)
D'Mello, Demian Antony; Ananthanarayana, V. S.
2010-02-01
A web service is an interface of the software component that can be accessed by standard Internet protocols. The web service technology enables an application to application communication and interoperability. The increasing number of web service providers throughout the globe have produced numerous web services providing the same or similar functionality. This necessitates the use of tools and techniques to search the suitable services available over the Web. UDDI (universal description, discovery and integration) is the first initiative to find the suitable web services based on the requester's functional demands. However, the requester's requirements may also include non-functional aspects like quality of service (QoS). In this paper, the authors define a QoS model for QoS aware and business driven web service publishing and selection. The authors propose a QoS requirement format for the requesters, to specify their complex demands on QoS for the web service selection. The authors define a tree structure called quality constraint tree (QCT) to represent the requester's variety of requirements on QoS properties having varied preferences. The paper proposes a QoS broker based architecture for web service selection, which facilitates the requesters to specify their QoS requirements to select qualitatively optimal web service. A web service selection algorithm is presented, which ranks the functionally similar web services based on the degree of satisfaction of the requester's QoS requirements and preferences. The paper defines web service provider qualities to distinguish qualitatively competitive web services. The paper also presents the modelling and selection mechanism for the requester's alternative constraints defined on the QoS. The authors implement the QoS broker based system to prove the correctness of the proposed web service selection mechanism.
XMM-Newton Mobile Web Application
NASA Astrophysics Data System (ADS)
Ibarra, A.; Kennedy, M.; Rodríguez, P.; Hernández, C.; Saxton, R.; Gabriel, C.
2013-10-01
We present the first XMM-Newton web mobile application, coded using new web technologies such as HTML5, the Query mobile framework, and D3 JavaScript data-driven library. This new web mobile application focuses on re-formatted contents extracted directly from the XMM-Newton web, optimizing the contents for mobile devices. The main goals of this development were to reach all kind of handheld devices and operating systems, while minimizing software maintenance. The application therefore has been developed as a web mobile implementation rather than a more costly native application. New functionality will be added regularly.
Finding Web-Based Anxiety Interventions on the World Wide Web: A Scoping Review
Olander, Ellinor K; Ayers, Susan
2016-01-01
Background One relatively new and increasingly popular approach of increasing access to treatment is Web-based intervention programs. The advantage of Web-based approaches is the accessibility, affordability, and anonymity of potentially evidence-based treatment. Despite much research evidence on the effectiveness of Web-based interventions for anxiety found in the literature, little is known about what is publically available for potential consumers on the Web. Objective Our aim was to explore what a consumer searching the Web for Web-based intervention options for anxiety-related issues might find. The objectives were to identify currently publically available Web-based intervention programs for anxiety and to synthesize and review these in terms of (1) website characteristics such as credibility and accessibility; (2) intervention program characteristics such as intervention focus, design, and presentation modes; (3) therapeutic elements employed; and (4) published evidence of efficacy. Methods Web keyword searches were carried out on three major search engines (Google, Bing, and Yahoo—UK platforms). For each search, the first 25 hyperlinks were screened for eligible programs. Included were programs that were designed for anxiety symptoms, currently publically accessible on the Web, had an online component, a structured treatment plan, and were available in English. Data were extracted for website characteristics, program characteristics, therapeutic characteristics, as well as empirical evidence. Programs were also evaluated using a 16-point rating tool. Results The search resulted in 34 programs that were eligible for review. A wide variety of programs for anxiety, including specific anxiety disorders, and anxiety in combination with stress, depression, or anger were identified and based predominantly on cognitive behavioral therapy techniques. The majority of websites were rated as credible, secure, and free of advertisement. The majority required users to register and/or to pay a program access fee. Half of the programs offered some form of paid therapist or professional support. Programs varied in treatment length and number of modules and employed a variety of presentation modes. Relatively few programs had published research evidence of the intervention’s efficacy. Conclusions This review represents a snapshot of available Web-based intervention programs for anxiety that could be found by consumers in March 2015. The consumer is confronted with a diversity of programs, which makes it difficult to identify an appropriate program. Limited reports and existence of empirical evidence for efficacy make it even more challenging to identify credible and reliable programs. This highlights the need for consistent guidelines and standards on developing, providing, and evaluating Web-based interventions and platforms with reliable up-to-date information for professionals and consumers about the characteristics, quality, and accessibility of Web-based interventions. PMID:27251763
Finding Web-Based Anxiety Interventions on the World Wide Web: A Scoping Review.
Ashford, Miriam Thiel; Olander, Ellinor K; Ayers, Susan
2016-06-01
One relatively new and increasingly popular approach of increasing access to treatment is Web-based intervention programs. The advantage of Web-based approaches is the accessibility, affordability, and anonymity of potentially evidence-based treatment. Despite much research evidence on the effectiveness of Web-based interventions for anxiety found in the literature, little is known about what is publically available for potential consumers on the Web. Our aim was to explore what a consumer searching the Web for Web-based intervention options for anxiety-related issues might find. The objectives were to identify currently publically available Web-based intervention programs for anxiety and to synthesize and review these in terms of (1) website characteristics such as credibility and accessibility; (2) intervention program characteristics such as intervention focus, design, and presentation modes; (3) therapeutic elements employed; and (4) published evidence of efficacy. Web keyword searches were carried out on three major search engines (Google, Bing, and Yahoo-UK platforms). For each search, the first 25 hyperlinks were screened for eligible programs. Included were programs that were designed for anxiety symptoms, currently publically accessible on the Web, had an online component, a structured treatment plan, and were available in English. Data were extracted for website characteristics, program characteristics, therapeutic characteristics, as well as empirical evidence. Programs were also evaluated using a 16-point rating tool. The search resulted in 34 programs that were eligible for review. A wide variety of programs for anxiety, including specific anxiety disorders, and anxiety in combination with stress, depression, or anger were identified and based predominantly on cognitive behavioral therapy techniques. The majority of websites were rated as credible, secure, and free of advertisement. The majority required users to register and/or to pay a program access fee. Half of the programs offered some form of paid therapist or professional support. Programs varied in treatment length and number of modules and employed a variety of presentation modes. Relatively few programs had published research evidence of the intervention's efficacy. This review represents a snapshot of available Web-based intervention programs for anxiety that could be found by consumers in March 2015. The consumer is confronted with a diversity of programs, which makes it difficult to identify an appropriate program. Limited reports and existence of empirical evidence for efficacy make it even more challenging to identify credible and reliable programs. This highlights the need for consistent guidelines and standards on developing, providing, and evaluating Web-based interventions and platforms with reliable up-to-date information for professionals and consumers about the characteristics, quality, and accessibility of Web-based interventions.
Continuous Ultrasonic Inspection of Extruded Wood-Plastic Composites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, Brian J.; Bender, Donald A.
Nondestructive evaluation (NDE) techniques are needed for in-line monitoring of wood-plastic composite (WPC) quality during manufacturing for process control. Through-transmission ultrasonic inspection is useful in characterizing stiffness and detecting cracks and voids in a range of materials; however, little is documented about ultrasound propagation in WPC materials. The objectives of this research were to determine applicable ultrasonic transducer frequencies, coupling methods, configurations and placements for wave speed monitoring and web defect detection within an extrusion process; to quantify the effects of temperature on ultrasonic parameters; and to develop a prototype ultrasonic inspection system for a full-size extrusion line. An angledmore » beam, water-coupled ultrasonic inspection system using a pair of 50-kHz narrowband transducers was adequate for monitoring wave speed parallel to the extrusion direction. For locating internal web defects, water-coupled, 500-kHz broadband ultrasonic transducers were used in a through-thickness transmission setup. Temperature compensation factors were developed to adjust ultrasonic wave speed measurements. The prototype inspection system was demonstrated in a 55 mm conical twin-screw extrusion line.« less
Web Content Management and One EPA Web Factsheet
One EPA Web is a multi-year project to improve EPA’s website to better meet the needs of our Web visitors. Content is developed and managed in the WebCMS which supports One EPA Web goals by standardizing how we create and publish content.
Web information retrieval for health professionals.
Ting, S L; See-To, Eric W K; Tse, Y K
2013-06-01
This paper presents a Web Information Retrieval System (WebIRS), which is designed to assist the healthcare professionals to obtain up-to-date medical knowledge and information via the World Wide Web (WWW). The system leverages the document classification and text summarization techniques to deliver the highly correlated medical information to the physicians. The system architecture of the proposed WebIRS is first discussed, and then a case study on an application of the proposed system in a Hong Kong medical organization is presented to illustrate the adoption process and a questionnaire is administrated to collect feedback on the operation and performance of WebIRS in comparison with conventional information retrieval in the WWW. A prototype system has been constructed and implemented on a trial basis in a medical organization. It has proven to be of benefit to healthcare professionals through its automatic functions in classification and summarizing the medical information that the physicians needed and interested. The results of the case study show that with the use of the proposed WebIRS, significant reduction of searching time and effort, with retrieval of highly relevant materials can be attained.
Experimental evaluation of the impact of packet capturing tools for web services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choe, Yung Ryn; Mohapatra, Prasant; Chuah, Chen-Nee
Network measurement is a discipline that provides the techniques to collect data that are fundamental to many branches of computer science. While many capturing tools and comparisons have made available in the literature and elsewhere, the impact of these packet capturing tools on existing processes have not been thoroughly studied. While not a concern for collection methods in which dedicated servers are used, many usage scenarios of packet capturing now requires the packet capturing tool to run concurrently with operational processes. In this work we perform experimental evaluations of the performance impact that packet capturing process have on web-based services;more » in particular, we observe the impact on web servers. We find that packet capturing processes indeed impact the performance of web servers, but on a multi-core system the impact varies depending on whether the packet capturing and web hosting processes are co-located or not. In addition, the architecture and behavior of the web server and process scheduling is coupled with the behavior of the packet capturing process, which in turn also affect the web server's performance.« less
A User-centered Model for Web Site Design
Kinzie, Mable B.; Cohn, Wendy F.; Julian, Marti F.; Knaus, William A.
2002-01-01
As the Internet continues to grow as a delivery medium for health information, the design of effective Web sites becomes increasingly important. In this paper, the authors provide an overview of one effective model for Web site design, a user-centered process that includes techniques for needs assessment, goal/task analysis, user interface design, and rapid prototyping. They detail how this approach was employed to design a family health history Web site, Health Heritage
Radiology teaching file cases on the World Wide Web.
Scalzetti, E M
1997-08-01
The presentation of a radiographic teaching file on the World Wide Web can be enhanced by attending to principles of web design. Chief among these are appropriate control of page layout, minimization of the time required to download a page from the remote server, and provision for navigation within and among the web pages that constitute the site. Page layout is easily accomplished by the use of tables; column widths can be fixed to maintain an acceptable line length for text. Downloading time is minimized by rigorous editing and by optimal compression of image files; beyond this, techniques like preloading of images and specification of image width and height are also helpful. Navigation controls should be clear, consistent, and readily available.
ERIC Educational Resources Information Center
Kao, Chia-Pin; Tsai, Chin-Chung; Shih, Meilun
2014-01-01
The major purpose of this study was to develop a survey to measure elementary school teachers' self-efficacy for web-based professional development. Based on interviews with eight elementary school teachers, three scales of web-based professional development self-efficacy (WPDSE) were formed, namely, general self-efficacy (measuring teachers'…
Web-based education in anesthesiology: a critical overview.
Doyle, D John
2008-12-01
The purpose of this review is to discuss the rise of web-based educational resources available to the anesthesiology community. Recent developments of particular importance include the growth of 'Web 2.0' resources, the development of the concepts of 'open access' and 'information philanthropy', and the expansion of web-based medical simulation software products.In addition, peer review of online educational resources has now come of age. The worldwide web has made available a large variety of valuable medical information and education resources only dreamed of two decades ago. To a large extent,these developments represent a shift in the focus of medical education resources to emphasize free access to materials and to encourage collaborative development efforts.
Warren, D.W.
1997-04-15
A process and an apparatus are disclosed for high-intensity drying of fiber webs or sheets, such as newsprint, printing and writing papers, packaging paper, and paperboard or linerboard, as they are formed on a paper machine. The invention uses direct contact between the wet fiber web or sheet and various molten heat transfer fluids, such as liquefied eutectic metal alloys, to impart heat at high rates over prolonged durations, in order to achieve ambient boiling of moisture contained within the web. The molten fluid contact process causes steam vapor to emanate from the web surface, without dilution by ambient air; and it is differentiated from the evaporative drying techniques of the prior industrial art, which depend on the uses of steam-heated cylinders to supply heat to the paper web surface, and ambient air to carry away moisture, which is evaporated from the web surface. Contact between the wet fiber web and the molten fluid can be accomplished either by submersing the web within a molten bath or by coating the surface of the web with the molten media. Because of the high interfacial surface tension between the molten media and the cellulose fiber comprising the paper web, the molten media does not appreciatively stick to the paper after it is dried. Steam generated from the paper web is collected and condensed without dilution by ambient air to allow heat recovery at significantly higher temperature levels than attainable in evaporative dryers. 6 figs.
MetaGenyo: a web tool for meta-analysis of genetic association studies.
Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro
2017-12-16
Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .
Parikh, Priti P; Minning, Todd A; Nguyen, Vinh; Lalithsena, Sarasi; Asiaee, Amir H; Sahoo, Satya S; Doshi, Prashant; Tarleton, Rick; Sheth, Amit P
2012-01-01
Research on the biology of parasites requires a sophisticated and integrated computational platform to query and analyze large volumes of data, representing both unpublished (internal) and public (external) data sources. Effective analysis of an integrated data resource using knowledge discovery tools would significantly aid biologists in conducting their research, for example, through identifying various intervention targets in parasites and in deciding the future direction of ongoing as well as planned projects. A key challenge in achieving this objective is the heterogeneity between the internal lab data, usually stored as flat files, Excel spreadsheets or custom-built databases, and the external databases. Reconciling the different forms of heterogeneity and effectively integrating data from disparate sources is a nontrivial task for biologists and requires a dedicated informatics infrastructure. Thus, we developed an integrated environment using Semantic Web technologies that may provide biologists the tools for managing and analyzing their data, without the need for acquiring in-depth computer science knowledge. We developed a semantic problem-solving environment (SPSE) that uses ontologies to integrate internal lab data with external resources in a Parasite Knowledge Base (PKB), which has the ability to query across these resources in a unified manner. The SPSE includes Web Ontology Language (OWL)-based ontologies, experimental data with its provenance information represented using the Resource Description Format (RDF), and a visual querying tool, Cuebee, that features integrated use of Web services. We demonstrate the use and benefit of SPSE using example queries for identifying gene knockout targets of Trypanosoma cruzi for vaccine development. Answers to these queries involve looking up multiple sources of data, linking them together and presenting the results. The SPSE facilitates parasitologists in leveraging the growing, but disparate, parasite data resources by offering an integrative platform that utilizes Semantic Web techniques, while keeping their workload increase minimal.
Karlsen, Bjørg; Oftedal, Bjørg; Stangeland Lie, Silje; Rokne, Berit; Peyrot, Mark; Zoffmann, Vibeke; Graue, Marit
2016-01-01
Introduction Self-management is deemed the cornerstone in overall diabetes management. Web-based self-management interventions have potential to support adults with type 2 diabetes (T2DM) in managing their disease. Owing to somewhat ambiguous results of such interventions, interventions should be theory-based and incorporate well-defined counselling methods and techniques for behavioural change. This study is designed to assess the effectiveness of a theory-driven web-based Guided Self-Determination (GSD) intervention among adults with T2DM in general practice to improve diabetes self-management behaviours and glycosylated haemoglobin (HbA1c). Methods and analysis A complex intervention design based on the framework of the UK Medical Research Council is employed as a guide for developing the intervention, assessing its feasibility and evaluating its effectiveness. The study consists of three phases: (1) the modelling phase adapting the original GSD programme for adults with T2DM, using a qualitative design, (2) feasibility assessment of the adapted intervention on the web, employing qualitative and quantitative methods and (3) evaluating the effectiveness of the intervention on diabetes self-management behaviours and HbA1c, using a quasi-experimental design. The first phase, which is completed, and the second phase, which is underway, will provide important information about the development of the intervention and its acceptability, whereas the third phase will assess the effectiveness of this systematically developed intervention. Ethics and dissemination The Norwegian Regional Committee for Medical and Health Research Ethics (REK west number 2015/60) has approved the study design. Patients recruited in the different phases will fill out an informed consent form prior to inclusion and will be guaranteed anonymity and the right to withdraw from the study at any time. The results of the study will be published in peer-reviewed journals, electronically and in print, and presented at research conferences. Trial registration number: NCT02575599. PMID:27965253
Evaluating HDR photos using Web 2.0 technology
NASA Astrophysics Data System (ADS)
Qiu, Guoping; Mei, Yujie; Duan, Jiang
2011-01-01
High dynamic range (HDR) photography is an emerging technology that has the potential to dramatically enhance the visual quality and realism of digital photos. One of the key technical challenges of HDR photography is displaying HDR photos on conventional devices through tone mapping or dynamic range compression. Although many different tone mapping techniques have been developed in recent years, evaluating tone mapping operators prove to be extremely difficult. Web2.0, social media and crowd-sourcing are emerging Internet technologies which can be harnessed to harvest the brain power of the mass to solve difficult problems in science, engineering and businesses. Paired comparison is used in the scientific study of preferences and attitudes and has been shown to be capable of obtaining an interval-scale ordering of items along a psychometric dimension such as preference or importance. In this paper, we exploit these technologies for evaluating HDR tone mapping algorithms. We have developed a Web2.0 style system that enables Internet users from anywhere to evaluate tone mapped HDR photos at any time. We adopt a simple paired comparison protocol, Internet users are presented a pair of tone mapped images and are simply asked to select the one that they think is better or click a "no difference" button. These user inputs are collected in the web server and analyzed by a rank aggregation algorithm which ranks the tone mapped photos according to the votes they received. We present experimental results which demonstrate that the emerging Internet technologies can be exploited as a new paradigm for evaluating HDR tone mapping algorithms. The advantages of this approach include the potential of collecting large user inputs under a variety of viewing environments rather than limited user participation under controlled laboratory environments thus enabling more robust and reliable quality assessment. We also present data analysis to correlate user generated qualitative indices with quantitative image statistics which may provide useful guidance for developing better tone mapping operators.
Golfing with protons: using research grade simulation algorithms for online games
NASA Astrophysics Data System (ADS)
Harold, J.
2004-12-01
Scientists have long known the power of simulations. By modeling a system in a computer, researchers can experiment at will, developing an intuitive sense of how a system behaves. The rapid increase in the power of personal computers, combined with technologies such as Flash, Shockwave and Java, allow us to bring research simulations into the education world by creating exploratory environments for the public. This approach is illustrated by a project funded by a small grant from NSF's Informal Science Education program, through an opportunity that provides education supplements to existing research awards. Using techniques adapted from a magnetospheric research program, several Flash based interactives have been developed that allow web site visitors to explore the motion of particles in the Earth's magnetosphere. These pieces were folded into a larger Space Weather Center web project at the Space Science Institute (www.spaceweathercenter.org). Rather than presenting these interactives as plasma simulations per se, the research algorithms were used to create games such as "Magneto Mini Golf", where the balls are protons moving in combined electric and magnetic fields. The "holes" increase in complexity, beginning with no fields and progressing towards a simple model of Earth's magnetosphere. The emphasis of the activity is gameplay, but because it is at its core a plasma simulation, the user develops an intuitive sense of charged particle motion as they progress. Meanwhile, the pieces contain embedded assessments that are measurable through a database driven tracking system. Mining that database not only provides helpful usability information, but allows us to examine whether users are meeting the learning goals of the activities. We will discuss the development and evaluation results of the project, as well as the potential for these types of activities to shift the expectations of what a web site can and should provide educationally.
Development of Plant Control Diagnosis Technology and Increasing Its Applications
NASA Astrophysics Data System (ADS)
Kugemoto, Hidekazu; Yoshimura, Satoshi; Hashizume, Satoru; Kageyama, Takashi; Yamamoto, Toru
A plant control diagnosis technology was developed to improve the performance of plant-wide control and maintain high productivity of plants. The control performance diagnosis system containing this technology picks out the poor performance loop, analyzes the cause, and outputs the result on the Web page. Meanwhile, the PID tuning tool is used to tune extracted loops from the control performance diagnosis system. It has an advantage of tuning safely without process changes. These systems are powerful tools to do Kaizen (continuous improvement efforts) step by step, coordinating with the operator. This paper describes a practical technique regarding the diagnosis system and its industrial applications.
Van Kouwenberg, Emily; Chattha, Anmol S; Adetayo, Oluwaseun A
2017-06-01
Webbed neck deformity (WND) can have significant functional and psychosocial impact on the developing child. Surgical correction can be challenging depending on the extent of the deformity, and patients often also have low posterior hairlines requiring simultaneous correction. Current surgical techniques include various methods of single-stage radical excision that often result in visible scar burden and residual deformity. There is currently no general consensus of which technique provides the best outcomes. A modified approach to WND was designed by the senior author aimed to decrease scar burden. Endoscopic-assisted fasciectomy was performed with simultaneous posterior hairline reconstruction with local tissue rearrangement camouflaged within the hair-bearing scalp. Staged surgical correction was planned rather than correction in a single operation. A retrospective review was performed to evaluate all patients who underwent this approach over a 2-year period. Two patients underwent the modified approach, a 17-year-old female with Noonan syndrome and a 2-year-old female with Turner syndrome. Both patients showed postoperative improvement in range of motion, contour of the jaw and neckline, and posterior hairline definition. Patients were found to have decreased scar burden compared with traditional techniques. A staged, combination approach of endoscopic-assisted fasciectomy and strategic local tissue reconstruction of the posterior hairline to correct WND achieves good functional and aesthetic results and good patient satisfaction. This modification should be considered when managing WND.
Jamoulle, Marc; Resnick, Melissa; Grosjean, Julien; Ittoo, Ashwin; Cardillo, Elena; Vander Stichele, Robert; Darmoni, Stefan; Vanmeerbeek, Marc
2018-12-01
While documentation of clinical aspects of General Practice/Family Medicine (GP/FM) is assured by the International Classification of Primary Care (ICPC), there is no taxonomy for the professional aspects (context and management) of GP/FM. To present the development, dissemination, applications, and resulting face validity of the Q-Codes taxonomy specifically designed to describe contextual features of GP/FM, proposed as an extension to the ICPC. The Q-Codes taxonomy was developed from Lamberts' seminal idea for indexing contextual content (1987) by a multi-disciplinary team of knowledge engineers, linguists and general practitioners, through a qualitative and iterative analysis of 1702 abstracts from six GP/FM conferences using Atlas.ti software. A total of 182 concepts, called Q-Codes, representing professional aspects of GP/FM were identified and organized in a taxonomy. Dissemination: The taxonomy is published as an online terminological resource, using semantic web techniques and web ontology language (OWL) ( http://www.hetop.eu/Q ). Each Q-Code is identified with a unique resource identifier (URI), and provided with preferred terms, and scope notes in ten languages (Portuguese, Spanish, English, French, Dutch, Korean, Vietnamese, Turkish, Georgian, German) and search filters for MEDLINE and web searches. This taxonomy has already been used to support queries in bibliographic databases (e.g., MEDLINE), to facilitate indexing of grey literature in GP/FM as congress abstracts, master theses, websites and as an educational tool in vocational teaching, Conclusions: The rapidly growing list of practical applications provides face-validity for the usefulness of this freely available new terminological resource.
Are We Ready To Abandon the Classroom? The Dark Side of Web Instruction.
ERIC Educational Resources Information Center
Cohen, LeoNora M.
This paper discusses four assumptions and four concerns regarding instruction using the World Wide Web. The assumptions address: the novice status of the Web course developer; the developer's appreciation for various aspects of the Web; her high expectations for doing it right; and her commitment to not incurring more costs for distance learners.…
Specification Patent Management for Web Application Platform Ecosystem
NASA Astrophysics Data System (ADS)
Fukami, Yoshiaki; Isshiki, Masao; Takeda, Hideaki; Ohmukai, Ikki; Kokuryo, Jiro
Diversified usage of web applications has encouraged disintegration of web platform into management of identification and applications. Users make use of various kinds of data linked to their identity with multiple applications on certain social web platforms such as Facebook or MySpace. There has emerged competition among web application platforms. Platformers can design relationship with developers by controlling patent of their own specification and adopt open technologies developed external organizations. Platformers choose a way to open according to feature of the specification and their position. Patent management of specification come to be a key success factor to build competitive web application platforms. Each way to attract external developers such as standardization, open source has not discussed and analyzed all together.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.