Alternative browsers are gaining significant market share, and both Apple and Microsoft are releasing OS upgrades which portend some interesting changes in Web development. Of particular interest for language learning professionals may be new developments in the area of Web browser based applications, particularly using an approach dubbed "Ajax."…
Clark, Jason A.
Loman, Nicholas J; Pallen, Mark J
Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader
Zhao, Li; Wang, Wei; Wu, Yanbin; Huang, Ming
According to client interactive operation in agricultural macroscopic decision-making eystem WebGIS publish, Ajax asynchronous communication technology and GWT-Ext were integrated into WebGIS. The Ajax technique used in the browser made the user getting part of the webpage information through the server possible. GWT-Ext is a Web interface element based on GWT (Google Web Toolkit) and Extjs development. GWT-Ext use Object Orient language Java and Ext component to develop Ajax applications, it is more efficient, shorten the development cycle. Based on the method in this paper the speed of server response and the interactivity can be improved.
Yanagihara, Shintaro; Ishihara, Akira; Ishii, Toshinao; Kitsuki, Junichi; Seo, Kazuo
In recent years, with spread of Web application and performance gain of Web browsers, the demand of the web-based supervisory and control(WSCADA) systems based on RIA(Rich Internet Application) is increased. To develop CRUD operations(Create, Read, Update, Delete which corresponds to the basic database operations) of RIA-based web applications, various frameworks and libraries are being provided. However, to develop behavior operations, a lot of program must be written manually. The typical operations of WSCADA are behavior operations, so even if RIA frameworks and libraries are used to develop WSCADA, the productivity of development doesn't improve. Although conceptual models and development environment have been proposed for typical web applications consisted mostly of CRUD operations, those for WSCADA is still the unsolved problem. This paper proposes the user interface model and the development environment for the monitoring user interface program of WSCADA. We focus on the productivity enhancement of the WSCADA development, and propose the Monitoring User Interface Model(MUM) extended Model-View-Controller(MVC) model. We design the Ajax framework and the development environment based on our model. We define the DisplayItem as the advanced View and the MonitoringItem as the advanced Model, and classify the Controller into the Interaction and the Behavior. Our Ajax framework based on web browser's standard technologies, provides the mapping between conceptual model elements. We define the domain specific language for writing the mapping. We design development environment for auto-generating Behavior program from the mapping. In this paper, we evaluate our model and development environment through the experimental development of the typical WSCADA. As a result, the development cost of the WSCADA based on our framework is only one fifth of that based on the typical Ajax library.
Mohammed, S; Orabi, A; Fiaidhi, J; Orabi, M
McLane, J. C.; Czech, W.; Yuen, D.; Greensky, J.; Knox, M. R.
We have designed a new software system for real-time interactive visualization of results taken directly from large-scale simulations of 3-D mantle convection and other large-scale simulations. This approach allows for intense visualization sessions for a couple of hours as opposed to storing massive amounts of data in a storage system. Our data sets consist of 3-D data for volume rendering with over 10 million unknowns at each timestep. Large scale visualization on a display wall holding around 13 million pixels has already been accomplished with extension to hand-held devices, such as the OQO and Nokia N800 and recently the iPHONE. We are developing web-based software in Java to extend the use of this system across long distances. The software is aimed at creating an interactive and functional application capable of running on multiple browsers by taking advantage of two AJAX-enabled web frameworks: Echo2 and Google Web Toolkit. The software runs in two modes allowing for a user to control an interactive session or observe a session controlled by another user. Modular build of the system allows for components to be swapped out for new components so that other forms of visualization could be accommodated such as Molecular Dynamics in mineral physics or 2-D data sets from lithospheric regional models.
statements by adding the instrumentation to the GWT UI classes, leaving the user code untouched. Some content management frameworks such as Drupal [12...Google web toolkit.” http://code.google.com/webtoolkit/.  “Form generation – drupal api.” http://api.drupal.org/api/group/form_api/6. 9
Home . Google. 26 ZK Project Home . ZK Ajax Framework. 27 Echo2 Project Home . Echo2 Ajax Framework. 28 ICEfaces Project Home . IceSoft...Technologies. 29 Dojo Project Home . Dojo Ajax Framework. 30 Apache XAP Project Home . Apache Software Foundation. 39 ZK Pros: • Lots of widgets...Postgraduate School Research Professor Arijit Das, and his Mobile Device Checkout requirement. After successfully getting the prototype working with the ZK
Geng, Xingyun; Jiang, Kui
As regards the application of community residents electronics health record system, there is a question of how to draw the child growth monitoring chart quickly and efficiently, which is also the focus of this research for enhancing residents experience in using the system. The system is combined with the current emerging Ajax and GDI+ technology. The client uses the pre-designed Ajax Manager status to deal with residents' request and send XMLHTTP request to the server. Sever responds to the request and makes use of GDI+ programming for implementation of rendering graphics and feedback. The system finally realizes child growth monitoring chart releasing on the Web.
Newman, R. L.; Lindquist, K. G.; Vernon, F. L.; Davis, G. A.; Eakins, J.; Astiz, L.
Writing these sorts of science archive web applications is now possible because of some significant breakthroughs in web technology over the last four years. The Web browser is no longer a glorified batch processing terminal, but an interactive environment that allows the user to have a similar experience as one might expect with an installed desktop application. Taking advantage of this technology requires a significant amount of UI design and advanced interactions with the web server. There are new levels of sophistication required to effectively develop this sort of web application. The IRSA group (NASA/IPAC Infrared Science Archive) is developing web-based software that equally takes advantage of modern technology and is designed to be reused easily. This way we can add new missions and data sets without a large programming effort while keeping the advanced interface. We can now provide true web-based FITS viewing, data overlays, and interaction without any plugins. Our tabular display allows us to filter, sort, and interact with large amounts data in ways that take advantage of the browser's power. This talk will show how we can us AJAX technology, the Google Web Toolkit (GWT), and Java to develop a data archive that is both well designed and creates a truly interactive experience.
Ruegg, C. S.
VIGIL,FRANK; REEDER,ROXANA G.
The Factsheets web application was conceived out of the requirement to create, update, publish, and maintain a web site with dynamic research and development (R and D) content. Before creating the site, a requirements discovery process was done in order to accurately capture the purpose and functionality of the site. One of the high priority requirements for the site would be that no specialized training in web page authoring would be necessary. All functions of uploading, creation, and editing of factsheets needed to be accomplished by entering data directly into web form screens generated by the application. Another important requirement of the site was to allow for access to the factsheet web pages and data via the internal Sandia Restricted Network and Sandia Open Network based on the status of the input data. Important to the owners of the web site would be to allow the published factsheets to be accessible to all personnel within the department whether or not the sheets had completed the formal Review and Approval (R and A) process. Once the factsheets had gone through the formal review and approval process, they could then be published both internally and externally based on their individual publication status. An extended requirement and feature of the site would be to provide a keyword search capability to search through the factsheets. Also, since the site currently resides on both the internal and external networks, it would need to be registered with the Sandia search engines in order to allow access to the content of the site by the search engines. To date, all of the above requirements and features have been created and implemented in the Factsheet web application. These have been accomplished by the use of flat text databases, which are discussed in greater detail later in this paper.
Avila, Edwin M. Martinez; Muniz, Ricardo; Szafran, Jamie; Dalton, Adam
Lines of code (LOC) analysis is one of the methods used to measure programmer productivity and estimate schedules of programming projects. The Launch Control System (LCS) had previously used this method to estimate the amount of work and to plan development efforts. The disadvantage of using LOC as a measure of effort is that one can only measure 30% to 35% of the total effort of software projects involves coding . In the application, instead of using the LOC we are using function point for a better estimation of hours in each software to develop. Because of these disadvantages, Jamie Szafran of the System Software Branch of Control And Data Systems (NE-C3) at Kennedy Space Canter developed a web application called Function Point Analysis (FPA) Depot. The objective of this web application is that the LCS software architecture team can use the data to more accurately estimate the effort required to implement customer requirements. This paper describes the evolution of the domain model used for function point analysis as project managers continually strive to generate more accurate estimates.
Paulsworth, Ashley; Kurtz, Jim; Brun de Pontet, Stephanie
Sunvestment Energy Group (previously called Sunvestment Group) was established to create a web application that brings together site hosts, those who will obtain the energy from the solar array, with project developers and funders, including affinity investors. Sunvestment Energy Group (SEG) uses a community-based model that engages with investors who have some affinity with the site host organization. In addition to a financial return, these investors receive non-financial value from their investments and are therefore willing to offer lower cost capital. This enables the site host to enjoy more savings from solar through these less expensive Community Power Purchase Agreements (CPPAs). The purpose of this award was to develop an online platform to bring site hosts and investors together virtually.
Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.
their understanding of VoI attributes (source reliable , information content, and latency). The VoI web application emulates many features of a...built to allow the tool to be accessed from the Internet via a web browser. The following sections describe the web application’s user interface...based upon 3 attributes: source reliable , information content, and latency. The cards are divided into 4 decks: training, tactical, strategic, and
Chen, Hsinchun; Chau, Michael
Presents an overview of machine learning research and reviews methods used for evaluating machine learning systems. Ways that machine-learning algorithms were used in traditional information retrieval systems in the "pre-Web" era are described, and the field of Web mining and how machine learning has been used in different Web mining…
Goff, Samuel J.
Wood, Dylan; King, Margaret; Landis, Drew; Courtney, William; Wang, Runtang; Kelly, Ross; Turner, Jessica A.; Calhoun, Vince D.
Wood, Dylan; King, Margaret; Landis, Drew; Courtney, William; Wang, Runtang; Kelly, Ross; Turner, Jessica A; Calhoun, Vince D
Ibarra, A.; Kennedy, M.; Rodríguez, P.; Hernández, C.; Saxton, R.; Gabriel, C.
Lehmann Miotto, Giovanna; Magnoni, Luca; Sloper, John Erik
The ATLAS Trigger and Data Acquisition (TDAQ) infrastructure is responsible for filtering and transferring ATLAS experimental data from detectors to mass storage systems. It relies on a large, distributed computing system composed of thousands of software applications running concurrently. In such a complex environment, information sharing is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking, the streams of messages sent by applications and data published via information services are constantly monitored by experts to verify the correctness of running operations and to understand problematic situations. To simplify and improve system analysis and errors detection tasks, we developed the TDAQ Analytics Dashboard, a web application that aims to collect, correlate and visualize effectively this real time flow of information. The TDAQ Analytics Dashboard is composed of two main entities that reflect the twofold scope of the application. The first is the engine, a Java service that performs aggregation, processing and filtering of real time data stream and computes statistical correlation on sliding windows of time. The results are made available to clients via a simple web interface supporting SQL-like query syntax. The second is the visualization, provided by an Ajax-based web application that runs on client's browser. The dashboard approach allows to present information in a clear and customizable structure. Several types of interactive graphs are proposed as widgets that can be dynamically added and removed from visualization panels. Each widget acts as a client for the engine, querying the web interface to retrieve data with desired criteria. In this paper we present the design, development and evolution of the TDAQ Analytics Dashboard. We also present the statistical analysis computed by the application in this first period of high energy data taking operations for the ATLAS experiment.
Murchie, S. L.; Adams, E. Y.; Mustard, J. F.; Rivkin, A.; Peplowski, P. N.
The Advanced Jovian Asteroid eXplorer (AJAX) is the first mission to characterize the geology, morphology, geophysical properties, and chemistry of a Trojan asteroid. The Decadal Survey outlined a notional New Frontiers class Trojan asteroid rendezvous mission to conduct geological, elemental composition, mineralogical, and geophysical investigations. AJAX, our Discovery mission proposal, addresses the Decadal Survey science goals by using a focused payload and an innovative mission design. By responding to the most important questions about the Trojan asteroids, AJAX advances our understanding of all of the Solar System. Are these objects a remnant population of the local primordial material from which the outer planets and their satellites formed, or did they originate in the Kuiper Belt? Landed measurements of major and minor elements test hypotheses for the Trojan asteroid origin, revealing the outer Solar System dynamical history. How and when were prebiotic materials delivered to the terrestrial planets? AJAX's landed measurements include C and H concentrations, necessary to determine their inventories of volatiles and organic compounds, material delivered to the inner Solar System during the Late Heavy Bombardment. What chemical and geological processes shaped the small bodies that merged to form the planets in our Solar System? AJAX investigates the asteroid internal structure, geology, and regolith by using global high-resolution stereo and multispectral imaging, determining density and estimating interior porosity by measuring gravity, and measuring regolith mechanical properties by landing. AJAX's science phase starts with search for natural satellites and dust lifted by possible cometary activity and shape and pole position determination. AJAX descends to lower altitudes for global mapping, and conducts a low flyover for high-resolution surface characterization and measurement of hydrogen abundance. Finally, it deploys a small landed package, which
Lansky, M R
This paper explores the vicissitudes of shame and its relation to narcissistic rage and escalation of conflict in Sophocles' Ajax. The plot is set in motion by Ajax's shame over losing the competition with Odysseus for Achilles' armor. His shame leads to narcissistic, rage and propels him to vengeance against the social order. Misidentification, an aspect of narcissistic rage, compounds his disgrace by escalating his shame to suicidal proportions when his madness leaves him. His defenses all fail, and his suicide becomes inevitable. Forces that bind him to the social order lose out to those that make of him a humiliated outcast and drive him to kill himself.
Odier, J.; Albrand, S.; Fulachier, J.; Lambert, F.
... relevant issues. Features Include Select data based on mission, date and/or scientific parameter Output original data ... Details: Toolsets for Airborne Data (TAD) Web Application Category: Instrument Specific Search, ...
Garnier, Laurent; Geant4 Collaboration
Geant4 is a toolkit for the simulation of the passage of particles through matter. The Geant4 visualization system supports many drivers including OpenGL, OpenInventor, HepRep, DAWN, VRML, RayTracer, gMocren and ASCIITree, with diverse and complementary functionalities. Web applications have an increasing role in our work, and thanks to emerging frameworks such as Wt , building a web application on top of a C++ application without rewriting all the code can be done. Because the Geant4 toolkit's visualization and user interface modules are well decoupled from the rest of Geant4, it is straightforward to adapt these modules to render in a web application instead of a computer's native window manager. The API of the Wt framework closely matches that of Qt , our experience in building Qt driver will benefit for Wt driver. Porting a Geant4 application to a web application is easy, and with minimal effort, Geant4 users can replicate this process to share their own Geant4 applications in a web browser.
The Social Web constitutes a shift in information flow from the traditional Web. Previously, content was provided by the owners of a website, for consumption by the end-user. Nowadays, these websites are being replaced by Social Web applications which are frameworks for the publication of user-provided content. Traditionally, Web content could be `trusted' to some extent based on the site it originated from. Algorithms such as Google's PageRank were (and still are) used to compute the importance of a website, based on analysis of underlying link topology. In the Social Web, analysis of link topology merely tells us about the importance of the information framework which hosts the content. Consumers of information still need to know about the importance/reliability of the content they are reading, and therefore about the reliability of the producers of that content. Research into trust and reputation of the producers of information in the Social Web is still very much in its infancy. Every day, people are forced to make trusting decisions about strangers on the Web based on a very limited amount of information. For example, purchasing a product from an eBay seller with a `reputation' of 99%, downloading a file from a peer-to-peer application such as Bit-Torrent, or allowing Amazon.com tell you what products you will like. Even something as simple as reading comments on a Web-blog requires the consumer to make a trusting decision about the quality of that information. In all of these example cases, and indeed throughout the Social Web, there is a pressing demand for increased information upon which we can make trusting decisions. This chapter examines the diversity of sources from which trust information can be harnessed within Social Web applications and discusses a high level classification of those sources. Three different techniques for harnessing and using trust from a range of sources are presented. These techniques are deployed in two sample Social Web
We provide a description of work at the National Aeronautics and Space Administration (NASA) on building system based on semantic-web concepts and technologies. NASA has been one of the early adopters of semantic-web technologies for practical applications. Indeed there are several ongoing 0 endeavors on building semantics based systems for use in diverse NASA domains ranging from collaborative scientific activity to accident and mishap investigation to enterprise search to scientific information gathering and integration to aviation safety decision support We provide a brief overview of many applications and ongoing work with the goal of informing the external community of these NASA endeavors.
Shams, Khawaja; Norris, Jeff
This slide presentation accompanies a tutorial on the ReSTful (Representational State Transfer) web application. Using Open Services Gateway Initiative (OSGi), ReST uses HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites. It also uses Eclipse for the rapid development, the Eclipse debugger, the test application, and the ease of export to production servers.
Choi, Yongsoon; Kim, Seonghoon; Park, Hong-Seong
ZigBee technology that is observed to radio network of low electric power with latest WPAN's IEEE 802.15.4 technologies is evaluated as best technology for sensor network and digital home network construction. These fields require using application for monitoring, managing, controlling and organizing these nodes. However, these programs have a weakness that is hard to construct flexible system with Internet. This paper will use web server linked with gateway to manage sensor network that is consisted of ZigBee node and design system that manages ZigBee device with relative fast responsibility using AJAX web technology.
This tool is a guideline for planning and checking for 508 compliance on web sites and web based applications. Determine which EIT components are covered or excepted, which 508 standards and requirements apply, and how to implement them.
Liu, Leping; Johnson, D. Lamont
This paper evaluates the quality of two major types of Web resources for K-12 education --information for research, and interactive applications for teaching and learning. It discusses an evaluation on the quality of 1,025 pieces of Web information (articles, research reports, news, and statistics) and 900 Web applications (tutorials, drills,…
... If you haven’t already noticed the link to the new SSE-GIS web application on the SSE homepage entitled “GIS Web Mapping Applications and Services”, we invite you to visit the site. The Surface meteorology and Solar Energy (SSE) v1.0.3 Web Mapping ...
Carrazza, Stefano; Ferrara, Alfio; Palazzo, Daniele; Rojo, Juan
We present APFEL Web, a Web-based application designed to provide a flexible user-friendly tool for the graphical visualization of parton distribution functions. In this note we describe the technical design of the APFEL Web application, motivating the choices and the framework used for the development of this project. We document the basic usage of APFEL Web and show how it can be used to provide useful input for a variety of collider phenomenological studies. Finally we provide some examples showing the output generated by the application.
Lu, Ming-te; Yeung, Wing-lok
Proposes a framework for commercial Web application development based on prior research in hypermedia and human-computer interfaces. First, its social acceptability is investigated. Next, economic, technical, operational, and organizational viability are examined. For Web-page design, the functionality and usability of Web pages are considered.…
Guerquin, Michal; McDermott, Jason E.; Frazier, Zach; Samudrala, Ram
The Bioverse is a framework for creating, warehousing and presenting biological information based on hierarchical levels of organisation. The framework is guided by a deeper philosophy of desiring to represent all relationships between all components of biological systems towards the goal of a wholistic picture of organismal biology. Data from various sources is combined into a single repository and a uniform interface is exposed to access it. The power of the approach of the Bioverse is that, due to its inclusive nature, patterns emerge from the acquired data and new predictions are made. The implementation of this repository (beginning with acquisition of source data, processing in a pipeline and concluding with storage in a relational database) and interfaces to the data contained in it, from a programmatic application interface to a user friendly web application, are discussed
de Knikker, Remko; Guo, Youjun; Li, Jin-long; Kwan, Albert KH; Yip, Kevin Y; Cheung, David W; Cheung, Kei-Hoi
Background Very often genome-wide data analysis requires the interoperation of multiple databases and analytic tools. A large number of genome databases and bioinformatics applications are available through the web, but it is difficult to automate interoperation because: 1) the platforms on which the applications run are heterogeneous, 2) their web interface is not machine-friendly, 3) they use a non-standard format for data input and output, 4) they do not exploit standards to define application interface and message exchange, and 5) existing protocols for remote messaging are often not firewall-friendly. To overcome these issues, web services have emerged as a standard XML-based model for message exchange between heterogeneous applications. Web services engines have been developed to manage the configuration and execution of a web services workflow. Results To demonstrate the benefit of using web services over traditional web interfaces, we compare the two implementations of HAPI, a gene expression analysis utility developed by the University of California San Diego (UCSD) that allows visual characterization of groups or clusters of genes based on the biomedical literature. This utility takes a set of microarray spot IDs as input and outputs a hierarchy of MeSH Keywords that correlates to the input and is grouped by Medical Subject Heading (MeSH) category. While the HTML output is easy for humans to visualize, it is difficult for computer applications to interpret semantically. To facilitate the capability of machine processing, we have created a workflow of three web services that replicates the HAPI functionality. These web services use document-style messages, which means that messages are encoded in an XML-based format. We compared three approaches to the implementation of an XML-based workflow: a hard coded Java application, Collaxa BPEL Server and Taverna Workbench. The Java program functions as a web services engine and interoperates with these web
"Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that comprised 80% of the hits were results reporting (27%), graduate medical education (26%), research (20%), and bed availability (8%). The mean number of hits per application was 3888 (SD = 5598; range, 14-19879). A model is described for just-in-time database-driven Web application development and an example given with a popular HTML editor and database program. PMID:14517109
Lamongie, Julien R.
6. Photocopy of photograph showing an Ajax and Hercules Missile from ARADCOM Argus pg. 3, from Institute for Military History, Carlisle Barracks, Carlisle, PA, October 1, 1958 - NIKE Missile Battery PR-79, East Windsor Road south of State Route 101, Foster, Providence County, RI
7. Photocopy of photograph showing four Ajax missiles in launch position from ARADCOM Argus pg. 14, from Institute for Military History, Carlisle Barracks, Carlisle, PA, October 1, 1963 - NIKE Missile Battery PR-79, East Windsor Road south of State Route 101, Foster, Providence County, RI
Hsieh, Yichuan; Brennan, Patricia Flatley
Traditional consumer health informatics (CHI) applications that were developed for lay public on the Web were commonly written in a Hypertext Markup Language (HTML). As genetics knowledge rapidly advances and requires updating information in a timely fashion, a different content structure is therefore needed to facilitate information delivery. This poster will present the process of developing a dynamic database-driven Web CHI application.
The Web-based Interspecies Correlation Estimation (Web-ICE) modeling application is available to the risk assessment community through a user-friendly internet platform (http://epa.gov/ceampubl/fchain/webice/). ICE models are log-linear least square regressions that predict acute...
Fukami, Yoshiaki; Isshiki, Masao; Takeda, Hideaki; Ohmukai, Ikki; Kokuryo, Jiro
Diversified usage of web applications has encouraged disintegration of web platform into management of identification and applications. Users make use of various kinds of data linked to their identity with multiple applications on certain social web platforms such as Facebook or MySpace. There has emerged competition among web application platforms. Platformers can design relationship with developers by controlling patent of their own specification and adopt open technologies developed external organizations. Platformers choose a way to open according to feature of the specification and their position. Patent management of specification come to be a key success factor to build competitive web application platforms. Each way to attract external developers such as standardization, open source has not discussed and analyzed all together.
Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng
Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.
Palmer, Grant; Arnold, James O. (Technical Monitor)
There are many advantages to developing web-based scientific applications. Any number of people can access the application concurrently. The application can be accessed from a remote location. The application becomes essentially platform-independent because it can be run from any machine that has internet access and can run a web browser. Maintenance and upgrades to the application are simplified since only one copy of the application exists in a centralized location. This paper details the creation of web-based applications using Java servlets. Java is a powerful, versatile programming language that is well suited to developing web-based programs. A Java servlet provides the interface between the central server and the remote client machines. The servlet accepts input data from the client, runs the application on the server, and sends the output back to the client machine. The type of servlet that supports the HTTP protocol will be discussed in depth. Among the topics the paper will discuss are how to write an http servlet, how the servlet can run applications written in Java and other languages, and how to set up a Java web server. The entire process will be demonstrated by building a web-based application to compute stagnation point heat transfer.
Hampton, J.; Simons, R.
This document describes the application design philosophy for the Comprehensive Nuclear Test Ban Treaty Research & Development Web Site. This design incorporates object-oriented techniques to produce a flexible and maintainable system of applications that support the web site. These techniques will be discussed at length along with the issues they address. The overall structure of the applications and their relationships with one another will also be described. The current problems and future design changes will be discussed as well.
Yildiz, Sevda Goktepe; Korpeoglu, Seda Goktepe
In recent years, WebQuests have received a great deal of attention and have been used effectively in teaching-learning process in various courses. In this study, a WebQuest that can be applicable in teaching topological concepts for undergraduate level students was prepared. A number of topological concepts, such as countability, infinity, and…
Rose, Alexander S.; Hildebrand, Peter W.
The NGL Viewer (http://proteinformatics.charite.de/ngl) is a web application for the visualization of macromolecular structures. By fully adopting capabilities of modern web browsers, such as WebGL, for molecular graphics, the viewer can interactively display large molecular complexes and is also unaffected by the retirement of third-party plug-ins like Flash and Java Applets. Generally, the web application offers comprehensive molecular visualization through a graphical user interface so that life scientists can easily access and profit from available structural data. It supports common structural file-formats (e.g. PDB, mmCIF) and a variety of molecular representations (e.g. ‘cartoon, spacefill, licorice’). Moreover, the viewer can be embedded in other web sites to provide specialized visualizations of entries in structural databases or results of structure-related calculations. PMID:25925569
Rose, Alexander S; Hildebrand, Peter W
The NGL Viewer (http://proteinformatics.charite.de/ngl) is a web application for the visualization of macromolecular structures. By fully adopting capabilities of modern web browsers, such as WebGL, for molecular graphics, the viewer can interactively display large molecular complexes and is also unaffected by the retirement of third-party plug-ins like Flash and Java Applets. Generally, the web application offers comprehensive molecular visualization through a graphical user interface so that life scientists can easily access and profit from available structural data. It supports common structural file-formats (e.g. PDB, mmCIF) and a variety of molecular representations (e.g. 'cartoon, spacefill, licorice'). Moreover, the viewer can be embedded in other web sites to provide specialized visualizations of entries in structural databases or results of structure-related calculations.
Seidensticker, R. G.
The dendritic web process for growing long thin ribbon crystals of silicon and other semiconductors is described. Growth is initiated from a thin wirelike dendrite seed which is brought into contact with the melt surface. Initially, the seed grows laterally to form a button at the melt surface; when the seed is withdrawn, needlelike dendrites propagate from each end of the button into the melt, and the web portion of the crystal is formed by the solidification of the liquid film supported by the button and the bounding dendrites. Apparatus used for dendritic web growth, material characteristics, and the two distinctly different mechanisms involved in the growth of a single crystal are examined. The performance of solar cells fabricated from dendritic web material is indistinguishable from the performance of cells fabricated from Czochralski grown material.
... AFFAIRS Proposed Information Collection (Internet Student CPR Web Registration Application); Comment... solicits comments on information needed to establish an online web registration application. DATES: Written... use of other forms of information technology. Title: Internet Student CPR Web Registration...
Hasan, M. R.; Ibrahimy, M. I.; Motakabber, S. M. A.; Ferdaus, M. M.; Khan, M. N. H.; Mostafa, M. G.
The paper describes a technique to develop a web based financial system, following latest technology and business needs. In the development of web based application, the user friendliness and technology both are very important. It is used ASP .NET MVC 4 platform and SQL 2008 server for development of web based financial system. It shows the technique for the entry system and report monitoring of the application is user friendly. This paper also highlights the critical situations of development, which will help to develop the quality product.
The U.S. Department of Energy (DOE) requires all employees who hold a security clearance and have access to classified information and/or special nuclear material to be trained in the area of Safeguards and Security. Since the advent of the World Wide Web, personnel who are responsible for training have capitalized on this communication medium to develop and deliver Web-based training. Unlike traditional computer based training where the student was required to find a workstation where the training program resided, one of Web-based training strongest advantage is that the training can be delivered right to the workers desk top computer. This paper will address reasons for the driving forces behind the utilization of Web-based training at the Laboratory with a brief explanation of the different types of training conducted. Also discussed briefly is the different types of distance learning used in conjunction with Web-based training. The implementation strategy will be addressed and how the Laboratory utilized a Web-Based Standards Committee to develop standards for Web-based training applications. Web-based problems resulting from little or no communication between training personnel across the Laboratory will be touched on and how this was solved. Also discussed is the development of a ''Virtual Training Center'' where personnel can shop on-line for their training needs. Web-based training programs within the Safeguards and Security arena will be briefly discussed. Specifically, Web-based training in the area of Materials Control and Accountability will be explored. A Web-based example of what a student would experience during a training session is also discussed. A short closing statement of what the future of Web-based Training holds in the future is offered.
Mahler, Simon A; Wagner, Mary-Jo; Church, Amy; Sokolosky, Mitchell; Cline, David M
Emergency Medicine (EM) residency program web sites are an important tool that programs use to attract applicants. However, there are only a few studies examining the aspects of a program's web site that are most important to EM applicants. We conducted a cross-sectional study of 142 prospective residency applicants interviewing for an EM position at one of three EM residency programs for the 2003 match. The survey demonstrated that almost all applicants researched EM programs online. The majority (71%) identified geographic location as the most important factor in applying to a specific program. Approximately 40% considered an easily navigated web site as very/moderately important to their application decision-making process. Rotation schedule was also important in applicant decision-making. The Internet is a significant source of information to the majority of applicants in EM. Online information from programs' web sites, although not as significant as geography, influences an applicant's choice of where to apply for a residency position. An easily navigated, complete web site may improve the recruitment of candidates to EM residency programs.
Ueno; Asai; Arita
We have constructed a general framework for integrating application programs with control through a local Web browser. This method is based on a simple inter-process message function from an external process to application programs. Commands to a target program are prepared in a script file, which is parsed by a message dispatcher program. When it is used as a helper application to a Web browser, these messages will be sent from the browser by clicking a hyper-link in a Web document. Our framework also supports pluggable extension-modules for application programs by means of dynamic linking. A prototype system is implemented on our molecular structure-viewer program, MOSBY. It successfully featured a function to load an extension-module required for the docking study of molecular fragments from a Web page. Our simple framework facilitates the concise configuration of Web softwares without complicated knowledge on network computation and security issues. It is also applicable for a wide range of network computations processing private data using a Web browser.
Web applications became most popular medium in the Internet. Popularity, easiness of web application script languages and frameworks together with careless development results in high number of web application vulnerabilities and high number of attacks performed. There are several types of attacks possible because of improper input validation: SQL injection Cross-site scripting, Cross-Site Request Forgery (CSRF), web spam in blogs and others. In order to secure web applications intrusion detection (IDS) and intrusion prevention systems (IPS) are being used. Intrusion detection systems are divided in two groups: misuse detection (traditional IDS) and anomaly detection. This paper presents data mining based algorithm for anomaly detection. The principle of this method is the comparison of the incoming HTTP traffic with a previously built profile that contains a representation of the "normal" or expected web application usage sequence patterns. The frequent sequence patterns are found with GSP algorithm. Previously presented detection method was rewritten and improved. Some tests show that the software catches malicious requests, especially long attack sequences, results quite good with medium length sequences, for short length sequences must be complemented with other methods.
Luaces, Miguel R.; Pedreira, Oscar; Places, Ángeles S.; Seco, Diego
Tracy, Fran; Jordan, Katy
This paper draws upon the experience of an interdisciplinary research group in engaging undergraduate university students in the design and development of semantic web technologies. A flexible approach to participatory design challenged conventional distinctions between "designer" and "user" and allowed students to play a role…
Wang, Xin; Ye, Yan; Qi, Jiahui; Wu, Min
Due to http protocol restrictions, the traditional web real-time applications cannot push information from the server to the browser. Although it can be achieved through technical means, but there are obvious shortcomings. This paper introduces an English testing system design and development on server-side, which is based on the WebSocket protocol. Through the WebSocket, a bidirectional communication channel can be established between the browser and the server. It realizes the real-time communication in the English test system.
minister, Dr. Mohammed Mosaddeq. The coup, titled Operation Ajax, coincided with the early Cold War years and the development of the nascent...overthrew Iran’s elected prime minister, Dr. Mohammed Mossadeq. The coup, titled Operation Ajax, coincided with the early Cold War years and the...1 The Early Cold War Years (1947-1953
such as information retrieval from the World Wide Web is an example of client-server computing. However, the term is generally applied to systems in...developed and built a prototype capable of retrieving information considered pertinent for reported terrorist incidents worldwide in support of this...and Sons Ltd. Clouse, S. (n.d.). Web-based applications vs . client-server software. Fcny.org. Retrieved August 29, 2011, from http://metrix.fcny.org
Di Benedetto, M.; Corsini, M.; Scopigno, R.
Hariadi, Bambang; Dewiyani Sunarto, M. J.; Sudarmaningtyas, Pantjawati
This study aimed to develop a web-based learning application as a form of learning revolution. The form of learning revolution includes the provision of unlimited teaching materials, real time class organization, and is not limited by time or place. The implementation of this application is in the form of hybrid learning by using Google Apps for…
Guthrie, J.D.; Dartiguenave, C.; Ries, Kernell G.
StreamStats is a U.S. Geological Survey Web-based GIS application developed as a tool for waterresources planning and management, engineering design, and other applications. StreamStats' primary functionality allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, Web services have been developed that provide the capability to remote users and applications to access comprehensive GIS tools that are available in StreamStats, including delineating drainage-basin boundaries, computing basin characteristics, estimating streamflow statistics for user-selected locations, and determining point features that coincide with a National Hydrography Dataset (NHD) reach address. For the state of Kentucky, a web service also has been developed that provides users the ability to estimate daily time series of drainage-basin average values of daily precipitation and temperature. The use of web services allows the user to take full advantage of the datasets and processes behind the Stream Stats application without having to develop and maintain them. ?? 2009 IEEE.
Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai
Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users' browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach.
Pritychenko,B.; Sonzogni, A.A.
We present Sigma Web interface which provides user-friendly access for online analysis and plotting of the evaluated and experimental nuclear reaction data stored in the ENDF-6 and EXFOR formats. The interface includes advanced browsing and search capabilities, interactive plots of cross sections, angular distributions and spectra, nubars, comparisons between evaluated and experimental data, computations for cross section data sets, pre-calculated integral quantities, neutron cross section uncertainties plots and visualization of covariance matrices. Sigma is publicly available at the National Nuclear Data Center website at http://www.nndc.bnl.gov/sigma.
Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.
Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the
Fee, J.; Martinez, E.
USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/
Brigham, Tara J
Library instruction in academic, health sciences, and hospital libraries is an evolving concept. Content, the intended learner, and various teaching models influence the creation of library instructional tutorials and presentations. Standard programs such as PowerPoint and Keynote are often used to generate these instructional materials. However, new and dynamic web-based presentation applications have the potential to improve the learning experience for patients and health care professionals. This column will briefly touch on library instruction and standard presentation creators, but will mainly concentrate on six web-based applications that can be used for the creation of library instructional presentations.
Burger, Dan; Stassun, Keivan G.; Pepper, Joshua; Siverd, Robert J.; Paegert, Martin; De Lee, Nathan M.; Robinson, William H.
Filtergraph is a web application being developed and maintained by the Vanderbilt Initiative in Data-intensive Astrophysics (VIDA) to flexibly and rapidly visualize a large variety of astronomy datasets of various formats and sizes. The user loads a flat-file dataset into Filtergraph which automatically generates an interactive data portal that can be easily shared with others. From this portal, the user can immediately generate scatter plots of up to five dimensions as well as histograms and tables based on the dataset. Key features of the portal include intuitive controls with auto-completed variable names, the ability to filter the data in real time through user-specified criteria, the ability to select data by dragging on the screen, and the ability to perform arithmetic operations on the data in real time. To enable seamless data visualization and exploration, changes are quickly rendered on screen and visualizations can be exported as high quality graphics files. The application is optimized for speed in the context of large datasets: for instance, a plot generated from a stellar database of 3.1 million entries renders in less than 2 s on a standard web server platform. This web application has been created using the Web2py web framework based on the Python programming language. Filtergraph is free to use at http://filtergraph.vanderbilt.edu/.
Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey
A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.
Monnerville, M.; Sémah, G.
The article proposes an intelligent framework for supporting Web-based applications. The framework focuses on innovative use of existing resources and technologies in the form of services and takes the leverage of theoretical foundation of services science and the research from services computing. The main focus of the framework is to deliver benefits to users with various roles such as service requesters, service providers, and business owners to maximize their productivity when engaging with each other via the Web. The article opens up with research motivations and questions, analyses the existing state of research in the field, and describes the approach in implementing the proposed framework. Finally, an e-health application is discussed to evaluate the effectiveness of the framework where participants such as general practitioners (GPs), patients, and health-care workers collaborate via the Web.
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroic effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.
Casson, William H. Jr.
The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.
Ardizzone, Valeria; Bruno, Riccardo; Calanducci, Antonio; Carrubba, Carla; Fargetta, Marco; Ingrà, Elisa; Inserra, Giuseppina; La Rocca, Giuseppe; Monforte, Salvatore; Pistagna, Fabrizio; Ricceri, Rita; Rotondo, Riccardo; Scardaci, Diego; Barbera, Roberto
In this paper we present the architecture of a framework for building Science Gateways supporting official standards both for user authentication and authorization and for middleware-independent job and data management. Two use cases of the customization of the Science Gateway framework for Semantic-Web-based life science applications are also described.
Jerzy Nogiec; Kelley Trombly-Freytag; Dana Walbridge
Although many general-purpose frameworks have been developed to aid in web application development, they typically tend to be both comprehensive and complex. To address this problem, a specialized server-side Java framework designed specifically for data retrieval and visualization has been developed. The framework's focus is on maintainability and data security. The functionality is rich with features necessary for simplifying data display design, deployment, user management and application debugging, yet the scope is deliberately kept limited to allow for easy comprehension and rapid application development. The system clearly decouples the application processing and visualization, which in turn allows for clean separation of layout and processing development. Duplication of standard web page features such as toolbars and navigational aids is therefore eliminated. The framework employs the popular Model-View-Controller (MVC) architecture, but it also uses the filter mechanism for several of its base functionalities, which permits easy extension of the provided core functionality of the system.
Qiu, Fang; Ni, Feng; Chastain, Bryan; Huang, Haiting; Zhao, Peisheng; Han, Weiguo; Di, Liping
GRASS is a well-known geographic information system developed more than 30 years ago. As one of the earliest GIS systems, GRASS has currently survived mainly as free, open-source desktop GIS software, with users primarily limited to the research community or among programmers who use it to create customized functions. To allow average GIS end users to continue taking advantage of this widely-used software, we developed a GRASS Web Application Software System (GWASS), a distributed, web-based, multi-tiered Geospatial Information System (GIS) built on top of the GeoBrain web service, a project sponsored by NASA using the latest service oriented architecture (SOA). This SOA enabled system offers an effective and practical alternative to current commercial desktop GIS solutions. With GWASS, all geospatial processing and analyses are conducted by the server, so users are not required to install any software at the client side, which reduces the cost of access for users. The only resource needed to use GWASS is an access to the Internet, and anyone who knows how to use a web browser can operate the system. The SOA framework is revitalizing the GRASS as a new means to bring powerful geospatial analysis and resources to more users with concurrent access.
... and Improved Version of the ASDC MOPITT Search and Subset Web Application Friday, June 24, 2016 A new and improved version of the ASDC MOPITT Search and Subset Web Application has been released. New features include: Versions 5 and 6 ...
Gray, Alasdair J G; Sadler, Jason; Kit, Oles; Kyzirakos, Kostis; Karpathiotakis, Manos; Calbimonte, Jean-Paul; Page, Kevin; García-Castro, Raúl; Frazer, Alex; Galpin, Ixent; Fernandes, Alvaro A A; Paton, Norman W; Corcho, Oscar; Koubarakis, Manolis; De Roure, David; Martinez, Kirk; Gómez-Pérez, Asunción
Sensing devices are increasingly being deployed to monitor the physical world around us. One class of application for which sensor data is pertinent is environmental decision support systems, e.g., flood emergency response. For these applications, the sensor readings need to be put in context by integrating them with other sources of data about the surrounding environment. Traditional systems for predicting and detecting floods rely on methods that need significant human resources. In this paper we describe a semantic sensor web architecture for integrating multiple heterogeneous datasets, including live and historic sensor data, databases, and map layers. The architecture provides mechanisms for discovering datasets, defining integrated views over them, continuously receiving data in real-time, and visualising on screen and interacting with the data. Our approach makes extensive use of web service standards for querying and accessing data, and semantic technologies to discover and integrate datasets. We demonstrate the use of our semantic sensor web architecture in the context of a flood response planning web application that uses data from sensor networks monitoring the sea-state around the coast of England.
Gray, Alasdair J. G.; Sadler, Jason; Kit, Oles; Kyzirakos, Kostis; Karpathiotakis, Manos; Calbimonte, Jean-Paul; Page, Kevin; García-Castro, Raúl; Frazer, Alex; Galpin, Ixent; Fernandes, Alvaro A. A.; Paton, Norman W.; Corcho, Oscar; Koubarakis, Manolis; De Roure, David; Martinez, Kirk; Gómez-Pérez, Asunción
Sensing devices are increasingly being deployed to monitor the physical world around us. One class of application for which sensor data is pertinent is environmental decision support systems, e.g., flood emergency response. For these applications, the sensor readings need to be put in context by integrating them with other sources of data about the surrounding environment. Traditional systems for predicting and detecting floods rely on methods that need significant human resources. In this paper we describe a semantic sensor web architecture for integrating multiple heterogeneous datasets, including live and historic sensor data, databases, and map layers. The architecture provides mechanisms for discovering datasets, defining integrated views over them, continuously receiving data in real-time, and visualising on screen and interacting with the data. Our approach makes extensive use of web service standards for querying and accessing data, and semantic technologies to discover and integrate datasets. We demonstrate the use of our semantic sensor web architecture in the context of a flood response planning web application that uses data from sensor networks monitoring the sea-state around the coast of England. PMID:22164110
Li, Zhao; Liu, Nan; Liu, Renyi; Bao, Weizheng
This paper analyzes the necessity of the application of WebGIS in CATV basic network. There was almost no research in using WebGIS to manage CATV basic network data in China. This paper briefly introduces how to construct an integrated transaction management system of CATV basic network based on WebGIS, and it forms a database of pivotal points, lines and buildings. CATV basic network based on WebGIS has very great superiority to traditional GIS, anybody can visit the system in anyplace by wired or wireless connection. Therefore it can satisfy the managers who want to query information about network, equipment and customers or response on. In this way the routine network and equipment maintenance becomes easily and fleetly. But WebGIS also has bottle-necks such as vector graphic editing. The editing of vector graphic and attribute play important roles in GIS, especially in CATV basic network management, because equipment, cable and fiber-optic change frequently. This paper develops Zhejiang Jiangshan broadcasting and TV station geographic information system on the basis of ArcIMS platform, and making use of the capability of reading and writing spatial data of ArcSDE to solve vector graphic editing problem.
SSE-GIS v1.03 Web Mapping Application Now Available Wednesday, July 6, 2016 ... you haven’t already noticed the link to the new SSE-GIS web application on the SSE homepage entitled “GIS Web Mapping Applications and Services”, we invite you to visit the site. ...
Keng, Tan Chin; Ching, Yeoh Kah
The use of web applications has become a trend in many disciplines including education. In view of the influence of web application in education, this study examines web application technologies that could enhance undergraduates' learning experiences, with focus on Quantity Surveying (QS) and Information Technology (IT) undergraduates. The…
Zavala Romero, O.; Ahmed, A.; Chassignet, E.; Zavala-Hidalgo, J.
Kreutel, Jörn; Gerlach, Andrea; Klekamp, Stefanie; Schulz, Kristin
We describe the ideas and results of an applied research project that aims at leveraging the expressive power of semantic web technologies as a server-side backend for mobile applications that provide access to location and multimedia data and allow for a rich user experience in mobile scenarios, ranging from city and museum guides to multimedia enhancements of any kind of narrative content, including e-book applications. In particular, we will outline a reusable software architecture for both server-side functionality and native mobile platforms that is aimed at significantly decreasing the effort required for developing particular applications of that kind.
Huang, Yu S; Horton, Matthew; Vilhjálmsson, Bjarni J; Seren, Umit; Meng, Dazhe; Meyer, Christopher; Ali Amer, Muhammad; Borevitz, Justin O; Bergelson, Joy; Nordborg, Magnus
Tso, Kam S.; Pajevski, Michael J.
Cybersecurity has become a great concern as threats of service interruption, unauthorized access, stealing and altering of information, and spreading of viruses have become more prevalent and serious. Application layer access control of applications is a critical component in the overall security solution that also includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. An access control solution, based on an open-source access manager augmented with custom software components, was developed to provide protection to both Web-based and Javabased client and server applications. The DISA Security Service (DISA-SS) provides common access control capabilities for AMMOS software applications through a set of application programming interfaces (APIs) and network- accessible security services for authentication, single sign-on, authorization checking, and authorization policy management. The OpenAM access management technology designed for Web applications can be extended to meet the needs of Java thick clients and stand alone servers that are commonly used in the JPL AMMOS environment. The DISA-SS reusable components have greatly reduced the effort for each AMMOS subsystem to develop its own access control strategy. The novelty of this work is that it leverages an open-source access management product that was designed for Webbased applications to provide access control for Java thick clients and Java standalone servers. Thick clients and standalone servers are still commonly used in businesses and government, especially for applications that require rich graphical user interfaces and high-performance visualization that cannot be met by thin clients running on Web browsers
Serban, Alexandru; Crisan-Vida, Mihaela; Mada, Leonard; Stoicu-Tivadar, Lacramioara
User interfaces are important to facilitate easy learning and operating with an IT application especially in the medical world. An easy to use interface has to be simple and to customize the user needs and mode of operation. The technology in the background is an important tool to accomplish this. The present work aims to creating a web interface using specific technology (HTML table design combined with CSS3) to provide an optimized responsive interface for a complex web application. In the first phase, the current icMED web medical application layout is analyzed, and its structure is designed using specific tools, on source files. In the second phase, a new graphic adaptable interface to different mobile terminals is proposed, (using HTML table design (TD) and CSS3 method) that uses no source files, just lines of code for layout design, improving the interaction in terms of speed and simplicity. For a complex medical software application a new prototype layout was designed and developed using HTML tables. The method uses a CSS code with only CSS classes applied to one or multiple HTML table elements, instead of CSS styles that can be applied to just one DIV tag at once. The technique has the advantage of a simplified CSS code, and a better adaptability to different media resolutions compared to DIV-CSS style method. The presented work is a proof that adaptive web interfaces can be developed just using and combining different types of design methods and technologies, using HTML table design, resulting in a simpler to learn and use interface, suitable for healthcare services.
Sernadela, Pedro; González-Castro, Lorena; Oliveira, José Luís
In recent years, we have witnessed an explosion of biological data resulting largely from the demands of life science research. The vast majority of these data are freely available via diverse bioinformatics platforms, including relational databases and conventional keyword search applications. This type of approach has achieved great results in the last few years, but proved to be unfeasible when information needs to be combined or shared among different and scattered sources. During recent years, many of these data distribution challenges have been solved with the adoption of semantic web. Despite the evident benefits of this technology, its adoption introduced new challenges related with the migration process, from existent systems to the semantic level. To facilitate this transition, we have developed Scaleus, a semantic web migration tool that can be deployed on top of traditional systems in order to bring knowledge, inference rules, and query federation to the existent data. Targeted at the biomedical domain, this web-based platform offers, in a single package, straightforward data integration and semantic web services that help developers and researchers in the creation process of new semantically enhanced information systems. SCALEUS is available as open source at http://bioinformatics-ua.github.io/scaleus/ .
Martinez, E.; Fee, J.
The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.
Brigham, Tara J
Google is a company that is constantly expanding and growing its services and products. While most librarians possess a "love/hate" relationship with Google, there are a number of reasons you should consider exploring some of the tools Google has created and made freely available. Applications and services such as Google Docs, Slides, and Google+ are functional and dynamic without the cost of comparable products. This column will address some of the issues users should be aware of before signing up to use Google's tools, and a description of some of Google's Web applications and services, plus how they can be useful to librarians in health care.
Romaniuk, Ryszard S.
Wilga Summer 2016 Symposium on Photonics Applications and Web Engineering was held on 29 May - 06 June. The Symposium gathered over 350 participants, mainly young researchers active in optics, optoelectronics, photonics, electronics technologies and applications. There were presented around 300 presentations in a few main topical tracks including: bio-photonics, optical sensory networks, photonics-electronics-mechatronics co-design and integration, large functional system design and maintenance, Internet of Thins, and other. The paper is an introduction the 2016 WILGA Summer Symposium Proceedings, and digests some of the Symposium chosen key presentations.
Romaniuk, Ryszard S.
Wilga Summer 2015 Symposium on Photonics Applications and Web Engineering was held on 23-31 May. The Symposium gathered over 350 participants, mainly young researchers active in optics, optoelectronics, photonics, electronics technologies and applications. There were presented around 300 presentations in a few main topical tracks including: bio-photonics, optical sensory networks, photonics-electronics-mechatronics co-design and integration, large functional system design and maintenance, Internet of Thins, and other. The paper is an introduction the 2015 WILGA Summer Symposium Proceedings, and digests some of the Symposium chosen key presentations.
Veksler-Lublinksy, Isana; Barash, Danny; Avisar, Chai; Troim, Einav; Chew, Paul; Kedem, Klara
FASH (Fourier Alignment Sequence Heuristics) is a web application, based on the Fast Fourier Transform, for finding remote homologs within a long nucleic acid sequence. Given a query sequence and a long text-sequence (e.g, the human genome), FASH detects subsequences within the text that are remotely-similar to the query. FASH offers an alternative approach to Blast/Fasta for querying long RNA/DNA sequences. FASH differs from these other approaches in that it does not depend on the existence of contiguous seed-sequences in its initial detection phase. The FASH web server is user friendly and very easy to operate. FASH can be accessed at (secured website) PMID:18505581
Fox, Geoffrey; Pallickara, Shrideep; Pierce, Marlon; Gadgil, Harshawardhan
Grid application frameworks have increasingly aligned themselves with the developments in Web services. Web services are currently the most popular infrastructure based on service-oriented architecture (SOA) paradigm. There are three core areas within the SOA framework: (i) a set of capabilities that are remotely accessible, (ii) communications using messages and (iii) metadata pertaining to the aforementioned capabilities. In this paper, we focus on issues related to the messaging substrate hosting these services; we base these discussions on the NARADABROKERING system. We outline strategies to leverage capabilities available within the substrate without the need to make any changes to the service implementations themselves. We also identify the set of services needed to build Grids of Grids. Finally, we discuss another technology, HPSEARCH, which facilitates the administration of the substrate and the deployment of applications via a scripting interface. These issues have direct relevance to scientific Grid applications, which need to go beyond remote procedure calls in client-server interactions to support integrated distributed applications that couple databases, high performance computing codes and visualization codes.
Kobara, S.; Howard, M. K.; Simoniello, C.; Jochens, A. E.; Gulf Of Mexico Coastal Ocean Observing System Regional Association (Gcoos-Ra)
Spatial and temporal information on the ecology of marine species and encompassing oceanographic environment is vital to the development of effective strategies for marine resource management and biodiversity conservation. Assembling data and generating products is a time-consuming and often laborious part of the workflow required of fisheries specialists, resource managers, marine scientists and other stakeholder groups for effective fishery management and marine spatial planning. Workflow costs for all groups can be significantly reduced through the use of interoperable networked data systems. The Gulf of Mexico Coastal Ocean Observing System Regional Association (GCOOS-RA) is one of 11 RAs comprising the non-Federal part of the U.S. Integrated Ocean Observing System (IOOS). The RAs serve the region’s needs for data and information: by working with data providers to offer their data in standardized ways following IOOS guidance, by gathering stakeholders’ needs and requirements, and by producing basic products or facilitating product-generation by others to meet those needs. The GCOOS Data Portal aggregates regional near real-time data and serves these data through standardized service interfaces suitable for automated machine access or in formats suitable for human consumption. The related Products Portal generates products in graphical displays for humans and in standard formats for importing into common software packages. Web map applications are created using ArcGIS server RESTful service, publicly available Open Geospatial Consortium (OGC) Web Map Service (WMS) layers, and Web Coverage Service (WCS). Use of standardize interfaces allows us to construct seamless workflows that carry data from sensors through to products in an automated fashion. As a demonstration of the power of interoperable standards-based systems we have developed tailored product web pages for recreational boaters and fishermen. This is a part of an ongoing project to provide an
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Bogomolov, Vasily; Martynova, Yuliya; Shulgina, Tamara
Tso, Kam S.; Pajevski, Michael J.; Johnson, Bryan
Cyber security has gained national and international attention as a result of near continuous headlines from financial institutions, retail stores, government offices and universities reporting compromised systems and stolen data. Concerns continue to rise as threats of service interruption, and spreading of viruses become ever more prevalent and serious. Controlling access to application layer resources is a critical component in a layered security solution that includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. In this paper we discuss the development of an application-level access control solution, based on an open-source access manager augmented with custom software components, to provide protection to both Web-based and Java-based client and server applications.
Deploying business applications on the internal Web is a priority at Oak Ridge National Laboratory (Lockheed Martin Energy Research) and Lockheed Martin Energy Systems, Inc. as with most corporations. Three separate applications chose the Oracle Application Server (OAS), using the PL/SQL cartridge as a Web deployment method. This method was chosen primarily because the data was already stored in Oracle tables and developers knew HJSQL or at least SQL. The Database Support group had the responsibility of installing, testing, and determining standard methods for interfacing with the PL/SQL cartridge of the OAS. Note that the term Web Application Server was used for version 3, but in this discussion, OAS will be used for both version 3 and version 4.
Weiss, R. M.; McLane, J. C.; Yuen, D. A.; Wang, S.
...) A text-only page, with equivalent information or functionality, shall be provided to make a web site... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Web-based intranet and... STANDARDS Technical Standards § 1194.22 Web-based intranet and internet information and applications. (a)...
DuPlain, Ron; Balser, Dana S.; Radziwill, Nicole M.
The NRAO faced performance and usability issues after releasing a single-search-box ("Google-like") web application to query data across all NRAO telescope archives. Running queries with several relations across multiple databases proved to be very expensive in compute resources. An investigation for a better platform led to Solr and Blacklight, a solution stack which allows in-house development to focus on in-house problems. Solr is an Apache project built on Lucene to provide a modern search server with a rich set of features and impressive performance. Blacklight is a web user interface (UI) for Solr primarily developed by libraries at the University of Virginia and Stanford University. Though Blacklight targets libraries, it is highly adaptable for many types of search applications which benefit from the faceted searching and browsing, minimal configuration, and flexible query parsing of Solr and Lucene. The result: one highly reused codebase provides for millisecond response times and a flexible UI. Not just for observational data, NRAO is rolling out Solr and Blacklight across domains of library databases, telescope proposals, and more -- in addition to telescope data products, where integration with the Virtual Observatory is on-going.
Ries, Kernell G.; Guthrie, John G.; Rea, Alan H.; Steeves, Peter A.; Stewart, David W.
. Streamflow measurements are collected systematically over a period of years at partial-record stations to estimate peak-flow or low-flow statistics. Streamflow measurements usually are collected at miscellaneous-measurement stations for specific hydrologic studies with various objectives. StreamStats is a Web-based Geographic Information System (GIS) application (fig. 1) that was created by the USGS, in cooperation with Environmental Systems Research Institute, Inc. (ESRI)1, to provide users with access to an assortment of analytical tools that are useful for water-resources planning and management. StreamStats functionality is based on ESRI's ArcHydro Data Model and Tools, described on the Web at http://support.esri.com/index.cfm?fa=downloads.dataModels.filteredGateway&dmid=15. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection stations and user-selected ungaged sites. It also allows users to identify stream reaches that are upstream and downstream from user-selected sites, and to identify and obtain information for locations along the streams where activities that may affect streamflow conditions are occurring. This functionality can be accessed through a map-based user interface that appears in the user's Web browser (fig. 1), or individual functions can be requested remotely as Web services by other Web or desktop computer applications. StreamStats can perform these analyses much faster than historically used manual techniques. StreamStats was designed so that each state would be implemented as a separate application, with a reliance on local partnerships to fund the individual applications, and a goal of eventual full national implementation. Idaho became the first state to implement StreamStats in 2003. By mid-2008, 14 states had applications available to the public, and 18 other states were in various stages of implementation.
Morgil, Inci; Gungor Seyhan, Hatice; Ural Alsan, Evrim; Temel, Senar
Students perform intensive web-based applications during their education. One of these is project-based application. In this study, the effect of web based project applications on students' attitudes towards chemistry has been investigated. 42 students attending Hacettepe University, Faculty of Education, and Department of Chemistry Education have…
Today's Web applications are already "aware" of the network of computers and data on the Internet, in the sense that they perceive, remember, and represent knowledge external to themselves. However, Web applications are generally not able to respond to the meaning and context of the information in their memories. As a result, most applications are…
Koltun, G.F.; Kula, Stephanie P.; Puskas, Barry M.
A StreamStats Web application was developed for Ohio that implements equations for estimating a variety of streamflow statistics including the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year peak streamflows, mean annual streamflow, mean monthly streamflows, harmonic mean streamflow, and 25th-, 50th-, and 75th-percentile streamflows. StreamStats is a Web-based geographic information system application designed to facilitate the estimation of streamflow statistics at ungaged locations on streams. StreamStats can also serve precomputed streamflow statistics determined from streamflow-gaging station data. The basic structure, use, and limitations of StreamStats are described in this report. To facilitate the level of automation required for Ohio's StreamStats application, the technique used by Koltun (2003)1 for computing main-channel slope was replaced with a new computationally robust technique. The new channel-slope characteristic, referred to as SL10-85, differed from the National Hydrography Data based channel slope values (SL) reported by Koltun (2003)1 by an average of -28.3 percent, with the median change being -13.2 percent. In spite of the differences, the two slope measures are strongly correlated. The change in channel slope values resulting from the change in computational method necessitated revision of the full-model equations for flood-peak discharges originally presented by Koltun (2003)1. Average standard errors of prediction for the revised full-model equations presented in this report increased by a small amount over those reported by Koltun (2003)1, with increases ranging from 0.7 to 0.9 percent. Mean percentage changes in the revised regression and weighted flood-frequency estimates relative to regression and weighted estimates reported by Koltun (2003)1 were small, ranging from -0.72 to -0.25 percent and -0.22 to 0.07 percent, respectively.
working prototype of the system is provided based on Oracle 8i DBMS, Apache Tomcat Web server, Java Servlets and Java Server Pages. Suggestions...tool for the application. A working prototype of the system is provided based on Oracle 8i DBMS, Apache Tomcat Web server, Java Servlets and Java...3.Tier Figure 26. EAMS System Architecture While Apache /Tomcat 4.1 was configured as a Web server, for the middle tier application, Java
SPONSORED REPORT SERIES Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment 30...Lexical Link Analysis (LLA) Application: Improving Web Service to Defense Acquisition Visibility Environment 30 September 2015 Dr. Ying Zhao, Research...tasks completed this year. Task 1. We worked with the OSD OUSD ATL (US) to install the LLA/SSA/CLA system as a web service in the Defense
ARL-TN-0688 ● AUG 2015 US Army Research Laboratory Adapting My Weather Impacts Decision Aid (MyWIDA) to Additional Web...Laboratory Adapting My Weather Impacts Decision Aid (MyWIDA) to Additional Web Application Server Technologies by Jacob C Randall and Jeffrey O...COVERED (From - To) May–Aug 2015 4. TITLE AND SUBTITLE Adapting My Weather Impacts Decision Aid (MyWIDA) to Additional Web Application Server
Boustani, M.; Mattmann, C. A.; Ramirez, P.
There are many Earth science projects and data systems being developed at the Jet Propulsion Laboratory, California Institute of Technology (JPL) that require the use of Geographic Information Systems (GIS). Three in particular are: (1) the JPL Airborne Snow Observatory (ASO) that measures the amount of water being generated from snow melt in mountains; (2) the Regional Climate Model Evaluation System (RCMES) that compares climate model outputs with remote sensing datasets in the context of model evaluation and the Intergovernmental Panel on Climate Change and for the U.S. National Climate Assessment and; (3) the JPL Snow Server that produces a snow and ice climatology for the Western US and Alaska, for the U.S. National Climate Assessment. Each of these three examples and all other earth science projects are strongly in need of having GIS and geoprocessing capabilities to process, visualize, manage and store GeoSpatial data. Beside some open source GIS libraries and some software like ArcGIS there are comparatively few open source, web-based and easy to use application that are capable of doing GIS processing and visualization. To address this, we present GISCube, an open source web-based GIS application that can store, visualize and process GIS and GeoSpatial data. GISCube is powered by Geothon, an open source python GIS cookbook. Geothon has a variety of Geoprocessing tools such data conversion, processing, spatial analysis and data management tools. GISCube has the capability of supporting a variety of well known GIS data formats in both vector and raster formats, and the system is being expanded to support NASA's and scientific data formats such as netCDF and HDF files. In this talk, we demonstrate how Earth science and other projects can benefit by using GISCube and Geothon, its current goals and our future work in the area.
... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Web-based intranet and internet information and applications. 1194.22 Section 1194.22 Parks, Forests, and Public Property... STANDARDS Technical Standards § 1194.22 Web-based intranet and internet information and applications. (a)...
... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Web-based intranet and internet information and applications. 1194.22 Section 1194.22 Parks, Forests, and Public Property... STANDARDS Technical Standards § 1194.22 Web-based intranet and internet information and applications. (a)...
Collecting and presenting the latest research and development results from the leading researchers in the field of e-learning systems, Web-Based Intelligent E-Learning Systems: Technologies and Applications provides a single record of current research and practical applications in Web-based intelligent e-learning systems. This book includes major…
Marganian, P.; Clark, M.; Shelton, A.; McCarty, M.; Sessoms, E.
Recent web technologies focusing on languages, frameworks, and tools are discussed, using the Robert C. Byrd Green Bank Telescopes (GBT) new Dynamic Scheduling System as the primary example. Within that example, we use a popular Python web framework, Django, to build the extensive web services for our users. We also use a second complimentary server, written in Haskell, to incorporate the core scheduling algorithms. We provide a desktop-quality experience across all the popular browsers for our users with the Google Web Toolkit and judicious use of JQuery in Django templates. Single sign-on and authentication throughout all NRAO web services is accomplished via the Central Authentication Service protocol, or CAS.
Scotch, Matthew; Yip, Kevin Y; Cheung, Kei-Hoi
Development of public health informatics applications often requires the integration of multiple data sources. This process can be challenging due to issues such as different file formats, schemas, naming systems, and having to scrape the content of web pages. A potential solution to these system development challenges is the use of Web 2.0 technologies. In general, Web 2.0 technologies are new internet services that encourage and value information sharing and collaboration among individuals. In this case report, we describe the development and use of Web 2.0 technologies including Yahoo! Pipes within a public health application that integrates animal, human, and temperature data to assess the risk of West Nile Virus (WNV) outbreaks. The results of development and testing suggest that while Web 2.0 applications are reasonable environments for rapid prototyping, they are not mature enough for large-scale public health data applications. The application, in fact a "systems of systems," often failed due to varied timeouts for application response across web sites and services, internal caching errors, and software added to web sites by administrators to manage the load on their servers. In spite of these concerns, the results of this study demonstrate the potential value of grid computing and Web 2.0 approaches in public health informatics.
Keshavan, Anisha; Datta, Esha; McDonough, Ian; Madan, Christopher R; Jordan, Kesshi; Henry, Roland G
Tissue classification plays a crucial role in the investigation of normal neural development, brain-behavior relationships, and the disease mechanisms of many psychiatric and neurological illnesses. Ensuring the accuracy of tissue classification is important for quality research and, in particular, the translation of imaging biomarkers to clinical practice. Assessment with the human eye is vital to correct various errors inherent to all currently available segmentation algorithms. Manual quality assurance becomes methodologically difficult at a large scale - a problem of increasing importance as the number of data sets is on the rise. To make this process more efficient, we have developed Mindcontrol, an open-source web application for the collaborative quality control of neuroimaging processing outputs. The Mindcontrol platform consists of a dashboard to organize data, descriptive visualizations to explore the data, an imaging viewer, and an in-browser annotation and editing toolbox for data curation and quality control. Mindcontrol is flexible and can be configured for the outputs of any software package in any data organization structure. Example configurations for three large, open-source datasets are presented: the 1000 Functional Connectomes Project (FCP), the Consortium for Reliability and Reproducibility (CoRR), and the Autism Brain Imaging Data Exchange (ABIDE) Collection. These demo applications link descriptive quality control metrics, regional brain volumes, and thickness scalars to a 3D imaging viewer and editing module, resulting in an easy-to-implement quality control protocol that can be scaled for any size and complexity of study.
Fayz, B; Moldenhauer, J S; Wang, D; Zhao, C; Yao, B; Liu, D; Weinsheimer, S; Gardner, L; Johnson, A; Womble, D D; Krawetz, S A
Genomic and expression data have increased dramatically over the last several years. This is primarily due to the completion of the human genome project as well as an upsurge in the use of various high-throughput technologies. Recent attempts to correlate genomic and expression data have stimulated the scientific community to determine how this data can be used within a clinical setting (P Khatri et al., Genomics 2002: 79: 266; LJ van't Veer et al., Nature 2002: 415: 530). LARALink (Loci Analysis for Rearrangements Link) is a database-driven web application that utilizes several public datasets to analyze clinical cytogenetic data to identify candidate genes. LARALink allows UniGene clusters or single-nucleotide polymorphisms (SNPs) to be queried for multiple patients by cytoband, chromosome marker, or base pair. The results can be further refined with the use of an anatomical site, developmental stage, pathology, or cell-type expression filter. Once a set of UniGene clusters (expressed genes) has been identified either for a single patient or for a shared region among multiple patients, the expression-distribution profile, expressed sequence tags (ESTs), or online mendelian inheritance in man (OMIM) entries are displayed. The utility of this tool is shown by its application to both research and clinical medicine. LARALink is a public resource available at: http://www.laralink.bioinformatics.wayne.edu:8080/unigene.
Gogoi, Ankur; Rajkhowa, Pritom; P. Saikia, Gunjan; Ahmed, Gazi A.; Choudhury, Amarjyoti
Development of an online web application to simulate and display plane wave scattering from small particles is presented. In particular, the computation of angular variation of the scattering properties (scattering matrix elements, scattering coefficients, single scattering albedo etc.) of particulate matter by using the Mie theory and the T-matrix method was incorporated in the application. Comparison of the results generated by using the web application with other reported benchmark results has shown that the web application is accurate and reliable for electromagnetic scattering computations.
Laakso, J. H.; Straayer, J. W.
A final program summary is reported for test and evaluation activities that were conducted for space shuttle web selection. Large scale advanced composite shear web components were tested and analyzed to evaluate application of advanced composite shear web construction to a space shuttle orbiter thrust structure. The shear web design concept consisted of a titanium-clad + or - 45 deg boron/epoxy web laminate stiffened with vertical boron-epoxy reinforced aluminum stiffeners and logitudinal aluminum stiffening. The design concept was evaluated to be efficient and practical for the application that was studied. Because of the effects of buckling deflections, a requirement is identified for shear buckling resistant design to maximize the efficiency of highly-loaded advanced composite shear webs.
Moore, D R; Feurer, I D; Zavala, E Y; Shaffer, D; Karp, S; Hoy, H; Moore, D E
Most centers utilize phone or written surveys to screen candidates who self-refer to be living kidney donors. To increase efficiency and reduce resource utilization, we developed a web-based application to screen kidney donor candidates. The aim of this study was to evaluate the use of this web-based application. Method and time of referral were tabulated and descriptive statistics summarized demographic characteristics. Time series analyses evaluated use over time. Between January 1, 2011 and March 31, 2012, 1200 candidates self-referred to be living kidney donors at our center. Eight hundred one candidates (67%) completed the web-based survey and 399 (33%) completed a phone survey. Thirty-nine percent of donors accessed the application on nights and weekends. Postimplementation of the web-based application, there was a statistically significant increase (p < 0.001) in the number of self-referrals via the web-based application as opposed to telephone contact. Also, there was a significant increase (p = 0.025) in the total number of self-referrals post-implementation from 61 to 116 per month. An interactive web-based application is an effective strategy for the initial screening of donor candidates. The web-based application increased the ability to interface with donors, process them efficiently and ultimately increased donor self-referral at our center.
This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…
Manguy, Jean; Jehl, Peter; Dillon, Eugène T; Davey, Norman E; Shields, Denis C; Holton, Thérèse A
Tandem mass spectrometry (MS/MS) techniques, developed for protein identification, are increasingly being applied in the field of peptidomics. Using this approach, the set of protein fragments observed in a sample of interest can be determined to gain insights into important biological processes such as signaling and other bioactivities. As the peptidomics era progresses, there is a need for robust and convenient methods to inspect and analyze MS/MS derived data. Here, we present Peptigram, a novel tool dedicated to the visualization and comparison of peptides detected by MS/MS. The principal advantage of Peptigram is that it provides visualizations at both the protein and peptide level, allowing users to simultaneously visualize the peptide distributions of one or more samples of interest, mapped to their parent proteins. In this way rapid comparisons between samples can be made in terms of their peptide coverage and abundance. Moreover, Peptigram integrates and displays key sequence features from external databases and links with peptide analysis tools to offer the user a comprehensive peptide discovery resource. Here, we illustrate the use of Peptigram on a data set of milk hydrolysates. For convenience, Peptigram is implemented as a web application, and is freely available for academic use at http://bioware.ucd.ie/peptigram .
Huang, Wen-Hao David; Hood, Denice Ward; Yoo, Sun Joo
Situated in the gender digital divide framework, this survey study investigated the role of computer anxiety in influencing female college students' perceptions toward Web 2.0 applications for learning. Based on 432 college students' "Web 2.0 for learning" perception ratings collected by relevant categories of "Unified Theory of Acceptance and Use…
Maamar, Zakaria; Wives, Leandro Krug; Boukadi, Khouloud
This chapter discusses the use of social networks in Web services with focus on the discovery stage that characterizes the life cycle of these Web services. Other stages in this life cycle include description, publication, invocation, and composition. Web services are software applications that end users or other peers can invoke and compose to satisfy different needs such as hotel booking and car rental. Discovering the relevant Web services is, and continues to be, a major challenge due to the dynamic nature of these Web services. Indeed, Web services appear/disappear or suspend/resume operations without prior notice. Traditional discovery techniques are based on registries such as Universal Description, Discovery and Integration (UDDI) and Electronic Business using eXtensible Markup Language (ebXML). Unfortunately, despite the different improvements that these techniques have been subject to, they still suffer from various limitations that could slow down the acceptance trend of Web services by the IT community. Social networks seem to offer solutions to some of these limitations but raise, at the same time, some issues that are discussed in this chapter. The contributions of this chapter are three: social network definition in the particular context of Web services; mechanisms that support Web services build, use, and maintain their respective social networks; and social networks adoption to discover Web services.
case, the ordering process is, of course, not fully automated. Standardized products, on the other hand, are easily identified and the cost charged to the print buyer can be retrieved from predefined price lists. Typically, higher volumes will result in more attractive prices. An additional advantage of this type of products is that they are often defined such that they can be produced in bulk using conventional printing techniques. If one wants to automate the ganging, a connection must be established between the on-line ordering and the production planning system. (For digital printing, there typically is no need to gang products since they can be produced more effectively separately.) Many of the on-line print solutions support additional features also available in general purpose e-commerce sites. We here think of the availability of virtual shopping baskets, the connectivity with payment gateways and the support of special facilities for interfacing with courier services (bar codes, connectivity to courier web sites for tracking shipments etc.). Supporting these features also assumes an intimate link with the print production system. Another development that goes beyond the on-line ordering of printed material and the submission of full pages and/or documents, is the interactive, on-line definition of the content itself. Typical applications in this respect are, e.g., the creation of business cards, leaflets, letter heads etc. On a more professional level, we also see that more and more publishing organizations start using on-line publishing platforms to organize their work. These professional platforms can also be connected directly to printing portals and thus enable extra automation. In this paper, we will discuss for each of the different applications presented above (traditional Print Portals, Web2Print applications and professional, on-line publishing platforms) how they interact with prepress and print production systems and how they contribute to the
Knipp, D.; Kilcommons, L. M.; Damas, M. C.
We have created a simple and user-friendly web application to visualize output from empirical atmospheric models that describe the lower atmosphere and the Space-Atmosphere Interface Region (SAIR). The Atmospheric Model Web Explorer (AtModWeb) is a lightweight, multi-user, Python-driven application which uses standard web technology (jQuery, HTML5, CSS3) to give an in-browser interface that can produce plots of modeled quantities such as temperature and individual species and total densities of neutral and ionized upper-atmosphere. Output may be displayed as: 1) a contour plot over a map projection, 2) a pseudo-color plot (heatmap) which allows visualization of a variable as a function of two spatial coordinates, or 3) a simple line plot of one spatial coordinate versus any number of desired model output variables. The application is designed around an abstraction of an empirical atmospheric model, essentially treating the model code as a black box, which makes it simple to add additional models without modifying the main body of the application. Currently implemented are the Naval Research Laboratory NRLMSISE00 model for neutral atmosphere and the International Reference Ionosphere (IRI). These models are relevant to the Low Earth Orbit environment and the SAIR. The interface is simple and usable, allowing users (students and experts) to specify time and location, and choose between historical (i.e. the values for the given date) or manual specification of whichever solar or geomagnetic activity drivers are required by the model. We present a number of use-case examples from research and education: 1) How does atmospheric density between the surface and 1000 km vary with time of day, season and solar cycle?; 2) How do ionospheric layers change with the solar cycle?; 3 How does the composition of the SAIR vary between day and night at a fixed altitude?
Long, J. W.
Lorenz is a product of the ASC Scientific Data Management effort. Lorenz is a web-based application designed to help computer centers make information and resources more easily available to their users.
Voumard, Jérémie; Aye, Zar Chi; Derron, Marc-Henri; Jaboyedoff, Michel
Roads and railways are threatened throughout the year by several natural hazards around the world, leading to the closing of transportation corridors, loss of access, deviation travels and potentially infrastructures damages and loss of human lives and also financial, social and economic consequences. Protection measures used to reduce the exposure to natural hazards are usually expensive and cannot be deployed on an entire transportation network. It is thus necessary to choose priority areas where protection measures need to be built. The aim of this study is to propose a friendly tool to evaluate and to understand issues and consequences of section closing and affected parts of a transportation network at small region scale. The proposed tool, currently in its design and building phase, will provide ways to simulate different closure scenarios and to analyze their consequences on transportation network; like deviating traffic on others roads and railways sections, additional time and distance travel or accessibility for emergency services like police, firefighters and ambulances. The tool is based on OpenGeo architecture, which is composed of open-source components. It integrates PostGIS for database, GeoServer and GeoWebCache for application servers and finally GeoExt and OpenLayers for user interface. Users will be able to attribute quantitative (like roads and railway type and closure consequences) and qualitative (like section unavailability duration, season, etc.) data to the different roads and railways sections based on their user rights. They will also be able to evaluate different track closures consequences in terms of different scenarios. Once finalized, the goal of this project including natural hazards, traffic and geomatic thematic is to propose a decision support tool for public authorities firstly and for specialists secondly so that they can evaluate easily and accurately as much as possible to highlight the weakpoints of the transportation
Background As the “omics” revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter’s complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. Results COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a “semantic web in a box” approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. Conclusions The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/. PMID:23244467
Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.
Allen, David G; Mahto, Raj V; Otondo, Robert F
Recruitment theory and research show that objective characteristics, subjective considerations, and critical contact send signals to prospective applicants about the organization and available opportunities. In the generating applicants phase of recruitment, critical contact may consist largely of interactions with recruitment sources (e.g., newspaper ads, job fairs, organization Web sites); however, research has yet to fully address how all 3 types of signaling mechanisms influence early job pursuit decisions in the context of organizational recruitment Web sites. Results based on data from 814 student participants searching actual organization Web sites support and extend signaling and brand equity theories by showing that job information (directly) and organization information (indirectly) are related to intentions to pursue employment when a priori perceptions of image are controlled. A priori organization image is related to pursuit intentions when subsequent information search is controlled, but organization familiarity is not, and attitudes about a recruitment source also influence attraction and partially mediate the effects of organization information. Theoretical and practical implications for recruitment are discussed.
Cao, Xinhua; Hoo, Kent S., Jr.; Zhang, Hong; Ching, Wan; Zhang, Ming; Wong, Stephen T. C.
We described a web-based data warehousing method for retrieving and analyzing neurological multimedia information. The web-based method supports convenient access, effective search and retrieval of clinical textual and image data, and on-line analysis. To improve the flexibility and efficiency of multimedia information query and analysis, a three-tier, multimedia data warehouse for epilepsy research has been built. The data warehouse integrates clinical multimedia data related to epilepsy from disparate sources and archives them into a well-defined data model.
Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly
Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly
Kopf, Stephan; Guthier, Benjamin; Lemelson, Hendrik; Effelsberg, Wolfgang
In this paper, we introduce our new visualization service which presents web pages and images on arbitrary devices with differing display resolutions. We analyze the layout of a web page and simplify its structure and formatting rules. The small screen of a mobile device is used much better this way. Our new image adaptation service combines several techniques. In a first step, border regions which do not contain relevant semantic content are identified. Cropping is used to remove these regions. Attention objects are identified in a second step. We use face detection, text detection and contrast based saliency maps to identify these objects and combine them into a region of interest. Optionally, the seam carving technique can be used to remove inner parts of an image. Additionally, we have developed a software tool to validate, add, delete, or modify all automatically extracted data. This tool also simulates different mobile devices, so that the user gets a feeling of how an adapted web page will look like. We have performed user studies to evaluate our web and image adaptation approach. Questions regarding software ergonomics, quality of the adapted content, and perceived benefit of the adaptation were asked.
In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…
Bajt, Susanne K.
The current generation of new students, referred to as the Millennial Generation, brings a new set of challenges to the community college. The influx of these technologically sophisticated students, who interact through the social phenomenon of Web 2.0 technology, bring expectations that may reshape institutions of higher learning. This chapter…
von Haller, B.; Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Delort, C.; Dénes, E.; Diviá, R.; Fuchs, U.; Niedziela, J.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Wegrzynek, A.
Gotvald, Anthony J.; Musser, Jonathan W.
StreamStats is being implemented on a State-by-State basis to allow for customization of the data development and underlying datasets to address their specific needs, issues, and objectives. The USGS, in cooperation with the Georgia Environmental Protection Division and Georgia Department of Transportation, has implemented StreamStats for Georgia. The Georgia StreamStats Web site is available through the national StreamStats Web-page portal at http://streamstats.usgs.gov. Links are provided on this Web page for individual State applications, instructions for using StreamStats, definitions of basin characteristics and streamflow statistics, and other supporting information.
Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid
There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies.
1 LEXICAL LINK ANALYSIS (LLA) APPLICATION: IMPROVING WEB SERVICE TO DEFENSE ACQUISITION VISIBILITY ENVIRONMENT(DAVE) May 13-14, 2015 Dr. Ying...Improving Web Service to Defense Acquisition Visibility Environment (DAVE) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR...pattern recognition that scales up to Big Data • System Self-Awareness (SSA) • Big data and Deep Learning (BDDL) / Big Data Architecture and
Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.
Gomez, Fabinton Sotelo; Ordóñez, Armando
Previously a framework for integrating web resources providing educational services in dotLRN was presented. The present paper describes the application of this framework in a rural school in Cauca--Colombia. The case study includes two web resources about the topic of waves (physics) which is oriented in secondary education. Web classes and…
Cimino, J J; Socratous, S A; Clayton, P D
Clinical computing application development at Columbia-Presbyterian Medical Center has been limited by the lack of a flexible programming environment that supports multiple client user platforms. The World Wide Web offers a potential solution, with its multifunction servers, multiplatform clients, and use of standard protocols for displaying information. The authors are now using the Web, coupled with their own local clinical data server and vocabulary server, to carry out rapid prototype development of clinical information systems. They have developed one such prototype system that can be run on most popular computing platforms from anywhere on the Internet. The Web paradigm allows easy integration of clinical information with other local and Internet-based information sources. The Web also simplifies many aspects of application design; for example, it includes facilities for the use of encryption to meet the authors' security and confidentiality requirements. The prototype currently runs on only the Web server in the Department of Medical Informatics at Columbia University, but it could be run on other Web servers that access the authors' clinical data and vocabulary servers. It could also be adapted to access clinical information from other systems with similar server capabilities. This approach may be adaptable for use in developing institution-independent standards for data and application sharing. PMID:7496876
Gurupur, Varadraj P; Tanik, Murat M
In this paper we present a system using Semantic Web by which applications can be effectively constructed for clinical research purposes. We are aware of the immense difficulties and variations involved in clinical research applications. With a purpose of mitigating some of these difficulties in the process of developing clinical research applications we are presenting an approach for building information systems based on Semantic Web. We have developed a working prototype using C-Map tools leveraging the underlying principles of Abstract Software Design Framework to convert domain knowledge into machine-actable information.
Vermylen, J. P.
I will demonstrate a series of web-based visualizations of domestic state-level and international country-level energy statistics. The time-series energy consumption and production data sets are from the International Energy Agency (IEA) and the United States Department of Energy's Energy Information Administration (EIA). I will demonstrate the capabilities of existing web-based community data analysis sites, such as Swivel.com and IBM's Many-Eyes.com, as well as the capabilities of embeddable visualization gadgets, such as the Gapminder-inspired Motion Charts created by Google. These tools will allow students and the public to interactively explore relationships and trends of energy consumption and production. These visualizations will be particularly useful in exploring energy statistics that are traditionally presented in a multitude of competing units. The tools will also be useful for students in inquiry-based learning programs.
pictures of the gun in semantic memory , which are sampled from ImageNet [Deng et al., 2009]. Second, the episodic memory is an autobiograph - ical record of... memory frozen in time. The starting point of this thesis is to view fast growing Web image collections as the socially ag- gregated pictorial records of...accepted nomen- clature of human and social memory studies [Halbwachs, 1992; Tulving, 1972]. We illustrate three important types of memories in the image
Zhang, Yong; Cui, Bin-Ge
With the development of Internet technology, A legion of scientific computing legacy programs with rich domain knowledge and expertise were distributed across various disciplines. As the program implementations or interfaces and so on, scientific computing legacy programs can not be shared through the Internet. This paper proposes a method of packaging scientific computing legacy programs into DLL(Dynamic Link Library), and packaging them into Web services through the C# reflection, making the scientific computing legacy programs successfully share on the Internet.
iLM is a Web based application for representation, management and sharing of IMS LIP conformant user profiles. The tool is developed using a service oriented architecture with emphasis on the easy data sharing. Data elicitation from user profiles is based on the utilization of XQuery scripts and sharing with other applications is achieved through…
Goodall, J. L.; Castronova, A. M.; Huynh, N.; Caicedo, J. M.
Management of water systems often requires the integration of data and models across a range of sources and disciplinary expertise. Service-Oriented Architectures (SOA) have emerged as a powerful paradigm for providing this integration. Including models within a SOA presents challenges because services are not well suited for applications that require state management and large data transfers. Despite these challenges, thoughtful inclusion of models as resources within a SOA could have distinct advantages that center on the idea of abstracting complex computer hardware and software from service consumers while, at the same time, providing powerful resources to client applications. With these advantages and challenges of using models within SOA in mind, this work explores the potential of a modeling service standard as a means for integrating models as resources within SOA. Specifically, we investigate the use of the Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard for exposing models as web services. Through extension of a Python-based implementation of WPS (called pyWPS), we present a demonstration of the methodology through a case study involving a storm event that floods roads and disrupts travel in Columbia, SC. The case study highlights the benefit of an urban infrastructure system with its various subsystems (stormwater, transportation, and structures) interacting and exchanging data seamlessly.
Martins, Wellington Santos; Soares Lucas, Divino César; de Souza Neves, Kelligton Fabricio; Bertioli, David John
Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. Availability The web tool may be accessed at http://purl.oclc.org/NET/websat/ PMID:19255650
Li, Yunjia; Wald, Mike; Wills, Gary; Khoja, Shakeel; Millard, David; Kajaba, Jiri; Singh, Priyanka; Gilbert, Lester
This paper discusses the development of a Web-based media annotation application named Synote, which addresses the important issue that while the whole of a multimedia resource on the Web can be easily bookmarked, searched, linked to and tagged, it is still difficult to search or associate notes or other resources with a certain part of a resource. Synote supports the creation of synchronized notes, bookmarks, tags, links, images and text captions. It is a freely available application that enables any user to make annotations in and search annotations to any fragment of a continuous multimedia resource in the most used browsers and operating systems. In the implementation, Synote categorized different media resources and synchronized them via time line. The presentation of synchronized resources makes full use of Web 2.0 AJAX technology to enrich interoperability for the user experience. Positive evaluation results about the performance, efficiency and effectiveness of Synote were returned when using it with students and teachers for a number of undergraduate courses.
Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai
Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users’ browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach. PMID:28245610
ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = Lexical Link Analysis Application: Improving Web Service to Acquisition...2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lexical Link Analysis Application: Improving Web Service...www.acquisitionresearch.net). ^Åèìáëáíáçå=oÉëÉ~êÅÜ=mêçÖê~ãW= `êÉ~íáåÖ=póåÉêÖó=Ñçê=fåÑçêãÉÇ=`Ü~åÖÉ= - 345 - Lexical Link Analysis Application: Improving Web Service
Chakaveh, Sepideh; Geuer, Olaf; Werning, Stefan; Borggrefe, Sorina; Haeger, Ralf
In the near future interactive television will provide many entertaining and innovative broadcasting formats for TV viewers. Moreover through the recent advancements in web-based visualisation techniques, complex application scenarios can already be realised. With the aid of a demonstrator called "deinewahl02" we proved a challenging concept on how to import web-based applications onto more complex platforms such as MHP set-up boxes. "deinewahl02" which represents a pretend political TV debate of the German general elections 2002, gives one the possibilities to playfully & entertainly be guided into the programme. Using the various functionalities such as "Voting" or "Hotspots" the possibilities of interaction, are well demonstrated within the programme. This provides a two way communication channel which can be established instantly, between the viewer and the broadcaster. "deinewahl02" has successfully demonstrated, ways where web-based applications may quickly and cheaply be implemented onto much more complex platforms.
Poller, Andreas; Steinebach, Martin; Liu, Huajian
We present two approaches to robust image obfuscation based on permutation of image regions and channel intensity modulation. The proposed concept of robust image obfuscation is a step towards end-to-end security in Web 2.0 applications. It helps to protect the privacy of the users against threats caused by internet bots and web applications that extract biometric and other features from images for data-linkage purposes. The approaches described in this paper consider that images uploaded to Web 2.0 applications pass several transformations, such as scaling and JPEG compression, until the receiver downloads them. In contrast to existing approaches, our focus is on usability, therefore the primary goal is not a maximum of security but an acceptable trade-off between security and resulting image quality.
Friberg, P. A.; Luis, R. S.; Quintiliani, M.; Lisowski, S.; Hunter, S.
Recently, a novel set of modules has been included in the Open Source Earthworm seismic data processing system, supporting the use of web applications. These include the Mole sub-system, for storing relevant event data in a MySQL database (see M. Quintiliani and S. Pintore, SRL, 2013), and an embedded webserver, Moleserv, for serving such data to web clients in QuakeML format. These modules have enabled, for the first time using Earthworm, the use of web applications for seismic data processing. These can greatly simplify the operation and maintenance of seismic data processing centers by having one or more servers providing the relevant data as well as the data processing applications themselves to client machines running arbitrary operating systems.Web applications with secure online web access allow operators to work anywhere, without the often cumbersome and bandwidth hungry use of secure shell or virtual private networks. Furthermore, web applications can seamlessly access third party data repositories to acquire additional information, such as maps. Finally, the usage of HTML email brought the possibility of specialized web applications, to be used in email clients. This is the case of EWHTMLEmail, which produces event notification emails that are in fact simple web applications for plotting relevant seismic data.Providing web services as part of Earthworm has enabled a number of other tools as well. One is ISTI's EZ Earthworm, a web based command and control system for an otherwise command line driven system; another is a waveform web service. The waveform web service serves Earthworm data to additional web clients for plotting, picking, and other web-based processing tools. The current Earthworm waveform web service hosts an advanced plotting capability for providing views of event-based waveforms from a Mole database served by Moleserve.The current trend towards the usage of cloud services supported by web applications is driving improvements in Java
Graham, Matthew; Williams, R. D.; Djorgovski, S. D.; Drake, A. J.; Mahabal, A.
Skyalert.org is a web-based management system for collecting and disseminating observations about time-critical astronomical transients, and for adding annotations and intelligent machine-learning to those observations. The information is "pushed” to subscribers, who may be either humans (email, text message etc) or they may be machines that control telescopes. Subscribers can prepare precise "trigger rules” to decide which events should reach them and their robots, rules that may be based on sky position, or on the specific vocabulary of parameters that define a particular type of event. Each event has its own web page updated immediately when new information comes, with long-lived URLs and wiki capability. The subscriber has an account on the web, and builds the trigger-rules and watch-lists there, defining decision criteria about future events. As soon as the transient event is seen and causes trigger, a message can be pushed to the subscriber, email, IM, text-message, etc. Annotations can be fetched automatically and immediately from the archives, such as SDSS, DSS, NED, Simbad, or other Virtual Observatory resources. Other actions upon event arrival include immediate running of data mining or classification modules, based on the event and past data. Skyalert can also drive robotic telescopes through the HTN and dc3.org schedulers; it can evaluate joint trigger rules such as "magnitude difference from SDSS". Skyalert is a component system allowing pluggable custom data mining modules, distributed intelligence, and a central point of information for each transient. Our twin thrusts are automation of process, and discrimination of interesting events.
similar approach was used to merge WoLF with ACS- TSP (an ACO algorithm) to solve a Traveling Salesman Problem in . The variables used for policy update...OPTIMIZING THE REPLICATION OF MULTI-QUALITY WEB APPLICATIONS USING ACO AND WOLF THESIS Judson C Dressler, Second Lieutenant, USAF AFIT/GCS/ENG/06-05...Air Force, Department of Defense, or the U.S. Government. AFIT/GCS/ENG/06-05 OPTIMIZING THE REPLICATION OF MULTI-QUALITY WEB APPLICATIONS USING ACO AND
Patel, Shyamal; Chen, Bor-Rong; Buckley, Thomas; Rednic, Ramona; McClure, Doug; Tarsy, Daniel; Shih, Ludy; Dy, Jennifer; Welsh, Matt; Bonato, Paolo
Objective long-term health monitoring can improve the clinical management of several medical conditions ranging from cardiopulmonary diseases to motor disorders. In this paper, we present our work toward the development of a home-monitoring system. The system is currently used to monitor patients with Parkinson's disease who experience severe motor fluctuations. Monitoring is achieved using wireless wearable sensors whose data are relayed to a remote clinical site via a web-based application. The work herein presented shows that wearable sensors combined with a web-based application provide reliable quantitative information that can be used for clinical decision making.
Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M.
In this paper we use information theory techniques on time series of abundances to determine the topology of a food web. At the outset, the food web participants (two consumers, two resources) are known; in addition we know that each consumer prefers one of the resources over the other. However, we do not know which consumer prefers which resource, and if this preference is absolute (i.e., whether or not the consumer will consume the non-preferred resource). Although the consumers and resources are identified at the beginning of the experiment, we also provide evidence that the consumers are not resources for each other, and the resources do not consume each other. We do show that there is significant mutual information between resources; the model is seasonally forced and some shared information between resources is expected. Similarly, because the model is seasonally forced, we expect shared information between consumers as they respond to the forcing of the resources. The model that we consider does include noise, and in an effort to demonstrate that these methods may be of some use in other than model data, we show the efficacy of our methods with decreasing time series size; in this particular case we obtain reasonably clear results with a time series length of 400 points. This approaches ecological time series lengths from real systems.
Application Archetype MVC Model View Controller NDFD National Digital Forecast Database NOAA National Oceanic and Atmospheric Administration’s NSSC...for Java (GAE-J). GWT and GAE are not required to implement the COLD-T application, any web framework (PHP, Struts, JSF, Spring MVC , etc.) could be...2010.  S. D. K. Hyun Jung La, “Balanced MVC Architecture for Developing Service-based Mobile Applications,” IEEE International Conference on E
Burger, D.; Stassun, K. G.; Pepper, J. A.; Siverd, R. J.; Paegert, M. A.; De Lee, N. M.
Filtergraph is a web application being developed by the Vanderbilt Initiative in Data-intensive Astrophysics (VIDA) to flexibly handle a large variety of astronomy datasets. While current datasets at Vanderbilt are being used to search for eclipsing binaries and extrasolar planets, this system can be easily reconfigured for a wide variety of data sources. The user loads a flat-file dataset into Filtergraph which instantly generates an interactive data portal that can be easily shared with others. From this portal, the user can immediately generate scatter plots, histograms, and tables based on the dataset. Key features of the portal include the ability to filter the data in real time through user-specified criteria, the ability to select data by dragging on the screen, and the ability to perform arithmetic operations on the data in real time. The application is being optimized for speed in the context of very large datasets: for instance, plot generated from a stellar database of 3.1 million entries render in less than 2 seconds on a standard web server platform. This web application has been created using the Web2py web framework based on the Python programming language.
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application
Jeliazkova, Nina; Jeliazkov, Vedrin
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application
Laakso, J. H.; Straayer, J. W.
Three large scale advanced composite shear web components were tested and analyzed to evaluate application of the design concept to a space shuttle orbiter thrust structure. The shear web design concept consisted of a titanium-clad + or - 45 deg boron/epoxy web laminate stiffened with vertical boron/epoxy reinforced aluminum stiffeners. The design concept was evaluated to be efficient and practical for the application that was studied. Because of the effects of buckling deflections, a requirement is identified for shear buckling resistant design to maximize the efficiency of highly-loaded advanced composite shear webs. An approximate analysis of prebuckling deflections is presented and computer-aided design results, which consider prebuckling deformations, indicate that the design concept offers a theoretical weight saving of 31 percent relative to all metal construction. Recommendations are made for design concept options and analytical methods that are appropriate for production hardware.
Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.
Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have
Gandhi, Nilima; Bhavsar, Satyendra P; Gewurtz, Sarah B; Diamond, Miriam L; Evenset, Anita; Christensen, Guttorm N; Gregor, Dennis
A multichemical food web model has been developed to estimate the biomagnification of interconverting chemicals in aquatic food webs. We extended a fugacity-based food web model for single chemicals to account for reversible and irreversible biotransformation among a parent chemical and transformation products, by simultaneously solving mass balance equations of the chemicals using a matrix solution. The model can be applied to any number of chemicals and organisms or taxonomic groups in a food web. The model was illustratively applied to four PBDE congeners, BDE-47, -99, -100, and -153, in the food web of Lake Ellasjøen, Bear Island, Norway. In Ellasjøen arctic char (Salvelinus alpinus), the multichemical model estimated PBDE biotransformation from higher to lower brominated congeners and improved the correspondence between estimated and measured concentrations in comparison to estimates from the single-chemical food web model. The underestimation of BDE-47, even after considering bioformation due to biotransformation of the otherthree congeners, suggests its formation from additional biotransformation pathways not considered in this application. The model estimates approximate values for congener-specific biotransformation half-lives of 5.7,0.8,1.14, and 0.45 years for BDE-47, -99, -100, and -153, respectively, in large arctic char (S. alpinus) of Lake Ellasjøen.
Pispidikis, I.; Dimopoulou, E.
Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.
The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.
Della Mea, Vincenzo; De Momi, Ivan; Aprile, Giuseppe; Puglisi, Fabio; Menis, Jessica; Casetta, Anica; Bolzonello, Silvia; Fasola, Gianpiero
Collection of collateral effects related to toxicities suffered by patients being exposed to anticancer treatments is of crucial importance in clinical practice but also in oncological research. The present paper describes a web application called PaTOS for self-report of anticancer therapy toxicities, and its evaluation in a preliminary interface analysis and then in a feasibility study.
Pritchett, Christopher G.; Pritchett, Christal C.; Wohleb, Elisha C.
This research study was designed to determine the degree of use of Web 2.0 technology applications by certified education professionals and examine differences among various groups as well as reasons for these differences. A quantitative survey instrument was developed to gather demographic information and data. Participants reported they would be…
von Franqué, Alexander; Tellioglu, Hilda
Many educational institutions use Learning Management Systems to provide e-learning content to their students. This often includes quizzes that can help students to prepare for exams. However, the content is usually web-optimized and not very usable on mobile devices. In this work a native mobile application ("UML Quiz") that imports…
The study aims to investigate are using Web 2.0 applications promoting reflective thinking skills for higher education student in faculty for education. Although the literature reveals that technology integration is a trend in higher education and researchers and educators have increasingly shared their ideas and examples of implementations of Web…
Pelet, Jean-Eric, Ed.
Once considered the traditional approach to education, brick and mortar institutions are no longer the norm due to e-learning technologies. Populations are turning into ubiquitous human beings, and educational practices are reflecting this change. "E-Learning 2.0 Technologies and Web Applications in Higher Education" compiles the latest…
Trinidad, Sue; Broadley, Tania
The research reported in this paper documents the use of Web2.0 applications with six Western Australian schools that are considered to be regional and/or remote. With a population of two million people within an area of 2,525,500 square kilometres Western Australia has a number of towns that are classified as regional and remote. Each of the…
Cakiroglu, Unal; Akkan, Yasar; Guven, Bulent
Determining the reflections of technology integration applications that are to be performed in our schools is important to light the way of first steps of integration. In this research, the effect of a web-based instruction environment used by 31 different teachers in a high school to school culture is set forth. The school culture is analyzed…
Pritchett, Christal C.; Wohleb, Elisha C.; Pritchett, Christopher G.
This research study was designed to examine the degree of perceived importance of interactive technology applications among various groups of certified educators; the degree to which education professionals utilized interactive online technology applications and to determine if there was a significant difference between the different groups based…
Reigle, Rosemary R.
The changing online learning environment requires that instructors depend less on the standard tools built into most educational learning platforms and turn their focus to use of Open Educational Resources (OERs) and free or low-cost commercial applications. These applications permit new and more efficient ways to build online learning communities…
Web Engineering is the application of systematic, disciplined and quantifiable approaches to development, operation, and maintenance of Web-based applications. It is both a pro-active approach and a growing collection of theoretical and empirical research in Web application development. This paper gives an overview of Web Engineering by addressing the questions: (a) why is it needed? (b) what is its domain of operation? (c) how does it help and what should it do to improve Web application development? and (d) how should it be incorporated in education and training? The paper discusses the significant differences that exist between Web applications and conventional software, the taxonomy of Web applications, the progress made so far and the research issues and experience of creating a specialization at the master's level. The paper reaches a conclusion that Web Engineering at this stage is a moving target since Web technologies are constantly evolving, making new types of applications possible, which in turn may require innovations in how they are built, deployed and maintained.
Purpose: The purpose of this paper is to describe the use of AJAX for searching the Biblioteche Oggi database of bibliographic records. Design/methodology/approach: The paper is a demonstration of how bibliographic database single page interfaces allow the implementation of more user-friendly features for social and collaborative tasks. Findings:…
Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar
Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.
Szostek, K.; Piórkowski, A.
In this article construction and potential of OpenGL multi-user web-based application are presented. The most common technologies like: .NET ASP, Java and Mono were used with specific OpenGL libraries to visualize tree-dimensional medical data. The most important conclusion of this work is that server side applications can easily take advantage of fast GPU and produce efficient results of advanced computation just like the visualization.
Sun, Charles; Windrem, May; Picinich, Lou
World Wide Web (W3) technologies are considered in relation to their application to space missions. It is considered that such technologies, including the hypertext transfer protocol and the Java object-oriented language, offer a powerful and relatively inexpensive framework for distributed application software development. The suitability of these technologies for payload monitoring systems development is discussed, and the experience gained from the development of an insect habitat monitoring system based on W3 technologies is reported.
A web-based application to help Southern High Plains cotton producers estimate profitability under center pivot irrigated production is described. The application’s crop modeling and general profit calculation approach are outlined in a preceding companion paper, while additional details of the prof...
A web-based application intended to help Southern High Plains cotton producers estimate profitability under center pivot irrigated production is described. The application’s crop modeling and general profit calculation approach are outlined in a preceding companion paper, while additional details of...
Pierlet, Noëlla; Aerts, Werner; Vanautgaerden, Mark; Van den Bosch, Bart; De Deurwaerder, André; Schils, Erik; Noppe, Thomas
The LISA application, developed by the University Hospitals Leuven, permits referring physicians to consult the electronic medical records of their patients over the internet in a highly secure way. We decided to completely change the way we secured the application, discard the existing web application and build a completely new application, based on the in-house developed hospital information system, used in the University Hospitals Leuven. The result is a fat Java client, running on a Windows Terminal Server, secured by a commercial SSL-VPN solution.
Zaldivar, Vicente Arturo Romero; Arandia, Jon Ander Elorriaga; Brito, Mateo Lezcano
In this article, the main characteristics of the educational browser YADBrowser are described. One of the main objectives of this project is to define new languages and object models which facilitate the creation of educational applications for the Internet. The fundamental characteristics of the object model of the browser are also described.…
Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran
Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/ PMID:26571275
Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.; Batteh, John J; Tiller, Michael M.
Previous reports focused on the development of component and system models as well as end-to-end system models using Modelica and Dymola for two advanced reactor architectures: (1) Advanced Liquid Metal Reactor and (2) fluoride high-temperature reactor (FHR). The focus of this report is the release of the first beta version of the web-based application for model use and collaboration, as well as an update on the FHR model. The web-based application allows novice users to configure end-to-end system models from preconfigured choices to investigate the instrumentation and controls implications of these designs and allows for the collaborative development of individual component models that can be benchmarked against test systems for potential inclusion in the model library. A description of this application is provided along with examples of its use and a listing and discussion of all the models that currently exist in the library.
Romaniuk, Ryszard S.
XXXVth periodic Symposium WILGA (winter edition) on Design, Construction and Application of Advanced Electronic and Photonic Systems was held at the end of January 2015. It is an established, periodic meeting of young researchers, M.Sc. and Ph.D. students and their supervisors. The meeting is organized by the PERG/ELHEP Laboratories of Institute of Electronic Systems - WUT since two decades. Sessions of the 2015 January meeting were: development of the architecture of digital electronics, embedded systems, design of system functionality, analog electronics and photonics, hardware - software integration, complex system reliability and dependability working in harsh environments, applications of electronic and photonic systems in space and satellite engineering and large research experiments. Summer edition of WILGA Symposium is organized on 25-31 May 2015 [wilga.ise.pw.edu.pl].
Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.
Fast, Karl V.; Campbell, D. Grant
Compares the implied ontological frameworks of the Open Archives Initiative Protocol for Metadata Harvesting and the World Wide Web Consortium's Semantic Web. Discusses current search engine technology, semantic markup, indexing principles of special libraries and online databases, and componentization and the distinction between data and…
Dineen, Brian R; Noe, Raymond A
The authors examined 2 forms of customization in a Web-based recruitment context. Hypotheses were tested in a controlled study in which participants viewed multiple Web-based job postings that each included information about multiple fit categories. Results indicated that customization of information regarding person-organization (PO), needs-supplies, and demands-abilities (DA) fit (fit information customization) and customization of the order in which these fit categories were presented (configural customization) had differential effects on outcomes. Specifically, (a) applicant pool PO and DA fit were greater when fit information customization was provided, (b) applicant pool fit in high- versus low-relevance fit categories was better differentiated when configural customization was provided, and (c) overall application rates were lower when either or both forms of customization were provided. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
Duarte, L; Teodoro, A C; Gonçalves, J A; Soares, D; Cunha, M
Soil erosion is a serious environmental problem. An estimation of the expected soil loss by water-caused erosion can be calculated considering the Revised Universal Soil Loss Equation (RUSLE). Geographical Information Systems (GIS) provide different tools to create categorical maps of soil erosion risk which help to study the risk assessment of soil loss. The objective of this study was to develop a GIS open source application (in QGIS), using the RUSLE methodology for estimating erosion rate at the watershed scale (desktop application) and provide the same application via web access (web application). The applications developed allow one to generate all the maps necessary to evaluate the soil erosion risk. Several libraries and algorithms from SEXTANTE were used to develop these applications. These applications were tested in Montalegre municipality (Portugal). The maps involved in RUSLE method-soil erosivity factor, soil erodibility factor, topographic factor, cover management factor, and support practices-were created. The estimated mean value of the soil loss obtained was 220 ton km(-2) year(-1) ranged from 0.27 to 1283 ton km(-2) year(-1). The results indicated that most of the study area (80 %) is characterized by very low soil erosion level (<321 ton km(-2) year(-1)) and in 4 % of the studied area the soil erosion was higher than 962 ton km(-2) year(-1). It was also concluded that areas with high slope values and bare soil are related with high level of erosion and the higher the P and C values, the higher the soil erosion percentage. The RUSLE web and the desktop application are freely available.
Romaniuk, Ryszard S.
Since twenty years, young researchers form the Institute of Electronic Systems, Warsaw University of Technology, organize two times a year, under only a marginal supervision of the senior faculty members, under the patronage of WEiTI PW, KEiT PAN, SPIE, IEEE, PKOpto SEP and PSF, the WILGA Symposium on advanced, integrated functional electronic, photonic and mechatronic systems [1-5]. All aspects are considered like: research and development, theory and design, technology - material and construction, software and hardware, commissioning and tests, as well as pilot and practical applications. The applications concern mostly, which turned after several years to be a proud specialization of the WILGA Symposium, Internet engineering, high energy physics experiments, new power industry including fusion, nuclear industry, space and satellite technologies, telecommunications, smart municipal environment, as well as biology and medicine [6-8]. XXXVIIth WILGA Symposium was held on 29-31 January 2016 and gathered a few tens of young researchers active in the mentioned research areas. There were presented a few tens of technical papers which will be published in Proc.SPIE together with the accepted articles from the Summer Edition of the WILGA Symposium scheduled for 29.05-06.06.2016. This article is a digest of chosen presentations from WILGA Symposium 2016 Winter Edition. The survey is narrowed to a few chosen and main topical tracks, like electronics and photonics design using industrial standards like ATCA/MTCA, also particular designs of functional systems using this series of industrial standards. The paper, summarizing traditionally since many years the accomplished WILGA Symposium organized by young researchers from Warsaw University of Technology, is also the following part of a cycle of papers concerning their participation in design of new generations of electronic systems used in discovery experiments in Poland and in leading research laboratories of the world.
Sreenivasaiah, Pradeep Kumar; Kim, Do Han
Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and Web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases and Web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research. PMID:21423387
Pierce, Marlon E.; Fox, Geoffrey C.; Aktas, Mehmet S.; Aydin, Galip; Gadgil, Harshawardhan; Qi, Zhigang; Sayar, Ahmet
We describe our distributed systems research efforts to build the “cyberinfrastructure” components that constitute a geophysical Grid, or more accurately, a Grid of Grids. Service-oriented computing principles are used to build a distributed infrastructure of Web accessible components for accessing data and scientific applications. Our data services fall into two major categories: Archival, database-backed services based around Geographical Information System (GIS) standards from the Open Geospatial Consortium, and streaming services that can be used to filter and route real-time data sources such as Global Positioning System data streams. Execution support services include application execution management services and services for transferring remote files. These data and execution service families are bound together through metadata information and workflow services for service orchestration. Users may access the system through the QuakeSim scientific Web portal, which is built using a portlet component approach.
Maxbauer, Daniel P.; Feinberg, Joshua M.; Fox, David L.
It is common in the fields of rock and environmental magnetism to unmix magnetic mineral components using statistical methods that decompose various types of magnetization curves (e.g., acquisition, demagnetization, or backfield). A number of programs have been developed over the past decade that are frequently used by the rock magnetic community, however many of these programs are either outdated or have obstacles inhibiting their usability. MAX UnMix is a web application (available online at http://www.irm.umn.edu/maxunmix), built using the shiny package for R studio, that can be used for unmixing coercivity distributions derived from magnetization curves. Here, we describe in detail the statistical model underpinning the MAX UnMix web application and discuss the programs functionality. MAX UnMix is an improvement over previous unmixing programs in that it is designed to be user friendly, runs as an independent website, and is platform independent.
Tsai, F.; Cho, K.
The Asian Association on Remote Sensing (AARS) organizes a web contest (WEBCON) of photogrammetry, remote sensing and spatial information sciences in the annual meeting of Asian Conference on Remote Sensing (ACRS) every year. The purpose of WEBCON is to promote the development of web and other forms of internet services of the internet related to geo-information sciences and to attract more students and young scientists participating in the related fields of study and applications. Since 2011, WEBCON has become one of the major events in ACRS and successfully increased the interest in the research, development and applications of photogrammetry, remote sensing and spatial information sciences among students and young scientist. The success of WEBCON is an excellent example of promoting the profession of spatial information to young people.
Sreenivasaiah, Pradeep Kumar; Kim, Do Han
Dynamic and rapidly evolving nature of systems driven research imposes special requirements on the technology, approach, design and architecture of computational infrastructure including database and Web application. Several solutions have been proposed to meet the expectations and novel methods have been developed to address the persisting problems of data integration. It is important for researchers to understand different technologies and approaches. Having familiarized with the pros and cons of the existing technologies, researchers can exploit its capabilities to the maximum potential for integrating data. In this review we discuss the architecture, design and key technologies underlying some of the prominent databases and Web applications. We will mention their roles in integration of biological data and investigate some of the emerging design concepts and computational technologies that are likely to have a key role in the future of systems driven biomedical research.
Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott; Splendiani, Andrea
As Semantic Web technologies mature and new releases of key elements, such as SPARQL 1.1 and OWL 2.0, become available, the Life Sciences continue to push the boundaries of these technologies with ever more sophisticated tools and applications. Unsurprisingly, therefore, interest in the SWAT4LS (Semantic Web Applications and Tools for the Life Sciences) activities have remained high, as was evident during the third international SWAT4LS workshop held in Berlin in December 2010. Contributors to this workshop were invited to submit extended versions of their papers, the best of which are now made available in the special supplement of BMC Bioinformatics. The papers reflect the wide range of work in this area, covering the storage and querying of Life Sciences data in RDF triple stores, tools for the development of biomedical ontologies and the semantics-based integration of Life Sciences as well as clinicial data.
Breunig, M.; Kuper, P. V.; Dittrich, A.; Wild, P.; Butwilowski, E.; Al-Doori, M.
The object-oriented database architecture DB4GeO was originally designed to support sub-surface applications in the geo-sciences. This is reflected in DB4GeO's geometric data model as well as in its import and export functions. Initially, these functions were designed for communication with 3D geological modeling and visualization tools such as GOCAD or MeshLab. However, it soon became clear that DB4GeO was suitable for a much wider range of applications. Therefore it is natural to move away from a standalone solution and to open the access to DB4GeO data by standardized OGC web-services. Though REST and OGC services seem incompatible at first sight, the implementation in DB4GeO shows that OGC-based implementation of web-services may use parts of the DB4GeO-REST implementation. Starting with initial solutions in the history of DB4GeO, this paper will introduce the design, adaptation (i.e. model transformation), and first steps in the implementation of OGC Web Feature (WFS) and Web Processing Services (WPS), as new interfaces to DB4GeO data and operations. Among its capabilities, DB4GeO can provide data in different data formats like GML, GOCAD, or DB3D XML through a WFS, as well as its ability to run operations like a 3D-to-2D service, or mesh-simplification (Progressive Meshes) through a WPS. We then demonstrate, an Android-based mobile 3D augmented reality viewer for DB4GeO that uses the Web Feature Service to visualize 3D geo-database query results. Finally, we explore future research work considering DB4GeO in the framework of the research group "Computer-Aided Collaborative Subway Track Planning in Multi-Scale 3D City and Building Models".
Stevens, C; Blackett, S; Legrice, I J; Hunter, P J
The Internet is becoming increasingly accessible and new technologies are enabling the delivery of more features to end users. It is therefore increasingly compelling to develop technology to facilitate the delivery of educational content and computational tools via the Internet. Here we report on the Internet enabling of the CMISS package as a Web browser extension, and its use in a custom online teaching application for medical students.
Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill
Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.
Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill
Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513
Living, working, and going to school near roadways has been associated with a number of adverse health effects, including asthma exacerbation, cardiovascular impairment, and respiratory symptoms. In the United States, 30% - 45% of urban populations live or work in the near-road environment, with a greater percentage of minority and low-income residents living in areas with highly- trafficked roadways. Near-road studies typically use surrogates of exposure to evaluate potential causality of health effects, including proximity, traffic counts, or total length of roads within a given radius. In contrast, simplified models provide an opportunity to examine how changes in input parameters, such as vehicle counts or speeds, can affect air quality. Simplified or reduced-form models typically retain the same or similar algorithms most responsible for characterizing uncertainty in more sophisticated models. The Community Line Source modeling system (C-LINE) allows users to explore what-if scenarios such as increases in diesel trucks or total traffic; examine hot spot conditions and areas for further study; determine ideal monitor placement locations; or evaluate air quality changes due to traffic re-routing. This presentation describes the input parameters, analytical procedures, visualization routines, and software considerations for C-LINE, and an example application for Newport News, Virginia. Results include scenarios related to port development and resulting traffic
Dee, Fred R; Haugen, Thomas H; Kreiter, Clarence D
The goal of mechanistic case diagraming (MCD) is to provide students with more in-depth understanding of cause and effect relationships and basic mechanistic pathways in medicine. This will enable them to better explain how observed clinical findings develop from preceding pathogenic and pathophysiological events. The pedagogic function of MCD is in relating risk factors, disease entities and morphology, signs and symptoms, and test and procedure findings in a specific case scenario with etiologic pathogenic and pathophysiological sequences within a flow diagram. In this paper, we describe the addition of automation and predetermined lists to further develop the original concept of MCD as described by Engelberg in 1992 and Guerrero in 2001. We demonstrate that with these modifications, MCD is effective and efficient in small group case-based teaching for second-year medical students (ratings of ~3.4 on a 4.0 scale). There was also a significant correlation with other measures of competency, with a 'true' score correlation of 0.54. A traditional calculation of reliability showed promising results (α =0.47) within a low stakes, ungraded environment. Further, we have demonstrated MCD's potential for use in independent learning and TBL. Future studies are needed to evaluate MCD's potential for use in medium stakes assessment or self-paced independent learning and assessment. MCD may be especially relevant in returning students to the application of basic medical science mechanisms in the clinical years.
Han, Weiguo; Di, Liping; Yu, Genong; Shao, Yuanzheng; Kang, Lingjun
Geospatial Web Services (GWS) make geospatial information and computing resources discoverable and accessible over the Web. Among them, Open Geospatial Consortium (OGC) standards-compliant data, catalog and processing services are most popular, and have been widely adopted and leveraged in geospatial research and applications. The GWS metrics, such as visit count, average processing time, and user distribution, are important to evaluate their overall performance and impacts. However, these metrics, especially of federated catalog service, have not been systematically evaluated and reported to relevant stakeholders from the point of view of service providers. Taking an integrated catalog service for earth observation data as an example, this paper describes metrics information retrieval, organization, and representation of a catalog service federation. An extensible and efficient log file analyzer is implemented to retrieve a variety of service metrics from the log file and store analysis results in an easily programmable format. An Ajax powered Web portal is built to provide stakeholders, sponsors, developers, partners, and other types of users with specific and relevant insights into metrics information in an interactive and informative form. The deployed system has provided useful information for periodical reports, service delivery, and decision support. The proposed measurement strategy and analytics framework can be a guidance to help GWS providers evaluate their services.
Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A
Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.
Dagher, A P; Fitzpatrick, M; Flanders, A E; Eng, J
Java is a relatively new programming language that has been used to develop a World Wide Web-based tool for estimating magnetic resonance (MR) imaging relaxation times, thereby demonstrating how Java may be used for Web-based radiology applications beyond improving the user interface of teaching files. A standard processing algorithm coded with Java is downloaded along with the hypertext markup language (HTML) document. The user (client) selects the desired pulse sequence and inputs data obtained from a region of interest on the MR images. The algorithm is used to modify selected MR imaging parameters in an equation that models the phenomenon being evaluated. MR imaging relaxation times are estimated, and confidence intervals and a P value expressing the accuracy of the final results are calculated. Design features such as simplicity, object-oriented programming, and security restrictions allow Java to expand the capabilities of HTML by offering a more versatile user interface that includes dynamic annotations and graphics. Java also allows the client to perform more sophisticated information processing and computation than is usually associated with Web applications. Java is likely to become a standard programming option, and the development of stand-alone Java applications may become more common as Java is integrated into future versions of computer operating systems.
Paquette, Suzanne M.; Leinonen, Kalle; Longabaugh, William J.R.
Background The drug discovery process is now highly dependent on the management, curation and integration of large amounts of potentially useful data. Semantics are necessary in order to interpret the information and derive knowledge. Advances in recent years have mitigated concerns that the lack of robust, usable tools has inhibited the adoption of methodologies based on semantics. Results This paper presents three examples of how Semantic Web techniques and technologies can be used in order to support chemistry research: a controlled vocabulary for quantities, units and symbols in physical chemistry; a controlled vocabulary for the classification and labelling of chemical substances and mixtures; and, a database of chemical identifiers. This paper also presents a Web-based service that uses the datasets in order to assist with the completion of risk assessment forms, along with a discussion of the legal implications and value-proposition for the use of such a service. Conclusions We have introduced the Semantic Web concepts, technologies, and methodologies that can be used to support chemistry research, and have demonstrated the application of those techniques in three areas very relevant to modern chemistry research, generating three new datasets that we offer as exemplars of an extensible portfolio of advanced data integration facilities. We have thereby established the importance of Semantic Web techniques and technologies for meeting Wild’s fourth “grand challenge”. PMID:24855494
Suryanto, Wiwit; Irnaka, Theodosius Marwan
One-dimensional modeling of magnetotelluric (MT) data has been performed using an online application on a web-based virtual private server. The application was developed with the Python language using the Django framework with HTML and CSS components. The input data, including the apparent resistivity and phase as a function of period or frequency with standard deviation, can be entered through an interactive web page that can be freely accessed at https://komputasi.geofisika.ugm.ac.id. The subsurface models, represented by resistivity as a function of depth, are iteratively improved by changing the model parameters, such as the resistivity and the layer depth, based on the observed apparent resistivity and phase data. The output of the application displayed on the screen presents resistivity as a function of depth and includes the RMS error for each iteration. Synthetic and real data were used in comparative tests of the application's performance, and it is shown that the application developed accurate subsurface resistivity models. Hence, this application can be used for practical one-dimensional modeling of MT data.
Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This
Mantas, V. M.; Liu, Z.; Pereira, A. J. S. C.
The full potential of Satellite Rainfall Estimates (SRE) can only be realized if timely access to the datasets is possible. Existing data distribution web portals are often focused on global products and offer limited customization options, especially for the purpose of routine regional monitoring. Furthermore, most online systems are designed to meet the needs of desktop users, limiting the compatibility with mobile devices. In response to the growing demand for SRE and to address the current limitations of available web portals a project was devised to create a set of freely available applications and services, available at a common portal that can: (1) simplify cross-platform access to Tropical Rainfall Measuring Mission Online Visualization and Analysis System (TOVAS) data (including from Android mobile devices), (2) provide customized and continuous monitoring of SRE in response to user demands and (3) combine data from different online data distribution services, including rainfall estimates, river gauge measurements or imagery from Earth Observation missions at a single portal, known as the Tropical Rainfall Measuring Mission (TRMM) Explorer. The TRMM Explorer project suite includes a Python-based web service and Android applications capable of providing SRE and ancillary data in different intuitive formats with the focus on regional and continuous analysis. The outputs include dynamic plots, tables and data files that can also be used to feed downstream applications and services. A case study in Southern Angola is used to describe the potential of the TRMM Explorer for SRE distribution and analysis in the context of ungauged watersheds. The development of a collection of data distribution instances helped to validate the concept and identify the limitations of the program, in a real context and based on user feedback. The TRMM Explorer can successfully supplement existing web portals distributing SRE and provide a cost-efficient resource to small and medium
Wibonele, Kasanda J.; Zhang, Yanqing
A web data mining system using granular computing and ASP programming is proposed. This is a web based application, which allows web users to submit survey data for many different companies. This survey is a collection of questions that will help these companies develop and improve their business and customer service with their clients by analyzing survey data. This web application allows users to submit data anywhere. All the survey data is collected into a database for further analysis. An administrator of this web application can login to the system and view all the data submitted. This web application resides on a web server, and the database resides on the MS SQL server.
Cody, R. P.; Manley, W. F.; Gaylord, A. G.; Kassin, A.; Villarreal, S.; Barba, M.; Dover, M.; Escarzaga, S. M.; Habermann, T.; Kozimor, J.; Score, R.; Tweedie, C. E.
Although a great deal of progress has been made with various arctic observing efforts, it can be difficult to assess such progress when so many agencies, organizations, research groups and others are making such rapid progress over such a large expanse of the Arctic. To help meet the strategic needs of the U.S. SEARCH-AON program and facilitate the development of SAON and other related initiatives, the Arctic Observing Viewer (AOV; http://ArcticObservingViewer.org) has been developed. This web mapping application compiles detailed information pertaining to U.S. Arctic Observing efforts. Contributing partners include the U.S. NSF, USGS, ACADIS, ADIwg, AOOS, a2dc, AON, ARMAP, BAID, IASOA, INTERACT, and others. Over 7700 observation sites are currently in the AOV database and the application allows users to visualize, navigate, select, advance search, draw, print, and more. During 2015, the web mapping application has been enhanced by the addition of a query builder that allows users to create rich and complex queries. AOV is founded on principles of software and data interoperability and includes an emerging "Project" metadata standard, which uses ISO 19115-1 and compatible web services. Substantial efforts have focused on maintaining and centralizing all database information. In order to keep up with emerging technologies, the AOV data set has been structured and centralized within a relational database and the application front-end has been ported to HTML5 to enable mobile access. Other application enhancements include an embedded Apache Solr search platform which provides users with the capability to perform advance searches and an administration web based data management system that allows administrators to add, update, and delete information in real time. We encourage all collaborators to use AOV tools and services for their own purposes and to help us extend the impact of our efforts and ensure AOV complements other cyber-resources. Reinforcing dispersed but
McCann, M. P.
Using the STOQS Web Application for Access to in situ Oceanographic Data Mike McCann 7 August 2012 With increasing measurement and sampling capabilities of autonomous oceanographic platforms (e.g. Gliders, Autonomous Underwater Vehicles, Wavegliders), the need to efficiently access and visualize the data they collect is growing. The Monterey Bay Aquarium Research Institute has designed and built the Spatial Temporal Oceanographic Query System (STOQS) specifically to address this issue. The need for STOQS arises from inefficiencies discovered from using CF-NetCDF point observation conventions for these data. The problem is that access efficiency decreases with decreasing dimension of CF-NetCDF data. For example, the Trajectory Common Data Model feature type has only one coordinate dimension, usually Time - positions of the trajectory (Depth, Latitude, Longitude) are stored as non-indexed record variables within the NetCDF file. If client software needs to access data between two depth values or from a bounded geographic area, then the whole data set must be read and the selection made within the client software. This is very inefficient. What is needed is a way to easily select data of interest from an archive given any number of spatial, temporal, or other constraints. Geospatial relational database technology provides this capability. The full STOQS application consists of a Postgres/PostGIS database, Mapserver, and Python-Django running on a server and Web 2.0 technology (jQuery, OpenLayers, Twitter Bootstrap) running in a modern web browser. The web application provides faceted search capabilities allowing a user to quickly drill into the data of interest. Data selection can be constrained by spatial, temporal, and depth selections as well as by parameter value and platform name. The web application layer also provides a REST (Representational State Transfer) Application Programming Interface allowing tools such as the Matlab stoqstoolbox to retrieve data
Mykkänen, Juha; Riekkinen, Annamari; Sormunen, Marko; Karhunen, Harri; Laitinen, Pertti
Service-oriented architectures (SOAs) and web service technologies have been proposed to respond to some central interoperability challenges of heterogeneous health information systems (HIS). We propose a model which we are using to define services and solutions for healthcare applications from the requirements in the healthcare processes. Focusing on the transition from the process level of the model to the application level, we also present some central design considerations, which can be used to guide the design of service-based interoperability. We illustrate these aspects with examples from our current work from the service-enabled HIS.
Mykkänen, Juha; Riekkinen, Annamari; Laitinen, Pertti; Karhunen, Harri; Sormunen, Marko
Service-oriented architectures (SOA) and web service technologies have been proposed to respond to some central interoperability challenges of heterogeneous health information systems (HIS). We propose a model, which we are using to define services and solutions for healthcare applications from the requirements in the healthcare processes. Focusing on the transition from the process level of the model to the application level, we also present some central design considerations, which can be used to guide the design of service-based interoperability and illustrate these aspects with examples from our current work in service-enabled HIS.
The Alpha Jet Atmospheric eXperiment (AJAX) is a research project based at Moffett Field, CA, which collects airborne measurements of ozone, carbon dioxide, methane, water vapor, and formaldehyde, as well as 3-D winds, temperature, pressure, and location. Since its first science flight in 2011, AJAX has developed a wide a variety of mission types, combining vertical profiles (from approx. 8 km to near surface),boundary layer legs, and plume sampling as needed. With an ongoing five-year data set, the team has sampled over 160 vertical profiles, a dozen wildfires, and numerous stratospheric ozone intrusions. This talk will present an overview of our flights flown to date, with particular focus on methane observations in the San Francisco Bay Area, Sacramento, and the delta region.
Skoog, R. A.
Nauman, Mohammad; Ali, Tamleek
Smartphones are increasingly being used to store personal information as well as to access sensitive data from the Internet and the cloud. Establishment of the identity of a user requesting information from smartphones is a prerequisite for secure systems in such scenarios. In the past, keystroke-based user identification has been successfully deployed on production-level mobile devices to mitigate the risks associated with naïve username/password based authentication. However, these approaches have two major limitations: they are not applicable to services where authentication occurs outside the domain of the mobile device - such as web-based services; and they often overly tax the limited computational capabilities of mobile devices. In this paper, we propose a protocol for keystroke dynamics analysis which allows web-based applications to make use of remote attestation and delegated keystroke analysis. The end result is an efficient keystroke-based user identification mechanism that strengthens traditional password protected services while mitigating the risks of user profiling by collaborating malicious web services.
Mair, M; Mikovits, C; Sengthaler, M; Schöpf, M; Kinzel, H; Urich, C; Kleidorfer, M; Sitzenfrei, R; Rauch, W
Research in urban water management has experienced a transition from traditional model applications to modelling water cycles as an integrated part of urban areas. This includes the interlinking of models of many research areas (e.g. urban development, socio-economy, urban water management). The integration and simulation is realized in newly developed frameworks (e.g. DynaMind and OpenMI) and often assumes a high knowledge in programming. This work presents a Web based urban water management modelling platform which simplifies the setup and usage of complex integrated models. The platform is demonstrated with a small application example on a case study within the Alpine region. The used model is a DynaMind model benchmarking the impact of newly connected catchments on the flooding behaviour of an existing combined sewer system. As a result the workflow of the user within a Web browser is demonstrated and benchmark results are shown. The presented platform hides implementation specific aspects behind Web services based technologies such that the user can focus on his main aim, which is urban water management modelling and benchmarking. Moreover, this platform offers a centralized data management, automatic software updates and access to high performance computers accessible with desktop computers and mobile devices.
Samwald, Matthias; Lim, Ernest; Masiar, Peter; Marenco, Luis; Chen, Huajun; Morse, Thomas; Mutalik, Pradeep; Shepherd, Gordon; Miller, Perry; Cheung, Kei-Hoi
The amount of biomedical data available in Semantic Web formats has been rapidly growing in recent years. While these formats are machine-friendly, user-friendly web interfaces allowing easy querying of these data are typically lacking. We present "Entrez Neuron", a pilot neuron-centric interface that allows for keyword-based queries against a coherent repository of OWL ontologies. These ontologies describe neuronal structures, physiology, mathematical models and microscopy images. The returned query results are organized hierarchically according to brain architecture. Where possible, the application makes use of entities from the Open Biomedical Ontologies (OBO) and the 'HCLS knowledgebase' developed by the W3C Interest Group for Health Care and Life Science. It makes use of the emerging RDFa standard to embed ontology fragments and semantic annotations within its HTML-based user interface. The application and underlying ontologies demonstrate how Semantic Web technologies can be used for information integration within a curated information repository and between curated information repositories. It also demonstrates how information integration can be accomplished on the client side, through simple copying and pasting of portions of documents that contain RDFa markup.
Awad, M A; Khalil, I
Web prediction is a classification problem in which we attempt to predict the next set of Web pages that a user may visit based on the knowledge of the previously visited pages. Predicting user's behavior while serving the Internet can be applied effectively in various critical applications. Such application has traditional tradeoffs between modeling complexity and prediction accuracy. In this paper, we analyze and study Markov model and all- Kth Markov model in Web prediction. We propose a new modified Markov model to alleviate the issue of scalability in the number of paths. In addition, we present a new two-tier prediction framework that creates an example classifier EC, based on the training examples and the generated classifiers. We show that such framework can improve the prediction time without compromising prediction accuracy. We have used standard benchmark data sets to analyze, compare, and demonstrate the effectiveness of our techniques using variations of Markov models and association rule mining. Our experiments show the effectiveness of our modified Markov model in reducing the number of paths without compromising accuracy. Additionally, the results support our analysis conclusions that accuracy improves with higher orders of all- Kth model.
Chan, Shermann S.; Wu, Yi; Li, Qing; Zhuang, Yueting
How to facilitate efficient video manipulation and access in a web-based environment is becoming a popular trend for video applications. In this paper, we present a web-oriented video management and application processing system, based on our previous work on multimedia database and content-based retrieval. In particular, we extend the VideoMAP architecture with specific web-oriented mechanisms, which include: (1) Concurrency control facilities for the editing of video data among different types of users, such as Video Administrator, Video Producer, Video Editor, and Video Query Client; different users are assigned various priority levels for different operations on the database. (2) Versatile video retrieval mechanism which employs a hybrid approach by integrating a query-based (database) mechanism with content- based retrieval (CBR) functions; its specific language (CAROL/ST with CBR) supports spatio-temporal semantics of video objects, and also offers an improved mechanism to describe visual content of videos by content-based analysis method. (3) Query profiling database which records the `histories' of various clients' query activities; such profiles can be used to provide the default query template when a similar query is encountered by the same kind of users. An experimental prototype system is being developed based on the existing VideoMAP prototype system, using Java and VC++ on the PC platform.
Italia, Michael J; Pennington, Jeffrey W; Ruth, Byron; Wrazien, Stacey; Loutrel, Jennifer G; Crenshaw, E Bryan; Miller, Jeffrey; White, Peter S
Biomedical researchers share a common challenge of making complex data understandable and accessible. This need is increasingly acute as investigators seek opportunities for discovery amidst an exponential growth in the volume and complexity of laboratory and clinical data. To address this need, we developed Harvest, an open source framework that provides a set of modular components to aid the rapid development and deployment of custom data discovery software applications. Harvest incorporates visual representations of multidimensional data types in an intuitive, web-based interface that promotes a real-time, iterative approach to exploring complex clinical and experimental data. The Harvest architecture capitalizes on standards-based, open source technologies to address multiple functional needs critical to a research and development environment, including domain-specific data modeling, abstraction of complex data models, and a customizable web client.
Welton, B.; Chouinard, K.; Sultan, M.; Becker, D.; Milewski, A.; Becker, R.
Rising populations in the arid and semi arid parts of the World are increasing the demand for fresh water supplies worldwide. Many data sets needed for assessment of hydrologic applications across vast regions of the world are expensive, unpublished, difficult to obtain, or at varying scales which complicates their use. Fortunately, this situation is changing with the development of global remote sensing datasets and web-based platforms such as GIS Server. GIS provides a cost effective vehicle for comparing, analyzing, and querying a variety of spatial datasets as geographically referenced layers. We have recently constructed a web-based GIS, that incorporates all relevant geological, geochemical, geophysical, and remote sensing data sets that were readily used to identify reservoir types and potential well locations on local and regional scales in various tectonic settings including: (1) extensional environment (Red Sea rift), (2) transcurrent fault system (Najd Fault in the Arabian-Nubian Shield), and (3) compressional environments (Himalayas). The web-based GIS could also be used to detect spatial and temporal trends in precipitation, recharge, and runoff in large watersheds on local, regional, and continental scales. These applications were enabled through the construction of a web-based ArcGIS Server with Google Map’s interface and the development of customized geoprocessing tools. ArcGIS Server provides out-of-the-box setups that are generic in nature. This platform includes all of the standard web based GIS tools (e.g. pan, zoom, identify, search, data querying, and measurement). In addition to the standard suite of tools provided by ArcGIS Server an additional set of advanced data manipulation and display tools was also developed to allow for a more complete and customizable view of the area of interest. The most notable addition to the standard GIS Server tools is the custom on-demand geoprocessing tools (e.g., graph, statistical functions, custom raster
Chen, Chuanbin; Wu, Qunyong; Chen, Chongcheng; Chen, Han
Lang, Jeremy S.; Irving, James R.
Kassin, A.; Gaylord, A. G.; Manley, W. F.; Villarreal, S.; Tweedie, C. E.; Cody, R. P.; Copenhaver, W.; Dover, M.; Score, R.; Habermann, T.
Although a great deal of progress has been made with various arctic observing efforts, it can be difficult to assess such progress when so many agencies, organizations, research groups and others are making such rapid progress. To help meet the strategic needs of the U.S. SEARCH-AON program and facilitate the development of SAON and related initiatives, the Arctic Observing Viewer (AOV; http://ArcticObservingViewer.org) has been developed. This web mapping application compiles detailed information pertaining to U.S. Arctic Observing efforts. Contributing partners include the U.S. NSF, USGS, ACADIS, ADIwg, AOOS, a2dc, AON, ARMAP, BAID, IASOA, INTERACT, and others. Over 6100 sites are currently in the AOV database and the application allows users to visualize, navigate, select, advance search, draw, print, and more. AOV is founded on principles of software and data interoperability and includes an emerging "Project" metadata standard, which uses ISO 19115-1 and compatible web services. In the last year, substantial efforts have focused on maintaining and centralizing all database information. In order to keep up with emerging technologies and demand for the application, the AOV data set has been structured and centralized within a relational database; furthermore, the application front-end has been ported to HTML5. Porting the application to HTML5 will now provide access to mobile users utilizing tablets and cell phone devices. Other application enhancements include an embedded Apache Solr search platform which provides users with the capability to perform advance searches throughout the AOV dataset, and an administration web based data management system which allows the administrators to add, update, and delete data in real time. We encourage all collaborators to use AOV tools and services for their own purposes and to help us extend the impact of our efforts and ensure AOV complements other cyber-resources. Reinforcing dispersed but interoperable resources in this
Weaver, J. Curtis; Terziotti, Silvia; Kolb, Katharine R.; Wagner, Chad R.
A statewide StreamStats application for North Carolina was developed in cooperation with the North Carolina Department of Transportation following completion of a pilot application for the upper French Broad River basin in western North Carolina (Wagner and others, 2009). StreamStats for North Carolina, available at http://water.usgs.gov/osw/streamstats/north_carolina.html, is a Web-based Geographic Information System (GIS) application developed by the U.S. Geological Survey (USGS) in consultation with Environmental Systems Research Institute, Inc. (Esri) to provide access to an assortment of analytical tools that are useful for water-resources planning and management (Ries and others, 2008). The StreamStats application provides an accurate and consistent process that allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection sites and user-selected ungaged sites. In the North Carolina application, users can compute 47 basin characteristics and peak-flow frequency statistics (Weaver and others, 2009; Robbins and Pope, 1996) for a delineated drainage basin. Selected streamflow statistics and basin characteristics for data-collection sites have been compiled from published reports and also are immediately accessible by querying individual sites from the web interface. Examples of basin characteristics that can be computed in StreamStats include drainage area, stream slope, mean annual precipitation, and percentage of forested area (Ries and others, 2008). Examples of streamflow statistics that were previously available only through published documents include peak-flow frequency, flow-duration, and precipitation data. These data are valuable for making decisions related to bridge design, floodplain delineation, water-supply permitting, and sustainable stream quality and ecology. The StreamStats application also allows users to identify stream reaches upstream and downstream from user-selected sites
Okladnikov, I.; Gordov, E. P.; Titov, A. G.
Alder, J. R.; Hostetler, S.
The USGS National Climate Change Viewer (NCCV) is an interactive web application for visualizing potential future changes in climate at the state, county and watershed levels for the Continental US on a monthly time scale. The aim of the application is to make climate model results more accessible and understandable to a wide range of users. The application is based on high resolution (~800 m) statistically downscaled climate projections (NEX-DCP30, RCP4.5 and 8.5, produced by NASA) from 30 of the CMIP5 GCMs. We used the climate data to drive a simple water balance model to provide additional insights into the potential for climate-driven change in surface hydrology. The application contains a robust set of tools for characterizing future climate change, including interactive maps, climographs, time series, data tables and downloadable summary reports for each region. The NCCV generally finds future temperature and evaporative demands increase, winter snowpack decreases and the timing of peak runoff is shifted earlier in the season, particularly in the Northeast and mountainous West. The web site has been of interest to the public, local, state and federal government agencies and universities as a teaching tool.
Web Design for Space Operations: An Overview of the Challenges and New Technologies Used in Developing and Operating Web-Based Applications in Real-Time Operational Support Onboard the International Space Station, in Astronaut Mission Planning and Mission Control Operations
The International Space Station (ISS) Operations Planning Team, Mission Control Centre and Mission Automation Support Network (MAS) have all evolved over the years to use commercial web-based technologies to create a configurable electronic infrastructure to manage the complex network of real-time planning, crew scheduling, resource and activity management as well as onboard document and procedure management required to co-ordinate ISS assembly, daily operations and mission support. While these Web technologies are classified as non-critical in nature, their use is part of an essential backbone of daily operations on the ISS and allows the crew to operate the ISS as a functioning science laboratory. The rapid evolution of the internet from 1998 (when ISS assembly began) to today, along with the nature of continuous manned operations in space, have presented a unique challenge in terms of software engineering and system development. In addition, the use of a wide array of competing internet technologies (including commercial technologies such as .NET and JAVA ) and the special requirements of having to support this network, both nationally among various control centres for International Partners (IPs), as well as onboard the station itself, have created special challenges for the MCC Web Tools Development Team, software engineers and flight controllers, who implement and maintain this system. This paper presents an overview of some of these operational challenges, and the evolving nature of the solutions and the future use of COTS based rich internet technologies in manned space flight operations. In particular this paper will focus on the use of Microsoft.s .NET API to develop Web-Based Operational tools, the use of XML based service oriented architectures (SOA) that needed to be customized to support Mission operations, the maintenance of a Microsoft IIS web server onboard the ISS, The OpsLan, functional-oriented Web Design with AJAX
Langella, Giuliano; Basile, Angelo; Coppola, Antonio; Manna, Piero; Orefice, Nadia; Terribile, Fabio
It is long time by now that scientific research on environmental and agricultural issues spent large effort in the development and application of models for prediction and simulation in spatial and temporal domains. This is fulfilled by studying and observing natural processes (e.g. rainfall, water and chemicals transport in soils, crop growth) whose spatiotemporal behavior can be reproduced for instance to predict irrigation and fertilizer requirements and yield quantities/qualities. In this work a mechanistic model to simulate water flow and solute transport in the soil-plant-atmosphere continuum is presented. This desktop computer program was written according to the specific requirement of developing web applications. The model is capable to solve the following issues all together: (a) water balance and (b) solute transport; (c) crop modelling; (d) GIS-interoperability; (e) embedability in web-based geospatial Decision Support Systems (DSS); (f) adaptability at different scales of application; and (g) ease of code modification. We maintained the desktop characteristic in order to further develop (e.g. integrate novel features) and run the key program modules for testing and validation purporses, but we also developed a middleware component to allow the model run the simulations directly over the web, without software to be installed. The GIS capabilities allows the web application to make simulations in a user-defined region of interest (delimited over a geographical map) without the need to specify the proper combination of model parameters. It is possible since the geospatial database collects information on pedology, climate, crop parameters and soil hydraulic characteristics. Pedological attributes include the spatial distribution of key soil data such as soil profile horizons and texture. Further, hydrological parameters are selected according to the knowledge about the spatial distribution of soils. The availability and definition in the geospatial domain
Huang, Tingting; Liu, Jialin; Li, Yong; Zhang, Rui
In order to support the theory and practice of the web-based cancer database development in China, we applied a systematic evaluation to assess the development condition of the web-based cancer databases at home and abroad. We performed computer-based retrieval of the Ovid-MEDLINE, Springerlink, EBSCOhost, Wiley Online Library and CNKI databases, the papers of which were published between Jan. 1995 and Dec. 2011, and retrieved the references of these papers by hand. We selected qualified papers according to the pre-established inclusion and exclusion criteria, and carried out information extraction and analysis of the papers. Eventually, searching the online database, we obtained 1244 papers, and checking the reference lists, we found other 19 articles. Thirty-one articles met the inclusion and exclusion criteria and we extracted the proofs and assessed them. Analyzing these evidences showed that the U.S.A. counted for 26% in the first place. Thirty-nine percent of these web-based cancer databases are comprehensive cancer databases. As for single cancer databases, breast cancer and prostatic cancer are on the top, both counting for 10% respectively. Thirty-two percent of the cancer database are associated with cancer gene information. For the technical applications, MySQL and PHP applied most widely, nearly 23% each.
Lundin, M; Lundin, J; Helin, H; Isola, J
Aims: To develop an educationally useful atlas of breast histopathology, using advanced web based virtual microscopy technology. Methods: By using a robotic microscope and software adopted and modified from the aerial and satellite imaging industry, a virtual microscopy system was developed that allows fully automated slide scanning and image distribution via the internet. More than 150 slides were scanned at high resolution with an oil immersion ×40 objective (numerical aperture, 1.3) and archived on an image server residing in a high speed university network. Results: A publicly available website was constructed, http://www.webmicroscope.net/breastatlas, which features a comprehensive virtual slide atlas of breast histopathology according to the World Health Organisation 2003 classification. Users can view any part of an entire specimen at any magnification within a standard web browser. The virtual slides are supplemented with concise textual descriptions, but can also be viewed without diagnostic information for self assessment of histopathology skills. Conclusions: Using the technology described here, it is feasible to develop clinically and educationally useful virtual microscopy applications. Web based virtual microscopy will probably become widely used at all levels in pathology teaching. PMID:15563669
Eguchi, S.; Kawasaki, W.; Shirasaki, Y.; Komiya, Y.; Kosugi, G.; Ohishi, M.; Mizumoto, Y.
van Nas, Atila; Pan, Calvin; Ingram-Drake, Leslie A; Ghazalpour, Anatole; Drake, Thomas A; Sobel, Eric M; Papp, Jeanette C; Lusis, Aldons J
The Systems Genetics Resource (SGR) (http://systems.genetics.ucla.edu) is a new open-access web application and database that contains genotypes and clinical and intermediate phenotypes from both human and mouse studies. The mouse data include studies using crosses between specific inbred strains and studies using the Hybrid Mouse Diversity Panel. SGR is designed to assist researchers studying genes and pathways contributing to complex disease traits, including obesity, diabetes, atherosclerosis, heart failure, osteoporosis, and lipoprotein metabolism. Over the next few years, we hope to add data relevant to deafness, addiction, hepatic steatosis, toxin responses, and vascular injury. The intermediate phenotypes include expression array data for a variety of tissues and cultured cells, metabolite levels, and protein levels. Pre-computed tables of genetic loci controlling intermediate and clinical phenotypes, as well as phenotype correlations, are accessed via a user-friendly web interface. The web site includes detailed protocols for all of the studies. Data from published studies are freely available; unpublished studies have restricted access during their embargo period.
Ryoo, Ju-Mee; Johnson, Matthew S.; Iraci, Laura T.; Yates, Emma L.; Gore, Warren
High ozone (O3) concentrations at low altitudes (1.5-4 km) were detected from airborne Alpha Jet Atmospheric eXperiment (AJAX) measurements on 30 May 2012 off the coast of California (CA). We investigate the causes of those elevated O3 concentrations using airborne measurements and various models. GEOS-Chem simulation shows that the contribution from local sources is likely small. A back-trajectory model was used to determine the air mass origins and how much they contributed to the O3 over CA. Low-level potential vorticity (PV) from Modern Era Retrospective analysis for Research and Applications 2 (MERRA-2) reanalysis data appears to be a result of the diabatic heating and mixing of airs in the lower altitudes, rather than be a result of direct transport from stratospheric intrusion. The Q diagnostic, which is a measure of the mixing of the air masses, indicates that there is sufficient mixing along the trajectory to indicate that O3 from the different origins is mixed and transported to the western U.S. The back-trajectory model simulation demonstrates the air masses of interest came mostly from the mid troposphere (MT, 76%), but the contribution of the lower troposphere (LT, 19%) is also significant compared to those from the upper troposphere/lower stratosphere (UT/LS, 5%). Air coming from the LT appears to be mostly originating over Asia. The possible surface impact of the high O3 transported aloft on the surface O3 concentration through vertical and horizontal transport within a few days is substantiated by the influence maps determined from the Weather Research and Forecasting-Stochastic Time Inverted Lagrangian Transport (WRF-STILT) model and the observed increases in surface ozone mixing ratios. Contrasting this complex case with a stratospheric-dominant event emphasizes the contribution of each source to the high O3 concentration in the lower altitudes over CA. Integrated analyses using models, reanalysis, and diagnostic tools, allows high ozone values
Leader, David P; Milner-White, E James
Background Small loop-shaped motifs are common constituents of the three-dimensional structure of proteins. Typically they comprise between three and seven amino acid residues, and are defined by a combination of dihedral angles and hydrogen bonding partners. The most abundant of these are αβ-motifs, asx-motifs, asx-turns, β-bulges, β-bulge loops, β-turns, nests, niches, Schellmann loops, ST-motifs, ST-staples and ST-turns. We have constructed a database of such motifs from a range of high-quality protein structures and built a web application as a visual interface to this. Description The web application, Motivated Proteins, provides access to these 12 motifs (with 48 sub-categories) in a database of over 400 representative proteins. Queries can be made for specific categories or sub-categories of motif, motifs in the vicinity of ligands, motifs which include part of an enzyme active site, overlapping motifs, or motifs which include a particular amino acid sequence. Individual proteins can be specified, or, where appropriate, motifs for all proteins listed. The results of queries are presented in textual form as an (X)HTML table, and may be saved as parsable plain text or XML. Motifs can be viewed and manipulated either individually or in the context of the protein in the Jmol applet structural viewer. Cartoons of the motifs imposed on a linear representation of protein secondary structure are also provided. Summary information for the motifs is available, as are histograms of amino acid distribution, and graphs of dihedral angles at individual positions in the motifs. Conclusion Motivated Proteins is a publicly and freely accessible web application that enables protein scientists to study small three-dimensional motifs without requiring knowledge of either Structured Query Language or the underlying database schema. PMID:19210785
Rossi, Lorenzo; Margola, Lorenzo; Manzelli, Vacia; Bandera, Alessandra
wHospital is the result of an information technology research project, based on the utilization of a web based application for managing the hospital drugs dispensing. Part of wHospital back bone and its key distinguishing characteristic is the adoption of the digital signature system,initially deployed by the Government of Lombardia, a Northern Italy Region, throughout the distribution of smart cards to all the healthcare and hospital staffs. The developed system is a web-based application with a proposed Health Records Digital Signature (HReDS) handshake to comply with the national law and with the Joint Commission International Standards. The prototype application, for a single hospital Operative Unit (OU), has focused on data and process management, related to drug therapy. Following a multi-faceted selection process, the Infective Disease OU of the Hospital in Busto Arsizio, Lombardia, was chosen for the development and prototype implementation. The project lead time, from user requirement analysis to training and deployment was approximately 8 months. This paper highlights the applied project methodology, the system architecture, and the achieved preliminary results.
Wolf, N.; Fuchsgruber, V.; Riembauer, G.; Siegmund, A.
Satellite images have great educational potential for teaching on environmental issues and can promote the motivation of young people to enter careers in natural science and technology. Due to the importance and ubiquity of remote sensing in science, industry and the public, the use of satellite imagery has been included into many school curricular in Germany. However, its implementation into school practice is still hesitant, mainly due to lack of teachers' know-how and education materials that align with the curricula. In the project "Space4Geography" a web-based learning platform is developed with the aim to facilitate the application of satellite imagery in secondary school teaching and to foster effective student learning experiences in geography and other related subjects in an interdisciplinary way. The platform features ten learning modules demonstrating the exemplary application of original high spatial resolution remote sensing data (RapidEye and TerraSAR-X) to examine current environmental issues such as droughts, deforestation and urban sprawl. In this way, students will be introduced into the versatile applications of spaceborne earth observation and geospatial technologies. The integrated web-based remote sensing software "BLIF" equips the students with a toolset to explore, process and analyze the satellite images, thereby fostering the competence of students to work on geographical and environmental questions without requiring prior knowledge of remote sensing. This contribution presents the educational concept of the learning environment and its realization by the example of the learning module "Deforestation of the rainforest in Brasil".
Newton, Richard; Hinds, Jason; Wernisch, Lorenz
Whole genome DNA microarray genomotyping experiments compare the gene content of different species or strains of bacteria. A statistical approach to analysing the results of these experiments was developed, based on a Hidden Markov model (HMM), which takes adjacency of genes along the genome into account when calling genes present or absent. The model was implemented in the statistical language R and applied to three datasets. The method is numerically stable with good convergence properties. Error rates are reduced compared with approaches that ignore spatial information. Moreover, the HMM circumvents a problem encountered in a conventional analysis: determining the cut-off value to use to classify a gene as absent. An Apache Struts web interface for the R script was created for the benefit of users unfamiliar with R. The application may be found at http://hmmgd.cryst.bbk.ac.uk/hmmgd. The source code illustrating how to run R scripts from an Apache Struts-based web application is available from the corresponding author on request. The application is also available for local installation if required.
Gopinath, Girish; Ambili, G K; Gregory, Shery Joseph; Anusha, C K
Application of geospatial technology is very shimmering in drought monitoring. Drought severity in crops for six northern districts of Kerala has been attempted using Geospatial Techniques. Normalized Difference Vegetation Index (NDVI) is the major parameter used to measure vegetation health obtained from MODIS, Terra satellite products MOD13Q1, MOD02QKM. The mean Normalized Difference Vegetation Index (NDVI) of Kerala state over 13 years was calculated. The daily anomalies of NDVI from its long term mean NDVI over the same period was determined based on which drought risk classification was done. High negative NDVI anomaly areas are susceptible to drought and the severity of drought risk on each crop can be identified using Land Use/Land Cover data. Overlaying daily NDVI Anomaly based drought risk map on land use/land cover map gives the drought risk for different crops. Based on this, a web application has been developed for Northern districts of Kerala state in India. This web application can be used to plan for drought management measures and can also serve as a database for drought analysis.
Kogawa, Noriko; Ito, Reiko; Gon, Yasuhiro; Maruoka, Shuichiro; Hashimoto, Shu
Instruction on inhalation techniques for chronic obstructive pulmonary disease(COPD)and asthma patients being treated with inhalants have sufficient therapeutic effects and are important to maintain adherence. However, problems continue to exist, including time constraints of medical staff that have a large number of patients and a lack of knowledge on inhalation instruction methods. A web application,"Inhalation Lessons,'for the iPad has been developed. It explains inhalation methods, and consists of videos and review tests. Instruction on inhalation techniques was performed using this application for patients that use Diskus, and the effects were examined. As a result, there are significant improvements in the inhalation techniques of patients after viewing the"Inhalation Lessons'application. Uniform instruction on inhalation techniques can be performed even in the field of homecare.
Health assessment is a clinical nursing course and places emphasis on clinical skills. The application of computer-assisted instruction in the field of nursing teaching solved the problems in the traditional lecture class. This article stated teaching experience of web-based computer-assisted instruction, based upon a two-year study of computer-assisted instruction courseware use within the course health assessment. The computer-assisted instruction courseware could develop teaching structure, simulate clinical situations, create teaching situations and facilitate students study.
Pence, Charles H.
While textual analysis of the journal literature is a burgeoning field, there is still a profound lack of user-friendly software for accomplishing this task. RLetters is a free, open-source web application which provides researchers with an environment in which they can select sets of journal articles and analyze them with cutting-edge textual analysis tools. RLetters allows users without prior expertise in textual analysis to analyze word frequency, collocations, cooccurrences, term networks, and more. It is implemented in Ruby and scripts are provided to automate deployment. PMID:26731738
Attali, Dean; Bidshahri, Roza; Haynes, Charles; Bryan, Jennifer
Droplet digital polymerase chain reaction (ddPCR) is a novel platform for exact quantification of DNA which holds great promise in clinical diagnostics. It is increasingly popular due to its digital nature, which provides more accurate quantification and higher sensitivity than traditional real-time PCR. However, clinical adoption has been slowed in part by the lack of software tools available for analyzing ddPCR data. Here, we present ddpcr – a new R package for ddPCR visualization and analysis. In addition, ddpcr includes a web application (powered by the Shiny R package) that allows users to analyze ddPCR data using an interactive graphical interface. PMID:27703666
Attali, Dean; Bidshahri, Roza; Haynes, Charles; Bryan, Jennifer
Droplet digital polymerase chain reaction (ddPCR) is a novel platform for exact quantification of DNA which holds great promise in clinical diagnostics. It is increasingly popular due to its digital nature, which provides more accurate quantification and higher sensitivity than traditional real-time PCR. However, clinical adoption has been slowed in part by the lack of software tools available for analyzing ddPCR data. Here, we present ddpcr - a new R package for ddPCR visualization and analysis. In addition, ddpcr includes a web application (powered by the Shiny R package) that allows users to analyze ddPCR data using an interactive graphical interface.
McGowan, Sean M.; Luco, Nicolas
The U.S. Geological Survey (USGS) Engineering Risk Assessment Project has developed the Seismic Risk Web Application to combine earthquake hazard and structural fragility information in order to calculate the risk of earthquake damage to structures. Enabling users to incorporate their own hazard and fragility information into the calculations will make it possible to quantify (in near real-time) the risk of additional damage to structures caused by aftershocks following significant earthquakes. Results can quickly be shared with stakeholders to illustrate the impact of elevated ground motion hazard and earthquake-compromised structural integrity on the risk of damage during a short-term, post-earthquake time horizon.
Stuckey, Marla H.; Hoffman, Scott A.
StreamStats is a national web-based Geographic Information System (GIS) application, developed by the U.S. Geological Survey (USGS), in cooperation with Environmental Systems Research Institute, Inc., to provide a variety of water-resource-related information. Users can easily obtain descriptive information, basin characteristics, and streamflow statistics for USGS streamgages and ungaged stream locations throughout Pennsylvania. StreamStats also allows users to search upstream and (or) downstream from user-selected points to identify locations of and obtain information for water-resource-related activities, such as dams and streamgages.
Hsieh, Sheau-Ling; Chang, Wen-Yung; Chen, Chi-Huang; Weng, Yung-Ching
Various researches in web related semantic similarity measures have been deployed. However, measuring semantic similarity between two terms remains a challenging task. The traditional ontology-based methodologies have a limitation that both concepts must be resided in the same ontology tree(s). Unfortunately, in practice, the assumption is not always applicable. On the other hand, if the corpus is sufficiently adequate, the corpus-based methodologies can overcome the limitation. Now, the web is a continuous and enormous growth corpus. Therefore, a method of estimating semantic similarity is proposed via exploiting the page counts of two biomedical concepts returned by Google AJAX web search engine. The features are extracted as the co-occurrence patterns of two given terms P and Q, by querying P, Q, as well as P AND Q, and the web search hit counts of the defined lexico-syntactic patterns. These similarity scores of different patterns are evaluated, by adapting support vector machines for classification, to leverage the robustness of semantic similarity measures. Experimental results validating against two datasets: dataset 1 provided by A. Hliaoutakis; dataset 2 provided by T. Pedersen, are presented and discussed. In dataset 1, the proposed approach achieves the best correlation coefficient (0.802) under SNOMED-CT. In dataset 2, the proposed method obtains the best correlation coefficient (SNOMED-CT: 0.705; MeSH: 0.723) with physician scores comparing with measures of other methods. However, the correlation coefficients (SNOMED-CT: 0.496; MeSH: 0.539) with coder scores received opposite outcomes. In conclusion, the semantic similarity findings of the proposed method are close to those of physicians' ratings. Furthermore, the study provides a cornerstone investigation for extracting fully relevant information from digitizing, free-text medical records in the National Taiwan University Hospital database.
Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien
Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective
Granitz, Neil; Koernig, Stephen K.
Although both experiential learning and Web 2.0 tools focus on creativity, sharing, and collaboration, sparse research has been published integrating a Web 2.0 paradigm with experiential learning in marketing. In this article, Web 2.0 concepts are explained. Web 2.0 is then positioned as a philosophy that can advance experiential learning through…
Alberti, Koko; Hiemstra, Paul; de Jong, Kor; Karssenberg, Derek
Numerical ensemble models are used in the analysis and forecasting of a wide range of environmental processes. Common use cases include assessing the consequences of nuclear accidents, pollution releases into the ocean or atmosphere, forest fires, volcanic eruptions, or identifying areas at risk from such hazards. In addition to the increased use of scenario analyses and model forecasts, the availability of supplementary data describing errors and model uncertainties is increasingly commonplace. Unfortunately most current visualization routines are not capable of properly representing uncertain information. As a result, uncertainty information is not provided at all, not readily accessible, or it is not communicated effectively to model users such as domain experts, decision makers, policy makers, or even novice users. In an attempt to address these issues a lightweight and interactive web-application has been developed. It makes clear and concise uncertainty visualizations available in a web-based mapping and visualization environment, incorporating aggregation (upscaling) techniques to adjust uncertainty information to the zooming level. The application has been built on a web mapping stack of open source software, and can quantify and visualize uncertainties in numerical ensemble models in such a way that both expert and novice users can investigate uncertainties present in a simple ensemble dataset. As a test case, a dataset was used which forecasts the spread of an airborne tracer across Western Europe. Extrinsic uncertainty representations are used in which dynamic circular glyphs are overlaid on model attribute maps to convey various uncertainty concepts. It supports both basic uncertainty metrics such as standard deviation, standard error, width of the 95% confidence interval and interquartile range, as well as more experimental ones aimed at novice users. Ranges of attribute values can be specified, and the circular glyphs dynamically change size to
Gross, M. B.; Mayernik, M. S.; Rowan, L. R.; Khan, H.; Boler, F. M.; Maull, K. E.; Stott, D.; Williams, S.; Corson-Rikert, J.; Johns, E. M.; Daniels, M. D.; Krafft, D. B.
UNAVCO, UCAR, and Cornell University are working together to leverage semantic web technologies to enable discovery of people, datasets, publications and other research products, as well as the connections between them. The EarthCollab project, an EarthCube Building Block, is enhancing an existing open-source semantic web application, VIVO, to address connectivity gaps across distributed networks of researchers and resources related to the following two geoscience-based communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. People, publications, datasets and grant information have been mapped to an extended version of the VIVO-ISF ontology and ingested into VIVO's database. Data is ingested using a custom set of scripts that include the ability to perform basic automated and curated disambiguation. VIVO can display a page for every object ingested, including connections to other objects in the VIVO database. A dataset page, for example, includes the dataset type, time interval, DOI, related publications, and authors. The dataset type field provides a connection to all other datasets of the same type. The author's page will show, among other information, related datasets and co-authors. Information previously spread across several unconnected databases is now stored in a single location. In addition to VIVO's default display, the new database can also be queried using SPARQL, a query language for semantic data. EarthCollab will also extend the VIVO web application. One such extension is the ability to cross-link separate VIVO instances across institutions, allowing local display of externally curated information. For example, Cornell's VIVO faculty pages will display UNAVCO's dataset information and UNAVCO's VIVO will display Cornell faculty member contact and
Ludovici, Alessandro; Calveras, Anna
In this paper, we present the design of a Constrained Application Protocol (CoAP) proxy able to interconnect Web applications based on Hypertext Transfer Protocol (HTTP) and WebSocket with CoAP based Wireless Sensor Networks. Sensor networks are commonly used to monitor and control physical objects or environments. Smart Cities represent applications of such a nature. Wireless Sensor Networks gather data from their surroundings and send them to a remote application. This data flow may be short or long lived. The traditional HTTP long-polling used by Web applications may not be adequate in long-term communications. To overcome this problem, we include the WebSocket protocol in the design of the CoAP proxy. We evaluate the performance of the CoAP proxy in terms of latency and memory consumption. The tests consider long and short-lived communications. In both cases, we evaluate the performance obtained by the CoAP proxy according to the use of WebSocket and HTTP long-polling. PMID:25585107
Ludovici, Alessandro; Calveras, Anna
In this paper, we present the design of a Constrained Application Protocol (CoAP) proxy able to interconnect Web applications based on Hypertext Transfer Protocol (HTTP) and WebSocket with CoAP based Wireless Sensor Networks. Sensor networks are commonly used to monitor and control physical objects or environments. Smart Cities represent applications of such a nature. Wireless Sensor Networks gather data from their surroundings and send them to a remote application. This data flow may be short or long lived. The traditional HTTP long-polling used by Web applications may not be adequate in long-term communications. To overcome this problem, we include the WebSocket protocol in the design of the CoAP proxy. We evaluate the performance of the CoAP proxy in terms of latency and memory consumption. The tests consider long and short-lived communications. In both cases, we evaluate the performance obtained by the CoAP proxy according to the use of WebSocket and HTTP long-polling.
Bastos, Hugo P; Sousa, Lisete; Clarke, Luka A; Couto, Francisco M
Functional context for biological sequence is provided in the form of annotations. However, within a group of similar sequences there can be annotation heterogeneity in terms of coverage and specificity. This in turn can introduce issues regarding the interpretation of actual functional similarity and overall functional coherence of such a group. One way to mitigate such issues is through the use of visualization and statistical techniques. Therefore, in order to help interpret this annotation heterogeneity we created a web application that generates Gene Ontology annotation graphs for protein sets and their associated statistics from simple frequencies to enrichment values and Information Content based metrics. The publicly accessible website http://xldb.di.fc.ul.pt/gryfun/ currently accepts lists of UniProt accession numbers in order to create user-defined protein sets for subsequent annotation visualization and statistical assessment. GRYFUN is a freely available web application that allows GO annotation visualization of protein sets and which can be used for annotation coherence and cohesiveness analysis and annotation extension assessments within under-annotated protein sets.
Maddox, Marlo; Zheng, Yihua; Rastaetter, Lutz; Taktakishvili, A.; Mays, M. L.; Kuznetsova, M.; Lee, Hyesook; Chulaki, Anna; Hesse, Michael; Mullinix, Richard; Berrios, David
The NASA GSFC Space Weather Center (http://swc.gsfc.nasa.gov) is committed to providing forecasts, alerts, research, and educational support to address NASA's space weather needs - in addition to the needs of the general space weather community. We provide a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, custom space weather alerts and products, weekly summaries and reports, and most recently - video casts. There are many challenges in providing accurate descriptions of past, present, and expected space weather events - and the Space Weather Center at NASA GSFC employs several innovative solutions to provide access to a comprehensive collection of both observational data, as well as space weather model/simulation data. We'll describe the challenges we've faced with managing hundreds of data streams, running models in real-time, data storage, and data dissemination. We'll also highlight several systems and tools that are utilized by the Space Weather Center in our daily operations, all of which are available to the general community as well. These systems and services include a web-based application called the Integrated Space Weather Analysis System (iSWA http://iswa.gsfc.nasa.gov), two mobile space weather applications for both IOS and Android devices, an external API for web-service style access to data, google earth compatible data products, and a downloadable client-based visualization tool.
Elmeligy Abdelhamid, Sherif H.; Kuhlman, Chris J.; Marathe, Madhav V.; Mortveit, Henning S.; Ravi, S. S.
Discrete dynamical systems are used to model various realistic systems in network science, from social unrest in human populations to regulation in biological networks. A common approach is to model the agents of a system as vertices of a graph, and the pairwise interactions between agents as edges. Agents are in one of a finite set of states at each discrete time step and are assigned functions that describe how their states change based on neighborhood relations. Full characterization of state transitions of one system can give insights into fundamental behaviors of other dynamical systems. In this paper, we describe a discrete graph dynamical systems (GDSs) application called GDSCalc for computing and characterizing system dynamics. It is an open access system that is used through a web interface. We provide an overview of GDS theory. This theory is the basis of the web application; i.e., an understanding of GDS provides an understanding of the software features, while abstracting away implementation details. We present a set of illustrative examples to demonstrate its use in education and research. Finally, we compare GDSCalc with other discrete dynamical system software tools. Our perspective is that no single software tool will perform all computations that may be required by all users; tools typically have particular features that are more suitable for some tasks. We situate GDSCalc within this space of software tools. PMID:26263006
Beijer, Lilian J; Rietveld, Toni C M; van Beers, Marijn M A; Slangen, Robert M L; van den Heuvel, Henk; de Swart, Bert J M; Geurts, Alexander C H
Abstract In The Netherlands, a web application for speech training, E-learning-based speech therapy (EST), has been developed for patients with dysarthria, a speech disorder resulting from acquired neurological impairments such as stroke or Parkinson's disease. In this report, the EST infrastructure and its potentials for both therapists and patients are elucidated. EST provides patients with dysarthria the opportunity to engage in intensive speech training in their own environment, in addition to undergoing the traditional face-to-face therapy. Moreover, patients with chronic dysarthria can use EST to independently maintain the quality of their speech once the face-to-face sessions with their speech therapist have been completed. This telerehabilitation application allows therapists to remotely compose speech training programs tailored to suit each individual patient. Moreover, therapists can remotely monitor and evaluate changes in the patient's speech. In addition to its value as a device for composing, monitoring, and carrying out web-based speech training, the EST system compiles a database of dysarthric speech. This database is vital for further scientific research in this area.
O'Halloran, Damien M
Overlapping PCR is routinely used in a wide number of molecular applications. These include stitching PCR fragments together, generating fluorescent transcriptional and translational fusions, inserting mutations, making deletions, and PCR cloning. Overlapping PCR is also used for genotyping by traditional PCR techniques and in detection experiments using techniques such as loop-mediated isothermal amplification (LAMP). STITCHER is a web tool providing a central resource for researchers conducting all types of overlapping PCR experiments with an intuitive interface for automated primer design that's fast, easy to use, and freely available online (http://ohalloranlab.net/STITCHER.html). STITCHER can handle both single sequence and multi-sequence input, and specific features facilitate numerous other PCR applications, including assembly PCR, adapter PCR, and primer walking. Field PCR, and in particular, LAMP, offers promise as an on site tool for pathogen detection in underdeveloped areas, and STITCHER includes off-target detection features for pathogens commonly targeted using LAMP technology.
There is a great deal of excitement about using the internet and the World Wide Web in education. There are such exciting possibilities and there is a wealth and variety of material up on the web. There are however many problems, problems of access and resources, problems of quality -- for every excellent resource there are many poor ones, and there are insufficiently explored problems of teacher training and motivation. For example, Wiesenmayer and Meadows report on a study of 347 West Virginia science teachers. These teachers were enrolled in a week-long summer workshop to introduce them to the internet and its educational potential. The teachers were asked to review science sites as to overall quality and then about their usefulness in their own classrooms. The teachers were enthusiastic about the web, and gave two-thirds of the sites high ratings, and essentially all the rest average ratings. But alarmingly, over 80% of these sites were viewed as having no direct applicability in the teacher's own classroom. This summer I was assigned to work on the Amphion project in the Automated Software Engineering Group under the leadership of Michael Lowry. I wished to find educational applications of the Amphion system, which in its current implementation can be used to create fortran programs and animations using the SPICE libraries created by the NAIF group at JPL. I wished to find an application which provided real added educational value, which was in line with educational curriculum standards and which would serve a documented need of the educational community. The application selected was teaching about the causes of the seasons -- at the approximately the fourth, fifth, sixth grade level. This topic was chosen because it is in line with national curriculum standards. The fourth, fifth, sixth grade level was selected to coincide with the grade level served by the Ames Aerospace Encounter, which services 10,000 children a year on field trips. The hope is that
Maso, Joan; Díaz, Paula; Riverola, Anna; Pons, Xavier
Currently, the discovery and sharing of geospatial information over the web still presents difficulties. News distribution through website content was simplified by the use of Really Simple Syndication (RSS) and Atom syndication formats. This communication exposes an extension of Atom to redistribute references to geospatial information in a Spatial Data Infrastructure distributed environment. A geospatial client can save the status of an application that involves several OGC services of different kind and direct data and share this status with other users that need the same information and use different client vendor products in an interoperable way. The extensibility of the Atom format was essential to define a format that could be used in RSS enabled web browser, Mass Market map viewers and emerging geospatial enable integrated clients that support Open Geospatial Consortium (OGC) services. Since OWS Context has been designed as an Atom extension, it is possible to see the document in common places where Atom documents are valid. Internet web browsers are able to present the document as a list of items with title, abstract, time, description and downloading features. OWS Context uses GeoRSS so that, the document can be to be interpreted by both Google maps and Bing Maps as items that have the extent represented in a dynamic map. Another way to explode a OWS Context is to develop an XSLT to transform the Atom feed into an HTML5 document that shows the exact status of the client view window that saved the context document. To accomplish so, we use the width and height of the client window, and the extent of the view in world (geographic) coordinates in order to calculate the scale of the map. Then, we can mix elements in world coordinates (such as CF-NetCDF files or GML) with elements in pixel coordinates (such as WMS maps, WMTS tiles and direct SVG content). A smarter map browser application called MiraMon Map Browser is able to write a context document and read
Lee, Jae Eun; Sung, Jung Hye; Malouhi, Mohamad
Purpose: There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI) Translational Research Network (RTRN) Data Coordinating Center (DCC) and discuss its applicability to cardiovascular studies. Methods: Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach’s alpha (α) were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks). Results: Cronbach’s α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p < 0.0001), fast food restaurants (r = 0.729; p < 0.0001), parks (r = 0.773; p < 0.0001) and sidewalks (r = 0.648; p < 0.0001) within a mile from homes. It was also significantly associated with diversity index (r = 0.138, p = 0.0023), median household incomes (r = −0.181; p < 0.0001), and owner occupied rates (r = −0.440; p < 0.0001). However, its non-significant correlation was found with median age, vulnerability, unemployment rate, labor force, and population growth rate. Conclusion: Our data demonstrates that geospatial data generated by the web-based application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long-term effect
King, Zachary A.; Dräger, Andreas; Ebrahim, Ali; Sonnenschein, Nikolaus; Lewis, Nathan E.; Palsson, Bernhard O.
Escher is a web application for visualizing data on biological pathways. Three key features make Escher a uniquely effective tool for pathway visualization. First, users can rapidly design new pathway maps. Escher provides pathway suggestions based on user data and genome-scale models, so users can draw pathways in a semi-automated way. Second, users can visualize data related to genes or proteins on the associated reactions and pathways, using rules that define which enzymes catalyze each reaction. Thus, users can identify trends in common genomic data types (e.g. RNA-Seq, proteomics, ChIP)—in conjunction with metabolite- and reaction-oriented data types (e.g. metabolomics, fluxomics). Third, Escher harnesses the strengths of web technologies (SVG, D3, developer tools) so that visualizations can be rapidly adapted, extended, shared, and embedded. This paper provides examples of each of these features and explains how the development approach used for Escher can be used to guide the development of future visualization tools. PMID:26313928
Lee, Jongin; Hong, Woon-young; Cho, Minah; Sim, Mikang; Lee, Daehwan; Ko, Younhee; Kim, Jaebum
Recent advances in next-generation sequencing technologies and genome assembly algorithms have enabled the accumulation of a huge volume of genome sequences from various species. This has provided new opportunities for large-scale comparative genomics studies. Identifying and utilizing synteny blocks, which are genomic regions conserved among multiple species, is key to understanding genomic architecture and the evolutionary history of genomes. However, the construction and visualization of such synteny blocks from multiple species are very challenging, especially for biologists with a lack of computational skills. Here, we present Synteny Portal, a versatile web-based application portal for constructing, visualizing and browsing synteny blocks. With Synteny Portal, users can easily (i) construct synteny blocks among multiple species by using prebuilt alignments in the UCSC genome browser database, (ii) visualize and download syntenic relationships as high-quality images, (iii) browse synteny blocks with genetic information and (iv) download the details of synteny blocks to be used as input for downstream synteny-based analyses, all in an intuitive and easy-to-use web-based interface. We believe that Synteny Portal will serve as a highly valuable tool that will enable biologists to easily perform comparative genomics studies by compensating limitations of existing tools. Synteny Portal is freely available at http://bioinfo.konkuk.ac.kr/synteny_portal. PMID:27154270
Gurjar, Anoop Kishor Singh; Panwar, Abhijeet Singh; Gupta, Rajinder; Mantri, Shrikant S
High-throughput small RNA (sRNA) sequencing technology enables an entirely new perspective for plant microRNA (miRNA) research and has immense potential to unravel regulatory networks. Novel insights gained through data mining in publically available rich resource of sRNA data will help in designing biotechnology-based approaches for crop improvement to enhance plant yield and nutritional value. Bioinformatics resources enabling meta-analysis of miRNA expression across multiple plant species are still evolving. Here, we report PmiRExAt, a new online database resource that caters plant miRNA expression atlas. The web-based repository comprises of miRNA expression profile and query tool for 1859 wheat, 2330 rice and 283 maize miRNA. The database interface offers open and easy access to miRNA expression profile and helps in identifying tissue preferential, differential and constitutively expressing miRNAs. A feature enabling expression study of conserved miRNA across multiple species is also implemented. Custom expression analysis feature enables expression analysis of novel miRNA in total 117 datasets. New sRNA dataset can also be uploaded for analysing miRNA expression profiles for 73 plant species. PmiRExAt application program interface, a simple object access protocol web service allows other programmers to remotely invoke the methods written for doing programmatic search operations on PmiRExAt database.Database URL:http://pmirexat.nabi.res.in.
Molinaro, Marco; Knapic, Cristina; Smareglia, Riccardo
Italian center for Astronomical Archives (IA2, http://ia2.oats.inaf.it) is a national infrastructure project of the Italian National Institute for Astrophysics (Istituto Nazionale di AstroFisica, INAF) that provides services for the astronomical community. Besides data hosting for the Large Binocular Telescope (LBT) Corporation, the Galileo National Telescope (Telescopio Nazionale Galileo, TNG) Consortium and other telescopes and instruments, IA2 offers proprietary and public data access through user portals (both developed and mirrored) and deploys resources complying the Virtual Observatory (VO) standards. Archiving systems and web interfaces are developed to be extremely flexible about adding new instruments from other telescopes. VO resources publishing, along with data access portals, implements the International Virtual Observatory Alliance (IVOA) protocols providing astronomers with new ways of analyzing data. Given the large variety of data flavours and IVOA standards, the need for tools to easily accomplish data ingestion and data publishing arises. This paper describes the VO-Dance tool, that IA2 started developing to address VO resources publishing in a dynamical way from already existent database tables or views. The tool consists in a Java web application, potentially DBMS and platform independent, that stores internally the services' metadata and information, exposes restful endpoints to accept VO queries for these services and dynamically translates calls to these endpoints to SQL queries coherent with the published table or view. In response to the call VO-Dance translates back the database answer in a VO compliant way.
The mission of the United States Nuclear Data Program (USNDP) is to provide current, accurate, and authoritative data for use in pure and applied areas of nuclear science and engineering. This is accomplished by compiling, evaluating, and disseminating extensive datasets. Our main products include the Evaluated Nuclear Structure File (ENSDF) containing information on nuclear structure and decay properties and the Evaluated Nuclear Data File (ENDF) containing information on neutron-induced reactions. The National Nuclear Data Center (NNDC), through the website www.nndc.bnl.gov, provides web-based retrieval systems for these and many other databases. In addition, the NNDC hosts several on-line physics tools, useful for calculating various quantities relating to basic nuclear physics. In this talk, I will first introduce the quantities which are evaluated and recommended in our databases. I will then outline the searching capabilities which allow one to quickly and efficiently retrieve data. Finally, I will demonstrate how the database searches and web applications can provide effective teaching tools concerning the structure of nuclei and how they interact. Work supported by the Office of Nuclear Physics, Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-98CH10886.
King, Zachary A; Dräger, Andreas; Ebrahim, Ali; Sonnenschein, Nikolaus; Lewis, Nathan E; Palsson, Bernhard O
Escher is a web application for visualizing data on biological pathways. Three key features make Escher a uniquely effective tool for pathway visualization. First, users can rapidly design new pathway maps. Escher provides pathway suggestions based on user data and genome-scale models, so users can draw pathways in a semi-automated way. Second, users can visualize data related to genes or proteins on the associated reactions and pathways, using rules that define which enzymes catalyze each reaction. Thus, users can identify trends in common genomic data types (e.g. RNA-Seq, proteomics, ChIP)--in conjunction with metabolite- and reaction-oriented data types (e.g. metabolomics, fluxomics). Third, Escher harnesses the strengths of web technologies (SVG, D3, developer tools) so that visualizations can be rapidly adapted, extended, shared, and embedded. This paper provides examples of each of these features and explains how the development approach used for Escher can be used to guide the development of future visualization tools.
Gurjar, Anoop Kishor Singh; Panwar, Abhijeet Singh; Gupta, Rajinder; Mantri, Shrikant S.
High-throughput small RNA (sRNA) sequencing technology enables an entirely new perspective for plant microRNA (miRNA) research and has immense potential to unravel regulatory networks. Novel insights gained through data mining in publically available rich resource of sRNA data will help in designing biotechnology-based approaches for crop improvement to enhance plant yield and nutritional value. Bioinformatics resources enabling meta-analysis of miRNA expression across multiple plant species are still evolving. Here, we report PmiRExAt, a new online database resource that caters plant miRNA expression atlas. The web-based repository comprises of miRNA expression profile and query tool for 1859 wheat, 2330 rice and 283 maize miRNA. The database interface offers open and easy access to miRNA expression profile and helps in identifying tissue preferential, differential and constitutively expressing miRNAs. A feature enabling expression study of conserved miRNA across multiple species is also implemented. Custom expression analysis feature enables expression analysis of novel miRNA in total 117 datasets. New sRNA dataset can also be uploaded for analysing miRNA expression profiles for 73 plant species. PmiRExAt application program interface, a simple object access protocol web service allows other programmers to remotely invoke the methods written for doing programmatic search operations on PmiRExAt database. Database URL: http://pmirexat.nabi.res.in. PMID:27081157
Jiang, Wenping; Zou, Ziming
system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.
Fernandes, N.; Lopes, R.; Carriço, L.
Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju
With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…
Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A.M.; Gran, R.; Harris, D.A.; /Fermilab /William-Mary Coll.
Stracquadanio, Giovanni; Yang, Kun; Boeke, Jef D.; Bader, Joel S.
Summary: Synthetic biology has become a widely used technology, and expanding applications in research, education and industry require progress tracking for team-based DNA synthesis projects. Although some vendors are beginning to supply multi-kilobase sequence-verified constructs, synthesis workflows starting with short oligos remain important for cost savings and pedagogical benefit. We developed BioPartsDB as an open source, extendable workflow management system for synthetic biology projects with entry points for oligos and larger DNA constructs and ending with sequence-verified clones. Availability and Implementation: BioPartsDB is released under the MIT license and available for download at https://github.com/baderzone/biopartsdb. Additional documentation and video tutorials are available at https://github.com/baderzone/biopartsdb/wiki. An Amazon Web Services image is available from the AWS Market Place (ami-a01d07c8). Contact: firstname.lastname@example.org PMID:27412090
Rudik, Anastasia V; Bezhentsev, Vladislav M; Dmitriev, Alexander V; Druzhilovskiy, Dmitry S; Lagunin, Alexey A; Filimonov, Dmitry A; Poroikov, Vladimir V
A new freely available web-application MetaTox ( http://www.way2drug.com/mg ) for prediction of xenobiotic's metabolism and calculation toxicity of metabolites based on the structural formula of chemicals has been developed. MetaTox predicts metabolites, which are formed by nine classes of reactions (aliphatic and aromatic hydroxylation, N- and O-glucuronidation, N-, S- and C-oxidation, and N- and O-dealkylation). The calculation of probability for generated metabolites is based on analyses of "structure-biotransformation reactions" and "structure-modified atoms" relationships using a Bayesian approach. Prediction of LD50 values is performed by GUSAR software for the parent compound and each of the generated metabolites using quantitative structure-activity relationahip (QSAR) models created for acute rat toxicity with the intravenous type of administration.
Currenti, Gilda; Napoli, Rosalba; Sicali, Antonino; Greco, Filippo; Negro, Ciro Del
We present GEOFIM (GEOphysical Forward/Inverse Modeling), a WebGIS application for integrated interpretation of multiparametric geophysical observations. It has been developed to jointly interpret scalar and vector magnetic data, gravity data, as well as geodetic data, from GPS, tiltmeter, strainmeter and InSAR observations, recorded in active volcanic areas. GEOFIM gathers a library of analytical solutions, which provides an estimate of the geophysical signals due to perturbations in the thermal and stress state of the volcano. The integrated geophysical modeling can be performed by a simple trial and errors forward modeling or by an inversion procedure based on NSGA-II algorithm. The software capability was tested on the multiparametric data set recorded during the 2008-2009 Etna flank eruption onset. The results encourage to exploit this approach to develop a near-real-time warning system for a quantitative model-based assessment of geophysical observations in areas where different parameters are routinely monitored.
Aanensen, David M; Huntley, Derek M; Menegazzo, Mirko; Powell, Chris I; Spratt, Brian G
Previously, we have described the development of the generic mobile phone data gathering tool, EpiCollect, and an associated web application, providing two-way communication between multiple data gatherers and a project database. This software only allows data collection on the phone using a single questionnaire form that is tailored to the needs of the user (including a single GPS point and photo per entry), whereas many applications require a more complex structure, allowing users to link a series of forms in a linear or branching hierarchy, along with the addition of any number of media types accessible from smartphones and/or tablet devices (e.g., GPS, photos, videos, sound clips and barcode scanning). A much enhanced version of EpiCollect has been developed (EpiCollect+). The individual data collection forms in EpiCollect+ provide more design complexity than the single form used in EpiCollect, and the software allows the generation of complex data collection projects through the ability to link many forms together in a linear (or branching) hierarchy. Furthermore, EpiCollect+ allows the collection of multiple media types as well as standard text fields, increased data validation and form logic. The entire process of setting up a complex mobile phone data collection project to the specification of a user (project and form definitions) can be undertaken at the EpiCollect+ website using a simple 'drag and drop' procedure, with visualisation of the data gathered using Google Maps and charts at the project website. EpiCollect+ is suitable for situations where multiple users transmit complex data by mobile phone (or other Android devices) to a single project web database and is already being used for a range of field projects, particularly public health projects in sub-Saharan Africa. However, many uses can be envisaged from education, ecology and epidemiology to citizen science.
Aanensen, David M.; Huntley, Derek M.; Menegazzo, Mirko; Powell, Chris I.; Spratt, Brian G.
Previously, we have described the development of the generic mobile phone data gathering tool, EpiCollect, and an associated web application, providing two-way communication between multiple data gatherers and a project database. This software only allows data collection on the phone using a single questionnaire form that is tailored to the needs of the user (including a single GPS point and photo per entry), whereas many applications require a more complex structure, allowing users to link a series of forms in a linear or branching hierarchy, along with the addition of any number of media types accessible from smartphones and/or tablet devices (e.g., GPS, photos, videos, sound clips and barcode scanning). A much enhanced version of EpiCollect has been developed (EpiCollect+). The individual data collection forms in EpiCollect+ provide more design complexity than the single form used in EpiCollect, and the software allows the generation of complex data collection projects through the ability to link many forms together in a linear (or branching) hierarchy. Furthermore, EpiCollect+ allows the collection of multiple media types as well as standard text fields, increased data validation and form logic. The entire process of setting up a complex mobile phone data collection project to the specification of a user (project and form definitions) can be undertaken at the EpiCollect+ website using a simple ‘drag and drop’ procedure, with visualisation of the data gathered using Google Maps and charts at the project website. EpiCollect+ is suitable for situations where multiple users transmit complex data by mobile phone (or other Android devices) to a single project web database and is already being used for a range of field projects, particularly public health projects in sub-Saharan Africa. However, many uses can be envisaged from education, ecology and epidemiology to citizen science. PMID:25485096
Ries, Kernell G.; Horn, Marilee A.; Nardi, Mark R.; Tessler, Steven
Approximately 25,000 new households and thousands of new jobs will be established in an area that extends from southwest to northeast of Baltimore, Maryland, as a result of the Federal Base Realignment and Closure (BRAC) process, with consequent new demands on the water resources of the area. The U.S. Geological Survey, in cooperation with the Maryland Department of the Environment, has extended the area of implementation and added functionality to an existing map-based Web application named StreamStats to provide an improved tool for planning and managing the water resources in the BRAC-affected areas. StreamStats previously was implemented for only a small area surrounding Baltimore, Maryland, and it was extended to cover all BRAC-affected areas. StreamStats could provide previously published streamflow statistics, such as the 1-percent probability flood and the 7-day, 10-year low flow, for U.S. Geological Survey data-collection stations and estimates of streamflow statistics for any user-selected point on a stream within the implemented area. The application was modified for this study to also provide summaries of water withdrawals and discharges upstream from any user-selected point on a stream. This new functionality was made possible by creating a Web service that accepts a drainage-basin delineation from StreamStats, overlays it on a spatial layer of water withdrawal and discharge points, extracts the water-use data for the identified points, and sends it back to StreamStats, where it is summarized for the user. The underlying water-use data were extracted from the U.S. Geological Survey's Site-Specific Water-Use Database System (SWUDS) and placed into a Microsoft Access database that was created for this study for easy linkage to the Web service and StreamStats. This linkage of StreamStats with water-use information from SWUDS should enable Maryland regulators and planners to make more informed decisions on the use of water resources in the BRAC area, and
Since the inception of the South African Heritage Resources Information System (SAHRIS) in 2012, creating heritage cases and permit applications has been streamlined, and interaction with South African Heritage Authorities has been simplified. SAHRIS facilitates applications for development cases and mining applications that trigger the South African National Heritage Resources Act (Act 25 of 1999) and is able to differentiate between cases that require comment only, where the heritage process is subsidiary to environmental or mining law (Section 38(8)), and those where the heritage authority is the deciding authority (Section 38(1)). The system further facilitates cases related to site and object management, as well as permit applications for excavation, invasive research techniques and export of materials for research abroad in the case of archaeological or palaeontological specimens, or for sale or exhibition in the case of heritage objects. The integrated, easy to use, online system has removed the need for applicants to print out forms, take documents from one government department to the next for approval and other time-consuming processes that accompany paper-based systems. SAHRIS is a user friendly application that makes it easy for applicants to make their submissions, but also allows applicants to track the progress of their cases with the relevant heritage authority, which allows for better response rates and turnaround times from the authorities, while also ensuring transparency and good governance practice.
Kim, Minsung; Kim, Kamyoung; Lee, Sang-Il
This article examines the pedagogical potential of a Web-based GIS application, Population Migration Web Service (PMWS), in which students can examine population geography in an interactive and exploratory manner. This article introduces PMWS, a tailored, unique Internet GIS application that provides functions for visualizing spatial interaction…
Cao, Xinhua; Wong, Stephen T C; Hoo, Kent Soo; Tjandra, Donny; Fu, J C; Lowenstein, Daniel H
There is an increasing need to efficiently share diverse clinical and image data among different clinics, labs, and departments of a medical center enterprise to facilitate better quality care and more effective clinical research. In this paper, we describe a web-based, federated information model as a viable technical solution with applications in medical refractory epilepsy and other neurological disorders. We describe four such online applications developed in a federated system prototype: surgical planning, image analysis, statistical data analysis, and dynamic extraction, transforming, and loading (ETL) of data from a heterogeneous collection of data sources into an epilepsy multimedia data warehouse (EMDW). The federated information system adopts a three-tiered architecture, consisting of a user-interface layer, an application logic layer, and a data service layer. We implemented two complementary federated information technologies, i.e., XML (eXtensible Markup Language) and CORBA (Common Object Request Broker Architecture), in the prototype to enable multimedia data exchange and brain images transmission. The preliminary results show that the federated prototype system provides a uniform interface, heterogeneous information integration and efficient data sharing for users in our institution who are concerned with the care of patients with epilepsy and who pursue research in this area.
Pennington, Jeffrey W; Ruth, Byron; Italia, Michael J; Miller, Jeffrey; Wrazien, Stacey; Loutrel, Jennifer G; Crenshaw, E Bryan; White, Peter S
Biomedical researchers share a common challenge of making complex data understandable and accessible as they seek inherent relationships between attributes in disparate data types. Data discovery in this context is limited by a lack of query systems that efficiently show relationships between individual variables, but without the need to navigate underlying data models. We have addressed this need by developing Harvest, an open-source framework of modular components, and using it for the rapid development and deployment of custom data discovery software applications. Harvest incorporates visualizations of highly dimensional data in a web-based interface that promotes rapid exploration and export of any type of biomedical information, without exposing researchers to underlying data models. We evaluated Harvest with two cases: clinical data from pediatric cardiology and demonstration data from the OpenMRS project. Harvest's architecture and public open-source code offer a set of rapid application development tools to build data discovery applications for domain-specific biomedical data repositories. All resources, including the OpenMRS demonstration, can be found at http://harvest.research.chop.edu.
Aljraiwi, Seham Salman
The current study proposes web applications-based learning environment to promote teaching and learning activities in the classrooms. It also helps teachers facilitate learners' contributions in the process of learning and improving their motivation and performance. The case study illustrated that female students were more interested in learning…
Bintas, Jale; Barut, Asim
The aim of research is to compare difference between tenth class students and determine their level of success about classic and web based educational applications of Turbo Pascal lesson. This research was applied to 10 A and 10 TLB students of Izmir Karsikaya Anatolian Technical and industrial high school computer department in second term of…
Wang, Ying; Le, Linh H; Wang, Xiaohang; Tao, Zhen; Druschel, Charlotte D; Cross, Philip K; Hwang, Syni-An
Geographic information systems (GIS) have been widely used in mapping health data and analyzing the geographic distribution of disease. Mapping and spatially analyzing data normally begins with geocoding, a process of assigning geographic coordinates to an address so that it can be displayed and analyzed on a map. The objectives of this project were to develop Web-based geocoding applications for the New York State birth defects surveillance system to geocode, both automatically and interactively, the birth defect cases of the Congenital Malformations Registry (CMR) and evaluate the geocoding results. Geocoding software, in conjunction with a Java-based development tool (J Server), was used to develop the Web-based applications on the New York State Department of Health's Health Commerce System. The Web-based geocoding applications have been developed and implemented for the New York State birth defects surveillance system. These menu-driven applications empower users to conduct geocoding activities using only a PC and a Web browser without the installation of any GIS software. These powerful tools provide automatic, real-time, street-level geocoding of the routinely collected birth defects records in the CMR. Up to 92% of the CMR records have been geocoded with addresses exactly matched to the reference addresses on house number, street name, and city or zip code.
Aye, Zar Chi; Jaboyedoff, Michel; Derron, Marc-Henri
Probabilistic Risk Assessment (PRA) model and the knowledge collected from experts. The visualization of the risk reduction scenarios can also be shared among the users on the web to support the on-line participatory process. In addition, cost-benefit ratios of the different risk reduction scenarios can be prepared in order to serve as inputs for high-level decision makers. The most appropriate risk reduction scenarios will be chosen using Multi-Criteria Evaluation (MCE) method by weighting different parameters according to the preferences and criteria defined by the users. The role of public participation has been changing from one-way communication between authorities, experts, stakeholders and citizens towards more intensive two-way interaction. Involving the affected public and interest groups can enhance the level of legitimacy, transparency, and confidence in the decision making process. Due to its important part in decision making, online participatory tool is included in the DSS in order to allow the involved stakeholders interactively in risk reduction and be aware of the existing vulnerability conditions of the community. Moreover, it aims to achieve a more transparent and better informed decision-making process. The system is under in progress and the first tools implemented will be presented showing the wide possibilities of new web technologies which can have a great impact on the decision making process. It will be applied in four pilot areas in Europe: French Alps, North Eastern Italy, Romania and Poland. Nevertheless, the framework will be designed and implemented in a way to be applicable in any other regions.
Langella, Giuliano; Basile, Angelo; Giannecchini, Simone; Iamarino, Michela; Munafò, Michele; Terribile, Fabio
Soil sealing is one of the most important causes of land degradation and desertification. In Europe, soil covered by impermeable materials has increased by about 80% from the Second World War till nowadays, while population has only grown by one third. There is an increasing concern at the high political levels about the need to attenuate imperviousness itself and its effects on soil functions. European Commission promulgated a roadmap (COM(2011) 571) by which the net land take would be zero by 2050. Furthermore, European Commission also published a report in 2011 providing best practices and guidelines for limiting soil sealing and imperviousness. In this scenario, we developed an open source and an open source based Soil Sealing Geospatial Cyber Infrastructure (SS-GCI) named as "Soil Monitor". This tool merges a webGIS with parallel geospatial computation in a fast and dynamic fashion in order to provide real-time assessments of soil sealing at high spatial resolution (20 meters and below) over the whole Italy. Common open source webGIS packages are used to implement both the data management and visualization infrastructures, such as GeoServer and MapStore. The high-speed geospatial computation is ensured by a GPU parallelism using the CUDA (Computing Unified Device Architecture) framework by NVIDIA®. This kind of parallelism required the writing - from scratch - all codes needed to fulfil the geospatial computation built behind the soil sealing toolbox. The combination of GPU computing with webGIS infrastructures is relatively novel and required particular attention at the Java-CUDA programming interface. As a result, Soil Monitor is smart because it can perform very high time-consuming calculations (querying for instance an Italian administrative region as area of interest) in less than one minute. The web application is embedded in a web browser and nothing must be installed before using it. Potentially everybody can use it, but the main targets are the
Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas
Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the
Minhas, Riaz Aziz; Ali, Usman; Awan, Muhammad Siddique; Ahmed, Khawaja Basharat; Khan, Muhammad Nasim; Dar, Naeem Iftikhar; Qamar, Qamar Zaman; Ali, Hassan; Grueter, Cyril C; Tsuji, Yamato
Grey langurs (Semnopithecus spp.) occupy a variety of habitats, ranging from lowland forests and semi-desert to alpine forests. Little is known about their foraging and ranging in alpine forests, which appear to contain less food than lowland forests. We conducted a 1-year study of Himalayan grey langurs (Semnopithecus ajax) in Machiara National Park, Pakistan, where they occur at relatively high altitudes (range 2000-4733 m). We followed three groups of different sizes and compositions and examined the effects of ecological and social factors on ranging and feeding. The home-range sizes of a small bisexual group (SBG), a large bisexual group (LBG), and an all-male group (AMG) were 2.35 ± 0.92 (mean ± SD; average of four seasons), 3.28 ± 0.55, and 3.52 ± 1.00 km(2), respectively, and were largest in winter for all groups. The daily path lengths of the SBG, LBG, and AMG were 1.23 ± 0.28 (mean ± SD; average of four seasons), 1.75 ± 0.34, and 1.84 ± 0.70 km, respectively; that of the LBG was longer in winter, while that of the AMG was shorter in summer. Both the home-range size and daily path length of the AMG were larger than those of the other groups, even after partialling out the effect of group size differences. The mean altitude used by the langurs and the proportion of animals seen feeding did not differ among seasons or group types. As the mean temperature increased, the altitude used by langurs significantly increased for the SBG and LBG, but not for the AMG. On the other hand, as the temperature increased, the home-range sizes significantly decreased for the SBG and AMG, but not for the LBG. Rainfall did not show any correlation with ranging or feeding in any of the groups. Our results suggested that grey langurs in Machiara National Park employ a high-cost, high-return foraging strategy in winter, and that the ranging of the AMG also reflects its reproductive strategy.
Background The interaction between biological researchers and the bioinformatics tools they use is still hampered by incomplete interoperability between such tools. To ensure interoperability initiatives are effectively deployed, end-user applications need to be aware of, and support, best practices and standards. Here, we report on an initiative in which software developers and genome biologists came together to explore and raise awareness of these issues: BioHackathon 2009. Results Developers in attendance came from diverse backgrounds, with experts in Web services, workflow tools, text mining and visualization. Genome biologists provided expertise and exemplar data from the domains of sequence and pathway analysis and glyco-informatics. One goal of the meeting was to evaluate the ability to address real world use cases in these domains using the tools that the developers represented. This resulted in i) a workflow to annotate 100,000 sequences from an invertebrate species; ii) an integrated system for analysis of the transcription factor binding sites (TFBSs) enriched based on differential gene expression data obtained from a microarray experiment; iii) a workflow to enumerate putative physical protein interactions among enzymes in a metabolic pathway using protein structure data; iv) a workflow to analyze glyco-gene-related diseases by searching for human homologs of glyco-genes in other species, such as fruit flies, and retrieving their phenotype-annotated SNPs. Conclusions Beyond deriving prototype solutions for each use-case, a second major purpose of the BioHackathon was to highlight areas of insufficiency. We discuss the issues raised by our exploration of the problem/solution space, concluding that there are still problems with the way Web services are modeled and annotated, including: i) the absence of several useful data or analysis functions in the Web service "space"; ii) the lack of documentation of methods; iii) lack of compliance with the SOAP
Shen, Xin; Chu, Ka Hou; Chan, Benny Kwok Kan; Tsang, Ling Ming
The complete mitochondrial genome of Megabalanus ajax Darwin, 1854 (Sessilia: Balanidae) is reported. Compared to typical gene content of metazoan mitochondrial genomes, duplication of one tRNA gene (trnL2) and absence of another tRNA gene (trnS1) are identified in M. ajax mitochondrial genome. There is a replacement of one tRNA (trnS1) by another tRNA (trnL2) in M. ajax mitochondrial genome compared to Megabalanus volcano mitochondrial genome. Inversion of a six-gene block (trnP-nd4L-nd4-trnH-nd5-trnF) is found between M. ajax/M. volcano and Tetraclita japonica mitochondrial genomes. With reference to the pancrustacean mitochondrial ground pattern, there is an inversion of a large gene block from the light strand to heavy strand in the two Megabalanus mitochondrial genomes, including three PCGs and two tRNAs (nd4L-nd4-trnH-nd5-trnF). Furthermore, four tRNAs (trnA, trnE, trnQ and trnC) exhibit translocation, while translocation and inversion occur in three tRNAs (trnP, trnY and trnK).
Holubec, V.; Valášková, T.; Halounová, L.
The project describes a process of conversion of printed books into a web map and mobile application. The goal of the project is to make spatial data in the book accessible to wide public using GIS especially on web in order to spread the information about this topic. Moreover, as a result of the analysis and of the new perspectives gained from the data context, historians will be able to find new connections. The books that serve as sources of the project (two books with the scope of about 1400 pages featuring hundreds of locations where each location is associated with more events of different types) refer to places with many addresses in Prague and some villages in the Czech Republic which are related to events that took place during the World War II. The paper describes the steps of conversion, the design of the data model in Esri geodatabase and examples of outputs. The historical data are connected to actual addresses and thanks to such a combination of historical and actual locations, the project will help to discover a part of the history of the Czech Republic and it will show new context in data via GIS capabilities. This project is a continuation of a project which recorded a march of death on a map. This is a unique project created in cooperation with Academia Publishing. The outputs of the project will serve as a core resource for a multimedia history portal. The author of the book is currently writing sequels from the post-war period and at least two other books are envisioned, so the future of the project is ensured.
Habib, Shahid; Talabac, Stephen J.
There is a significant interest in the Earth Science research and user remote sensing community to substantially increase the number of useful observations relative to the current frequency of collection. The obvious reason for such a push is to improve the temporal, spectral, and spatial coverage of the area(s) under investigation. However, there is little analysis available in terms of the benefits, costs and the optimal set of sensors needed to make the necessary observations. Classic observing system solutions may no longer be applicable because of their point design philosophy. Instead, a new intelligent data collection system paradigm employing both reactive and proactive measurement strategies with adaptability to the dynamics of the phenomena should be developed. This is a complex problem that should be carefully studied and balanced across various boundaries including: science, modeling, applications, and technology. Modeling plays a crucial role in making useful predictions about naturally occurring or human-induced phenomena In particular, modeling can serve to mitigate the potentially deleterious impacts a phenomenon may have on human life, property, and the economy. This is especially significant when one is interested in learning about the dynamics of, for example, the spread of forest fires, regional to large-scale air quality issues, the spread of the harmful invasive species, or the atmospheric transport of volcanic plumes and ash. This paper identifies and examines these challenging issues and presents architectural alternatives for an integrated sensor web to provide observing scenarios driving the requisite dynamic spatial, spectral, and temporal characteristics to address these key application areas. A special emphasis is placed on the observing systems and its operational aspects in serving the multiple users and stakeholders in providing societal benefits. We also address how such systems will take advantage of technological advancement in
The Web 2.0 and the Semantic Web represent different forms of evolution of the first-generation Web, and both of them enrich Web resources with semantic annotations. Recommendation and personalization of Web resources is another trend that becomes more and more important with the growth of information, and both the Web 2.0 and the Semantic Web are deeply connected to it. The objective of this paper is to analyze the contribution of recommendation and adaptation techniques to these paradigms and to investigate if these techniques can be used as a bridge for their integration. More specifically, the paper will focus on the contribution of adaptation and recommendation techniques to improve the quality of annotations in the Web 2.0, Semantic Web, and mixed approaches and the relevance of annotated resources that are retrieved or filtered to users.
Malkin, Mathew R.; Lenart, John; Stier, Gary R.; Gatling, Jason W.; Applegate II, Richard L.
Objectives This study compared admission rates to a United States anesthesiology residency program for applicants completing face-to-face versus web-based interviews during the admissions process. We also explored factors driving applicants to select each interview type. Methods The 211 applicants invited to interview for admission to our anesthesiology residency program during the 2014-2015 application cycle were participants in this pilot observational study. Of these, 141 applicants selected face-to-face interviews, 53 applicants selected web-based interviews, and 17 applicants declined to interview. Data regarding applicants' reasons for selecting a particular interview type were gathered using an anonymous online survey after interview completion. Residency program admission rates and survey answers were compared between applicants completing face-to-face versus web-based interviews. Results One hundred twenty-seven (75.1%) applicants completed face-to-face and 42 (24.9%) completed web-based interviews. The admission rate to our residency program was not significantly different between applicants completing face-to-face versus web-based interviews. One hundred eleven applicants completed post-interview surveys. The most common reasons for selecting web-based interviews were conflict of interview dates between programs, travel concerns, or financial limitations. Applicants selected face-to-face interviews due to a desire to interact with current residents, or geographic proximity to the residency program. Conclusions These results suggest that completion of web-based interviews is a viable alternative to completion of face-to-face interviews, and that choice of interview type does not affect the rate of applicant admission to the residency program. Web-based interviews may be of particular interest to applicants applying to a large number of programs, or with financial limitations. PMID:27039029
Pietrobon, Ricardo; Nielsen, Karen C; Steele, Susan M; Menezes, Andreia P; Martins, Henrique; Jacobs, Danny O
Background Although scientific writing plays a central role in the communication of clinical research findings and consumes a significant amount of time from clinical researchers, few Web applications have been designed to systematically improve the writing process. This application had as its main objective the separation of the multiple tasks associated with scientific writing into smaller components. It was also aimed at providing a mechanism where sections of the manuscript (text blocks) could be assigned to different specialists. Manuscript Architect was built using Java language in conjunction with the classic lifecycle development method. The interface was designed for simplicity and economy of movements. Manuscripts are divided into multiple text blocks that can be assigned to different co-authors by the first author. Each text block contains notes to guide co-authors regarding the central focus of each text block, previous examples, and an additional field for translation when the initial text is written in a language different from the one used by the target journal. Usability was evaluated using formal usability tests and field observations. Results The application presented excellent usability and integration with the regular writing habits of experienced researchers. Workshops were developed to train novice researchers, presenting an accelerated learning curve. The application has been used in over 20 different scientific articles and grant proposals. Conclusion The current version of Manuscript Architect has proven to be very useful in the writing of multiple scientific texts, suggesting that virtual writing by interdisciplinary groups is an effective manner of scientific writing when interdisciplinary work is required. PMID:15960855
Wayland, Matthew T; Chubb, James C
When parasites invade paired structures of their host non-randomly, the resulting asymmetry may have both pathological and ecological significance. To facilitate the detection and visualisation of asymmetric infections we have developed a free software tool, Analysis of Symmetry of Parasitic Infections (ASPI). This tool has been implemented as an R package (https://cran.r-project.org/package=aspi) and a web application (https://wayland.shinyapps.io/aspi). ASPI can detect both consistent bias towards one side, and inconsistent bias in which the left side is favoured in some hosts and the right in others. Application of ASPI is demonstrated using previously unpublished data on the distribution of metacercariae of species of Diplostomum von Nordmann, 1832 in the eyes of ruffe Gymnocephalus cernua (Linnaeus). Invasion of the lenses appeared to be random, with the proportion of metacercariae in the left and right lenses showing the pattern expected by chance. However, analysis of counts of metacercariae from the humors, choroid and retina revealed asymmetry between eyes in 38% of host fish.
Romaniuk, Ryszard S.
This paper is the first part (out of five) of the research survey of WILGA Symposium work, May 2012 Edition, concerned with photonics and electronics applications in astronomy and space technologies. It presents a digest of chosen technical work results shown by young researchers from different technical universities from this country during the Jubilee XXXth SPIE-IEEE Wilga 2012, May Edition, symposium on Photonics and Web Engineering. Topical tracks of the symposium embraced, among others, nanomaterials and nanotechnologies for photonics, sensory and nonlinear optical fibers, object oriented design of hardware, photonic metrology, optoelectronics and photonics applications, photonics-electronics co-design, optoelectronic and electronic systems for astronomy and high energy physics experiments, JET tokamak and pi-of-the sky experiments development. The symposium is an annual summary in the development of numerable Ph.D. theses carried out in this country in the area of advanced electronic and photonic systems. It is also a great occasion for SPIE, IEEE, OSA and PSP students to meet together in a large group spanning the whole country with guests from this part of Europe. A digest of Wilga references is presented [1-275].
Scarselli, Franco; Tsoi, Ah Chung; Hagenbuchner, Markus; Noi, Lucia Di
This paper proposes the combination of two state-of-the-art algorithms for processing graph input data, viz., the probabilistic mapping graph self organizing map, an unsupervised learning approach, and the graph neural network, a supervised learning approach. We organize these two algorithms in a cascade architecture containing a probabilistic mapping graph self organizing map, and a graph neural network. We show that this combined approach helps us to limit the long-term dependency problem that exists when training the graph neural network resulting in an overall improvement in performance. This is demonstrated in an application to a benchmark problem requiring the detection of spam in a relatively large set of web sites. It is found that the proposed method produces results which reach the state of the art when compared with some of the best results obtained by others using quite different approaches. A particular strength of our method is its applicability towards any input domain which can be represented as a graph.
Steeman, Gerald; Connell, Christopher
Many librarians may feel that dynamic Web pages are out of their reach, financially and technically. Yet we are reminded in library and Web design literature that static home pages are a thing of the past. This paper describes how librarians at the Institute for Defense Analyses (IDA) library developed a database-driven, dynamic intranet site using commercial off-the-shelf applications. Administrative issues include surveying a library users group for interest and needs evaluation; outlining metadata elements; and, committing resources from managing time to populate the database and training in Microsoft FrontPage and Web-to-database design. Technical issues covered include Microsoft Access database fundamentals, lessons learned in the Web-to-database process (including setting up Database Source Names (DSNs), redesigning queries to accommodate the Web interface, and understanding Access 97 query language vs. Standard Query Language (SQL)). This paper also offers tips on editing Active Server Pages (ASP) scripting to create desired results. A how-to annotated resource list closes out the paper.
The Automated Geospatial Watershed Assessment (AGWA) tool is a desktop application that uses widely available standardized spatial datasets to derive inputs for multi-scale hydrologic models (Miller et al., 2007). The required data sets include topography (DEM data), soils, clima...
Background The study of RNA has been dramatically improved by the introduction of Next Generation Sequencing platforms allowing massive and cheap sequencing of selected RNA fractions, also providing information on strand orientation (RNA-Seq). The complexity of transcriptomes and of their regulative pathways make RNA-Seq one of most complex field of NGS applications, addressing several aspects of the expression process (e.g. identification and quantification of expressed genes and transcripts, alternative splicing and polyadenylation, fusion genes and trans-splicing, post-transcriptional events, etc.). Moreover, the huge volume of data generated by NGS platforms introduces unprecedented computational and technological challenges to efficiently analyze and store sequence data and results. Methods In order to provide researchers with an effective and friendly resource for analyzing RNA-Seq data, we present here RAP (RNA-Seq Analysis Pipeline), a cloud computing web application implementing a complete but modular analysis workflow. This pipeline integrates both state-of-the-art bioinformatics tools for RNA-Seq analysis and in-house developed scripts to offer to the user a comprehensive strategy for data analysis. RAP is able to perform quality checks (adopting FastQC and NGS QC Toolkit), identify and quantify expressed genes and transcripts (with Tophat, Cufflinks and HTSeq), detect alternative splicing events (using SpliceTrap) and chimeric transcripts (with ChimeraScan). This pipeline is also able to identify splicing junctions and constitutive or alternative polyadenylation sites (implementing custom analysis modules) and call for statistically significant differences in genes and transcripts expression, splicing pattern and polyadenylation site usage (using Cuffdiff2 and DESeq). Results Through a user friendly web interface, the RAP workflow can be suitably customized by the user and it is automatically executed on our cloud computing environment. This strategy
Gonsenhauser, Blair; Hallarn, Rose; Carpenter, Daniel; Para, Michael F; Reider, Carson R
Participant accrual into research studies is critical to advancing clinical and translational research to clinical care. Without sufficient recruitment, the purpose of any research study cannot be realized; yet, low recruitment and enrollment of participants persist. StudySearch is a web-based application designed to provide an easily readable, publicly accessible, and searchable listing of IRB-approved protocols that are accruing study participants. The Regulatory, Recruitment and Biomedical Informatics Cores of the Center for Clinical and Translational Science (CCTS) at The Ohio State University developed this research study posting platform. Postings include basic descriptive information: study title, purpose of the study, eligibility criteria and study personnel contact information. Language concerning benefits and/or inducements is not included; therefore, while IRB approval for a study to be listed on StudySearch is required, IRB approval of the posted language is not. Studies are listed by one of two methods; one automated and one manual: (1). Studies registered on ClinicalTrials.gov are automatically downloaded once a month; or (2). Studies are submitted directly by researchers to the CCTS Regulatory Core staff. In either case, final language is a result of an iterative process between researchers and CCTS staff. Deployed in January 2011 at OSU, this application has grown to approximately 200 studies currently posted and 1500 unique visitors per month. Locally, StudySearch is part of the CCTS recruitment toolkit. Features continue to be modified to better accommodate user behaviors. Nationally, this open source application is available for use. PMID:26912012
Kerschke, Dorit; Schilling, Maik; Simon, Andreas; Wächter, Joachim
Miller, Christopher T.
This paper provides a review of literature that relates research on Carl Rogers' person-centered learning theory to Web-based learning. Based on the review of the literature, a set of criteria is described that can be used to determine how closely a Web-based course matches the different components of Rogers' person-centered learning theory. Using…
Wang, Shu-Ling; Lin, Sunny S. J.
Although the Web allows for flexible learning, research has found that online students tend to lack focus, willingness to participate, confidence, and discipline. This study thus attempts to promote Web-based self-regulated learning from the social cognitive perspective, which emphasizes the interactions among personal, behavioral, and…
Henke, Karen Greenwood
Web-based productivity tools help people collaborate, communicate, and save time--and the good news is most are free. This article presents five good strategies for using web-based productivity tools: (1) Clean out your inbox and focus your professional development with RSS; (2) Share your documents online with wikis or online productivity…
Dagiene, Valentina; Kurilovas, Eugenijus
The paper is aimed to analyse the external expert evaluation results of eContent"plus" programme's iCOPER (Interoperable Content for Performance in a Competency-driven Society) project's deliverables, especially quality control and Web 2.0 technologies report. It is a suitability report for better practice concerning the use of Web 2.0…
Roby, W.; Wu, X.; Ly, L.; Goldina, T.
Hester, Reid K; Delaney, Harold D; Campbell, William; Handmaker, Nancy
Eighty-four heavy drinkers who responded to a newspaper recruitment advertisement were randomly assigned to receive either (a) training in a Moderate Drinking protocol via an Internet-based program (www.moderatedrinking.com) and use of the online resources of Moderation Management (MM; www.moderation.org) or (b) use of the online resources of MM alone. Follow-ups are being conducted at 3, 6, and 12 months. Results of the recently completed 3-month follow-up (86% follow-up) indicated both groups significantly reduced their drinking based on these variables: standard drinks per week, percent days abstinent, and mean estimated blood alcohol concentration (BAC) per drinking day. Both groups also significantly reduced their alcohol-related problems. Relative to the control group, the experimental group had better outcomes on percent days abstinent and log drinks per drinking day. These short-term outcome data provide evidence for the effectiveness of both the Moderate Drinking Web application and of the resources available online at MM in helping heavy drinkers reduce their drinking and alcohol-related problems.
Seren, Ümit; Vilhjálmsson, Bjarni J; Horton, Matthew W; Meng, Dazhe; Forai, Petar; Huang, Yu S; Long, Quan; Segura, Vincent; Nordborg, Magnus
Arabidopsis thaliana is an important model organism for understanding the genetics and molecular biology of plants. Its highly selfing nature, small size, short generation time, small genome size, and wide geographic distribution make it an ideal model organism for understanding natural variation. Genome-wide association studies (GWAS) have proven a useful technique for identifying genetic loci responsible for natural variation in A. thaliana. Previously genotyped accessions (natural inbred lines) can be grown in replicate under different conditions and phenotyped for different traits. These important features greatly simplify association mapping of traits and allow for systematic dissection of the genetics of natural variation by the entire A. thaliana community. To facilitate this, we present GWAPP, an interactive Web-based application for conducting GWAS in A. thaliana. Using an efficient implementation of a linear mixed model, traits measured for a subset of 1386 publicly available ecotypes can be uploaded and mapped with a mixed model and other methods in just a couple of minutes. GWAPP features an extensive, interactive, and user-friendly interface that includes interactive Manhattan plots and linkage disequilibrium plots. It also facilitates exploratory data analysis by implementing features such as the inclusion of candidate polymorphisms in the model as cofactors.
Romaniuk, Ryszard S.
This paper is the fourth part (out of five) of the research survey of WILGA Symposium work, May 2012 Edition, concerned with Optoelectronic Devices, Sensors, Communication and Multimedia (Video and Audio) technologies. It presents a digest of chosen technical work results shown by young researchers from different technical universities from this country during the Jubilee XXXth SPIE-IEEE Wilga 2012, May Edition, symposium on Photonics and Web Engineering. Topical tracks of the symposium embraced, among others, nanomaterials and nanotechnologies for photonics, sensory and nonlinear optical fibers, object oriented design of hardware, photonic metrology, optoelectronics and photonics applications, photonics-electronics co-design, optoelectronic and electronic systems for astronomy and high energy physics experiments, JET tokamak and pi-of-the sky experiments development. The symposium is an annual summary in the development of numerable Ph.D. theses carried out in this country in the area of advanced electronic and photonic systems. It is also a great occasion for SPIE, IEEE, OSA and PSP students to meet together in a large group spanning the whole country with guests from this part of Europe. A digest of Wilga references is presented [1-270].
Romaniuk, Ryszard S.
This paper is the third part (out of five) of the research survey of WILGA Symposium work, May 2012 Edition, concerned with Photon Physics and Plasma Research. It presents a digest of chosen technical work results shown by young researchers from different technical universities from this country during the Jubilee XXXth SPIE-IEEE Wilga 2012, May Edition, symposium on Photonics and Web Engineering. Topical tracks of the symposium embraced, among others, nanomaterials and nanotechnologies for photonics, sensory and nonlinear optical fibers, object oriented design of hardware, photonic metrology, optoelectronics and photonics applications, photonics-electronics co-design, optoelectronic and electronic systems for astronomy and high energy physics experiments, JET tokamak and pi-of-the sky experiments development. The symposium is an annual summary in the development of numerable Ph.D. theses carried out in this country in the area of advanced electronic and photonic systems. It is also a great occasion for SPIE, IEEE, OSA and PSP students to meet together in a large group spanning the whole country with guests from this part of Europe. A digest of Wilga references is presented [1-270].
Romaniuk, Ryszard S.
The paper is the second part (out of five) of the research survey of WILGA Symposium work, May 2012 Edition, concerned with accelerator technology and high energy physics experiments. It presents a digest of chosen technical work results shown by young researchers from different technical universities from this country during the XXXth Jubilee SPIE-IEEE Wilga 2012, May Edition, symposium on Photonics and Web Engineering. Topical tracks of the symposium embraced, among others, nanomaterials and nanotechnologies for photonics, sensory and nonlinear optical fibers, object oriented design of hardware, photonic metrology, optoelectronics and photonics applications, photonicselectronics co-design, optoelectronic and electronic systems for astronomy and high energy physics experiments, JET and pi-of-the sky experiments development. The symposium is an annual summary in the development of numerable Ph.D. theses carried out in this country in the area of advanced electronic and photonic systems. It is also a great occasion for SPIE, IEEE, OSA and PSP students to meet together in a large group spanning the whole country with guests from this part of Europe. A digest of Wilga references is presented [1-275].
Tsai, Yuan-Fan; Chan, Chun-Hsiang; Wang, Han; Pan, Yun-Xing; Lin, Gine-Jie
In recent years, due to the global warming and global climate anomaly, more and more disasters appear such as flood and debris flow. The disasters always cause loss of life and property. However, the cross-aged invention, smart phone, makes our life more conveniently for delivering lots of information instantly. This study uses Eclipse as the development platform, and designs the urban disaster information mobile Application (APP) which is for debris flow and flood in Taipei city area. In this study, an urban disaster information APP, Taipei Let You Know, has successfully developed under android development environment, combined disaster indicators and the warming value of disaster. In order to ameliorate official information delay problem, this APP not only shows official information, but also offers a WEB 2.0 platform for public users to upload all disaster information instantly. As the result, the losses of life and property can decrease, and the disaster information delivery can be faster and more accurate by utilizing this APP in the future.
Phan, Dung; Spangenberg, Douglas A.; Palikonda, Rabindra; Khaiyer, Mandana M.; Nordeen, Michele L.; Nguyen, Louis; Minnis, Patrick
The need for ready access to satellite data and associated physical parameters such as cloud properties has been steadily growing. Air traffic management, weather forecasters, energy producers, and weather and climate researchers among others can utilize more satellite information than in the past. Thus, it is essential that such data are made available in near real-time and as archival products in an easy-access and user friendly environment. A host of Internet web sites currently provide a variety of satellite products for various applications. Each site has a unique contribution with appeal to a particular segment of the public and scientific community. This is no less true for the NASA Langley's Clouds and Radiation (NLCR) website (http://www-pm.larc.nasa.gov) that has been evolving over the past 10 years to support a variety of research projects This website was originally developed to display cloud products derived from the Geostationary Operational Environmental Satellite (GOES) over the Southern Great Plains for the Atmospheric Radiation Measurement (ARM) Program. It has evolved into a site providing a comprehensive database of near real-time and historical satellite products used for meteorological, aviation, and climate studies. To encourage the user community to take advantage of the site, this paper summarizes the various products and projects supported by the website and discusses future options for new datasets.
Yang, In Seok; Lee, Hwan Young; Yang, Woo Ick; Shin, Kyoung-Jin
Mitochondrial DNA (mtDNA) is a valuable tool in the fields of forensic, population, and medical genetics. However, recording and comparing mtDNA control region or entire genome sequences would be difficult if researchers are not familiar with mtDNA nomenclature conventions. Therefore, mtDNAprofiler, a Web application, was designed for the analysis and comparison of mtDNA sequences in a string format or as a list of mtDNA single-nucleotide polymorphisms (mtSNPs). mtDNAprofiler which comprises four mtDNA sequence-analysis tools (mtDNA nomenclature, mtDNA assembly, mtSNP conversion, and mtSNP concordance-check) supports not only the accurate analysis of mtDNA sequences via an automated nomenclature function, but also consistent management of mtSNP data via direct comparison and validity-check functions. Since mtDNAprofiler consists of four tools that are associated with key steps of mtDNA sequence analysis, mtDNAprofiler will be helpful for researchers working with mtDNA. mtDNAprofiler is freely available at http://mtprofiler.yonsei.ac.kr.
Panizzoni, Giulio; Debiasi, Alberto; Eccher, Matteo; De Amicis, Raffaele
Global warming and rapid climatic changes are producing dramatic effects on coastal area of Mediterranean countries. Italian coastal areas are one of the most urbanized zones of the south western Europe and the extensive use of soil is causing a consistent impact on the hydrogeological context. Moreover, soil consumption combined with extreme meteorological events, facilitates the occurrence of hazardous landslide events. Environmental policy makers and data managers in territorial planning need to face such emergency situation with appropriate tools. We present an application service with the aim of advising user through environmental analysis of Landslide and Soil Consumption impact. This service wants also to improve the sharing of environmental harmonized datasets/metadata across different organizations and the creation of a collaborative environment where the stakeholders and environmental experts can share their data and work cooperatively. We developed a set of processing services providing functionalities to assess impact of landslide on territory and impact of land take and soil sealing. Among others, the service is able to evaluate environmental impacts of landslide events on Cultural Heritage sites. We have also designed a 3D WebGL client customized to execute the processing services and visualize their outputs. It provides high usability in terms of navigation and data visualization. In this way the service provides not only a Spatial Data Infrastructure to access and visualize data but a complete Decision Support Systems for a more effective environmental planning of coastal area.
Romaniuk, Ryszard S.
This paper is the fifth part (out of five) of the research survey of WILGA Symposium work, May 2012 Edition, concerned with Biomedical, Artificial Intelligence and DNA Computing technologies. It presents a digest of chosen technical work results shown by young researchers from different technical universities from this country during the Jubilee XXXth SPIE-IEEE Wilga 2012, May Edition, symposium on Photonics and Web Engineering. Topical tracks of the symposium embraced, among others, nanomaterials and nanotechnologies for photonics, sensory and nonlinear optical fibers, object oriented design of hardware, photonic metrology, optoelectronics and photonics applications, photonics-electronics co-design, optoelectronic and electronic systems for astronomy and high energy physics experiments, JET tokamak and pi-of-the sky experiments development. The symposium is an annual summary in the development of numerable Ph.D. theses carried out in this country in the area of advanced electronic and photonic systems. It is also a great occasion for SPIE, IEEE, OSA and PSP students to meet together in a large group spanning the whole country with guests from this part of Europe. A digest of Wilga references is presented [1-270].
Ahearn, Elizabeth A.; Ries, Kernell G.; Steeves, Peter A.
Introduction An important mission of the U. S. Geological Survey (USGS) is to provide information on streamflow in the Nation's rivers. Streamflow statistics are used by water managers, engineers, scientists, and others to protect people and property during floods and droughts, and to manage land, water, and biological resources. Common uses for streamflow statistics include dam, bridge, and culvert design; water-supply planning and management; water-use appropriations and permitting; wastewater and industrial discharge permitting; hydropower-facility design and regulation; and flood-plain mapping for establishing flood-insurance rates and land-use zones. In an effort to improve access to published streamflow statistics, and to make the process of computing streamflow statistics for ungaged stream sites easier, more accurate, and more consistent, the USGS and the Environmental Systems Research Institute, Inc. (ESRI) developed StreamStats (Ries and others, 2004). StreamStats is a Geographic Information System (GIS)-based Web application for serving previously published streamflow statistics and basin characteristics for USGS data-collection stations, and computing streamflow statistics and basin characteristics for ungaged stream sites. The USGS, in cooperation with the Connecticut Department of Environmental Protection and the Connecticut Department of Transportation, has implemented StreamStats for Connecticut.
Stieger, Stefan; Burger, Christoph
Student ratings have been a controversial but important method for the improvement of teaching quality during the past several decades. Most universities rely on summative evaluations conducted at the end of a term or course. A formative approach in which each course unit is evaluated may be beneficial for students and teachers but has rarely been applied. This is most probably due to the time constraints associated with various procedures inherent in formative evaluation (numerous evaluations, high amounts of aggregated data, high administrative investment). In order to circumvent these disadvantages, we chose the Web 2.0 Internet application Twitter as evaluation tool and tested whether it is useful for the implementation of a formative evaluation. After a first pilot and subsequent experimental study, the following conclusions were drawn: First, the formative evaluation did not come to the same results as the summative evaluation at the end of term, suggesting that formative evaluations tap into different aspects of course evaluation than summative evaluations do. Second, the results from an offline (i.e., paper-pencil) summative evaluation were identical with those from an online summative evaluation of the same course conducted a week later. Third, the formative evaluation did not influence the ratings of the summative evaluation at the end of the term. All in all, we can conclude that Twitter is a useful tool for evaluating a course formatively (i.e., on a weekly basis). Because of Twitter's simple use and the electronic handling of data, the administrative effort remains small.
Chou, Pao-Nan; Chang, Chi-Cheng
This study examines the effects of reflection category and reflection quality on learning outcomes during Web-based portfolio assessment process. Experimental subjects consist of forty-five eight-grade students in a "Computer Application" course. Through the Web-based portfolio assessment system, these students write reflection, and join…
Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya
Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche
Ramousse, R; Davis, F
Data collection on time and length of building in orb-weaving spiders has suffered from absence of light during construction and inconvenient hours. A simple apparatus is described which permits recording of the spiders' movements as they disturb an ultrasonic field. By varying onset and length of dark periods for two animals at even temperature and by registering the building periods for 127 webs, a definite influence of the light-dark cycle can be identified: there is a strong preference for building webs in the dark; this is superimposed on the circadian rhythm of orb-web construction. One of the spiders always built earlier than the other.
Qiao, Liang; Li, Ying; Chen, Xin; Yang, Sheng; Gao, Peng; Liu, Hongjun; Feng, Zhengquan; Nian, Yongjian; Qiu, Mingguo
Garbow, Z. A.; Olson, N. R.; Yuen, D. A.; Boggs, J. M.
Current advances in computer hardware, information technology and data collection techniques have produced very large data sets, sometimes more than terabytes,in a wide variety of scientific and engineering disciplines. We must harness this opportunity to visualize and extract useful information from geophysical and geological data. We have taken the task of data-mining by using a map-like approach over the web for interrogating the humongous data, using a client-server paradigm. The spatial-data is mapped onto a two-dimensional grid from which the user ( client ) can quiz the data with the map-interface as a user extension . The data is stored on high-end compute server. The computational gateway separating the client and the server can be the front-end of an electronic publication , electronic classroom , a Grid system device or e-business. We have used a combination of JAVA, JAVA-3D and Perl for processing the data and communicating them between the client and the server. The user can interrogate the geospatial data over any particular region with arbitrary length scales and pose relevant statistical questions, such as the histogram plots and local statistics. We have applied this method for the following data sets (1.) distribution of prime numbers (2.) two-dimensional mantle convection (3.) three-dimensional mantle convection (4) high-resolution satellite reflectance data over the Upper Midwest for multiple wavelengths (5) molecular dynamics describing the flow of blood in narrow vessels. Using this map-interface concept, the user can actually interrogate these data over the web. This strategy for dissecting large data-sets can be easily applied to other areas, such as satellite geodesy and earthquake data. This mode of data-query may function in an adequately covered wireless web environment with a transfer rate of around 10 Mbit/sec .
Abad-Mota, S.; Ruckhaus, E.; Garboza, A.; Tepedino, G.
aggregation, hourly, daily, monthly, so that they can be provided to the user at the desired level. This means that additional caution has to be exercised in query answering, in order to distinguish between primary and derived data. On the other hand, a Web 2.0 application is being designed to provide a front-end to the repository. This design focuses on two important aspects: the use of metadata structures, and the definition of collaborative Web 2.0 features that can be integrated to a project of this nature. Metadata descriptors include for a set of measurements, its quality, granularity and other dimension information. With these descriptors it is possible to establish relationships between different sets of measurements and provide scientists with efficient searching mechanisms that determine the related sets of measurements that contribute to a query answer. Unlike traditional applications for climatic data, our approach not only satisfies requirements of researchers specialized in this domain, but also those of anyone interested in this area; one of the objectives is to build an informal knowledge base that can be improved and consolidated with the usage of the system.
Zhang, De-gan; Zhang, Xiao-dan
With the growth of the amount of information manipulated by embedded application systems, which are embedded into devices and offer access to the devices on the internet, the requirements of saving the information systemically is necessary so as to fulfil access from the client and the local processing more efficiently. For supporting mobile applications, a design and implementation solution of embedded un-interruptible power supply (UPS) system (in brief, EUPSS) is brought forward for long-distance monitoring and controlling of UPS based on Web. The implementation of system is based on ATmega161, RTL8019AS and Arm chips with TCP/IP protocol suite for communication. In the embedded UPS system, an embedded file system is designed and implemented which saves the data and index information on a serial EEPROM chip in a structured way and communicates with a microcontroller unit through I2C bus. By embedding the file system into UPS system or other information appliances, users can access and manipulate local data on the web client side. Embedded file system on chips will play a major role in the growth of IP networking. Based on our experiment tests, the mobile users can easily monitor and control UPS in different places of long-distance. The performance of EUPSS has satisfied the requirements of all kinds of Web-based mobile applications.
The ECOTOX (ECOTOXicology Database) system developed by the USEPA, National Health and Environmental Effects Research Laboratory (NHEERL), Mid-Continent Ecology Division in Duluth, MN (MED-Duluth), provides a web browser search interface for locating aquatic and terrestrial toxic...
Okladnikov, I.; Gordov, E. P.; Titov, A. G.; Bogomolov, V. Y.; Genina, E.; Martynova, Y.; Shulgina, T. M.
Teixeira, Leonor; Saavedra, Vasco; Ferreira, Carlos; Sousa Santos, Beatriz
Modern methods of information and communication that use web technologies provide an opportunity to facilitate closer communication between patients and healthcare providers, allowing a joint management of chronic diseases. This paper describes a web-based technological solution to support the management of inherited bleeding disorders integrating, diffusing and archiving large sets of data relating to the clinical practice of hemophilia care, more specifically the clinical practice at the Hematology Service of Coimbra Hospital Center (a Hemophilia Treatment Center located in Portugal).
Machado, Catia M; Rebholz-Schuhmann, Dietrich; Freitas, Ana T; Couto, Francisco M
Semantic web technologies offer an approach to data integration and sharing, even for resources developed independently or broadly distributed across the web. This approach is particularly suitable for scientific domains that profit from large amounts of data that reside in the public domain and that have to be exploited in combination. Translational medicine is such a domain, which in addition has to integrate private data from the clinical domain with proprietary data from the pharmaceutical domain. In this survey, we present the results of our analysis of translational medicine solutions that follow a semantic web approach. We assessed these solutions in terms of their target medical use case; the resources covered to achieve their objectives; and their use of existing semantic web resources for the purposes of data sharing, data interoperability and knowledge discovery. The semantic web technologies seem to fulfill their role in facilitating the integration and exploration of data from disparate sources, but it is also clear that simply using them is not enough. It is fundamental to reuse resources, to define mappings between resources, to share data and knowledge. All these aspects allow the instantiation of translational medicine at the semantic web-scale, thus resulting in a network of solutions that can share resources for a faster transfer of new scientific results into the clinical practice. The envisioned network of translational medicine solutions is on its way, but it still requires resolving the challenges of sharing protected data and of integrating semantic-driven technologies into the clinical practice.
Bernier, Eveline; Gosselin, Pierre; Badard, Thierry; Bédard, Yvan
Background Climate change has a significant impact on population health. Population vulnerabilities depend on several determinants of different types, including biological, psychological, environmental, social and economic ones. Surveillance of climate-related health vulnerabilities must take into account these different factors, their interdependence, as well as their inherent spatial and temporal aspects on several scales, for informed analyses. Currently used technology includes commercial off-the-shelf Geographic Information Systems (GIS) and Database Management Systems with spatial extensions. It has been widely recognized that such OLTP (On-Line Transaction Processing) systems were not designed to support complex, multi-temporal and multi-scale analysis as required above. On-Line Analytical Processing (OLAP) is central to the field known as BI (Business Intelligence), a key field for such decision-support systems. In the last few years, we have seen a few projects that combine OLAP and GIS to improve spatio-temporal analysis and geographic knowledge discovery. This has given rise to SOLAP (Spatial OLAP) and a new research area. This paper presents how SOLAP and climate-related health vulnerability data were investigated and combined to facilitate surveillance. Results Based on recent spatial decision-support technologies, this paper presents a spatio-temporal web-based application that goes beyond GIS applications with regard to speed, ease of use, and interactive analysis capabilities. It supports the multi-scale exploration and analysis of integrated socio-economic, health and environmental geospatial data over several periods. This project was meant to validate the potential of recent technologies to contribute to a better understanding of the interactions between public health and climate change, and to facilitate future decision-making by public health agencies and municipalities in Canada and elsewhere. The project also aimed at integrating an initial
Helm, C. W.; Sparks, W.; Levene, J.; Hostetler, M.
The National Renewable Energy Laboratory in Golden, CO has developed a software platform that provides for the development of fully customized and unique web mapping applications that reuse a common base of software code. The application capabilities that have been developed within this platform include spatial data visualization, large-scale data retrieval and the analysis of various renewable energy resource data-sets. The platform consists of three primary components of reusable code: the back-end data storage and retrieval engine, a user-customizable Data Styling Engine, and front end user interface code. Each component of the platform represents a reusable code base from which new applications can be generated with a minimal amount of new code. This reusable code base can be thought of in the same vein as object oriented development: the reusable code is analogous to a base class that specific applications inherit from and extend. The architecture was motivated by a requirement to rapidly develop and deploy multiple web-based mapping applications for varying renewable energy and alternative fuel technologies, and for different customers. It was observed that these applications share a significant set of core features and functionality, with varying degrees of customization required for each application. A series of needs instigated the development of the architecture: * New applications should not require re-implementation of existing functionality (either through re-coding or "copy and paste" reuse) * Enhancements to the base functionality could automatically propagate through all derived applications * All applications should be able to utilize a common, internal (to NREL) Web Mapping Service (WMS), or any external WMS * The framework must support user authentication, role-based access control to specific data layers, and user customization of layer styling. This requirement led to the development of the Data Styling Engine. * A developer should be able to
Li, Xiaojuan; Xing, Yu; Zheng, Hengchao
This paper presents a method of setting up WEB System to Integrated Supervision Control System for the requirements of city Rail Transit. First, basic platform and software/hardware architecture of WEB System are discussed. Then the function module, data flow and communication mechanisms are described and a design based on technologies of SVG and Ajax is proposed and the WEB video release function and system security are described. This design makes it possible that important information of Integrated Supervision Control System can be browsed and queried in external Web pages. Watching Real-time images of all cameras in internal network of Rail Transit is possible, which is providing remote viewing and management functions for metro managers.
Panozzo, Silvia; Colauzzi, Michele; Scarabel, Laura; Collavo, Alberto; Rosan, Valentina; Sattin, Maurizio
Herbicides are the major weed control tool in most cropping systems worldwide. However, the high reliance on herbicides has led to environmental issues as well as to the evolution of herbicide-resistant biotypes. Resistance is a major concern in modern agriculture and early detection of resistant biotypes is therefore crucial for its management and prevention. In this context, a timely update of resistance biotypes distribution is fundamental to devise and implement efficient resistance management strategies. Here we present an innovative web-based application called iMAR (interactive MApping of Resistance) for the mapping of herbicide resistant biotypes. It is based on open source software tools and translates into maps the data reported in the GIRE (Italian herbicide resistance working group) database of herbicide resistance at national level. iMAR allows an automatic, easy and cost-effective updating of the maps a nd provides two different systems, "static" and "dynamic". In the first one, the user choices are guided by a hierarchical tree menu, whereas the latter is more flexible and includes a multiple choice criteria (type of resistance, weed species, region, cropping systems) that permits customized maps to be created. The generated information can be useful to various stakeholders who are involved in weed resistance management: farmers, advisors, national and local decision makers as well as the agrochemical industry. iMAR is freely available, and the system has the potential to handle large datasets and to be used for other purposes with geographical implications, such as the mapping of invasive plants or pests.
Deodhar, Suruchi; Bisset, Keith; Chen, Jiangzhuo; Barrett, Chris; Wilson, Mandy; Marathe, Madhav
Public health decision makers need access to high resolution situation assessment tools for understanding the extent of various epidemics in different regions of the world. In addition, they need insights into the future course of epidemics by way of forecasts. Such forecasts are essential for planning the allocation of limited resources and for implementing several policy-level and behavioral intervention strategies. The need for such forecasting systems became evident in the wake of the recent Ebola outbreak in West Africa. We have developed EpiCaster, an integrated Web application for situation assessment and forecasting of various epidemics, such as Flu and Ebola, that are prevalent in different regions of the world. Using EpiCaster, users can assess the magnitude and severity of different epidemics at highly resolved spatio-temporal levels. EpiCaster provides time-varying heat maps and graphical plots to view trends in the disease dynamics. EpiCaster also allows users to visualize data gathered through surveillance mechanisms, such as Google Flu Trends (GFT) and the World Health Organization (WHO). The forecasts provided by EpiCaster are generated using different epidemiological models, and the users can select the models through the interface to filter the corresponding forecasts. EpiCaster also allows the users to study epidemic propagation in the presence of a number of intervention strategies specific to certain diseases. Here we describe the modeling techniques, methodologies and computational infrastructure that EpiCaster relies on to support large-scale predictive analytics for situation assessment and forecasting of global epidemics. PMID:27796009
Panozzo, Silvia; Colauzzi, Michele; Scarabel, Laura; Collavo, Alberto; Rosan, Valentina; Sattin, Maurizio
Herbicides are the major weed control tool in most cropping systems worldwide. However, the high reliance on herbicides has led to environmental issues as well as to the evolution of herbicide-resistant biotypes. Resistance is a major concern in modern agriculture and early detection of resistant biotypes is therefore crucial for its management and prevention. In this context, a timely update of resistance biotypes distribution is fundamental to devise and implement efficient resistance management strategies. Here we present an innovative web-based application called iMAR (interactive MApping of Resistance) for the mapping of herbicide resistant biotypes. It is based on open source software tools and translates into maps the data reported in the GIRE (Italian herbicide resistance working group) database of herbicide resistance at national level. iMAR allows an automatic, easy and cost-effective updating of the maps a nd provides two different systems, “static” and “dynamic”. In the first one, the user choices are guided by a hierarchical tree menu, whereas the latter is more flexible and includes a multiple choice criteria (type of resistance, weed species, region, cropping systems) that permits customized maps to be created. The generated information can be useful to various stakeholders who are involved in weed resistance management: farmers, advisors, national and local decision makers as well as the agrochemical industry. iMAR is freely available, and the system has the potential to handle large datasets and to be used for other purposes with geographical implications, such as the mapping of invasive plants or pests. PMID:26266545
Bruno, Andrew E.; Soares, Alexei S.; Owen, Robin L.; Snell, Edward H.
Haptic interfaces have become common in consumer electronics. They enable easy interaction and information entry without the use of a mouse or keyboard. Our work illustrates the application of a haptic interface to crystallization screening in order to provide a natural means for visualizing and selecting results. By linking this to a cloud-based database and web-based application program interface, the same application shifts the approach from `point and click' to `touch and share', where results can be selected, annotated and discussed collaboratively. Furthermore, in the crystallographic application, given a suitable crystallization plate, beamline and robotic end effector, the resulting information can be used to close the loop between screening and X-ray analysis, allowing a direct and efficient `screen to beam' approach. The application is not limited to the area of crystallization screening; `touch and share' can be used by any information-rich scientific analysis and geographically distributed collaboration.
Bruno, Andrew E; Soares, Alexei S; Owen, Robin L; Snell, Edward H
Haptic interfaces have become common in consumer electronics. They enable easy interaction and information entry without the use of a mouse or keyboard. The work presented here illustrates the application of a haptic interface to crystallization screening in order to provide a natural means for visualizing and selecting results. By linking this to a cloud-based database and web-based application program interface, the same application shifts the approach from 'point and click' to 'touch and share', where results can be selected, annotated and discussed collaboratively. In the crystallographic application, given a suitable crystallization plate, beamline and robotic end effector, the resulting information can be used to close the loop between screening and X-ray analysis, allowing a direct and efficient 'screen to beam' approach. The application is not limited to the area of crystallization screening; 'touch and share' can be used by any information-rich scientific analysis and geographically distributed collaboration.
Bruno, Andrew E.; Soares, Alexei S.; Owen, Robin L.; ...
Haptic interfaces have become common in consumer electronics. They enable easy interaction and information entry without the use of a mouse or keyboard. Our work illustrates the application of a haptic interface to crystallization screening in order to provide a natural means for visualizing and selecting results. By linking this to a cloud-based database and web-based application program interface, the same application shifts the approach from `point and click' to `touch and share', where results can be selected, annotated and discussed collaboratively. Furthermore, in the crystallographic application, given a suitable crystallization plate, beamline and robotic end effector, the resulting informationmore » can be used to close the loop between screening and X-ray analysis, allowing a direct and efficient `screen to beam' approach. The application is not limited to the area of crystallization screening; `touch and share' can be used by any information-rich scientific analysis and geographically distributed collaboration.« less
Kukafka, R; Lussier, Y A; Patel, V L; Cimino, J J
This paper describes how theory facilitated the development of educational content for the MI-HEART project, a tailored Web-based intervention designed to favorably influence the appropriateness and rapidity of decision-making in patients suffering from symptoms of acute myocardial infarction. There were five steps involved: 1) formulating the behavioral goal, 2) defining intervention objectives based on an analyses of the determinants of behavior, 3) developing an assessment tool to measure a person's status on these determinants, 4) creating tailored content that address individual variation on determinants of the health behavior and, 5) developing algorithms and a computer program that link responses from the assessment to specific tailored communication. The approach we describe largely distinguishes Web-based applications that are designed to change health behavior from those that simply impart information. Developers of Web-based applications that propose to improve health status by modifying health-related behaviors need the understanding that although it is said that we live in an "information age", simply increasing knowledge has not been effective in changing behaviors in most instances. Furthermore, the one-size fits all approach to developing educational content cannot address the needs, concerns and interests of different individuals. With informatics technology, our ability to collect information from individuals and provide educational content tailored to the specific information collected is not only possibly, but practical.
The Alpha Jet Atmospheric eXperiment (AJAX) has been measuring atmospheric ozone, carbon dioxide, methane and meteorological parameters from near the surface to 8000 m since January 2011. The main goals are to study photochemical ozone production and the impacts of extreme events on western US air quality, provide data to support satellite observations and aid in the quantification of emission sources e.g. wildfires, urban outflow, diary and oil and gas. The aircraft is based at Moffett Field and flies multiple times a month to sample vertical profiles at selected sites in California and Nevada, providing long-term data records at these sites. AJAX is also uniquely positioned to launch with short notice sampling flights in rapid response to extreme events e.g. the 2013 Yosemite Rim fire. This talk will focus on the impacts of vertical transport on surface air quality, and investigation of emission sources from diaries and wildfires.
Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis
Newman, R. L.; Clark, A.; Trabant, C. M.; Karstens, R.; Hutko, A. R.; Casey, R. E.; Ahern, T. K.
Since 2001, the IRIS Data Management Center (DMC) WILBER II system has provided a convenient web-based interface for locating seismic data related to a particular event, and requesting a subset of that data for download. Since its launch, both the scale of available data and the technology of web-based applications have developed significantly. Wilber 3 is a ground-up redesign that leverages a number of public and open-source projects to provide an event-oriented data request interface with a high level of interactivity and scalability for multiple data types. Wilber 3 uses the IRIS/Federation of Digital Seismic Networks (FDSN) web services for event data, metadata, and time-series data. Combining a carefully optimized Google Map with the highly scalable SlickGrid data API, the Wilber 3 client-side interface can load tens of thousands of events or networks/stations in a single request, and provide instantly responsive browsing, sorting, and filtering of event and meta data in the web browser, without further reliance on the data service. The server-side of Wilber 3 is a Python-Django application, one of over a dozen developed in the last year at IRIS, whose common framework, components, and administrative overhead represent a massive savings in developer resources. Requests for assembled datasets, which may include thousands of data channels and gigabytes of data, are queued and executed using the Celery distributed Python task scheduler, giving Wilber 3 the ability to operate in parallel across a large number of nodes.
Tsarouchas, C.; Schlenker, S.; Dimitrov, G.; Jahn, G.
WebQuest is a popular inquiry-oriented activity in which learners use Web resources. Since the creation of the innovation, almost 15 years ago, the Web has changed significantly, while the WebQuest technique has changed little. This article examines possible applications of new Web trends on WebQuest instructional strategy. Some possible…
Atakan, K.; Sebrier, M.; Camelbeeck, T.; Siame, L.; Valensise, G.; Winter, T.
Recognition of active faults, particularly in low seismicity regions such as Western Europe, has been a subject puzzling seismologists for many years. These regions are generally characterized by low-hazard but high-risk, due to the concentration of human and material properties with high-vulnerability. Detecting tectonic deformations that may lead to destructive earthquakes in such areas requires innovative research strategies that suit climate, slowly deforming fault, and heavily human-modified areas. The variety and amount of information involved in the characterization of slowly deforming faults are in general disseminated in several institutions with no easy access to. This information should be gathered, parameterized and stored in a way that make them feasible to be used in seismic hazard studies. In this sense, within the framework of the European project SAFE (Slow Active Faults in Europe; EVG1-2000-22005) a Web-based application (SAFE-Tools) for diagnosing slow active faults is developed. The basic design of the SAFE-Tools (SAFE-T) is based on server-client architecture, with data communication and visualization occurring through the Internet. The system is developed using the Java programming language and operates through an Internet browser. SAFE-T handles both parametric and graphical (image) data with a display and manipulation capability of pre-prepared data sets from a relational database with an interactive processing capacity all conducted through applets. A distributed database structure is developed opening a possibility for a network of interconnected servers. Layers of graphical data (e.g. geological maps, DEM images etc.) and sets of parametric data (e.g. historical or instrumental earthquake catalogues) are entered to the system either through an interactive process using HTML-forms, or as a bulk entry. Data are stored as geographical co-ordinate points with different attributes in the relational database. For identification of active faults
Masiello, I.; Ramberg, R.; Lonka, K.
Computer-based systems have great potential for delivering learning material. Here, a Web-based learning management system is employed by a medical university to support undergraduate courses. The objective was to help the university's staff to understand the readiness and attitudes of students to the use of information technology, their…
Chen, Jun; Wang, Zu-Yuan; Wu, Yuren
Purpose: The purpose of this paper is to introduce some new functions achieved in a web-based multimedia courseware, which is developed by Flash software and used by part-time graduate students. Design/methodology/approach: The courseware uses Adobe Flash CS3 as its development software, which supports Actionscript language, FMS and FLV technology…
Fernandez, Jose Maria Perez
Describes how the Internet was used in an English class for architecture and construction students at the University of Granada (Spain). Discusses course organization; links to construction company Web sites; active learning; group work; student presentations; student autonomy and student motivation; and problems with plagiarism. (LRW)
Ozmutlu, H. Cenk; Spink, Amanda; Ozmutlu, Seda
Discusses the need for tools that allow effective analysis of search engine queries to provide a greater understanding of Web users' information seeking behavior and describes a study that developed an effective strategy for selecting samples from large-scale data sets. Reports on Poisson sampling with data logs from the Excite search engine.…
Gholami, Khalil; Sayadi, Yaser
This paper addresses the faculty perception on web-based instruction in order to explain the nature of learning and instruction in this setting. Using a mixed method approach, the research studied a sample of 132 University Faculty (lecturers and professors) in University of Kurdistan. The research tools were interview and questionnaire. The…
A new web primer design program, BatchPrimer3, is developed based on Primer3. BatchPrimer3 adopted the Primer3 core program as a major primer design engine to choose the best primer pairs. A new score-based primer picking module is incorporated into BatchPrimer3 and used to pick position-restricte...
Papageorgiou, Elpiniki I; Huszka, Csaba; De Roo, Jos; Douali, Nassim; Jaulent, Marie-Christine; Colaert, Dirk
This study aimed to focus on medical knowledge representation and reasoning using the probabilistic and fuzzy influence processes, implemented in the semantic web, for decision support tasks. Bayesian belief networks (BBNs) and fuzzy cognitive maps (FCMs), as dynamic influence graphs, were applied to handle the task of medical knowledge formalization for decision support. In order to perform reasoning on these knowledge models, a general purpose reasoning engine, EYE, with the necessary plug-ins was developed in the semantic web. The two formal approaches constitute the proposed decision support system (DSS) aiming to recognize the appropriate guidelines of a medical problem, and to propose easily understandable course of actions to guide the practitioners. The urinary tract infection (UTI) problem was selected as the proof-of-concept example to examine the proposed formalization techniques implemented in the semantic web. The medical guidelines for UTI treatment were formalized into BBN and FCM knowledge models. To assess the formal models' performance, 55 patient cases were extracted from a database and analyzed. The results showed that the suggested approaches formalized medical knowledge efficiently in the semantic web, and gave a front-end decision on antibiotics' suggestion for UTI.
Del Fabro, Marcos Didonet; de Alimeda, Eduardo Cunha; Sluzarski, Fabiano
Teaching web development in Computer Science undergraduate courses is a difficult task. Often, there is a gap between the students' experiences and the reality in the industry. As a consequence, the students are not always well-prepared once they get the degree. This gap is due to several reasons, such as the complexity of the assignments, the…
Otamendi, Francisco Javier; Doncel, Luis Miguel
Experimental teaching in general, and simulation in particular, have primarily been used in lecture rooms but in the future must also be adapted to e-learning. The integration of web simulators into virtual learning environments, coupled with specific supporting video documentation and the use of videoconference tools, results in robust…
Liou, C.; Hulbert, S.
We present the architecture, design, and implementation details of the ADASS XII web site. The web site was implemented in Zope, a high-performance application server, web server, and content management system rolled into one. Zope includes a robust, scalable object database, web services architecture, and powerful programming capabilities. The web site was built to conform to HTML, CSS, and accessibility standards as adopted by the W3C. This dynamic web site also taps into a back-end Sybase database while requiring a minimal amount of coding. We offer this site as a prototype web site suitable for reuse in supporting future ADASS meetings.
Albeke, S. E.; Perkins, D. G.; Ewers, S. L.; Ewers, B. E.; Holbrook, W. S.; Miller, S. N.
Mutiara Yoga Asmarani Suci, Agisha; Sukaesih Sitanggang, Imas
Outliers analysis on hotspot data as an indicator of fire occurences in Riau Province between 2001 and 2012 have been done, but it was less helpful in fire prevention efforts. This is because the results can only be used by certain people and can not be easily and quickly accessed by users. The purpose of this research is to create a web-based application to detect outliers on Hotspot data and to visualize the outliers based on the time and location. Outliers detection was done in the previous research using the k-means clustering method with global and collective outlier approach in Riau Province Hotspot data between 2001 and 2012. This work aims to develop a web-based application using the framework Shiny with the R programming language. This application provides several functions including summary and visualization of the selected data, clustering hotspot data using k-means algorithm, visualization of the clustering results and sum square error (SSE), and displaying global and collective outliers and visualization of outlier spread on Riau Province Map.
Wheeler, Graham M.; Sweeting, Michael J.; Mander, Adrian P.
In phase I cancer clinical trials, the maximum tolerated dose of a new drug is often found by a dose-escalation method known as the A + B design. We have developed an interactive web application, AplusB, which computes and returns exact operating characteristics of A + B trial designs. The application has a graphical user interface (GUI), requires no programming knowledge and is free to access and use on any device that can open an internet browser. A customised report is available for download for each design that contains tabulated operating characteristics and informative plots, which can then be compared with other dose-escalation methods. We present a step-by-step guide on how to use this application and provide several illustrative examples of its capabilities. PMID:27403961
Mohd Ali, Noorlin; Tsuboi, Ryo; Matsumoto, Yuta; Koishi, Daisuke; Inoue, Kentaro; Maeda, Kazuhiro; Kurata, Hiroyuki
Computational analysis of metabolic fluxes is essential in understanding the structure and function of a metabolic network and in rationally designing genetically modified mutants for an engineering purpose. We had presented the genetic modification flux (GMF) that predicts the flux distribution of a broad range of genetically modified mutants. To enhance the feasibility and usability of GMF, we have developed a web application with a metabolic network database to predict a flux distribution of genetically modified mutants. One hundred and twelve data sets of Escherichia coli, Corynebacterium glutamicum, Saccharomyces cerevisiae, and Chinese hamster ovary were registered as standard models.
Laakso, J. H.; Smith, D. D.; Zimmerman, D. K.
The fabrication of two shear web test elements and three large scale shear web test components are reported. In addition, the fabrication of test fixtures for the elements and components is described. The center-loaded beam test fixtures were configured to have a test side and a dummy or permanent side. The test fixtures were fabricated from standard extruded aluminum sections and plates and were designed to be reuseable.
Ohmukai, Ikki; Matsuo, Yutaka; Matsumura, Naohiro; Takeda, Hideaki
In this paper we propose Web-based communication environment called ``Community Web Platform''. Our platform provides an easy way to exchange personal knowledge among people with lightweight metadata such like RSS and FOAF. We investigate the nature of ``personal trustness'' on the environment since it is one and only measure for evaluating subjective information and knowledge. We also discuss how to develop and maintain Community Web applications from our exrerience.
Laakso, J. H.; Zimmerman, D. K.
An advanced composite shear web design concept was developed for the Space Shuttle orbiter main engine thrust beam structure. Various web concepts were synthesized by a computer-aided adaptive random search procedure. A practical concept is identified having a titanium-clad + or - 45 deg boron/epoxy web plate with vertical boron/epoxy reinforced aluminum stiffeners. The boron-epoxy laminate contributes to the strength and stiffness efficiency of the basic web section. The titanium-cladding functions to protect the polymeric laminate parts from damaging environments and is chem-milled to provide reinforcement in selected areas. Detailed design drawings are presented for both boron/epoxy reinforced and all-metal shear webs. The weight saving offered is 24% relative to all-metal construction at an attractive cost per pound of weight saved, based on the detailed designs. Small scale element tests substantiate the boron/epoxy reinforced design details in critical areas. The results show that the titanium-cladding reliably reinforces the web laminate in critical edge load transfer and stiffener fastener hole areas.
Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao
This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.
... (b) 1.4 (c) 2.1 (d) 6.1 (e) 1.2 (f) 9.1 (g) 5.1 (h) 5.2 (i) 12.1 (j) 7.1 (k) 11.4 2. Paragraphs (l... the presentation. (c) Web pages shall be designed so that all information conveyed with color is also... active region of a server-side image map. (f) Client-side image maps shall be provided instead of...
Gopu, A.; Hayashi, S.; Young, M. D.
Paulau, Pavel V.; Feenders, Christoph; Blasius, Bernd
The analysis of small recurrent substructures, so called network motifs, has become a standard tool of complex network science to unveil the design principles underlying the structure of empirical networks. In many natural systems network nodes are associated with an intrinsic property according to which they can be ordered and compared against each other. Here, we expand standard motif analysis to be able to capture the hierarchical structure in such ordered networks. Our new approach is based on the identification of all ordered 3-node substructures and the visualization of their significance profile. We present a technique to calculate the fine grained motif spectrum by resolving the individual members of isomorphism classes (sets of substructures formed by permuting node-order). We apply this technique to computer generated ensembles of ordered networks and to empirical food web data, demonstrating the importance of considering node order for food-web analysis. Our approach may not only be helpful to identify hierarchical patterns in empirical food webs and other natural networks, it may also provide the base for extending motif analysis to other types of multi-layered networks. PMID:26144248
Dietze, Heiko; Berardini, Tanya Z.; Foulger, Rebecca E.; Hill, David P.; Lomax, Jane; Osumi-Sutherland, David; Roncaglia, Paola; Mungall, Christopher J.
Biological ontologies are continually growing and improving from requests for new classes (terms) by biocurators. These ontology requests can frequently create bottlenecks in the biocuration process, as ontology developers struggle to keep up, while manually processing these requests and create classes. TermGenie allows biocurators to generate new classes based on formally specified design patterns or templates. The system is web-based and can be accessed by any authorized curator through a web browser. Automated rules and reasoning engines are used to ensure validity, uniqueness and relationship to pre-existing classes. In the last 4 years the Gene Ontology TermGenie generated 4715 new classes, about 51.4% of all new classes created. The immediate generation of permanent identifiers proved not to be an issue with only 70 (1.4%) obsoleted classes. Lastly, TermGenie is a web-based class-generation system that complements traditional ontology development tools. All classes added through pre-defined templates are guaranteed to have OWL equivalence axioms that are used for automatic classification and in some cases inter-ontology linkage. At the same time, the system is simple and intuitive and can be used by most biocurators without extensive training.
Dietze, Heiko; Berardini, Tanya Z.; Foulger, Rebecca E.; ...
Biological ontologies are continually growing and improving from requests for new classes (terms) by biocurators. These ontology requests can frequently create bottlenecks in the biocuration process, as ontology developers struggle to keep up, while manually processing these requests and create classes. TermGenie allows biocurators to generate new classes based on formally specified design patterns or templates. The system is web-based and can be accessed by any authorized curator through a web browser. Automated rules and reasoning engines are used to ensure validity, uniqueness and relationship to pre-existing classes. In the last 4 years the Gene Ontology TermGenie generated 4715 newmore » classes, about 51.4% of all new classes created. The immediate generation of permanent identifiers proved not to be an issue with only 70 (1.4%) obsoleted classes. Lastly, TermGenie is a web-based class-generation system that complements traditional ontology development tools. All classes added through pre-defined templates are guaranteed to have OWL equivalence axioms that are used for automatic classification and in some cases inter-ontology linkage. At the same time, the system is simple and intuitive and can be used by most biocurators without extensive training.« less
Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.
The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…
Thongjuea, Supat; Ruanjaichon, Vinitchan; Bruskiewich, Richard; Vanavichit, Apichart
RiceGeneThresher is a public online resource for mining genes underlying genome regions of interest or quantitative trait loci (QTL) in rice genome. It is a compendium of rice genomic resources consisting of genetic markers, genome annotation, expressed sequence tags (ESTs), protein domains, gene ontology, plant stress-responsive genes, metabolic pathways and prediction of protein-protein interactions. RiceGeneThresher system integrates these diverse data sources and provides powerful web-based applications, and flexible tools for delivering customized set of biological data on rice. Its system supports whole-genome gene mining for QTL by querying using DNA marker intervals or genomic loci. RiceGeneThresher provides biologically supported evidences that are essential for targeting groups or networks of genes involved in controlling traits underlying QTL. Users can use it to discover and to assign the most promising candidate genes in preparation for the further gene function validation analysis. The web-based application is freely available at http://rice.kps.ku.ac.th.
Estrada, Jorge; Bernadó, Pau; Blackledge, Martin; Sancho, Javier
Background The stability of proteins is governed by the heat capacity, enthalpy and entropy changes of folding, which are strongly correlated to the change in solvent accessible surface area experienced by the polypeptide. While the surface exposed in the folded state can be easily determined, accessibilities for the unfolded state at the atomic level cannot be obtained experimentally and are typically estimated using simplistic models of the unfolded ensemble. A web application providing realistic accessibilities of the unfolded ensemble of a given protein at the atomic level will prove useful. Results ProtSA, a web application that calculates sequence-specific solvent accessibilities of the unfolded state ensembles of proteins has been developed and made freely available to the scientific community. The input is the amino acid sequence of the protein of interest. ProtSA follows a previously published calculation protocol which uses the Flexible-Meccano algorithm to generate unfolded conformations representative of the unfolded ensemble of the protein, and uses the exact analytical software ALPHASURF to calculate atom solvent accessibilities, which are averaged on the ensemble. Conclusion ProtSA is a novel tool for the researcher investigating protein folding energetics. The sequence specific atom accessibilities provided by ProtSA will allow obtaining better estimates of the contribution of the hydrophobic effect to the free energy of folding, will help to refine existing parameterizations of protein folding energetics, and will be useful to understand the influence of point mutations on protein stability. PMID:19356231
Mehta, V. K.; Kemp-Benedict, E.; Wang, G.; Malghan, D.
Pérez Gutiérrez, B. R.; Vera-Rivera, F. H.; Niño, E. D. V.
Estimate the ionic charge generated in electrical discharges will allow us to know more accurately the concentration of ions implanted on the surfaces of nonmetallic solids. For this reason, in this research a web application was developed to allow us to calculate the ionic charge generated in an electrical discharge from the experimental parameters established in an ion implantation process performed in the JUPITER (Joint Universal Plasma and Ion Technologies Experimental Reactor) reactor. The estimated value of the ionic charge will be determined from data acquired on an oscilloscope, during startup and shutdown of electrical discharge, which will then be analyzed and processed. The study will provide best developments with regard to the application of ion implantation in various industrial sectors.
Pérez Gutiérrez, B. R.; Vera-Rivera, F. H.; Niño, E. D. V.
In the present research a web application is designed and implemented allowing us to calculate the dose of implanted ions on solid substrates from experimental parameters (repetition rate and pulse duration of current, potential difference, work pressure and treatment time) and current pulses acquired during start-up and shutdown of high voltage electrical discharges at low pressures. By physical and mathematical processing, it is achieved at first estimate the value of the ionic charge, and then the areal density. This study will provide more accurate and suitable developments in relation to applications of ion implantation as an alternative technique to increase the service life of the surfaces of the materials used in various industrial sectors developments.
Pietrobon, Ricardo; Shah, Anand; Kuo, Paul; Harker, Matthew; McCready, Mariana; Butler, Christeen; Martins, Henrique; Moorman, CT; Jacobs, Danny O
Background Although regulatory compliance in academic research is enforced by law to ensure high quality and safety to participants, its implementation is frequently hindered by cost and logistical barriers. In order to decrease these barriers, we have developed a Web-based application, Duke Surgery Research Central (DSRC), to monitor and streamline the regulatory research process. Results The main objective of DSRC is to streamline regulatory research processes. The application was built using a combination of paper prototyping for system requirements and Java as the primary language for the application, in conjunction with the Model-View-Controller design model. The researcher interface was designed for simplicity so that it could be used by individuals with different computer literacy levels. Analogously, the administrator interface was designed with functionality as its primary goal. DSRC facilitates the exchange of regulatory documents between researchers and research administrators, allowing for tasks to be tracked and documents to be stored in a Web environment accessible from an Intranet. Usability was evaluated using formal usability tests and field observations. Formal usability results demonstrated that DSRC presented good speed, was easy to learn and use, had a functionality that was easily understandable, and a navigation that was intuitive. Additional features implemented upon request by initial users included: extensive variable categorization (in contrast with data capture using free text), searching capabilities to improve how research administrators could search an extensive number of researcher names, warning messages before critical tasks were performed (such as deleting a task), and confirmatory e-mails for critical tasks (such as completing a regulatory task). Conclusion The current version of DSRC was shown to have excellent overall usability properties in handling research regulatory issues. It is hoped that its release as an open
The Web is growing and changing from a paradigm of static publishing to one of participation and interaction. This change has implications for people with disabilities who rely on access to the Web for employment, information, entertainment, and increased independence. The interactive and collaborative nature of Web 2.0 can present access problems for some users. There are some best practices which can be put in place today to improve access. New specifications such as Accessible Rich Internet Applications (ARIA) and IAccessible2 are opening the doors to increasing the accessibility of Web 2.0 and beyond.
White, C. D.
Webs are sets of Feynman diagrams that contribute to the exponents of scattering amplitudes, in the kinematic limit in which emitted radiation is soft. As such, they have a number of phenomenological and formal applications, and offer tantalizing glimpses into the all-order structure of perturbative quantum field theory. This article is based on a series of lectures given to graduate students, and aims to provide a pedagogical introduction to webs. Topics covered include exponentiation in (non-)abelian gauge theories, the web mixing matrix formalism for non-abelian gauge theories, and recent progress on the calculation of web diagrams. Problems are included throughout the text, to aid understanding.
Wiklund Axelsson, S; Nyberg, L; Näslund, A; Melander Wikman, A
This study investigates the anticipated psychosocial impact of present web-based e-health services and future mobile health applications among older Swedes. Random sample's of Swedish citizens aged 55 years old and older were given a survey containing two different e-health scenarios which respondents rated according to their anticipated psychosocial impact by means of the PIADS instrument. Results consistently demonstrated the positive anticipation of psychosocial impacts for both scenarios. The future mobile health applications scored more positively than the present web-based e-health services. An increase in age correlated positively to lower impact scores. These findings indicate that from a psychosocial perspective, web-based e-health services and mobile health applications are likely to positively impact quality of life. This knowledge can be helpful when tailoring and implementing e-health services that are directed to older people.
Bubakir, Mahmoud M.; Li, Haoyi; Wu, Weifeng; Li, Xiaohu; Ma, Shuai; Yang, Weimin
Melt electrospinning, a technique that has gained increasing attention since it easily can generate continuous ultrafine fibers directly from polymer melts without the use of any solvent. Therefore, it is considered as a safe, cost effective, and environmental friendly technique. However, with all those great advantages, the technique still suffers some drawbacks such as: large fiber diameter and low throughput. The hot air assisted melt differential electrospinning (MDES) is a new technique invented by our research team that can solve or eliminate those drawbacks. The most important features of our used apparatus are: Needleless nozzle that could generate multiple Taylor cones around the bottom edge of the nozzle, which can result in a high throughput. The stretching force acting on the jets can be further strengthened by an air current provided by an air pressure gun. Interference between the high voltage supply and temperature sensors could be prevented through the grounding of the nozzle. The ultrafine pp webs produced using the same apparatus was in the micro/nano scale with a diameter of 600nm-6um and a smooth surface. Porosity of the webs ranges from 86.5%-99.4% when different collecting devices are used. The resultant ultrafine webs were applied in three areas: oil sorption, water treatment, and hydrophilic PP membrane. The results were very promising as for oil the sorption capacity was 129.0g/g; for water treatment, the rejection rate for 3um particles was 95%. And for the hydrophilic PP membrane, the water sorption capacity was 12.3 g/g.
Dafli, Eleni; Antoniou, Panagiotis; Ioannidis, Lazaros; Dombros, Nicholas; Topps, David
Background Virtual patients are interactive computer simulations that are increasingly used as learning activities in modern health care education, especially in teaching clinical decision making. A key challenge is how to retrieve and repurpose virtual patients as unique types of educational resources between different platforms because of the lack of standardized content-retrieving and repurposing mechanisms. Semantic Web technologies provide the capability, through structured information, for easy retrieval, reuse, repurposing, and exchange of virtual patients between different systems. Objective An attempt to address this challenge has been made through the mEducator Best Practice Network, which provisioned frameworks for the discovery, retrieval, sharing, and reuse of medical educational resources. We have extended the OpenLabyrinth virtual patient authoring and deployment platform to facilitate the repurposing and retrieval of existing virtual patient material. Methods A standalone Web distribution and Web interface, which contains an extension for the OpenLabyrinth virtual patient authoring system, was implemented. This extension was designed to semantically annotate virtual patients to facilitate intelligent searches, complex queries, and easy exchange between institutions. The OpenLabyrinth extension enables OpenLabyrinth authors to integrate and share virtual patient case metadata within the mEducator3.0 network. Evaluation included 3 successive steps: (1) expert reviews; (2) evaluation of the ability of health care professionals and medical students to create, share, and exchange virtual patients through specific scenarios in extended OpenLabyrinth (OLabX); and (3) evaluation of the repurposed learning objects that emerged from the procedure. Results We evaluated 30 repurposed virtual patient cases. The evaluation, with a total of 98 participants, demonstrated the system’s main strength: the core repurposing capacity. The extensive metadata schema
Nadkarni, Prakash M.; Brandt, Cynthia M.; Marenco, Luis
The task of creating and maintaining a front end to a large institutional entity-attribute-value (EAV) database can be cumbersome when using traditional client-server technology. Switching to Web technology as a delivery vehicle solves some of these problems but introduces others. In particular, Web development environments tend to be primitive, and many features that client-server developers take for granted are missing. WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component. It also addresses some challenging user interface issues that arise when any complex system is created. The authors describe the architecture of WebEAV and provide an overview of its features with suitable examples. PMID:10887163
Jones, B.K.; Risty, R.R.; Buswell, M.
The U.S. Geological Survey Earth Resources Observation Systems Data Center responds to emergencies in support of various government agencies for human-induced and natural disasters. This response consists of satellite tasking and acquisitions, satellite image registrations, disaster-extent maps analysis and creation, base image provision and support, Web-based mapping services for product delivery, and predisaster and postdisaster data archiving. The emergency response staff are on call 24 hours a day, 7 days a week, and have access to many commercial and government satellite and aerial photography tasking authorities. They have access to value-added data processing and photographic laboratory services for off-hour emergency requests. They work with various Federal agencies for preparedness planning, which includes providing base imagery. These data may include digital elevation models, hydrographic models, base satellite images, vector data layers such as roads, aerial photographs, and other predisaster data. These layers are incorporated into a Web-based browser and data delivery service that is accessible either to the general public or to select customers. As usage declines, the data are moved to a postdisaster nearline archive that is still accessible, but not in real time.
This paper describes the development of Kevlar webbings for parachute applications. Evaluation of existing webbings and a study of the effects of filling yarn denier and pick count on tensile and joint strength provided data for fabric design. Measurements of warp crimp as a function of filling denier and pick count demonstrated the relationship between warp crimp and strength. One newly developed webbing had higher strength efficiency and another had higher joint efficiency than comparable existing webbings. Both new webbings had overall efficiencies over 5% higher than values for existing webbings. 10 refs., 4 figs., 2 tabs.
The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to Web data and documents. This chapter provides a brief overview of web mining techniques and research areas, most notably hypertext classification, wrapper induction, recommender systems and web usage mining.
Ryoo, Ju-Mee; Johnson, Matthew S.; Iraci, Laura T.; Yates, Emma L.; Pierce, R. Bradley; Tanaka, Tomoaki; Gore, Warren
High ozone concentrations at low altitudes near the surface were detected from airborne Alpha Jet Atmospheric eXperiment (AJAX) measurements on May 30, 2012. We investigate the causes of the elevated ozone concentrations using the airborne measurements and various models. GEOS-chem and WRF-STILT model simulations show that the contribution from local sources is small. From MERRA reanalysis, it is found that high potential vorticity (PV) is observed at low altitudes. This high PV appears to be only partially coming through the stratospheric intrusions because the air inside the high PV region is moist, which shows that mixing appears to be enhanced in the low altitudes. Considering that diabatic heating can also produce high PV in the lower troposphere, high ozone is partially coming through stratospheric intrusion, but this cannot explain the whole ozone concentration in the target areas of the western U.S. A back-trajectory model is utilized to see where the air masses originated. The air masses of the target areas came from the lower stratosphere (LS), upper (UT), mid- (MT), and lower troposphere (LT). The relative number of trajectories coming from LS and UT is low (7.7 and 7.6, respectively) compared to that from LT (64.1), but the relative ozone concentration coming from LS and UT is high (38.4 and 20.95, respectively) compared to that from LT (17.7). The air mass coming from LT appears to be mostly coming from Asia. Q diagnostics show that there is sufficient mixing along the trajectory to indicate that ozone from the different origins is mixed and transported to the western U.S. This study shows that high ozone concentrations can be detected by airborne measurements, which can be analyzed by integrated platforms such as models, reanalysis, and satellite data.
Ryoo, Ju-Mee; Johnson, Matthew S.; Iraci, Laura T.; Yates, Emma L.; Pierce, R. Bradley; Tanaka, Tomoaki; Gore, Warren
High ozone concentrations at low altitudes near the surface were detected from airborne Alpha Jet Atmospheric eXperiment (AJAX) measurements on May 30, 2012. We investigate the causes of the elevated ozone concentrations using the airborne measurements and various models. GEOSchem and WRF-STILT model simulations show that the contribution from local sources is small. From MERRA reanalysis, it is found that high potential vorticity (PV) is observed at low altitudes. This high PV appears to be only partially coming through the stratospheric intrusions because the air inside the high PV region is moist, which shows that mixing appears to be enhanced in the low altitudes. Considering that diabatic heating can also produce high PV in the lower troposphere, high ozone is partially coming through stratospheric intrusion, but this cannot explain the whole ozone concentration in the target areas of the western U.S. A back-trajectory model is utilized to see where the air masses originated. The air masses of the target areas came from the lower stratosphere (LS), upper (UT), mid- (MT), and lower troposphere (LT). The relative number of trajectories coming from LS and UT is low (7.7% and 7.6%, respectively) compared to that from LT (64.1%), but the relative ozone concentration coming from LS and UT is high (38.4% and 20.95%, respectively) compared to that from LT (17.7%). The air mass coming from LT appears to be mostly coming from Asia. Q diagnostics show that there is sufficient mixing along the trajectory to indicate that ozone from the different origins is mixed and transported to the western U.S. This study shows that high ozone concentrations can be detected by airborne measurements, which can be analyzed by integrated platforms such as models, reanalysis, and satellite data.
Ryoo, J. M.; Johnson, M. S.; Iraci, L. T.; Yates, E. L.; Pierce, R. B.; Tanaka, T.; Gore, W.
High ozone concentrations at low altitudes near the surface were detected from airborne Alpha Jet Atmospheric eXperiment (AJAX) measurements on May 30, 2012. We investigate the causes of the elevated ozone concentrations using the airborne measurements and various models. GEOS-chem and WRF-STILT model simulations show that the contribution from local sources is small. From MERRA reanalysis, it is found that high potential vorticity (PV) is observed at low altitudes. This high PV appears to be only partially coming through the stratospheric intrusions because the air inside the high PV region is moist, which shows that mixing appears to be enhanced in the low altitudes. Considering that diabatic heating can also produce high PV in the lower troposphere, high ozone is partially coming through stratospheric intrusion, but this cannot explain the whole ozone concentration in the target areas of the western U.S. A back-trajectory model is utilized to see where the air masses originated. The air masses of the target areas came from the lower stratosphere (LS), upper (UT), mid- (MT), and lower troposphere (LT). The relative number of trajectories coming from LS and UT is low (7.7% and 7.6%, respectively) compared to that from LT (64.1%), but the relative ozone concentration coming from LS and UT is high (38.4% and 20.95%, respectively) compared to that from LT (17.7%). The air mass coming from LT appears to be mostly coming from Asia. Q diagnostics show that there is sufficient mixing along the trajectory to indicate that ozone from the different origins is mixed and transported to the western U.S. This study shows that high ozone concentrations can be detected by airborne measurements, which can be analyzed by integrated platforms such as models, reanalysis, and satellite data.
Manea, M.; Norini, G.; Capra, L.; Manea, V. C.
The Colima Volcano is currently the most active Mexican volcano. After the 1913 plinian activity the volcano presented several eruptive phases that lasted few years, but since 1991 its activity became more persistent with vulcanian eruptions, lava and dome extrusions. During the last 15 years the volcano suffered several eruptive episodes as in 1991, 1994, 1998-1999, 2001-2003, 2004 and 2005 with the emplacement of pyroclastic flows. During rain seasons lahars are frequent affecting several infrastructures such as bridges and electric towers. Researchers from different institutions (Mexico, USA, Germany, Italy, and Spain) are currently working on several aspects of the volcano, from remote sensing, field data of old and recent deposits, structural framework, monitoring (rain, seismicity, deformation and visual observations) and laboratory experiments (analogue models and numerical simulations). Each investigation is focused to explain a single process, but it is fundamental to visualize the global status of the volcano in order to understand its behavior and to mitigate future hazards. The Colima Volcano WebGIS represents an initiative aimed to collect and store on a systematic basis all the data obtained so far for the volcano and to continuously update the database with new information. The Colima Volcano WebGIS is hosted on the Computational Geodynamics Laboratory web server and it is based entirely on Open Source software. The web pages, written in php/html will extract information from a mysql relational database, which will host the information needed for the MapBender application. There will be two types of intended users: 1) researchers working on the Colima Volcano, interested in this project and collaborating in common projects will be provided with open access to the database and will be able to introduce their own data, results, interpretation or recommendations; 2) general users, interested in accessing information on Colima Volcano will be provided
With the growth of Web 2.0 library intranets in recent years, many libraries are leaving behind legacy, first-generation intranets. As Web 2.0 intranets multiply and mature, how will traditional intranet best practices--especially in the areas of planning, implementation, and evaluation--translate into an existing Web 2.0 intranet infrastructure?…
Marenco, Luis; Tosches, Nicholas; Crasto, Chiquito; Shepherd, Gordon; Miller, Perry L.; Nadkarni, Prakash M.
The EAV/CR framework, designed for database support of rapidly evolving scientific domains, utilizes metadata to facilitate schema maintenance and automatic generation of Web-enabled browsing interfaces to the data. EAV/CR is used in SenseLab, a neuroscience database that is part of the national Human Brain Project. This report describes various enhancements to the framework. These include (1) the ability to create “portals” that present different subsets of the schema to users with a particular research focus, (2) a generic XML-based protocol to assist data extraction and population of the database by external agents, (3) a limited form of ad hoc data query, and (4) semantic descriptors for interclass relationships and links to controlled vocabularies such as the UMLS. PMID:12807806
Cai, Mingyang; Gao, Fan; Lu, Wange; Wang, Kai
Circularized Chromosome Conformation Capture followed by deep sequencing (4C-Seq) is a powerful technique to identify genome-wide partners interacting with a pre-specified genomic locus. Here, we present a computational and statistical approach to analyze 4C-Seq data generated from both enzyme digestion and sonication fragmentation-based methods. We implemented a command line software tool and a web interface called w4CSeq, which takes in the raw 4C sequencing data (FASTQ files) as input, performs automated statistical analysis and presents results in a user-friendly manner. Besides providing users with the list of candidate interacting sites/regions, w4CSeq generates figures showing genome-wide distribution of interacting regions, and sketches the enrichment of key features such as TSSs, TTSs, CpG sites and DNA replication timing around 4C sites.
Luff, R; Zähringer, M; Harms, W; Bleher, M; Prommer, B; Stöhlker, U
The German Federal Office for Radiation Protection operates a network of about 1800 gamma dose rate stations as a part of the national emergency preparedness plan. Each of the six network centres is capable of operating the network alone. Most of the used hardware and software have been developed in-house under open-source license. Short development cycles and close cooperation between developers and users ensure robustness, transparency and fast maintenance procedures, thus avoiding unnecessary complex solutions. This also reduces the overall costs of the network operation. An easy-to-expand web interface has been developed to make the complete system available to other interested network operators in order to increase cooperation between different countries. The interface is also regularly in use for education during scholarships of trainees supported, e.g. by the 'International Atomic Energy Agency' to operate a local area dose rate monitoring test network.
González-Medina, Mariana; Méndez-Lucio, Oscar; Medina-Franco, José L
Activity landscape modeling is a powerful method for the quantitative analysis of structure-activity relationships. This cheminformatics area is in continuous growth, and several quantitative and visual approaches are constantly being developed. However, these approaches often fall into disuse due to their limited access. Herein, we present Activity Landscape Plotter as the first freely available web-based tool to automatically analyze structure-activity relationships of compound data sets. Based on the concept of activity landscape modeling, the online service performs pairwise structure and activity relationships from an input data set supplied by the user. For visual analysis, Activity Landscape Plotter generates Structure-Activity Similarity and Dual-Activity Difference maps. The user can interactively navigate through the maps and export all the pairwise structure-activity information as comma delimited files. Activity Landscape Plotter is freely accessible at https://unam-shiny-difacquim.shinyapps.io/ActLSmaps /.
Wagner, Chad R.; Tighe, Kirsten C.; Terziotti, Silvia
StreamStats is a Web-based Geographic Information System (GIS) application that was developed by the U.S. Geological Survey (USGS) in cooperation with Environmental Systems Research Institute, Inc. (ESRI) to provide access to an assortment of analytical tools that are useful for water-resources planning and management. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection sites and selected ungaged sites. StreamStats also allows users to identify stream reaches upstream and downstream from user-selected sites and obtain information for locations along streams where activities occur that can affect streamflow conditions. This functionality can be accessed through a map-based interface with the user's Web browser or through individual functions requested remotely through other Web applications.
Zubair, Muhammad; Ponniah, Ganeshthangaraj; Yang, Young Jin; Choi, Kyung Hyun
The mass production of printed electronics can be achieved by roll-to-roll(R2R) printing system, so highly accurate web tension is required that can minimize the register error and keep the thickness and roughness of printed devices in limits. The web tension of a R2R system is regulated by the use of integrated load cells and active dancer system for printed electronics applications using decentralized multi-input-single-output(MISO) regularized variable learning rate backpropagation artificial neural networks. The active dancer system is used before printing system to reduce disturbances in the web tension of process span. The classical PID control result in tension spikes with the change in roll diameter of winder and unwinder rolls. The presence of dancer in R2R system shows that improved web tension control in printing span and the web tension can be enhanced from 3.75 N to 4.75 N. The overshoot of system is less than ±2.5 N and steady state error is within ±1 N where load cells have a signal noise of ±0.7 N. The integration of load cells and active dancer with self-adapting neural network control provide a solution to the web tension control of multispan roll-to-roll system.
Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li
The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.
Reder, Nicholas P.; Glasser, Daniel; Dintzis, Suzanne M.; Rendi, Mara H.; Garcia, Rochelle L.; Henriksen, Jonathan C.; Kilgore, Mark R.
Context: Whole-slide images (WSIs) present a rich source of information for education, training, and quality assurance. However, they are often used in a fashion similar to glass slides rather than in novel ways that leverage the advantages of WSI. We have created a pipeline to transform annotated WSI into pattern recognition training, and quality assurance web application called novel diagnostic electronic resource (NDER). Aims: Create an efficient workflow for extracting annotated WSI for use by NDER, an attractive web application that provides high-throughput training. Materials and Methods: WSI were annotated by a resident and classified into five categories. Two methods of extracting images and creating image databases were compared. Extraction Method 1: Manual extraction of still images and validation of each image by four breast pathologists. Extraction Method 2: Validation of annotated regions on the WSI by a single experienced breast pathologist and automated extraction of still images tagged by diagnosis. The extracted still images were used by NDER. NDER briefly displays an image, requires users to classify the image after time has expired, then gives users immediate feedback. Results: The NDER workflow is efficient: annotation of a WSI requires 5 min and validation by an expert pathologist requires An additional one to 2 min. The pipeline is highly automated, with only annotation and validation requiring human input. NDER effectively displays hundreds of high-quality, high-resolution images and provides immediate feedback to users during a 30 min session. Conclusions: NDER efficiently uses annotated WSI to rapidly increase pattern recognition and evaluate for diagnostic proficiency. PMID:27563490
Wirkus, Lars; Strunck, Alexander
War and violent conflict are omnipresent-be it war in the Middle East, violent conflicts in failed states or increasing military expenditures and exports/ imports of military goods. To understand certain conflicts or peace processes and their possible interrelation, to conduct a well-founded political discussion and to support or influence decision-making, one matter is of special importance: easily accessible and, in particular, reliable data and information. Against this background, the Bonn International Center for Conversion (BICC) in close cooperation with the German Federal Agency for Civic Education (bpb) has been developing a map-based information portal on war and peace with various thematic modules for the latter's online service (http://sicherheitspolitik.bpb.de). The portal will eventually offer nine of such modules that are intended to give various target groups, such as interested members of the public, teachers and learners, policymakers and representatives of the media access to the required information in form of an interactive and country-based global overview or a comparison of different issues. Five thematic modules have been completed so far: War and conflict, peace and demobilization, military capacities, resources and conflict, conventional weapons. The portal offers a broad spectrum of different data processing and visualization tools. Its central feature is an interactive mapping component based on WebGIS and a relational database. Content and data provided through thematic maps in the form of WebGIS layers are generally supplemented by info graphics, data tables and short articles providing deeper knowledge on the respective issue. All modules and their sub-chapters are introduced by background texts. They put all interactive maps of a module into an appropriate context and help the users to also understand the interrelation between various layers. If a layer is selected, all corresponding texts and graphics are shown automatically below
Singh, Kulwinder; Park, Dong-Won
with an existing web infrastructure, thereby making the wealth of Web information easily available to the user by phone. This kind of system can be deployed as an extension to 911 and 411 services to share the workload with human operators. This paper presents all the underlying principles, architecture, features, and an example of the real world deployment of our proposed system. The source code and documentations are available for commercial productions.
Frederick W. Ahrens; C. Habeger; J. Loughran; T. Patterson
The project summarized in this report dealt with an evaluation of new microwave applicator ideas for paper preheating and drying. The technical basis for success in this project is the fact that Industrial Microwave Systems has recently identified certain previously unrecognized wave guide ''design variables'' and hardware implementation concepts that can be employed to greatly improve the uniformity of microwave energy distribution for continuous flow processes. Two applicator concepts were ultimately evaluated, a Cross-Machine Direction (CD) oriented applicator and a Machine Direction (MD) oriented applicator. The economic basis for success is the result of several factors. Since 1985, the capital expenditure required for an industrial microwave applicator system has decreased by a factor of four. The maintenance costs have decreased by a factor of 10 and the life expectancy of the magnetron has increased by more than a factor of four to in excess of 8,000 hours (nearly one year at 24 hours/day operation).
Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.
During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.
Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.
During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.
Mendoza, Patricia; Gonzalez, Perla; Villanueva, Brenda; Haltiwanger, Emily; Nazeran, Homer
We describe a vital sign telemonitor (VST) that acquires, records, displays, and provides readings such as: electrocardiograms (ECGs), temperature (T), and oxygen saturation (SaO2) over the Internet to any site. The design of this system consisted of three parts: sensors, analog signal processing circuits, and a user-friendly graphical user interface (GUI). The first part involved selection of appropriate sensors. For ECG, disposable Ag/AgCl electrodes; for temperature, LM35 precision temperature sensor; and for SaO2 the Nonin Oximetry Development Kit equipped with a finger clip were selected. The second part consisted of processing the analog signals obtained from these sensors. This was achieved by implementing suitable amplifiers and filters for the vital signs. The final part focused on development of a GUI to display the vital signs in the LabVIEW environment. From these measurements, important values such as heart rate (HR), beat-to-beat (RR) intervals, SaO2 percentages, and T in both degrees Celsius and Fahrenheit were calculated The GUI could be accessed through the Internet in a Web-page facilitating the possibility of real-time patient telemonitoring. The final system was completed and tested on volunteers with satisfactory results.
Tomazic, Igor; Alvera-Azcarate, Aida; Barth, Alexander; Beckers, Jean-Marie
DINEOF (Data INterpolating Empirical Orthogonal Functions) is a powerful tool based on EOF decomposition developed at the University of Liege/GHER for the reconstruction of missing data in satellite datasets, as well as for the reduction of noise and detection of outliers. DINEOF is openly available as a series of Fortran routines to be compiled by the user, and as binaries (that can be run directly without any compilation) both for Windows and Linux platforms. In order to facilitate the use of DINEOF and increase the number of interested users, we developed a web-based interface for DINEOF with the necessary parameters available to run high-quality DINEOF analysis. This includes choosing variable within selected dataset, defining a domain, time range, filtering criteria based on available variables in the dataset (e.g. quality flag, satellite zenith angle …) and defining necessary DINEOF parameters. Results, including reconstructed data and calculated EOF modes will be disseminated in NetCDF format using OpenDAP and WMS server allowing easy visualisation and analysis. First, we will include several satellite datasets of sea surface temperature and chlorophyll concentration obtained from MyOcean data centre and already remapped to the regular grid (L3C). Later, based on user's request, we plan to extend number of datasets available for reconstruction.
Ruffner, John W.; Woodward, Kim G.
Night vision goggles (NVGs) can enhance military and civilian operations at night. With this increased capability comes the requirement to provide suitable training. Results from field experience and accident analyses suggest that problems experienced by NVG users can be attributed to a limited understanding of NVG limitations and to perceptual problems. In addition, there is evidence that NVG skills are perishable and require frequent practice. Format training is available to help users obtain the required knowledge and skills. However, there often is insufficient opportunity to obtain and practice perceptual skills prior to using NVGs in the operational environment. NVG users need early and continued exposure to the night environment across a broad range of visual and operational conditions to develop and maintain the necessary knowledge and perceptual skills. NVG training has consisted of classroom instruction, hands-on training, and simulator training. Advances in computer-based training (CBT) and web-based training (WBT) have made these technologies very appealing as additions to the NVG training mix. This paper discusses our efforts to develop NVG training using multimedia, interactive CBT and WBT for NVG training. We discuss how NVG CBT and WBT can be extended to military and civilian ground, maritime, and aviation NVG training.
Integrated decision support systems for regulatory applications benefit from standardindustry practices such as code reuse, test-driven development, and modularization. Theseapproaches make meeting the federal government’s goals of transparency, efficiency, and quality assurance ...
2088 WSEAS TRANS. on INFORMATION SCIENCE & APPLICATIONS Issue 12. Volume 2. December 2005 ISSN: 1790-0832 Spiral System Implementation Methodology...organizations. The scribe, as well as knowledge-management tech- 20060926071 WSEAS TRANS. on INFORMATION SCIENCE & APPLICATIONS Issue 12, Volume 2. December...information. This requires the op- they need to work on substantive mission tasks. erators to enter information and move it manu- 2090 WSEAS TRANS. on
Kadoya, Taku; Osada, Yutaka; Takimoto, Gaku
Quantitative description of food webs provides fundamental information for the understanding of population, community, and ecosystem dynamics. Recently, stable isotope mixing models have been widely used to quantify dietary proportions of different food resources to a focal consumer. Here we propose a novel mixing model (IsoWeb) that estimates diet proportions of all consumers in a food web based on stable isotope information. IsoWeb requires a topological description of a food web, and stable isotope signatures of all consumers and resources in the web. A merit of IsoWeb is that it takes into account variation in trophic enrichment factors among different consumer-resource links. Sensitivity analysis using realistic hypothetical food webs suggests that IsoWeb is applicable to a wide variety of food webs differing in the number of species, connectance, sample size, and data variability. Sensitivity analysis based on real topological webs showed that IsoWeb can allow for a certain level of topological uncertainty in target food webs, including erroneously assuming false links, omission of existent links and species, and trophic aggregation into trophospecies. Moreover, using an illustrative application to a real food web, we demonstrated that IsoWeb can compare the plausibility of different candidate topologies for a focal web. These results suggest that IsoWeb provides a powerful tool to analyze food-web structure from stable isotope data. We provide R and BUGS codes to aid efficient applications of IsoWeb.
Cook, David A
Cognitive and learning styles (CLS) have long been investigated as a basis to adapt instruction and enhance learning. Web-based learning (WBL) can reach large, heterogenous audiences, and adaptation to CLS may increase its effectiveness. Adaptation is only useful if some learners (with a defined trait) do better with one method and other learners (with a complementary trait) do better with another method (aptitude-treatment interaction). A comprehensive search of health professions education literature found 12 articles on CLS in computer-assisted learning and WBL. Because so few reports were found, research from non-medical education was also included. Among all the reports, four CLS predominated. Each CLS construct was used to predict relationships between CLS and WBL. Evidence was then reviewed to support or refute these predictions. The wholist-analytic construct shows consistent aptitude-treatment interactions consonant with predictions (wholists need structure, a broad-before-deep approach, and social interaction, while analytics need less structure and a deep-before-broad approach). Limited evidence for the active-reflective construct suggests aptitude-treatment interaction, with active learners doing better with interactive learning and reflective learners doing better with methods to promote reflection. As predicted, no consistent interaction between the concrete-abstract construct and computer format was found, but one study suggests that there is interaction with instructional method. Contrary to predictions, no interaction was found for the verbal-imager construct. Teachers developing WBL activities should consider assessing and adapting to accommodate learners defined by the wholist-analytic and active-reflective constructs. Other adaptations should be considered experimental. Further WBL research could clarify the feasibility and effectiveness of assessing and adapting to CLS.
Veredas, Francisco J; Ruiz-Bandera, Esperanza; Villa-Estrada, Francisca; Rufino-González, Juan F; Morente, Laura
Pressure ulcers (PrU) are considered as one of the most challenging problems that Nursing professionals have to deal with in their daily practice. Nowadays, the education on PrUs is mainly based on traditional lecturing, seminars and face-to-face instruction, sometimes with the support of photographs of wounds being used as teaching material. This traditional educational methodology suffers from some important limitations, which could affect the efficacy of the learning process. This current study has been designed to introduce information and communication technologies (ICT) in the education on PrU for undergraduate students, with the main objective of evaluating the advantages an disadvantages of using ICT, by comparing the learning results obtained from using an e-learning tool with those from a traditional teaching methodology. In order to meet this major objective, a web-based learning system named ePULab has been designed and developed as an adaptive e-learning tool for the autonomous acquisition of knowledge on PrU evaluation. This innovative system has been validated by means of a randomized controlled trial that compares its learning efficacy with that from a control group receiving a traditional face-to-face instruction. Students using ePULab gave significantly better (p<0.01) learning acquisition scores (from pre-test mean 8.27 (SD 1.39) to post-test mean 15.83 (SD 2.52)) than those following traditional lecture-style classes (from pre-test mean 8.23 (SD 1.23) to post-test mean 11.6 (SD 2.52)). In this article, the ePULab software is described in detail and the results from that experimental educational validation study are also presented and analyzed.
Dereeper, Alexis; Homa, Felix; Andres, Gwendoline; Sempere, Guilhem; Sarah, Gautier; Hueber, Yann; Dufayard, Jean-François; Ruiz, Manuel
SNiPlay is a web-based tool for detection, management and analysis of genetic variants including both single nucleotide polymorphisms (SNPs) and InDels. Version 3 now extends functionalities in order to easily manage and exploit SNPs derived from next generation sequencing technologies, such as GBS (genotyping by sequencing), WGRS (whole gre-sequencing) and RNA-Seq technologies. Based on the standard VCF (variant call format) format, the application offers an intuitive interface for filtering and comparing polymorphisms using user-defined sets of individuals and then establishing a reliable genotyping data matrix for further analyses. Namely, in addition to the various scaled-up analyses allowed by the application (genomic annotation of SNP, diversity analysis, haplotype reconstruction and network, linkage disequilibrium), SNiPlay3 proposes new modules for GWAS (genome-wide association studies), population stratification, distance tree analysis and visualization of SNP density. Additionally, we developed a suite of Galaxy wrappers for each step of the SNiPlay3 process, so that the complete pipeline can also be deployed on a Galaxy instance using the Galaxy ToolShed procedure and then be computed as a Galaxy workflow. SNiPlay is accessible at http://sniplay.southgreen.fr. PMID:26040700
Dereeper, Alexis; Homa, Felix; Andres, Gwendoline; Sempere, Guilhem; Sarah, Gautier; Hueber, Yann; Dufayard, Jean-François; Ruiz, Manuel
SNiPlay is a web-based tool for detection, management and analysis of genetic variants including both single nucleotide polymorphisms (SNPs) and InDels. Version 3 now extends functionalities in order to easily manage and exploit SNPs derived from next generation sequencing technologies, such as GBS (genotyping by sequencing), WGRS (whole gre-sequencing) and RNA-Seq technologies. Based on the standard VCF (variant call format) format, the application offers an intuitive interface for filtering and comparing polymorphisms using user-defined sets of individuals and then establishing a reliable genotyping data matrix for further analyses. Namely, in addition to the various scaled-up analyses allowed by the application (genomic annotation of SNP, diversity analysis, haplotype reconstruction and network, linkage disequilibrium), SNiPlay3 proposes new modules for GWAS (genome-wide association studies), population stratification, distance tree analysis and visualization of SNP density. Additionally, we developed a suite of Galaxy wrappers for each step of the SNiPlay3 process, so that the complete pipeline can also be deployed on a Galaxy instance using the Galaxy ToolShed procedure and then be computed as a Galaxy workflow. SNiPlay is accessible at http://sniplay.southgreen.fr.
Yuen, Erica K; Gros, Kirstin; Welsh, Kyleen E; McCauley, Jenna; Resnick, Heidi S; Danielson, Carla K; Price, Matthew; Ruggiero, Kenneth J
Technology-based self-help interventions have the potential to increase access to evidence-based mental healthcare, especially for families affected by natural disasters. However, development of these interventions is a complex process and poses unique challenges. Usability testing, which assesses the ability of individuals to use an application successfully, can have a significant impact on the quality of a self-help intervention. This article describes (a) the development of a novel web-based multi-module self-help intervention for disaster-affected adolescents and their parents and (b) a mixed-methods formal usability study to evaluate user response. A total of 24 adolescents were observed, videotaped, and interviewed as they used the depressed mood component of the self-help intervention. Quantitative results indicated an above-average user experience, and qualitative analysis identified 120 unique usability issues. We discuss the challenges of developing self-help applications, including design considerations and the value of usability testing in technology-based interventions, as well as our plan for widespread dissemination.
Yuen, Erica K; Gros, Kirstin; Welsh, Kyleen E; McCauley, Jenna; Resnick, Heidi S; Danielson, Carla K; Price, Matthew; Ruggiero, Kenneth J
Technology-based self-help interventions have the potential to increase access to evidence-based mental healthcare, especially for families affected by natural disasters. However, development of these interventions is a complex process and poses unique challenges. Usability testing, which assesses the ability of individuals to use an application successfully, can have a significant impact on the quality of a self-help intervention. This article describes (a) the development of a novel web-based multi-module self-help intervention for disaster-affected adolescents and their parents and (b) a mixed-methods formal usability study to evaluate user response. A total of 24 adolescents were observed, videotaped, and interviewed as they used the depressed mood component of the self-help intervention. Quantitative results indicated an above-average user experience, and qualitative analysis identified 120 unique usability issues. We discuss the challenges of developing self-help applications, including design considerations and the value of usability testing in technology-based interventions, as well as our plan for widespread dissemination. PMID:25933798
Semple, Hugh; Qin, Han; Sasson, Comilla
Improving survival rates at the neighborhood level is increasingly seen as a priority for reducing overall rates of out-of-hospital cardiac arrest (OHCA) in the United States. Since wide disparities exist in OHCA rates at the neighborhood level, it is important for public health officials and residents to be able to quickly locate neighborhoods where people are at elevated risk for cardiac arrest and to target these areas for educational outreach and other mitigation strategies. This paper describes an OHCA web mapping application that was developed to provide users with interactive maps and data for them to quickly visualize and analyze the geographic pattern of cardiac arrest rates, bystander CPR rates, and survival rates at the neighborhood level in different U.S. cities. The data comes from the CARES Registry and is provided over a period spanning several years so users can visualize trends in neighborhood out-of-hospital cardiac arrest patterns. Users can also visualize areas that are statistical hot and cold spots for cardiac arrest and compare OHCA and bystander CPR rates in the hot and cold spots. Although not designed as a public participation GIS (PPGIS), this application seeks to provide a forum around which data and maps about local patterns of OHCA can be shared, analyzed and discussed with a view of empowering local communities to take action to address the high rates of OHCA in their vicinity.
Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.
The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application
Garzotto, Franca; Salmon, Tullio; Pigozzi, Massimiliano
A framework for the design of multi-channel (MC) applications in the cultural tourism domain is presented. Several heterogeneous interface devices are supported including location-sensitive mobile units, on-site stationary devices, and personalized CDs that extend the on-site experience beyond the visit time thanks to personal memories gathered…
The purpose of my qualitative, action study was to gain a better understanding of the effects of an experimental college course in computer applications. This inquiry was made concerning both the teacher's and learners' points of view. A holistic, arts-based approach was used by the researcher/teacher in order to design, develop and facilitate a…
24 7.1.7 Space Object Tracking/Satellite Drag...issues in Afghanistan, as well as to support the Detection of Bulk Explosives Army Technology Objective .‡ The AIR application is a replacement of...time of war, the “critical” ‡ Army Technology Objectives are being phased out and are being
Reppert, James E.
Interactive courseware applications are becoming more prevalent as instructional tools in the communication classroom. Prometheus, developed by George Washington University, allows instructors to post syllabi, course outlines, lecture notes, and tests online, in addition to giving students access to discussions and chat sessions. Other popular…
Johnson, G. W.; Gonzalez, J.; Brady, J. J.; Gaylord, A.; Manley, W. F.; Cody, R.; Dover, M.; Score, R.; Garcia-Lavigne, D.; Tweedie, C. E.
ARMAP 3D allows users to dynamically interact with information about U.S. federally funded research projects in the Arctic. This virtual globe allows users to explore data maintained in the Arctic Research & Logistics Support System (ARLSS) database providing a very valuable visual tool for science management and logistical planning, ascertaining who is doing what type of research and where. Users can “fly to” study sites, view receding glaciers in 3D and access linked reports about specific projects. Custom “Search” tasks have been developed to query by researcher name, discipline, funding program, place names and year and display results on the globe with links to detailed reports. ARMAP 3D was created with ESRI’s free ArcGIS Explorer (AGX) new build 900 providing an updated application from build 500. AGX applications provide users the ability to integrate their own spatial data on various data layers provided by ArcOnline (http://resources.esri.com/arcgisonlineservices). Users can add many types of data including OGC web services without any special data translators or costly software. ARMAP 3D is part of the ARMAP suite (http://armap.org), a collection of applications that support Arctic science tools for users of various levels of technical ability to explore information about field-based research in the Arctic. ARMAP is funded by the National Science Foundation Office of Polar Programs Arctic Sciences Division and is a collaborative development effort between the Systems Ecology Lab at the University of Texas at El Paso, Nuna Technologies, the INSTAAR QGIS Laboratory, and CH2M HILL Polar Services.
Maryana, S.; Kurnia, E.; Ruyani, A.
Employee performance assessment is also called a performance review, performance evaluation, or assessment of employees, is an effort to assess the achievements of staffing performance with the aim to increase productivity of employees and companies. This application helps in the assessment of employee performance using five criteria: Presence, Quality of Work, Quantity of Work, Discipline, and Teamwork. The system uses the Exponential Comparative Method and Weighting Eckenrode. Calculation results using graphs were provided to see the assessment of each employee. Programming language used in this system is written in Notepad++ and MySQL database. The testing result on the system can be concluded that this application is correspond with the design and running properly. The test conducted is structural test, functional test, and validation, sensitivity analysis, and SUMI testing.
Ries, Kernell G.; Steeves, Peter A.; Guthrie, John D.; Rea, Alan H.; Stewart, David W.
StreamStats is a U.S. Geological Survey Webbased geographic information systems application developed as a tool for water-resources planning and management, engineering design, and other applications. The primary functionality of StreamStats allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, tools that allow stream-network navigation were added to StreamStats. These tools allow users to select any point along a stream and locate activities upstream and downstream from the selected point, such as streamgaging stations, dams, and point-source discharges, and obtain information about such activities. Users also can obtain stream-reach addresses and estimates of streamflow statistics for the selected points.
Ries, K.G.; Steeves, P.A.; Guthrie, J.D.; Rea, A.H.; Stewart, D.W.
StreamStats is a U.S. Geological Survey Webbased geographic information systems application developed as a tool for water-resources planning and management, engineering design, and other applications. The primary functionality of StreamStats allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, tools that allow stream-network navigation were added to StreamStats. These tools allow users to select any point along a stream and locate activities upstream and downstream from the selected point, such as streamgaging stations, dams, and point-source discharges, and obtain information about such activities. Users also can obtain stream-reach addresses and estimates of streamflow statistics for the selected points.
Zai, Adrian; Chung, Jeanhee; Chueh, Henry
Scutwork.com is an online, peer-based residency review system. We report preliminary results of an online survey designed to investigate the impact of Scutwork on the residency application process. Overall, 68% of respondents believe that the reviews influenced their decision-making and 91% would use Scutwork again. These results and others reported below suggest that Scutwork may play a significant role in the residency selection process.
Conde, Miguel Ángel; Aguilar, Diego Alonso Gómez; Del Pozo de Dios, Alberto; Peñalvo, Francisco José García
Owing to the intrinsic relation among actual education and new technologies, it results essential the fact to found the new ways to satisfy both sides of the modern eLearning platforms, the needs of students and tutors and the enough technologies to support it. Consequently, the possibility to interconnect the LMS with other external applications to enrich and strengthen the comprehension of learning process is one of the principal paths to follow.
Folkerts, M; Graves, Y; Tian, Z; Gu, X; Jia, X; Jiang, S
Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.
Fung, Alex C. W.; Fu, Frank H. K.; Cheung, W. S.
The experience of a two-year project to promote the use of Web-based teaching and learning at XXX University was presented. A total of 68 courses over two semesters were included in the study. Surveys of students and teachers suggested that the quality of learning and teaching were improved. Over 80% of the students had used Web-based teaching and…
Tsukamoto, Takafumi; Yasunaga, Takuo
Stracquadanio, Giovanni; Yang, Kun; Boeke, Jef D; Bader, Joel S
Synthetic biology has become a widely used technology, and expanding applications in research, education and industry require progress tracking for team-based DNA synthesis projects. Although some vendors are beginning to supply multi-kilobase sequence-verified constructs, synthesis workflows starting with short oligos remain important for cost savings and pedagogical benefit. We developed BioPartsDB as an open source, extendable workflow management system for synthetic biology projects with entry points for oligos and larger DNA constructs and ending with sequence-verified clones.
Alexander, Susan; Hoy, Haley; Maskey, Manil; Conover, Helen; Gamble, John; Fraley, Anne
The knowledge base for healthcare providers working in the field of organ transplantation has grown exponentially. However, the field has no centralized 'space' dedicated to efficient access and sharing of information. The ease of use and portability of mobile applications (apps) make them ideal for subspecialists working in complex healthcare environments. In this article, the authors review the literature related to healthcare technology; describe the development of health-related technology; present their mobile app pilot project assessing the effects of a collaborative, mobile app based on a freely available content manage framework; and report their findings. They conclude by sharing both lessons learned while completing this project and future directions.
French, Leon; Liu, Po; Marais, Olivia; Koreman, Tianna; Tseng, Lucia; Lai, Artemis; Pavlidis, Paul
We describe the WhiteText project, and its progress towards automatically extracting statements of neuroanatomical connectivity from text. We review progress to date on the three main steps of the project: recognition of brain region mentions, standardization of brain region mentions to neuroanatomical nomenclature, and connectivity statement extraction. We further describe a new version of our manually curated corpus that adds 2,111 connectivity statements from 1,828 additional abstracts. Cross-validation classification within the new corpus replicates results on our original corpus, recalling 67% of connectivity statements at 51% precision. The resulting merged corpus provides 5,208 connectivity statements that can be used to seed species-specific connectivity matrices and to better train automated techniques. Finally, we present a new web application that allows fast interactive browsing of the over 70,000 sentences indexed by the system, as a tool for accessing the data and assisting in further curation. Software and data are freely available at http://www.chibi.ubc.ca/WhiteText/. PMID:26052282
French, Leon; Liu, Po; Marais, Olivia; Koreman, Tianna; Tseng, Lucia; Lai, Artemis; Pavlidis, Paul
We describe the WhiteText project, and its progress towards automatically extracting statements of neuroanatomical connectivity from text. We review progress to date on the three main steps of the project: recognition of brain region mentions, standardization of brain region mentions to neuroanatomical nomenclature, and connectivity statement extraction. We further describe a new version of our manually curated corpus that adds 2,111 connectivity statements from 1,828 additional abstracts. Cross-validation classification within the new corpus replicates results on our original corpus, recalling 67% of connectivity statements at 51% precision. The resulting merged corpus provides 5,208 connectivity statements that can be used to seed species-specific connectivity matrices and to better train automated techniques. Finally, we present a new web application that allows fast interactive browsing of the over 70,000 sentences indexed by the system, as a tool for accessing the data and assisting in further curation. Software and data are freely available at http://www.chibi.ubc.ca/WhiteText/.
Surfing for Data: A Gathering Trend in Data Storage Is the Use of Web-Based Applications that Make It Easy for Authorized Users to Access Hosted Server Content with Just a Computing Device and Browser
Technology & Learning, 2005
In recent years, the widespread availability of networks and the flexibility of Web browsers have shifted the industry from a client-server model to a Web-based one. In the client-server model of computing, clients run applications locally, with the servers managing storage, printing functions, and network traffic. Because every client is…
WebRTC is enabling a new generation of Telehealth applications and will be an important part of the future of Telehealth. WebRTC allows for web applications to control a user's microphone and video camera from the browser. In this viewpoint, the author presents the pros and cons of WebRTC for Telehealth applications. PMID:28293589
Bíl, Michal; Kubeček, Jan; Andrášik, Richard; Bílová, Martina; Sedoník, Jiří
We present a web-map application (www.rupok.cz) designed for visualization of losses caused by natural hazards to the transportation infrastructure. This application is an output of a project in which we analyzed direct, indirect and network-wide impacts of major natural disasters which hit the CZ as of 1997. When natural disasters hit a road network the results are often a number of closed road sections. Certain roads may be, however, destroyed, whereas the majority of them are usually only closed and can be reopened after a short period of time. While the computation of direct losses (the cost of remedial works) is fairly simple, the evaluation of indirect and network-wide costs is much more difficult. We created a database of interrupted road and highway sections due to natural processes which includes data since 1997 and which is automatically updated. 6,828 records concerning interrupted communications located on 2,879 road sections are included in the database for the 1997 - 2014 time period. Flooding caused 37 % of the traffic interruptions, followed by fallen trees (22 %), landsliding (5 %) and rockfalls (2 %). The RUPOK webpage contains information on the probabilities of transportation section interruptions due to natural processes as well as the impacts of possible interruptions. The direct losses are depicted as monetary values per road section unit. The values are calculated on the basis of official tables including the prices for construction works. The indirect losses were calculated on the basis of the best alternative route expenses and as traffic intensities affected by a road section interruption.
Rea, Alan; Skinner, Kenneth D.
The U.S. Geological Survey Hawaii StreamStats application uses an integrated suite of raster and vector geospatial datasets to delineate and characterize watersheds. The geospatial datasets used to delineate and characterize watersheds on the StreamStats website, and the methods used to develop the datasets are described in this report. The datasets for Hawaii were derived primarily from 10 meter resolution National Elevation Dataset (NED) elevation models, and the National Hydrography Dataset (NHD), using a set of procedures designed to enforce the drainage pattern from the NHD into the NED, resulting in an integrated suite of elevation-derived datasets. Additional sources of data used for computing basin characteristics include precipitation, land cover, soil permeability, and elevation-derivative datasets. The report also includes links for metadata and downloads of the geospatial datasets.
Mudunuri, Uma; Stephens, Robert; Bruining, David; Liu, David; Lebeda, Frank J.
This paper outlines botXminer, a publicly available application to search XML-formatted MEDLINE® data in a complete, object-relational schema implemented in Oracle® XML DB. An advantage offered by botXminer is that it can generate quantitative results with certain queries that are not feasible through the Entrez-PubMed® interface. After retrieving citations associated with user-supplied search terms, MEDLINE fields (title, abstract, journal, MeSH® and chemical) and terms (MeSH qualifiers and descriptors, keywords, author, gene symbol and chemical), these citations are grouped and displayed as tabulated or graphic results. This work represents an extension of previous research for integrating these citations with relational systems. botXminer has a user-friendly, intuitive interface that can be freely accessed at . PMID:16845112
Sharma, M. K.; Kumar, Rajeev
WebOS (Web based operating system) is a new form of Operating Systems. You can use your desktop as a virtual desktop on the web, accessible via a browser, with multiple integrated built-in applications that allow the user to easily manage and organize her data from any location. Desktop on web can be named as WEBtop. This paper starts with a introduction of WebOS and its benefits. For this paper, We have reviewed some most interesting WebOS available nowadays and tried to provide a detailed description of their features. We have identified some parameters as comparison criteria among them. A technical review is given with research design and future goals to design better web based operating systems is a part of this study. Findings of the study conclude this paper.
Chen, Li-Chiou; Tao, Lixin
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul
Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…
Habib, Sami; Safar, Maytham
In many web applications, such as the distance learning, the frequency of refreshing multimedia web documents places a heavy burden on the WWW resources. Moreover, the updated web documents may encounter inordinate delays, which make it difficult to retrieve web documents in time. Here, we present an Internet tool called WEBCAP that can schedule…
Fonda, Stephanie J; Kedziora, Richard J; Vigersky, Robert A; Bursell, Sven-Erik
Behaviors carried out by the person with diabetes (e.g., healthy eating, physical activity, judicious use of medication, glucose monitoring, coping and problem-solving, regular clinic visits, etc.) are of central importance in diabetes management. To assist with these behaviors, we developed a prototype PHA for diabetes self-management that was based on User-Centered Design principles and congruent with the anticipatory vision of Project Health Design (PHD). This article presents aspects of the prototype PHA's functionality as conceived under PHD and describes modifications to the PHA now being undertaken under new sponsorship, in response to user feedback and timing tests we have performed. In brief, the prototype Personal Health Application (PHA) receives data on the major diabetes management domains from a Personal Health Record (PHR) and analyzes and provides feedback based on clinically vetted educational content. The information is presented within "gadgets" within a portal-based website. The PHR used for the first implementation was the Common Platform developed by PHD. Key changes include a re-conceptualization of the gadgets by topic areas originally defined by the American Association of Diabetes Educators, a refocusing on low-cost approaches to diabetes monitoring and data entry, and synchronization with a new PHR, Microsoft® HealthVault™.
Itri, Jason N; Kim, Woojin; Scanlon, Mary H
Radiology residency and fellowship training provides a unique opportunity to evaluate trainee performance and determine the impact of various educational interventions. We have developed a simple software application (Orion) using open-source tools to facilitate the identification and monitoring of resident and fellow discrepancies in on-call preliminary reports. Over a 6-month period, 19,200 on-call studies were interpreted by 20 radiology residents, and 13,953 on-call studies were interpreted by 25 board-certified radiology fellows representing eight subspecialties. Using standard review macros during faculty interpretation, each of these reports was classified as "agreement", "minor discrepancy", and "major discrepancy" based on the potential to impact patient management or outcome. Major discrepancy rates were used to establish benchmarks for resident and fellow performance by year of training, modality, and subspecialty, and to identify residents and fellows demonstrating a significantly higher major discrepancy rate compared with their classmates. Trends in discrepancies were used to identify subspecialty-specific areas of increased major discrepancy rates in an effort to tailor the didactic and case-based curriculum. A series of missed-case conferences were developed based on trends in discrepancies, and the impact of these conferences is currently being evaluated. Orion is a powerful information technology tool that can be used by residency program directors, fellowship programs directors, residents, and fellows to improve radiology education and training.
Mannaert, H.; De Gruyter, B.; Adriaenssens, P.
Presents a Web portal for multicast communication management, which provides fully automatic service management with integrated provisioning of hardware equipment. Describes the software architecture, the implementation, and the application usage of the Web portal for multicast delivery. (Author/AEF)
The Web-based Electronic Data Review (WebEDR) application performs automated data evaluation on ERLN electronic data deliverables (EDDs). It uses test derived from the National Functional Guidelines combined with method-defined limits to measure data.
Adams, Samantha A
Blogs, short for "web logs," together with podcasts and wikis are currently important foci of general internet research. These three applications are part of the larger body of next-generation communication applications that comprises "Web 2.0." Within the specific area of health care, however, little attention has been devoted to understanding these technologies and how they are being used by lay health publics. In this article, I will discuss the emergent findings from a new project that looks at blogging interfaces as potential tools for disease prevention and health promotion. I use a literature review combined with "front stage" web analyses of two cases and interviews with the supporting institutions for these sites to discuss the relevant informatics questions that arise with respect to these applications. I further introduce the idea of "goal-oriented" blogging that is found in the first case study. Because this research project is still in preliminary phases, this should be viewed as an exploration into the topic and work in progress. In addition to raising questions, I will outline the important subsequent research steps.
Newman, R. L.; Lindquist, K. G.; Clemesha, A.; Vernon, F. L.
Since April 2004 the EarthScope USArray seismic network has grown to over 400 broadband stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. Providing secure, yet open, access to real-time and archived data for a broad range of audiences is best served by a series of platform agnostic low-latency web-based applications. We present a framework of tools that interface between the world wide web and Boulder Real Time Technologies Antelope Environmental Monitoring System data acquisition and archival software. These tools provide audiences ranging from network operators and geoscience researchers, to funding agencies and the general public, with comprehensive information about the experiment. This ranges from network-wide to station-specific metadata, state-of-health metrics, event detection rates, archival data and dynamic report generation over a stations two year life span. Leveraging open source web-site development frameworks for both the server side (Perl, Python and PHP) and client-side (Flickr, Google Maps/Earth and jQuery) facilitates the development of a robust extensible architecture that can be tailored on a per-user basis, with rapid prototyping and development that adheres to web-standards.
Suftin, I.; Read, J. S.; Walker, J.
Scientists prefer not having to be tied down to a specific machine or operating system in order to analyze local and remote data sets or publish work. Increasingly, analysis has been migrating to decentralized web services and data sets, using web clients to provide the analysis interface. While simplifying workflow access, analysis, and publishing of data, the move does bring with it its own unique set of issues. Web clients used for analysis typically offer workflows geared towards a single user, with steps and results that are often difficult to recreate and share with others. Furthermore, workflow results often may not be easily used as input for further analysis. Older browsers further complicate things by having no way to maintain larger chunks of information, often offloading the job of storage to the back-end server or trying to squeeze it into a cookie. It has been difficult to provide a concept of "session storage" or "workflow sharing" without a complex orchestration of the back-end for storage depending on either a centralized file system or database. With the advent of HTML5, browsers gained the ability to store more information through the use of the Web Storage API (a browser-cookie holds a maximum of 4 kilobytes). Web Storage gives us the ability to store megabytes of arbitrary data in-browser either with an expiration date or just for a session. This allows scientists to create, update, persist and share their workflow without depending on the backend to store session information, providing the flexibility for new web-based workflows to emerge. In the DSASWeb portal ( http://cida.usgs.gov/DSASweb/ ), using these techniques, the representation of every step in the analyst's workflow is stored as plain-text serialized JSON, which we can generate as a text file and provide to the analyst as an upload. This file may then be shared with others and loaded back into the application, restoring the application to the state it was in when the session file
Millard, W. David; Stoops, LaMar R.; Dorow, Kevin E.
Web Operational Status Boards (WebOSB)is a web-based application designed to acquire, display, and update highly dynamic status information between multiple users and jurisdictions. WebOSB is able to disseminate real-time status informationsupport the timely sharing of informationwith constant, dynamic updates via personal computers and the Internet between emergency operations centers (EOCs), incident command centers, and to users outside the EOC who need to know the information (hospitals, shelters, schools). The WebOSB application far exceeds outdated information-sharing methods used by emergency workers: whiteboards, Word and Excel documents, or even locality-specific Web sites. WebOSBs capabilities include the following elements: - Secure access. Multiple users can access information on WebOSB from any personal computer with Internet access and a secure ID. Privileges are use to control access and distribution of status information and to identify users who are authorized to add or edit information. - Simultaneous update. WebOSB provides options for users to add, display, and update dynamic information simultaneously at all locations involved in the emergency management effort, A single status board can be updated from multiple locations enabling shelters and hospitals to post bed availability or list decontamination capability. - On-the-fly modification. Allowing the definition of an existing status board to be modified on-the-fly can be an asset during an emergency, where information requirements can change quickly. The status board designer feature allows an administrator to quickly define, modi,, add to, and implement new status boards in minutes without needing the help of Web designers and computer programmers. - Publisher/subscriber notification. As a subscriber, each user automatically receives notification of any new information relating to specific status boards. The publisher/subscriber feature automatically notified each user of any new
Development of a Web-Based Clinical Decision Support System for Drug Prescription: Non-Interventional Naturalistic Description of the Antipsychotic Prescription Patterns in 4345 Outpatients and Future Applications
Berrouiguet, Sofian; Barrigón, Maria Luisa; Brandt, Sara A.; Ovejero-García, Santiago; Álvarez-García, Raquel; Carballo, Juan Jose; Lenca, Philippe; Courtet, Philippe; Baca-García, Enrique
Purpose The emergence of electronic prescribing devices with clinical decision support systems (CDSS) is able to significantly improve management pharmacological treatments. We developed a web application available on smartphones in order to help clinicians monitor prescription and further propose CDSS. Method A web application (www.MEmind.net) was developed to assess patients and collect data regarding gender, age, diagnosis and treatment. We analyzed antipsychotic prescriptions in 4345 patients attended in five Psychiatric Community Mental Health Centers from June 2014 to October 2014. The web-application reported average daily dose prescribed for antipsychotics, prescribed daily dose (PDD), and the PDD to defined daily dose (DDD) ratio. Results The MEmind web-application reported that antipsychotics were used in 1116 patients out of the total sample, mostly in 486 (44%) patients with schizophrenia related disorders but also in other diagnoses. Second generation antipsychotics (quetiapine, aripiprazole and long-acting paliperidone) were preferably employed. Low doses were more frequently used than high doses. Long acting paliperidone and ziprasidone however, were the only two antipsychotics used at excessive dosing. Antipsychotic polypharmacy was used in 287 (26%) patients with classic depot drugs, clotiapine, amisulpride and clozapine. Conclusions In this study we describe the first step of the development of a web application that is able to make polypharmacy, high dose usage and off label usage of antipsychotics visible to clinicians. Current development of the MEmind web application may help to improve prescription security via momentary feedback of prescription and clinical decision support system. PMID:27764107
Zhang, Hui; Teng, Yun; Doan, Tra Thi Thanh; Yat, Yun Wei; Chan, Sheot Harn; Kelly, Barry C
Studies of trophodynamics and contaminant bioaccumulation in tropical marine ecosystems are limited. This study employed stable isotope and trace contaminant analysis to assess sources of primary productivity, trophic interactions and chemical bioaccumulation behavior in two mangrove food webs and one offshore coastal marine food web in Singapore. Samples of sediment, phytoplankton, mangrove leaves, clams, snails, crabs, worms, prawns, and fishes were analyzed for stable carbon and nitrogen isotope values, as well as concentrations of persistent organic pollutants. In the mangrove food webs, consumers exhibited similar δ(13) C values, likely due to the well-mixed nature of these systems. However, the two primary consumers (common nerite and rodong snail) exhibited distinct δ(13) C values (-21.6 ‰ vs -17.7 ‰), indicating different carbon sources. Fish from Singapore Strait exhibited similar δ(13) C values, indicating common carbon sources in this offshore marine food web. The highest trophic level (TL) was determined as glass perchlet (TL = 3.3) and tilapia (TL = 3.4) in the two mangrove food webs and grunter (TL = 3.7) in the Singapore Strait food web. PCB 153 and p, p'-DDE concentrations ranged from 0.9 to 84.6 ng/g lipid wt and from < 0.2 to 267.4 ng/g lipid wt, respectively. The trophic magnification factors (TMFs) of PCB 153 and p, p'-DDE ranged were between 1.63 and 4.62, indicating biomagnification in these tropical marine food webs. The findings provide important information that will aid future chemical bioaccumulation assessment initiatives. This article is protected by copyright. All rights reserved.
The Automated Geospatial Watershed Assessment (AGWA) tool is a desktop application that uses widely available standardized spatial datasets to derive inputs for multi-scale hydrologic models (Miller et al., 2007). The required data sets include topography (DEM data), soils, climate, and land-cover ...
Okamoto, Shusuke; Kamada, Masaru; Yonekura, Tatsuhiro
Hopkins, R. H.
A program to develop the technology of the silicon dendritic web ribbon growth process is examined. The effort is being concentrated on the area rate and quality requirements necessary to meet the JPL/DOE goals for terrestrial PV applications. Closed loop web growth system development and stress reduction for high area rate growth is considered.
Li, Lixin; Zhou, Xiaolu; Kalo, Marc; Piltner, Reinhard
Appropriate spatiotemporal interpolation is critical to the assessment of relationships between environmental exposures and health outcomes. A powerful assessment of human exposure to environmental agents would incorporate spatial and temporal dimensions simultaneously. This paper compares shape function (SF)-based and inverse distance weighting (IDW)-based spatiotemporal interpolation methods on a data set of PM2.5 data in the contiguous U.S. Particle pollution, also known as particulate matter (PM), is composed of microscopic solids or liquid droplets that are so small that they can get deep into the lungs and cause serious health problems. PM2.5 refers to particles with a mean aerodynamic diameter less than or equal to 2.5 micrometers. Based on the error statistics results of k-fold cross validation, the SF-based method performed better overall than the IDW-based method. The interpolation results generated by the SF-based method are combined with population data to estimate the population exposure to PM2.5 in the contiguous U.S. We investigated the seasonal variations, identified areas where annual and daily PM2.5 were above the standards, and calculated the population size in these areas. Finally, a web application is developed to interpolate and visualize in real time the spatiotemporal variation of ambient air pollution across the contiguous U.S. using air pollution data from the U.S. Environmental Protection Agency (EPA)’s AirNow program. PMID:27463722
Rushley, Stephanie; Carter, Matthew; Chiou, Charles; Farmer, Richard; Haywood, Kevin; Pototzky, Anthony, Jr.; White, Adam; Winker, Daniel
Colombia is a country with highly variable terrain, from the Andes Mountains to plains and coastal areas, many of these areas are prone to flooding disasters. To identify these risk areas NASA's Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) was used to construct a digital elevation model (DEM) for the study region. The preliminary risk assessment was applied to a pilot study area, the La Mosca River basin. Precipitation data from the National Aeronautics and Space Administration (NASA) Tropical Rainfall Measuring Mission (TRMM)'s near-real-time rainfall products as well as precipitation data from the Instituto de Hidrologia, Meteorologia y Estudios Ambientales (the Institute of Hydrology, Meteorology and Environmental Studies, IDEAM) and stations in the La Mosca River Basin were used to create rainfall distribution maps for the region. Using the precipitation data and the ASTER DEM, the web application, Mi Pronóstico, run by IDEAM, was updated to include an interactive map which currently allows users to search for a location and view the vulnerability and current weather and flooding conditions. The geospatial information was linked to an early warning system in Mi Pronóstico that can alert the public of flood warnings and identify locations of nearby shelters.
Li, Lixin; Zhou, Xiaolu; Kalo, Marc; Piltner, Reinhard
Appropriate spatiotemporal interpolation is critical to the assessment of relationships between environmental exposures and health outcomes. A powerful assessment of human exposure to environmental agents would incorporate spatial and temporal dimensions simultaneously. This paper compares shape function (SF)-based and inverse distance weighting (IDW)-based spatiotemporal interpolation methods on a data set of PM2.5 data in the contiguous U.S. Particle pollution, also known as particulate matter (PM), is composed of microscopic solids or liquid droplets that are so small that they can get deep into the lungs and cause serious health problems. PM2.5 refers to particles with a mean aerodynamic diameter less than or equal to 2.5 micrometers. Based on the error statistics results of k-fold cross validation, the SF-based method performed better overall than the IDW-based method. The interpolation results generated by the SF-based method are combined with population data to estimate the population exposure to PM2.5 in the contiguous U.S. We investigated the seasonal variations, identified areas where annual and daily PM2.5 were above the standards, and calculated the population size in these areas. Finally, a web application is developed to interpolate and visualize in real time the spatiotemporal variation of ambient air pollution across the contiguous U.S. using air pollution data from the U.S. Environmental Protection Agency (EPA)'s AirNow program.
Roman, J. H.
As programmers we have worked with many Application Development Interface API development kits. They are well suited for interaction with a particular system. A vast source of information can be made accessible by using the http protocol through the web as an API. This setup has many advantages including the vast knowledge available on setting web servers and services. Also, these tools are available on most hardware and operating system combinations. In this paper I will cover the various types of systems that can be developed this way, their advantages and some drawbacks of this approach. Index Terms--Application Programmer Interface, Distributed applications, Hyper Text Transfer Protocol, Web.
Delin, Kevin A. (Inventor); Jackson, Shannon P. (Inventor)
A Sensor Web formed of a number of different sensor pods. Each of the sensor pods include a clock which is synchronized with a master clock so that all of the sensor pods in the Web have a synchronized clock. The synchronization is carried out by first using a coarse synchronization which takes less power, and subsequently carrying out a fine synchronization to make a fine sync of all the pods on the Web. After the synchronization, the pods ping their neighbors to determine which pods are listening and responded, and then only listen during time slots corresponding to those pods which respond.
Gao, Jerry Z.; Zhu, Eugene; Shim, Simon
With the increasing applications of the Web in e-commerce, advertising, and publication, new technologies are needed to improve Web graphics technology due to the current limitation of technology. The SVG (Scalable Vector Graphics) technology is a new revolutionary solution to overcome the existing problems in the current web technology. It provides precise and high-resolution web graphics using plain text format commands. It sets a new standard for web graphic format to allow us to present complicated graphics with rich test fonts and colors, high printing quality, and dynamic layout capabilities. This paper provides a tutorial overview about SVG technology and its essential features, capability, and advantages. The reports a comparison studies between SVG and other web graphics technologies.
As more and more data and information becomes available on the Web, new technologies that use explicit semantics for information organization are becoming desirable. New terms such as Linked Data, Semantic Web and Web 3.0 are used more and more, although there is increasing confusion as to what each means. In this talk, I will describe how different sorts of models can be used to link data in different ways. I will particularly explore different kinds of Web applications, from Enterprise Data Integration to Web 3.0 startups, government data release, the different needs of Web 2.0 and 3.0, the growing interest in “semantic search”, and the underlying technologies that power these new approaches.
Highfield, Linda; Ottenweller, Cecelia; Pfanz, Andre; Hanks, Jeanne
Nunes, David; Tran, Thanh-Dien; Raposo, Duarte; Pinto, André; Gomes, André; Silva, Jorge Sá
As the Internet evolved, social networks (such as Facebook) have bloomed and brought together an astonishing number of users. Mashing up mobile phones and sensors with these social environments enables the creation of people-centric sensing systems which have great potential for expanding our current social networking usage. However, such systems also have many associated technical challenges, such as privacy concerns, activity detection mechanisms or intermittent connectivity, as well as limitations due to the heterogeneity of sensor nodes and networks. Considering the openness of the Web 2.0, good technical solutions for these cases consist of frameworks that expose sensing data and functionalities as common Web-Services. This paper presents our RESTful Web Service-based model for people-centric sensing frameworks, which uses sensors and mobile phones to detect users' activities and locations, sharing this information amongst the user's friends within a social networking site. We also present some screenshot results of our experimental prototype.
Van Neste, Christophe; Gansemans, Yannick; De Coninck, Dieter; Van Hoofstat, David; Van Criekinge, Wim; Deforce, Dieter; Van Nieuwerburgh, Filip
Routine use of massively parallel sequencing (MPS) for forensic genomics is on the horizon. The last few years, several algorithms and workflows have been developed to analyze forensic MPS data. However, none have yet been tailored to the needs of the forensic analyst who does not possess an extensive bioinformatics background. We developed our previously published forensic MPS data analysis framework MyFLq (My-Forensic-Loci-queries) into an open-source, user-friendly, web-based application. It can be installed as a standalone web application, or run directly from the Illumina BaseSpace environment. In the former, laboratories can keep their data on-site, while in the latter, data from forensic samples that are sequenced on an Illumina sequencer can be uploaded to Basespace during acquisition, and can subsequently be analyzed using the published MyFLq BaseSpace application. Additional features were implemented such as an interactive graphical report of the results, an interactive threshold selection bar, and an allele length-based analysis in addition to the sequenced-based analysis. Practical use of the application is demonstrated through the analysis of four 16-plex short tandem repeat (STR) samples, showing the complementarity between the sequence- and length-based analysis of the same MPS data.
Fenstermacher, Kurt D.; Ginsburg, Mark
Discusses mining Web data to draw conclusions about Web users and proposes a client-side monitoring system that supports flexible data collection and encompasses client-side applications beyond the Web browser to incorporate standard office productivity tools. Highlights include goals for client-side monitoring; framework for user monitoring,…
Washington Univ., Seattle.
This brief paper considers the application of "universal design" principles to Web page design in order to increase accessibility for people with disabilities. Suggestions are based on the World Wide Web Consortium's accessibility initiative, which has proposed guidelines for all Web authors and federal government standards. Seven guidelines for…
Web 2.0 applications are changing how educators interact both with each other and with their students. Educators can use these new Web tools daily to create, share, socialize, and collaborate with students, colleagues, and newly developed network contacts. School librarians are finding that Web 2.0 tools are bringing them more ways to embrace and…
Saloranta, Tuomo M; Andersen, Tom; Naes, Kristoffer
Rate constant bioaccumulation models are applied to simulate the flow of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in the coastal marine food web of Frierfjorden, a contaminated fjord in southern Norway. We apply two different ways to parameterize the rate constants in the model, global sensitivity analysis of the models using Extended Fourier Amplitude Sensitivity Test (Extended FAST) method, as well as results from general linear system theory, in order to obtain a more thorough insight to the system's behavior and to the flow pathways of the PCDD/Fs. We calibrate our models against observed body concentrations of PCDD/Fs in the food web of Frierfjorden. Differences between the predictions from the two models (using the same forcing and parameter values) are of the same magnitude as their individual deviations from observations, and the models can be said to perform about equally well in our case. Sensitivity analysis indicates that the success or failure of the models in predicting the PCDD/F concentrations in the food web organisms highly depends on the adequate estimation of the truly dissolved concentrations in water and sediment pore water. We discuss the pros and cons of such models in understanding and estimating the present and future concentrations and bioaccumulation of persistent organic pollutants in aquatic food webs.
Li, Boren; Wu, Jianping; Pan, Mao; Huang, Jing
In hazard management, earthquake researchers have utilized GIS to ease the process of managing disasters. Researchers use WebGIS to assess hazards and seismic risk. Although they can provide a visual analysis platform based on GIS technology, they lack a general description in the extensibility of WebGIS for processing dynamic data, especially real-time data. In this paper, we propose a novel approach for real-time 3D visual earthquake information publishing model based on WebGIS and digital globe to improve the ability of processing real-time data in systems based on WebGIS. On the basis of the model, we implement a real-time 3D earthquake information publishing system—EqMap3D. The system can not only publish real-time earthquake information but also display these data and their background geoscience information in a 3D scene. It provides a powerful tool for display, analysis, and decision-making for researchers and administrators. It also facilitates better communication between researchers engaged in geosciences and the interested public.
Montrieux, Hannelore; Vangestel, Sandra; Raes, Annelies; Matthys, Paul; Schellens, Tammy
Blended learning as an instructional approach is getting more attention in the educational landscape and has been researched thoroughly. Yet, this study reports the results of an innovation project aiming to gain insight into three different scenarios of applying web-based lectures: as preparation for face-to-face practical exercises, as a…
With the increasing wealth of information on the Web, information integration is ubiquitous as the same real-world entity may appear in a variety of forms extracted from different sources. This dissertation proposes supervised and unsupervised algorithms that are naturally integrated in a scalable framework to solve the entity resolution problem,…
Aydın, Eda Akman; Bay, Ömer Faruk; Güler, İnan
Brain Computer Interface (BCI) based environment control systems could facilitate life of people with neuromuscular diseases, reduces dependence on their caregivers, and improves their quality of life. As well as easy usage, low-cost, and robust system performance, mobility is an important functionality expected from a practical BCI system in real life. In this study, in order to enhance users' mobility, we propose internet based wireless communication between BCI system and home environment. We designed and implemented a prototype of an embedded low-cost, low power, easy to use web server which is employed in internet based wireless control of a BCI based home environment. The embedded web server provides remote access to the environmental control module through BCI and web interfaces. While the proposed system offers to BCI users enhanced mobility, it also provides remote control of the home environment by caregivers as well as the individuals in initial stages of neuromuscular disease. The input of BCI system is P300 potentials. We used Region Based Paradigm (RBP) as stimulus interface. Performance of the BCI system is evaluated on data recorded from 8 non-disabled subjects. The experimental results indicate that the proposed web server enables internet based wireless control of electrical home appliances successfully through BCIs.
This February 2003 document contains a diagram of dates and events for compliance with the NESHAP for Paper and Other Web Coating. Also on this page is an April 2004 flow chart to determine if the NESHAP applies to your facility.
Barsoum, Emad; Kuester, Falko
The pervasive nature of web-based content has lead to the development of applications and user interfaces that port between a broad range of operating systems and databases, while providing intuitive access to static and time-varying information. However, the integration of this vast resource into virtual environments has remained elusive. In this paper we present an implementation of a 3D Web Browser (WebVR) that enables the user to search the internet for arbitrary information and to seamlessly augment this information into virtual environments. WebVR provides access to the standard data input and query mechanisms offered by conventional web browsers, with the difference that it generates active texture-skins of the web contents that can be mapped onto arbitrary surfaces within the environment. Once mapped, the corresponding texture functions as a fully integrated web-browser that will respond to traditional events such as the selection of links or text input. As a result, any surface within the environment can be turned into a web-enabled resource that provides access to user-definable data. In order to leverage from the continuous advancement of browser technology and to support both static as well as streamed content, WebVR uses ActiveX controls to extract the desired texture skin from industry strength browsers, providing a unique mechanism for data fusion and extensibility.
Ross, A.; Stackhouse, P. W.; Tisdale, B.; Tisdale, M.; Chandler, W.; Hoell, J. M., Jr.; Kusterer, J.
The NASA Langley Research Center Science Directorate and Atmospheric Science Data Center have initiated a pilot program to utilize Geographic Information System (GIS) tools that enable, generate and store climatological averages using spatial queries and calculations in a spatial database resulting in greater accessibility of data for government agencies, industry and private sector individuals. The major objectives of this effort include the 1) Processing and reformulation of current data to be consistent with ESRI and openGIS tools, 2) Develop functions to improve capability and analysis that produce "on-the-fly" data products, extending these past the single location to regional and global scales. 3) Update the current web sites to enable both web-based and mobile application displays for optimization on mobile platforms, 4) Interact with user communities in government and industry to test formats and usage of optimization, and 5) develop a series of metrics that allow for monitoring of progressive performance. Significant project results will include the the development of Open Geospatial Consortium (OGC) compliant web services (WMS, WCS, WFS, WPS) that serve renewable energy and agricultural application products to users using GIS software and tools. Each data product and OGC service will be registered within ECHO, the Common Metadata Repository, the Geospatial Platform, and Data.gov to ensure the data are easily discoverable and provide data users with enhanced access to SSE data, parameters, services, and applications. This effort supports cross agency, cross organization, and interoperability of SSE data products and services by collaborating with DOI, NRCan, NREL, NCAR, and HOMER for requirements vetting and test bed users before making available to the wider public.
Bocci, M.; Brigolin, D.; Pranovi, F.; Najih, M.; Nachite, D.; Pastres, R.
The work applies food web models to the Lagoon of Nador (Morocco) and subsequently estimates ecosystem indices. This effort supports the evaluation of the ecosystem status and the implementation of the Ecosystem Approach (EcAp), endorsed by the contracting parties of the Barcelona Convention for the Mediterranean Sea. The Lagoon of Nador, on the Mediterranean coast of Morocco, suffered from eutrophication during recent decades. We used indices derived from Ecological Network Analysis for investigating the most relevant features of ecosystem functioning in the decade 2000-2010 (present scenario), and comparing them with those of the 1980s (past scenario). As the Lagoon includes different habitats, the methodology was applied to each of them, in order to assess their contribution to the functioning of the whole ecosystem. Results highlighted an increase in Total System Throughput (TST) in the present scenario when compared with the past one, also associated to an increase of Total Respiration (TR) and of the ratio between Total Primary Production and Total Respiration (TPP/TR). Under the present scenario Nador lagoon shows a decreased cycling efficiency. The sensitivity analysis highlighted the capability of TST and Comprehensive Cycling Index (CCI) in detecting changes, in agreement with other recent studies on responses of food web functioning to eutrophication. The results are discussed in respect to three specific aspects, related with the application of food Web Models and Ecological Network Analysis in the EcAp context: i) data availability; ii) spatialization of indicators; iii) selected set of indicators. The results also highlight the important role of sensitivity/uncertainty analysis when implementing food web models in data-scarce systems.
Derriere, S.; Boch, T.
Bröring, Arne; Echterhoff, Johannes; Jirka, Simon; Simonis, Ingo; Everding, Thomas; Stasch, Christoph; Liang, Steve; Lemmens, Rob
Many sensor networks have been deployed to monitor Earth’s environment, and more will follow in the future. Environmental sensors have improved continuously by becoming smaller, cheaper, and more intelligent. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for sharing, finding, and accessing sensors and their data across different applications. It hides the heterogeneous sensor hardware and communication protocols from the applications built on top of it. The Sensor Web Enablement initiative of the Open Geospatial Consortium standardizes web service interfaces and data encodings which can be used as building blocks for a Sensor Web. This article illustrates and analyzes the recent developments of the new generation of the Sensor Web Enablement specification framework. Further, we relate the Sensor Web to other emerging concepts such as the Web of Things and point out challenges and resulting future work topics for research on Sensor Web Enablement. PMID:22163760
Nunes, David; Tran, Thanh-Dien; Raposo, Duarte; Pinto, André; Gomes, André; Silva, Jorge Sá
As the Internet evolved, social networks (such as Facebook) have bloomed and brought together an astonishing number of users. Mashing up mobile phones and sensors with these social environments enables the creation of people-centric sensing systems which have great potential for expanding our current social networking usage. However, such systems also have many associated technical challenges, such as privacy concerns, activity detection mechanisms or intermittent connectivity, as well as limitations due to the heterogeneity of sensor nodes and networks. Considering the openness of the Web 2.0, good technical solutions for these cases consist of frameworks that expose sensing data and functionalities as common Web-Services. This paper presents our RESTful Web Service-based model for people-centric sensing frameworks, which uses sensors and mobile phones to detect users’ activities and locations, sharing this information amongst the user’s friends within a social networking site. We also present some screenshot results of our experimental prototype. PMID:22438732
Archuleta, Christy-Ann M.; Eames, Deanna R.
The Rio Grande Civil Works and Restoration Projects Web Application, developed by the U.S. Geological Survey in cooperation with the U.S. Army Corps of Engineers (USACE) Albuquerque District, is designed to provide publicly available information through the Internet about civil works and restoration projects in the Rio Grande Basin. Since 1942, USACE Albuquerque District responsibilities have included building facilities for the U.S. Army and U.S. Air Force, providing flood protection, supplying water for power and public recreation, participating in fire remediation, protecting and restoring wetlands and other natural resources, and supporting other government agencies with engineering, contracting, and project management services. In the process of conducting this vast array of engineering work, the need arose for easily tracking the locations of and providing information about projects to stakeholders and the public. This fact sheet introduces a Web application developed to enable users to visualize locations and search for information about USACE (and some other Federal, State, and local) projects in the Rio Grande Basin in southern Colorado, New Mexico, and Texas.