Sample records for web visualisation tools

  1. Uncertainty visualisation in the Model Web

    NASA Astrophysics Data System (ADS)

    Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.

    2012-04-01

    Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool: (i) adjacent maps showing data and uncertainty separately, and (ii) multidimensional mapping providing different visualisation methods in combination to explore the spatial, temporal and uncertainty distribution of the data. Adjacent maps allow a simpler visualisation by separating value and uncertainty maps for non-experts and a first overview. The multidimensional approach allows a more complex exploration of the data for experts by browsing through the different dimensions. It offers the visualisation of maps, statistic plots and time series in different windows and sliders to interactively move through time, space and uncertainty (thresholds).

  2. Visualizing astronomy data using VRML

    NASA Astrophysics Data System (ADS)

    Beeson, Brett; Lancaster, Michael; Barnes, David G.; Bourke, Paul D.; Rixon, Guy T.

    2004-09-01

    Visualisation is a powerful tool for understanding the large data sets typical of astronomical surveys and can reveal unsuspected relationships and anomalous regions of parameter space which may be difficult to find programatically. Visualisation is a classic information technology for optimising scientific return. We are developing a number of generic on-line visualisation tools as a component of the Australian Virtual Observatory project. The tools will be deployed within the framework of the International Virtual Observatory Alliance (IVOA), and follow agreed-upon standards to make them accessible by other programs and people. We and our IVOA partners plan to utilise new information technologies (such as grid computing and web services) to advance the scientific return of existing and future instrumentation. Here we present a new tool - VOlume - which visualises point data. Visualisation of astronomical data normally requires the local installation of complex software, the downloading of potentially large datasets, and very often time-consuming and tedious data format conversions. VOlume enables the astronomer to visualise data using just a web browser and plug-in. This is achieved using IVOA standards which allow us to pass data between Web Services, Java Servlet Technology and Common Gateway Interface programs. Data from a catalogue server can be streamed in eXtensible Mark-up Language format to a servlet which produces Virtual Reality Modeling Language output. The user selects elements of the catalogue to map to geometry and then visualises the result in a browser plug-in such as Cortona or FreeWRL. Other than requiring an input VOTable format file, VOlume is very general. While its major use will likely be to display and explore astronomical source catalogues, it can easily render other important parameter fields such as the sky and redshift coverage of proposed surveys or the sampling of the visibility plane by a rotation-synthesis interferometer.

  3. Web-based visualisation and analysis of 3D electron-microscopy data from EMDB and PDB.

    PubMed

    Lagerstedt, Ingvar; Moore, William J; Patwardhan, Ardan; Sanz-García, Eduardo; Best, Christoph; Swedlow, Jason R; Kleywegt, Gerard J

    2013-11-01

    The Protein Data Bank in Europe (PDBe) has developed web-based tools for the visualisation and analysis of 3D electron microscopy (3DEM) structures in the Electron Microscopy Data Bank (EMDB) and Protein Data Bank (PDB). The tools include: (1) a volume viewer for 3D visualisation of maps, tomograms and models, (2) a slice viewer for inspecting 2D slices of tomographic reconstructions, and (3) visual analysis pages to facilitate analysis and validation of maps, tomograms and models. These tools were designed to help non-experts and experts alike to get some insight into the content and assess the quality of 3DEM structures in EMDB and PDB without the need to install specialised software or to download large amounts of data from these archives. The technical challenges encountered in developing these tools, as well as the more general considerations when making archived data available to the user community through a web interface, are discussed. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  4. EarthServer: Use of Rasdaman as a data store for use in visualisation of complex EO data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter; Grant, Mike

    2013-04-01

    The European Commission FP7 project EarthServer is establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending cutting-edge Array Database technology. EarthServer is built around the Rasdaman Raster Data Manager which extends standard relational database systems with the ability to store and retrieve multi-dimensional raster data of unlimited size through an SQL style query language. Rasdaman facilitates visualisation of data by providing several Open Geospatial Consortium (OGC) standard interfaces through its web services wrapper, Petascope. These include the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. Here we will show that the use of EarthServer technologies and infrastructure allows access and visualisation of massive scale data through a web client with only marginal bandwidth use as opposed to the current mechanism of copying huge amounts of data to create visualisations locally. For example if a user wanted to generate a plot of global average chlorophyll for a complete decade time series they would only have to download the result instead of Terabytes of data. Firstly we will present a brief overview of the capabilities of Rasdaman and the WCPS query language to introduce the ways in which it is used in a visualisation tool chain. We will show that there are several ways in which WCPS can be utilised to create both standard and novel web based visualisations. An example of a standard visualisation is the production of traditional 2d plots, allowing users the ability to plot data products easily. However, the query language allows the creation of novel/custom products, which can then immediately be plotted with the same system. For more complex multi-spectral data, WCPS allows the user to explore novel combinations of bands in standard band-ratio algorithms through a web browser with dynamic updating of the resultant image. To visualise very large datasets Rasdaman has the capability to dynamically scale a dataset or query result so that it can be appraised quickly for use in later unscaled queries. All of these techniques are accessible through a web based GIS interface increasing the number of potential users of the system. Lastly we will show the advances in dynamic web based 3D visualisations being explored within the EarthServer project. By utilising the emerging declarative 3D web standard X3DOM as a tool to visualise the results of WCPS queries we introduce several possible benefits, including quick appraisal of data for outliers or anomalous data points and visualisation of the uncertainty of data alongside the actual data values.

  5. KAGLVis - On-line 3D Visualisation of Earth-observing-satellite Data

    NASA Astrophysics Data System (ADS)

    Szuba, Marek; Ameri, Parinaz; Grabowski, Udo; Maatouki, Ahmad; Meyer, Jörg

    2015-04-01

    One of the goals of the Large-Scale Data Management and Analysis project is to provide a high-performance framework facilitating management of data acquired by Earth-observing satellites such as Envisat. On the client-facing facet of this framework, we strive to provide visualisation and basic analysis tool which could be used by scientists with minimal to no knowledge of the underlying infrastructure. Our tool, KAGLVis, is a JavaScript client-server Web application which leverages modern Web technologies to provide three-dimensional visualisation of satellite observables on a wide range of client systems. It takes advantage of the WebGL API to employ locally available GPU power for 3D rendering; this approach has been demonstrated to perform well even on relatively weak hardware such as integrated graphics chipsets found in modern laptop computers and with some user-interface tuning could even be usable on embedded devices such as smartphones or tablets. Data is fetched from the database back-end using a ReST API and cached locally, both in memory and using HTML5 Web Storage, to minimise network use. Computations, calculation of cloud altitude from cloud-index measurements for instance, can depending on configuration be performed on either the client or the server side. Keywords: satellite data, Envisat, visualisation, 3D graphics, Web application, WebGL, MEAN stack.

  6. bioWeb3D: an online webGL 3D data visualisation tool.

    PubMed

    Pettit, Jean-Baptiste; Marioni, John C

    2013-06-07

    Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets.

  7. A web-based 3D visualisation and assessment system for urban precinct scenario modelling

    NASA Astrophysics Data System (ADS)

    Trubka, Roman; Glackin, Stephen; Lade, Oliver; Pettit, Chris

    2016-07-01

    Recent years have seen an increasing number of spatial tools and technologies for enabling better decision-making in the urban environment. They have largely arisen because of the need for cities to be more efficiently planned to accommodate growing populations while mitigating urban sprawl, and also because of innovations in rendering data in 3D being well suited for visualising the urban built environment. In this paper we review a number of systems that are better known and more commonly used in the field of urban planning. We then introduce Envision Scenario Planner (ESP), a web-based 3D precinct geodesign, visualisation and assessment tool, developed using Agile and Co-design methods. We provide a comprehensive account of the tool, beginning with a discussion of its design and development process and concluding with an example use case and a discussion of the lessons learned in its development.

  8. Using Cesium for 3D Thematic Visualisations on the Web

    NASA Astrophysics Data System (ADS)

    Gede, Mátyás

    2018-05-01

    Cesium (http://cesiumjs.org) is an open source, WebGL-based JavaScript library for virtual globes and 3D maps. It is an excellent tool for 3D thematic visualisations, but to use its full functionality it has to be feed with its own file format, CZML. Unfortunately, this format is not yet supported by any major GIS software. This paper intro- duces a plugin for QGIS, developed by the author, which facilitates the creation of CZML file for various types of visualisations. The usability of Cesium is also examined in various hardware/software environments.

  9. bioWeb3D: an online webGL 3D data visualisation tool

    PubMed Central

    2013-01-01

    Background Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. Results An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Conclusions Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets. PMID:23758781

  10. A Tropical Marine Microbial Natural Products Geobibliography as an Example of Desktop Exploration of Current Research Using Web Visualisation Tools

    PubMed Central

    Mukherjee, Joydeep; Llewellyn, Lyndon E; Evans-Illidge, Elizabeth A

    2008-01-01

    Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind™, ArcGIS Explorer™ and Google Earth™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article. PMID:19172194

  11. brain-coX: investigating and visualising gene co-expression in seven human brain transcriptomic datasets.

    PubMed

    Freytag, Saskia; Burgess, Rosemary; Oliver, Karen L; Bahlo, Melanie

    2017-06-08

    The pathogenesis of neurological and mental health disorders often involves multiple genes, complex interactions, as well as brain- and development-specific biological mechanisms. These characteristics make identification of disease genes for such disorders challenging, as conventional prioritisation tools are not specifically tailored to deal with the complexity of the human brain. Thus, we developed a novel web-application-brain-coX-that offers gene prioritisation with accompanying visualisations based on seven gene expression datasets in the post-mortem human brain, the largest such resource ever assembled. We tested whether our tool can correctly prioritise known genes from 37 brain-specific KEGG pathways and 17 psychiatric conditions. We achieved average sensitivity of nearly 50%, at the same time reaching a specificity of approximately 75%. We also compared brain-coX's performance to that of its main competitors, Endeavour and ToppGene, focusing on the ability to discover novel associations. Using a subset of the curated SFARI autism gene collection we show that brain-coX's prioritisations are most similar to SFARI's own curated gene classifications. brain-coX is the first prioritisation and visualisation web-tool targeted to the human brain and can be freely accessed via http://shiny.bioinf.wehi.edu.au/freytag.s/ .

  12. Digital Investigations of AN Archaeological Smart Point Cloud: a Real Time Web-Based Platform to Manage the Visualisation of Semantical Queries

    NASA Astrophysics Data System (ADS)

    Poux, F.; Neuville, R.; Hallot, P.; Van Wersch, L.; Luczfalvy Jancsó, A.; Billen, R.

    2017-05-01

    While virtual copies of the real world tend to be created faster than ever through point clouds and derivatives, their working proficiency by all professionals' demands adapted tools to facilitate knowledge dissemination. Digital investigations are changing the way cultural heritage researchers, archaeologists, and curators work and collaborate to progressively aggregate expertise through one common platform. In this paper, we present a web application in a WebGL framework accessible on any HTML5-compatible browser. It allows real time point cloud exploration of the mosaics in the Oratory of Germigny-des-Prés, and emphasises the ease of use as well as performances. Our reasoning engine is constructed over a semantically rich point cloud data structure, where metadata has been injected a priori. We developed a tool that directly allows semantic extraction and visualisation of pertinent information for the end users. It leads to efficient communication between actors by proposing optimal 3D viewpoints as a basis on which interactions can grow.

  13. Interactive Tools to Access the HELCATS Catalogues

    NASA Astrophysics Data System (ADS)

    Rouillard, Alexis; Plotnikov, Illya; Pinto, Rui; Génot, Vincent; Bouchemit, Myriam; Davies, Jackie

    2017-04-01

    The propagation tool is a web-based interface written in java that allows users to propagate Coronal Mass Ejections (CMEs), Corotating Interaction Regions (CIRs) and Solar Energetic Particles (SEPs) in the inner heliosphere. The tool displays unique datasets and catalogues through a 2-D visualisation of the trajectories of these heliospheric structures in relation to the orbital position of probes/planets and the pointing direction and extent of different imaging instruments. Summary plots of in-situ data or images of the solar corona and planetary aurorae stored at the CDPP, MEDOC and APIS databases, respectively, can be used to verify the presence of heliospheric structures at the estimated launch or impact times. A great novelty of the tool is the immediate visualisation of J-maps and the possibility to superpose on these maps the HELCATS CME and CIR catalogues.

  14. Interactive Tools to Access the HELCATS Catalogues

    NASA Astrophysics Data System (ADS)

    Rouillard, A.; Génot, V.; Bouchemit, M.; Pinto, R.

    2017-09-01

    The propagation tool is a web-based interface written in java that allows users to propagate Coronal Mass Ejections (CMEs), Corotating Interaction Regions (CIRs) and Solar Energetic Particles (SEPs) in the inner heliosphere. The tool displays unique datasets and catalogues through a 2-D visualisation of the trajectories of these heliospheric structures in relation to the orbital position of probes/planets and the pointing direction and extent of different imaging instruments. Summary plots of in-situ data or images of the solar corona and planetary aurorae stored at the CDPP, MEDOC and APIS databases, respectively, can be used to verify the presence of heliospheric structures at the estimated launch or impact times. A great novelty of the tool is the immediate visualisation of J-maps and the possibility to superpose on these maps the HELCATS CME and CIR catalogues.

  15. The Environmental Virtual Observatory (EVO) local exemplar: A cloud based local landscape learning visualisation tool for communicating flood risk to catchment stakeholders

    NASA Astrophysics Data System (ADS)

    Wilkinson, Mark; Beven, Keith; Brewer, Paul; El-khatib, Yehia; Gemmell, Alastair; Haygarth, Phil; Mackay, Ellie; Macklin, Mark; Marshall, Keith; Quinn, Paul; Stutter, Marc; Thomas, Nicola; Vitolo, Claudia

    2013-04-01

    Today's world is dominated by a wide range of informatics tools that are readily available to a wide range of stakeholders. There is growing recognition that the appropriate involvement of local communities in land and water management decisions can result in multiple environmental, economic and social benefits. Therefore, local stakeholder groups are increasingly being asked to participate in decision making alongside policy makers, government agencies and scientists. As such, addressing flooding issues requires new ways of engaging with the catchment and its inhabitants at a local level. To support this, new tools and approaches are required. The growth of cloud based technologies offers new novel ways to facilitate this process of exchange of information in earth sciences. The Environmental Virtual Observatory Pilot project (EVOp) is a new initiative from the UK Natural Environment Research Council (NERC) designed to deliver proof of concept for new tools and approaches to support the challenges as outlined above (http://www.evo-uk.org/). The long term vision of the Environmental Virtual Observatory is to: • Make environmental data more visible and accessible to a wide range of potential users including public good applications; • Provide tools to facilitate the integrated analysis of data, greater access to added knowledge and expert analysis and visualisation of the results; • Develop new, added-value knowledge from public and private sector data assets to help tackle environmental challenges. As part of the EVO pilot, an interactive cloud based tool has been developed with local stakeholders. The Local Landscape Visualisation Tool attempts to communicate flood risk in local impacted communities. The tool has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. This tool (assessable via a web portal) combines numerous cloud based tools and services, local catchment datasets, hydrological models and novel visualisation techniques. This pilot tool has been developed by engaging with different stakeholder groups in three catchments in the UK; the Afon Dyfi (Wales), the River Tarland (Scotland) and the River Eden (England). Stakeholders were interested in accessing live data in their catchments and looking at different land use change scenarios on flood peaks. Visualisation tools have been created which offer access to real time data (such as river level, rainfall and webcam images). Other tools allow land owners to use cloud based models (example presented here uses Topmodel, a rainfall-runoff model, on a custom virtual machine image on Amazon web services) and local datasets to explore future land use scenarios, allowing them to understand the associated flood risk. Different ways to communicate model uncertainty are currently being investigated and discussed with stakeholders. In summary the pilot project has had positive feedback and has evolved into two unique parts; a web based map tool and a model interface tool. Users can view live data from different sources, combine different data types together (data mash-up), develop local scenarios for land use and flood risk and exploit the dynamic, elastic cloud modelling capability. This local toolkit will reside within a wider EVO platform that will include national and global datasets, models and state of the art cloud computer systems.

  16. User Interface Requirements for Web-Based Integrated Care Pathways: Evidence from the Evaluation of an Online Care Pathway Investigation Tool.

    PubMed

    Balatsoukas, Panos; Williams, Richard; Davies, Colin; Ainsworth, John; Buchan, Iain

    2015-11-01

    Integrated care pathways (ICPs) define a chronological sequence of steps, most commonly diagnostic or treatment, to be followed in providing care for patients. Care pathways help to ensure quality standards are met and to reduce variation in practice. Although research on the computerisation of ICP progresses, there is still little knowledge on what are the requirements for designing user-friendly and usable electronic care pathways, or how users (normally health care professionals) interact with interfaces that support design, analysis and visualisation of ICPs. The purpose of the study reported in this paper was to address this gap by evaluating the usability of a novel web-based tool called COCPIT (Collaborative Online Care Pathway Investigation Tool). COCPIT supports the design, analysis and visualisation of ICPs at the population level. In order to address the aim of this study, an evaluation methodology was designed based on heuristic evaluations and a mixed method usability test. The results showed that modular visualisation and direct manipulation of information related to the design and analysis of ICPs is useful for engaging and stimulating users. However, designers should pay attention to issues related to the visibility of the system status and the match between the system and the real world, especially in relation to the display of statistical information about care pathways and the editing of clinical information within a care pathway. The paper concludes with recommendations for interface design.

  17. OLSVis: an animated, interactive visual browser for bio-ontologies

    PubMed Central

    2012-01-01

    Background More than one million terms from biomedical ontologies and controlled vocabularies are available through the Ontology Lookup Service (OLS). Although OLS provides ample possibility for querying and browsing terms, the visualization of parts of the ontology graphs is rather limited and inflexible. Results We created the OLSVis web application, a visualiser for browsing all ontologies available in the OLS database. OLSVis shows customisable subgraphs of the OLS ontologies. Subgraphs are animated via a real-time force-based layout algorithm which is fully interactive: each time the user makes a change, e.g. browsing to a new term, hiding, adding, or dragging terms, the algorithm performs smooth and only essential reorganisations of the graph. This assures an optimal viewing experience, because subsequent screen layouts are not grossly altered, and users can easily navigate through the graph. URL: http://ols.wordvis.com Conclusions The OLSVis web application provides a user-friendly tool to visualise ontologies from the OLS repository. It broadens the possibilities to investigate and select ontology subgraphs through a smooth visualisation method. PMID:22646023

  18. Web-based volume slicer for 3D electron-microscopy data from EMDB

    PubMed Central

    Salavert-Torres, José; Iudin, Andrii; Lagerstedt, Ingvar; Sanz-García, Eduardo; Kleywegt, Gerard J.; Patwardhan, Ardan

    2016-01-01

    We describe the functionality and design of the Volume slicer – a web-based slice viewer for EMDB entries. This tool uniquely provides the facility to view slices from 3D EM reconstructions along the three orthogonal axes and to rapidly switch between them and navigate through the volume. We have employed multiple rounds of user-experience testing with members of the EM community to ensure that the interface is easy and intuitive to use and the information provided is relevant. The impetus to develop the Volume slicer has been calls from the EM community to provide web-based interactive visualisation of 2D slice data. This would be useful for quick initial checks of the quality of a reconstruction. Again in response to calls from the community, we plan to further develop the Volume slicer into a fully-fledged Volume browser that provides integrated visualisation of EMDB and PDB entries from the molecular to the cellular scale. PMID:26876163

  19. Event visualisation in ALICE - current status and strategy for Run 3

    NASA Astrophysics Data System (ADS)

    Niedziela, Jeremi; von Haller, Barthélémy

    2017-10-01

    A Large Ion Collider Experiment (ALICE) is one of the four big experiments running at the Large Hadron Collider (LHC), which focuses on the study of the Quark-Gluon Plasma (QGP) being produced in heavy-ion collisions. The ALICE Event Visualisation Environment (AliEve) is a tool providing an interactive 3D model of the detector’s geometry and a graphical representation of the data. Together with the online reconstruction module, it provides important quality monitoring of the recorded data. As a consequence it has been used in the ALICE Run Control Centre during all stages of Run 2. Static screenshots from the online visualisation are published on the public website - ALICE LIVE. Dedicated converters have been developed to provide geometry and data for external projects. An example of such project is the Total Event Display (TEV) - a visualisation tool recently developed by the CERN Media Lab based on the Unity game engine. It can be easily deployed on any platform, including web and mobile platforms. Another external project is More Than ALICE - an augmented reality application for visitors, overlaying detector descriptions and event visualisations on the camera’s picture. For the future Run 3 both AliEve and TEV will be adapted to fit the ALICE O2 project. Several changes are required due to the new data formats, especially so-called Compressed Time Frames.

  20. WebGL for Rosetta Science Planning

    NASA Astrophysics Data System (ADS)

    Schmidt, Albrecht; Völk, Stefan; Grieger, Björn

    2013-04-01

    Rosetta is a mission of the European Space Agency (ESA) to rendez-vous with comet Churyumov-Gerasimenko in 2014. The trajectory and operations of the mission are particularly complex, have many free parameters and are novel to the community. To support science planning, communicate operational ideas and disseminate operational scenarios to the scientific community, the science ground segment makes use of Web-based visualisation technologies. Using the recent standard WebGL, static pages of time-dependent three-dimensional views of the spacecraft and the field-of-views of the instruments are generated, directly from the operational files. These can then be viewed in modern Web browsers for understanding or verification, be analysed and correlated with other studies. Variable timesteps make it possible to provide both overviews and detailed animated scenes. The technical challenges that are particular to Web-based environments include: (1) In traditional OpenGL, is much easier to compute needed data on demand since the visualisation runs natively on a usually quite powerful computer. In WebGL application, since requests for additional data have to be passed through a Web server, they are more complex and also require a more complex infrastructure. (2) The volume of data that can be kept in a browser environment is limited and has to be transferred over often slow network links. Thus, careful design and reduction of data is required. (3) Although browser support for WebGL has improved since the authors started using it, it is often not well supported on mobile and small devices. (4) Web browsers often only support limited end user interactions with a mouse or keyboards. While some of the challenges can be expected to become less important as technological progress continues, others seem to be more inherent to the approach. On the positive side, the authors' experiences include: (1) low threshold in the community to using the visualisations, (2), thus, cooperative use of the products, and (3) good and still improving tool and library support.

  1. EHDViz: clinical dashboard development using open-source technologies

    PubMed Central

    Badgeley, Marcus A; Shameer, Khader; Glicksberg, Benjamin S; Tomlinson, Max S; Levin, Matthew A; McCormick, Patrick J; Kasarskis, Andrew; Reich, David L; Dudley, Joel T

    2016-01-01

    Objective To design, develop and prototype clinical dashboards to integrate high-frequency health and wellness data streams using interactive and real-time data visualisation and analytics modalities. Materials and methods We developed a clinical dashboard development framework called electronic healthcare data visualization (EHDViz) toolkit for generating web-based, real-time clinical dashboards for visualising heterogeneous biomedical, healthcare and wellness data. The EHDViz is an extensible toolkit that uses R packages for data management, normalisation and producing high-quality visualisations over the web using R/Shiny web server architecture. We have developed use cases to illustrate utility of EHDViz in different scenarios of clinical and wellness setting as a visualisation aid for improving healthcare delivery. Results Using EHDViz, we prototyped clinical dashboards to demonstrate the contextual versatility of EHDViz toolkit. An outpatient cohort was used to visualise population health management tasks (n=14 221), and an inpatient cohort was used to visualise real-time acuity risk in a clinical unit (n=445), and a quantified-self example using wellness data from a fitness activity monitor worn by a single individual was also discussed (n-of-1). The back-end system retrieves relevant data from data source, populates the main panel of the application and integrates user-defined data features in real-time and renders output using modern web browsers. The visualisation elements can be customised using health features, disease names, procedure names or medical codes to populate the visualisations. The source code of EHDViz and various prototypes developed using EHDViz are available in the public domain at http://ehdviz.dudleylab.org. Conclusions Collaborative data visualisations, wellness trend predictions, risk estimation, proactive acuity status monitoring and knowledge of complex disease indicators are essential components of implementing data-driven precision medicine. As an open-source visualisation framework capable of integrating health assessment, EHDViz aims to be a valuable toolkit for rapid design, development and implementation of scalable clinical data visualisation dashboards. PMID:27013597

  2. A web based tool for storing and visualising data generated within a smart home.

    PubMed

    McDonald, H A; Nugent, C D; Moore, G; Finlay, D D; Hallberg, J

    2011-01-01

    There is a growing need to re-assess the current approaches available to researchers for storing and managing heterogeneous data generated within a smart home environment. In our current work we have developed the homeML Application; a web based tool to support researchers engaged in the area of smart home research as they perform experiments. Within this paper the homeML Application is presented which includes the fundamental components of the homeML Repository and the homeML Toolkit. Results from a usability study conducted by 10 computer science researchers are presented; the initial results of which have been positive.

  3. WEBnm@ v2.0: Web server and services for comparing protein flexibility.

    PubMed

    Tiwari, Sandhya P; Fuglebakk, Edvin; Hollup, Siv M; Skjærven, Lars; Cragnolini, Tristan; Grindhaug, Svenn H; Tekle, Kidane M; Reuter, Nathalie

    2014-12-30

    Normal mode analysis (NMA) using elastic network models is a reliable and cost-effective computational method to characterise protein flexibility and by extension, their dynamics. Further insight into the dynamics-function relationship can be gained by comparing protein motions between protein homologs and functional classifications. This can be achieved by comparing normal modes obtained from sets of evolutionary related proteins. We have developed an automated tool for comparative NMA of a set of pre-aligned protein structures. The user can submit a sequence alignment in the FASTA format and the corresponding coordinate files in the Protein Data Bank (PDB) format. The computed normalised squared atomic fluctuations and atomic deformation energies of the submitted structures can be easily compared on graphs provided by the web user interface. The web server provides pairwise comparison of the dynamics of all proteins included in the submitted set using two measures: the Root Mean Squared Inner Product and the Bhattacharyya Coefficient. The Comparative Analysis has been implemented on our web server for NMA, WEBnm@, which also provides recently upgraded functionality for NMA of single protein structures. This includes new visualisations of protein motion, visualisation of inter-residue correlations and the analysis of conformational change using the overlap analysis. In addition, programmatic access to WEBnm@ is now available through a SOAP-based web service. Webnm@ is available at http://apps.cbu.uib.no/webnma . WEBnm@ v2.0 is an online tool offering unique capability for comparative NMA on multiple protein structures. Along with a convenient web interface, powerful computing resources, and several methods for mode analyses, WEBnm@ facilitates the assessment of protein flexibility within protein families and superfamilies. These analyses can give a good view of how the structures move and how the flexibility is conserved over the different structures.

  4. EHDViz: clinical dashboard development using open-source technologies.

    PubMed

    Badgeley, Marcus A; Shameer, Khader; Glicksberg, Benjamin S; Tomlinson, Max S; Levin, Matthew A; McCormick, Patrick J; Kasarskis, Andrew; Reich, David L; Dudley, Joel T

    2016-03-24

    To design, develop and prototype clinical dashboards to integrate high-frequency health and wellness data streams using interactive and real-time data visualisation and analytics modalities. We developed a clinical dashboard development framework called electronic healthcare data visualization (EHDViz) toolkit for generating web-based, real-time clinical dashboards for visualising heterogeneous biomedical, healthcare and wellness data. The EHDViz is an extensible toolkit that uses R packages for data management, normalisation and producing high-quality visualisations over the web using R/Shiny web server architecture. We have developed use cases to illustrate utility of EHDViz in different scenarios of clinical and wellness setting as a visualisation aid for improving healthcare delivery. Using EHDViz, we prototyped clinical dashboards to demonstrate the contextual versatility of EHDViz toolkit. An outpatient cohort was used to visualise population health management tasks (n=14,221), and an inpatient cohort was used to visualise real-time acuity risk in a clinical unit (n=445), and a quantified-self example using wellness data from a fitness activity monitor worn by a single individual was also discussed (n-of-1). The back-end system retrieves relevant data from data source, populates the main panel of the application and integrates user-defined data features in real-time and renders output using modern web browsers. The visualisation elements can be customised using health features, disease names, procedure names or medical codes to populate the visualisations. The source code of EHDViz and various prototypes developed using EHDViz are available in the public domain at http://ehdviz.dudleylab.org. Collaborative data visualisations, wellness trend predictions, risk estimation, proactive acuity status monitoring and knowledge of complex disease indicators are essential components of implementing data-driven precision medicine. As an open-source visualisation framework capable of integrating health assessment, EHDViz aims to be a valuable toolkit for rapid design, development and implementation of scalable clinical data visualisation dashboards. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud

    PubMed Central

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation. PMID:26501966

  6. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    PubMed

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.

  7. Lossy compression for Animated Web Visualisation

    NASA Astrophysics Data System (ADS)

    Prudden, R.; Tomlinson, J.; Robinson, N.; Arribas, A.

    2017-12-01

    This talk will discuss an technique for lossy data compression specialised for web animation. We set ourselves the challenge of visualising a full forecast weather field as an animated 3D web page visualisation. This data is richly spatiotemporal, however it is routinely communicated to the public as a 2D map, and scientists are largely limited to visualising data via static 2D maps or 1D scatter plots. We wanted to present Met Office weather forecasts in a way that represents all the generated data. Our approach was to repurpose the technology used to stream high definition videos. This enabled us to achieve high rates of compression, while being compatible with both web browsers and GPU processing. Since lossy compression necessarily involves discarding information, evaluating the results is an important and difficult problem. This is essentially a problem of forecast verification. The difficulty lies in deciding what it means for two weather fields to be "similar", as simple definitions such as mean squared error often lead to undesirable results. In the second part of the talk, I will briefly discuss some ideas for alternative measures of similarity.

  8. A Powerful, Cost Effective, Web Based Engineering Solution Supporting Conjunction Detection and Visual Analysis

    NASA Astrophysics Data System (ADS)

    Novak, Daniel M.; Biamonti, Davide; Gross, Jeremy; Milnes, Martin

    2013-08-01

    An innovative and visually appealing tool is presented for efficient all-vs-all conjunction analysis on a large catalogue of objects. The conjunction detection uses a nearest neighbour search algorithm, based on spatial binning and identification of pairs of objects in adjacent bins. This results in the fastest all vs all filtering the authors are aware of. The tool is constructed on a server-client architecture, where the server broadcasts to the client the conjunction data and ephemerides, while the client supports the user interface through a modern browser, without plug-in. In order to make the tool flexible and maintainable, Java software technologies were used on the server side, including Spring, Camel, ActiveMQ and CometD. The user interface and visualisation are based on the latest web technologies: HTML5, WebGL, THREE.js. Importance has been given on the ergonomics and visual appeal of the software. In fact certain design concepts have been borrowed from the gaming industry.

  9. CNV-WebStore: online CNV analysis, storage and interpretation.

    PubMed

    Vandeweyer, Geert; Reyniers, Edwin; Wuyts, Wim; Rooms, Liesbeth; Kooy, R Frank

    2011-01-05

    Microarray technology allows the analysis of genomic aberrations at an ever increasing resolution, making functional interpretation of these vast amounts of data the main bottleneck in routine implementation of high resolution array platforms, and emphasising the need for a centralised and easy to use CNV data management and interpretation system. We present CNV-WebStore, an online platform to streamline the processing and downstream interpretation of microarray data in a clinical context, tailored towards but not limited to the Illumina BeadArray platform. Provided analysis tools include CNV analsyis, parent of origin and uniparental disomy detection. Interpretation tools include data visualisation, gene prioritisation, automated PubMed searching, linking data to several genome browsers and annotation of CNVs based on several public databases. Finally a module is provided for uniform reporting of results. CNV-WebStore is able to present copy number data in an intuitive way to both lab technicians and clinicians, making it a useful tool in daily clinical practice.

  10. Spatial Data Uncertainty in a Webgis Tool Supporting Sediments Management in Wallonia

    NASA Astrophysics Data System (ADS)

    Stéphenne, N. R.; Beaumont, B.; Veschkens, M.; Palm, S.; Charlemagne, C.

    2015-08-01

    This paper describes a WebGIS prototype developed for the Walloon administration to improve the communication and the management of sediments dredging actions carried out in rivers and lakes. In Wallonia, levelling dredged sediments on banks requires an official authorization from the administration. This request refers to geospatial datasets such as the official land use map, the cadastral map or the distance to potential pollution sources. Centralising geodatabases within a web interface facilitate the management of these authorizations for the managers and the central administration. The proposed system integrates various data from disparate sources. Some issues in map scale, spatial search quality and cartographic visualisation are discussed in this paper with the solutions provided. The prototype web application is currently discussed with some potential users in order to understand in which way this tool facilitate the communication, the management and the quality of the authorisation process. The structure of the paper states the why, what, who and how of this communication tool with a special focus on errors and uncertainties.

  11. The Zeldovich & Adhesion approximations and applications to the local universe

    NASA Astrophysics Data System (ADS)

    Hidding, Johan; van de Weygaert, Rien; Shandarin, Sergei

    2016-10-01

    The Zeldovich approximation (ZA) predicts the formation of a web of singularities. While these singularities may only exist in the most formal interpretation of the ZA, they provide a powerful tool for the analysis of initial conditions. We present a novel method to find the skeleton of the resulting cosmic web based on singularities in the primordial deformation tensor and its higher order derivatives. We show that the A 3 lines predict the formation of filaments in a two-dimensional model. We continue with applications of the adhesion model to visualise structures in the local (z < 0.03) universe.

  12. Understanding WCAG2.0 Colour Contrast Requirements Through 3D Colour Space Visualisation.

    PubMed

    Sandnes, Frode Eika

    2016-01-01

    Sufficient contrast between text and background is needed to achieve sufficient readability. WCAG2.0 provides a specific definition of sufficient contrast on the web. However, the definition is hard to understand and most designers thus use contrast calculators to validate their colour choices. Often, such checks are performed after design and this may be too late. This paper proposes a colour selection approach based on three-dimensional visualisation of the colour space. The complex non-linear relationships between the colour components become comprehendible when viewed in 3D. The method visualises the available colours in an intuitive manner and allows designers to check a colour against the set of other valid colours. Unlike the contrast calculators, the proposed method is proactive and fun to use. A colour space builder was developed and the resulting models were viewed with a point cloud viewer. The technique can be used as both a design tool and a pedagogical aid to teach colour theory and design.

  13. Using Jupyter Notebooks for Interactive Space Science Simulations

    NASA Astrophysics Data System (ADS)

    Schmidt, Albrecht

    2016-04-01

    Jupyter Notebooks can be used as an effective means to communicate scientific ideas through Web-based visualisations and, at the same time, give a user more than a pre-defined set of options to manipulate the visualisations. To some degree, even computations can be done without too much knowledge of the underlying data structures and infrastructure to discover novel aspects of the data or tailor view to users' needs. Here, we show how to combine Jupyter Notebooks with other open-source tools to provide rich and interactive views on space data, especially the visualisation of spacecraft operations. Topics covered are orbit visualisation, spacecraft orientation, instrument timelines as well as performance analysis of mission segments. Technically, also the re-use and integration of existing components will be shown, both on the code level as well on the visualisation level so that the effort which was put into the development of new components could be reduced. Another important aspect is the bridging of the gap between operational data and the scientific exploitation of the payload data, for which also a way forward will be shown. A lesson learned from the implementation and use of a prototype is the synergy between the team who provisions the notebooks and the consumers, who both share access to the same code base, if not resources; this often simplifies communication and deployment.

  14. Gaia Data Release 1. The archive visualisation service

    NASA Astrophysics Data System (ADS)

    Moitinho, A.; Krone-Martins, A.; Savietto, H.; Barros, M.; Barata, C.; Falcão, A. J.; Fernandes, T.; Alves, J.; Silva, A. F.; Gomes, M.; Bakker, J.; Brown, A. G. A.; González-Núñez, J.; Gracia-Abril, G.; Gutiérrez-Sánchez, R.; Hernández, J.; Jordan, S.; Luri, X.; Merin, B.; Mignard, F.; Mora, A.; Navarro, V.; O'Mullane, W.; Sagristà Sellés, T.; Salgado, J.; Segovia, J. C.; Utrilla, E.; Arenou, F.; de Bruijne, J. H. J.; Jansen, F.; McCaughrean, M.; O'Flaherty, K. S.; Taylor, M. B.; Vallenari, A.

    2017-09-01

    Context. The first Gaia data release (DR1) delivered a catalogue of astrometry and photometry for over a billion astronomical sources. Within the panoplyof methods used for data exploration, visualisation is often the starting point and even the guiding reference for scientific thought. However, this is a volume of data that cannot be efficiently explored using traditional tools, techniques, and habits. Aims: We aim to provide a global visual exploration service for the Gaia archive, something that is not possible out of the box for most people. The service has two main goals. The first is to provide a software platform for interactive visual exploration of the archive contents, using common personal computers and mobile devices available to most users. The second aim is to produce intelligible and appealing visual representations of the enormous information content of the archive. Methods: The interactive exploration service follows a client-server design. The server runs close to the data, at the archive, and is responsible for hiding as far as possible the complexity and volume of the Gaia data from the client. This is achieved by serving visual detail on demand. Levels of detail are pre-computed using data aggregation and subsampling techniques. For DR1, the client is a web application that provides an interactive multi-panel visualisation workspace as well as a graphical user interface. Results: The Gaia archive Visualisation Service offers a web-based multi-panel interactive visualisation desktop in a browser tab. It currently provides highly configurable 1D histograms and 2D scatter plots of Gaia DR1 and the Tycho-Gaia Astrometric Solution (TGAS) with linked views. An innovative feature is the creation of ADQL queries from visually defined regions in plots. These visual queries are ready for use in the Gaia Archive Search/data retrieval service. In addition, regions around user-selected objects can be further examined with automatically generated SIMBAD searches. Integration of the Aladin Lite and JS9 applications add support to the visualisation of HiPS and FITS maps. The production of the all-sky source density map that became the iconic image of Gaia DR1 is described in detail. Conclusions: On the day of DR1, over seven thousand users accessed the Gaia Archive visualisation portal. The system, running on a single machine, proved robust and did not fail while enabling thousands of users to visualise and explore the over one billion sources in DR1. There are still several limitations, most noticeably that users may only choose from a list of pre-computed visualisations. Thus, other visualisation applications that can complement the archive service are examined. Finally, development plans for Data Release 2 are presented.

  15. The EMBL-EBI bioinformatics web and programmatic tools framework.

    PubMed

    Li, Weizhong; Cowley, Andrew; Uludag, Mahmut; Gur, Tamer; McWilliam, Hamish; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Lopez, Rodrigo

    2015-07-01

    Since 2009 the EMBL-EBI Job Dispatcher framework has provided free access to a range of mainstream sequence analysis applications. These include sequence similarity search services (https://www.ebi.ac.uk/Tools/sss/) such as BLAST, FASTA and PSI-Search, multiple sequence alignment tools (https://www.ebi.ac.uk/Tools/msa/) such as Clustal Omega, MAFFT and T-Coffee, and other sequence analysis tools (https://www.ebi.ac.uk/Tools/pfa/) such as InterProScan. Through these services users can search mainstream sequence databases such as ENA, UniProt and Ensembl Genomes, utilising a uniform web interface or systematically through Web Services interfaces (https://www.ebi.ac.uk/Tools/webservices/) using common programming languages, and obtain enriched results with novel visualisations. Integration with EBI Search (https://www.ebi.ac.uk/ebisearch/) and the dbfetch retrieval service (https://www.ebi.ac.uk/Tools/dbfetch/) further expands the usefulness of the framework. New tools and updates such as NCBI BLAST+, InterProScan 5 and PfamScan, new categories such as RNA analysis tools (https://www.ebi.ac.uk/Tools/rna/), new databases such as ENA non-coding, WormBase ParaSite, Pfam and Rfam, and new workflow methods, together with the retirement of depreciated services, ensure that the framework remains relevant to today's biological community. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Web based tools for data manipulation, visualisation and validation with interactive georeferenced graphs

    NASA Astrophysics Data System (ADS)

    Ivankovic, D.; Dadic, V.

    2009-04-01

    Some of oceanographic parameters have to be manually inserted into database; some (for example data from CTD probe) are inserted from various files. All this parameters requires visualization, validation and manipulation from research vessel or scientific institution, and also public presentation. For these purposes is developed web based system, containing dynamic sql procedures and java applets. Technology background is Oracle 10g relational database, and Oracle application server. Web interfaces are developed using PL/SQL stored database procedures (mod PL/SQL). Additional parts for data visualization include use of Java applets and JavaScript. Mapping tool is Google maps API (javascript) and as alternative java applet. Graph is realized as dynamically generated web page containing java applet. Mapping tool and graph are georeferenced. That means that click on some part of graph, automatically initiate zoom or marker onto location where parameter was measured. This feature is very useful for data validation. Code for data manipulation and visualization are partially realized with dynamic SQL and that allow as to separate data definition and code for data manipulation. Adding new parameter in system requires only data definition and description without programming interface for this kind of data.

  17. SSRPrimer and SSR Taxonomy Tree: Biome SSR discovery

    PubMed Central

    Jewell, Erica; Robinson, Andrew; Savage, David; Erwin, Tim; Love, Christopher G.; Lim, Geraldine A. C.; Li, Xi; Batley, Jacqueline; Spangenberg, German C.; Edwards, David

    2006-01-01

    Simple sequence repeat (SSR) molecular genetic markers have become important tools for a broad range of applications such as genome mapping and genetic diversity studies. SSRs are readily identified within DNA sequence data and PCR primers can be designed for their amplification. These PCR primers frequently cross amplify within related species. We report a web-based tool, SSR Primer, that integrates SPUTNIK, an SSR repeat finder, with Primer3, a primer design program, within one pipeline. On submission of multiple FASTA formatted sequences, the script screens each sequence for SSRs using SPUTNIK. Results are then parsed to Primer3 for locus specific primer design. We have applied this tool for the discovery of SSRs within the complete GenBank database, and have designed PCR amplification primers for over 13 million SSRs. The SSR Taxonomy Tree server provides web-based searching and browsing of species and taxa for the visualisation and download of these SSR amplification primers. These tools are available at . PMID:16845092

  18. SSRPrimer and SSR Taxonomy Tree: Biome SSR discovery.

    PubMed

    Jewell, Erica; Robinson, Andrew; Savage, David; Erwin, Tim; Love, Christopher G; Lim, Geraldine A C; Li, Xi; Batley, Jacqueline; Spangenberg, German C; Edwards, David

    2006-07-01

    Simple sequence repeat (SSR) molecular genetic markers have become important tools for a broad range of applications such as genome mapping and genetic diversity studies. SSRs are readily identified within DNA sequence data and PCR primers can be designed for their amplification. These PCR primers frequently cross amplify within related species. We report a web-based tool, SSR Primer, that integrates SPUTNIK, an SSR repeat finder, with Primer3, a primer design program, within one pipeline. On submission of multiple FASTA formatted sequences, the script screens each sequence for SSRs using SPUTNIK. Results are then parsed to Primer3 for locus specific primer design. We have applied this tool for the discovery of SSRs within the complete GenBank database, and have designed PCR amplification primers for over 13 million SSRs. The SSR Taxonomy Tree server provides web-based searching and browsing of species and taxa for the visualisation and download of these SSR amplification primers. These tools are available at http://bioinformatics.pbcbasc.latrobe.edu.au/ssrdiscovery.html.

  19. A proposed-standard format to represent and distribute tomographic models and other earth spatial data

    NASA Astrophysics Data System (ADS)

    Postpischl, L.; Morelli, A.; Danecek, P.

    2009-04-01

    Formats used to represent (and distribute) tomographic earth models differ considerably and are rarely self-consistent. In fact, each earth scientist, or research group, uses specific conventions to encode the various parameterizations used to describe, e.g., seismic wave speed or density in three dimensions, and complete information is often found in related documents or publications (if available at all) only. As a consequence, use of various tomographic models from different authors requires considerable effort, is more cumbersome than it should be and prevents widespread exchange and circulation within the community. We propose a format, based on modern web standards, able to represent different (grid-based) model parameterizations within the same simple text-based environment, easy to write, to parse, and to visualise. The aim is the creation of self-describing data-structures, both human and machine readable, that are automatically recognised by general-purpose software agents, and easily imported in the scientific programming environment. We think that the adoption of such a representation as a standard for the exchange and distribution of earth models can greatly ease their usage and enhance their circulation, both among fellow seismologists and among a broader non-specialist community. The proposed solution uses semantic web technologies, fully fitting the current trends in data accessibility. It is based on Json (JavaScript Object Notation), a plain-text, human-readable lightweight computer data interchange format, which adopts a hierarchical name-value model for representing simple data structures and associative arrays (called objects). Our implementation allows integration of large datasets with metadata (authors, affiliations, bibliographic references, units of measure etc.) into a single resource. It is equally suited to represent other geo-referenced volumetric quantities — beyond tomographic models — as well as (structured and unstructured) computational meshes. This approach can exploit the capabilities of the web browser as a computing platform: a series of in-page quick tools for comparative analysis between models will be presented, as well as visualisation techniques for tomographic layers in Google Maps and Google Earth. We are working on tools for conversion into common scientific format like netCDF, to allow easy visualisation in GEON-IDV or gmt.

  20. SeeSway - A free web-based system for analysing and exploring standing balance data.

    PubMed

    Clark, Ross A; Pua, Yong-Hao

    2018-06-01

    Computerised posturography can be used to assess standing balance, and can predict poor functional outcomes in many clinical populations. A key limitation is the disparate signal filtering and analysis techniques, with many methods requiring custom computer programs. This paper discusses the creation of a freely available web-based software program, SeeSway (www.rehabtools.org/seesway), which was designed to provide powerful tools for pre-processing, analysing and visualising standing balance data in an easy to use and platform independent website. SeeSway links an interactive web platform with file upload capability to software systems including LabVIEW, Matlab, Python and R to perform the data filtering, analysis and visualisation of standing balance data. Input data can consist of any signal that comprises an anterior-posterior and medial-lateral coordinate trace such as center of pressure or mass displacement. This allows it to be used with systems including criterion reference commercial force platforms and three dimensional motion analysis, smartphones, accelerometers and low-cost technology such as Nintendo Wii Balance Board and Microsoft Kinect. Filtering options include Butterworth, weighted and unweighted moving average, and discrete wavelet transforms. Analysis methods include standard techniques such as path length, amplitude, and root mean square in addition to less common but potentially promising methods such as sample entropy, detrended fluctuation analysis and multiresolution wavelet analysis. These data are visualised using scalograms, which chart the change in frequency content over time, scatterplots and standard line charts. This provides the user with a detailed understanding of their results, and how their different pre-processing and analysis method selections affect their findings. An example of the data analysis techniques is provided in the paper, with graphical representation of how advanced analysis methods can better discriminate between someone with neurological impairment and a healthy control. The goal of SeeSway is to provide a simple yet powerful educational and research tool to explore how standing balance is affected in aging and clinical populations. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Collecting, Visualising, Communicating and Modelling Geographic Data for the Sciences

    NASA Astrophysics Data System (ADS)

    Crooks, A.; Hudson-Smith, A.; Milton, R.; Smith, D.; Batty, M.; Neuhaus, F.

    2009-12-01

    New web technologies and task specific software packages and services are fundamentally changing the way we share, collect, visualise, communicate and distribute geographic information. Coupled with these new technologies is the emergence of rich fine scale and extensive geographical datasets of the built environment. Such technologies and data are providing opportunities for both the social and physical sciences that were unimaginable ten years ago. Within this paper we discus such change from our own experiences at the Centre of Advanced Spatial Analysis. Specifically, how it is now possible to harness the crowd to collect peoples’ opinions about topical events such as the current financial crisis, in real time and map the results, through the use of our GMapCreator software and the MapTube website. Furthermore, such tools allow for widespread dissemination and visualisation of geographic data to whoever has an internet connection. We will explore how one can use new datasets to visualise the city using our Virtual London model as an example. Within the model individual buildings are tagged with multiple attributes providing a lens to explore the urban structure offering a plethora of research applications. We then turn to how one can visualise and communicate such data through low cost software and virtual worlds such as Crysis and Second Life with a look into their potential for modelling and finally how we disseminated much of this information through weblogs (blogs) such as Digital Urban, GIS and Agent-based modelling and Urban Tick.

  2. Landscape Visualisation on the Internet

    NASA Astrophysics Data System (ADS)

    Imhof, M. P.; Cox, M. T.; Harvey, D. W.; Heemskerk, G. E.; Pettit, C. J.

    2012-07-01

    The Victorian Resources Online (VRO) website (http://www.dpi.vic.gov.au/vro) is the principal means for accessing landscapebased information in Victoria. In this paper we introduce a range of online landscape visualisations that have been developed to enhance existing static web content around the nature and distribution of Victoria's landforms and soils as well as associated processes. Flash is used to develop online visualisations that include interactive landscape panoramas, animations of soil and landscape processes and videos of experts explaining features in the field as well as landscape "flyovers". The use of interactive visualisations adds rich information multimedia content to otherwise static pages and offers the potential to improve user's appreciation and understanding of soil and landscapes. Visualisation is becoming a key component of knowledge management activities associated with VRO - proving useful for both "knowledge capture" (from subject matter specialists) and "knowledge transfer" to a diverse user base. A range of useful visualisation products have been made available online, with varying degrees of interactivity and suited to a variety of users. The use of video files, animation and interactive visualisations is adding rich information content to otherwise static web pages. These information products offer new possibilities to enhance learning of landscapes and the effectiveness of these will be tested as the next phase of development.

  3. Services for Emodnet-Chemistry Data Products

    NASA Astrophysics Data System (ADS)

    Santinelli, Giorgio; Hendriksen, Gerrit; Barth, Alexander

    2016-04-01

    In the framework of Emodnet Chemistry lot, data products from regional leaders were made available in order to transform information into a database. This has been done by using functions and scripts, reading so-called enriched ODV files and inserting data directly into a cloud relational geodatabase. The main table is the one of observations which contains the main data and meta-data associated with the enriched ODV files. A particular implementation in data loading is used in order to improve on-the-fly computational speed. Data from Baltic Sea, North Sea, Mediterrean, Black Sea and part of the Atlantic region has been entered into the geodatabase, and consequently being instantly available from the OceanBrowser Emodnet portal. Furthermore, Deltares has developed an application that provides additional visualisation services for the aggregated and validated data collections. The visualisations are produced by making use of part of the OpenEarthTool stack (http://www.openearth.eu), by the integration of Web Feature Services and by the implementation of Web Processing Services. The goal is the generation of server-side plots of timeseries, profiles, timeprofiles and maps of selected parameters from data sets of selected stations. Regional data collections are retrieved using Emodnet Chemistry cloud relational geo-database. The spatial resolution in time and the intensity of data availability for selected parameters is shown using Web Service requests via the OceanBrowser Emodnet Web portal. OceanBrowser also shows station reference codes, which are used to establish a link for additional metadata, further data shopping and download.

  4. First Prototype of a Web Map Interface for ESA's Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Gonzalez, J.

    2014-04-01

    We present a first prototype of a Web Map Interface that will serve as a proof of concept and design for ESA's future fully web-based Planetary Science Archive (PSA) User Interface. The PSA is ESA's planetary science archiving authority and central repository for all scientific and engineering data returned by ESA's Solar System missions [1]. All data are compliant with NASA's Planetary Data System (PDS) Standards and are accessible through several interfaces [2]: in addition to serving all public data via FTP and the Planetary Data Access Protocol (PDAP), a Java-based User Interface provides advanced search, preview, download, notification and delivery-basket functionality. It allows the user to query and visualise instrument observations footprints using a map-based interface (currently only available for Mars Express HRSC and OMEGA instruments). During the last decade, the planetary mapping science community has increasingly been adopting Geographic Information System (GIS) tools and standards, originally developed for and used in Earth science. There is an ongoing effort to produce and share cartographic products through Open Geospatial Consortium (OGC) Web Services, or as standalone data sets, so that they can be readily used in existing GIS applications [3,4,5]. Previous studies conducted at ESAC [6,7] have helped identify the needs of Planetary GIS users, and define key areas of improvement for the future Web PSA User Interface. Its web map interface shall will provide access to the full geospatial content of the PSA, including (1) observation geometry footprints of all remote sensing instruments, and (2) all georeferenced cartographic products, such as HRSC map-projected data or OMEGA global maps from Mars Express. It shall aim to provide a rich user experience for search and visualisation of this content using modern and interactive web mapping technology. A comprehensive set of built-in context maps from external sources, such as MOLA topography, TES infrared maps or planetary surface nomenclature, provided in both simple cylindrical and polar stereographic projections, shall enhance this user experience. In addition, users should be able to import and export data in commonly used open- GIS formats. It is also intended to serve all PSA geospatial data through OGC-compliant Web Services so that they can be captured, visualised and analysed directly from GIS software, along with data from other sources. The following figure illustrates how the PSA web map interface and services shall fit in a typical Planetary GIS user working environment.

  5. Visualising associations between paired ‘omics’ data sets

    PubMed Central

    2012-01-01

    Background Each omics platform is now able to generate a large amount of data. Genomics, proteomics, metabolomics, interactomics are compiled at an ever increasing pace and now form a core part of the fundamental systems biology framework. Recently, several integrative approaches have been proposed to extract meaningful information. However, these approaches lack of visualisation outputs to fully unravel the complex associations between different biological entities. Results The multivariate statistical approaches ‘regularized Canonical Correlation Analysis’ and ‘sparse Partial Least Squares regression’ were recently developed to integrate two types of highly dimensional ‘omics’ data and to select relevant information. Using the results of these methods, we propose to revisit few graphical outputs to better understand the relationships between two ‘omics’ data and to better visualise the correlation structure between the different biological entities. These graphical outputs include Correlation Circle plots, Relevance Networks and Clustered Image Maps. We demonstrate the usefulness of such graphical outputs on several biological data sets and further assess their biological relevance using gene ontology analysis. Conclusions Such graphical outputs are undoubtedly useful to aid the interpretation of these promising integrative analysis tools and will certainly help in addressing fundamental biological questions and understanding systems as a whole. Availability The graphical tools described in this paper are implemented in the freely available R package mixOmics and in its associated web application. PMID:23148523

  6. eWaterCycle visualisation. combining the strength of NetCDF and Web Map Service: ncWMS

    NASA Astrophysics Data System (ADS)

    Hut, R.; van Meersbergen, M.; Drost, N.; Van De Giesen, N.

    2016-12-01

    As a result of the eWatercycle global hydrological forecast we have created Cesium-ncWMS, a web application based on ncWMS and Cesium. ncWMS is a server side application capable of reading any NetCDF file written using the Climate and Forecasting (CF) conventions, and making the data available as a Web Map Service(WMS). ncWMS automatically determines available variables in a file, and creates maps colored according to map data and a user selected color scale. Cesium is a Javascript 3D virtual Globe library. It uses WebGL for rendering, which makes it very fast, and it is capable of displaying a wide variety of data types such as vectors, 3D models, and 2D maps. The forecast results are automatically uploaded to our web server running ncWMS. In turn, the web application can be used to change the settings for color maps and displayed data. The server uses the settings provided by the web application, together with the data in NetCDF to provide WMS image tiles, time series data and legend graphics to the Cesium-NcWMS web application. The user can simultaneously zoom in to the very high resolution forecast results anywhere on the world, and get time series data for any point on the globe. The Cesium-ncWMS visualisation combines a global overview with local relevant information in any browser. See the visualisation live at forecast.ewatercycle.org

  7. CyanoEXpress: A web database for exploration and visualisation of the integrated transcriptome of cyanobacterium Synechocystis sp. PCC6803.

    PubMed

    Hernandez-Prieto, Miguel A; Futschik, Matthias E

    2012-01-01

    Synechocystis sp. PCC6803 is one of the best studied cyanobacteria and an important model organism for our understanding of photosynthesis. The early availability of its complete genome sequence initiated numerous transcriptome studies, which have generated a wealth of expression data. Analysis of the accumulated data can be a powerful tool to study transcription in a comprehensive manner and to reveal underlying regulatory mechanisms, as well as to annotate genes whose functions are yet unknown. However, use of divergent microarray platforms, as well as distributed data storage make meta-analyses of Synechocystis expression data highly challenging, especially for researchers with limited bioinformatic expertise and resources. To facilitate utilisation of the accumulated expression data for a wider research community, we have developed CyanoEXpress, a web database for interactive exploration and visualisation of transcriptional response patterns in Synechocystis. CyanoEXpress currently comprises expression data for 3073 genes and 178 environmental and genetic perturbations obtained in 31 independent studies. At present, CyanoEXpress constitutes the most comprehensive collection of expression data available for Synechocystis and can be freely accessed. The database is available for free at http://cyanoexpress.sysbiolab.eu.

  8. Managing Geological Profiles in Databases for 3D Visualisation

    NASA Astrophysics Data System (ADS)

    Jarna, A.; Grøtan, B. O.; Henderson, I. H. C.; Iversen, S.; Khloussy, E.; Nordahl, B.; Rindstad, B. I.

    2016-10-01

    Geology and all geological structures are three-dimensional in space. GIS and databases are common tools used by geologists to interpret and communicate geological data. The NGU (Geological Survey of Norway) is the national institution for the study of bedrock, mineral resources, surficial deposits and groundwater and marine geology. 3D geology is usually described by geological profiles, or vertical sections through a map, where you can look at the rock structure below the surface. The goal is to gradually expand the usability of existing and new geological profiles to make them more available in the retail applications as well as build easier entry and registration of profiles. The project target is to develop the methodology for acquisition of data, modification and use of data and its further presentation on the web by creating a user-interface directly linked to NGU's webpage. This will allow users to visualise profiles in a 3D model.

  9. COMAN: a web server for comprehensive metatranscriptomics analysis.

    PubMed

    Ni, Yueqiong; Li, Jun; Panagiotou, Gianni

    2016-08-11

    Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.

  10. Visualising Learning Design in LAMS: A Historical View

    ERIC Educational Resources Information Center

    Dalziel, James

    2011-01-01

    The Learning Activity Management System (LAMS) provides a web-based environment for the creation, sharing, running and monitoring of Learning Designs. A central feature of LAMS is the visual authoring environment, where educators use a drag-and-drop environment to create sequences of learning activities. The visualisation is based on boxes…

  11. Interactive Visualisations and Statistical Literacy

    ERIC Educational Resources Information Center

    Sutherland, Sinclair; Ridgway, Jim

    2017-01-01

    Statistical literacy involves engagement with the data one encounters. New forms of data and new ways to engage with data--notably via interactive data visualisations--are emerging. Some of the skills required to work effectively with these new visualisation tools are described. We argue that interactive data visualisations will have as profound…

  12. Evaluating virtual hosted desktops for graphics-intensive astronomy

    NASA Astrophysics Data System (ADS)

    Meade, B. F.; Fluke, C. J.

    2018-04-01

    Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.

  13. Combining Open-Source Packages for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    Schmidt, Albrecht; Grieger, Björn; Völk, Stefan

    2015-04-01

    The science planning of the ESA Rosetta mission has presented challenges which were addressed with combining various open-source software packages, such as the SPICE toolkit, the Python language and the Web graphics library three.js. The challenge was to compute certain parameters from a pool of trajectories and (possible) attitudes to describe the behaviour of the spacecraft. To be able to do this declaratively and efficiently, a C library was implemented that allows to interface the SPICE toolkit for geometrical computations from the Python language and process as much data as possible during one subroutine call. To minimise the lines of code one has to write special care was taken to ensure that the bindings were idiomatic and thus integrate well into the Python language and ecosystem. When done well, this very much simplifies the structure of the code and facilitates the testing for correctness by automatic test suites and visual inspections. For rapid visualisation and confirmation of correctness of results, the geometries were visualised with the three.js library, a popular Javascript library for displaying three-dimensional graphics in a Web browser. Programmatically, this was achieved by generating data files from SPICE sources that were included into templated HTML and displayed by a browser, thus made easily accessible to interested parties at large. As feedback came and new ideas were to be explored, the authors benefited greatly from the design of the Python-to-SPICE library which allowed the expression of algorithms to be concise and easier to communicate. In summary, by combining several well-established open-source tools, we were able to put together a flexible computation and visualisation environment that helped communicate and build confidence in planning ideas.

  14. Situation Awareness. Report of Break-Out Group 4.

    DTIC Science & Technology

    2006-10-01

    abstraction of the objects and contents, which are displayed and analysed with a visualisation tool. To get a clear picture of the situation around you do...cities, villages etc.). It is possible for other types of information to be visualised /handled similarly for network and data analysis. A good example...interesting feature of a visualisation tool concerning reliability/uncertainty might be the possibility to show/hide uncertain information. This means that

  15. Extracting scientific articles from a large digital archive: BioStor and the Biodiversity Heritage Library.

    PubMed

    Page, Roderic D M

    2011-05-23

    The Biodiversity Heritage Library (BHL) is a large digital archive of legacy biological literature, comprising over 31 million pages scanned from books, monographs, and journals. During the digitisation process basic metadata about the scanned items is recorded, but not article-level metadata. Given that the article is the standard unit of citation, this makes it difficult to locate cited literature in BHL. Adding the ability to easily find articles in BHL would greatly enhance the value of the archive. A service was developed to locate articles in BHL based on matching article metadata to BHL metadata using approximate string matching, regular expressions, and string alignment. This article locating service is exposed as a standard OpenURL resolver on the BioStor web site http://biostor.org/openurl/. This resolver can be used on the web, or called by bibliographic tools that support OpenURL. BioStor provides tools for extracting, annotating, and visualising articles from the Biodiversity Heritage Library. BioStor is available from http://biostor.org/.

  16. Trends in gel dosimetry: Preliminary bibliometric overview of active growth areas, research trends and hot topics from Gore’s 1984 paper onwards

    NASA Astrophysics Data System (ADS)

    Baldock, C.

    2017-05-01

    John Gore’s seminal 1984 paper on gel dosimetry spawned a vibrant research field ranging from fundamental science through to clinical applications. A preliminary bibliometric study was undertaken of the gel dosimetry family of publications inspired by, and resulting from, Gore’s original 1984 paper to determine active growth areas, research trends and hot topics from Gore’s paper up to and including 2016. Themes and trends of the gel dosimetry research field were bibliometrically explored by way of co-occurrence term maps using the titles and abstracts text corpora from the Web of Science database for all relevant papers from 1984 to 2016. Visualisation of similarities was used by way of the VOSviewer visualisation tool to generate cluster maps of gel dosimetry knowledge domains and the associated citation impact of topics within the domains. Heat maps were then generated to assist in the understanding of active growth areas, research trends, and emerging and hot topics in gel dosimetry.

  17. Using ESO Reflex with Web Services

    NASA Astrophysics Data System (ADS)

    Järveläinen, P.; Savolainen, V.; Oittinen, T.; Maisala, S.; Ullgrén, M. Hook, R.

    2008-08-01

    ESO Reflex is a prototype graphical workflow system, based on Taverna, and primarily intended to be a flexible way of running ESO data reduction recipes along with other legacy applications and user-written tools. ESO Reflex can also readily use the Taverna Web Services features that are based on the Apache Axis SOAP implementation. Taverna is a general purpose Web Service client, and requires no programming to use such services. However, Taverna also has some restrictions: for example, no numerical types such integers. In addition the preferred binding style is document/literal wrapped, but most astronomical services publish the Axis default WSDL using RPC/encoded style. Despite these minor limitations we have created simple but very promising test VO workflow using the Sesame name resolver service at CDS Strasbourg, the Hubble SIAP server at the Multi-Mission Archive at Space Telescope (MAST) and the WESIX image cataloging and catalogue cross-referencing service at the University of Pittsburgh. ESO Reflex can also pass files and URIs via the PLASTIC protocol to visualisation tools and has its own viewer for VOTables. We picked these three Web Services to try to set up a realistic and useful ESO Reflex workflow. They also demonstrate ESO Reflex abilities to use many kind of Web Services because each of them requires a different interface. We describe each of these services in turn and comment on how it was used

  18. Information visualisation based on graph models

    NASA Astrophysics Data System (ADS)

    Kasyanov, V. N.; Kasyanova, E. V.

    2013-05-01

    Information visualisation is a key component of support tools for many applications in science and engineering. A graph is an abstract structure that is widely used to model information for its visualisation. In this paper, we consider practical and general graph formalism called hierarchical graphs and present the Higres and Visual Graph systems aimed at supporting information visualisation on the base of hierarchical graph models.

  19. The MetabolomeExpress Project: enabling web-based processing, analysis and transparent dissemination of GC/MS metabolomics datasets.

    PubMed

    Carroll, Adam J; Badger, Murray R; Harvey Millar, A

    2010-07-14

    Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.

  20. An open source, web based, simple solution for seismic data dissemination and collaborative research

    NASA Astrophysics Data System (ADS)

    Diviacco, Paolo

    2005-06-01

    Collaborative research and data dissemination in the field of geophysical exploration need network tools that can access large amounts of data from anywhere using any PC or workstation. Simple solutions based on a combination of Open Source software can be developed to address such requests, exploiting the possibilities offered by the web technologies, and at the same time avoiding the costs and inflexibility of commercial systems. A viable solution consists of MySQL for data storage and retrieval, CWP/SU and GMT for data visualisation and a scripting layer driven by PHP that allows users to access the system via an Apache web server. In the light of the experience building the on-line archive of seismic data of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS), we describe the solutions and the methods adopted, with a view to stimulate both the attitude of network collaborative research of other institutions similar to ours, and the development of different applications.

  1. ACORNS: A Tool for the Visualisation and Modelling of Atypical Development

    ERIC Educational Resources Information Center

    Moore, D. G.; George, R.

    2011-01-01

    Across many academic disciplines visualisation and notation systems are used for modelling data and developing theory, but in child development visual models are not widely used; yet researchers and students of developmental difficulties may benefit from a visualisation and notation system which can clearly map developmental outcomes and…

  2. Accessing eSDO Solar Image Processing and Visualization through AstroGrid

    NASA Astrophysics Data System (ADS)

    Auden, E.; Dalla, S.

    2008-08-01

    The eSDO project is funded by the UK's Science and Technology Facilities Council (STFC) to integrate Solar Dynamics Observatory (SDO) data, algorithms, and visualization tools with the UK's Virtual Observatory project, AstroGrid. In preparation for the SDO launch in January 2009, the eSDO team has developed nine algorithms covering coronal behaviour, feature recognition, and global / local helioseismology. Each of these algorithms has been deployed as an AstroGrid Common Execution Architecture (CEA) application so that they can be included in complex VO workflows. In addition, the PLASTIC-enabled eSDO "Streaming Tool" online movie application allows users to search multi-instrument solar archives through AstroGrid web services and visualise the image data through galleries, an interactive movie viewing applet, and QuickTime movies generated on-the-fly.

  3. Integrating geo web services for a user driven exploratory analysis

    NASA Astrophysics Data System (ADS)

    Moncrieff, Simon; Turdukulov, Ulanbek; Gulland, Elizabeth-Kate

    2016-04-01

    In data exploration, several online data sources may need to be dynamically aggregated or summarised over spatial region, time interval, or set of attributes. With respect to thematic data, web services are mainly used to present results leading to a supplier driven service model limiting the exploration of the data. In this paper we propose a user need driven service model based on geo web processing services. The aim of the framework is to provide a method for the scalable and interactive access to various geographic data sources on the web. The architecture combines a data query, processing technique and visualisation methodology to rapidly integrate and visually summarise properties of a dataset. We illustrate the environment on a health related use case that derives Age Standardised Rate - a dynamic index that needs integration of the existing interoperable web services of demographic data in conjunction with standalone non-spatial secure database servers used in health research. Although the example is specific to the health field, the architecture and the proposed approach are relevant and applicable to other fields that require integration and visualisation of geo datasets from various web services and thus, we believe is generic in its approach.

  4. Sharing knowledge of Planetary Datasets through the Web-Based PRoGIS

    NASA Astrophysics Data System (ADS)

    Giordano, M. G.; Morley, J. M.; Muller, J. P. M.; Barnes, R. B.; Tao, Y. T.

    2015-10-01

    The large amount of raw and derived data available from various planetary surface missions (e.g. Mars and Moon in our case) has been integrated withco-registered and geocoded orbital image data to provide rover traverses and camera site locations in universal global co-ordinates [1]. This then allows an integrated GIS to use these geocoded products for scientific applications: we aim to create a web interface, PRoGIS, with minimal controls focusing on the usability and visualisation of the data, to allow planetary geologists to share annotated surface observations. These observations in a common context are shared between different tools and software (PRoGIS, Pro3D, 3D point cloud viewer). Our aim is to use only Open Source components that integrate Open Web Services for planetary data to make available an universal platform with a WebGIS interface, as well as a 3D point cloud and a Panorama viewer to explore derived data. On top of these tools we are building capabilities to make and share annotations amongst users. We use Python and Django for the server-side framework and Open Layers 3 for the WebGIS client. For good performance previewing 3D data (point clouds, pictures on the surface and panoramas) we employ ThreeJS, a WebGL Javascript library. Additionally, user and group controls allow scientists to store and share their observations. PRoGIS not only displays data but also launches sophisticated 3D vision reprocessing (PRoVIP) and an immersive 3D analysis environment (PRo3D).

  5. Extracting scientific articles from a large digital archive: BioStor and the Biodiversity Heritage Library

    PubMed Central

    2011-01-01

    Background The Biodiversity Heritage Library (BHL) is a large digital archive of legacy biological literature, comprising over 31 million pages scanned from books, monographs, and journals. During the digitisation process basic metadata about the scanned items is recorded, but not article-level metadata. Given that the article is the standard unit of citation, this makes it difficult to locate cited literature in BHL. Adding the ability to easily find articles in BHL would greatly enhance the value of the archive. Description A service was developed to locate articles in BHL based on matching article metadata to BHL metadata using approximate string matching, regular expressions, and string alignment. This article locating service is exposed as a standard OpenURL resolver on the BioStor web site http://biostor.org/openurl/. This resolver can be used on the web, or called by bibliographic tools that support OpenURL. Conclusions BioStor provides tools for extracting, annotating, and visualising articles from the Biodiversity Heritage Library. BioStor is available from http://biostor.org/. PMID:21605356

  6. Use of Data Visualisation in the Teaching of Statistics: A New Zealand Perspective

    ERIC Educational Resources Information Center

    Forbes, Sharleen; Chapman, Jeanette; Harraway, John; Stirling, Doug; Wild, Chris

    2014-01-01

    For many years, students have been taught to visualise data by drawing graphs. Recently, there has been a growing trend to teach statistics, particularly statistical concepts, using interactive and dynamic visualisation tools. Free down-loadable teaching and simulation software designed specifically for schools, and more general data visualisation…

  7. ATLAS Live: Collaborative Information Streams

    NASA Astrophysics Data System (ADS)

    Goldfarb, Steven; ATLAS Collaboration

    2011-12-01

    I report on a pilot project launched in 2010 focusing on facilitating communication and information exchange within the ATLAS Collaboration, through the combination of digital signage software and webcasting. The project, called ATLAS Live, implements video streams of information, ranging from detailed detector and data status to educational and outreach material. The content, including text, images, video and audio, is collected, visualised and scheduled using digital signage software. The system is robust and flexible, utilizing scripts to input data from remote sources, such as the CERN Document Server, Indico, or any available URL, and to integrate these sources into professional-quality streams, including text scrolling, transition effects, inter and intra-screen divisibility. Information is published via the encoding and webcasting of standard video streams, viewable on all common platforms, using a web browser or other common video tool. Authorisation is enforced at the level of the streaming and at the web portals, using the CERN SSO system.

  8. Open-Source web-based geographical information system for health exposure assessment

    PubMed Central

    2012-01-01

    This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software. PMID:22233606

  9. Multinational Experiment 7 Cyber Domain Outcome 3. Cyber Situational Awareness. Limited Objective Experiment Report

    DTIC Science & Technology

    2013-02-28

    situational awareness (some surprise at the value of information received from other sectors). • The visualisation technology provided, significantly...digital text and video communications; streaming information feeds; and infrastructure visualisation - all cell teams were provided identical tools...better understanding cyber of situational awareness. The visualisation technology and experiment environment were a first attempt at trying to represent

  10. SLIVISU, an Interactive Visualisation Framework for Analysis of Geological Sea-Level Indicators

    NASA Astrophysics Data System (ADS)

    Klemann, V.; Schulte, S.; Unger, A.; Dransch, D.

    2011-12-01

    Flanking data analysis in earth system sciences by advanced visualisation tools is a striking feature due to rising complexity, amount and variety of available data. With respect to sea-level indicators (SLIs), their analysis in earth-system applications, such as modelling and simulation on regional or global scales, demands the consideration of large amounts of data - we talk about thousands of SLIs - and, so, to go ahead of analysing single sea-level curves. On the other hand, a gross analysis by means of statistical methods is hindered by the often heterogeneous and individual character of the single SLIs, i.e., the spatio-temporal context and often heterogenous information is difficult to handle or to represent in an objective way. Therefore a concept of integrating automated analysis and visualisation is mandatory. This is provided by visual analytics. As an implementation of this concept, we present the visualisation framework SLIVISU, developed at GFZ, which bases on multiple linked views and provides a synoptic analysis of observational data, model configurations, model outputs and results of automated analysis in glacial isostatic adjustment. Starting as a visualisation tool for an existing database of SLIs, it now serves as an analysis tool for the evaluation of model simulations in studies of glacial-isostatic adjustment.

  11. Towards a Decision Support Tool for 3d Visualisation: Application to Selectivity Purpose of Single Object in a 3d City Scene

    NASA Astrophysics Data System (ADS)

    Neuville, R.; Pouliot, J.; Poux, F.; Hallot, P.; De Rudder, L.; Billen, R.

    2017-10-01

    This paper deals with the establishment of a comprehensive methodological framework that defines 3D visualisation rules and its application in a decision support tool. Whilst the use of 3D models grows in many application fields, their visualisation remains challenging from the point of view of mapping and rendering aspects to be applied to suitability support the decision making process. Indeed, there exists a great number of 3D visualisation techniques but as far as we know, a decision support tool that facilitates the production of an efficient 3D visualisation is still missing. This is why a comprehensive methodological framework is proposed in order to build decision tables for specific data, tasks and contexts. Based on the second-order logic formalism, we define a set of functions and propositions among and between two collections of entities: on one hand static retinal variables (hue, size, shape…) and 3D environment parameters (directional lighting, shadow, haze…) and on the other hand their effect(s) regarding specific visual tasks. It enables to define 3D visualisation rules according to four categories: consequence, compatibility, potential incompatibility and incompatibility. In this paper, the application of the methodological framework is demonstrated for an urban visualisation at high density considering a specific set of entities. On the basis of our analysis and the results of many studies conducted in the 3D semiotics, which refers to the study of symbols and how they relay information, the truth values of propositions are determined. 3D visualisation rules are then extracted for the considered context and set of entities and are presented into a decision table with a colour coding. Finally, the decision table is implemented into a plugin developed with three.js, a cross-browser JavaScript library. The plugin consists of a sidebar and warning windows that help the designer in the use of a set of static retinal variables and 3D environment parameters.

  12. The Cadmio XML healthcare record.

    PubMed

    Barbera, Francesco; Ferri, Fernando; Ricci, Fabrizio L; Sottile, Pier Angelo

    2002-01-01

    The management of clinical data is a complex task. Patient related information reported in patient folders is a set of heterogeneous and structured data accessed by different users having different goals (in local or geographical networks). XML language provides a mechanism for describing, manipulating, and visualising structured data in web-based applications. XML ensures that the structured data is managed in a uniform and transparent manner independently from the applications and their providers guaranteeing some interoperability. Extracting data from the healthcare record and structuring them according to XML makes the data available through browsers. The MIC/MIE model (Medical Information Category/Medical Information Elements), which allows the definition and management of healthcare records and used in CADMIO, a HISA based project, is described in this paper, using XML for allowing the data to be visualised through web browsers.

  13. SpineCreator: a Graphical User Interface for the Creation of Layered Neural Models.

    PubMed

    Cope, A J; Richmond, P; James, S S; Gurney, K; Allerton, D J

    2017-01-01

    There is a growing requirement in computational neuroscience for tools that permit collaborative model building, model sharing, combining existing models into a larger system (multi-scale model integration), and are able to simulate models using a variety of simulation engines and hardware platforms. Layered XML model specification formats solve many of these problems, however they are difficult to write and visualise without tools. Here we describe a new graphical software tool, SpineCreator, which facilitates the creation and visualisation of layered models of point spiking neurons or rate coded neurons without requiring the need for programming. We demonstrate the tool through the reproduction and visualisation of published models and show simulation results using code generation interfaced directly into SpineCreator. As a unique application for the graphical creation of neural networks, SpineCreator represents an important step forward for neuronal modelling.

  14. visPIG--a web tool for producing multi-region, multi-track, multi-scale plots of genetic data.

    PubMed

    Scales, Matthew; Jäger, Roland; Migliorini, Gabriele; Houlston, Richard S; Henrion, Marc Y R

    2014-01-01

    We present VISual Plotting Interface for Genetics (visPIG; http://vispig.icr.ac.uk), a web application to produce multi-track, multi-scale, multi-region plots of genetic data. visPIG has been designed to allow users not well versed with mathematical software packages and/or programming languages such as R, Matlab®, Python, etc., to integrate data from multiple sources for interpretation and to easily create publication-ready figures. While web tools such as the UCSC Genome Browser or the WashU Epigenome Browser allow custom data uploads, such tools are primarily designed for data exploration. This is also true for the desktop-run Integrative Genomics Viewer (IGV). Other locally run data visualisation software such as Circos require significant computer skills of the user. The visPIG web application is a menu-based interface that allows users to upload custom data tracks and set track-specific parameters. Figures can be downloaded as PDF or PNG files. For sensitive data, the underlying R code can also be downloaded and run locally. visPIG is multi-track: it can display many different data types (e.g association, functional annotation, intensity, interaction, heat map data,…). It also allows annotation of genes and other custom features in the plotted region(s). Data tracks can be plotted individually or on a single figure. visPIG is multi-region: it supports plotting multiple regions, be they kilo- or megabases apart or even on different chromosomes. Finally, visPIG is multi-scale: a sub-region of particular interest can be 'zoomed' in. We describe the various features of visPIG and illustrate its utility with examples. visPIG is freely available through http://vispig.icr.ac.uk under a GNU General Public License (GPLv3).

  15. EarthServer: Visualisation and use of uncertainty as a data exploration tool

    NASA Astrophysics Data System (ADS)

    Walker, Peter; Clements, Oliver; Grant, Mike

    2013-04-01

    The Ocean Science/Earth Observation community generates huge datasets from satellite observation. Until recently it has been difficult to obtain matching uncertainty information for these datasets and to apply this to their processing. In order to make use of uncertainty information when analysing "Big Data" we need both the uncertainty itself (attached to the underlying data) and a means of working with the combined product without requiring the entire dataset to be downloaded. The European Commission FP7 project EarthServer (http://earthserver.eu) is addressing the problem of accessing and ad-hoc analysis of extreme-size Earth Science data using cutting-edge Array Database technology. The core software (Rasdaman) and web services wrapper (Petascope) allow huge datasets to be accessed using Open Geospatial Consortium (OGC) standard interfaces including the well established standards, Web Coverage Service (WCS) and Web Map Service (WMS) as well as the emerging standard, Web Coverage Processing Service (WCPS). The WCPS standard allows the running of ad-hoc queries on any of the data stored within Rasdaman, creating an infrastructure where users are not restricted by bandwidth when manipulating or querying huge datasets. The ESA Ocean Colour - Climate Change Initiative (OC-CCI) project (http://www.esa-oceancolour-cci.org/), is producing high-resolution, global ocean colour datasets over the full time period (1998-2012) where high quality observations were available. This climate data record includes per-pixel uncertainty data for each variable, based on an analytic method that classifies how much and which types of water are present in a pixel, and assigns uncertainty based on robust comparisons to global in-situ validation datasets. These uncertainty values take two forms, Root Mean Square (RMS) and Bias uncertainty, respectively representing the expected variability and expected offset error. By combining the data produced through the OC-CCI project with the software from the EarthServer project we can produce a novel data offering that allows the use of traditional exploration and access mechanisms such as WMS and WCS. However the real benefits can be seen when utilising WCPS to explore the data . We will show two major benefits to this infrastructure. Firstly we will show that the visualisation of the combined chlorophyll and uncertainty datasets through a web based GIS portal gives users the ability to instantaneously assess the quality of the data they are exploring using traditional web based plotting techniques as well as through novel web based 3 dimensional visualisation. Secondly we will showcase the benefits available when combining these data with the WCPS standard. The uncertainty data can be utilised in queries using the standard WCPS query language. This allows selection of data either for download or use within the query, based on the respective uncertainty values as well as the possibility of incorporating both the chlorophyll data and uncertainty data into complex queries to produce additional novel data products. By filtering with uncertainty at the data source rather than the client we can minimise traffic over the network allowing huge datasets to be worked on with a minimal time penalty.

  16. The use of interactive graphical maps for browsing medical/health Internet information resources

    PubMed Central

    Boulos, Maged N Kamel

    2003-01-01

    As online information portals accumulate metadata descriptions of Web resources, it becomes necessary to develop effective ways for visualising and navigating the resultant huge metadata repositories as well as the different semantic relationships and attributes of described Web resources. Graphical maps provide a good method to visualise, understand and navigate a world that is too large and complex to be seen directly like the Web. Several examples of maps designed as a navigational aid for Web resources are presented in this review with an emphasis on maps of medical and health-related resources. The latter include HealthCyberMap maps , which can be classified as conceptual information space maps, and the very abstract and geometric Visual Net maps of PubMed (for demos). Information resources can be also organised and navigated based on their geographic attributes. Some of the maps presented in this review use a Kohonen Self-Organising Map algorithm, and only HealthCyberMap uses a Geographic Information System to classify Web resource data and render the maps. Maps based on familiar metaphors taken from users' everyday life are much easier to understand. Associative and pictorial map icons that enable instant recognition and comprehension are preferred to geometric ones and are key to successful maps for browsing medical/health Internet information resources. PMID:12556244

  17. The VERCE Science Gateway: Enabling User Friendly HPC Seismic Wave Simulations.

    NASA Astrophysics Data System (ADS)

    Casarotti, E.; Spinuso, A.; Matser, J.; Leong, S. H.; Magnoni, F.; Krause, A.; Garcia, C. R.; Muraleedharan, V.; Krischer, L.; Anthes, C.

    2014-12-01

    The EU-funded project VERCE (Virtual Earthquake and seismology Research Community in Europe) aims to deploy technologies which satisfy the HPC and data-intensive requirements of modern seismology.As a result of VERCE official collaboration with the EU project SCI-BUS, access to computational resources, like local clusters and international infrastructures (EGI and PRACE), is made homogeneous and integrated within a dedicated science gateway based on the gUSE framework. In this presentation we give a detailed overview on the progress achieved with the developments of the VERCE Science Gateway, according to a use-case driven implementation strategy. More specifically, we show how the computational technologies and data services have been integrated within a tool for Seismic Forward Modelling, whose objective is to offer the possibility to performsimulations of seismic waves as a service to the seismological community.We will introduce the interactive components of the OGC map based web interface and how it supports the user with setting up the simulation. We will go through the selection of input data, which are either fetched from federated seismological web services, adopting community standards, or provided by the users themselves by accessing their own document data store. The HPC scientific codes can be selected from a number of waveform simulators, currently available to the seismological community as batch tools or with limited configuration capabilities in their interactive online versions.The results will be staged out via a secure GridFTP transfer to a VERCE data layer managed by iRODS. The provenance information of the simulation will be automatically cataloged by the data layer via NoSQL techonologies.Finally, we will show the example of how the visualisation output of the gateway could be enhanced by the connection with immersive projection technology at the Virtual Reality and Visualisation Centre of Leibniz Supercomputing Centre (LRZ).

  18. Web tools for large-scale 3D biological images and atlases

    PubMed Central

    2012-01-01

    Background Large-scale volumetric biomedical image data of three or more dimensions are a significant challenge for distributed browsing and visualisation. Many images now exceed 10GB which for most users is too large to handle in terms of computer RAM and network bandwidth. This is aggravated when users need to access tens or hundreds of such images from an archive. Here we solve the problem for 2D section views through archive data delivering compressed tiled images enabling users to browse through very-large volume data in the context of a standard web-browser. The system provides an interactive visualisation for grey-level and colour 3D images including multiple image layers and spatial-data overlay. Results The standard Internet Imaging Protocol (IIP) has been extended to enable arbitrary 2D sectioning of 3D data as well a multi-layered images and indexed overlays. The extended protocol is termed IIP3D and we have implemented a matching server to deliver the protocol and a series of Ajax/Javascript client codes that will run in an Internet browser. We have tested the server software on a low-cost linux-based server for image volumes up to 135GB and 64 simultaneous users. The section views are delivered with response times independent of scale and orientation. The exemplar client provided multi-layer image views with user-controlled colour-filtering and overlays. Conclusions Interactive browsing of arbitrary sections through large biomedical-image volumes is made possible by use of an extended internet protocol and efficient server-based image tiling. The tools open the possibility of enabling fast access to large image archives without the requirement of whole image download and client computers with very large memory configurations. The system was demonstrated using a range of medical and biomedical image data extending up to 135GB for a single image volume. PMID:22676296

  19. CNVinspector: a web-based tool for the interactive evaluation of copy number variations in single patients and in cohorts.

    PubMed

    Knierim, Ellen; Schwarz, Jana Marie; Schuelke, Markus; Seelow, Dominik

    2013-08-01

    Many genetic disorders are caused by copy number variations (CNVs) in the human genome. However, the large number of benign CNV polymorphisms makes it difficult to delineate causative variants for a certain disease phenotype. Hence, we set out to create software that accumulates and visualises locus-specific knowledge and enables clinicians to study their own CNVs in the context of known polymorphisms and disease variants. CNV data from healthy cohorts (Database of Genomic Variants) and from disease-related databases (DECIPHER) were integrated into a joint resource. Data are presented in an interactive web-based application that allows inspection, evaluation and filtering of CNVs in single individuals or in entire cohorts. CNVinspector provides simple interfaces to upload CNV data, compare them with own or published control data and visualise the results in graphical interfaces. Beyond choosing control data from different public studies, platforms and methods, dedicated filter options allow the detection of CNVs that are either enriched in patients or depleted in controls. Alternatively, a search can be restricted to those CNVs that appear in individuals of similar clinical phenotype. For each gene of interest within a CNV, we provide a link to NCBI, ENSEMBL and the GeneDistiller search engine to browse for potential disease-associated genes. With its user-friendly handling, the integration of control data and the filtering options, CNVinspector will facilitate the daily work of clinical geneticists and accelerate the delineation of new syndromes and gene functions. CNVinspector is freely accessible under http://www.cnvinspector.org.

  20. Online 3D terrain visualisation using Unity 3D game engine: A comparison of different contour intervals terrain data draped with UAV images

    NASA Astrophysics Data System (ADS)

    Hafiz Mahayudin, Mohd; Che Mat, Ruzinoor

    2016-06-01

    The main objective of this paper is to discuss on the effectiveness of visualising terrain draped with Unmanned Aerial Vehicle (UAV) images generated from different contour intervals using Unity 3D game engine in online environment. The study area that was tested in this project was oil palm plantation at Sintok, Kedah. The contour data used for this study are divided into three different intervals which are 1m, 3m and 5m. ArcGIS software were used to clip the contour data and also UAV images data to be similar size for the overlaying process. The Unity 3D game engine was used as the main platform for developing the system due to its capabilities which can be launch in different platform. The clipped contour data and UAV images data were process and exported into the web format using Unity 3D. Then process continue by publishing it into the web server for comparing the effectiveness of different 3D terrain data (contour data) draped with UAV images. The effectiveness is compared based on the data size, loading time (office and out-of-office hours), response time, visualisation quality, and frame per second (fps). The results were suggest which contour interval is better for developing an effective online 3D terrain visualisation draped with UAV images using Unity 3D game engine. It therefore benefits decision maker and planner related to this field decide on which contour is applicable for their task.

  1. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  2. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  3. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    NASA Astrophysics Data System (ADS)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  4. SEPHYDRO: An Integrated Multi-Filter Web-Based Tool for Baseflow Separation

    NASA Astrophysics Data System (ADS)

    Serban, D.; MacQuarrie, K. T. B.; Popa, A.

    2017-12-01

    Knowledge of baseflow contributions to streamflow is important for understanding watershed scale hydrology, including groundwater-surface water interactions, impact of geology and landforms on baseflow, estimation of groundwater recharge rates, etc. Baseflow (or hydrograph) separation methods can be used as supporting tools in many areas of environmental research, such as the assessment of the impact of agricultural practices, urbanization and climate change on surface water and groundwater. Over the past few decades various digital filtering and graphically-based methods have been developed in an attempt to improve the assessment of the dynamics of the various sources of streamflow (e.g. groundwater, surface runoff, subsurface flow); however, these methods are not available under an integrated platform and, individually, often require significant effort for implementation. Here we introduce SEPHYDRO, an open access, customizable web-based tool, which integrates 11 algorithms allowing for separation of streamflow hydrographs. The streamlined interface incorporates a reference guide as well as additional information that allows users to import their own data, customize the algorithms, and compare, visualise and export results. The tool includes one-, two- and three-parameter digital filters as well as graphical separation methods and has been successfully applied in Atlantic Canada, in studies dealing with nutrient loading to fresh water and coastal water ecosystems. Future developments include integration of additional separation algorithms as well as incorporation of geochemical separation methods. SEPHYDRO has been developed through a collaborative research effort between the Canadian Rivers Institute, University of New Brunswick (Fredericton, New Brunswick, Canada), Agriculture and Agri-Food Canada and Environment and Climate Change Canada and is currently available at http://canadianriversinstitute.com/tool/

  5. Visualising large hierarchies with Flextree

    NASA Astrophysics Data System (ADS)

    Song, Hongzhi; Curran, Edwin P.; Sterritt, Roy

    2003-05-01

    One of the main tasks in Information Visualisation research is creating visual tools to facilitate human understanding of large and complex information spaces. Hierarchies, being a good mechanism in organising such information, are ubiquitous. Although much research effort has been spent on finding useful representations for hierarchies, visualising large hierarchies is still a difficult topic. One of the difficulties is how to show both tructure and node content information in one view. Another is how to achieve multiple foci in a focus+context visualisation. This paper describes a novel hierarchy visualisation technique called FlexTree to address these problems. It contains some important features that have not been exploited so far. In this visualisation, a profile or contour unique to the hierarchy being visualised can be gained in a histogram-like layout. A normalised view of a common attribute of all nodes can be acquired, and selection of this attribute is controllable by the user. Multiple foci are consistently accessible within a global context through interaction. Furthermore it can handle a large hierarchy that contains several thousand nodes in a PC environment. In addition results from an informal evaluation are also presented.

  6. Science Plan Visualisation for Rosetta

    NASA Astrophysics Data System (ADS)

    Schmidt, A.; Grieger, B.; Völk, S.

    2013-12-01

    Rosetta is a mission of the European Space Agency (ESA) to rendez-vous with comet Churyumov-Gerasimenko in mid-2014. The trajectories and their corresponding operations are flexible and particularly complex. To make informed decisions among the many free parameters novel ways to communicate operations to the community have been explored. To support science planning by communicating operational ideas and disseminating operational scenarios, the science ground segment makes use of Web-based visualisation technologies. To keep the threshold to analysing operations proposals as low as possible, various implementation techniques have been investigated. An important goal was to use the Web to make the content as accessible as possible. By adopting the recent standard WebGL and generating static pages of time-dependent three-dimensional views of the spacecraft as well as the corresponding field-of-views of instruments, directly from the operational and for-study files, users are given the opportunity to explore interactively in their Web browsers what is being proposed in addition to using the traditional file products and analysing them in detail. The scenes and animations can be viewed in any modern Web browser and be combined with other analyses. This is to facilitate verification and cross-validation of complex products, often done by comparing different independent analyses and studies. By providing different timesteps in animations, it is possible to focus on long-term planning or short-term planning without distracting the user from the essentials. This is particularly important since the information that can be displayed in a Web browser is somewhat related to data volume that can be transferred across the wire. In Web browsers, it is more challenging to do numerical calculations on demand. Since requests for additional data have to be passed through a Web server, they are more complex and also require a more complex infrastructure. The volume of data that can be kept in a browser environment is limited and might have to be transferred over often slow network links. Thus, careful design and reduction of data is required. Regarding user interaction, Web browsers are often limited to a mouse and keyboards. In terms of benefits, the threshold and turn-around times for discussing operational ideas by using the visualisation techniques described here are lowered. An additional benefit of the approach was the cooperative use of products by distributed users which resulted in higher-quality software and data by incorporating more feedback than what would usually have been available.

  7. The centrality of meta-programming in the ES-DOC eco-system

    NASA Astrophysics Data System (ADS)

    Greenslade, Mark

    2017-04-01

    The Earth System Documentation (ES-DOC) project is an international effort aiming to deliver a robust earth system model inter-comparison project documentation infrastructure. Such infrastructure both simplifies & standardizes the process of documenting (in detail) projects, experiments, models, forcings & simulations. In support of CMIP6, ES-DOC has upgraded its eco-system of tools, web-services & web-sites. The upgrade consolidates the existing infrastructure (built for CMIP5) and extends it with the introduction of new capabilities. The strategic focus of the upgrade is improvements in the documentation experience and broadening the range of scientific use-cases that the archived documentation may help deliver. Whether it is highlighting dataset errors, exploring experimental protocols, comparing forcings across ensemble runs, understanding MIP objectives, reviewing citations, exploring component properties of configured models, visualising inter-model relationships, scientists involved in CMIP6 will find the ES-DOC infrastructure helpful. This presentation underlines the centrality of meta-programming within the ES-DOC eco-system. We will demonstrate how agility is greatly enhanced by taking a meta-programming approach to representing data models and controlled vocabularies. Such an approach nicely decouples representations from encodings. Meta-models will be presented along with the associated tooling chain that forward engineers artefacts as diverse as: class hierarchies, IPython notebooks, mindmaps, configuration files, OWL & SKOS documents, spreadsheets …etc.

  8. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies

    PubMed Central

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-01-01

    Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006

  9. Making large amounts of meteorological plots easily accessible to users

    NASA Astrophysics Data System (ADS)

    Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin

    2015-04-01

    The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member organisations with forecasts in the medium time range of 3 to 15 days, and some longer-range forecasts for up to a year ahead, with varying degrees of detail. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast, where some specific processing and visualisation are applied to extract information. Every day, thousands of raw data are being pushed to the ECMWF's interactive web charts application called ecCharts, and thousands of products are processed and pushed to ECMWF's institutional web site ecCharts provides a highly interactive application to display and manipulate recent numerical forecasts to forecasters in national weather services and ECMWF's commercial customers. With ecCharts forecasters are able to explore ECMWF's medium-range forecasts in far greater detail than has previously been possible on the web, and this as soon as the forecast becomes available. All ecCharts's products are also available through a machine-to-machine web map service based on the OGC Web Map Service (WMS) standard. ECMWF institutional web site provides access to a large number of graphical products. It was entirely redesigned last year. It now shares the same infrastructure as ECMWF's ecCharts, and can benefit of some ecCharts functionalities, for example the dashboard. The dashboard initially developed for ecCharts allows users to organise their own collection of products depending on their work flow, and is being further developed. In its first implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.

  10. FoodMicrobionet: A database for the visualisation and exploration of food bacterial communities based on network analysis.

    PubMed

    Parente, Eugenio; Cocolin, Luca; De Filippis, Francesca; Zotta, Teresa; Ferrocino, Ilario; O'Sullivan, Orla; Neviani, Erasmo; De Angelis, Maria; Cotter, Paul D; Ercolini, Danilo

    2016-02-16

    Amplicon targeted high-throughput sequencing has become a popular tool for the culture-independent analysis of microbial communities. Although the data obtained with this approach are portable and the number of sequences available in public databases is increasing, no tool has been developed yet for the analysis and presentation of data obtained in different studies. This work describes an approach for the development of a database for the rapid exploration and analysis of data on food microbial communities. Data from seventeen studies investigating the structure of bacterial communities in dairy, meat, sourdough and fermented vegetable products, obtained by 16S rRNA gene targeted high-throughput sequencing, were collated and analysed using Gephi, a network analysis software. The resulting database, which we named FoodMicrobionet, was used to analyse nodes and network properties and to build an interactive web-based visualisation. The latter allows the visual exploration of the relationships between Operational Taxonomic Units (OTUs) and samples and the identification of core- and sample-specific bacterial communities. It also provides additional search tools and hyperlinks for the rapid selection of food groups and OTUs and for rapid access to external resources (NCBI taxonomy, digital versions of the original articles). Microbial interaction network analysis was carried out using CoNet on datasets extracted from FoodMicrobionet: the complexity of interaction networks was much lower than that found for other bacterial communities (human microbiome, soil and other environments). This may reflect both a bias in the dataset (which was dominated by fermented foods and starter cultures) and the lower complexity of food bacterial communities. Although some technical challenges exist, and are discussed here, the net result is a valuable tool for the exploration of food bacterial communities by the scientific community and food industry. Copyright © 2015. Published by Elsevier B.V.

  11. a Web-Based Interactive Tool for Multi-Resolution 3d Models of a Maya Archaeological Site

    NASA Astrophysics Data System (ADS)

    Agugiaro, G.; Remondino, F.; Girardi, G.; von Schwerin, J.; Richards-Rissetto, H.; De Amicis, R.

    2011-09-01

    Continuous technological advances in surveying, computing and digital-content delivery are strongly contributing to a change in the way Cultural Heritage is "perceived": new tools and methodologies for documentation, reconstruction and research are being created to assist not only scholars, but also to reach more potential users (e.g. students and tourists) willing to access more detailed information about art history and archaeology. 3D computer-simulated models, sometimes set in virtual landscapes, offer for example the chance to explore possible hypothetical reconstructions, while on-line GIS resources can help interactive analyses of relationships and change over space and time. While for some research purposes a traditional 2D approach may suffice, this is not the case for more complex analyses concerning spatial and temporal features of architecture, like for example the relationship of architecture and landscape, visibility studies etc. The project aims therefore at creating a tool, called "QueryArch3D" tool, which enables the web-based visualisation and queries of an interactive, multi-resolution 3D model in the framework of Cultural Heritage. More specifically, a complete Maya archaeological site, located in Copan (Honduras), has been chosen as case study to test and demonstrate the platform's capabilities. Much of the site has been surveyed and modelled at different levels of detail (LoD) and the geometric model has been semantically segmented and integrated with attribute data gathered from several external data sources. The paper describes the characteristics of the research work, along with its implementation issues and the initial results of the developed prototype.

  12. Hyper-Fit: Fitting Linear Models to Multidimensional Data with Multivariate Gaussian Uncertainties

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Obreschkow, D.

    2015-09-01

    Astronomical data is often uncertain with errors that are heteroscedastic (different for each data point) and covariant between different dimensions. Assuming that a set of D-dimensional data points can be described by a (D - 1)-dimensional plane with intrinsic scatter, we derive the general likelihood function to be maximised to recover the best fitting model. Alongside the mathematical description, we also release the hyper-fit package for the R statistical language (http://github.com/asgr/hyper.fit) and a user-friendly web interface for online fitting (http://hyperfit.icrar.org). The hyper-fit package offers access to a large number of fitting routines, includes visualisation tools, and is fully documented in an extensive user manual. Most of the hyper-fit functionality is accessible via the web interface. In this paper, we include applications to toy examples and to real astronomical data from the literature: the mass-size, Tully-Fisher, Fundamental Plane, and mass-spin-morphology relations. In most cases, the hyper-fit solutions are in good agreement with published values, but uncover more information regarding the fitted model.

  13. Visualisation Ability of Senior High School Students with Using GeoGebra and Transparent Mica

    NASA Astrophysics Data System (ADS)

    Thohirudin, M.; Maryati, TK; Dwirahayu, G.

    2017-04-01

    Visualisation ability is an ability to process, inform, and transform object which suitable for geometry topic in math. This research aims to describe the influence of using software GeoGebra and transparent mica for student’s visualisation ability. GeoGebra is shortness of geometry and algebra. GeoGebra is an open source program that is created for math. Transparent mica is a tool that is created by the author to transform a geometry object. This research is a quantitative experiment model. The subject of this research were students in grade XII of science program in Annajah Senior High School Rumpin with two classes which one as an experiment class (science one) and another one as a control class (science two). Experiment class use GeoGebra and transparent mica in the study, and control class use powerpoint in the study. Data of student’s visualisation ability is collected from posttest with visual questions which are gifted at the end of the research to both classes with topic “transformation geometry”. This research resulted that studying with GeoGebra and transparent mica had a better influence than studying with powerpoint to student’s visualisation ability. The time of study in class and the habit of the students to use software and tool affected the result of research. Although, GeoGebra and transparent mica can give help to students in transformation geometry topic.

  14. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies.

    PubMed

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-04-01

    The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  15. A Web-based geographic information system for monitoring animal welfare during long journeys.

    PubMed

    Ippoliti, Carla; Di Pasquale, Adriano; Fiore, Gianluca; Savini, Lara; Conte, Annamaria; Di Gianvito, Federica; Di Francesco, Cesare

    2007-01-01

    Animal welfare protection during long journeys is mandatory according to European Union regulations designed to ensure that animals are transported in accordance with animal welfare requirements and to provide control bodies with a regulatory tool to react promptly in cases of non-compliance and to ensure a safe network between products, animals and farms. Regulation 1/2005/EC foresees recourse to a system of traceability within European Union member states. The Joint Research Centre of the European Commission (JRC) has developed a prototype system fulfilling the requirements of the Regulation which is able to monitor compliance with animal welfare requirements during transportation, register electronic identification of transported animals and store data in a central database shared with the other member states through a Web-based application. Test equipment has recently been installed on a vehicle that records data on vehicle position (geographic coordinates, date/time) and animal welfare conditions (measurements of internal temperature of the vehicle, etc.). The information is recorded at fixed intervals and transmitted to the central database. The authors describe the Web-based geographic information system, through which authorised users can visualise instantly the real-time position of the vehicle, monitor the sensor-recorded data and follow the time-space path of the truck during journeys.

  16. Designing Spatial Visualisation Tasks for Middle School Students with a 3D Modelling Software: An Instrumental Approach

    ERIC Educational Resources Information Center

    Turgut, Melih; Uygan, Candas

    2015-01-01

    In this work, certain task designs to enhance middle school students' spatial visualisation ability, in the context of an instrumental approach, have been developed. 3D modelling software, SketchUp®, was used. In the design process, software tools were focused on and, thereafter, the aim was to interpret the instrumental genesis and spatial…

  17. The iMars web-GIS - spatio-temporal data queries and single image web map services

    NASA Astrophysics Data System (ADS)

    Walter, S. H. G.; Steikert, R.; Schreiner, B.; Sidiropoulos, P.; Tao, Y.; Muller, J.-P.; Putry, A. R. D.; van Gasselt, S.

    2017-09-01

    We introduce a new approach for a system dedicated to planetary surface change detection by simultaneous visualisation of single-image time series in a multi-temporal context. In the context of the EU FP-7 iMars project we process and ingest vast amounts of automatically co-registered (ACRO) images. The base of the co-registration are the high precision HRSC multi-orbit quadrangle image mosaics, which are based on bundle-block-adjusted multi-orbit HRSC DTMs.

  18. Oceanographic data at your fingertips: the SOCIB App for smartphones

    NASA Astrophysics Data System (ADS)

    Lora, Sebastian; Sebastian, Kristian; Troupin, Charles; Pau Beltran, Joan; Frontera, Biel; Gómara, Sonia; Tintoré, Joaquín

    2015-04-01

    The Balearic Islands Coastal Ocean Observing and Forecasting System (SOCIB, http://www.socib.es), is a multi-platform Marine Research Infrastructure that generates data from nearshore to the open sea in the Western Mediterranean Sea. In line with SOCIB principles of discoverable, freely available and standardized data, an application (App) for smartphones has been designed, with the objective of providing an easy access to all the data managed by SOCIB in real-time: underwater gliders, drifters, profiling buoys, research vessel, HF Radar and numerical model outputs (hydrodynamics and waves). The Data Centre, responsible for the aquisition, processing and visualisation of all SOCIB data, developed a REpresentational State Transfer (REST) application programming interface (API) called "DataDiscovery" (http://apps.socib.es/DataDiscovery/). This API is made up of RESTful web services that provide information on : platforms, instruments, deployments of instruments. It also provides the data themselves. In this way, it is possible to integrate SOCIB data in third-party applications, developed either by the Data Center or externally. The existence of a single point for the data distribution not only allows for an efficient management but also makes easier the concepts and data access for external developers, who are not necessarily familiar with the concepts and tools related to oceanographic or atmospheric data. The SOCIB App for Android (https://play.google.com/store/apps/details?id=com.socib) uses that API as a "data backend", in such a way that it is straightforward to manage which information is shown by the application, without having to modify and upload it again. The only pieces of information that do not depend on the services are the App "Sections" and "Screens", but the content displayed in each of them is obtained through requests to the web services. The API is not used only for the smartphone app: presently, most of SOCIB applications for data visualisation and access rely on the API, for instance: corporative web, deployment Application (Dapp, http://apps.socib.es/dapp/), Sea Boards (http://seaboard.socib.es/).

  19. Custom-made raster method for fistula and graft.

    PubMed

    Blokker, C

    2005-01-01

    Unfamiliarity with fistula and graft characteristics can lead to failed punctures, haematomas and sometimes access occlusion. The Custom-made Raster Method provides detailed shunt visualisation and angiographic images by using photo-editing software. Access veins of an individual shunt and an adapted raster are projected on a digital picture of the arm. During angiography the shunt arm is fixated and a digital picture is taken from a fixed vertical angle and distance. Reference points are marked on the shunt arm, which serves as a fixation to draw a raster with coordination points. In this way a picture is created similar to a roadmap with veins. There is complete integration of digital and radiological images by using software programmes Adobe Photoshop + Illustrator or Agfa Web 1000 under Windows XP. All illustrations fit 1:1 by scaling up or down without distortion. Editing with Photoshop gives a precise projection of shunt veins on the real coloured background of the digital photograph. In this projection the grey angiography background is made completely transparent. The system can contain more detailed information in combination with echo (duplex) images of depth and diameter. This visualisation method is a useful tool for multi disciplinary access meetings with intervention radiologists, access surgeons and nephrologists. Access malfunction, aneurysms and stenosis can be projected at the exact location. The system leads to clear and concrete puncture advice. Transfer of access information and communication to other dialysis centres is facilitated.

  20. Volcanic ash dosage calculator: A proof-of-concept tool to support aviation stakeholders during ash events

    NASA Astrophysics Data System (ADS)

    Dacre, H.; Prata, A.; Shine, K. P.; Irvine, E.

    2017-12-01

    The volcanic ash clouds produced by Icelandic volcano Eyjafjallajökull in April/May 2010 resulted in `no fly zones' which paralysed European aircraft activity and cost the airline industry an estimated £1.1 billion. In response to the crisis, the Civil Aviation Authority (CAA), in collaboration with Rolls Royce, produced the `safe-to-fly' chart. As ash concentrations are the primary output of dispersion model forecasts, the chart was designed to illustrate how engine damage progresses as a function of ash concentration. Concentration thresholds were subsequently derived based on previous ash encounters. Research scientists and aircraft manufactures have since recognised the importance of volcanic ash dosages; the accumulated concentration over time. Dosages are an improvement to concentrations as they can be used to identify pernicious situations where ash concentrations are acceptably low but the exposure time is long enough to cause damage to aircraft engines. Here we present a proof-of-concept volcanic ash dosage calculator; an innovative, web-based research tool, developed in close collaboration with operators and regulators, which utilises interactive data visualisation to communicate the uncertainty inherent in dispersion model simulations and subsequent dosage calculations. To calculate dosages, we use NAME (Numerical Atmospheric-dispersion Modelling Environment) to simulate several Icelandic eruption scenarios, which result in tephra dispersal across the North Atlantic, UK and Europe. Ash encounters are simulated based on flight-optimal routes derived from aircraft routing software. Key outputs of the calculator include: the along-flight dosage, exposure time and peak concentration. The design of the tool allows users to explore the key areas of uncertainty in the dosage calculation and to visualise how this changes as the planned flight path is varied. We expect that this research will result in better informed decisions from key stakeholders during volcanic ash events through a deeper understanding of the associated uncertainties in dosage calculations.

  1. A database of charged cosmic rays

    NASA Astrophysics Data System (ADS)

    Maurin, D.; Melot, F.; Taillet, R.

    2014-09-01

    Aims: This paper gives a description of a new online database and associated online tools (data selection, data export, plots, etc.) for charged cosmic-ray measurements. The experimental setups (type, flight dates, techniques) from which the data originate are included in the database, along with the references to all relevant publications. Methods: The database relies on the MySQL5 engine. The web pages and queries are based on PHP, AJAX and the jquery, jquery.cluetip, jquery-ui, and table-sorter third-party libraries. Results: In this first release, we restrict ourselves to Galactic cosmic rays with Z ≤ 30 and a kinetic energy per nucleon up to a few tens of TeV/n. This corresponds to more than 200 different sub-experiments (i.e., different experiments, or data from the same experiment flying at different times) in as many publications. Conclusions: We set up a cosmic-ray database (CRDB) and provide tools to sort and visualise the data. New data can be submitted, providing the community with a collaborative tool to archive past and future cosmic-ray measurements. http://lpsc.in2p3.fr/crdb; Contact: crdatabase@lpsc.in2p3.fr

  2. TIM, a ray-tracing program for METATOY research and its dissemination

    NASA Astrophysics Data System (ADS)

    Lambert, Dean; Hamilton, Alasdair C.; Constable, George; Snehanshu, Harsh; Talati, Sharvil; Courtial, Johannes

    2012-03-01

    TIM (The Interactive METATOY) is a ray-tracing program specifically tailored towards our research in METATOYs, which are optical components that appear to be able to create wave-optically forbidden light-ray fields. For this reason, TIM possesses features not found in other ray-tracing programs. TIM can either be used interactively or by modifying the openly available source code; in both cases, it can easily be run as an applet embedded in a web page. Here we describe the basic structure of TIM's source code and how to extend it, and we give examples of how we have used TIM in our own research. Program summaryProgram title: TIM Catalogue identifier: AEKY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 124 478 No. of bytes in distributed program, including test data, etc.: 4 120 052 Distribution format: tar.gz Programming language: Java Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6 Operating system: Any; developed under Mac OS X Version 10.6 RAM: Typically 145 MB (interactive version running under Mac OS X Version 10.6) Classification: 14, 18 External routines: JAMA [1] (source code included) Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories; can visualise geometric optic transformations; can create anaglyphs (for viewing with coloured "3D glasses") and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene.

  3. Using geovisual analytics in Google Earth to understand disease distribution: a case study of campylobacteriosis in the Czech Republic (2008-2012).

    PubMed

    Marek, Lukáš; Tuček, Pavel; Pászto, Vít

    2015-01-28

    Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.

  4. WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets

    NASA Astrophysics Data System (ADS)

    Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.

    2014-08-01

    Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.

  5. A novel multimedia tool to improve bedside teaching of cardiac auscultation

    PubMed Central

    Woywodt, A; Herrmann, A; Kielstein, J; Haller, H; Haubitz, M; Purnhagen, H

    2004-01-01

    Training in cardiac auscultation is a core element of undergraduate teaching but recent studies have documented a remarkable decline in auscultatory skills. Therefore there is an interest in new ways to teach cardiac auscultation. In analogy to phonocardiography, an electronic system for simultaneous auscultation and visualisation of murmurs was sought. For this purpose, an electronic stethoscope was linked to a laptop computer and software created to visualise auscultatory findings. In a preliminary trial in undergraduate students, this approach greatly facilitated teaching. Amalgamating traditional phonocardiography with a multimedia approach, this system represents a novel tool for bedside teaching of cardiac auscultation. PMID:15192171

  6. Marketing health education: advertising margarine and visualising health in Britain from 1964-c.2000.

    PubMed

    Hand, Jane

    2017-01-01

    During the post-war period, margarine was re-conceptualised as a value-added product with distinct health benefits. This article contextualises the advertising of margarine as a healthy food, focusing on Unilever's Flora brand as an important case study in legitimising the emergent role of disease prevention as a marketing tool. It uses the methodology of visual culture to examine how advertising employed chronic disease prevention as a selling tool. This article assesses how the post-war environment gave rise to new ways of visually advertising food, and how these promoted innovative visualisations of food, the body and their interactions with health.

  7. Sequence Bundles: a novel method for visualising, discovering and exploring sequence motifs

    PubMed Central

    2014-01-01

    Background We introduce Sequence Bundles--a novel data visualisation method for representing multiple sequence alignments (MSAs). We identify and address key limitations of the existing bioinformatics data visualisation methods (i.e. the Sequence Logo) by enabling Sequence Bundles to give salient visual expression to sequence motifs and other data features, which would otherwise remain hidden. Methods For the development of Sequence Bundles we employed research-led information design methodologies. Sequences are encoded as uninterrupted, semi-opaque lines plotted on a 2-dimensional reconfigurable grid. Each line represents a single sequence. The thickness and opacity of the stack at each residue in each position indicates the level of conservation and the lines' curved paths expose patterns in correlation and functionality. Several MSAs can be visualised in a composite image. The Sequence Bundles method is designed to favour a tangible, continuous and intuitive display of information. Results We have developed a software demonstration application for generating a Sequence Bundles visualisation of MSAs provided for the BioVis 2013 redesign contest. A subsequent exploration of the visualised line patterns allowed for the discovery of a number of interesting features in the dataset. Reported features include the extreme conservation of sequences displaying a specific residue and bifurcations of the consensus sequence. Conclusions Sequence Bundles is a novel method for visualisation of MSAs and the discovery of sequence motifs. It can aid in generating new insight and hypothesis making. Sequence Bundles is well disposed for future implementation as an interactive visual analytics software, which can complement existing visualisation tools. PMID:25237395

  8. How can scientists bring research to use: the HENVINET experience.

    PubMed

    Bartonova, Alena

    2012-06-28

    Health concerns have driven the European environmental policies of the last 25 years, with issues becoming more complex. Addressing these concerns requires an approach that is both interdisciplinary and engages scientists with society. In response to this requirement, the FP6 coordination action "Health and Environment Network" HENVINET was set up to create a permanent inter-disciplinary network of professionals in the field of health and environment tasked to bridge the communication gap between science and society. In this paper we describe how HENVINET delivered on this task. The HENVINET project approached the issue of inter-disciplinary collaboration in four ways. (1) The Drivers-Pressures-State-Exposure-Effect-Action framework was used to structure information gathering, collaboration and communication between scientists in the field of health and the environment. (2) Interactive web-based tools were developed to enhance methods for knowledge evaluation, and use these methods to formulate policy advice. (3) Quantification methods were adapted to measure scientific agreement. And (4) Open architecture web technology was used to develop an information repository and a web portal to facilitate collaboration and communication among scientists. Twenty-five organizations from Europe and five from outside Europe participated in the Health and Environment Network HENVINET, which lasted for 3.5 years. The consortium included partners in environmental research, public health and veterinary medicine; included medical practitioners and representatives of local administrations; and had access to national policy making and EEA and WHO expertise. Dedicated web-based tools for visualisation of environmental health issues and knowledge evaluation allowed remote expert elicitation, and were used as a basis for developing policy advice in five health areas (asthma and allergies; cancer; neurodevelopmental disorders; endocrine disruption; and engineered nanoparticles in the environment). An open searchable database of decision support tools was established and populated. A web based social networking tool was developed to enhance collaboration and communication between scientists and society. HENVINET addressed key issues that arise in inter-disciplinary research on health and environment and in communicating research results to policy makers and society. HENVINET went beyond traditional scientific tools and methods to bridge the communication gap between science and policy makers. The project identified the need for a common framework and delivered it. It developed and implemented a variety of novel methods and tools and, using several representative examples, demonstrated the process of producing politically relevant scientific advice based on an open participation of experts. It highlighted the need for, and benefits of, a liaison between health and environment professionals and professionals in the social sciences and liberal arts. By adopting critical complexity thinking, HENVINET extended the traditional approach to environment and health research, and set the standard for current approaches to bridge the gap between science and society.

  9. 3DNOW: Image-Based 3d Reconstruction and Modeling via Web

    NASA Astrophysics Data System (ADS)

    Tefera, Y.; Poiesi, F.; Morabito, D.; Remondino, F.; Nocerino, E.; Chippendale, P.

    2018-05-01

    This paper presents a web-based 3D imaging pipeline, namely 3Dnow, that can be used by anyone without the need of installing additional software other than a browser. By uploading a set of images through the web interface, 3Dnow can generate sparse and dense point clouds as well as mesh models. 3D reconstructed models can be downloaded with standard formats or previewed directly on the web browser through an embedded visualisation interface. In addition to reconstructing objects, 3Dnow offers the possibility to evaluate and georeference point clouds. Reconstruction statistics, such as minimum, maximum and average intersection angles, point redundancy and density can also be accessed. The paper describes all features available in the web service and provides an analysis of the computational performance using servers with different GPU configurations.

  10. Visualising Astronomy: Visualising Exoplanets

    NASA Astrophysics Data System (ADS)

    Wyatt, R.

    2012-05-01

    In my previous column1, I described some of the varied means of diagramming the data about exoplanets and exoplanetary systems. Frankly, however, those methods don't do justice to the bigger picture: we need a wider range of tools to help people grok2 (to understand intuitively) what astronomical observations have revealed. (Normally, I use the term "visualisation" to refer to the visual representation of data, but I'm going to relax that a little in this context; instead, I'll interpret the word in its more commonplace usage of creating a mental image.) How can we help people comprehend the scope, the breadth, and the impact of the spectacular observations of planets around other stars?

  11. How to Pinpoint Energy-Inefficient Buildings? AN Approach Based on the 3d City Model of Vienna

    NASA Astrophysics Data System (ADS)

    Skarbal, B.; Peters-Anders, J.; Faizan Malik, A.; Agugiaro, G.

    2017-09-01

    This paper describes a methodology to assess the energy performance of residential buildings starting from the semantic 3D city model of Vienna. Space heating, domestic hot water and electricity demand are taken into account. The paper deals with aspects related to urban data modelling, with particular attention to the energy-related topics, and with issues related to interactive data exploration/visualisation and management from a plugin-free web-browser, e.g. based on Cesium, a WebGL virtual globe and map engine. While providing references to existing previous works, only some general and introductory information is given about the data collection, harmonisation and integration process necessary to create the CityGML-based 3D city model, which serves as the central information hub for the different applications developed and described more in detail in this paper. The work aims, among the rest, at developing urban decision making and operational optimisation software tools to minimise non-renewable energy use in cities. The results obtained so far, as well as some comments about their quality and limitations, are presented, together with the discussion regarding the next steps and some planned improvements.

  12. Communicating spatial uncertainty to non-experts using R

    NASA Astrophysics Data System (ADS)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R package included a collation of the plotting functions that were evaluated in the survey. The implementation of static visualisations was done via calls to the 'ggplot2' package. This allowed the user to provide control over the content, legend, colours, axes and titles. The interactive methods were implemented using the 'shiny' package allowing users to activate the visualisation of statistical descriptions of uncertainty through interaction with a plotted map of means. This research brings uncertainty visualisation to a broader audience through the development of tools for visualising uncertainty using open source software.

  13. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    NASA Astrophysics Data System (ADS)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  14. Performance characteristics of visualising the cervix in symptomatic young females: a review of primary care records in females with and without cervical cancer.

    PubMed

    Lim, Anita Wey Wey; Hamilton, Willie; Hollingworth, Antony; Stapley, Sally; Sasieni, Peter

    2016-03-01

    The current strategy for timely detection of cervical cancer in young females centres on visualising the cervix when females present with gynaecological symptoms, but is based on expert opinion without an evidence base. To assess visualising the cervix in primary care in young females with gynaecological symptoms. A review of primary care records for females in England aged 20-29 years with cervical cancer (nationwide interview-based study) and in the general population (Clinical Practice Research Datalink database). From primary care records the proportion of females was identified with gynaecological symptoms who had documented cervical examination in the year before diagnosis (cancers) and in 1-year age bands (general population). Of these, the proportion was identified that was then referred for suspected malignancy. Only 39% of young females with cervical cancer had documented examination at symptomatic presentation. Visualisation resulted in referral for suspected malignancy for 18% of those examined (95% confidence interval = 5% to 40%). Very few (<1.7%) symptomatic females in the general population had documented cervical examination. None were referred for suspected malignancy at the time. The sensitivity of cervical examination to detect cancer is very low, highlighting the need for better triage tools for primary care. Until such tools are identified GPs should continue to consider cervical cancer when symptoms persist and the cervix is not obviously abnormal on clinical examination. Further research on additional triage tools such as cervical cytology used as a diagnostic aid is needed urgently. © British Journal of General Practice 2016.

  15. "Where On Mars?": An Open Planetary Mapping Platform for Researchers, Educators, and the General Public

    NASA Astrophysics Data System (ADS)

    Manaud, Nicolas; Carter, John; Boix, Oriol

    2016-10-01

    The "Where On Mars?" project is essentially the evolution of an existing outreach product developed in collaboration between ESA and CartoDB; an interactive map visualisation of the ESA's ExoMars Rover candidate landing sites (whereonmars.co). Planetary imagery data and maps are increasingly produced by the scientific community, and shared typically as images, in scientific publications, presentations or public outreach websites. However, this media lacks of interactivity and contextual information available for further exploration, making it difficult for any audience to relate one location-based information to another. We believe that interactive web maps are a powerful way of telling stories, engaging with and educating people who, over the last decade, have become familiar with tools such as Google Maps. A few planetary web maps exist but they are either too complex for non-experts, or are closed-systems that do not allows anyone to publish and share content. The long-term vision for the project is to provide researchers, communicators, educators and a worldwide public with an open planetary mapping and social platform enabling them to create, share, communicate and consume research-based content. We aim for this platform to become the reference website everyone will go to learn about Mars and other planets in our Solar System; just like people head to Google Maps to find their bearings or any location-based information. The driver is clearly to create for people an emotional connection with Mars. The short-term objectives for the project are (1) to produce and curate an open repository of basemaps, geospatial data sets, map visualisations, and story maps; (2) to develop a beautifully crafted and engaging interactive map of Mars. Based on user-generated content, the underlying framework should (3) make it easy to create and share additional interactive maps telling specific stories.

  16. Marketing health education: advertising margarine and visualising health in Britain from 1964–c.2000

    PubMed Central

    Hand, Jane

    2017-01-01

    Abstract During the post-war period, margarine was re-conceptualised as a value-added product with distinct health benefits. This article contextualises the advertising of margarine as a healthy food, focusing on Unilever’s Flora brand as an important case study in legitimising the emergent role of disease prevention as a marketing tool. It uses the methodology of visual culture to examine how advertising employed chronic disease prevention as a selling tool. This article assesses how the post-war environment gave rise to new ways of visually advertising food, and how these promoted innovative visualisations of food, the body and their interactions with health. PMID:29348778

  17. linkedISA: semantic representation of ISA-Tab experimental metadata.

    PubMed

    González-Beltrán, Alejandra; Maguire, Eamonn; Sansone, Susanna-Assunta; Rocca-Serra, Philippe

    2014-01-01

    Reporting and sharing experimental metadata- such as the experimental design, characteristics of the samples, and procedures applied, along with the analysis results, in a standardised manner ensures that datasets are comprehensible and, in principle, reproducible, comparable and reusable. Furthermore, sharing datasets in formats designed for consumption by humans and machines will also maximize their use. The Investigation/Study/Assay (ISA) open source metadata tracking framework facilitates standards-compliant collection, curation, visualization, storage and sharing of datasets, leveraging on other platforms to enable analysis and publication. The ISA software suite includes several components used in increasingly diverse set of life science and biomedical domains; it is underpinned by a general-purpose format, ISA-Tab, and conversions exist into formats required by public repositories. While ISA-Tab works well mainly as a human readable format, we have also implemented a linked data approach to semantically define the ISA-Tab syntax. We present a semantic web representation of the ISA-Tab syntax that complements ISA-Tab's syntactic interoperability with semantic interoperability. We introduce the linkedISA conversion tool from ISA-Tab to the Resource Description Framework (RDF), supporting mappings from the ISA syntax to multiple community-defined, open ontologies and capitalising on user-provided ontology annotations in the experimental metadata. We describe insights of the implementation and how annotations can be expanded driven by the metadata. We applied the conversion tool as part of Bio-GraphIIn, a web-based application supporting integration of the semantically-rich experimental descriptions. Designed in a user-friendly manner, the Bio-GraphIIn interface hides most of the complexities to the users, exposing a familiar tabular view of the experimental description to allow seamless interaction with the RDF representation, and visualising descriptors to drive the query over the semantic representation of the experimental design. In addition, we defined queries over the linkedISA RDF representation and demonstrated its use over the linkedISA conversion of datasets from Nature' Scientific Data online publication. Our linked data approach has allowed us to: 1) make the ISA-Tab semantics explicit and machine-processable, 2) exploit the existing ontology-based annotations in the ISA-Tab experimental descriptions, 3) augment the ISA-Tab syntax with new descriptive elements, 4) visualise and query elements related to the experimental design. Reasoning over ISA-Tab metadata and associated data will facilitate data integration and knowledge discovery.

  18. Visualising nursing data using correspondence analysis.

    PubMed

    Kokol, Peter; Blažun Vošner, Helena; Železnik, Danica

    2016-09-01

    Digitally stored, large healthcare datasets enable nurses to use 'big data' techniques and tools in nursing research. Big data is complex and multi-dimensional, so visualisation may be a preferable approach to analyse and understand it. To demonstrate the use of visualisation of big data in a technique called correspondence analysis. In the authors' study, relations among data in a nursing dataset were shown visually in graphs using correspondence analysis. The case presented demonstrates that correspondence analysis is easy to use, shows relations between data visually in a form that is simple to interpret, and can reveal hidden associations between data. Correspondence analysis supports the discovery of new knowledge. Implications for practice Knowledge obtained using correspondence analysis can be transferred immediately into practice or used to foster further research.

  19. Integrated data visualisation: an approach to capture older adults’ wellness

    PubMed Central

    Wilamowska, Katarzyna; Demiris, George; Thompson, Hilaire

    2013-01-01

    Informatics tools can help support the health and independence of older adults. In this paper, we present an approach towards integrating health-monitoring data and describe several techniques for the assessment and visualisation of integrated health and well-being of older adults. We present three different visualisation techniques to provide distinct alternatives towards display of the same information, focusing on reducing the cognitive load of data interpretation. We demonstrate the feasibility of integrating health-monitoring information into a comprehensive measure of wellness, while also highlighting the challenges of designing visual displays targeted at multiple user groups. These visual displays of wellness can be incorporated into personal health records and can be an effective support for informed decision-making. PMID:23079025

  20. Realising the Uncertainty Enabled Model Web

    NASA Astrophysics Data System (ADS)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  1. Analysis of emergency physicians' Twitter accounts.

    PubMed

    Lulic, Ileana; Kovic, Ivor

    2013-05-01

    Twitter is one of the fastest growing social media networks for communication between users via short messages. Technology proficient physicians have demonstrated enthusiasm in adopting social media for their work. To identify and create the largest directory of emergency physicians on Twitter, analyse their user accounts and reveal details behind their connections. Several web search tools were used to identify emergency physicians on Twitter with biographies completely or partially written in English. NodeXL software was used to calculate emergency physicians' Twitter network metrics and create visualisation graphs. The authors found 672 Twitter accounts of self-identified emergency physicians. Protected accounts were excluded from the study, leaving 632 for further analysis. Most emergency physicians were located in USA (55.4%), had created their accounts in 2009 (43.4%), used their full personal name (77.5%) and provided a custom profile picture (92.2%). Based on at least one published tweet in the last 15 days, there were 345 (54.6%) active users on 31 December 2011. Active users mostly used mobile devices based on the Apple operating system to publish tweets (69.2%). Visualisation of emergency physicians' Twitter network revealed many users with no connections with their colleagues, and a small group of most influential users who were highly interconnected. Only a small proportion of registered emergency physicians use Twitter. Among them exists a smaller inner network of emergency physicians with strong social bonds that is using Twitter's full potentials for professional development.

  2. Open source libraries and frameworks for biological data visualisation: a guide for developers.

    PubMed

    Wang, Rui; Perez-Riverol, Yasset; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-04-01

    Recent advances in high-throughput experimental techniques have led to an exponential increase in both the size and the complexity of the data sets commonly studied in biology. Data visualisation is increasingly used as the key to unlock this data, going from hypothesis generation to model evaluation and tool implementation. It is becoming more and more the heart of bioinformatics workflows, enabling scientists to reason and communicate more effectively. In parallel, there has been a corresponding trend towards the development of related software, which has triggered the maturation of different visualisation libraries and frameworks. For bioinformaticians, scientific programmers and software developers, the main challenge is to pick out the most fitting one(s) to create clear, meaningful and integrated data visualisation for their particular use cases. In this review, we introduce a collection of open source or free to use libraries and frameworks for creating data visualisation, covering the generation of a wide variety of charts and graphs. We will focus on software written in Java, JavaScript or Python. We truly believe this software offers the potential to turn tedious data into exciting visual stories. © 2014 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Open source libraries and frameworks for biological data visualisation: A guide for developers

    PubMed Central

    Wang, Rui; Perez-Riverol, Yasset; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2015-01-01

    Recent advances in high-throughput experimental techniques have led to an exponential increase in both the size and the complexity of the data sets commonly studied in biology. Data visualisation is increasingly used as the key to unlock this data, going from hypothesis generation to model evaluation and tool implementation. It is becoming more and more the heart of bioinformatics workflows, enabling scientists to reason and communicate more effectively. In parallel, there has been a corresponding trend towards the development of related software, which has triggered the maturation of different visualisation libraries and frameworks. For bioinformaticians, scientific programmers and software developers, the main challenge is to pick out the most fitting one(s) to create clear, meaningful and integrated data visualisation for their particular use cases. In this review, we introduce a collection of open source or free to use libraries and frameworks for creating data visualisation, covering the generation of a wide variety of charts and graphs. We will focus on software written in Java, JavaScript or Python. We truly believe this software offers the potential to turn tedious data into exciting visual stories. PMID:25475079

  4. EarthServer2 : The Marine Data Service - Web based and Programmatic Access to Ocean Colour Open Data

    NASA Astrophysics Data System (ADS)

    Clements, Oliver; Walker, Peter

    2017-04-01

    The ESA Ocean Colour - Climate Change Initiative (ESA OC-CCI) has produced a long-term high quality global dataset with associated per-pixel uncertainty data. This dataset has now grown to several hundred terabytes (uncompressed) and is freely available to download. However, the sheer size of the dataset can act as a barrier to many users; large network bandwidth, local storage and processing requirements can prevent researchers without the backing of a large organisation from taking advantage of this raw data. The EC H2020 project, EarthServer2, aims to create a federated data service providing access to more than 1 petabyte of earth science data. Within this federation the Marine Data Service already provides an innovative on-line tool-kit for filtering, analysing and visualising OC-CCI data. Data are made available, filtered and processed at source through a standards-based interface, the Open Geospatial Consortium Web Coverage Service and Web Coverage Processing Service. This work was initiated in the EC FP7 EarthServer project where it was found that the unfamiliarity and complexity of these interfaces itself created a barrier to wider uptake. The continuation project, EarthServer2, addresses these issues by providing higher level tools for working with these data. We will present some examples of these tools. Many researchers wish to extract time series data from discrete points of interest. We will present a web based interface, based on NASA/ESA WebWorldWind, for selecting points of interest and plotting time series from a chosen dataset. In addition, a CSV file of locations and times, such as a ship's track, can be uploaded and these points extracted and returned in a CSV file allowing researchers to work with the extract locally, such as a spreadsheet. We will also present a set of Python and JavaScript APIs that have been created to complement and extend the web based GUI. These APIs allow the selection of single points and areas for extraction. The extracted data is returned as structured data (for instance a Python array) which can then be passed directly to local processing code. We will highlight how the libraries can be used by the community and integrated into existing systems, for instance by the use of Jupyter notebooks to share Python code examples which can then be used by other researchers as a basis for their own work.

  5. The ethics of Google Earth: crossing thresholds from spatial data to landscape visualisation.

    PubMed

    Sheppard, Stephen R J; Cizek, Petr

    2009-05-01

    'Virtual globe' software systems such as Google Earth are growing rapidly in popularity as a way to visualise and share 3D environmental data. Scientists and environmental professionals, many of whom are new to 3D modeling and visual communications, are beginning routinely to use such techniques in their work. While the appeal of these techniques is evident, with unprecedented opportunities for public access to data and collaborative engagement over the web, are there nonetheless risks in their widespread usage when applied in areas of the public interest such as planning and policy-making? This paper argues that the Google Earth phenomenon, which features realistic imagery of places, cannot be dealt with only as a question of spatial data and geographic information science. The virtual globe type of visualisation crosses several key thresholds in communicating scientific and environmental information, taking it well beyond the realm of conventional spatial data and geographic information science, and engaging more complex dimensions of human perception and aesthetic preference. The realism, perspective views, and social meanings of the landscape visualisations embedded in virtual globes invoke not only cognition but also emotional and intuitive responses, with associated issues of uncertainty, credibility, and bias in interpreting the imagery. This paper considers the types of risks as well as benefits that may exist with participatory uses of virtual globes by experts and lay-people. It is illustrated with early examples from practice and relevant themes from the literature in landscape visualisation and related disciplines such as environmental psychology and landscape planning. Existing frameworks and principles for the appropriate use of environmental visualisation methods are applied to the special case of widely accessible, realistic 3D and 4D visualisation systems such as Google Earth, in the context of public awareness-building and agency decision-making on environmental issues. Relevant principles are suggested which lend themselves to much-needed evaluation of risks and benefits of virtual globe systems. Possible approaches for balancing these benefits and risks include codes of ethics, software design, and metadata templates.

  6. Using geographical information systems and cartograms as a health service quality improvement tool.

    PubMed

    Lovett, Derryn A; Poots, Alan J; Clements, Jake T C; Green, Stuart A; Samarasundera, Edgar; Bell, Derek

    2014-07-01

    Disease prevalence can be spatially analysed to provide support for service implementation and health care planning, these analyses often display geographic variation. A key challenge is to communicate these results to decision makers, with variable levels of Geographic Information Systems (GIS) knowledge, in a way that represents the data and allows for comprehension. The present research describes the combination of established GIS methods and software tools to produce a novel technique of visualising disease admissions and to help prevent misinterpretation of data and less optimal decision making. The aim of this paper is to provide a tool that supports the ability of decision makers and service teams within health care settings to develop services more efficiently and better cater to the population; this tool has the advantage of information on the position of populations, the size of populations and the severity of disease. A standard choropleth of the study region, London, is used to visualise total emergency admission values for Chronic Obstructive Pulmonary Disease and bronchiectasis using ESRI's ArcGIS software. Population estimates of the Lower Super Output Areas (LSOAs) are then used with the ScapeToad cartogram software tool, with the aim of visualising geography at uniform population density. An interpolation surface, in this case ArcGIS' spline tool, allows the creation of a smooth surface over the LSOA centroids for admission values on both standard and cartogram geographies. The final product of this research is the novel Cartogram Interpolation Surface (CartIS). The method provides a series of outputs culminating in the CartIS, applying an interpolation surface to a uniform population density. The cartogram effectively equalises the population density to remove visual bias from areas with a smaller population, while maintaining contiguous borders. CartIS decreases the number of extreme positive values not present in the underlying data as can be found in interpolation surfaces. This methodology provides a technique for combining simple GIS tools to create a novel output, CartIS, in a health service context with the key aim of improving visualisation communication techniques which highlight variation in small scale geographies across large regions. CartIS more faithfully represents the data than interpolation, and visually highlights areas of extreme value more than cartograms, when either is used in isolation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Innovation through Wearable Sensors to Collect Real-Life Data among Pediatric Patients with Cardiometabolic Risk Factors

    PubMed Central

    Yan, Kestens; Tracie, Barnett; Marie-Ève, Mathieu; Mélanie, Henderson; Jean-Luc, Bigras; Benoit, Thierry; St-Onge, Maxime; Marie, Lambert

    2014-01-01

    Background. While increasing evidence links environments to health behavior, clinicians lack information about patients' physical activity levels and lifestyle environments. We present mobile health tools to collect and use spatio-behavioural lifestyle data for personalized physical activity plans in clinical settings. Methods. The Dyn@mo lifestyle intervention was developed at the Sainte-Justine University Hospital Center to promote physical activity and reduce sedentary time among children with cardiometabolic risk factors. Mobility, physical activity, and heart rate were measured in free-living environments during seven days. Algorithms processed data to generate spatio-behavioural indicators that fed a web-based interactive mapping application for personalised counseling. Proof of concept and tools are presented using data collected among the first 37 participants recruited in 2011. Results. Valid accelerometer data was available for 5.6 (SD = 1.62) days in average, heart rate data for 6.5 days, and GPS data was available for 6.1 (2.1) days. Spatio-behavioural indicators were shared between patients, parents, and practitioners to support counseling. Conclusion. Use of wearable sensors along with data treatment algorithms and visualisation tools allow to better measure and describe real-life environments, mobility, physical activity, and physiological responses. Increased specificity in lifestyle interventions opens new avenues for remote patient monitoring and intervention. PMID:24678323

  8. Visualising the Health of Communities: Using Photovoice as a Pedagogical Tool in the College Classroom

    ERIC Educational Resources Information Center

    Cooper, Cheryl; Sorensen, William; Yarbrough, Susan

    2017-01-01

    Objective: To describe the use of Photovoice as a pedagogical tool to promote experiential learning and critical dialogue among participants on an undergraduate community health course. Design: A descriptive study of the use of the pedagogical tool Photovoice, based on three foundational education theories. Results: Based on teachers' reflective…

  9. Approaches to integrating indicators into 3D landscape visualisations and their benefits for participative planning situations.

    PubMed

    Wissen, Ulrike; Schroth, Olaf; Lange, Eckart; Schmid, Willy A

    2008-11-01

    In discussing issues of landscape change, the complex relationships in the landscape have to be assessed. In participative planning processes, 3D visualisations have a high potential as an aid in understanding and communicating characteristics of landscape conditions by integrating visual and non-visual landscape information. Unclear is, which design and how much interactivity is required for an indicator visualisation that would suit stakeholders best in workshop situations. This paper describes the preparation and application of three different types of integrated 3D visualisations in workshops conducted in the Entlebuch UNESCO Biosphere Reserve (CH). The results reveal that simple representations of a complex issue created by draping thematic maps on the 3D model can make problematic developments visible at a glance; that diagrams linked to the spatial context can help draw attention to problematic relationships not considered beforehand; and that the size of species as indicators of conditions of the landscape's production and biotope function seems to provide a common language for stakeholders with different perspectives. Overall, the of the indicators the functions required to assist in information processing. Further research should focus on testing the effectiveness of the integrated visualisation tools in participative processes for the general public.

  10. BioSAVE: display of scored annotation within a sequence context.

    PubMed

    Pollock, Richard F; Adryan, Boris

    2008-03-20

    Visualization of sequence annotation is a common feature in many bioinformatics tools. For many applications it is desirable to restrict the display of such annotation according to a score cutoff, as biological interpretation can be difficult in the presence of the entire data. Unfortunately, many visualisation solutions are somewhat static in the way they handle such score cutoffs. We present BioSAVE, a sequence annotation viewer with on-the-fly selection of visualisation thresholds for each feature. BioSAVE is a versatile OS X program for visual display of scored features (annotation) within a sequence context. The program reads sequence and additional supplementary annotation data (e.g., position weight matrix matches, conservation scores, structural domains) from a variety of commonly used file formats and displays them graphically. Onscreen controls then allow for live customisation of these graphics, including on-the-fly selection of visualisation thresholds for each feature. Possible applications of the program include display of transcription factor binding sites in a genomic context or the visualisation of structural domain assignments in protein sequences and many more. The dynamic visualisation of these annotations is useful, e.g., for the determination of cutoff values of predicted features to match experimental data. Program, source code and exemplary files are freely available at the BioSAVE homepage.

  11. BioSAVE: Display of scored annotation within a sequence context

    PubMed Central

    Pollock, Richard F; Adryan, Boris

    2008-01-01

    Background Visualization of sequence annotation is a common feature in many bioinformatics tools. For many applications it is desirable to restrict the display of such annotation according to a score cutoff, as biological interpretation can be difficult in the presence of the entire data. Unfortunately, many visualisation solutions are somewhat static in the way they handle such score cutoffs. Results We present BioSAVE, a sequence annotation viewer with on-the-fly selection of visualisation thresholds for each feature. BioSAVE is a versatile OS X program for visual display of scored features (annotation) within a sequence context. The program reads sequence and additional supplementary annotation data (e.g., position weight matrix matches, conservation scores, structural domains) from a variety of commonly used file formats and displays them graphically. Onscreen controls then allow for live customisation of these graphics, including on-the-fly selection of visualisation thresholds for each feature. Conclusion Possible applications of the program include display of transcription factor binding sites in a genomic context or the visualisation of structural domain assignments in protein sequences and many more. The dynamic visualisation of these annotations is useful, e.g., for the determination of cutoff values of predicted features to match experimental data. Program, source code and exemplary files are freely available at the BioSAVE homepage. PMID:18366701

  12. Framing ICT-Enabled Innovation for Learning: The Case of One-to-One Learning Initiatives in Europe

    ERIC Educational Resources Information Center

    Bocconi, Stefania; Kampylis, Panagiotis; Punie, Yves

    2013-01-01

    This article discusses 1:1 learning initiatives in Europe in the context of a mapping framework of ICT-enabled innovation for learning. The aim of the framework, visualised as a spider's web, is two-fold: (i) to provide a further understanding of the nature of ICT-enabled innovation for learning; and (ii) to depict the impact of existing and…

  13. Web access and dissemination of Andalusian coastal erosion rates: viewers and standard/filtered map services.

    NASA Astrophysics Data System (ADS)

    Álvarez Francoso, Jose; Prieto Campos, Antonio; Ojeda Zujar, Jose; Guisado-Pintado, Emilia; Pérez Alcántara, Juan Pedro

    2017-04-01

    The accessibility to environmental information via web viewers using map services (OGC or proprietary services) has become more frequent since newly information sources (ortophotos, LIDAR, GPS) are of great detailed and thus generate a great volume of data which barely can be disseminated using either analogue (paper maps) or digital (pdf) formats. Moreover, governments and public institutions are concerned about the need of facilitates provision to research results and improve communication about natural hazards to citizens and stakeholders. This information ultimately, if adequately disseminated, it's crucial in decision making processes, risk management approaches and could help to increase social awareness related to environmental issues (particularly climate change impacts). To overcome this issue, two strategies for wide dissemination and communication of the results achieved in the calculation of beach erosion for the 640 km length of the Andalusian coast (South Spain) using web viewer technology are presented. Each of them are oriented to different end users and thus based on different methodologies. Erosion rates has been calculated at 50m intervals for different periods (1956-1977-2001-2011) as part of a National Research Project based on the spasialisation and web-access of coastal vulnerability indicators for Andalusian region. The 1st proposal generates WMS services (following OGC standards) that are made available by Geoserver, using a geoviewer client developed through Leaflet. This viewer is designed to be used by the general public (citizens, politics, etc) by combining a set of tools that give access to related documents (pdfs), visualisation tools (panoramio pictures, geo-localisation with GPS) are which are displayed within an user-friendly interface. Further, the use of WMS services (implemented on Geoserver) provides a detailed semiology (arrows and proportional symbols, using alongshore coastaline buffers to represent data) which not only enhances access to erosion rates but also enables multi-scale data representation. The 2nd proposal, as intended to be used by technicians and specialists on the field, includes a geoviewer with an innovative profile (including visualization of time-ranges, application of different uncertainty levels to the data, etc) to fulfil the needs of these users. For its development, a set of Javascript libraries combined with Openlayers (or Leaflet) are implemented to guarantee all the functionalities existing for the basic geoviewer. Further to this, the viewer has been improved by i) the generation of services by request through the application of a filter in ECQL language (Extended Common Query Language), using the vendor parameter CQL_FILTER from Geoserver. These dynamic filters allow the final user to predefine the visualised variable, its spatial and temporal domain, a range of specific values and other attributes, thus multiplying the generation of real-time cartography; ii) by using the layer's WFS service, the Javascript application exploit the alphanumeric data to generate related statistics in real time (e.g. mean rates, length of eroded coast, etc.) and interactive graphs (via HighCharts.js library) which accurately help in beach erosion rates interpretation (representing trends and bars diagrams, among others. As a result two approaches for communicating scientific results to different audiences based on web-based with complete dataset of geo-information, services and functionalities are implemented. The combination of standardised environmental data with tailor-made exploitation techniques (interactive maps, and real-time statistics) assures the correct access and interpretation of the information.

  14. GeoMapApp Learning Activities: A Virtual Lab Environment for Student-Centred Engagement with Geoscience Data

    NASA Astrophysics Data System (ADS)

    Kluge, S.; Goodwillie, A. M.

    2012-12-01

    As STEM learning requirements enter the mainstream, there is benefit to providing the tools necessary for students to engage with research-quality geoscience data in a cutting-edge, easy-to-use map-based interface. Funded with an NSF GeoEd award, GeoMapApp Learning Activities ( http://serc.carleton.edu/geomapapp/collection.html ) are being created to help in that endeavour. GeoMapApp Learning Activities offer step-by-step instructions within a guided inquiry approach that enables students to dictate the pace of learning. Based upon GeoMapApp (http://www.geomapapp.org), a free, easy-to-use map-based data exploration and visualisation tool, each activity furnishes the educator with an efficient package of downloadable documents. This includes step-by-step student instructions and answer sheet; an educator's annotated worksheet containing teaching tips, additional content and suggestions for further work; and, quizzes for use before and after the activity to assess learning. Examples of activities so far created involve calculation and analysis of the rate of seafloor spreading; compilation of present-day evidence for huge ancient landslides on the seafloor around the Hawaiian islands; a study of radiometrically-dated volcanic rocks to help understand the concept of hotspots; and, the optimisation of contours as a means to aid visualisation of 3-D data sets on a computer screen. The activities are designed for students at the introductory undergraduate, community college and high school levels, and present a virtual lab-like environment to expose students to content and concepts typically found in those educational settings. The activities can be used in the classroom or out of class, and their guided nature means that the requirement for teacher intervention is reduced thus allowing students to spend more time analysing and understanding geoscience data, content and concepts. Each activity is freely available through the SERC-Carleton web site.

  15. Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost

    NASA Astrophysics Data System (ADS)

    Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.

    2016-12-01

    The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.

  16. Urbanisation and its effect on risk factors associated with childhood diarrhoea in Mbour, Senegal: A visualisation.

    PubMed

    Thiam, Sokhna; Fuhrimann, Samuel; Niang-Diène, Aminata; Sy, Ibrahima; Faye, Ousmane; Utzinger, Jürg; Cissé, Guéladio

    2017-11-27

    Rapid urbanisation, particularly in secondary cities in Africa, brings along specific challenges for global health, including the prevention and control of infectious diseases such as diarrhoea. Our purpose was to visualise urbanisation trends and its effect on risk factors associated with childhood diarrhoea, e.g. water supply, sanitation, wastewater and solid waste management in Mbour, a secondary city in south-western Senegal. Our visualisation is facilitated by epidemiological and geographical surveys carried out in 2016. A deeper spatial and visual understanding of the urbanisation trends and the disparities of diarrhoea-associated risk factors might lead to the implementation of suitable health interventions and preventive measures. Our visualisation is aimed to serve as a basis for discussion and as a decision support tool for policymakers, municipal officials and local communities to prioritise interventions related to water, sanitation and waste management with a view to reduce the environmental and health risks in the rapidly growing city of Mbour, which is set as an example for other similar secondary cities across low- and middle-income countries in Africa.

  17. iReport: a generalised Galaxy solution for integrated experimental reporting.

    PubMed

    Hiltemann, Saskia; Hoogstrate, Youri; der Spek, Peter van; Jenster, Guido; Stubbs, Andrew

    2014-01-01

    Galaxy offers a number of visualisation options with components, such as Trackster, Circster and Galaxy Charts, but currently lacks the ability to easily combine outputs from different tools into a single view or report. A number of tools produce HTML reports as output in order to combine the various output files from a single tool; however, this requires programming and knowledge of HTML, and the reports must be custom-made for each new tool. We have developed a generic and flexible reporting tool for Galaxy, iReport, that allows users to create interactive HTML reports directly from the Galaxy UI, with the ability to combine an arbitrary number of outputs from any number of different tools. Content can be organised into different tabs, and interactivity can be added to components. To demonstrate the capability of iReport we provide two publically available examples, the first is an iReport explaining about iReports, created for, and using content from the recent Galaxy Community Conference 2014. The second is a genetic report based on a trio analysis to determine candidate pathogenic variants which uses our previously developed Galaxy toolset for whole-genome NGS analysis, CGtag. These reports may be adapted for outputs from any sequencing platform and any results, such as omics data, non-high throughput results and clinical variables. iReport provides a secure, collaborative, and flexible web-based reporting system that is compatible with Galaxy (and non-Galaxy) generated content. We demonstrate its value with a real-life example of reporting genetic trio-analysis.

  18. airGRteaching: an R-package designed for teaching hydrology with lumped hydrological models

    NASA Astrophysics Data System (ADS)

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Andréassian, Vazken; Brigode, Pierre

    2017-04-01

    Lumped hydrological models are useful and convenient tools for research, engineering and educational purposes. They propose catchment-scale representations of the precipitation-discharge relationship. Thanks to their limited data requirements, they can be easily implemented and run. With such models, it is possible to simulate a number of hydrological key processes over the catchment with limited structural and parametric complexity, typically evapotranspiration, runoff, underground losses, etc. The Hydrology Group at Irstea (Antony) has been developing a suite of rainfall-runoff models over the past 30 years. This resulted in a suite of models running at different time steps (from hourly to annual) applicable for various issues including water balance estimation, forecasting, simulation of impacts and scenario testing. Recently, Irstea has developed an easy-to-use R-package (R Core Team, 2016), called airGR (Coron et al., 2016, 2017), to make these models widely available. Although its initial target public was hydrological modellers, the package is already used for educational purposes. Indeed, simple models allow for rapidly visualising the effects of parameterizations and model components on flows hydrographs. In order to avoid the difficulties that students may have when manipulating R and datasets, we developed (Delaigue and Coron, 2016): - Three simplified functions to prepare data, calibrate a model and run a simulation - Simplified and dynamic plot functions - A shiny (Chang et al., 2016) interface that connects this R-package to a browser-based visualisation tool. On this interface, the students can use different hydrological models (including the possibility to use a snow-accounting model), manually modify their parameters and automatically calibrate their parameters with diverse objective functions. One of the visualisation tabs of the interface includes observed precipitation and temperature, simulated snowpack (if any), observed and simulated discharges, which are updated immediately (a calibration only needs a couple of seconds or less, a simulation is almost immediate). In addition, time series of internal variables, live-visualisation of internal variables evolution and performance statistics are provided. This interface allows for hands-on exercises that can include for instance the analysis by students of: - The effects of each parameter and model components on simulated discharge - The effects of objective functions based on high flows- or low flows-focused criteria on simulated discharge - The seasonality of the model components. References Winston Chang, Joe Cheng, JJ Allaire, Yihui Xie and Jonathan McPherson (2016). shiny: Web Application Framework for R. R package version 0.13.2. https://CRAN.R-project.org/package=shiny Coron L., Thirel G., Perrin C., Delaigue O., Andréassian V., airGR: a suite of lumped hydrological models in an R-package, Environmental Modelling and software, 2017, submitted. Coron, L., Perrin, C. and Michel, C. (2016). airGR: Suite of GR hydrological models for precipitation-runoff modelling. R package version 1.0.3. https://webgr.irstea.fr/airGR/?lang=en. Olivier Delaigue and Laurent Coron (2016). airGRteaching: Tools to simplify the use of the airGR hydrological package by students. R package version 0.0.1. https://webgr.irstea.fr/airGR/?lang=en R Core Team (2016). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL https://www.R-project.org/.

  19. Instructor-Provided Summary Infographics to Support Online Learning

    ERIC Educational Resources Information Center

    Elena Gallagher, Silvia; O'Dulain, Mairtin; O'Mahony, Niamh; Kehoe, Claire; McCarthy, Fintan; Morgan, Gerard

    2017-01-01

    Infographics are a visualisation tool that can be used to improve retention, comprehension and appeal of complex concepts. The rise of infographic use in education has facilitated new forms of application and design of these tools. Instructor-provided summary infographics are a new form of infographic, whereby key learning objectives and content…

  20. The Effect of a Computer-Based Cartooning Tool on Children's Cartoons and Written Stories

    ERIC Educational Resources Information Center

    Madden, M.; Chung, P. W. H.; Dawson, C. W.

    2008-01-01

    This paper reports a study assessing a new computer tool for cartoon storytelling, created by the authors for a target audience in the upper half of the English and Welsh Key Stage 2 (years 5 and 6, covering ages 9-11 years). The tool attempts to provide users with more opportunities for expressive visualisation than previous educational software;…

  1. Visualisation of abscisic acid and 12-oxo-phytodienoic acid in immature Phaseolus vulgaris L. seeds using desorption electrospray ionisation-imaging mass spectrometry

    NASA Astrophysics Data System (ADS)

    Enomoto, Hirofumi; Sensu, Takuya; Sato, Kei; Sato, Futoshi; Paxton, Thanai; Yumoto, Emi; Miyamoto, Koji; Asahina, Masashi; Yokota, Takao; Yamane, Hisakazu

    2017-02-01

    The plant hormone abscisic acid (ABA) and the jasmonic acid related-compound 12-oxo-phytodienoic acid (OPDA) play crucial roles in seed development, dormancy, and germination. However, a lack of suitable techniques for visualising plant hormones has restricted the investigation of their biological mechanisms. In the present study, desorption electrospray ionisation-imaging mass spectrometry (DESI-IMS), a powerful tool for visualising metabolites in biological tissues, was used to visualise ABA and OPDA in immature Phaseolus vulgaris L. seed sections. The mass spectra, peak values and chemical formulae obtained from the analysis of seed sections were consistent with those determined for ABA and OPDA standards, as were the precursor and major fragment ions observed in tandem mass spectrometry (MS/MS) imaging. Furthermore, the precursor and fragment ion images showed similar distribution patterns. In addition, the localisation of ABA and OPDA using DESI-IMS was confirmed using liquid chromatography-MS/MS (LC-MS/MS). The results indicated that ABA was mainly distributed in the radical and cotyledon of the embryo, whereas OPDA was distributed exclusively in external structures, such as the hilum and seed coat. The present study is the first to report the visualisation of plant hormones using IMS, and demonstrates that DESI-IMS is a promising technique for future plant hormone research.

  2. Streaming visualisation of quantitative mass spectrometry data based on a novel raw signal decomposition method

    PubMed Central

    Zhang, Yan; Bhamber, Ranjeet; Riba-Garcia, Isabel; Liao, Hanqing; Unwin, Richard D; Dowsey, Andrew W

    2015-01-01

    As data rates rise, there is a danger that informatics for high-throughput LC-MS becomes more opaque and inaccessible to practitioners. It is therefore critical that efficient visualisation tools are available to facilitate quality control, verification, validation, interpretation, and sharing of raw MS data and the results of MS analyses. Currently, MS data is stored as contiguous spectra. Recall of individual spectra is quick but panoramas, zooming and panning across whole datasets necessitates processing/memory overheads impractical for interactive use. Moreover, visualisation is challenging if significant quantification data is missing due to data-dependent acquisition of MS/MS spectra. In order to tackle these issues, we leverage our seaMass technique for novel signal decomposition. LC-MS data is modelled as a 2D surface through selection of a sparse set of weighted B-spline basis functions from an over-complete dictionary. By ordering and spatially partitioning the weights with an R-tree data model, efficient streaming visualisations are achieved. In this paper, we describe the core MS1 visualisation engine and overlay of MS/MS annotations. This enables the mass spectrometrist to quickly inspect whole runs for ionisation/chromatographic issues, MS/MS precursors for coverage problems, or putative biomarkers for interferences, for example. The open-source software is available from http://seamass.net/viz/. PMID:25663356

  3. Visualisation of abscisic acid and 12-oxo-phytodienoic acid in immature Phaseolus vulgaris L. seeds using desorption electrospray ionisation-imaging mass spectrometry

    PubMed Central

    Enomoto, Hirofumi; Sensu, Takuya; Sato, Kei; Sato, Futoshi; Paxton, Thanai; Yumoto, Emi; Miyamoto, Koji; Asahina, Masashi; Yokota, Takao; Yamane, Hisakazu

    2017-01-01

    The plant hormone abscisic acid (ABA) and the jasmonic acid related-compound 12-oxo-phytodienoic acid (OPDA) play crucial roles in seed development, dormancy, and germination. However, a lack of suitable techniques for visualising plant hormones has restricted the investigation of their biological mechanisms. In the present study, desorption electrospray ionisation-imaging mass spectrometry (DESI-IMS), a powerful tool for visualising metabolites in biological tissues, was used to visualise ABA and OPDA in immature Phaseolus vulgaris L. seed sections. The mass spectra, peak values and chemical formulae obtained from the analysis of seed sections were consistent with those determined for ABA and OPDA standards, as were the precursor and major fragment ions observed in tandem mass spectrometry (MS/MS) imaging. Furthermore, the precursor and fragment ion images showed similar distribution patterns. In addition, the localisation of ABA and OPDA using DESI-IMS was confirmed using liquid chromatography-MS/MS (LC-MS/MS). The results indicated that ABA was mainly distributed in the radical and cotyledon of the embryo, whereas OPDA was distributed exclusively in external structures, such as the hilum and seed coat. The present study is the first to report the visualisation of plant hormones using IMS, and demonstrates that DESI-IMS is a promising technique for future plant hormone research. PMID:28211480

  4. Ensembl 2002: accommodating comparative genomics.

    PubMed

    Clamp, M; Andrews, D; Barker, D; Bevan, P; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Hubbard, T; Kasprzyk, A; Keefe, D; Lehvaslaiho, H; Iyer, V; Melsopp, C; Mongin, E; Pettett, R; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Birney, E

    2003-01-01

    The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of human, mouse and other genome sequences, available as either an interactive web site or as flat files. Ensembl also integrates manually annotated gene structures from external sources where available. As well as being one of the leading sources of genome annotation, Ensembl is an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements. These range from sequence analysis to data storage and visualisation and installations exist around the world in both companies and at academic sites. With both human and mouse genome sequences available and more vertebrate sequences to follow, many of the recent developments in Ensembl have focusing on developing automatic comparative genome analysis and visualisation.

  5. Automatically visualise and analyse data on pathways using PathVisioRPC from any programming environment.

    PubMed

    Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T

    2015-08-23

    Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be used by data analysis pipelines for functional analysis of processed genomics data. PathVisioRPC enables data visualisation and pathway analysis directly from within various analytical environments used for preliminary analyses. It supports the use of existing pathways from WikiPathways or pathways created using the RPC itself. It also enables automation of tasks performed using PathVisio, making it useful to PathVisio users performing repeated visualisation and analysis tasks. PathVisioRPC is freely available for academic and commercial use at http://projects.bigcat.unimaas.nl/pathvisiorpc.

  6. Assessment of a Bayesian Belief Network-GIS framework as a practical tool to support marine planning.

    PubMed

    Stelzenmüller, V; Lee, J; Garnacho, E; Rogers, S I

    2010-10-01

    For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. The role of 3D visualisation as an analytical tool preparatory to numerical modelling [rapid communication

    NASA Astrophysics Data System (ADS)

    Robins, N. S.; Rutter, H. K.; Dumpleton, S.; Peach, D. W.

    2005-01-01

    Groundwater investigation has long depended on the process of developing a conceptual flow model as a precursor to developing a mathematical model, which in turn may lead in complex aquifers to the development of a numerical approximation model. The assumptions made in the development of the conceptual model depend heavily on the geological framework defining the aquifer, and if the conceptual model is inappropriate then subsequent modelling will also be incorrect. Paradoxically, the development of a robust conceptual model remains difficult, not least because this 3D paradigm is usually reduced to 2D plans and sections. 3D visualisation software is now available to facilitate the development of the conceptual model, to make the model more robust and defensible and to assist in demonstrating the hydraulics of the aquifer system. Case studies are presented to demonstrate the role and cost-effectiveness of the visualisation process.

  8. RSAT matrix-clustering: dynamic exploration and redundancy reduction of transcription factor binding motif collections

    PubMed Central

    Jaeger, Sébastien; Thieffry, Denis

    2017-01-01

    Abstract Transcription factor (TF) databases contain multitudes of binding motifs (TFBMs) from various sources, from which non-redundant collections are derived by manual curation. The advent of high-throughput methods stimulated the production of novel collections with increasing numbers of motifs. Meta-databases, built by merging these collections, contain redundant versions, because available tools are not suited to automatically identify and explore biologically relevant clusters among thousands of motifs. Motif discovery from genome-scale data sets (e.g. ChIP-seq) also produces redundant motifs, hampering the interpretation of results. We present matrix-clustering, a versatile tool that clusters similar TFBMs into multiple trees, and automatically creates non-redundant TFBM collections. A feature unique to matrix-clustering is its dynamic visualisation of aligned TFBMs, and its capability to simultaneously treat multiple collections from various sources. We demonstrate that matrix-clustering considerably simplifies the interpretation of combined results from multiple motif discovery tools, and highlights biologically relevant variations of similar motifs. We also ran a large-scale application to cluster ∼11 000 motifs from 24 entire databases, showing that matrix-clustering correctly groups motifs belonging to the same TF families, and drastically reduced motif redundancy. matrix-clustering is integrated within the RSAT suite (http://rsat.eu/), accessible through a user-friendly web interface or command-line for its integration in pipelines. PMID:28591841

  9. iPadPix—A novel educational tool to visualise radioactivity measured by a hybrid pixel detector

    NASA Astrophysics Data System (ADS)

    Keller, O.; Schmeling, S.; Müller, A.; Benoit, M.

    2016-11-01

    With the ability to attribute signatures of ionising radiation to certain particle types, pixel detectors offer a unique advantage over the traditional use of Geiger-Müller tubes also in educational settings. We demonstrate in this work how a Timepix readout chip combined with a standard 300μm pixelated silicon sensor can be used to visualise radioactivity in real-time and by means of augmented reality. The chip family is the result of technology transfer from High Energy Physics at CERN and facilitated by the Medipix Collaboration. This article summarises the development of a prototype based on an iPad mini and open source software detailed in ref. [1]. Appropriate experimental activities that explore natural radioactivity and everyday objects are given to demonstrate the use of this new tool in educational settings.

  10. The Footprint Database and Web Services of the Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Dobos, László; Varga-Verebélyi, Erika; Verdugo, Eva; Teyssier, David; Exter, Katrina; Valtchanov, Ivan; Budavári, Tamás; Kiss, Csaba

    2016-10-01

    Data from the Herschel Space Observatory is freely available to the public but no uniformly processed catalogue of the observations has been published so far. To date, the Herschel Science Archive does not contain the exact sky coverage (footprint) of individual observations and supports search for measurements based on bounding circles only. Drawing on previous experience in implementing footprint databases, we built the Herschel Footprint Database and Web Services for the Herschel Space Observatory to provide efficient search capabilities for typical astronomical queries. The database was designed with the following main goals in mind: (a) provide a unified data model for meta-data of all instruments and observational modes, (b) quickly find observations covering a selected object and its neighbourhood, (c) quickly find every observation in a larger area of the sky, (d) allow for finding solar system objects crossing observation fields. As a first step, we developed a unified data model of observations of all three Herschel instruments for all pointing and instrument modes. Then, using telescope pointing information and observational meta-data, we compiled a database of footprints. As opposed to methods using pixellation of the sphere, we represent sky coverage in an exact geometric form allowing for precise area calculations. For easier handling of Herschel observation footprints with rather complex shapes, two algorithms were implemented to reduce the outline. Furthermore, a new visualisation tool to plot footprints with various spherical projections was developed. Indexing of the footprints using Hierarchical Triangular Mesh makes it possible to quickly find observations based on sky coverage, time and meta-data. The database is accessible via a web site http://herschel.vo.elte.hu and also as a set of REST web service functions, which makes it readily usable from programming environments such as Python or IDL. The web service allows downloading footprint data in various formats including Virtual Observatory standards.

  11. An Offline-Online Android Application for Hazard Event Mapping Using WebGIS Open Source Technologies

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya

    2016-04-01

    Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche, etc. Keywords: Offline, Online, WebGIS Open source, Android, Hazard Event Mapping

  12. Eodataservice.org: Big Data Platform to Enable Multi-disciplinary Information Extraction from Geospatial Data

    NASA Astrophysics Data System (ADS)

    Natali, S.; Mantovani, S.; Barboni, D.; Hogan, P.

    2017-12-01

    In 1999, US Vice-President Al Gore outlined the concept of `Digital Earth' as a multi-resolution, three-dimensional representation of the planet to find, visualise and make sense of vast amounts of geo- referenced information on physical and social environments, allowing to navigate through space and time, accessing historical and forecast data to support scientists, policy-makers, and any other user. The eodataservice platform (http://eodataservice.org/) implements the Digital Earth Concept: eodatasevice is a cross-domain platform that makes available a large set of multi-year global environmental collections allowing data discovery, visualization, combination, processing and download. It implements a "virtual datacube" approach where data stored on distributed data centers are made available via standardized OGC-compliant interfaces. Dedicated web-based Graphic User Interfaces (based on the ESA-NASA WebWorldWind technology) as well as web-based notebooks (e.g. Jupyter notebook), deskop GIS tools and command line interfaces can be used to access and manipulate the data. The platform can be fully customized on users' needs. So far eodataservice has been used for the following thematic applications: High resolution satellite data distribution Land surface monitoring using SAR surface deformation data Atmosphere, ocean and climate applications Climate-health applications Urban Environment monitoring Safeguard of cultural heritage sites Support to farmers and (re)-insurances in the agriculturés field In the current work, the EO Data Service concept is presented as key enabling technology; furthermore various examples are provided to demonstrate the high level of interdisciplinarity of the platform.

  13. Communication of Science Plans in the Rosetta Mission

    NASA Astrophysics Data System (ADS)

    Schmidt, Albrecht; Grieger, Björn; Völk, Stefan

    2014-05-01

    Rosetta is a mission of the European Space Agency (ESA) to rendez-vous with comet Churyumov-Gerasimenko in mid-2014. The trajectories and their corresponding operations are both flexible and particularly complex. To make informed decisions among the many free parameters, novel ways to communicate operations to the community have been explored. To support science planning by communicating operational ideas and disseminating operational scenarios, the science ground segment makes use of Web-based visualisation technologies. To keep the threshold to analysing operations proposals as low as possible, various implementation techniques have been investigated. An important goal was to use the Web to make the content as accessible as possible. By adopting the recent standard WebGL and generating static pages of time-dependent three-dimensional views of the spacecraft as well as the corresponding field-of-views of instruments, directly from the operational and for-study files, users are given the opportunity to explore interactively in their Web browsers what is being proposed in addition to using the traditional file products and analysing them in detail. The scenes and animations can be viewed in any modern Web browser and be combined with other analyses. This is to facilitate verification and cross-validation of complex products, often done by comparing different independent analyses and studies. By providing different timesteps in animations, it is possible to focus on long-term planning or short-term planning without distracting the user from the essentials. This is particularly important since the information that can be displayed in a Web browser is somewhat related to data volume that can be transferred across the wire. In Web browsers, it is more challenging to do numerical calculations on demand. Since requests for additional data have to be passed through a Web server, they are more complex and also require a more complex infrastructure. The volume of data that can be kept in a browser environment is limited and might have to be transferred over often slow network links. Thus, careful design and reduction of data is required. Regarding user interaction, Web browsers are often limited to a mouse and keyboards. In terms of benefits, the threshold and turn-around times for discussing operational ideas by using the visualisation techniques described here are lowered. An additional benefit of the approach was the cooperative use of products by distributed users which resulted in higher-quality software and data by incorporating more feedback than what would usually have been available.

  14. Applications of Network Visualisation in Infectious Disease Management

    DTIC Science & Technology

    2006-12-01

    White, D.R. P-Systems: a structural model for kinship studies. Connections. 24, 22–33. 2001. [13] White, D.R., Batagelj , V., and Mrvar , A. Analyzing...Infectious Disease Management 12 - 6 RTO-MP-IST-063 [14] De Nooy, W., Mrvar , A., and Batagelj , V. Exploratory social network analysis with Pajek...Workspace for the World-Wide Web, Proceedings of the ACM Human Factors in Computing Systems, pp, 111. 1996 . [7] Plaisant, C. Facilitating Data

  15. A Visualisation Tool to Aid Exploration of Students' Interactions in Asynchronous Online Communication

    ERIC Educational Resources Information Center

    Jyothi, Sujana; McAvinia, Claire; Keating, John

    2012-01-01

    Much research in recent years has focused on the introduction of virtual learning environments (VLEs) to universities, documenting practice, and sharing experience ([2], [9], [45] and [58]). Attention has been directed towards the importance of online dialogue for learning as a defining feature of the VLE. Communicative tools are an important…

  16. Project Scheduling Tool for Maintaining Capability Interdependencies and Defence Program Investment: A User’s Guide

    DTIC Science & Technology

    2014-09-01

    The free, open-source Integrated Development Environment (IDE) NetBeans [11] was used in the creation of the Graphical User Interface (GUI) for the tool...Oracle Corporation (2013) NetBeans IDE 7.4, http://www.netbeans.org. 12. O’Shea, K., Pong, P. & Bulluss, G. (2012) Fit-for-Purpose Visualisation of

  17. A Semantic Sensor Web for Environmental Decision Support Applications

    PubMed Central

    Gray, Alasdair J. G.; Sadler, Jason; Kit, Oles; Kyzirakos, Kostis; Karpathiotakis, Manos; Calbimonte, Jean-Paul; Page, Kevin; García-Castro, Raúl; Frazer, Alex; Galpin, Ixent; Fernandes, Alvaro A. A.; Paton, Norman W.; Corcho, Oscar; Koubarakis, Manolis; De Roure, David; Martinez, Kirk; Gómez-Pérez, Asunción

    2011-01-01

    Sensing devices are increasingly being deployed to monitor the physical world around us. One class of application for which sensor data is pertinent is environmental decision support systems, e.g., flood emergency response. For these applications, the sensor readings need to be put in context by integrating them with other sources of data about the surrounding environment. Traditional systems for predicting and detecting floods rely on methods that need significant human resources. In this paper we describe a semantic sensor web architecture for integrating multiple heterogeneous datasets, including live and historic sensor data, databases, and map layers. The architecture provides mechanisms for discovering datasets, defining integrated views over them, continuously receiving data in real-time, and visualising on screen and interacting with the data. Our approach makes extensive use of web service standards for querying and accessing data, and semantic technologies to discover and integrate datasets. We demonstrate the use of our semantic sensor web architecture in the context of a flood response planning web application that uses data from sensor networks monitoring the sea-state around the coast of England. PMID:22164110

  18. A case-association cluster detection and visualisation tool with an application to Legionnaires’ disease

    PubMed Central

    Sansom, P; Copley, V R; Naik, F C; Leach, S; Hall, I M

    2013-01-01

    Statistical methods used in spatio-temporal surveillance of disease are able to identify abnormal clusters of cases but typically do not provide a measure of the degree of association between one case and another. Such a measure would facilitate the assignment of cases to common groups and be useful in outbreak investigations of diseases that potentially share the same source. This paper presents a model-based approach, which on the basis of available location data, provides a measure of the strength of association between cases in space and time and which is used to designate and visualise the most likely groupings of cases. The method was developed as a prospective surveillance tool to signal potential outbreaks, but it may also be used to explore groupings of cases in outbreak investigations. We demonstrate the method by using a historical case series of Legionnaires’ disease amongst residents of England and Wales. PMID:23483594

  19. [Shared decision-making and individualized goal setting - a pilot trial using PRISM (Pictorial Representation of Illness and Self Measure) in psychiatric inpatients].

    PubMed

    Büchi, S; Straub, S; Schwager, U

    2010-12-01

    Although there is much talk about shared decision making and individualized goal setting, there is a lack of knowledge and knowhow in their realization in daily clinical practice. There is a lack in tools for easy applicable tools to ameliorate person-centred individualized goal setting processes. In three selected psychiatric inpatients the semistructured, theory driven use of PRISM (Pictorial Representation of Illness and Self Measure) in patients with complex psychiatric problems is presented and discussed. PRISM sustains a person-centred individualized process of goal setting and treatment and reinforces the active participation of patients. The process of visualisation and synchronous documentation is validated positively by patients and clinicians. The visual goal setting requires 30 to 45 minutes. In patients with complex psychiatric illness PRISM was used successfully to ameliorate individual goal setting. Specific effects of PRISM-visualisation are actually evaluated in a randomized controlled trial.

  20. GeoMapApp: Using Authentic Geoscience Data to Promote Student Engagement and Understanding

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.

    2016-12-01

    We increasingly expect geoscience data to be readily and freely accessible via the web in formats that are easy to handle. Yet, we are often required to compile data sets with different formats from multiple sources and, sometimes, we give up in frustration. Fortunately, recent advances in web-enabled technologies are helping to lower barriers by bridging the gap of data accessibility and integration. GeoMapApp (http://www.geomapapp.org), a free data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory provides users with an intuitive map-based interface. GeoMapApp offers free access to hundreds of integrated research-grade geoscience data sets. Examples include earthquake and volcano data, geological maps, lithospheric plate boundary information, geochemical, oceanographic, and environmental data. Users can also import their own data files. The GeoMapApp interface presents data in its proper geographical context that enhances geospatial awareness and helps students more easily gain insight and understanding from the data. Simple tools for data manipulation help students analyse the data in different ways. An improved Save Session function allows users to store a pre-loaded state of GeoMapApp. When shared with a class, the saved file frees up valuable classroom time for students to explore and interrogate the data by allowing every student to open GeoMapApp at exactly the same starting point. GeoMapApp is adaptable to a range of learning environments from lab sessions, group projects, and homework assignments to in-class pop-ups. A wide range of undergraduate enquiry-driven education modules for GeoMapApp is already available at SERC. In this presentation, we will show GeoMapApp-based activities that promote student engagement with authentic geoscience data and that provide a better sense of data "ownership" and of academic equality - GeoMapApp presents the same data in the same tool used by researchers. Topics covered will include plate tectonics and climatology.

  1. Working Together, Before We're all at Sea

    NASA Astrophysics Data System (ADS)

    Bricher, P.; Newman, L.; Diggs, S. C.

    2016-02-01

    Wouldn't it be nice to know who is going to be sailing where next summer? There have been many attempts to build a portal to share information about future field plans in the Southern Ocean, and these have thus far met with limited success in terms of capacity and and take-up. There is, however, considerable optimism about the future potential for such a tool, with at least a dozen multi-nation research and field initiatives planning to develop such tools for their own research communities. There is a clear appetite among researchers for a tool to spark conversations well in advance of field seasons that lead to better use of scarce field resources. Ironically though, one of the biggest challenges to the successful development of such a tool is a lack of communication among the groups that are trying to develop such portals. A second major challenge is a lack of resources to properly develop, test, and maintain such portals. The Southern Ocean Observing System (SOOS) is holding conversations among data managers and multi-nation research initiatives to develop a tool of maximum utility for all. We propose a common backbone, in the form of a single integrated database based on open-source code, that meets the needs of oceanographers, biologists, other researchers, and program managers. Customised data entry forms and web visualisations can then be built off this to better target it to the needs of individual groups, without sacrificing interoperability. A further advantage of this approach is that we can marshal the resources of all groups to produce the best field planning tool possible. In this presentation, we will share the lessons learned so far, and invite further collaboration.

  2. Development of a browser application to foster research on linking climate and health datasets: Challenges and opportunities.

    PubMed

    Hajat, Shakoor; Whitmore, Ceri; Sarran, Christophe; Haines, Andy; Golding, Brian; Gordon-Brown, Harriet; Kessel, Anthony; Fleming, Lora E

    2017-01-01

    Improved data linkages between diverse environment and health datasets have the potential to provide new insights into the health impacts of environmental exposures, including complex climate change processes. Initiatives that link and explore big data in the environment and health arenas are now being established. To encourage advances in this nascent field, this article documents the development of a web browser application to facilitate such future research, the challenges encountered to date, and how they were addressed. A 'storyboard approach' was used to aid the initial design and development of the application. The application followed a 3-tier architecture: a spatial database server for storing and querying data, server-side code for processing and running models, and client-side browser code for user interaction and for displaying data and results. The browser was validated by reproducing previously published results from a regression analysis of time-series datasets of daily mortality, air pollution and temperature in London. Data visualisation and analysis options of the application are presented. The main factors that shaped the development of the browser were: accessibility, open-source software, flexibility, efficiency, user-friendliness, licensing restrictions and data confidentiality, visualisation limitations, cost-effectiveness, and sustainability. Creating dedicated data and analysis resources, such as the one described here, will become an increasingly vital step in improving understanding of the complex interconnections between the environment and human health and wellbeing, whilst still ensuring appropriate confidentiality safeguards. The issues raised in this paper can inform the future development of similar tools by other researchers working in this field. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. RootJS: Node.js Bindings for ROOT 6

    NASA Astrophysics Data System (ADS)

    Beffart, Theo; Früh, Maximilian; Haas, Christoph; Rajgopal, Sachin; Schwabe, Jonas; Wolff, Christoph; Szuba, Marek

    2017-10-01

    We present rootJS, an interface making it possible to seamlessly integrate ROOT 6 into applications written for Node.js, the JavaScript runtime platform increasingly commonly used to create high-performance Web applications. ROOT features can be called both directly from Node.js code and by JIT-compiling C++ macros. All rootJS methods are invoked asynchronously and support callback functions, allowing non-blocking operation of Node.js applications using them. Last but not least, our bindings have been designed to platform-independent and should therefore work on all systems supporting both ROOT 6 and Node.js. Thanks to rootJS it is now possible to create ROOT-aware Web applications taking full advantage of the high performance and extensive capabilities of Node.js. Examples include platforms for the quality assurance of acquired, reconstructed or simulated data, book-keeping and e-log systems, and even Web browser-based data visualisation and analysis.

  4. ANTP Protocol Suite Software Implementation Architecture in Python

    DTIC Science & Technology

    2011-06-03

    a popular platform of networking programming, an area in which C has traditionally dominated. 2 NetController AeroRP AeroNP AeroNP API AeroTP...visualisation of the running system. For example using the Google Maps API , the main logging web page can show all the running nodes in the system. By...communication between AeroNP and AeroRP and runs on the operating system as daemon. Furthermore, it creates an API interface to mange the communication between

  5. MyChemise: A 2D drawing program that uses morphing for visualisation purposes

    PubMed Central

    2011-01-01

    MyChemise (My Chemical Structure Editor) is a new 2D structure editor. It is designed as a Java applet that enables the direct creation of structures in the Internet using a web browser. MyChemise saves files in a digital format (.cse) and the import and export of .mol files using the appropriate connection tables is also possible. MyChemise is available as a free online version in English and German. The MyChemise GUI is designed to be user friendly and can be used intuitively. There is also an English and German program description available as a PDF file. In addition to the known ways of drawing chemical structure formulas, there are also parts implemented in the program that allow the creation of different types of presentation. The morphing module uses this technology as a component for dynamic visualisation. For example, it enables a clear and simple illustration of molecule vibrations and reaction sequences. PMID:22152022

  6. RSAT matrix-clustering: dynamic exploration and redundancy reduction of transcription factor binding motif collections.

    PubMed

    Castro-Mondragon, Jaime Abraham; Jaeger, Sébastien; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2017-07-27

    Transcription factor (TF) databases contain multitudes of binding motifs (TFBMs) from various sources, from which non-redundant collections are derived by manual curation. The advent of high-throughput methods stimulated the production of novel collections with increasing numbers of motifs. Meta-databases, built by merging these collections, contain redundant versions, because available tools are not suited to automatically identify and explore biologically relevant clusters among thousands of motifs. Motif discovery from genome-scale data sets (e.g. ChIP-seq) also produces redundant motifs, hampering the interpretation of results. We present matrix-clustering, a versatile tool that clusters similar TFBMs into multiple trees, and automatically creates non-redundant TFBM collections. A feature unique to matrix-clustering is its dynamic visualisation of aligned TFBMs, and its capability to simultaneously treat multiple collections from various sources. We demonstrate that matrix-clustering considerably simplifies the interpretation of combined results from multiple motif discovery tools, and highlights biologically relevant variations of similar motifs. We also ran a large-scale application to cluster ∼11 000 motifs from 24 entire databases, showing that matrix-clustering correctly groups motifs belonging to the same TF families, and drastically reduced motif redundancy. matrix-clustering is integrated within the RSAT suite (http://rsat.eu/), accessible through a user-friendly web interface or command-line for its integration in pipelines. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. GIS Technologies For The New Planetary Science Archive (PSA)

    NASA Astrophysics Data System (ADS)

    Docasal, R.; Barbarisi, I.; Rios, C.; Macfarlane, A. J.; Gonzalez, J.; Arviset, C.; De Marchi, G.; Martinez, S.; Grotheer, E.; Lim, T.; Besse, S.; Heather, D.; Fraga, D.; Barthelemy, M.

    2015-12-01

    Geographical information system (GIS) is becoming increasingly used for planetary science. GIS are computerised systems for the storage, retrieval, manipulation, analysis, and display of geographically referenced data. Some data stored in the Planetary Science Archive (PSA), for instance, a set of Mars Express/Venus Express data, have spatial metadata associated to them. To facilitate users in handling and visualising spatial data in GIS applications, the new PSA should support interoperability with interfaces implementing the standards approved by the Open Geospatial Consortium (OGC). These standards are followed in order to develop open interfaces and encodings that allow data to be exchanged with GIS Client Applications, well-known examples of which are Google Earth and NASA World Wind as well as open source tools such as Openlayers. The technology already exists within PostgreSQL databases to store searchable geometrical data in the form of the PostGIS extension. An existing open source maps server is GeoServer, an instance of which has been deployed for the new PSA, uses the OGC standards to allow, among others, the sharing, processing and editing of data and spatial data through the Web Feature Service (WFS) standard as well as serving georeferenced map images through the Web Map Service (WMS). The final goal of the new PSA, being developed by the European Space Astronomy Centre (ESAC) Science Data Centre (ESDC), is to create an archive which enables science exploitation of ESA's planetary missions datasets. This can be facilitated through the GIS framework, offering interfaces (both web GUI and scriptable APIs) that can be used more easily and scientifically by the community, and that will also enable the community to build added value services on top of the PSA.

  8. Airborne electromagnetics data interactive visualisation and exploratory data analysis using Cloud technologies

    NASA Astrophysics Data System (ADS)

    Golodoniuc, P.; Davis, A. C.; Klump, J. F.

    2017-12-01

    Electromagnetic exploration techniques are extensively used for remote detection and measurement of subsurface electrical conductivity structures for a variety of geophysical applications such as mineral exploration and groundwater detection. The Electromagnetic Applications group in the Mineral Resources business unit of CSIRO heavily relies upon the use of airborne electromagnetic (AEM) data for the development of new exploration methods. AEM data, which are often originally acquired for green- or brown-fields exploration for minerals, can be re-used for groundwater resource detection in the near-surface. This makes AEM data potentially useful beyond their initial purpose for decades into the future. Increasingly, AEM data are also used as a primary mapping tool for groundwater resources. With surveys ranging from under 1000 km to tens of thousands of km in total length, AEM data are spatially and temporally dense. Sounding stations are often sampled every 0.2 seconds, with about 30-50 measurements taken at each site, resulting in a spacing of measurements along the flight lines of approximately 20­-50 metres. This means that typical AEM surveys can easily have on the order of millions of individual stations, with tens of millions of measurements. AEM data needs to be examined for data quality before it can be inverted into conductivity-depth information. Data, which is gathered in survey transects or lines, is examined both along the line, in a plan view and for the transient decay of the electromagnetic signal of individual stations before noise artefacts can be removed. The complexity of the data, its size and dimensionality require efficient tools that support interactive visual data analysis and allows easy navigation through the dataset. A suite of numerical algorithms for data quality assurance facilitates this process through efficient visualisations and data quality metrics. The extensible architecture of the toolkit allows application of custom algorithms on-demand through a web-based user interface and seamlessly connects data processing workflow to geophysical inversion codes. The toolkit architecture has a small client-side footprint and runs on a standard workstation, delegating all computationally intensive tasks to the accompanying Cloud-based processing unit.

  9. MiMiR – an integrated platform for microarray data sharing, mining and analysis

    PubMed Central

    Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence

    2008-01-01

    Background Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. Results A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. Conclusion The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies. PMID:18801157

  10. MiMiR--an integrated platform for microarray data sharing, mining and analysis.

    PubMed

    Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence

    2008-09-18

    Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies.

  11. Exercise Black Skies 2008: Enhancing Live Training Through Virtual Preparation -- Part Two: An Evaluation of Tools and Techniques

    DTIC Science & Technology

    2009-06-01

    visualisation tool. These tools are currently in use at the Surveillance and Control Training Unit (SACTU) in Williamtown, New South Wales, and the School...itself by facilitating the brevity and sharpness of learning points. The playback of video and audio was considered an extremely useful method of...The task assessor’s comments were supported by wall projections and audio replays of relevant mission segments that were controlled by an AAR

  12. A simple method for serving Web hypermaps with dynamic database drill-down

    PubMed Central

    Boulos, Maged N Kamel; Roudsari, Abdul V; Carson, Ewart R

    2002-01-01

    Background HealthCyberMap aims at mapping parts of health information cyberspace in novel ways to deliver a semantically superior user experience. This is achieved through "intelligent" categorisation and interactive hypermedia visualisation of health resources using metadata, clinical codes and GIS. HealthCyberMap is an ArcView 3.1 project. WebView, the Internet extension to ArcView, publishes HealthCyberMap ArcView Views as Web client-side imagemaps. The basic WebView set-up does not support any GIS database connection, and published Web maps become disconnected from the original project. A dedicated Internet map server would be the best way to serve HealthCyberMap database-driven interactive Web maps, but is an expensive and complex solution to acquire, run and maintain. This paper describes HealthCyberMap simple, low-cost method for "patching" WebView to serve hypermaps with dynamic database drill-down functionality on the Web. Results The proposed solution is currently used for publishing HealthCyberMap GIS-generated navigational information maps on the Web while maintaining their links with the underlying resource metadata base. Conclusion The authors believe their map serving approach as adopted in HealthCyberMap has been very successful, especially in cases when only map attribute data change without a corresponding effect on map appearance. It should be also possible to use the same solution to publish other interactive GIS-driven maps on the Web, e.g., maps of real world health problems. PMID:12437788

  13. Communication of uncertainty in hydrological predictions: a user-driven example web service for Europe

    NASA Astrophysics Data System (ADS)

    Fry, Matt; Smith, Katie; Sheffield, Justin; Watts, Glenn; Wood, Eric; Cooper, Jon; Prudhomme, Christel; Rees, Gwyn

    2017-04-01

    Water is fundamental to society as it impacts on all facets of life, the economy and the environment. But whilst it creates opportunities for growth and life, it can also cause serious damages to society and infrastructure through extreme hydro-meteorological events such as floods or droughts. Anticipation of future water availability and extreme event risks would both help optimise growth and limit damage through better preparedness and planning, hence providing huge societal benefits. Recent scientific research advances make it now possible to provide hydrological outlooks at monthly to seasonal lead time, and future projections up to the end of the century accounting for climatic changes. However, high uncertainty remains in the predictions, which varies depending on location, time of the year, prediction range and hydrological variable. It is essential that this uncertainty is fully understood by decision makers so they can account for it in their planning. Hence, the challenge is to finds ways to communicate such uncertainty for a range of stakeholders with different technical background and environmental science knowledge. The project EDgE (End-to end Demonstrator for improved decision making in the water sector for Europe) funded by the Copernicus programme (C3S) is a proof-of-concept project that develops a unique service to support decision making for the water sector at monthly to seasonal and to multi-decadal lead times. It is a mutual effort of co-production between hydrologists and environmental modellers, computer scientists and stakeholders representative of key decision makers in Europe for the water sector. This talk will present the iterative co-production process of a web service that serves the need of the user community. Through a series of Focus Group meetings in Spain, Norway and the UK, options for visualising the hydrological predictions and associated uncertainties are presented and discussed first as mock-up dash boards, off-line tools and pre-operational services. Feedbacks received from the users are listed and prioritised for the next-generation of development to take place. In addition, sprint-review webinars are also organised to insure the developed services address the users' demands correctly. The tools are formally tested through a set of case studies representative of decision making in contrasting water sectors, including hydro-power in snow-dominated regions, public water supply in heavily regulated countries, and river basin management in an arid environments with multiple users. In addition to the visualisation, a key component of the project is the provision of user guidance. This helps the user understand the challenges of dealing with uncertainty and interpretation of the results, provides contextual background information, describes the service's functionality, and showcases examples of good practice.

  14. The SOOS Data Portal, providing access to Southern Oceans data

    NASA Astrophysics Data System (ADS)

    Proctor, Roger; Finney, Kim; Blain, Peter; Taylor, Fiona; Newman, Louise; Meredith, Mike; Schofield, Oscar

    2013-04-01

    The Southern Ocean Observing System (SOOS) is an international initiative to enhance, coordinate and expand the strategic observations of the Southern Oceans that are required to address key scientific and societal challenges. A key component of SOOS will be the creation and maintenance of a Southern Ocean Data Portal to provide improved access to historical and ongoing data (Schofield et al., 2012, Eos, Vol. 93, No. 26, pp 241-243). The scale of this effort will require strong leveraging of existing data centres, new cyberinfrastructure development efforts, and defined data collection, quality control, and archiving procedures across the international community. The task of assembling the SOOS data portal is assigned to the SOOS Data Management Sub-Committee. The information infrastructure chosen for the SOOS data portal is based on the Australian Ocean Data Network (AODN, http://portal.aodn.org.au). The AODN infrastructure is built on open-source tools and the use of international standards ensures efficiency of data exchange and interoperability between contributing systems. OGC standard web services protocols are used for serving of data via the internet. These include Web Map Service (WMS) for visualisation, Web Feature Service (WFS) for data download, and Catalogue Service for Web (CSW) for catalogue exchange. The portal offers a number of tools to access and visualize data: - a Search link to the metadata catalogue enables search and discovery by simple text search, by geographic area, temporal extent, keyword, parameter, organisation, or by any combination of these, allowing users to gain access to further information and/or the data for download. Also, searches can be restricted to items which have either data to download, or attached map layers, or both - a Map interface for discovery and display of data, with the ability to change the style and opacity of layers, add additional data layers via OGC Web Map Services, view animated timeseries datastreams - data can be easily accessed and downloaded including directly from OPeNDAP/THREDDS servers. The SOOS data portal (http://soos.aodn.org.au/soos) aims to make access to Southern Ocean data a simple process and the initial layout classifies data into six themes - Heat and Freshwater; Circulation; Ice-sheets and Sea level; Carbon; Sea-ice; and Ecosystems, with the ability to integrate layers between themes. The portal is in its infancy (pilot launched January 2013) with a limited number of datasets available; however, the number of datasets is expected to grow rapidly as the international community becomes fully engaged.

  15. Exchanging environmental information and decision making: developing the local Pilot Environmental Virtual Observatory with stakeholder communities

    NASA Astrophysics Data System (ADS)

    Mackay, E.; Beven, K.; Brewer, P.; M, Haygarth, P.; Macklin, M.; Marshall, K.; Quinn, P.; Stutter, M.; Thomas, N.; Wilkinson, M.

    2012-04-01

    Public participation in the development of flood risk management and river basin management plans are explicit components of both the Water Framework and Floods Directives. At the local level, involving communities in land and water management has been found to (i) aid better environmental decision making, (ii) enhance social, economic and environmental benefits, and (iii) increase a sense of ownership. Facilitating the access and exchange of information on the local environment is an important part of this new approach to the land and water management process, which also includes local community stakeholders in decisions about the design and content of the information provided. As part of the Natural Environment Research Council's pilot Environment Virtual Observatory (EVO), the Local Level group are engaging with local community stakeholders in three different catchments in the UK (the rivers Eden, Tarland and Dyfi) to start the process of developing prototype visualisation tools to address the specific land and water management issues identified in each area. Through this local collaboration, we will provide novel visualisation tools through which to communicate complex catchment science outcomes and bring together different sources of environmental data in ways that better meet end-user needs as well as facilitate a far broader participatory approach in environmental decision making. The Local Landscape Visualisation Tools are being evolved iteratively during the project to reflect the needs, interests and capabilities of a wide range of stakeholders. The tools will use the latest concepts and technologies to communicate with and provide opportunities for the provision and exchange of information between the public, government agencies and scientists. This local toolkit will reside within a wider EVO platform that will include national datasets, models and state of the art cloud computer systems. As such, local stakeholder groups are assisting the EVO's development and participating in local decision making alongside policy makers, government agencies and scientists.

  16. Ms2lda.org: web-based topic modelling for substructure discovery in mass spectrometry.

    PubMed

    Wandy, Joe; Zhu, Yunfeng; van der Hooft, Justin J J; Daly, Rónán; Barrett, Michael P; Rogers, Simon

    2017-09-14

    We recently published MS2LDA, a method for the decomposition of sets of molecular fragment data derived from large metabolomics experiments. To make the method more widely available to the community, here we present ms2lda.org, a web application that allows users to upload their data, run MS2LDA analyses and explore the results through interactive visualisations. Ms2lda.org takes tandem mass spectrometry data in many standard formats and allows the user to infer the sets of fragment and neutral loss features that co-occur together (Mass2Motifs). As an alternative workflow, the user can also decompose a dataset onto predefined Mass2Motifs. This is accomplished through the web interface or programmatically from our web service. The website can be found at http://ms2lda.org , while the source code is available at https://github.com/sdrogers/ms2ldaviz under the MIT license. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  17. Microscopic transport model animation visualisation on KML base

    NASA Astrophysics Data System (ADS)

    Yatskiv, I.; Savrasovs, M.

    2012-10-01

    By reading classical literature devoted to the simulation theory it could be found that one of the greatest possibilities of simulation is the ability to present processes inside the system by animation. This gives to the simulation model additional value during presentation of simulation results for the public and authorities who are not familiar enough with simulation. That is why most of universal and specialised simulation tools have the ability to construct 2D and 3D representation of the model. Usually the development of such representation could take much time and there must be put a lot forces into creating an adequate 3D representation of the model. For long years such well-known microscopic traffic flow simulation software tools as VISSIM, AIMSUN and PARAMICS have had a possibility to produce 2D and 3D animation. But creation of realistic 3D model of the place where traffic flows are simulated, even in these professional software tools it is a hard and time consuming action. The goal of this paper is to describe the concepts of use the existing on-line geographical information systems for visualisation of animation produced by simulation software. For demonstration purposes the following technologies and tools have been used: PTV VISION VISSIM, KML and Google Earth.

  18. DataRocket: Interactive Visualisation of Data Structures

    NASA Astrophysics Data System (ADS)

    Parkes, Steve; Ramsay, Craig

    2010-08-01

    CodeRocket is a software engineering tool that provides cognitive support to the software engineer for reasoning about a method or procedure and for documenting the resulting code [1]. DataRocket is a software engineering tool designed to support visualisation and reasoning about program data structures. DataRocket is part of the CodeRocket family of software tools developed by Rapid Quality Systems [2] a spin-out company from the Space Technology Centre at the University of Dundee. CodeRocket and DataRocket integrate seamlessly with existing architectural design and coding tools and provide extensive documentation with little or no effort on behalf of the software engineer. Comprehensive, abstract, detailed design documentation is available early on in a project so that it can be used for design reviews with project managers and non expert stakeholders. Code and documentation remain fully synchronised even when changes are implemented in the code without reference to the existing documentation. At the end of a project the press of a button suffices to produce the detailed design document. Existing legacy code can be easily imported into CodeRocket and DataRocket to reverse engineer detailed design documentation making legacy code more manageable and adding substantially to its value. This paper introduces CodeRocket. It then explains the rationale for DataRocket and describes the key features of this new tool. Finally the major benefits of DataRocket for different stakeholders are considered.

  19. Crowdsourcing, citizen sensing and Sensor Web technologies for public and environmental health surveillance and crisis management: trends, OGC standards and application examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamel Boulos, Maged; Resch, Bernd; Crowley, David N.

    The PIE Activity Awareness Environment is designed to be an adaptive data triage and decision support tool that allows role and activity based situation awareness through a dynamic, trainable filtering system. This paper discusses the process and methodology involved in the application as well as some of its capabilities. 'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their ownmore » layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, 'noise', misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world.« less

  20. The Selimiye Mosque of Edirne, Turkey - AN Immersive and Interactive Virtual Reality Experience Using Htc Vive

    NASA Astrophysics Data System (ADS)

    Kersten, T. P.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A. P.

    2017-05-01

    Recent advances in contemporary Virtual Reality (VR) technologies are going to have a significant impact on veryday life. Through VR it is possible to virtually explore a computer-generated environment as a different reality, and to immerse oneself into the past or in a virtual museum without leaving the current real-life situation. For such the ultimate VR experience, the user should only see the virtual world. Currently, the user must wear a VR headset which fits around the head and over the eyes to visually separate themselves from the physical world. Via the headset images are fed to the eyes through two small lenses. Cultural heritage monuments are ideally suited both for thorough multi-dimensional geometric documentation and for realistic interactive visualisation in immersive VR applications. Additionally, the game industry offers tools for interactive visualisation of objects to motivate users to virtually visit objects and places. In this paper the generation of a virtual 3D model of the Selimiye mosque in the city of Edirne, Turkey and its processing for data integration into the game engine Unity is presented. The project has been carried out as a co-operation between BİMTAŞ, a company of the Greater Municipality of Istanbul, Turkey and the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg, Germany to demonstrate an immersive and interactive visualisation using the new VR system HTC Vive. The workflow from data acquisition to VR visualisation, including the necessary programming for navigation, is described. Furthermore, the possible use (including simultaneous multiple users environments) of such a VR visualisation for a CH monument is discussed in this contribution.

  1. CRISPRDetect: A flexible algorithm to define CRISPR arrays.

    PubMed

    Biswas, Ambarish; Staals, Raymond H J; Morales, Sergio E; Fineran, Peter C; Brown, Chris M

    2016-05-17

    CRISPR (clustered regularly interspaced short palindromic repeats) RNAs provide the specificity for noncoding RNA-guided adaptive immune defence systems in prokaryotes. CRISPR arrays consist of repeat sequences separated by specific spacer sequences. CRISPR arrays have previously been identified in a large proportion of prokaryotic genomes. However, currently available detection algorithms do not utilise recently discovered features regarding CRISPR loci. We have developed a new approach to automatically detect, predict and interactively refine CRISPR arrays. It is available as a web program and command line from bioanalysis.otago.ac.nz/CRISPRDetect. CRISPRDetect discovers putative arrays, extends the array by detecting additional variant repeats, corrects the direction of arrays, refines the repeat/spacer boundaries, and annotates different types of sequence variations (e.g. insertion/deletion) in near identical repeats. Due to these features, CRISPRDetect has significant advantages when compared to existing identification tools. As well as further support for small medium and large repeats, CRISPRDetect identified a class of arrays with 'extra-large' repeats in bacteria (repeats 44-50 nt). The CRISPRDetect output is integrated with other analysis tools. Notably, the predicted spacers can be directly utilised by CRISPRTarget to predict targets. CRISPRDetect enables more accurate detection of arrays and spacers and its gff output is suitable for inclusion in genome annotation pipelines and visualisation. It has been used to analyse all complete bacterial and archaeal reference genomes.

  2. PolyTB: A genomic variation map for Mycobacterium tuberculosis

    PubMed Central

    Coll, Francesc; Preston, Mark; Guerra-Assunção, José Afonso; Hill-Cawthorn, Grant; Harris, David; Perdigão, João; Viveiros, Miguel; Portugal, Isabel; Drobniewski, Francis; Gagneux, Sebastien; Glynn, Judith R.; Pain, Arnab; Parkhill, Julian; McNerney, Ruth; Martin, Nigel; Clark, Taane G.

    2014-01-01

    Summary Tuberculosis (TB) caused by Mycobacterium tuberculosis (Mtb) is the second major cause of death from an infectious disease worldwide. Recent advances in DNA sequencing are leading to the ability to generate whole genome information in clinical isolates of M. tuberculosis complex (MTBC). The identification of informative genetic variants such as phylogenetic markers and those associated with drug resistance or virulence will help barcode Mtb in the context of epidemiological, diagnostic and clinical studies. Mtb genomic datasets are increasingly available as raw sequences, which are potentially difficult and computer intensive to process, and compare across studies. Here we have processed the raw sequence data (>1500 isolates, eight studies) to compile a catalogue of SNPs (n = 74,039, 63% non-synonymous, 51.1% in more than one isolate, i.e. non-private), small indels (n = 4810) and larger structural variants (n = 800). We have developed the PolyTB web-based tool (http://pathogenseq.lshtm.ac.uk/polytb) to visualise the resulting variation and important meta-data (e.g. in silico inferred strain-types, location) within geographical map and phylogenetic views. This resource will allow researchers to identify polymorphisms within candidate genes of interest, as well as examine the genomic diversity and distribution of strains. PolyTB source code is freely available to researchers wishing to develop similar tools for their pathogen of interest. PMID:24637013

  3. Use of chemoinformatics tools- nuts and bolts; Challenges in their regulatory application

    EPA Science Inventory

    Cheminformatics spans a continuum of components from data storage to uncovering new insights that are useful for different decision making contexts. It covers the input, retrieval of data, the manipulation and integration of data through to the visualisation and analysis to trans...

  4. Spatiotemporal data visualisation for homecare monitoring of elderly people.

    PubMed

    Juarez, Jose M; Ochotorena, Jose M; Campos, Manuel; Combi, Carlo

    2015-10-01

    Elderly people who live alone can be assisted by home monitoring systems that identify risk scenarios such as falls, fatigue symptoms or burglary. Given that these systems have to manage spatiotemporal data, human intervention is required to validate automatic alarms due to the high number of false positives and the need for context interpretation. The goal of this work was to provide tools to support human action, to identify such potential risk scenarios based on spatiotemporal data visualisation. We propose the MTA (multiple temporal axes) model, a visual representation of temporal information of the activity of a single person at different locations. The main goal of this model is to visualize the behaviour of a person in their home, facilitating the identification of health-risk scenarios and repetitive patterns. We evaluate the model's insight capacity compared with other models using a standard evaluation protocol. We also test its practical suitability of the MTA graphical model in a commercial home monitoring system. In particular, we implemented 8VISU, a visualization tool based on MTA. MTA proved to be more than 90% accurate in identify non-risk scenarios, independently of the length of the record visualised. When the spatial complexity was increased (e.g. number of rooms) the model provided good accuracy form up to 5 rooms. Therefore, user preferences and user performance seem to be balanced. Moreover, it also gave high sensitivity levels (over 90%) for 5-8 rooms. Fall is the most recurrent incident for elderly people. The MTA model outperformed the other models considered in identifying fall scenarios (66% of correctness) and was the second best for burglary and fatigue scenarios (36% of correctness). Our experiments also confirm the hypothesis that cyclic models are the most suitable for fatigue scenarios, the Spiral and MTA models obtaining most positive identifications. In home monitoring systems, spatiotemporal visualization is a useful tool for identifying risk and preventing home accidents in elderly people living alone. The MTA model helps the visualisation in different stages of the temporal data analysis process. In particular, its explicit representation of space and movement is useful for identifying potential scenarios of risk, while the spiral structure can be used for the identification of recurrent patterns. The results of the experiments and the experience using the visualization tool 8VISU proof the potential of the MTA graphical model to mine temporal data and to support caregivers using home monitoring infrastructures. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Conservation and restoration of natural building stones monitored through non-destructive X-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Jacobs, P. Js; Cnudde, V.

    2003-04-01

    X-ray computed micro-tomography (μCT) is a promising non-destructive imaging technique to study building materials. μCT analysis provides information on the internal structure and petrophysical properties of small samples (size up to 2 cm diameter and 6 cm height), with to date a maximum resolution of 10 μm for commercial systems (Skyscan 1072). μCT allows visualising and measuring complete three-dimensional object structures without sample preparation. Possible applications of the μCT-technique for the monitoring of natural building stones are multiple: (i) to determine non-destructively porosity based on 3D images, (ii) to visualise weathering phenomena at the μ-scale, (iii) to understand the rationale of weathering processes, (iv) to visualise the presence of waterrepellents and consolidation products, (v) to monitor the protective effects of these products during weathering in order to understand the underlying weathering mechanisms and (vi) to provide advise on the suitability of products for the treatment of a particular rock-type. μCT-technique in combination with micro-Raman spectroscopy could prove to be a powerful tool for the future, as the combination of 3D visualisation and 2D chemical determination of inorganic as well as organic components could provide new insights to optimise conservation and restoration techniques of building materials. Determining the penetration depth of restoration products, used to consolidate or to protect natural building stones from weathering, is crucial if the application of conservation products is planned. Every type of natural building stone has its own petrophysical characteristics and each rock type reacts differently on the various restoration products available on the market. To assess the penetration depth and the effectiveness of a certain restoration product, μCT technology in combination with micro-Raman spectroscopy could be applied. Due to its non-destructive character and its resolution down to porosity scale, the technology of μCT offers a large potential of application. μCT-technique in combination with micro-Raman spectroscopy could prove to be a powerful tool for the future, as the combination of 3D visualisation and 2D chemical determination could provide new insights to optimise conservation and restoration techniques of building materials. These principles will be demonstrated for Maastricht limestone and Bray sandstone that have been selected for this study because of their high porosity and their very pure composition.

  6. Interactive Notebooks for Operational Meteorology

    NASA Astrophysics Data System (ADS)

    Prudden, R.; Robinson, N.; Arribas, A.

    2016-12-01

    Operational meteorologists are under pressure to analyse weather forecast data quickly and accurately. However, the volume and complexity of the data make this a challenge. In particular, the number of relevant dimensions (spatial, time, altitude, and ensemble member) make visualising the data difficult. It is also desirable to combine other data sources such as climatology, which complicates things further. Interactive notebooks such as Jupyter can provide a highly flexible tool for this kind of data exploration. The power of this approach is that it gives the user full control over their analysis, providing a simple way to combine different maps, plots and graphs using existing libraries. This talk will demonstrate how notebooks can be useful in operational meteorology, and show some examples of interactive data visualisation.

  7. EpiContactTrace: an R-package for contact tracing during livestock disease outbreaks and for risk-based surveillance.

    PubMed

    Nöremark, Maria; Widgren, Stefan

    2014-03-17

    During outbreak of livestock diseases, contact tracing can be an important part of disease control. Animal movements can also be of relevance for risk-based surveillance and sampling, i.e. both when assessing consequences of introduction or likelihood of introduction. In many countries, animal movement data are collected with one of the major objectives to enable contact tracing. However, often an analytical step is needed to retrieve appropriate information for contact tracing or surveillance. In this study, an open source tool was developed to structure livestock movement data to facilitate contact-tracing in real time during disease outbreaks and for input in risk-based surveillance and sampling. The tool, EpiContactTrace, was written in the R-language and uses the network parameters in-degree, out-degree, ingoing contact chain and outgoing contact chain (also called infection chain), which are relevant for forward and backward tracing respectively. The time-frames for backward and forward tracing can be specified independently and search can be done on one farm at a time or for all farms within the dataset. Different outputs are available; datasets with network measures, contacts visualised in a map and automatically generated reports for each farm either in HTML or PDF-format intended for the end-users, i.e. the veterinary authorities, regional disease control officers and field-veterinarians. EpiContactTrace is available as an R-package at the R-project website (http://cran.r-project.org/web/packages/EpiContactTrace/). We believe this tool can help in disease control since it rapidly can structure essential contact information from large datasets. The reproducible reports make this tool robust and independent of manual compilation of data. The open source makes it accessible and easily adaptable for different needs.

  8. Cytoscape.js: a graph theory library for visualisation and analysis.

    PubMed

    Franz, Max; Lopes, Christian T; Huck, Gerardo; Dong, Yue; Sumer, Onur; Bader, Gary D

    2016-01-15

    Cytoscape.js is an open-source JavaScript-based graph library. Its most common use case is as a visualization software component, so it can be used to render interactive graphs in a web browser. It also can be used in a headless manner, useful for graph operations on a server, such as Node.js. Cytoscape.js is implemented in JavaScript. Documentation, downloads and source code are available at http://js.cytoscape.org. gary.bader@utoronto.ca. © The Author 2015. Published by Oxford University Press.

  9. Visualising Disability in the Past

    ERIC Educational Resources Information Center

    Devlieger, Patrick; Grosvenor, Ian; Simon, Frank; Van Hove, Geert; Vanobbergen, Bruno

    2008-01-01

    In recent years there has been a growth in interdisciplinary work which has argued that disability is not an isolated, individual medical pathology but instead a key defining social category like "race", class and gender. Seen in this way disability provides researchers with another analytic tool for exploring the nature of power. Running almost…

  10. User-driven Cloud Implementation of environmental models and data for all

    NASA Astrophysics Data System (ADS)

    Gurney, R. J.; Percy, B. J.; Elkhatib, Y.; Blair, G. S.

    2014-12-01

    Environmental data and models come from disparate sources over a variety of geographical and temporal scales with different resolutions and data standards, often including terabytes of data and model simulations. Unfortunately, these data and models tend to remain solely within the custody of the private and public organisations which create the data, and the scientists who build models and generate results. Although many models and datasets are theoretically available to others, the lack of ease of access tends to keep them out of reach of many. We have developed an intuitive web-based tool that utilises environmental models and datasets located in a cloud to produce results that are appropriate to the user. Storyboards showing the interfaces and visualisations have been created for each of several exemplars. A library of virtual machine images has been prepared to serve these exemplars. Each virtual machine image has been tailored to run computer models appropriate to the end user. Two approaches have been used; first as RESTful web services conforming to the Open Geospatial Consortium (OGC) Web Processing Service (WPS) interface standard using the Python-based PyWPS; second, a MySQL database interrogated using PHP code. In all cases, the web client sends the server an HTTP GET request to execute the process with a number of parameter values and, once execution terminates, an XML or JSON response is sent back and parsed at the client side to extract the results. All web services are stateless, i.e. application state is not maintained by the server, reducing its operational overheads and simplifying infrastructure management tasks such as load balancing and failure recovery. A hybrid cloud solution has been used with models and data sited on both private and public clouds. The storyboards have been transformed into intuitive web interfaces at the client side using HTML, CSS and JavaScript, utilising plug-ins such as jQuery and Flot (for graphics), and Google Maps APIs. We have demonstrated that a cloud infrastructure can be used to assemble a virtual research environment that, coupled with a user-driven development approach, is able to cater to the needs of a wide range of user groups, from domain experts to concerned members of the general public.

  11. Visualizing and communicating uncertainty in the earth and environmental sciences: a review

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer

    2014-05-01

    I will review past attempts to visualising uncertainty in spatial or spatio-temporal predictions of groundwater quality, quality predictions, sea bed sediment, bird densities, air quality measurements, and exposure to air quality of individuals and populations. The attempts involved software development (aguila [1], greenland [2]), the development of standards for communicating uncertain spatial and spatio-temporal information (UncertML, [3]), and have been illustrated by applications in a number of EU projects (Apmosphere [4], INTAMAP [5], UncertWeb [6] and GeoViQua [7]). I will also report on usability studies that were carried out (e.g. [8]). [1] http://pcraster.geo.uu.nl/projects/developments/aguila/ [2] https://wiki.52north.org/bin/view/Geostatistics/Greenland [3] http://www.uncertml.org/ [4] http://www.apmosphere.org/ [5] http://www.intamap.org/ [6] http://www.uncertweb.org/ [7] http://www.geoviqua.org/ [8] Senaratne, H. L. Gerharz, E. Pebesma, A. Schwering, 2012. Usability of Spatio-Temporal Uncertainty Visualisation Methods. In: Bridging the Geographic Information Sciences, Lecture Notes in Geoinformation and Cartography, J. Gensel, D. Josselin and D. Vandenbroucke. Springer Berlin Heidelberg.

  12. Live Social Semantics

    NASA Astrophysics Data System (ADS)

    Alani, Harith; Szomszor, Martin; Cattuto, Ciro; van den Broeck, Wouter; Correndo, Gianluca; Barrat, Alain

    Social interactions are one of the key factors to the success of conferences and similar community gatherings. This paper describes a novel application that integrates data from the semantic web, online social networks, and a real-world contact sensing platform. This application was successfully deployed at ESWC09, and actively used by 139 people. Personal profiles of the participants were automatically generated using several Web 2.0 systems and semantic academic data sources, and integrated in real-time with face-to-face contact networks derived from wearable sensors. Integration of all these heterogeneous data layers made it possible to offer various services to conference attendees to enhance their social experience such as visualisation of contact data, and a site to explore and connect with other participants. This paper describes the architecture of the application, the services we provided, and the results we achieved in this deployment.

  13. Interactive Parallel Data Analysis within Data-Centric Cluster Facilities using the IPython Notebook

    NASA Astrophysics Data System (ADS)

    Pascoe, S.; Lansdowne, J.; Iwi, A.; Stephens, A.; Kershaw, P.

    2012-12-01

    The data deluge is making traditional analysis workflows for many researchers obsolete. Support for parallelism within popular tools such as matlab, IDL and NCO is not well developed and rarely used. However parallelism is necessary for processing modern data volumes on a timescale conducive to curiosity-driven analysis. Furthermore, for peta-scale datasets such as the CMIP5 archive, it is no longer practical to bring an entire dataset to a researcher's workstation for analysis, or even to their institutional cluster. Therefore, there is an increasing need to develop new analysis platforms which both enable processing at the point of data storage and which provides parallelism. Such an environment should, where possible, maintain the convenience and familiarity of our current analysis environments to encourage curiosity-driven research. We describe how we are combining the interactive python shell (IPython) with our JASMIN data-cluster infrastructure. IPython has been specifically designed to bridge the gap between the HPC-style parallel workflows and the opportunistic curiosity-driven analysis usually carried out using domain specific languages and scriptable tools. IPython offers a web-based interactive environment, the IPython notebook, and a cluster engine for parallelism all underpinned by the well-respected Python/Scipy scientific programming stack. JASMIN is designed to support the data analysis requirements of the UK and European climate and earth system modeling community. JASMIN, with its sister facility CEMS focusing the earth observation community, has 4.5 PB of fast parallel disk storage alongside over 370 computing cores provide local computation. Through the IPython interface to JASMIN, users can make efficient use of JASMIN's multi-core virtual machines to perform interactive analysis on all cores simultaneously or can configure IPython clusters across multiple VMs. Larger-scale clusters can be provisioned through JASMIN's batch scheduling system. Outputs can be summarised and visualised using the full power of Python's many scientific tools, including Scipy, Matplotlib, Pandas and CDAT. This rich user experience is delivered through the user's web browser; maintaining the interactive feel of a workstation-based environment with the parallel power of a remote data-centric processing facility.

  14. Facilitating participatory multilevel decision-making by using interactive mental maps.

    PubMed

    Pfeiffer, Constanze; Glaser, Stephanie; Vencatesan, Jayshree; Schliermann-Kraus, Elke; Drescher, Axel; Glaser, Rüdiger

    2008-11-01

    Participation of citizens in political, economic or social decisions is increasingly recognized as a precondition to foster sustainable development processes. Since spatial information is often important during planning and decision making, participatory mapping gains in popularity. However, little attention has been paid to the fact that information must be presented in a useful way to reach city planners and policy makers. Above all, the importance of visualisation tools to support collaboration, analytical reasoning, problem solving and decision-making in analysing and planning processes has been underestimated. In this paper, we describe how an interactive mental map tool has been developed in a highly interdisciplinary disaster management project in Chennai, India. We moved from a hand drawn mental maps approach to an interactive mental map tool. This was achieved by merging socio-economic and geospatial data on infrastructure, local perceptions, coping and adaptation strategies with remote sensing data and modern technology of map making. This newly developed interactive mapping tool allowed for insights into different locally-constructed realities and facilitated the communication of results to the wider public and respective policy makers. It proved to be useful in visualising information and promoting participatory decision-making processes. We argue that the tool bears potential also for health research projects. The interactive mental map can be used to spatially and temporally assess key health themes such as availability of, and accessibility to, existing health care services, breeding sites of disease vectors, collection and storage of water, waste disposal, location of public toilets or defecation sites.

  15. Applying Sensor Web Technology to Marine Sensor Data

    NASA Astrophysics Data System (ADS)

    Jirka, Simon; del Rio, Joaquin; Mihai Toma, Daniel; Nüst, Daniel; Stasch, Christoph; Delory, Eric

    2015-04-01

    In this contribution we present two activities illustrating how Sensor Web technology helps to enable a flexible and interoperable sharing of marine observation data based on standards. An important foundation is the Sensor Web Architecture developed by the European FP7 project NeXOS (Next generation Low-Cost Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management). This architecture relies on the Open Geospatial Consortium's (OGC) Sensor Web Enablement (SWE) framework. It is an exemplary solution for facilitating the interoperable exchange of marine observation data within and between (research) organisations. The architecture addresses a series of functional and non-functional requirements which are fulfilled through different types of OGC SWE components. The diverse functionalities offered by the NeXOS Sensor Web architecture are shown in the following overview: - Pull-based observation data download: This is achieved through the OGC Sensor Observation Service (SOS) 2.0 interface standard. - Push-based delivery of observation data to allow users the subscription to new measurements that are relevant for them: For this purpose there are currently several specification activities under evaluation (e.g. OGC Sensor Event Service, OGC Publish/Subscribe Standards Working Group). - (Web-based) visualisation of marine observation data: Implemented through SOS client applications. - Configuration and controlling of sensor devices: This is ensured through the OGC Sensor Planning Service 2.0 interface. - Bridging between sensors/data loggers and Sensor Web components: For this purpose several components such as the "Smart Electronic Interface for Sensor Interoperability" (SEISI) concept are developed; this is complemented by a more lightweight SOS extension (e.g. based on the W3C Efficient XML Interchange (EXI) format). To further advance this architecture, there is on-going work to develop dedicated profiles of selected OGC SWE specifications that provide stricter guidance how these standards shall be applied to marine data (e.g. SensorML 2.0 profiles stating which metadata elements are mandatory building upon the ESONET Sensor Registry developments, etc.). Within the NeXOS project the presented architecture is implemented as a set of open source components. These implementations can be re-used by all interested scientists and data providers needing tools for publishing or consuming oceanographic sensor data. In further projects such as the European project FixO3 (Fixed-point Open Ocean Observatories), these software development activities are complemented with additional efforts to provide guidance how Sensor Web technology can be applied in an efficient manner. This way, not only software components are made available but also documentation and information resources that help to understand which types of Sensor Web deployments are best suited to fulfil different types of user requirements.

  16. Rapid Offline-Online Post-Disaster Landslide Mapping Tool: A case study from Nepal

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya

    2016-04-01

    One of the crucial components of post disaster management is the efficient mapping of impacted areas. Here we present a tool designed to map landslides and affected objects after the earthquakes of 2015 in Nepal as well as for intense rainfall impact. Because internet is not available in many rural areas of Nepal, we developed an offline-online prototype based on Open-Source WebGIS technologies to make data on hazard impacts, including damaged infrastructure, landslides or flooding events available to authorities and the general public. This mobile application was designed as a low-cost, rapid and participatory method for recording impacts from hazard events. It is possible to record such events offline and upload them through a server, where internet connection is available. This application allows user authentication, image capturing, and information collation such as geolocation, event description, interactive mapping and finally storing all the data in the server for further analysis and visualisation. This application can be accessed by a mobile phone (Android) or a tablet as a hybrid version for both offline and online versions. The offline version has an interactive-offline map function which allows users to upload satellites image in order to improve ground truthing interpretation. After geolocation, the user can start mapping and then save recorded data into Geojson-TXT files that can be easily uploaded to the server whenever internet is available. This prototype was tested specifically for a rapid assessment of landslides and relevant land use characteristics such as roads, forest area, rivers in the Phewa Lake watershed near Pokhara, Nepal where a large number landslides were activated or reactivated after the 2015 monsoon season. More than 60 landslides were recorded during two days of field trip. Besides, it is possible to use this application for any other kind of hazard event like flood, avalanche, etc. Keywords: Offline, Online, Open source, WebGIS, Android, Post-Disaster, Landslide mapping

  17. Scan for Motifs: a webserver for the analysis of post-transcriptional regulatory elements in the 3' untranslated regions (3' UTRs) of mRNAs.

    PubMed

    Biswas, Ambarish; Brown, Chris M

    2014-06-08

    Gene expression in vertebrate cells may be controlled post-transcriptionally through regulatory elements in mRNAs. These are usually located in the untranslated regions (UTRs) of mRNA sequences, particularly the 3'UTRs. Scan for Motifs (SFM) simplifies the process of identifying a wide range of regulatory elements on alignments of vertebrate 3'UTRs. SFM includes identification of both RNA Binding Protein (RBP) sites and targets of miRNAs. In addition to searching pre-computed alignments, the tool provides users the flexibility to search their own sequences or alignments. The regulatory elements may be filtered by expected value cutoffs and are cross-referenced back to their respective sources and literature. The output is an interactive graphical representation, highlighting potential regulatory elements and overlaps between them. The output also provides simple statistics and links to related resources for complementary analyses. The overall process is intuitive and fast. As SFM is a free web-application, the user does not need to install any software or databases. Visualisation of the binding sites of different classes of effectors that bind to 3'UTRs will facilitate the study of regulatory elements in 3' UTRs.

  18. Investigation of Interactive Online Visual Tools for the Learning of Mathematics

    ERIC Educational Resources Information Center

    Jacobs, K. L.

    2005-01-01

    For many years, educators have been discussing benefits of educational practices such as the use of real-world examples, visualisation, interactivity, constructivism, self-paced learning and self-paced testing. Macromedia Flash MX has been used to develop online modules for the course Differential Equations offered at the University of South…

  19. Using Visualisation Software to Improve Student Approaches to HE Online Assessment

    ERIC Educational Resources Information Center

    Smith, David; Qayyum, M. Aslm; Hard, Natascha

    2017-01-01

    Studying via the Internet using information tools is a common activity for students in higher education. With students accessing their subject material via the Internet, studies have shown that students have difficulty understanding the complete purpose of an assessment which leads to poor information search practices. The selection of relevant…

  20. Colour Size Illusion on Liquid Crystal Displays and Design Guidelines for Bioinformatics Tools

    ERIC Educational Resources Information Center

    Yoo, Hyun Seung; Smith-Jackson, Tonya L.

    2011-01-01

    Although the influence of colour on size perception has been known for a century, there is only limited research on interventions that can reduce this effect. This study was therefore undertaken in order to identify appropriate interventions and propose design guidelines for information visualisation, especially in applications where size…

  1. A Paradigm for Promoting Visual Synthesis through Freehand Sketching

    ERIC Educational Resources Information Center

    Lane, Diarmaid; Seery, Niall; Gordon, Seamus

    2010-01-01

    Research (Fish, 2004) suggests that everybody should be taught how to freehand sketch and utilise it as a tool for supporting the visualising instinct. A fundamental shift in philosophy of the technology education system in Ireland towards design driven subjects brought with it a need to develop practising teacher's technological capabilities.…

  2. Novel application of three-dimensional technologies in a case of dismemberment.

    PubMed

    Baier, Waltraud; Norman, Danielle G; Warnett, Jason M; Payne, Mark; Harrison, Nigel P; Hunt, Nicholas C A; Burnett, Brian A; Williams, Mark A

    2017-01-01

    This case study reports the novel application of three-dimensional technologies such as micro-CT and 3D printing to the forensic investigation of a complex case of dismemberment. Micro-CT was successfully employed to virtually align severed skeletal elements found in different locations, analyse tool marks created during the dismemberment process, and virtually dissect a charred piece of evidence. High resolution 3D prints of the burnt human bone contained within were created for physical visualisation to assist the investigation team. Micro-CT as a forensic radiological method provided vital information and the basis for visualisation both during the investigation and in the subsequent trial making it one of the first examples of such technology in a UK court. Copyright © 2016. Published by Elsevier B.V.

  3. A novel alternative method for 3D visualisation in Parasitology: the construction of a 3D model of a parasite from 2D illustrations.

    PubMed

    Teo, B G; Sarinder, K K S; Lim, L H S

    2010-08-01

    Three-dimensional (3D) models of the marginal hooks, dorsal and ventral anchors, bars and haptoral reservoirs of a parasite, Sundatrema langkawiense Lim & Gibson, 2009 (Monogenea) were developed using the polygonal modelling method in Autodesk 3ds Max (Version 9) based on two-dimensional (2D) illustrations. Maxscripts were written to rotate the modelled 3D structures. Appropriately orientated 3D haptoral hard-parts were then selected and positioned within the transparent 3D outline of the haptor and grouped together to form a complete 3D haptoral entity. This technique is an inexpensive tool for constructing 3D models from 2D illustrations for 3D visualisation of the spatial relationships between the different structural parts within organisms.

  4. Making better scientific figures

    NASA Astrophysics Data System (ADS)

    Hawkins, Ed; McNeall, Doug

    2016-04-01

    In the words of the UK government chief scientific adviser "Science is not finished until it's communicated" (Walport 2013). The tools to produce good visual communication have never been so easily accessible to scientists as at the present. Correspondingly, it has never been easier to produce and disseminate poor graphics. In this presentation, we highlight some good practice and offer some practical advice in preparing scientific figures for presentation to peers or to the public. We identify common mistakes in visualisation, including some made by the authors, and offer some good reasons not to trust defaults in graphics software. In particular, we discuss the use of colour scales and share our experiences in running a social media campaign (http://tiny.cc/endrainbow) to replace the "rainbow" (also "jet", or "spectral") colour scale as the default in (climate) scientific visualisation.

  5. Groundwater Visualisation System (GVS): A software framework for integrated display and interrogation of conceptual hydrogeological models, data and time-series animation

    NASA Astrophysics Data System (ADS)

    Cox, Malcolm E.; James, Allan; Hawke, Amy; Raiber, Matthias

    2013-05-01

    Management of groundwater systems requires realistic conceptual hydrogeological models as a framework for numerical simulation modelling, but also for system understanding and communicating this to stakeholders and the broader community. To help overcome these challenges we developed GVS (Groundwater Visualisation System), a stand-alone desktop software package that uses interactive 3D visualisation and animation techniques. The goal was a user-friendly groundwater management tool that could support a range of existing real-world and pre-processed data, both surface and subsurface, including geology and various types of temporal hydrological information. GVS allows these data to be integrated into a single conceptual hydrogeological model. In addition, 3D geological models produced externally using other software packages, can readily be imported into GVS models, as can outputs of simulations (e.g. piezometric surfaces) produced by software such as MODFLOW or FEFLOW. Boreholes can be integrated, showing any down-hole data and properties, including screen information, intersected geology, water level data and water chemistry. Animation is used to display spatial and temporal changes, with time-series data such as rainfall, standing water levels and electrical conductivity, displaying dynamic processes. Time and space variations can be presented using a range of contouring and colour mapping techniques, in addition to interactive plots of time-series parameters. Other types of data, for example, demographics and cultural information, can also be readily incorporated. The GVS software can execute on a standard Windows or Linux-based PC with a minimum of 2 GB RAM, and the model output is easy and inexpensive to distribute, by download or via USB/DVD/CD. Example models are described here for three groundwater systems in Queensland, northeastern Australia: two unconfined alluvial groundwater systems with intensive irrigation, the Lockyer Valley and the upper Condamine Valley, and the Surat Basin, a large sedimentary basin of confined artesian aquifers. This latter example required more detail in the hydrostratigraphy, correlation of formations with drillholes and visualisation of simulation piezometric surfaces. Both alluvial system GVS models were developed during drought conditions to support government strategies to implement groundwater management. The Surat Basin model was industry sponsored research, for coal seam gas groundwater management and community information and consultation. The "virtual" groundwater systems in these 3D GVS models can be interactively interrogated by standard functions, plus production of 2D cross-sections, data selection from the 3D scene, rear end database and plot displays. A unique feature is that GVS allows investigation of time-series data across different display modes, both 2D and 3D. GVS has been used successfully as a tool to enhance community/stakeholder understanding and knowledge of groundwater systems and is of value for training and educational purposes. Projects completed confirm that GVS provides a powerful support to management and decision making, and as a tool for interpretation of groundwater system hydrological processes. A highly effective visualisation output is the production of short videos (e.g. 2-5 min) based on sequences of camera 'fly-throughs' and screen images. Further work involves developing support for multi-screen displays and touch-screen technologies, distributed rendering, gestural interaction systems. To highlight the visualisation and animation capability of the GVS software, links to related multimedia hosted online sites are included in the references.

  6. European seismological data exchange, access and processing: current status of the Research Infrastructure project NERIES

    NASA Astrophysics Data System (ADS)

    Giardini, D.; van Eck, T.; Bossu, R.; Wiemer, S.

    2009-04-01

    The EC Research infrastructure project NERIES, an Integrated Infrastructure Initiative in seismology for 2006-2010 has passed its mid-term point. We will present a short concise overview of the current state of the project, established cooperation with other European and global projects and the planning for the last year of the project. Earthquake data archiving and access within Europe has dramatically improved during the last two years. This concerns earthquake parameters, digital broadband and acceleration waveforms and historical data. The Virtual European Broadband Seismic Network (VEBSN) consists currently of more then 300 stations. A new distributed data archive concept, the European Integrated Waveform Data Archive (EIDA), has been implemented in Europe connecting the larger European seismological waveform data. Global standards for earthquake parameter data (QuakeML) and tomography models have been developed and are being established. Web application technology has been and is being developed to make a jump start to the next generation data services. A NERIES data portal provides a number of services testing the potential capacities of new open-source web technologies. Data application tools like shakemaps, lossmaps, site response estimation and tools for data processing and visualisation are currently available, although some of these tools are still in an alpha version. A European tomography reference model will be discussed at a special workshop in June 2009. Shakemaps, coherent with the NEIC application, are implemented in, among others, Turkey, Italy, Romania, Switzerland, several countries. The comprehensive site response software is being distributed and used both inside and outside the project. NERIES organises several workshops inviting both consortium and non-consortium participants and covering a wide range of subjects: ‘Seismological observatory operation tools', ‘Tomography', ‘Ocean bottom observatories', 'Site response software training', ‘Historical earthquake catalogues', ‘Distribution of acceleration data', etc. Some of these workshops are coordinated with other organisations/projects, like ORFEUS, ESONET, IRIS, etc. NERIES still offers grants to individual researchers or groups to work at facilities such as the Swiss national seismological network (SED/ETHZ, Switzerland), the CEA/DASE facilities in France, the data scanning facilities at INGV (SISMOS), the array facilities of NORSAR (Norway) and the new Conrad Facility in Austria.

  7. High-speed flow visualization in hypersonic, transonic, and shock tube flows

    NASA Astrophysics Data System (ADS)

    Kleine, H.; Olivier, H.

    2017-02-01

    High-speed flow visualisation has played an important role in the investigations conducted at the Stoßwellenlabor of the RWTH Aachen University for many decades. In addition to applying the techniques of high-speed imaging, this laboratory has been actively developing new or enhanced visualisation techniques and approaches such as various schlieren methods or time-resolved Mach-Zehnder interferometry. The investigated high-speed flows are inherently highly transient, with flow Mach numbers ranging from about M = 0.7 to M = 8. The availability of modern high-speed cameras has allowed us to expand the investigations into problems where reduced reproducibility had so far limited the amount of information that could be extracted from a limited number of flow visualisation records. Following a brief historical overview, some examples of recent studies are given, which represent the breadth of applications in which high-speed imaging has been an essential diagnostic tool to uncover the physics of high-speed flows. Applications include the stability of hypersonic corner flows, the establishment of shock wave systems in transonic airfoil flow, and the complexities of the interactions of shock waves with obstacles of various shapes.

  8. Visualising linked health data to explore health events around preventable hospitalisations in NSW Australia

    PubMed Central

    Jorm, Louisa R; Leyland, Alastair H

    2016-01-01

    Objective To explore patterns of health service use in the lead-up to, and following, admission for a ‘preventable’ hospitalisation. Setting 266 950 participants in the 45 and Up Study, New South Wales (NSW) Australia Methods Linked data on hospital admissions, general practitioner (GP) visits and other health events were used to create visual representations of health service use. For each participant, health events were plotted against time, with different events juxtaposed using different markers and panels of data. Various visualisations were explored by patient characteristics, and compared with a cohort of non-admitted participants matched on sociodemographic and health characteristics. Health events were displayed over calendar year and in the 90 days surrounding first preventable hospitalisation. Results The visualisations revealed patterns of clustering of GP consultations in the lead-up to, and following, preventable hospitalisation, with 14% of patients having a consultation on the day of admission and 27% in the prior week. There was a clustering of deaths and other hospitalisations following discharge, particularly for patients with a long length of stay, suggesting patients may have been in a state of health deterioration. Specialist consultations were primarily clustered during the period of hospitalisation. Rates of all health events were higher in patients admitted for a preventable hospitalisation than the matched non-admitted cohort. Conclusions We did not find evidence of limited use of primary care services in the lead-up to a preventable hospitalisation, rather people with preventable hospitalisations tended to have high levels of engagement with multiple elements of the healthcare system. As such, preventable hospitalisations might be better used as a tool for identifying sicker patients for managed care programmes. Visualising longitudinal health data was found to be a powerful strategy for uncovering patterns of health service use, and such visualisations have potential to be more widely adopted in health services research. PMID:27604087

  9. Visualisation of upper limb activity using spirals: A new approach to the assessment of daily prosthesis usage.

    PubMed

    Chadwell, Alix; Kenney, Laurence; Granat, Malcolm; Thies, Sibylle; Head, John S; Galpin, Adam

    2018-02-01

    Current outcome measures used in upper limb myoelectric prosthesis studies include clinical tests of function and self-report questionnaires on real-world prosthesis use. Research in other cohorts has questioned both the validity of self-report as an activity assessment tool and the relationship between clinical functionality and real-world upper limb activity. Previously, 1 we reported the first results of monitoring upper limb prosthesis use. However, the data visualisation technique used was limited in scope. Methodology development. To introduce two new methods for the analysis and display of upper limb activity monitoring data and to demonstrate the potential value of the approach with example real-world data. Upper limb activity monitors, worn on each wrist, recorded data on two anatomically intact participants and two prosthesis users over 1 week. Participants also filled in a diary to record upper limb activity. Data visualisation was carried out using histograms, and Archimedean spirals to illustrate temporal patterns of upper limb activity. Anatomically intact participants' activity was largely bilateral in nature, interspersed with frequent bursts of unilateral activity of each arm. At times when the prosthesis was worn prosthesis users showed very little unilateral use of the prosthesis (≈20-40 min/week compared to ≈350 min/week unilateral activity on each arm for anatomically intact participants), with consistent bias towards the intact arm throughout. The Archimedean spiral plots illustrated participant-specific patterns of non-use in prosthesis users. The data visualisation techniques allow detailed and objective assessment of temporal patterns in the upper limb activity of prosthesis users. Clinical relevance Activity monitoring offers an objective method for the assessment of upper limb prosthesis users' (PUs) activity outside of the clinic. By plotting data using Archimedean spirals, it is possible to visualise, in detail, the temporal patterns of upper limb activity. Further work is needed to explore the relationship between traditional functional outcome measures and real-world prosthesis activity.

  10. Addition of simethicone improves small bowel capsule endoscopy visualisation quality.

    PubMed

    Krijbolder, M S; Grooteman, K V; Bogers, S K; de Jong, D J

    2018-01-01

    Small bowel capsule endoscopy (SBCE) is an important diagnostic tool for small-bowel diseases but its quality may be hampered by intraluminal gas. This study evaluated the added value of the anti-foaming agent, simethicone, to a bowel preparation with polyethylene glycol (PEG) on the quality of small bowel visualisation and its use in the Netherlands. This was a retrospective, single-blind, cohort study. Patients in the PEG group only received PEG prior to SBCE. Patients in the PEG-S group ingested additional simethicone. Two investigators assessed the quality of small-bowel visualisation using a four-point scale for 'intraluminal gas' and 'faecal contamination'. By means of a survey, the use of anti-foaming agents was assessed in a random sample of 16 Dutch hospitals performing SBCE. The quality of small bowel visualisation in the PEG group (n = 33) was significantly more limited by intraluminal gas when compared with the PEG-S group (n = 31): proximal segment 83.3% in PEG group vs. 18.5% in PEG-S group (p < 0.01), distal segment 66.7% vs. 18.5% respectively (p < 0.01). No difference was observed in the amount of faecal contamination (proximal segment 80.0% PEG vs. 59.3% PEG-S, p = 0.2; distal segment 90.0% PEG vs. 85.2% PEG-S, p = 0.7), mean small bowel transit times (4.0 PEG vs. 3.9 hours PEG-S, p = 0.7) and diagnostic yield (43.3% PEG vs. 22.2% PEG-S, p = 0.16). Frequency of anti-foaming agent use in the Netherlands was low (3/16, 18.8%). Simethicone is of added value to a PEG bowel preparation in improving the quality of visualisation of the small bowel by reducing intraluminal gas. At present, the use of anti-foaming agents in SBCE preparation is not standard practice in the Netherlands.

  11. SigmoID: a user-friendly tool for improving bacterial genome annotation through analysis of transcription control signals

    PubMed Central

    Damienikan, Aliaksandr U.

    2016-01-01

    The majority of bacterial genome annotations are currently automated and based on a ‘gene by gene’ approach. Regulatory signals and operon structures are rarely taken into account which often results in incomplete and even incorrect gene function assignments. Here we present SigmoID, a cross-platform (OS X, Linux and Windows) open-source application aiming at simplifying the identification of transcription regulatory sites (promoters, transcription factor binding sites and terminators) in bacterial genomes and providing assistance in correcting annotations in accordance with regulatory information. SigmoID combines a user-friendly graphical interface to well known command line tools with a genome browser for visualising regulatory elements in genomic context. Integrated access to online databases with regulatory information (RegPrecise and RegulonDB) and web-based search engines speeds up genome analysis and simplifies correction of genome annotation. We demonstrate some features of SigmoID by constructing a series of regulatory protein binding site profiles for two groups of bacteria: Soft Rot Enterobacteriaceae (Pectobacterium and Dickeya spp.) and Pseudomonas spp. Furthermore, we inferred over 900 transcription factor binding sites and alternative sigma factor promoters in the annotated genome of Pectobacterium atrosepticum. These regulatory signals control putative transcription units covering about 40% of the P. atrosepticum chromosome. Reviewing the annotation in cases where it didn’t fit with regulatory information allowed us to correct product and gene names for over 300 loci. PMID:27257541

  12. A mobile app for delivering in-field soil data for precision agriculture

    NASA Astrophysics Data System (ADS)

    Isaacs, John P.; Stojanovic, Vladeta; Falconer, Ruth E.

    2015-04-01

    In the last decade precision agriculture has grown from a concept to an emerging technology, largely due to the maturing of GPS and mobile mapping. We investigated methods for reliable delivery and display of appropriate and context aware in-field farm data on mobile devices by developing a prototype android mobile app. The 3D app was developed using OpenGL ES 2.0 and written in Java, using the Android Development Tools (ADT) SDK. The app is able to obtain GPS coordinates and automatically synchronise the view and load relevant data based on the user's location. The intended audience of the mobile app is farmers and agronomists. Apps are becoming an essential tool in an agricultural professional's arsenal however most existing apps are limited to 2D display of data even though the modern chips in mobile devices can support the display of 3D graphics at interactive rates using technologies such as webGL. This project investigated the use of games techniques in the delivery and 3D display of field data, recognising that this may be a departure from the way the field data is currently delivered and displayed to farmers and agronomists. Different interactive 3D visualisation methods presenting spatial and temporal variation in yield values were developed and tested. It is expected that this app can be used by farmers and agronomists to support decision making in the field of precision agriculture and this is a growing market in UK and Europe.

  13. Synthetic Biology: Mapping the Scientific Landscape

    PubMed Central

    Oldham, Paul; Hall, Stephen; Burton, Geoff

    2012-01-01

    This article uses data from Thomson Reuters Web of Science to map and analyse the scientific landscape for synthetic biology. The article draws on recent advances in data visualisation and analytics with the aim of informing upcoming international policy debates on the governance of synthetic biology by the Subsidiary Body on Scientific, Technical and Technological Advice (SBSTTA) of the United Nations Convention on Biological Diversity. We use mapping techniques to identify how synthetic biology can best be understood and the range of institutions, researchers and funding agencies involved. Debates under the Convention are likely to focus on a possible moratorium on the field release of synthetic organisms, cells or genomes. Based on the empirical evidence we propose that guidance could be provided to funding agencies to respect the letter and spirit of the Convention on Biological Diversity in making research investments. Building on the recommendations of the United States Presidential Commission for the Study of Bioethical Issues we demonstrate that it is possible to promote independent and transparent monitoring of developments in synthetic biology using modern information tools. In particular, public and policy understanding and engagement with synthetic biology can be enhanced through the use of online interactive tools. As a step forward in this process we make existing data on the scientific literature on synthetic biology available in an online interactive workbook so that researchers, policy makers and civil society can explore the data and draw conclusions for themselves. PMID:22539946

  14. PolyTB: a genomic variation map for Mycobacterium tuberculosis.

    PubMed

    Coll, Francesc; Preston, Mark; Guerra-Assunção, José Afonso; Hill-Cawthorn, Grant; Harris, David; Perdigão, João; Viveiros, Miguel; Portugal, Isabel; Drobniewski, Francis; Gagneux, Sebastien; Glynn, Judith R; Pain, Arnab; Parkhill, Julian; McNerney, Ruth; Martin, Nigel; Clark, Taane G

    2014-05-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis (Mtb) is the second major cause of death from an infectious disease worldwide. Recent advances in DNA sequencing are leading to the ability to generate whole genome information in clinical isolates of M. tuberculosis complex (MTBC). The identification of informative genetic variants such as phylogenetic markers and those associated with drug resistance or virulence will help barcode Mtb in the context of epidemiological, diagnostic and clinical studies. Mtb genomic datasets are increasingly available as raw sequences, which are potentially difficult and computer intensive to process, and compare across studies. Here we have processed the raw sequence data (>1500 isolates, eight studies) to compile a catalogue of SNPs (n = 74,039, 63% non-synonymous, 51.1% in more than one isolate, i.e. non-private), small indels (n = 4810) and larger structural variants (n = 800). We have developed the PolyTB web-based tool (http://pathogenseq.lshtm.ac.uk/polytb) to visualise the resulting variation and important meta-data (e.g. in silico inferred strain-types, location) within geographical map and phylogenetic views. This resource will allow researchers to identify polymorphisms within candidate genes of interest, as well as examine the genomic diversity and distribution of strains. PolyTB source code is freely available to researchers wishing to develop similar tools for their pathogen of interest. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Semantic Body Browser: graphical exploration of an organism and spatially resolved expression data visualization.

    PubMed

    Lekschas, Fritz; Stachelscheid, Harald; Seltmann, Stefanie; Kurtz, Andreas

    2015-03-01

    Advancing technologies generate large amounts of molecular and phenotypic data on cells, tissues and organisms, leading to an ever-growing detail and complexity while information retrieval and analysis becomes increasingly time-consuming. The Semantic Body Browser is a web application for intuitively exploring the body of an organism from the organ to the subcellular level and visualising expression profiles by means of semantically annotated anatomical illustrations. It is used to comprehend biological and medical data related to the different body structures while relying on the strong pattern recognition capabilities of human users. The Semantic Body Browser is a JavaScript web application that is freely available at http://sbb.cellfinder.org. The source code is provided on https://github.com/flekschas/sbb. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. phiGENOME: an integrative navigation throughout bacteriophage genomes.

    PubMed

    Stano, Matej; Klucar, Lubos

    2011-11-01

    phiGENOME is a web-based genome browser generating dynamic and interactive graphical representation of phage genomes stored in the phiSITE, database of gene regulation in bacteriophages. phiGENOME is an integral part of the phiSITE web portal (http://www.phisite.org/phigenome) and it was optimised for visualisation of phage genomes with the emphasis on the gene regulatory elements. phiGENOME consists of three components: (i) genome map viewer built using Adobe Flash technology, providing dynamic and interactive graphical display of phage genomes; (ii) sequence browser based on precisely formatted HTML tags, providing detailed exploration of genome features on the sequence level and (iii) regulation illustrator, based on Scalable Vector Graphics (SVG) and designed for graphical representation of gene regulations. Bringing 542 complete genome sequences accompanied with their rich annotations and references, makes phiGENOME a unique information resource in the field of phage genomics. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Tool independence for the Web Accessibility Quantitative Metric.

    PubMed

    Vigo, Markel; Brajnik, Giorgio; Arrue, Myriam; Abascal, Julio

    2009-07-01

    The Web Accessibility Quantitative Metric (WAQM) aims at accurately measuring the accessibility of web pages. One of the main features of WAQM among others is that it is evaluation tool independent for ranking and accessibility monitoring scenarios. This article proposes a method to attain evaluation tool independence for all foreseeable scenarios. After demonstrating that homepages have a more similar error profile than any other web page in a given web site, 15 homepages were measured with 10,000 different values of WAQM parameters using EvalAccess and LIFT, two automatic evaluation tools for accessibility. A similar procedure was followed with random pages and with several test files obtaining several tuples that minimise the difference between both tools. One thousand four hundred forty-nine web pages from 15 web sites were measured with these tuples and those values that minimised the difference between the tools were selected. Once the WAQM was tuned, the accessibility of 15 web sites was measured with two metrics for web sites, concluding that even if similar values can be produced, obtaining the same scores is undesirable since evaluation tools behave in a different way.

  18. Towards the Development of a Taxonomy for Visualisation of Streamed Geospatial Data

    NASA Astrophysics Data System (ADS)

    Sibolla, B. H.; Van Zyl, T.; Coetzee, S.

    2016-06-01

    Geospatial data has very specific characteristics that need to be carefully captured in its visualisation, in order for the user and the viewer to gain knowledge from it. The science of visualisation has gained much traction over the last decade as a response to various visualisation challenges. During the development of an open source based, dynamic two-dimensional visualisation library, that caters for geospatial streaming data, it was found necessary to conduct a review of existing geospatial visualisation taxonomies. The review was done in order to inform the design phase of the library development, such that either an existing taxonomy can be adopted or extended to fit the needs at hand. The major challenge in this case is to develop dynamic two dimensional visualisations that enable human interaction in order to assist the user to understand the data streams that are continuously being updated. This paper reviews the existing geospatial data visualisation taxonomies that have been developed over the years. Based on the review, an adopted taxonomy for visualisation of geospatial streaming data is presented. Example applications of this taxonomy are also provided. The adopted taxonomy will then be used to develop the information model for the visualisation library in a further study.

  19. EpiCollect+: linking smartphones to web applications for complex data collection projects

    PubMed Central

    Aanensen, David M.; Huntley, Derek M.; Menegazzo, Mirko; Powell, Chris I.; Spratt, Brian G.

    2014-01-01

    Previously, we have described the development of the generic mobile phone data gathering tool, EpiCollect, and an associated web application, providing two-way communication between multiple data gatherers and a project database. This software only allows data collection on the phone using a single questionnaire form that is tailored to the needs of the user (including a single GPS point and photo per entry), whereas many applications require a more complex structure, allowing users to link a series of forms in a linear or branching hierarchy, along with the addition of any number of media types accessible from smartphones and/or tablet devices (e.g., GPS, photos, videos, sound clips and barcode scanning). A much enhanced version of EpiCollect has been developed (EpiCollect+). The individual data collection forms in EpiCollect+ provide more design complexity than the single form used in EpiCollect, and the software allows the generation of complex data collection projects through the ability to link many forms together in a linear (or branching) hierarchy. Furthermore, EpiCollect+ allows the collection of multiple media types as well as standard text fields, increased data validation and form logic. The entire process of setting up a complex mobile phone data collection project to the specification of a user (project and form definitions) can be undertaken at the EpiCollect+ website using a simple ‘drag and drop’ procedure, with visualisation of the data gathered using Google Maps and charts at the project website. EpiCollect+ is suitable for situations where multiple users transmit complex data by mobile phone (or other Android devices) to a single project web database and is already being used for a range of field projects, particularly public health projects in sub-Saharan Africa. However, many uses can be envisaged from education, ecology and epidemiology to citizen science. PMID:25485096

  20. ESO Reflex: a graphical workflow engine for data reduction

    NASA Astrophysics Data System (ADS)

    Hook, Richard; Ullgrén, Marko; Romaniello, Martino; Maisala, Sami; Oittinen, Tero; Solin, Otto; Savolainen, Ville; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Ballester, Pascal; Gabasch, Armin; Izzo, Carlo

    ESO Reflex is a prototype software tool that provides a novel approach to astronomical data reduction by integrating a modern graphical workflow system (Taverna) with existing legacy data reduction algorithms. Most of the raw data produced by instruments at the ESO Very Large Telescope (VLT) in Chile are reduced using recipes. These are compiled C applications following an ESO standard and utilising routines provided by the Common Pipeline Library (CPL). Currently these are run in batch mode as part of the data flow system to generate the input to the ESO/VLT quality control process and are also exported for use offline. ESO Reflex can invoke CPL-based recipes in a flexible way through a general purpose graphical interface. ESO Reflex is based on the Taverna system that was originally developed within the UK life-sciences community. Workflows have been created so far for three VLT/VLTI instruments, and the GUI allows the user to make changes to these or create workflows of their own. Python scripts or IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. Taverna is intended for use with web services and experiments using ESO Reflex to access Virtual Observatory web services have been successfully performed. ESO Reflex is the main product developed by Sampo, a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal was to look into the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Sampo concluded early in 2008. This contribution will describe ESO Reflex and show several examples of its use both locally and using Virtual Observatory remote web services. ESO Reflex is expected to be released to the community in early 2009.

  1. EpiCollect+: linking smartphones to web applications for complex data collection projects.

    PubMed

    Aanensen, David M; Huntley, Derek M; Menegazzo, Mirko; Powell, Chris I; Spratt, Brian G

    2014-01-01

    Previously, we have described the development of the generic mobile phone data gathering tool, EpiCollect, and an associated web application, providing two-way communication between multiple data gatherers and a project database. This software only allows data collection on the phone using a single questionnaire form that is tailored to the needs of the user (including a single GPS point and photo per entry), whereas many applications require a more complex structure, allowing users to link a series of forms in a linear or branching hierarchy, along with the addition of any number of media types accessible from smartphones and/or tablet devices (e.g., GPS, photos, videos, sound clips and barcode scanning). A much enhanced version of EpiCollect has been developed (EpiCollect+). The individual data collection forms in EpiCollect+ provide more design complexity than the single form used in EpiCollect, and the software allows the generation of complex data collection projects through the ability to link many forms together in a linear (or branching) hierarchy. Furthermore, EpiCollect+ allows the collection of multiple media types as well as standard text fields, increased data validation and form logic. The entire process of setting up a complex mobile phone data collection project to the specification of a user (project and form definitions) can be undertaken at the EpiCollect+ website using a simple 'drag and drop' procedure, with visualisation of the data gathered using Google Maps and charts at the project website. EpiCollect+ is suitable for situations where multiple users transmit complex data by mobile phone (or other Android devices) to a single project web database and is already being used for a range of field projects, particularly public health projects in sub-Saharan Africa. However, many uses can be envisaged from education, ecology and epidemiology to citizen science.

  2. AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields

    NASA Astrophysics Data System (ADS)

    López, R.; San-Juan, J. F.

    2013-05-01

    Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.

  3. Organization and Visualization for Initial Analysis of Forced-Choice Ipsative Data

    ERIC Educational Resources Information Center

    Cochran, Jill A.

    2015-01-01

    Forced-choice ipsative data are common in personality, philosophy and other preference-based studies. However, this type of data inherently contains dependencies that are challenging for usual statistical analysis. In order to utilize the structure of the data as a guide for analysis rather than as a challenge to manage, a visualisation tool was…

  4. Using Machine-Learning and Visualisation to Facilitate Learner Interpretation of Source Material

    ERIC Educational Resources Information Center

    Wolff, Annika; Mulholland, Paul; Zdrahal, Zdenek

    2014-01-01

    This paper describes an approach for supporting inquiry learning from source materials, realised and tested through a tool-kit. The approach is optimised for tasks that require a student to make interpretations across sets of resources, where opinions and justifications may be hard to articulate. We adopt a dialogue-based approach to learning…

  5. Evaluation of Teaching the IS-LM Model through a Simulation Program

    ERIC Educational Resources Information Center

    Pablo-Romero, Maria del Populo; Pozo-Barajas, Rafael; Gomez-Calero, Maria de la Palma

    2012-01-01

    The IS-ML model is a basic tool used in the teaching of short-term macroeconomics. Teaching is essentially done through the use of graphs. However, the way these graphs are traditionally taught does not allow the learner to easily visualise changes in the curves. The IS-LM simulation program overcomes difficulties encountered in understanding the…

  6. PyGPlates - a GPlates Python library for data analysis through space and deep geological time

    NASA Astrophysics Data System (ADS)

    Williams, Simon; Cannon, John; Qin, Xiaodong; Müller, Dietmar

    2017-04-01

    A fundamental consideration for studying the Earth through deep time is that the configurations of the continents, tectonic plates, and plate boundaries are continuously changing. Within a diverse range of fields including geodynamics, paleoclimate, and paleobiology, the importance of considering geodata in their reconstructed context across previous cycles of supercontinent aggregation, dispersal and ocean basin evolution is widely recognised. Open-source software tools such as GPlates provide paleo-geographic information systems for geoscientists to combine a wide variety of geodata and examine them within tectonic reconstructions through time. The availability of such powerful tools also brings new challenges - we want to learn something about the key associations between reconstructed plate motions and the geological record, but the high-dimensional parameter space is difficult for a human being to visually comprehend and quantify these associations. To achieve true spatio-temporal data-mining, new tools are needed. Here, we present a further development of the GPlates ecosystem - a Python-based tool for geotectonic analysis. In contrast to existing GPlates tools that are built around a graphical user interface (GUI) and interactive visualisation, pyGPlates offers a programming interface for the automation of quantitative plate tectonic analysis or arbitrary complexity. The vast array of open-source Python-based tools for data-mining, statistics and machine learning can now be linked to pyGPlates, allowing spatial data to be seamlessly analysed in space and geological "deep time", and with the ability to spread large computations across multiple processors. The presentation will illustrate a range of example applications, both simple and advanced. Basic examples include data querying, filtering, and reconstruction, and file-format conversions. For the innovative study of plate kinematics, pyGPlates has been used to explore the relationships between absolute plate motions, subduction zone kinematics, and mid-ocean ridge migration and orientation through deep time; to investigate the systematics of continental rift velocity evolution during Pangea breakup; and to make connections between kinematics of the Andean subduction zone and ore deposit formation. To support the numerical modelling community, pyGPlates facilitates the connection between tectonic surface boundary conditions contained within plate tectonic reconstructions (plate boundary configurations and plate velocities) and simulations such as thermo-mechanical models of lithospheric deformation and mantle convection. To support the development of web-based applications that can serve the wider geoscience community, we will demonstrate how pyGPlates can be combined with other open-source tools to serve alternative reconstructions together with a diverse array of reconstructed data sets in a self-consistent framework over the internet. PyGPlates is available to the public via the GPlates web site and contains comprehensive documentation covering installation on Windows/Mac/Linux platforms, sample code, tutorials and a detailed reference of pyGPlates functions and classes.

  7. TOPCAT: Tool for OPerations on Catalogues And Tables

    NASA Astrophysics Data System (ADS)

    Taylor, Mark

    2011-01-01

    TOPCAT is an interactive graphical viewer and editor for tabular data. Its aim is to provide most of the facilities that astronomers need for analysis and manipulation of source catalogues and other tables, though it can be used for non-astronomical data as well. It understands a number of different astronomically important formats (including FITS and VOTable) and more formats can be added. It offers a variety of ways to view and analyse tables, including a browser for the cell data themselves, viewers for information about table and column metadata, and facilities for 1-, 2-, 3- and higher-dimensional visualisation, calculating statistics and joining tables using flexible matching algorithms. Using a powerful and extensible Java-based expression language new columns can be defined and row subsets selected for separate analysis. Table data and metadata can be edited and the resulting modified table can be written out in a wide range of output formats. It is a stand-alone application which works quite happily with no network connection. However, because it uses Virtual Observatory (VO) standards, it can cooperate smoothly with other tools in the VO world and beyond, such as VODesktop, Aladin and ds9. Between 2006 and 2009 TOPCAT was developed within the AstroGrid project, and is offered as part of a standard suite of applications on the AstroGrid web site, where you can find information on several other VO tools. The program is written in pure Java and available under the GNU General Public Licence. It has been developed in the UK within the Starlink and AstroGrid projects, and under PPARC and STFC grants. Its underlying table processing facilities are provided by STIL.

  8. A visualization framework for design and evaluation

    NASA Astrophysics Data System (ADS)

    Blundell, Benjamin J.; Ng, Gary; Pettifer, Steve

    2006-01-01

    The creation of compelling visualisation paradigms is a craft often dominated by intuition and issues of aesthetics, with relatively few models to support good design. The majority of problem cases are approached by simply applying a previously evaluated visualisation technique. A large body of work exists covering the individual aspects of visualisation design such as the human cognition aspects visualisation methods for specific problem areas, psychology studies and so forth, yet most frameworks regarding visualisation are applied after-the-fact as an evaluation measure. We present an extensible framework for visualisation aimed at structuring the design process, increasing decision traceability and delineating the notions of function, aesthetics and usability. The framework can be used to derive a set of requirements for good visualisation design and evaluating existing visualisations, presenting possible improvements. Our framework achieves this by being both broad and general, built on top of existing works, with hooks for extensions and customizations. This paper shows how existing theories of information visualisation fit into the scheme, presents our experience in the application of this framework on several designs, and offers our evaluation of the framework and the designs studied.

  9. Digital test assembly of truck parts with the IMMA-tool--an illustrative case.

    PubMed

    Hanson, L; Högberg, D; Söderholm, M

    2012-01-01

    Several digital human modelling (DHM) tools have been developed for simulation and visualisation of human postures and motions. In 2010 the DHM tool IMMA (Intelligently Moving Manikins) was introduced as a DHM tool that uses advanced path planning techniques to generate collision free and biomechanically acceptable motions for digital human models (as well as parts) in complex assembly situations. The aim of the paper is to illustrate how the IPS/IMMA tool is used at Scania CV AB in a digital test assembly process, and to compare the tool with other DHM tools on the market. The illustrated case of using the IMMA tool, here combined with the path planner tool IPS, indicates that the tool is promising. The major strengths of the tool are its user friendly interface, the motion generation algorithms, the batch simulation of manikins and the ergonomics assessment methods that consider time.

  10. Delivering Electronic Resources with Web OPACs and Other Web-based Tools: Needs of Reference Librarians.

    ERIC Educational Resources Information Center

    Bordeianu, Sever; Carter, Christina E.; Dennis, Nancy K.

    2000-01-01

    Describes Web-based online public access catalogs (Web OPACs) and other Web-based tools as gateway methods for providing access to library collections. Addresses solutions for overcoming barriers to information, such as through the implementation of proxy servers and other authentication tools for remote users. (Contains 18 references.)…

  11. Follow-up imaging of the urinary tract in spinal injury patients: is a KUB necessary with every ultrasound?

    PubMed

    Tins, B; Teo, H-G; Popuri, R; Cassar-Pullicino, V; Tyrrell, P

    2005-04-01

    Prospective study of 100 consecutive patients. To evaluate the diagnostic usefulness of the urinary tract (KUB) radiograph routinely performed as part of spinal injury patient urinary tract screening with ultrasound (US) and the KUB radiograph. Orthopaedic and District General Hospital with spinal injuries unit, UK. Prospective study of the urinary tract of 100 consecutive routine follow-up spinal injury patients with KUB (kidneys, ureters, bladder) radiograph and US of the urinary tract. The percentage of the visualised area of kidneys and urinary bladder and relevant abnormal findings were recorded. Relevant patient history was recorded. In all, 80 men 20 women were examined (average age 46 years, average time since injury 11 years). A total of 199 kidneys and 99 urinary bladders were examined. On average, less than 50% of the renal area and about 70-75% of the urinary bladder area were visualised. Five patients had renal stones identified on the KUB radiograph, and of these two were seen on US. There were no stones seen on US only. The patient history was not helpful to identify patients with renal stones. Significant further renal abnormalities were identified with US in 14 patients, and with the KUB radiograph in 0 patients. Significant urinary bladder abnormalities were identified with US in 20 patients, and with the KUB radiograph in 0 patients. On average, less than 50% of the kidney area is visualised on the KUB due to overlying bowel markings making the KUB radiograph a poor tool to assess the kidneys. The KUB radiograph and US are poor tools to assess urinary tract stones. In the absence of a therapeutic consequence, the KUB radiograph does not seem justified in the routine follow-up of the urinary tract in spinal injury patients.

  12. Designing the RiverCare knowledge base and web-collaborative platform to exchange knowledge in river management

    NASA Astrophysics Data System (ADS)

    Cortes Arevalo, Juliette; den Haan, Robert-Jan; van der Voort, Mascha; Hulscher, Suzanne

    2016-04-01

    Effective communication strategies are necessary between different scientific disciplines, practitioners and non-experts for a shared understanding and better implementation of river management measures. In that context, the RiverCare program aims to get a better understanding of riverine measures that are being implemented towards self-sustaining multifunctional rivers in the Netherlands. During the RiverCare program, user committees are organized between the researchers and practitioners to discuss the aim and value of RiverCare outputs, related assumptions and uncertainties behind scientific results. Beyond the RiverCare program end, knowledge about river interventions, integrated effects, management and self-sustaining applications will be available to experts and non-experts by means of River Care communication tools: A web-collaborative platform and a serious gaming environment. As part of the communication project of RiverCare, we are designing the RiverCare web-collaborative platform and the knowledge-base behind that platform. We aim at promoting collaborative efforts and knowledge exchange in river management. However, knowledge exchange does not magically happen. Consultation and discussion of RiverCare outputs as well as elicitation of perspectives and preferences from different actors about the effects of riverine measures has to be facilitated. During the RiverCare research activities, the platform will support the user committees or collaborative sessions that are regularly held with the organizations directly benefiting from our research, at project level or in study areas. The design process of the collaborative platform follows an user centred approach to identify user requirements, co-create a conceptual design and iterative develop and evaluate prototypes of the platform. The envisioned web-collaborative platform opens with an explanation and visualisation of the RiverCare outputs that are available in the knowledge base. Collaborative sessions are initiated by one facilitator that invites other users to contribute by agreeing on an objective for the session and ways and period of collaboration. Upon login, users can join the different sessions that they are invited or will be willing to participate. Within these sessions, users collaboratively engage on the topic at hand, acquiring knowledge about the ongoing results of RiverCare, sharing knowledge between actors and co-constructing new knowledge in the process as input for RiverCare research activities. An overview of each session will be presented to registered and non-registered users to document collaboration efforts and promote interaction with actors outside RiverCare. At the user requirements analysis stage of the collaborative platform, a questionnaire and workshop session was launched to uncover the end user's preferences and expectations about the tool to be designed. Results comprised insights about design criteria of the collaborative platform. The user requirements will be followed by interview sessions with RiverCare researchers and user committee members to identify considerations for data management, objectives of collaboration, expected outputs and indicators to evaluate the collaborative platform. On one side, considerations of intended users are important for co-designing tools that effectively communicate and promote a shared understanding of scientific outputs. On the other one, active involvement of end-users is important for the establishment of measurable indicators to evaluate the tool and the collaborative process.

  13. Gebiss: an ImageJ plugin for the specification of ground truth and the performance evaluation of 3D segmentation algorithms

    PubMed Central

    2011-01-01

    Background Image segmentation is a crucial step in quantitative microscopy that helps to define regions of tissues, cells or subcellular compartments. Depending on the degree of user interactions, segmentation methods can be divided into manual, automated or semi-automated approaches. 3D image stacks usually require automated methods due to their large number of optical sections. However, certain applications benefit from manual or semi-automated approaches. Scenarios include the quantification of 3D images with poor signal-to-noise ratios or the generation of so-called ground truth segmentations that are used to evaluate the accuracy of automated segmentation methods. Results We have developed Gebiss; an ImageJ plugin for the interactive segmentation, visualisation and quantification of 3D microscopic image stacks. We integrated a variety of existing plugins for threshold-based segmentation and volume visualisation. Conclusions We demonstrate the application of Gebiss to the segmentation of nuclei in live Drosophila embryos and the quantification of neurodegeneration in Drosophila larval brains. Gebiss was developed as a cross-platform ImageJ plugin and is freely available on the web at http://imaging.bii.a-star.edu.sg/projects/gebiss/. PMID:21668958

  14. PHENOPSIS DB: an Information System for Arabidopsis thaliana phenotypic data in an environmental context

    PubMed Central

    2011-01-01

    Background Renewed interest in plant × environment interactions has risen in the post-genomic era. In this context, high-throughput phenotyping platforms have been developed to create reproducible environmental scenarios in which the phenotypic responses of multiple genotypes can be analysed in a reproducible way. These platforms benefit hugely from the development of suitable databases for storage, sharing and analysis of the large amount of data collected. In the model plant Arabidopsis thaliana, most databases available to the scientific community contain data related to genetic and molecular biology and are characterised by an inadequacy in the description of plant developmental stages and experimental metadata such as environmental conditions. Our goal was to develop a comprehensive information system for sharing of the data collected in PHENOPSIS, an automated platform for Arabidopsis thaliana phenotyping, with the scientific community. Description PHENOPSIS DB is a publicly available (URL: http://bioweb.supagro.inra.fr/phenopsis/) information system developed for storage, browsing and sharing of online data generated by the PHENOPSIS platform and offline data collected by experimenters and experimental metadata. It provides modules coupled to a Web interface for (i) the visualisation of environmental data of an experiment, (ii) the visualisation and statistical analysis of phenotypic data, and (iii) the analysis of Arabidopsis thaliana plant images. Conclusions Firstly, data stored in the PHENOPSIS DB are of interest to the Arabidopsis thaliana community, particularly in allowing phenotypic meta-analyses directly linked to environmental conditions on which publications are still scarce. Secondly, data or image analysis modules can be downloaded from the Web interface for direct usage or as the basis for modifications according to new requirements. Finally, the structure of PHENOPSIS DB provides a useful template for the development of other similar databases related to genotype × environment interactions. PMID:21554668

  15. SWE-based Observation Data Delivery from the Instrument to the User - Sensor Web Technology in the NeXOS Project

    NASA Astrophysics Data System (ADS)

    Jirka, Simon; del Rio, Joaquin; Toma, Daniel; Martinez, Enoc; Delory, Eric; Pearlman, Jay; Rieke, Matthes; Stasch, Christoph

    2017-04-01

    The rapidly evolving technology for building Web-based (spatial) information infrastructures and Sensor Webs, there are new opportunities to improve the process how ocean data is collected and managed. A central element in this development is the suite of Sensor Web Enablement (SWE) standards specified by the Open Geospatial Consortium (OGC). This framework of standards comprises on the one hand data models as well as formats for measurement data (ISO/OGC Observations and Measurement, O&M) and metadata describing measurement processes and sensors (OGC Sensor Model Language, SensorML). On the other hand the SWE standards comprise (Web service) interface specifications for pull-based access to observation data (OGC Sensor Observation Service, SOS) and for controlling or configuring sensors (OGC Sensor Planning Service, SPS). Also within the European INSPIRE framework the SWE standards play an important role as the SOS is the recommended download service interface for O&M-encoded observation data sets. In the context of the EU-funded Oceans of Tomorrow initiative the NeXOS (Next generation, Cost-effective, Compact, Multifunctional Web Enabled Ocean Sensor Systems Empowering Marine, Maritime and Fisheries Management) project is developing a new generation of in-situ sensors that make use of the SWE standards to facilitate the data publication process and the integration into Web based information infrastructures. This includes the development of a dedicated firmware for instruments and sensor platforms (SEISI, Smart Electronic Interface for Sensors and Instruments) maintained by the Universitat Politècnica de Catalunya (UPC). Among other features, SEISI makes use of OGC SWE standards such OGC-PUCK, to enable a plug-and-play mechanism for sensors based on SensorML encoded metadata. Thus, if a new instrument is attached to a SEISI-based platform, it automatically configures the connection to these instruments, automatically generated data files compliant with the ISO/OGC Observations and Measurements standard and initiates the data transmission into the NeXOS Sensor Web infrastructure. Besides these platform-related developments, NeXOS has realised the full path of data transmission from the sensor to the end user application. The conceptual architecture design is implemented by a series of open source SWE software packages provided by 52°North. This comprises especially different SWE server components (i.e. OGC Sensor Observation Service), tools for data visualisation (e.g. the 52°North Helgoland SOS viewer), and an editor for providing SensorML-based metadata (52°North smle). As a result, NeXOS has demonstrated how the SWE standards help to improve marine observation data collection. Within this presentation, we will present the experiences and findings of the NeXOS project and will provide recommendation for future work directions.

  16. Teaching Web Security Using Portable Virtual Labs

    ERIC Educational Resources Information Center

    Chen, Li-Chiou; Tao, Lixin

    2012-01-01

    We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…

  17. Smart "geomorphological" map browsing - a tale about geomorphological maps and the internet

    NASA Astrophysics Data System (ADS)

    Geilhausen, M.; Otto, J.-C.

    2012-04-01

    With the digital production of geomorphological maps, the dissemination of research outputs now extends beyond simple paper products. Internet technologies can contribute to both, the dissemination of geomorphological maps and access to geomorphologic data and help to make geomorphological knowledge available to a greater public. Indeed, many national geological surveys employ end-to-end digital workflows from data capture in the field to final map production and dissemination. This paper deals with the potential of web mapping applications and interactive, portable georeferenced PDF maps for the distribution of geomorphological information. Web mapping applications such as Google Maps have become very popular and widespread and increased the interest and access to mapping. They link the Internet with GIS technology and are a common way of presenting dynamic maps online. The GIS processing is performed online and maps are visualised in interactive web viewers characterised by different capabilities such as zooming, panning or adding further thematic layers, with the map refreshed after each task. Depending on the system architecture and the components used, advanced symbology, map overlays from different applications and sources and their integration into a Desktop GIS are possible. This interoperability is achieved through the use of international open standards that include mechanisms for the integration and visualisation of information from multiple sources. The portable document format (PDF) is commonly used for printing and is a standard format that can be processed by many graphic software and printers without loss of information. A GeoPDF enables the sharing of geospatial maps and data in PDF documents. Multiple, independent map frames with individual spatial reference systems are possible within a GeoPDF, for example, for map overlays or insets. Geospatial functionality of a GeoPDF includes scalable map display, layer visibility control, access to attribute data, coordinate queries and spatial measurements. The full functionality of GeoPDFs requires free and user-friendly plug-ins for PDF readers and GIS software. A GeoPDF enables fundamental GIS functionality turning the formerly static PDF map into an interactive, portable georeferenced PDF map. GeoPDFs are easy to create and provide an interesting and valuable way to disseminate geomorphological maps. Our motivation to engage with the online distribution of geomorphological maps originates in the increasing number of web mapping applications available today indicating that the Internet has become a medium for displaying geographical information in rich forms and user-friendly interfaces. So, why not use the Internet to distribute geomorphological maps and enhance their practical application? Web mapping and dynamic PDF maps can play a key role in the movement towards a global dissemination of geomorphological information. This will be exemplified by live demonstrations of i.) existing geomorphological WebGIS applications, ii.) data merging from various sources using web map services, and iii.) free to download GeoPDF maps during the presentations.

  18. Bibliometric Science Mapping as a Popular Trend: Chosen Examples of Visualisation of International Research Network Results

    ERIC Educational Resources Information Center

    Smyrnova-Trybulska, Eugenia; Morze, Nataliia; Kuzminska, Olena; Kommers, Piet

    2017-01-01

    The authors of the article describe the popular trends and methods as well as ICT tools used for the mapping and visualization of scientific domains as a research methodology which is attracting more and more interest from scientific information and science studies professionals. The researchers analysed Pajek, one of the programs used for the…

  19. Bringing Data to Life into an Introductory Statistics Course with Gapminder

    ERIC Educational Resources Information Center

    Le, Dai-Trang

    2013-01-01

    "Gapminder" is a free and easy to use software for visualising real-world data in multiple dimensions. The simple format of the Cartesian coordinate system is used in a dynamic and interactive way to convey a great deal of information. This tool can be readily used to arouse students' natural curiosity regarding world events and to…

  20. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    DTIC Science & Technology

    2016-09-01

    ARL-TR-7798 ● SEP 2016 US Army Research Laboratory Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server...for the Applied Anomaly Detection Tool (AADT) Web Server by Christian D Schlesiger Computational and Information Sciences Directorate, ARL...SUBTITLE Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  1. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  2. Using game engine for 3D terrain visualisation of GIS data: A review

    NASA Astrophysics Data System (ADS)

    Che Mat, Ruzinoor; Shariff, Abdul Rashid Mohammed; Nasir Zulkifli, Abdul; Shafry Mohd Rahim, Mohd; Hafiz Mahayudin, Mohd

    2014-06-01

    This paper reviews on the 3D terrain visualisation of GIS data using game engines that are available in the market as well as open source. 3D terrain visualisation is a technique used to visualise terrain information from GIS data such as a digital elevation model (DEM), triangular irregular network (TIN) and contour. Much research has been conducted to transform the 2D view of map to 3D. There are several terrain visualisation softwares that are available for free, which include Cesium, Hftool and Landserf. This review paper will help interested users to better understand the current state of art in 3D terrain visualisation of GIS data using game engines.

  3. D Modelling and Interactive Web-Based Visualization of Cultural Heritage Objects

    NASA Astrophysics Data System (ADS)

    Koeva, M. N.

    2016-06-01

    Nowadays, there are rapid developments in the fields of photogrammetry, laser scanning, computer vision and robotics, together aiming to provide highly accurate 3D data that is useful for various applications. In recent years, various LiDAR and image-based techniques have been investigated for 3D modelling because of their opportunities for fast and accurate model generation. For cultural heritage preservation and the representation of objects that are important for tourism and their interactive visualization, 3D models are highly effective and intuitive for present-day users who have stringent requirements and high expectations. Depending on the complexity of the objects for the specific case, various technological methods can be applied. The selected objects in this particular research are located in Bulgaria - a country with thousands of years of history and cultural heritage dating back to ancient civilizations. This motivates the preservation, visualisation and recreation of undoubtedly valuable historical and architectural objects and places, which has always been a serious challenge for specialists in the field of cultural heritage. In the present research, comparative analyses regarding principles and technological processes needed for 3D modelling and visualization are presented. The recent problems, efforts and developments in interactive representation of precious objects and places in Bulgaria are presented. Three technologies based on real projects are described: (1) image-based modelling using a non-metric hand-held camera; (2) 3D visualization based on spherical panoramic images; (3) and 3D geometric and photorealistic modelling based on architectural CAD drawings. Their suitability for web-based visualization are demonstrated and compared. Moreover the possibilities for integration with additional information such as interactive maps, satellite imagery, sound, video and specific information for the objects are described. This comparative study discusses the advantages and disadvantages of these three approaches and their integration in multiple domains, such as web-based 3D city modelling, tourism and architectural 3D visualization. It was concluded that image-based modelling and panoramic visualisation are simple, fast and effective techniques suitable for simultaneous virtual representation of many objects. However, additional measurements or CAD information will be beneficial for obtaining higher accuracy.

  4. Visualisation of the mechanosensitive channel of large conductance in bacteria using confocal microscopy.

    PubMed

    Norman, Christel; Liu, Zhen-Wei; Rigby, Paul; Raso, Albert; Petrov, Yevgeniy; Martinac, Boris

    2005-07-01

    The mechanosensitive channel of large conductance (MscL) plays an important role in the survival of bacterial cells to hypo-osmotic shock. This channel has been extensively studied and its sequence, structure and electrophysiological characteristics are well known. Here we present a method to visualise MscL in living bacteria using confocal microscopy. By creating a gene fusion between mscl and the gene encoding the green fluorescent protein (GFP) we were able to express the fusion protein MscL-GFP in bacteria. We show that MscL-GFP is present in the cytoplasmic membrane and forms functional channels. These channels have the same characteristics as wild-type MscL, except that they require more pressure to open. This method could prove an interesting, non-invasive, tool to study the localisation and the regulation of expression of MscL in bacteria.

  5. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  6. Analysis tools for the interplay between genome layout and regulation.

    PubMed

    Bouyioukos, Costas; Elati, Mohamed; Képès, François

    2016-06-06

    Genome layout and gene regulation appear to be interdependent. Understanding this interdependence is key to exploring the dynamic nature of chromosome conformation and to engineering functional genomes. Evidence for non-random genome layout, defined as the relative positioning of either co-functional or co-regulated genes, stems from two main approaches. Firstly, the analysis of contiguous genome segments across species, has highlighted the conservation of gene arrangement (synteny) along chromosomal regions. Secondly, the study of long-range interactions along a chromosome has emphasised regularities in the positioning of microbial genes that are co-regulated, co-expressed or evolutionarily correlated. While one-dimensional pattern analysis is a mature field, it is often powerless on biological datasets which tend to be incomplete, and partly incorrect. Moreover, there is a lack of comprehensive, user-friendly tools to systematically analyse, visualise, integrate and exploit regularities along genomes. Here we present the Genome REgulatory and Architecture Tools SCAN (GREAT:SCAN) software for the systematic study of the interplay between genome layout and gene expression regulation. SCAN is a collection of related and interconnected applications currently able to perform systematic analyses of genome regularities as well as to improve transcription factor binding sites (TFBS) and gene regulatory network predictions based on gene positional information. We demonstrate the capabilities of these tools by studying on one hand the regular patterns of genome layout in the major regulons of the bacterium Escherichia coli. On the other hand, we demonstrate the capabilities to improve TFBS prediction in microbes. Finally, we highlight, by visualisation of multivariate techniques, the interplay between position and sequence information for effective transcription regulation.

  7. Redesigning Instruction through Web-based Course Authoring Tools.

    ERIC Educational Resources Information Center

    Dabbagh, Nada H.; Schmitt, Jeff

    1998-01-01

    Examines the pedagogical implications of redesigning instruction for Web-based delivery through a case study of an undergraduate computer science course. Initially designed for a traditional learning environment, this course transformed to a Web-based course using WebCT, a Web-based course authoring tool. Discusses the specific features of WebCT.…

  8. Towards Making Data Bases Practical for use in the Field

    NASA Astrophysics Data System (ADS)

    Fischer, T. P.; Lehnert, K. A.; Chiodini, G.; McCormick, B.; Cardellini, C.; Clor, L. E.; Cottrell, E.

    2014-12-01

    Geological, geochemical, and geophysical research is often field based with travel to remote areas and collection of samples and data under challenging environmental conditions. Cross-disciplinary investigations would greatly benefit from near real-time data access and visualisation within the existing framework of databases and GIS tools. An example of complex, interdisciplinary field-based and data intensive investigations is that of volcanologists and gas geochemists, who sample gases from fumaroles, hot springs, dry gas vents, hydrothermal vents and wells. Compositions of volcanic gas plumes are measured directly or by remote sensing. Soil gas fluxes from volcanic areas are measured by accumulation chamber and involve hundreds of measurements to calculate the total emission of a region. Many investigators also collect rock samples from recent or ancient volcanic eruptions. Structural, geochronological, and geophysical data collected during the same or related field campaigns complement these emissions data. All samples and data collected in the field require a set of metadata including date, time, location, sample or measurement id, and descriptive comments. Currently, most of these metadata are written in field notebooks and later transferred into a digital format. Final results such as laboratory analyses of samples and calculated flux data are tabulated for plotting, correlation with other types of data, modeling and finally publication and presentation. Data handling, organization and interpretation could be greatly streamlined by using digital tools available in the field to record metadata, assign an International Geo Sample Number (IGSN), upload measurements directly from field instruments, and arrange sample curation. Available data display tools such as GeoMapApp and existing data sets (PetDB, IRIS, UNAVCO) could be integrated to direct locations for additional measurements during a field campaign. Nearly live display of sampling locations, pictures, and comments could be used as an educational and outreach tool during sampling expeditions. Achieving these goals requires the integration of existing online data resources, with common access through a dedicated web portal.

  9. EpiContactTrace: an R-package for contact tracing during livestock disease outbreaks and for risk-based surveillance

    PubMed Central

    2014-01-01

    Background During outbreak of livestock diseases, contact tracing can be an important part of disease control. Animal movements can also be of relevance for risk-based surveillance and sampling, i.e. both when assessing consequences of introduction or likelihood of introduction. In many countries, animal movement data are collected with one of the major objectives to enable contact tracing. However, often an analytical step is needed to retrieve appropriate information for contact tracing or surveillance. Results In this study, an open source tool was developed to structure livestock movement data to facilitate contact-tracing in real time during disease outbreaks and for input in risk-based surveillance and sampling. The tool, EpiContactTrace, was written in the R-language and uses the network parameters in-degree, out-degree, ingoing contact chain and outgoing contact chain (also called infection chain), which are relevant for forward and backward tracing respectively. The time-frames for backward and forward tracing can be specified independently and search can be done on one farm at a time or for all farms within the dataset. Different outputs are available; datasets with network measures, contacts visualised in a map and automatically generated reports for each farm either in HTML or PDF-format intended for the end-users, i.e. the veterinary authorities, regional disease control officers and field-veterinarians. EpiContactTrace is available as an R-package at the R-project website (http://cran.r-project.org/web/packages/EpiContactTrace/). Conclusions We believe this tool can help in disease control since it rapidly can structure essential contact information from large datasets. The reproducible reports make this tool robust and independent of manual compilation of data. The open source makes it accessible and easily adaptable for different needs. PMID:24636731

  10. A survey of motif finding Web tools for detecting binding site motifs in ChIP-Seq data

    PubMed Central

    2014-01-01

    Abstract ChIP-Seq (chromatin immunoprecipitation sequencing) has provided the advantage for finding motifs as ChIP-Seq experiments narrow down the motif finding to binding site locations. Recent motif finding tools facilitate the motif detection by providing user-friendly Web interface. In this work, we reviewed nine motif finding Web tools that are capable for detecting binding site motifs in ChIP-Seq data. We showed each motif finding Web tool has its own advantages for detecting motifs that other tools may not discover. We recommended the users to use multiple motif finding Web tools that implement different algorithms for obtaining significant motifs, overlapping resemble motifs, and non-overlapping motifs. Finally, we provided our suggestions for future development of motif finding Web tool that better assists researchers for finding motifs in ChIP-Seq data. Reviewers This article was reviewed by Prof. Sandor Pongor, Dr. Yuriy Gusev, and Dr. Shyam Prabhakar (nominated by Prof. Limsoon Wong). PMID:24555784

  11. Interventional MR: vascular applications.

    PubMed

    Smits, H F; Bos, C; van der Weide, R; Bakker, C J

    1999-01-01

    Three strategies for visualisation of MR-dedicated guidewires and catheters have been proposed, namely active tracking, the technique of locally induced field inhomogeneity and passive susceptibility-based tracking. In this article the pros and cons of these techniques are discussed, including the development of MR-dedicated guidewires and catheters, scan techniques, post-processing tools, and display facilities for MR tracking. Finally, some of the results obtained with MR tracking are discussed.

  12. Spatial Thinking and Visualisation of Real-World Concepts using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.

    2015-12-01

    Commonly, geoscience data is presented to students in the lab and classroom in the form of data tables, maps and graphs. Successful data interpretation requires learners to become proficient with spatial thinking skills, allowing them to gain insight and understanding of the underlying real-world 3-D processes and concepts. Yet, educators at both the school and university level often witness students having difficulty in performing that translation. As a result, tools and resources that help to bridge that spatial capability gap can have useful application in the educational realm. A free, map-based data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory caters to students and teachers alike by providing a variety of data display and manipulation techniques that enhance geospatial awareness. Called GeoMapApp (http://www.geomapapp.org), the tool provides access to hundreds of built-in authentic geoscience data sets. Examples include earthquake and volcano data, geological maps, lithospheric plate boundary information, geochemical, oceanographic, and environmental data. Barriers to entry are lowered through easy installation, seamless integration of research-grade data sets, intuitive menus, and project-saving continuity. The default base map is a cutting-edge elevation model covering the oceans and land. Dynamic contouring, artificial illumination, 3-D visualisations, data point manipulations, cross-sectional profiles, and other display techniques help students grasp the content and geospatial context of data. Data sets can also be layered for easier comparison. Students may import their own data sets in Excel, ASCII, shapefile, and gridded format, and they can gain a sense of ownership by being able to tailor their data explorations and save their own projects. GeoMapApp is adaptable to a range of learning environments from lab sessions, group projects, and homework assignments to in-class pop-ups. A new Save Session function allows educators to preserve a pre-loaded state of GeoMapApp. When shared with a class, the saved file allows every student to open GeoMapApp at exactly the same starting point from which to begin their data explorations. A wide range of enquiry-driven education modules for GeoMapApp is already available at SERC.

  13. 'Tagger' - a Mac OS X Interactive Graphical Application for Data Inference and Analysis of N-Dimensional Datasets in the Natural Physical Sciences.

    NASA Astrophysics Data System (ADS)

    Morse, P. E.; Reading, A. M.; Lueg, C.

    2014-12-01

    Pattern-recognition in scientific data is not only a computational problem but a human-observer problem as well. Human observation of - and interaction with - data visualization software can augment, select, interrupt and modify computational routines and facilitate processes of pattern and significant feature recognition for subsequent human analysis, machine learning, expert and artificial intelligence systems.'Tagger' is a Mac OS X interactive data visualisation tool that facilitates Human-Computer interaction for the recognition of patterns and significant structures. It is a graphical application developed using the Quartz Composer framework. 'Tagger' follows a Model-View-Controller (MVC) software architecture: the application problem domain (the model) is to facilitate novel ways of abstractly representing data to a human interlocutor, presenting these via different viewer modalities (e.g. chart representations, particle systems, parametric geometry) to the user (View) and enabling interaction with the data (Controller) via a variety of Human Interface Devices (HID). The software enables the user to create an arbitrary array of tags that may be appended to the visualised data, which are then saved into output files as forms of semantic metadata. Three fundamental problems that are not strongly supported by conventional scientific visualisation software are addressed:1] How to visually animate data over time, 2] How to rapidly deploy unconventional parametrically driven data visualisations, 3] How to construct and explore novel interaction models that capture the activity of the end-user as semantic metadata that can be used to computationally enhance subsequent interrogation. Saved tagged data files may be loaded into Tagger, so that tags may be tagged, if desired. Recursion opens up the possibility of refining or overlapping different types of tags, tagging a variety of different POIs or types of events, and of capturing different types of specialist observations of important or noticeable events. Other visualisations and modes of interaction will also be demonstrated, with the aim of discovering knowledge in large datasets in the natural, physical sciences. Fig.1 Wave height data from an oceanographic Wave Rider Buoy. Colors/radii are driven by wave height data.

  14. Analysis Tool Web Services from the EMBL-EBI.

    PubMed

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  15. Analysis Tool Web Services from the EMBL-EBI

    PubMed Central

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-01-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods. PMID:23671338

  16. Teaching Web 2.0 technologies using Web 2.0 technologies.

    PubMed

    Rethlefsen, Melissa L; Piorun, Mary; Prince, J Dale

    2009-10-01

    The research evaluated participant satisfaction with the content and format of the "Web 2.0 101: Introduction to Second Generation Web Tools" course and measured the impact of the course on participants' self-evaluated knowledge of Web 2.0 tools. The "Web 2.0 101" online course was based loosely on the Learning 2.0 model. Content was provided through a course blog and covered a wide range of Web 2.0 tools. All Medical Library Association members were invited to participate. Participants were asked to complete a post-course survey. Respondents who completed the entire course or who completed part of the course self-evaluated their knowledge of nine social software tools and concepts prior to and after the course using a Likert scale. Additional qualitative information about course strengths and weaknesses was also gathered. Respondents' self-ratings showed a significant change in perceived knowledge for each tool, using a matched pair Wilcoxon signed rank analysis (P<0.0001 for each tool/concept). Overall satisfaction with the course appeared high. Hands-on exercises were the most frequently identified strength of the course; the length and time-consuming nature of the course were considered weaknesses by some. Learning 2.0-style courses, though demanding time and self-motivation from participants, can increase knowledge of Web 2.0 tools.

  17. Why should we publish Linked Data?

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Riechert, Maik; Koubarakis, Manolis; Pace, Nino

    2016-04-01

    We use the Web every day to access information from all kinds of different sources. But the complexity and diversity of scientific data mean that discovering accessing and interpreting data remains a large challenge to researchers, decision-makers and other users. Different sources of useful information on data, algorithms, instruments and publications are scattered around the Web. How can we link all these things together to help users to better understand and exploit earth science data? How can we combine scientific data with other relevant data sources, when standards for describing and sharing data vary so widely between communities? "Linked Data" is a term that describes a set of standards and "best practices" for sharing data on the Web (http://www.w3.org/standards/semanticweb/data). These principles can be summarised as follows: 1. Create unique and persistent identifiers for the important "things" in a community (e.g. datasets, publications, algorithms, instruments). 2. Allow users to "look up" these identifiers on the web to find out more information about them. 3. Make this information machine-readable in a community-neutral format (such as RDF, Resource Description Framework). 4. Within this information, embed links to other things and concepts and say how these are related. 5. Optionally, provide web service interfaces to allow the user to perform sophisticated queries over this information (using a language such as SPARQL). The promise of Linked Data is that, through these techniques, data will be more discoverable, more comprehensible and more usable by different communities, not just the community that produced the data. As a result, many data providers (particularly public-sector institutions) are now publishing data in this way. However, this area is still in its infancy in terms of real-world applications. Data users need guidance and tools to help them use Linked Data. Data providers need reassurance that the investments they are making in publishing Linked Data will result in tangible user benefits. This presentation will address a number of these issues, using real-world experience gathered from four recent European projects: MELODIES (http://melodiesproject.eu), LEO (http://linkedeodata.eu), CHARMe (http://linkedeodata.eu) and TELEIOS (http://www.earthobservatory.eu). These projects have all applied Linked Data techniques in practical, real-world situations involving the use of diverse data (including earth science data) by both industrial and academic users. Specifically, we will: • Identify a set of practical and valuable uses for Linked Data, focusing on areas where Linked Data fills gaps left by other technologies. These uses include: enabling the discovery of earth science data using mass-market search engines, helping users to understand data and its uses, combining data from multiple sources and enabling the annotation of data by users. • Enumerate some common challenges faced by developers of data-driven services who wish to use Linked Data in their applications. • Describe a new suite of tools for managing, processing and visualising Linked Data in earth science applications (including geospatial Linked Data).

  18. Easy Web Interfaces to IDL Code for NSTX Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    W.M. Davis

    Reusing code is a well-known Software Engineering practice to substantially increase the efficiency of code production, as well as to reduce errors and debugging time. A variety of "Web Tools" for the analysis and display of raw and analyzed physics data are in use on NSTX [1], and new ones can be produced quickly from existing IDL [2] code. A Web Tool with only a few inputs, and which calls an IDL routine written in the proper style, can be created in less than an hour; more typical Web Tools with dozens of inputs, and the need for some adaptationmore » of existing IDL code, can be working in a day or so. Efficiency is also increased for users of Web Tools because o f the familiar interface of the web browser, and not needing X-windows, accounts, passwords, etc. Web Tools were adapted for use by PPPL physicists accessing EAST data stored in MDSplus with only a few man-weeks of effort; adapting to additional sites should now be even easier. An overview of Web Tools in use on NSTX, and a list of the most useful features, is also presented.« less

  19. Diagnosis demystified: CT as diagnostic tool in endodontics

    PubMed Central

    Shruthi, Nagaraja; Sreenivasa Murthy, B V; Sundaresh, K J; Mallikarjuna, Rachappa

    2013-01-01

    Diagnosis in endodontics is usually based on clinical and radiographical presentations, which are only empirical methods. The role of healing profession is to apply knowledge and skills towards maintaining and restoring the patient's health. Recent advances in imaging technologies have added to correct interpretation and diagnosis. CT is proving to be an effective tool in solving endodontic mysteries through its three-dimensional visualisation. CT imaging offers many diagnostic advantages to produce reconstructed images in selected projection and low-contrast resolution far superior to that of all other X-ray imaging modalities. This case report is an endeavour towards effective treatment planning of cases with root fracture, root resorption using spiral CT as an adjuvant diagnostic tool. PMID:23814212

  20. Web-based routing assistance tool to reduce pavement damage by overweight and oversize vehicles.

    DOT National Transportation Integrated Search

    2016-10-30

    This report documents the results of a completed project titled Web-Based Routing Assistance Tool to Reduce Pavement Damage by Overweight and Oversize Vehicles. The tasks involved developing a Web-based GIS routing assistance tool and evaluate ...

  1. Information visualisation for science and policy: engaging users and avoiding bias.

    PubMed

    McInerny, Greg J; Chen, Min; Freeman, Robin; Gavaghan, David; Meyer, Miriah; Rowland, Francis; Spiegelhalter, David J; Stefaner, Moritz; Tessarolo, Geizi; Hortal, Joaquin

    2014-03-01

    Visualisations and graphics are fundamental to studying complex subject matter. However, beyond acknowledging this value, scientists and science-policy programmes rarely consider how visualisations can enable discovery, create engaging and robust reporting, or support online resources. Producing accessible and unbiased visualisations from complicated, uncertain data requires expertise and knowledge from science, policy, computing, and design. However, visualisation is rarely found in our scientific training, organisations, or collaborations. As new policy programmes develop [e.g., the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES)], we need information visualisation to permeate increasingly both the work of scientists and science policy. The alternative is increased potential for missed discoveries, miscommunications, and, at worst, creating a bias towards the research that is easiest to display. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Designing and Implementing Web-Based Scaffolding Tools for Technology-Enhanced Socioscientific Inquiry

    ERIC Educational Resources Information Center

    Shin, Suhkyung; Brush, Thomas A.; Glazewski, Krista D.

    2017-01-01

    This study explores how web-based scaffolding tools provide instructional support while implementing a socio-scientific inquiry (SSI) unit in a science classroom. This case study focused on how students used web-based scaffolding tools during SSI activities, and how students perceived the SSI unit and the scaffolding tools embedded in the SSI…

  3. Big Data, Small Data: Accessing and Manipulating Geoscience Data Ranging From Repositories to Student-Collected Data Sets Using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.

    2015-12-01

    We often demand information and data to be accessible over the web at no cost, and no longer do we expect to spend time labouriously compiling data from myriad sources with frustratingly-different formats. Instead, we increasingly expect convenience and consolidation. Recent advances in web-enabled technologies and cyberinfrastructure are answering those calls by providing data tools and resources that can transform undergraduate education. By freeing up valuable classroom time, students can focus upon gaining deeper insights and understanding from real-world data. GeoMapApp (http://www.geomapapp.org) is a map-based data discovery and visualisation tool developed at Lamont-Doherty Earth Observatory. GeoMapApp promotes U-Learning by working across all major computer platforms and functioning anywhere with internet connectivity, by lowering socio-economic barriers (it is free), by seamlessly integrating thousands of built-in research-grade data sets under intuitive menus, and by being adaptable to a range of learning environments - from lab sessions, group projects, and homework assignments to in-class pop-ups. GeoMapApp caters to casual and specialist users alike. Contours, artificial illumination, 3-D displays, data point manipulations, cross-sectional profiles, and other display techniques help students better grasp the content and geospatial context of data. Layering capabilities allow easy data set comparisons. The core functionality also applies to imported data sets: Student-collected data can thus be imported and analysed using the same techniques. A new Save Session function allows educators to preserve a pre-loaded state of GeoMapApp. When shared with a class, the saved file allows every student to open GeoMapApp at exactly the same starting point from which to begin their data explorations. Examples of built-in data sets include seafloor crustal age, earthquake locations and focal mechanisms, analytical geochemistry, ocean water physical properties, US and international geological maps, and satellite imagery. Student-generated data sets can be imported in Excel, ASCII, shapefile, and gridded format. Base maps can be saved for posters and publications. A wide range of undergraduate enquiry-driven education modules for GeoMapApp is already available at SERC.

  4. KnowledgePuzzle: A Browsing Tool to Adapt the Web Navigation Process to the Learner's Mental Model

    ERIC Educational Resources Information Center

    AlAgha, Iyad

    2012-01-01

    This article presents KnowledgePuzzle, a browsing tool for knowledge construction from the web. It aims to adapt the structure of web content to the learner's information needs regardless of how the web content is originally delivered. Learners are provided with a meta-cognitive space (e.g., a concept mapping tool) that enables them to plan…

  5. Older Cancer Patients’ User Experiences With Web-Based Health Information Tools: A Think-Aloud Study

    PubMed Central

    Romijn, Geke; Smets, Ellen M A; Loos, Eugene F; Kunneman, Marleen; van Weert, Julia C M

    2016-01-01

    Background Health information is increasingly presented on the Internet. Several Web design guidelines for older Web users have been proposed; however, these guidelines are often not applied in website development. Furthermore, although we know that older individuals use the Internet to search for health information, we lack knowledge on how they use and evaluate Web-based health information. Objective This study evaluates user experiences with existing Web-based health information tools among older (≥ 65 years) cancer patients and survivors and their partners. The aim was to gain insight into usability issues and the perceived usefulness of cancer-related Web-based health information tools. Methods We conducted video-recorded think-aloud observations for 7 Web-based health information tools, specifically 3 websites providing cancer-related information, 3 Web-based question prompt lists (QPLs), and 1 values clarification tool, with colorectal cancer patients or survivors (n=15) and their partners (n=8) (median age: 73; interquartile range 70-79). Participants were asked to think aloud while performing search, evaluation, and application tasks using the Web-based health information tools. Results Overall, participants perceived Web-based health information tools as highly useful and indicated a willingness to use such tools. However, they experienced problems in terms of usability and perceived usefulness due to difficulties in using navigational elements, shortcomings in the layout, a lack of instructions on how to use the tools, difficulties with comprehensibility, and a large amount of variety in terms of the preferred amount of information. Although participants frequently commented that it was easy for them to find requested information, we observed that the large majority of the participants were not able to find it. Conclusions Overall, older cancer patients appreciate and are able to use cancer information websites. However, this study shows the importance of maintaining awareness of age-related problems such as cognitive and functional decline and navigation difficulties with this target group in mind. The results of this study can be used to design usable and useful Web-based health information tools for older (cancer) patients. PMID:27457709

  6. Planetary plasma data analysis and 3D visualisation tools of the CDPP in the IMPEx infrastructure

    NASA Astrophysics Data System (ADS)

    Gangloff, Michel; Génot, Vincent; Khodachenko, Maxim; Modolo, Ronan; Kallio, Esa; Alexeev, Igor; Al-Ubaidi, Tarek; Scherf, Manuel; André, Nicolas; Bourrel, Nataliya; Budnik, Elena; Bouchemit, Myriam; Dufourg, Nicolas; Beigbeder, Laurent

    2015-04-01

    The CDPP (Centre de Données de la Physique des Plasmas,(http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of plasma data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (http://amda.cdpp.eu/) which enables in depth analysis of a large amount of data through dedicated functionalities such as: visualization, conditional search, cataloguing, and 3DView (http://3dview.cdpp.eu/) which provides immersive visualisations in planetary environments and is further developed to include simulation and observational data. Both tools provide an interface to the IMPEx infrastructure (http://impexfp7.oeaw.ac.at) which facilitates the joint access to outputs of simulations (MHD or Hybrid models) in planetary sciences from providers like LATMOS, FMI as well as planetary plasma observational data provided by the CDPP. Several magnetospheric models are implemented in 3Dview (e.g. Tsyganenko for the Earth, and Cain for Mars). Magnetospheric models provided by SINP for the Earth, Jupiter, Saturn and Mercury as well as Hess models for Jupiter can also be used in 3DView, through the IMPEx infrastructure. A use case demonstrating the new capabilities offered by these tools and their interaction, including magnetospheric models, will be presented together with the IMPEx simulation metadata model used for the interface to simulation databases and model providers.

  7. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui

    PubMed Central

    2012-01-01

    Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications. PMID:22998945

  8. Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui.

    PubMed

    Newton, Richard; Deonarine, Andrew; Wernisch, Lorenz

    2012-09-24

    The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications.

  9. World Wide Web Pages--Tools for Teaching and Learning.

    ERIC Educational Resources Information Center

    Beasley, Sarah; Kent, Jean

    Created to help educators incorporate World Wide Web pages into teaching and learning, this collection of Web pages presents resources, materials, and techniques for using the Web. The first page focuses on tools for teaching and learning via the Web, providing pointers to sites containing the following: (1) course materials for both distance and…

  10. Web-based geo-visualisation of spatial information to support evidence-based health policy: a case study of the development process of HealthTracks.

    PubMed

    Jardine, Andrew; Mullan, Narelle; Gudes, Ori; Cosford, James; Moncrieff, Simon; West, Geoff; Xiao, Jianguo; Yun, Grace; Someford, Peter

    Place is of critical importance to health as it can reveal patterns of disease spread and clustering, associations with risk factors, and areas with greatest need for, or least access to healthcare services and promotion activities. Furthermore, in order to get a good understanding of the health status and needs of a particular area a broad range of data are required which can often be difficult and time consuming to obtain and collate. This process has been expedited by bringing together multiple data sources and making them available in an online geo-visualisation, HealthTracks, which consists of a mapping and reporting component. The overall aim of the HealthTracks project is to make spatial health information more accessible to policymakers, analysts, planners and program managers to inform decision-making across the Department of Health Western Australia. Preliminary mapping and reporting applications that have been utilised to inform service planning, increased awareness of the utility of spatial information and improved efficiency in data access were developed. The future for HealthTracks involves expanding the range of data available and developing new analytical capabilities in order to work towards providing external agencies, researchers and eventually the general public access to rich local area spatial data.

  11. Mobile Learning Approaches for U.S. Army Training

    DTIC Science & Technology

    2010-08-01

    2.0 tools on smartphones may promote student-centered learning pedagogies (e.g., Cochrane & Bateman, 2010) and provide learners with more fruitful...and effective relationships with their instructors and peers.1 That is, Web 2.0 tools facilitate learners‟ creative practices, participation...1 Web 1.0 tools focused on presenting information to users whereas Web 2.0 tools focused on providing social networking

  12. Teaching Plate Tectonic Concepts using GeoMapApp Learning Activities

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Kluge, S.

    2012-12-01

    GeoMapApp Learning Activities ( http://serc.carleton.edu/geomapapp/collection.html ) can help educators to expose undergraduate students to a range of earth science concepts using high-quality data sets in an easy-to-use map-based interface called GeoMapApp. GeoMapApp Learning Activities require students to interact with and analyse research-quality geoscience data as a means to explore and enhance their understanding of underlying content and concepts. Each activity is freely available through the SERC-Carleton web site and offers step-by-step student instructions and answer sheets. Also provided are annotated educator versions of the worksheets that include teaching tips, additional content and suggestions for further work. The activities can be used "off-the-shelf". Or, since the educator may require flexibility to tailor the activities, the documents are provided in Word format for easy modification. Examples of activities include one on the concept of seafloor spreading that requires students to analyse global seafloor crustal age data to calculate spreading rates in different ocean basins. Another activity has students explore hot spots using radiometric age dating of rocks along the Hawaiian-Emperor seamount chain. A third focusses upon the interactive use of contours and profiles to help students visualise 3-D topography on 2-D computer screens. A fourth activity provides a study of mass wasting as revealed through geomorphological evidence. The step-by-step instructions and guided inquiry approach reduce the need for teacher intervention whilst boosting the time that students can spend on productive exploration and learning. The activities can be used, for example, in a classroom lab with the educator present and as self-paced assignments in an out-of-class setting. GeoMapApp Learning Activities are funded through the NSF GeoEd program and are aimed at students in the introductory undergraduate, community college and high school levels. The activities are based upon GeoMapApp (http://www.geomapapp.org), a free map-based data exploration and visualisation tool that allows students to access a wide range of geoscience data in a virtual lab-like environment.

  13. Map as a Service: A Framework for Visualising and Maximising Information Return from Multi-Modal Wireless Sensor Networks

    PubMed Central

    Hammoudeh, Mohammad; Newman, Robert; Dennett, Christopher; Mount, Sarah; Aldabbas, Omar

    2015-01-01

    This paper presents a distributed information extraction and visualisation service, called the mapping service, for maximising information return from large-scale wireless sensor networks. Such a service would greatly simplify the production of higher-level, information-rich, representations suitable for informing other network services and the delivery of field information visualisations. The mapping service utilises a blend of inductive and deductive models to map sense data accurately using externally available knowledge. It utilises the special characteristics of the application domain to render visualisations in a map format that are a precise reflection of the concrete reality. This service is suitable for visualising an arbitrary number of sense modalities. It is capable of visualising from multiple independent types of the sense data to overcome the limitations of generating visualisations from a single type of sense modality. Furthermore, the mapping service responds dynamically to changes in the environmental conditions, which may affect the visualisation performance by continuously updating the application domain model in a distributed manner. Finally, a distributed self-adaptation function is proposed with the goal of saving more power and generating more accurate data visualisation. We conduct comprehensive experimentation to evaluate the performance of our mapping service and show that it achieves low communication overhead, produces maps of high fidelity, and further minimises the mapping predictive error dynamically through integrating the application domain model in the mapping service. PMID:26378539

  14. The Availability of Web 2.0 Tools from Community College Libraries' Websites Serving Large Student Bodies

    ERIC Educational Resources Information Center

    Blummer, Barbara; Kenton, Jeffrey M.

    2014-01-01

    Web 2.0 tools offer academic libraries new avenues for delivering services and resources to students. In this research we report on a content analysis of 100 US community college libraries' Websites for the availability of Web 2.0 applications. We found Web 2.0 tools utilized by 97% of our sample population and many of these sites contained more…

  15. An Assessment of ELINT Exploitation for Situational Awareness Visualisations on Operator Situational Awareness

    DTIC Science & Technology

    2006-10-01

    C L A S I F I C A T I O N An Assessment of ELINT Exploitation for Situational Awareness Visualisations on Operator Situational...environment. The An Assessment of ELINT Exploitation for Situational Awareness Visualisations on Operator Situational Awareness EXECUTIVE SUMMARY...1. ELEXSA’s Process of Sequential Enrichment. 2 Figure 2. Sample ELEXSA Visualisation . 3 Figure 3. Example Llama/Cheetah Network Layout. 5

  16. Increasing efficiency of information dissemination and collection through the World Wide Web

    Treesearch

    Daniel P. Huebner; Malchus B. Baker; Peter F. Ffolliott

    2000-01-01

    Researchers, managers, and educators have access to revolutionary technology for information transfer through the World Wide Web (Web). Using the Web to effectively gather and distribute information is addressed in this paper. Tools, tips, and strategies are discussed. Companion Web sites are provided to guide users in selecting the most appropriate tool for searching...

  17. WebMeV | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Web MeV (Multiple-experiment Viewer) is a web/cloud-based tool for genomic data analysis. Web MeV is being built to meet the challenge of exploring large public genomic data set with intuitive graphical interface providing access to state-of-the-art analytical tools.

  18. A web tool for STORET/WQX water quality data retrieval and Best Management Practice scenario suggestion.

    PubMed

    Park, Youn Shik; Engel, Bernie A; Kim, Jonggun; Theller, Larry; Chaubey, Indrajeet; Merwade, Venkatesh; Lim, Kyoung Jae

    2015-03-01

    Total Maximum Daily Load is a water quality standard to regulate water quality of streams, rivers and lakes. A wide range of approaches are used currently to develop TMDLs for impaired streams and rivers. Flow and load duration curves (FDC and LDC) have been used in many states to evaluate the relationship between flow and pollutant loading along with other models and approaches. A web-based LDC Tool was developed to facilitate development of FDC and LDC as well as to support other hydrologic analyses. In this study, the FDC and LDC tool was enhanced to allow collection of water quality data via the web and to assist in establishing cost-effective Best Management Practice (BMP) implementations. The enhanced web-based tool provides use of water quality data not only from the US Geological Survey but also from the Water Quality Portal for the U.S. via web access. Moreover, the web-based tool identifies required pollutant reductions to meet standard loads and suggests a BMP scenario based on ability of BMPs to reduce pollutant loads, BMP establishment and maintenance costs. In the study, flow and water quality data were collected via web access to develop LDC and to identify the required reduction. The suggested BMP scenario from the web-based tool was evaluated using the EPA Spreadsheet Tool for the Estimation of Pollutant Load model to attain the required pollutant reduction at least cost. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Implementing a social network intervention designed to enhance and diversify support for people with long-term conditions. A qualitative study.

    PubMed

    Kennedy, Anne; Vassilev, Ivaylo; James, Elizabeth; Rogers, Anne

    2016-02-29

    For people with long-term conditions, social networks provide a potentially central means of mobilising, mediating and accessing support for health and well-being. Few interventions address the implementation of improving engagement with and through social networks. This paper describes the development and implementation of a web-based tool which comprises: network mapping, user-centred preference elicitation and need assessment and facilitated engagement with resources. The study aimed to determine whether the intervention was acceptable, implementable and acted to enhance support and to add to theory concerning social networks and engagement with resources and activities. A longitudinal design with 15 case studies used ethnographic methods comprising video, non-participant observation of intervention delivery and qualitative interviews (baseline, 6 and 12 months). Participants were people with type 2 diabetes living in a marginalised island community. Facilitators were local health trainers and care navigators. Analysis applied concepts concerning implementation of technology for self-management support to explain how new practices of work were operationalised and how the technology impacted on relationships fit with everyday life and allowed for visual feedback. Most participants reported identifying and taking up new activities as a result of using the tool. Thematic analysis suggested that workability of the tool was predicated on disruption and reconstruction of networks, challenging/supportive facilitation and change and reflection over time concerning network support. Visualisation of the network enabled people to mobilise support and engage in new activities. The tool aligned synergistically with the facilitators' role of linking people to local resources. The social network tool works through a process of initiating positive disruption of established self-management practice through mapping and reflection on personal network membership and support. This opens up possibilities for reconstructing self-management differently from current practice. Key facets of successful implementation were: the visual maps of networks and support options; facilitation characterised by a perceived lack of status difference which assisted engagement and constructive discussion of support and preferences for activities; and background work (a reliable database, tailored preferences, option reduction) for facilitator and user ease of use.

  20. Using Web-Based Technologies and Tools in Future Choreographers' Training: British Experience

    ERIC Educational Resources Information Center

    Bidyuk, Dmytro

    2016-01-01

    In the paper the problem of using effective web-based technologies and tools in teaching choreography in British higher education institutions has been discussed. Researches on the usage of web-based technologies and tools for practical dance courses in choreographers' professional training at British higher education institutions by such British…

  1. A WebGL Tool for Visualizing the Topology of the Sun's Coronal Magnetic Field

    NASA Astrophysics Data System (ADS)

    Duffy, A.; Cheung, C.; DeRosa, M. L.

    2012-12-01

    We present a web-based, topology-viewing tool that allows users to visualize the geometry and topology of the Sun's 3D coronal magnetic field in an interactive manner. The tool is implemented using, open-source, mature, modern web technologies including WebGL, jQuery, HTML 5, and CSS 3, which are compatible with nearly all modern web browsers. As opposed to the traditional method of visualization, which involves the downloading and setup of various software packages-proprietary and otherwise-the tool presents a clean interface that allows the user to easily load and manipulate the model, while also offering great power to choose which topological features are displayed. The tool accepts data encoded in the JSON open format that has libraries available for nearly every major programming language, making it simple to generate the data.

  2. Web3DMol: interactive protein structure visualization based on WebGL.

    PubMed

    Shi, Maoxiang; Gao, Juntao; Zhang, Michael Q

    2017-07-03

    A growing number of web-based databases and tools for protein research are being developed. There is now a widespread need for visualization tools to present the three-dimensional (3D) structure of proteins in web browsers. Here, we introduce our 3D modeling program-Web3DMol-a web application focusing on protein structure visualization in modern web browsers. Users submit a PDB identification code or select a PDB archive from their local disk, and Web3DMol will display and allow interactive manipulation of the 3D structure. Featured functions, such as sequence plot, fragment segmentation, measure tool and meta-information display, are offered for users to gain a better understanding of protein structure. Easy-to-use APIs are available for developers to reuse and extend Web3DMol. Web3DMol can be freely accessed at http://web3dmol.duapp.com/, and the source code is distributed under the MIT license. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. 78 FR 54241 - Proposed Information Collection; Comment Request; BroadbandMatch Web Site Tool

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... Information Collection; Comment Request; BroadbandMatch Web Site Tool AGENCY: National Telecommunications and... goal of increased broadband deployment and use in the United States. The BroadbandMatch Web site began... empowering technology effectively. II. Method of Collection BroadbandMatch users access the Web site through...

  4. Digital Discernment: An E-Commerce Web Site Evaluation Tool

    ERIC Educational Resources Information Center

    Sigman, Betsy Page; Boston, Brian J.

    2013-01-01

    Students entering the business workforce today may well share some responsibility for developing, revising, or evaluating their company's Web site. They may lack the experience, however, to critique their employer's Web presence effectively. The purpose of developing Digital Discernment, an e-commerce Web site evaluation tool, was to prepare…

  5. Ondex Web: web-based visualization and exploration of heterogeneous biological networks.

    PubMed

    Taubert, Jan; Hassani-Pak, Keywan; Castells-Brooke, Nathalie; Rawlings, Christopher J

    2014-04-01

    Ondex Web is a new web-based implementation of the network visualization and exploration tools from the Ondex data integration platform. New features such as context-sensitive menus and annotation tools provide users with intuitive ways to explore and manipulate the appearance of heterogeneous biological networks. Ondex Web is open source, written in Java and can be easily embedded into Web sites as an applet. Ondex Web supports loading data from a variety of network formats, such as XGMML, NWB, Pajek and OXL. http://ondex.rothamsted.ac.uk/OndexWeb.

  6. Risk Management Collaboration through Sharing Interactive Graphics

    NASA Astrophysics Data System (ADS)

    Slingsby, Aidan; Dykes, Jason; Wood, Jo; Foote, Matthew

    2010-05-01

    Risk management involves the cooperation of scientists, underwriters and actuaries all of whom analyse data to support decision-making. Results are often disseminated through static documents with graphics that convey the message the analyst wishes to communicate. Interactive graphics are increasingly popular means of communicating the results of data analyses because they enable other parties to explore and visually analyse some of the data themselves prior to and during discussion. Discussion around interactive graphics can occur synchronously in face-to-face meetings or with video-conferencing and screen sharing or they can occur asynchronously through web-sites such as ManyEyes, web-based fora, blogs, wikis and email. A limitation of approaches that do not involve screen sharing is the difficulty in sharing the results of insights from interacting with the graphic. Static images accompanied can be shared but these themselves cannot be interacted, producing a discussion bottleneck (Baker, 2008). We address this limitation by allowing the state and configuration of graphics to be shared (rather than static images) so that a user can reproduce someone else's graphic, interact with it and then share the results of this accompanied with some commentary. HiVE (Slingsby et al, 2009) is a compact and intuitive text-based language that has been designed for this purpose. We will describe the vizTweets project (a 9-month project funded by JISC) in which we are applying these principles to insurance risk management in the context of the Willis Research Network, the world's largest collaboration between the insurance industry and the academia). The project aims to extend HiVE to meet the needs of the sector, design, implement free-available web services and tools and to provide case studies. We will present a case study that demonstrate the potential of this approach for collaboration within the Willis Research Network. Baker, D. Towards Transparency in Visualisation Based Research. AHRC ICT Methods Network Expert Workshop. Available at http://www.viznet.ac.uk/documents Slingsby, A., Dykes, J. and Wood, J. 2009. Configuring Hierarchical Layouts to Address Research Questions. IEEE Transactions on Visualization and Computer Graphics 15 (6), Nov-Dec 2009, pp977-984.

  7. SOCIB applications for oceanographic data management

    NASA Astrophysics Data System (ADS)

    Troupin, Charles; Pau Beltran, Joan; Frontera, Biel; Gómara, Sonia; Lora, Sebastian; March, David; Sebastian, Kristian; Tintoré, Joaquin

    2015-04-01

    The Balearic Islands Coastal Ocean Observing and Forecasting System (SOCIB, http://www.socib.es), is a multi-platform Marine Research Infrastructure that provides free, open and quality-controlled data from near-shore to the open sea. To collect the necessary data, the SOCIB system is made up of: a research vessel, a high-frequency (HF) radar system, weather stations, tide gauges, moorings, drifting buoys, ARGO profilers, and gliders (autonomous underwater vehicles). In addition, the system has recently begun incorporating oceanographic sensors attached to sea turtles. High-resolution numerical models provide forecast for hydrodynamics (ROMS) and waves (SAPO). According to SOCIB principles, data have to be: discoverable and accessible; freely available; interoperable, quality-controlled and standardized. The Data Centre (DC) manages the different steps of data processing, including: acquisition using SOCIB platforms (gliders, drifters, HF radar, ...), numerical models (hydrodynamics, waves, ...) or information generated by other data sources, distribution through dedicated web and mobile applications dynamic visualisation. The SOCIB DC constitutes an example of marine information systems within the framework of new coastal ocean observatories. In this work we present some of the applications developed for specific type of users, as well as the technologies used for their implementation: DAPP (Deployments application, http://apps.socib.es/dapp/), a web application to display information related to mobile platform trajectories. LW4NC2 (http://thredds.socib.es/lw4nc2), a web application for multidimensional (grid) data from NetCDF files (numerical models, HF radar). SACOSTA (http://gis.socib.es/sacosta), a viewer for cartographic data such as environmental sensitivity of the coastline. SEABOARD (http://seaboard.socib.es), a tool to disseminate SOCIB real time data to different types of users. Smart-phone apps to access data, platform trajectories and forecasts in real-time. In keeping with the objective of bringing relevant data to all kinds of users in a free and easy way, our future plans include the redesign of the applications to improve the user experience, along with the creation of applications specific to different groups of users, including tourists, sailors, surfers, and others.

  8. An open-source software platform for data management, visualisation, model building and model sharing in water, energy and other resource modelling domains.

    NASA Astrophysics Data System (ADS)

    Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.

    2015-12-01

    Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.

  9. CellTree: an R/bioconductor package to infer the hierarchical structure of cell populations from single-cell RNA-seq data.

    PubMed

    duVerle, David A; Yotsukura, Sohiya; Nomura, Seitaro; Aburatani, Hiroyuki; Tsuda, Koji

    2016-09-13

    Single-cell RNA sequencing is fast becoming one the standard method for gene expression measurement, providing unique insights into cellular processes. A number of methods, based on general dimensionality reduction techniques, have been suggested to help infer and visualise the underlying structure of cell populations from single-cell expression levels, yet their models generally lack proper biological grounding and struggle at identifying complex differentiation paths. Here we introduce cellTree: an R/Bioconductor package that uses a novel statistical approach, based on document analysis techniques, to produce tree structures outlining the hierarchical relationship between single-cell samples, while identifying latent groups of genes that can provide biological insights. With cellTree, we provide experimentalists with an easy-to-use tool, based on statistically and biologically-sound algorithms, to efficiently explore and visualise single-cell RNA data. The cellTree package is publicly available in the online Bionconductor repository at: http://bioconductor.org/packages/cellTree/ .

  10. Qualitative analysis of precipiation distribution in Poland with use of different data sources

    NASA Astrophysics Data System (ADS)

    Walawender, J.; Dyras, I.; Łapeta, B.; Serafin-Rek, D.; Twardowski, A.

    2008-04-01

    Geographical Information Systems (GIS) can be used to integrate data from different sources and in different formats to perform innovative spatial and temporal analysis. GIS can be also applied for climatic research to manage, investigate and display all kinds of weather data. The main objective of this study is to demonstrate that GIS is a useful tool to examine and visualise precipitation distribution obtained from different data sources: ground measurements, satellite and radar data. Three selected days (30 cases) with convective rainfall situations were analysed. Firstly, scalable GRID-based approach was applied to store data from three different sources in comparable layout. Then, geoprocessing algorithm was created within ArcGIS 9.2 environment. The algorithm included: GRID definition, reclassification and raster algebra. All of the calculations and procedures were performed automatically. Finally, contingency tables and pie charts were created to show relationship between ground measurements and both satellite and radar derived data. The results were visualised on maps.

  11. A Document Visualization Tool Customized to Explore DRDC Reports (Un outil de visualisation de document concu precisement pour explorer les rapports de RDDC)

    DTIC Science & Technology

    2011-08-01

    context of flight simulators . ................................................................................................................... 14...particular area? Suppose a commander at CFB Shearwater wanted to find out more about how he/she can best deal with issues of pilots’ motion sickness...in the flight simulator on base. As a first step, one would enter, “motion sickness” as a query in HanDles, and get the relevant documents returned

  12. The effectiveness of cartographic visualisations in landscape archaeology

    NASA Astrophysics Data System (ADS)

    Fairbairn, David

    2018-05-01

    The use of maps and other geovisualisation methods has been longstanding in archaeology. Archaeologists employ advanced contemporary tools in their data collection, analysis and presentation. Maps can be used to render the `big data' commonly collected by archaeological prospection techniques, but are also fundamental output instru-ments for the dissemination of archaeological interpretation and modelling. This paper addresses, through case studies, alternate methods of geovisualisation in archaeology and identifies the efficiencies of each.

  13. System Testing of Desktop and Web Applications

    ERIC Educational Resources Information Center

    Slack, James M.

    2011-01-01

    We want our students to experience system testing of both desktop and web applications, but the cost of professional system-testing tools is far too high. We evaluate several free tools and find that AutoIt makes an ideal educational system-testing tool. We show several examples of desktop and web testing with AutoIt, starting with simple…

  14. Providing Knowledge Recommendations: An Approach for Informal Electronic Mentoring

    ERIC Educational Resources Information Center

    Colomo-Palacios, Ricardo; Casado-Lumbreras, Cristina; Soto-Acosta, Pedro; Misra, Sanjay

    2014-01-01

    The use of Web 2.0 technologies for knowledge management is invading the corporate sphere. The Web 2.0 is the most adopted knowledge transfer tool within knowledge intensive firms and is starting to be used for mentoring. This paper presents IM-TAG, a Web 2.0 tool, based on semantic technologies, for informal mentoring. The tool offers…

  15. On Recommending Web 2.0 Tools to Personalise Learning

    ERIC Educational Resources Information Center

    Juškeviciene, Anita; Kurilovas, Eugenijus

    2014-01-01

    The paper aims to present research results on using Web 2.0 tools for learning personalisation. In the work, personalised Web 2.0 tools selection method is presented. This method takes into account student's learning preferences for content and communication modes tailored to the learning activities with a view to help the learner to quickly and…

  16. Web-based remote sensing of building energy performance

    NASA Astrophysics Data System (ADS)

    Martin, William; Nassiopoulos, Alexandre; Le Cam, Vincent; Kuate, Raphaël; Bourquin, Frédéric

    2013-04-01

    The present paper describes the design and the deployment of an instrumentation system enabling the energy monitoring of a building in a smart-grid context. The system is based on a network of wireless low power IPv6 sensors. Ambient temperature and electrical power for heating are measured. The management, storage, visualisation and treatment of the data is done through a web-based application that can be deployed as an online web service. The same web-based framework enables the acquisition of distant measured data such as those coming from a nearby weather station. On-site sensor and weather station data are then adequately treated based on inverse identification methods. The algorithms aim at determining the parameters of a numerical model suitable for a short-time horizon prediction of indoor climate. The model is based on standard multi-zone modelling assumptions and takes into account solar, airflow and conductive transfers. It was specially designed to render accurately inertia effects that are used in a demand-response strategy. All the hardware or software technologies that are used in the system are open and low cost so that they comply with the constraints of on-site deployment in buildings. The measured data as well as the model predictions can be accessed ubiquously through the web. This feature enables to consider a wide range of energy management applications at the disctrict, city or national level. The entire system has been deployed and tested in an experimental office building in Angers, France. It demonstrates the potential of ICT technologies to enable remotely controlled monitoring and surveillance in real time.

  17. Collaboratively Conceived, Designed and Implemented: Matching Visualization Tools with Geoscience Data Collections and Geoscience Data Collections with Visualization Tools via the ToolMatch Service.

    NASA Astrophysics Data System (ADS)

    Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.

    2014-12-01

    Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.

  18. Selecting a Free Web-Hosted Survey Tool for Student Use

    ERIC Educational Resources Information Center

    Elbeck, Matt

    2014-01-01

    This study provides marketing educators a review of free web-based survey services and guidance for student use. A mixed methods approach started with online searches and metrics identifying 13 free web-hosted survey services, described as demonstration or project tools, and ranked using popularity and importance web-based metrics. For each…

  19. Student Inquiry and Web 2.0

    ERIC Educational Resources Information Center

    Berger, Pam

    2010-01-01

    Web 2.0 applications are changing how educators interact both with each other and with their students. Educators can use these new Web tools daily to create, share, socialize, and collaborate with students, colleagues, and newly developed network contacts. School librarians are finding that Web 2.0 tools are bringing them more ways to embrace and…

  20. Students' Reaction to WebCT: Implications for Designing On-Line Learning Environments

    ERIC Educational Resources Information Center

    Osman, Mohamed Eltahir

    2005-01-01

    There is a growing number of web-based and web-assisted course development tools and products that can be used to create on-line learning environment. The utility of these products, however, varies greatly depending on their feasibility, prerequisite infrastructure, technical features, interface, and course development and management tools. WebCT…

  1. 3-D interactive visualisation tools for Hi spectral line imaging

    NASA Astrophysics Data System (ADS)

    van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.

    2017-06-01

    Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.

  2. Standards opportunities around data-bearing Web pages.

    PubMed

    Karger, David

    2013-03-28

    The evolving Web has seen ever-growing use of structured data, thanks to the way it enhances information authoring, querying, visualization and sharing. To date, however, most structured data authoring and management tools have been oriented towards programmers and Web developers. End users have been left behind, unable to leverage structured data for information management and communication as well as professionals. In this paper, I will argue that many of the benefits of structured data management can be provided to end users as well. I will describe an approach and tools that allow end users to define their own schemas (without knowing what a schema is), manage data and author (not program) interactive Web visualizations of that data using the Web tools with which they are already familiar, such as plain Web pages, blogs, wikis and WYSIWYG document editors. I will describe our experience deploying these tools and some lessons relevant to their future evolution.

  3. An exploration of counterfeit medicine surveillance strategies guided by geospatial analysis: lessons learned from counterfeit Avastin detection in the US drug supply chain.

    PubMed

    Cuomo, Raphael E; Mackey, Tim K

    2014-12-02

    To explore healthcare policy and system improvements that would more proactively respond to future penetration of counterfeit cancer medications in the USA drug supply chain using geospatial analysis. A statistical and geospatial analysis of areas that received notices from the Food and Drug Administration (FDA) about the possibility of counterfeit Avastin penetrating the US drug supply chain. Data from FDA warning notices were compared to data from 44 demographic variables available from the US Census Bureau via correlation, means testing and geospatial visualisation. Results were interpreted in light of existing literature in order to recommend improvements to surveillance of counterfeit medicines. This study analysed 791 distinct healthcare provider addresses that received FDA warning notices across 30,431 zip codes in the USA. Statistical outputs were Pearson's correlation coefficients and t values. Geospatial outputs were cartographic visualisations. These data were used to generate the overarching study outcome, which was a recommendation for a strategy for drug safety surveillance congruent with existing literature on counterfeit medication. Zip codes with greater numbers of individuals age 65+ and greater numbers of ethnic white individuals were most correlated with receipt of a counterfeit Avastin notice. Geospatial visualisations designed in conjunction with statistical analysis of demographic variables appeared more capable of suggesting areas and populations that may be at risk for undetected counterfeit Avastin penetration. This study suggests that dual incorporation of statistical and geospatial analysis in surveillance of counterfeit medicine may be helpful in guiding efforts to prevent, detect and visualise counterfeit medicines penetrations in the US drug supply chain and other settings. Importantly, the information generated by these analyses could be utilised to identify at-risk populations associated with demographic characteristics. Stakeholders should explore these results as another tool to improve on counterfeit medicine surveillance. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  4. DataView: a computational visualisation system for multidisciplinary design and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Chengen

    2016-01-01

    Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.

  5. Immersive 3D geovisualisation in higher education

    NASA Astrophysics Data System (ADS)

    Philips, Andrea; Walz, Ariane; Bergner, Andreas; Graeff, Thomas; Heistermann, Maik; Kienzler, Sarah; Korup, Oliver; Lipp, Torsten; Schwanghart, Wolfgang; Zeilinger, Gerold

    2014-05-01

    Through geovisualisation we explore spatial data, we analyse it towards a specific questions, we synthesise results, and we present and communicate them to a specific audience (MacEachren & Kraak 1997). After centuries of paper maps, the means to represent and visualise our physical environment and its abstract qualities have changed dramatically since the 1990s - and accordingly the methods how to use geovisualisation in teaching. Whereas some people might still consider the traditional classroom as ideal setting for teaching and learning geographic relationships and its mapping, we used a 3D CAVE (computer-animated virtual environment) as environment for a problem-oriented learning project called "GEOSimulator". Focussing on this project, we empirically investigated, if such a technological advance like the CAVE make 3D visualisation, including 3D geovisualisation, not only an important tool for businesses (Abulrub et al. 2012) and for the public (Wissen et al. 2008), but also for educational purposes, for which it had hardly been used yet. The 3D CAVE is a three-sided visualisation platform, that allows for immersive and stereoscopic visualisation of observed and simulated spatial data. We examined the benefits of immersive 3D visualisation for geographic research and education and synthesized three fundamental technology-based visual aspects: First, the conception and comprehension of space and location does not need to be generated, but is instantaneously and intuitively present through stereoscopy. Second, optical immersion into virtual reality strengthens this spatial perception which is in particular important for complex 3D geometries. And third, a significant benefit is interactivity, which is enhanced through immersion and allows for multi-discursive and dynamic data exploration and knowledge transfer. Based on our problem-oriented learning project, which concentrates on a case study on flood risk management at the Wilde Weisseritz in Germany, a river that significantly contributed to the hundred-year flooding in Dresden in 2002, we empirically evaluated the usefulness of this immersive 3D technology towards learning success. Results show that immersive 3D geovisualisation have educational and content-related advantages compared to 2D geovisualisations through the mentioned benefits. This innovative way of geovisualisation is thus not only entertaining and motivating for students, but can also be constructive for research studies by, for instance, facilitating the study of complex environments or decision-making processes.

  6. Biotool2Web: creating simple Web interfaces for bioinformatics applications.

    PubMed

    Shahid, Mohammad; Alam, Intikhab; Fuellen, Georg

    2006-01-01

    Currently there are many bioinformatics applications being developed, but there is no easy way to publish them on the World Wide Web. We have developed a Perl script, called Biotool2Web, which makes the task of creating web interfaces for simple ('home-made') bioinformatics applications quick and easy. Biotool2Web uses an XML document containing the parameters to run the tool on the Web, and generates the corresponding HTML and common gateway interface (CGI) files ready to be published on a web server. This tool is available for download at URL http://www.uni-muenster.de/Bioinformatics/services/biotool2web/ Georg Fuellen (fuellen@alum.mit.edu).

  7. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    PubMed

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  8. Using Airborne Remote Sensing to Increase Situational Awareness in Civil Protection and Humanitarian Relief - the Importance of User Involvement

    NASA Astrophysics Data System (ADS)

    Römer, H.; Kiefl, R.; Henkel, F.; Wenxi, C.; Nippold, R.; Kurz, F.; Kippnich, U.

    2016-06-01

    Enhancing situational awareness in real-time (RT) civil protection and emergency response scenarios requires the development of comprehensive monitoring concepts combining classical remote sensing disciplines with geospatial information science. In the VABENE++ project of the German Aerospace Center (DLR) monitoring tools are being developed by which innovative data acquisition approaches are combined with information extraction as well as the generation and dissemination of information products to a specific user. DLR's 3K and 4k camera system which allow for a RT acquisition and pre-processing of high resolution aerial imagery are applied in two application examples conducted with end users: a civil protection exercise with humanitarian relief organisations and a large open-air music festival in cooperation with a festival organising company. This study discusses how airborne remote sensing can significantly contribute to both, situational assessment and awareness, focussing on the downstream processes required for extracting information from imagery and for visualising and disseminating imagery in combination with other geospatial information. Valuable user feedback and impetus for further developments has been obtained from both applications, referring to innovations in thematic image analysis (supporting festival site management) and product dissemination (editable web services). Thus, this study emphasises the important role of user involvement in application-related research, i.e. by aligning it closer to user's requirements.

  9. Hydrological analysis in R: Topmodel and beyond

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Reusser, D.

    2011-12-01

    R is quickly gaining popularity in the hydrological sciences community. The wide range of statistical and mathematical functionality makes it an excellent tool for data analysis, modelling and uncertainty analysis. Topmodel was one of the first hydrological models being implemented as an R package and distributed through R's own distribution network CRAN. This facilitated pre- and postprocessing of data such as parameter sampling, calculation of prediction bounds, and advanced visualisation. However, apart from these basic functionalities, the package did not use many of the more advanced features of the R environment, especially from R's object oriented functionality. With R's increasing expansion in arenas such as high performance computing, big data analysis, and cloud services, we revisit the topmodel package, and use it as an example of how to build and deploy the next generation of hydrological models. R provides a convenient environment and attractive features to build and couple hydrological - and in extension other environmental - models, to develop flexible and effective data assimilation strategies, and to take the model beyond the individual computer by linking into cloud services for both data provision and computing. However, in order to maximise the benefit of these approaches, it will be necessary to adopt standards and ontologies for model interaction and information exchange. Some of those are currently being developed, such as the OGC web processing standards, while other will need to be developed.

  10. Integration of Web 2.0 Tools in Learning a Programming Course

    ERIC Educational Resources Information Center

    Majid, Nazatul Aini Abd

    2014-01-01

    Web 2.0 tools are expected to assist students to acquire knowledge effectively in their university environment. However, the lack of effort from lecturers in planning the learning process can make it difficult for the students to optimize their learning experiences. The aim of this paper is to integrate Web 2.0 tools with learning strategy in…

  11. Motivating Pre-Service Teachers in Technology Integration of Web 2.0 for Teaching Internships

    ERIC Educational Resources Information Center

    Kim, Hye Jeong; Jang, Hwan Young

    2015-01-01

    The aim of this study was to examine the predictors of pre-service teachers' use of Web 2.0 tools during a teaching internship, after a course that emphasized the use of the tools for instructional activities. Results revealed that integrating Web 2.0 tools during their teaching internship was strongly predicted by participants' perceived…

  12. K-12 Student Use of Web 2.0 Tools: A Global Study

    ERIC Educational Resources Information Center

    Toledo, Cheri; Shepard, MaryFriend

    2011-01-01

    Over the past decade, Internet use has increased 445% worldwide. This boom has enabled widespread access to online tools and digital spaces for educational practices. The results of this study of Web 2.0 tool use in kindergarten through high school (K-12) classrooms around the world will be presented. A web-based survey was sent out through online…

  13. Prototyping Tool for Web-Based Multiuser Online Role-Playing Game

    NASA Astrophysics Data System (ADS)

    Okamoto, Shusuke; Kamada, Masaru; Yonekura, Tatsuhiro

    This letter proposes a prototyping tool for Web-based Multiuser Online Role-Playing Game (MORPG). The design goal is to make this tool simple and powerful. The tool is comprised of a GUI editor, a translator and a runtime environment. The GUI editor is used to edit state-transition diagrams, each of which defines the behavior of the fictional characters. The state-transition diagrams are translated into C program codes, which plays the role of a game engine in RPG system. The runtime environment includes PHP, JavaScript with Ajax and HTML. So the prototype system can be played on the usual Web browser, such as Fire-fox, Safari and IE. On a click or key press by a player, the Web browser sends it to the Web server to reflect its consequence on the screens which other players are looking at. Prospected users of this tool include programming novices and schoolchildren. The knowledge or skill of any specific programming languages is not required to create state-transition diagrams. Its structure is not only suitable for the definition of a character behavior but also intuitive to help novices understand. Therefore, the users can easily create Web-based MORPG system with the tool.

  14. Bringing ocean observations to the classroom - integrating research infrastructure into education

    NASA Astrophysics Data System (ADS)

    Proctor, R.; Hoenner, X.; Mancini, S.; Tattersall, K.; Everett, J. D.; Suthers, I. M.; Steinberg, P.; Doblin, M.; Moltmann, T.

    2016-02-01

    For the past 4 years the Sydney Institute of Marine Science, a partnership of four Australian Universities (Macquarie University, the University of NSW, the University of Sydney and the University of Technology Sydney) has been running a Master's degree course called Topics in Australian Marine Science (TAMS). This course is unique in that the core of the course is built around research infrastructure - the Integrated Marine Observing System (IMOS). IMOS, established in 2007, is collecting unprecedented volumes of multi-disciplinary oceanographic data in the ocean and on the continental shelf which is made freely available across the web; IMOS frequently runs `data user workshops' throughout Australia to introduce scientists and managers to the wealth of observations available at their fingertips. The Masters course gives students an understanding of how different measurement platforms work and they explore the data that these platforms collect. Students combine attending seminars and lectures with hands on practicals and personal assignments, all built around access to IMOS data and the many tools available to visualise and analyse. The course attracts a diverse class with many mature students (i.e. > 25 years old) from a range of backgrounds who find that the ease of discovering and accessing data, coupled with the available tools, enables them to easily study the marine environment without the need for high level computational skills. Since its inception the popularity of the course has increased with 38 students undertaking the subject in 2014. The consensus from students and lecturers is that integrating `real' observations into the classroom is beneficial to all, and IMOS is seeking to extend this approach to other university campuses. The talk will describe the experiences from the TAMS course and highlight the IMOS approach to data discovery, availability and access through course examples.

  15. Visualising biological data: a semantic approach to tool and database integration

    PubMed Central

    Pettifer, Steve; Thorne, David; McDermott, Philip; Marsh, James; Villéger, Alice; Kell, Douglas B; Attwood, Teresa K

    2009-01-01

    Motivation In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are ad hoc collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customised for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research. Methods To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks. Results The toolkit, named Utopia, is freely available from . PMID:19534744

  16. Visualising biological data: a semantic approach to tool and database integration.

    PubMed

    Pettifer, Steve; Thorne, David; McDermott, Philip; Marsh, James; Villéger, Alice; Kell, Douglas B; Attwood, Teresa K

    2009-06-16

    In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are ad hoc collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customized for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research. To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks. The toolkit, named Utopia, is freely available from http://utopia.cs.man.ac.uk/.

  17. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    PubMed

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  18. A dataset describing brooding in three species of South African brittle stars, comprising seven high-resolution, micro X-ray computed tomography scans.

    PubMed

    Landschoff, Jannes; Du Plessis, Anton; Griffiths, Charles L

    2015-01-01

    Brooding brittle stars have a special mode of reproduction whereby they retain their eggs and juveniles inside respiratory body sacs called bursae. In the past, studying this phenomenon required disturbance of the sample by dissecting the adult. This caused irreversible damage and made the sample unsuitable for future studies. Micro X-ray computed tomography (μCT) is a promising technique, not only to visualise juveniles inside the bursae, but also to keep the sample intact and make the dataset of the scan available for future reference. Seven μCT scans of five freshly fixed (70 % ethanol) individuals, representing three differently sized brittle star species, provided adequate image quality to determine the numbers, sizes and postures of internally brooded young, as well as anatomy and morphology of adults. No staining agents were necessary to achieve high-resolution, high-contrast images, which permitted visualisations of both calcified and soft tissue. The raw data (projection and reconstruction images) are publicly available for download from GigaDB. Brittle stars of all sizes are suitable candidates for μCT imaging. This explicitly adds a new technique to the suite of tools available for studying the development of internally brooded young. The purpose of applying the technique was to visualise juveniles inside the adult, but because of the universally good quality of the dataset, the images can also be used for anatomical or comparative morphology-related studies of adult structures.

  19. Application of growing hierarchical SOM for visualisation of network forensics traffic data.

    PubMed

    Palomo, E J; North, J; Elizondo, D; Luque, R M; Watson, T

    2012-08-01

    Digital investigation methods are becoming more and more important due to the proliferation of digital crimes and crimes involving digital evidence. Network forensics is a research area that gathers evidence by collecting and analysing network traffic data logs. This analysis can be a difficult process, especially because of the high variability of these attacks and large amount of data. Therefore, software tools that can help with these digital investigations are in great demand. In this paper, a novel approach to analysing and visualising network traffic data based on growing hierarchical self-organising maps (GHSOM) is presented. The self-organising map (SOM) has been shown to be successful for the analysis of highly-dimensional input data in data mining applications as well as for data visualisation in a more intuitive and understandable manner. However, the SOM has some problems related to its static topology and its inability to represent hierarchical relationships in the input data. The GHSOM tries to overcome these limitations by generating a hierarchical architecture that is automatically determined according to the input data and reflects the inherent hierarchical relationships among them. Moreover, the proposed GHSOM has been modified to correctly treat the qualitative features that are present in the traffic data in addition to the quantitative features. Experimental results show that this approach can be very useful for a better understanding of network traffic data, making it easier to search for evidence of attacks or anomalous behaviour in a network environment. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. WOD - Weather On Demand forecasting system

    NASA Astrophysics Data System (ADS)

    Rognvaldsson, Olafur; Ragnarsson, Logi; Stanislawska, Karolina

    2017-04-01

    The backbone of the Belgingur forecasting system (called WOD - Weather On Demand) is the WRF-Chem atmospheric model, with a number of in-house customisations. Initial and boundary data are taken from the Global Forecasting System, operated by the National Oceanic and Atmospheric Administration (NOAA). Operational forecasts use cycling of a number of parameters, mainly deep soil and surface fields. This is done to minimise spin-up effects and to ensure proper book-keeping of hydrological fields such as snow accumulation and runoff, as well as the constituents of various chemical parameters. The WOD system can be used to create conventional short- to medium-range weather forecasts for any location on the globe. The WOD system can also be used for air quality purposes (e.g. dispersion forecasts from volcanic eruptions) and as a tool to provide input to other modelling systems, such as hydrological models. A wide variety of post-processing options are also available, making WOD an ideal tool for creating highly customised output that can be tailored to the specific needs of individual end-users. The most recent addition to the WOD system is an integrated verification system where forecasts can be compared to surface observations from chosen locations. Forecast visualisation, such as weather charts, meteograms, weather icons and tables, is done via number of web components that can be configured to serve the varying needs of different end-users. The WOD system itself can be installed in an automatic way on hardware running a range of Linux based OS. System upgrades can also be done in semi-automatic fashion, i.e. upgrades and/or bug-fixes can be pushed to the end-user hardware without system downtime. Importantly, the WOD system requires only rudimentary knowledge of the WRF modelling, and the Linux operating systems on behalf of the end-user, making it an ideal NWP tool in locations with limited IT infrastructure.

  1. Enhancing Thematic Units Using the World Wide Web: Tools and Strategies for Students with Mild Disabilities.

    ERIC Educational Resources Information Center

    Gardner, J. Emmett; Wissick, Cheryl A.

    2002-01-01

    This article presents principles for using Web-based activities to support curriculum accommodations for students with mild disabilities. Tools, resources, and strategies are identified to help teachers construct meaningful and Web-enhanced thematic units. Web sites are listed in the areas of math, science, language arts, and social studies;…

  2. Gobe: an interactive, web-based tool for comparative genomic visualization.

    PubMed

    Pedersen, Brent S; Tang, Haibao; Freeling, Michael

    2011-04-01

    Gobe is a web-based tool for viewing comparative genomic data. It supports viewing multiple genomic regions simultaneously. Its simple text format and flash-based rendering make it an interactive, exploratory research tool. Gobe can be used without installation through our web service, or downloaded and customized with stylesheets and javascript callback functions. Gobe is a flash application that runs in all modern web-browsers. The full source-code, including that for the online web application is available under the MIT license at: http://github.com/brentp/gobe. Sample applications are hosted at http://try-gobe.appspot.com/ and http://synteny.cnr.berkeley.edu/gobe-app/.

  3. A novel visualisation tool for climate services: a case study of temperature extremes and human mortality in Europe

    NASA Astrophysics Data System (ADS)

    Lowe, R.; Ballester, J.; Robine, J.; Herrmann, F. R.; Jupp, T. E.; Stephenson, D.; Rodó, X.

    2013-12-01

    Users of climate information often require probabilistic information on which to base their decisions. However, communicating information contained within a probabilistic forecast presents a challenge. In this paper we demonstrate a novel visualisation technique to display ternary probabilistic forecasts on a map in order to inform decision making. In this method, ternary probabilistic forecasts, which assign probabilities to a set of three outcomes (e.g. low, medium, and high risk), are considered as a point in a triangle of barycentric coordinates. This allows a unique colour to be assigned to each forecast from a continuum of colours defined on the triangle. Colour saturation increases with information gain relative to the reference forecast (i.e. the long term average). This provides additional information to decision makers compared with conventional methods used in seasonal climate forecasting, where one colour is used to represent one forecast category on a forecast map (e.g. red = ';dry'). We use the tool to present climate-related mortality projections across Europe. Temperature and humidity are related to human mortality via location-specific transfer functions, calculated using historical data. Daily mortality data at the NUTS2 level for 16 countries in Europe were obtain from 1998-2005. Transfer functions were calculated for 54 aggregations in Europe, defined using criteria related to population and climatological similarities. Aggregations are restricted to fall within political boundaries to avoid problems related to varying adaptation policies between countries. A statistical model is fit to cold and warm tails to estimate future mortality using forecast temperatures, in a Bayesian probabilistic framework. Using predefined categories of temperature-related mortality risk, we present maps of probabilistic projections for human mortality at seasonal to decadal time scales. We demonstrate the information gained from using this technique compared to more traditional methods to display ternary probabilistic forecasts. This technique allows decision makers to identify areas where the model predicts with certainty area-specific heat waves or cold snaps, in order to effectively target resources to those areas most at risk, for a given season or year. It is hoped that this visualisation tool will facilitate the interpretation of the probabilistic forecasts not only for public health decision makers but also within a multi-sectoral climate service framework.

  4. WebProtégé: A Collaborative Ontology Editor and Knowledge Acquisition Tool for the Web

    PubMed Central

    Tudorache, Tania; Nyulas, Csongor; Noy, Natalya F.; Musen, Mark A.

    2012-01-01

    In this paper, we present WebProtégé—a lightweight ontology editor and knowledge acquisition tool for the Web. With the wide adoption of Web 2.0 platforms and the gradual adoption of ontologies and Semantic Web technologies in the real world, we need ontology-development tools that are better suited for the novel ways of interacting, constructing and consuming knowledge. Users today take Web-based content creation and online collaboration for granted. WebProtégé integrates these features as part of the ontology development process itself. We tried to lower the entry barrier to ontology development by providing a tool that is accessible from any Web browser, has extensive support for collaboration, and a highly customizable and pluggable user interface that can be adapted to any level of user expertise. The declarative user interface enabled us to create custom knowledge-acquisition forms tailored for domain experts. We built WebProtégé using the existing Protégé infrastructure, which supports collaboration on the back end side, and the Google Web Toolkit for the front end. The generic and extensible infrastructure allowed us to easily deploy WebProtégé in production settings for several projects. We present the main features of WebProtégé and its architecture and describe briefly some of its uses for real-world projects. WebProtégé is free and open source. An online demo is available at http://webprotege.stanford.edu. PMID:23807872

  5. WebGeocalc and Cosmographia: Modern Tools to Access SPICE Archives

    NASA Astrophysics Data System (ADS)

    Semenov, B. V.; Acton, C. H.; Bachman, N. J.; Ferguson, E. W.; Rose, M. E.; Wright, E. D.

    2017-06-01

    The WebGeocalc (WGC) web client-server tool and the SPICE-enhanced Cosmographia visualization program are two new ways for accessing space mission geometry data provided in the PDS SPICE kernel archives and by mission operational SPICE kernel sets.

  6. Spatial Visualisation and Cognitive Style: How Do Gender Differences Play Out?

    ERIC Educational Resources Information Center

    Ramful, Ajay; Lowrie, Tom

    2015-01-01

    This study investigated potential gender differences in a sample of 807 Year 6 Singaporean students in relation to two variables: spatial visualisation ability and cognitive style. In contrast to the general trend, overall there were no significant gender differences on spatial visualisation ability. However, gender differences were prevalent…

  7. Envisioning Possibilities: Visualising as Enquiry in Literacy Studies

    ERIC Educational Resources Information Center

    Smith, Anna; Hall, Matthew; Sousanis, Nick

    2015-01-01

    Drawing from the research methods of three distinct literacy studies, in this piece, we highlight the visualisation approaches integral to our enquiry processes as researchers working to make sense of literacy and learning. We aim to encourage, provoke even, a conversation about visualisation processes in literacy research by sharing the…

  8. Visuals and Visualisation of Human Body Systems

    ERIC Educational Resources Information Center

    Mathai, Sindhu; Ramadas, Jayashree

    2009-01-01

    This paper explores the role of diagrams and text in middle school students' understanding and visualisation of human body systems. We develop a common framework based on structure and function to assess students' responses across diagram and verbal modes. Visualisation is defined in terms of understanding transformations on structure and relating…

  9. Using Visualisation to Enhance Learning

    ERIC Educational Resources Information Center

    Statham, Mick

    2014-01-01

    Learning to use visualisation techniques in the classroom enables pupils and teachers to gain new insights into how concepts are formed and how to strengthen them, but visualisation is sometimes not what it seems. Learning by constructing meaning is a long-standing principle underpinning much of today's school science. In "Primary Science…

  10. Visualising DNA in Classrooms Using Nile Blue

    ERIC Educational Resources Information Center

    Milne, Christine; Roche, Scott; McKay, David

    2008-01-01

    Giving students the opportunity to extract, manipulate and visualise DNA molecules enhances a constructivist approach to learning about modern techniques in biology and biotechnology Visualisation usually requires agarose gel electrophoresis and staining. In this article, we report on an alternative DNA stain, Nile Blue A, that may be used in the…

  11. Evaluating the Effectiveness of Web-based Climate Resilience Decision Support Tools: Insights from Coastal New Jersey

    NASA Astrophysics Data System (ADS)

    Brady, M.; Lathrop, R.; Auermuller, L. M.; Leichenko, R.

    2016-12-01

    Despite the recent surge of Web-based decision support tools designed to promote resiliency in U.S. coastal communities, to-date there has been no systematic study of their effectiveness. This study demonstrates a method to evaluate important aspects of effectiveness of four Web map tools designed to promote consideration of climate risk information in local decision-making and planning used in coastal New Jersey. In summer 2015, the research team conducted in-depth phone interviews with users of one regulatory and three non-regulatory Web map tools using a semi-structured questionnaire. The interview and analysis design drew from a combination of effectiveness evaluation approaches developed in software and information usability, program evaluation, and management information system (MIS) research. Effectiveness assessment results were further analyzed and discussed in terms of conceptual hierarchy of system objectives defined by respective tool developer and user organizations represented in the study. Insights from the interviews suggest that users rely on Web tools as a supplement to desktop and analog map sources because they provide relevant and up-to-date information in a highly accessible and mobile format. The users also reported relying on multiple information sources and comparison between digital and analog sources for decision support. However, with respect to this decision support benefit, users were constrained by accessibility factors such as lack of awareness and training with some tools, lack of salient information such as planning time horizons associated with future flood scenarios, and environmental factors such as mandates restricting some users to regulatory tools. Perceptions of Web tool credibility seem favorable overall, but factors including system design imperfections and inconsistencies in data and information across platforms limited trust, highlighting a need for better coordination between tools. Contributions of the study include user feedback on web-tool system designs consistent with collaborative methods for enhancing usability and a systematic look at effectiveness that includes both user perspectives and consideration of developer and organizational objectives.

  12. The Impact of Self-Efficacy and Professional Development on Implementation of Web 2.0 Tools in Elementary Classrooms

    ERIC Educational Resources Information Center

    Ward, Stephen

    2015-01-01

    This study sought to understand the impact of self-efficacy and professional development on the implementation of specific Web 2.0 tools in the elementary classroom. There were three research questions addressed in this QUAN-Qual study. Quantitative data were collected through three surveys with 48 total participants: the Web 2.0 tools Utilization…

  13. Oh! Web 2.0, Virtual Reference Service 2.0, Tools and Techniques (I): A Basic Approach

    ERIC Educational Resources Information Center

    Arya, Harsh Bardhan; Mishra, J. K.

    2011-01-01

    This study targets librarians and information professionals who use Web 2.0 tools and applications with a view to providing snapshots on how Web 2.0 technologies are used. It also aims to identify values and impact that such tools have exerted on libraries and their services, as well as to detect various issues associated with the implementation…

  14. Education and Technology in the 21st Century Experiences of Adult Online Learners Using Web 2.0

    ERIC Educational Resources Information Center

    Bryant, Wanda L.

    2014-01-01

    The emergence of a knowledge-based and technology-driven economy has prompted adults to seek additional knowledge and skills that will enable them to participate effectively in society. The rapid growth and popularity of the internet tools such as Web 2.0 tools have revolutionized adult learning. Through the rich support of Web 2.0 tools, adult…

  15. Interpreting User's Choice of Technologies: A Quantitative Research on Choosing the Best Web-Based Communication Tools

    ERIC Educational Resources Information Center

    Adebiaye, Richmond

    2010-01-01

    The proliferation of web-based communication tools like email clients vis-a-vis Yahoo mail, Gmail, and Hotmail have led to new innovations in web-based communication. Email users benefit greatly from this technology, but lack of security of these tools can put users at risk of loss of privacy, including identity theft, corporate espionage, and…

  16. Web tools for predictive toxicology model building.

    PubMed

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  17. An online planning tool for designing terrace layouts

    USDA-ARS?s Scientific Manuscript database

    A web-based conservation planning tool, WebTERLOC (web-based Terrace Location Program), was developed to provide multiple terrace layout options using digital elevation model (DEM) and geographic information systems (GIS). Development of a terrace system is complicated by the time-intensive manual ...

  18. Teaching with technology: automatically receiving information from the internet and web.

    PubMed

    Wink, Diane M

    2010-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools, social networking and social bookmarking sites, virtual worlds, and Web-based teaching and learning programs. This article presents information and tools related to automatically receiving information from the Internet and Web.

  19. Experimental evaluation of the impact of packet capturing tools for web services.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choe, Yung Ryn; Mohapatra, Prasant; Chuah, Chen-Nee

    Network measurement is a discipline that provides the techniques to collect data that are fundamental to many branches of computer science. While many capturing tools and comparisons have made available in the literature and elsewhere, the impact of these packet capturing tools on existing processes have not been thoroughly studied. While not a concern for collection methods in which dedicated servers are used, many usage scenarios of packet capturing now requires the packet capturing tool to run concurrently with operational processes. In this work we perform experimental evaluations of the performance impact that packet capturing process have on web-based services;more » in particular, we observe the impact on web servers. We find that packet capturing processes indeed impact the performance of web servers, but on a multi-core system the impact varies depending on whether the packet capturing and web hosting processes are co-located or not. In addition, the architecture and behavior of the web server and process scheduling is coupled with the behavior of the packet capturing process, which in turn also affect the web server's performance.« less

  20. Evaluation of bowel distension and mural visualisation using neutral oral contrast agents for multidetector-row computed tomography.

    PubMed

    Lim, Bee Kuan; Bux, Shaik Ismail; Rahmat, Kartini; Lam, Sze Yin; Liew, Yew Wai

    2012-11-01

    We compared the effectiveness of different types of non-commercial neutral oral contrast agents for bowel distension and mural visualisation in computed tomographic (CT) enterography. 90 consecutive patients from a group of 108 were randomly assigned to receive water (n = 30), 3.8% milk (n = 30) or 0.1% gastrografin (n = 30) as oral contrast agent. The results were independently reviewed by two radiologists who were blinded to the contrast agents used. The degree of bowel distension was qualitatively scored on a four-point scale. The discrimination of bowel loops, mural visualisation and visualisation of mucosal folds were evaluated on a 'yes' or 'no' basis. Side effects of the various agents were also recorded. 3.8% milk was significantly superior to water for bowel distension (jejunum, ileum and terminal ileum), discrimination of bowel loops (jejunum and ileum), mural visualisation and visualisation of mucosal folds (ileum and terminal ileum). It was also significantly superior to 0.1% gastrografin for bowel distension, discrimination of bowel loops, mural visualisation and visualisation of mucosal folds (jejunum, ileum and terminal ileum). However, 10% of patients who received 3.8% milk reported immediate post-test diarrhoea. No side effects were documented for patients who received water and 0.1% gastrografin. 3.8% milk is an effective and superior neutral oral contrast agent for the assessment of the jejunum, ileum and terminal ileum in CT enterography. However, further studies are needed to explore other suitable oral contrast agents for CT enterography in lactose- or cow's milk-intolerant patients.

  1. A Performance-Based Web Budget Tool

    ERIC Educational Resources Information Center

    Abou-Sayf, Frank K.; Lau, Wilson

    2007-01-01

    A web-based formula-driven tool has been developed for the purpose of performing two distinct academic department budgeting functions: allocation funding to the department, and budget management by the department. The tool's major features are discussed and its uses demonstrated. The tool's advantages are presented. (Contains 10 figures.)

  2. Googling DNA sequences on the World Wide Web.

    PubMed

    Hajibabaei, Mehrdad; Singer, Gregory A C

    2009-11-10

    New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.

  3. Defence Reporter. Spring 2012

    DTIC Science & Technology

    2012-01-01

    procedures and techniques for dealing with disruptive events. R0002869D Visualisation Techniques: Communicating Results to Senior Decision-Makers Dstl...understanding and accurate recall of key information. Open-source literature provides evidence that good visualisations aid effective communication...of abstract or complex information. General principles regarding the design of visualisations for use in presentations or reports are provided. Best

  4. Evidence for Effective Uses of Dynamic Visualisations in Science Curriculum Materials

    ERIC Educational Resources Information Center

    McElhaney, Kevin W.; Chang, Hsin-Yi; Chiu, Jennifer L.; Linn, Marcia C.

    2015-01-01

    Dynamic visualisations capture aspects of scientific phenomena that are difficult to communicate in static materials and benefit from well-designed scaffolds to succeed in classrooms. We review research to clarify the impacts of dynamic visualisations and to identify instructional scaffolds that mediate their success. We use meta-analysis to…

  5. Teaching Tectonics to Undergraduates with Web GIS

    NASA Astrophysics Data System (ADS)

    Anastasio, D. J.; Bodzin, A.; Sahagian, D. L.; Rutzmoser, S.

    2013-12-01

    Geospatial reasoning skills provide a means for manipulating, interpreting, and explaining structured information and are involved in higher-order cognitive processes that include problem solving and decision-making. Appropriately designed tools, technologies, and curriculum can support spatial learning. We present Web-based visualization and analysis tools developed with Javascript APIs to enhance tectonic curricula while promoting geospatial thinking and scientific inquiry. The Web GIS interface integrates graphics, multimedia, and animations that allow users to explore and discover geospatial patterns that are not easily recognized. Features include a swipe tool that enables users to see underneath layers, query tools useful in exploration of earthquake and volcano data sets, a subduction and elevation profile tool which facilitates visualization between map and cross-sectional views, drafting tools, a location function, and interactive image dragging functionality on the Web GIS. The Web GIS platform is independent and can be implemented on tablets or computers. The GIS tool set enables learners to view, manipulate, and analyze rich data sets from local to global scales, including such data as geology, population, heat flow, land cover, seismic hazards, fault zones, continental boundaries, and elevation using two- and three- dimensional visualization and analytical software. Coverages which allow users to explore plate boundaries and global heat flow processes aided learning in a Lehigh University Earth and environmental science Structural Geology and Tectonics class and are freely available on the Web.

  6. WebPresent: a World Wide Web-based telepresentation tool for physicians

    NASA Astrophysics Data System (ADS)

    Sampath-Kumar, Srihari; Banerjea, Anindo; Moshfeghi, Mehran

    1997-05-01

    In this paper, we present the design architecture and the implementation status of WebPresent - a world wide web based tele-presentation tool. This tool allows a physician to use a conference server workstation and make a presentation of patient cases to a geographically distributed audience. The audience consists of other physicians collaborating on patients' health care management and physicians participating in continuing medical education. These physicians are at several locations with networks of different bandwidth and capabilities connecting them. Audiences also receive the patient case information on different computers ranging form high-end display workstations to laptops with low-resolution displays. WebPresent is a scalable networked multimedia tool which supports the presentation of hypertext, images, audio, video, and a white-board to remote physicians with hospital Intranet access. WebPresent allows the audience to receive customized information. The data received can differ in resolution and bandwidth, depending on the availability of resources such as display resolution and network bandwidth.

  7. An Examination of Teachers' Integration of Web 2.0 Technologies in Secondary Classrooms: A Phenomenological Study

    ERIC Educational Resources Information Center

    Wang, Ling

    2013-01-01

    Web 2.0 tools may be able to close the digital gap between teachers and students if teachers can integrate the tools and change their pedagogy. The TPACK framework has outlined the elements needed to effect change, and research on Web 2.0 tools shows its potential as a change agent, but little research has looked at how the two interrelate. Using…

  8. Concept relationship editor: a visual interface to support the assertion of synonymy relationships between taxonomic classifications

    NASA Astrophysics Data System (ADS)

    Craig, Paul; Kennedy, Jessie

    2008-01-01

    An increasingly common approach being taken by taxonomists to define the relationships between taxa in alternative hierarchical classifications is to use a set-based notation which states relationship between two taxa from alternative classifications. Textual recording of these relationships is cumbersome and difficult for taxonomists to manage. While text based GUI tools are beginning to appear which ease the process, these have several limitations. Interactive visual tools offer greater potential to allow taxonomists to explore the taxa in these hierarchies and specify such relationships. This paper describes the Concept Relationship Editor, an interactive visualisation tool designed to support the assertion of relationships between taxonomic classifications. The tool operates using an interactive space-filling adjacency layout which allows users to expand multiple lists of taxa with common parents so they can explore and assert relationships between two classifications.

  9. Using a Learning Styles Inventory to Examine Student Satisfaction with Web-Based Instruction: A 15-Year Study of One Professor's Web-Based Course Instruction

    ERIC Educational Resources Information Center

    Olliges, Ralph

    2017-01-01

    This article examines Active Engagement, Active Communication, and Peer Engagement learning practices among various student groups. It examines which tools are most important for increasing student satisfaction with web-based and web-enhanced instruction. Second, it looks at how different tools lead to greater satisfaction among different types of…

  10. TrawlerWeb: an online de novo motif discovery tool for next-generation sequencing datasets.

    PubMed

    Dang, Louis T; Tondl, Markus; Chiu, Man Ho H; Revote, Jerico; Paten, Benedict; Tano, Vincent; Tokolyi, Alex; Besse, Florence; Quaife-Ryan, Greg; Cumming, Helen; Drvodelic, Mark J; Eichenlaub, Michael P; Hallab, Jeannette C; Stolper, Julian S; Rossello, Fernando J; Bogoyevitch, Marie A; Jans, David A; Nim, Hieu T; Porrello, Enzo R; Hudson, James E; Ramialison, Mirana

    2018-04-05

    A strong focus of the post-genomic era is mining of the non-coding regulatory genome in order to unravel the function of regulatory elements that coordinate gene expression (Nat 489:57-74, 2012; Nat 507:462-70, 2014; Nat 507:455-61, 2014; Nat 518:317-30, 2015). Whole-genome approaches based on next-generation sequencing (NGS) have provided insight into the genomic location of regulatory elements throughout different cell types, organs and organisms. These technologies are now widespread and commonly used in laboratories from various fields of research. This highlights the need for fast and user-friendly software tools dedicated to extracting cis-regulatory information contained in these regulatory regions; for instance transcription factor binding site (TFBS) composition. Ideally, such tools should not require prior programming knowledge to ensure they are accessible for all users. We present TrawlerWeb, a web-based version of the Trawler_standalone tool (Nat Methods 4:563-5, 2007; Nat Protoc 5:323-34, 2010), to allow for the identification of enriched motifs in DNA sequences obtained from next-generation sequencing experiments in order to predict their TFBS composition. TrawlerWeb is designed for online queries with standard options common to web-based motif discovery tools. In addition, TrawlerWeb provides three unique new features: 1) TrawlerWeb allows the input of BED files directly generated from NGS experiments, 2) it automatically generates an input-matched biologically relevant background, and 3) it displays resulting conservation scores for each instance of the motif found in the input sequences, which assists the researcher in prioritising the motifs to validate experimentally. Finally, to date, this web-based version of Trawler_standalone remains the fastest online de novo motif discovery tool compared to other popular web-based software, while generating predictions with high accuracy. TrawlerWeb provides users with a fast, simple and easy-to-use web interface for de novo motif discovery. This will assist in rapidly analysing NGS datasets that are now being routinely generated. TrawlerWeb is freely available and accessible at: http://trawler.erc.monash.edu.au .

  11. Assessing the Effect of Web-Based Learning Tools on Student Understanding of Stoichiometry Using Knowledge Space Theory

    ERIC Educational Resources Information Center

    Arasasingham, Ramesh D.; Taagepera, Mare; Potter, Frank; Martorell, Ingrid; Lonjers, Stacy

    2005-01-01

    Student achievement in web-based learning tools is assessed by using in-class examination, pretests, and posttests. The study reveals that using mastering chemistry web software in large-scale instruction provides an overall benefit to introductory chemistry students.

  12. Teaching and Learning with a Visualiser in the Primary Classroom: Modelling Graph-Making

    ERIC Educational Resources Information Center

    Mavers, Diane

    2009-01-01

    This paper examines the technological affordances of the visualiser, and what teachers actually do with it in the primary (elementary) classroom, followed by an investigation into one example of teaching and learning with this whole-class technology. A visualiser is a digital display device. Connected to a data projector, whatever is in view of…

  13. The Role and Potential Dangers of Visualisation When Learning about Sub-Microscopic Explanations in Chemistry Education

    ERIC Educational Resources Information Center

    Eilks, Ingo; Witteck, Torsten; Pietzner, Verena

    2012-01-01

    The core of theory-driven chemistry education consists of the constant shift between the different representational domains of chemical thinking: the macroscopic, the sub-microscopic, and the symbolic domains. Because the sub-microscopic domain can neither be seen nor directly visualised, it requires specific forms of visualisation, i.e. pictures…

  14. Drawing, Visualisation and Young Children's Exploration of "Big Ideas"

    ERIC Educational Resources Information Center

    Brooks, Margaret

    2009-01-01

    It is in the visualisation of ideas, and the expression or representation of our ideas, that we can bring something more clearly into consciousness. A drawing might be seen as an externalisation of a concept or idea. Drawing has the potential to play a mediating role in the visualisation of ideas and concepts in relation to young children…

  15. Data-intensive science gateway for rock physicists and volcanologists.

    NASA Astrophysics Data System (ADS)

    Filgueira, Rosa; Atkinson, Malcom; Bell, Andrew; Main, Ian; Boon, Steve; Meredith, Philp; Kilburn, Christopher

    2014-05-01

    Scientists have always shared data and mathematical models of the phenomena they study. Rock physics and Volcanology, as well as other solid-Earth sciences, have increasingly used Internet communications and computational renditions of their models for this purpose over the last two decades. Here we consider how to organise rock physics and volcanology data to open up opportunities for sharing and comparing both experiment data from experiments, observations and model runs and analytic interpretations of these data. Our hypothesis is that if we facilitate productive information sharing across those communities by using a new science gateway, it will benefit the science. The proposed science gateway should make the first steps for making existing research practices easier and facilitate new research. It will achieve this by supporting three major functions: 1) sharing data from laboratories and observatories, experimental facilities and models; 2) sharing models of rock fracture and methods for analysing experimental data; and 3) supporting recurrent operational tasks, such as data collection and model application in real time. We report initial work in two projects (NERC EFFORT and NERC CREEP-2) and experience with an early web-accessible protytpe called EFFORT gateway, where we are implementing such information sharing services for those projects. 1. Sharing data: In EFFORT gateway, we are working on several facilities for sharing data: *Upload data: We have designed and developed a new adaptive data transfer java tool called FAST (Flexible Automated Streaming Transfer) to upload experimental data and metadata periodically from laboratories to our repository. *Visualisation: As data are deposited in the repository, a visualisation of the accumulated data is made available for display in the Web portal. *Metadata and catalogues: The gateway uses a repository to hold all the data and a catalogue to hold all the corresponding metadata. 2. Sharing models and methods: The EFFORT gateway uses a repository to hold all of the models and a catalogue to hold the corresponding metadata. It provides several Web facilities for uploading, accessing and testing models. *Upload and store models: Through the gateway, researchers can upload as many models to the repository as they want. *Description of models: The gateway solicits and creates metadata for every model uploaded to store in the catalogue. *Search for models: Researchers can search the catalogue for models by using prepackaged sql-queries. *Access to models: Once a researcher has selected the model(s) that is going to be used for analysing an experiment, it will be obtained from the gateway. *Services to test and run models: Once a researcher selects a model and the experimental data to which it should be applied, the gateway submits the corresponding computational job to a high-performance computational (HPC) resource hiding technical details. Once a job is submitted to the HPC cluster, the results are displayed in the gateway in real time, catalogued and stored in the data repository, allowing further researcher-instigated operations to retrieve, inspect and aggregate results. *Services to write models: We have desgined VarPy library, which is an open-source toolbox which provides a Python framework for analysing volcanology and rock physics data. It provides several functions, which allow users to define their own workflows to develop models, analyses and visualizations. 3. Recurrent Operations: We have started to introduce some recurrent operations: *Automated data upload: FAST provides a mechanism to automate the data upload. *Periodic activation of models: The EFFORT gateway allows researchers to run different models periodically against the experimental data that are being or have been uploaded

  16. Web Audio/Video Streaming Tool

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2003-01-01

    In order to promote NASA-wide educational outreach program to educate and inform the public of space exploration, NASA, at Kennedy Space Center, is seeking efficient ways to add more contents to the web by streaming audio/video files. This project proposes a high level overview of a framework for the creation, management, and scheduling of audio/video assets over the web. To support short-term goals, the prototype of a web-based tool is designed and demonstrated to automate the process of streaming audio/video files. The tool provides web-enabled users interfaces to manage video assets, create publishable schedules of video assets for streaming, and schedule the streaming events. These operations are performed on user-defined and system-derived metadata of audio/video assets stored in a relational database while the assets reside on separate repository. The prototype tool is designed using ColdFusion 5.0.

  17. A decade of Web Server updates at the Bioinformatics Links Directory: 2003-2012.

    PubMed

    Brazas, Michelle D; Yim, David; Yeung, Winston; Ouellette, B F Francis

    2012-07-01

    The 2012 Bioinformatics Links Directory update marks the 10th special Web Server issue from Nucleic Acids Research. Beginning with content from their 2003 publication, the Bioinformatics Links Directory in collaboration with Nucleic Acids Research has compiled and published a comprehensive list of freely accessible, online tools, databases and resource materials for the bioinformatics and life science research communities. The past decade has exhibited significant growth and change in the types of tools, databases and resources being put forth, reflecting both technology changes and the nature of research over that time. With the addition of 90 web server tools and 12 updates from the July 2012 Web Server issue of Nucleic Acids Research, the Bioinformatics Links Directory at http://bioinformatics.ca/links_directory/ now contains an impressive 134 resources, 455 databases and 1205 web server tools, mirroring the continued activity and efforts of our field.

  18. Testing simple deceptive honeypot tools

    NASA Astrophysics Data System (ADS)

    Yahyaoui, Aymen; Rowe, Neil C.

    2015-05-01

    Deception can be a useful defensive technique against cyber-attacks; it has the advantage of unexpectedness to attackers and offers a variety of tactics. Honeypots are a good tool for deception. They act as decoy computers to confuse attackers and exhaust their time and resources. This work tested the effectiveness of two free honeypot tools in real networks by varying their location and virtualization, and the effects of adding more deception to them. We tested a Web honeypot tool, Glastopf and an SSH honeypot tool Kippo. We deployed the Web honeypot in both a residential network and our organization's network and as both real and virtual machines; the organization honeypot attracted more attackers starting in the third week. Results also showed that the virtual honeypots received attacks from more unique IP addresses. They also showed that adding deception to the Web honeypot, in the form of additional linked Web pages and interactive features, generated more interest by attackers. For the purpose of comparison, we used examined log files of a legitimate Web-site www.cmand.org. The traffic distributions for the Web honeypot and the legitimate Web site showed similarities (with much malicious traffic from Brazil), but the SSH honeypot was different (with much malicious traffic from China). Contrary to previous experiments where traffic to static honeypots decreased quickly, our honeypots received increasing traffic over a period of three months. It appears that both honeypot tools are useful for providing intelligence about cyber-attack methods, and that additional deception is helpful.

  19. Computer assisted surgery with 3D robot models and visualisation of the telesurgical action.

    PubMed

    Rovetta, A

    2000-01-01

    This paper deals with the support of virtual reality computer action in the procedures of surgical robotics. Computer support gives a direct representation of the surgical theatre. The modelization of the procedure in course and in development gives a psychological reaction towards safety and reliability. Robots similar to the ones used by the manufacturing industry can be used with little modification as very effective surgical tools. They have high precision, repeatability and are versatile in integrating with the medical instrumentation. Now integrated surgical rooms, with computer and robot-assisted intervention, are operating. The computer is the element for a decision taking aid, and the robot works as a very effective tool.

  20. The structure of gallery networks in the nests of termite Cubitermes spp. revealed by X-ray tomography

    NASA Astrophysics Data System (ADS)

    Perna, Andrea; Jost, Christian; Couturier, Etienne; Valverde, Sergi; Douady, Stéphane; Theraulaz, Guy

    2008-09-01

    Recent studies have introduced computer tomography (CT) as a tool for the visualisation and characterisation of insect architectures. Here, we use CT to map the three-dimensional networks of galleries inside Cubitermes nests in order to analyse them with tools from graph theory. The structure of these networks indicates that connections inside the nest are rearranged during the whole nest life. The functional analysis reveals that the final network topology represents an excellent compromise between efficient connectivity inside the nest and defence against attacking predators. We further discuss and illustrate the usefulness of CT to disentangle environmental and specific influences on nest architecture.

  1. Emerging Technology Update Intravascular Photoacoustic Imaging of Vulnerable Atherosclerotic Plaque.

    PubMed

    Wu, Min; Fw van der Steen, Antonius; Regar, Evelyn; van Soest, Gijs

    2016-10-01

    The identification of vulnerable atherosclerotic plaques in the coronary arteries is emerging as an important tool for guiding atherosclerosis diagnosis and interventions. Assessment of plaque vulnerability requires knowledge of both the structure and composition of the plaque. Intravascular photoacoustic (IVPA) imaging is able to show the morphology and composition of atherosclerotic plaque. With imminent improvements in IVPA imaging, it is becoming possible to assess human coronary artery disease in vivo . Although some challenges remain, IVPA imaging is on its way to being a powerful tool for visualising coronary atherosclerotic features that have been specifically associated with plaque vulnerability and clinical syndromes, and thus such imaging might become valuable for clinical risk assessment in the catheterisation laboratory.

  2. Computational and mathematical methods in brain atlasing.

    PubMed

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  3. Crowdsourcing, citizen sensing and sensor web technologies for public and environmental health surveillance and crisis management: trends, OGC standards and application examples

    PubMed Central

    2011-01-01

    'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world. PMID:22188675

  4. Crowdsourcing, citizen sensing and sensor web technologies for public and environmental health surveillance and crisis management: trends, OGC standards and application examples.

    PubMed

    Kamel Boulos, Maged N; Resch, Bernd; Crowley, David N; Breslin, John G; Sohn, Gunho; Burtner, Russ; Pike, William A; Jezierski, Eduardo; Chuang, Kuo-Yu Slayer

    2011-12-21

    'Wikification of GIS by the masses' is a phrase-term first coined by Kamel Boulos in 2005, two years earlier than Goodchild's term 'Volunteered Geographic Information'. Six years later (2005-2011), OpenStreetMap and Google Earth (GE) are now full-fledged, crowdsourced 'Wikipedias of the Earth' par excellence, with millions of users contributing their own layers to GE, attaching photos, videos, notes and even 3-D (three dimensional) models to locations in GE. From using Twitter in participatory sensing and bicycle-mounted sensors in pervasive environmental sensing, to creating a 100,000-sensor geo-mashup using Semantic Web technology, to the 3-D visualisation of indoor and outdoor surveillance data in real-time and the development of next-generation, collaborative natural user interfaces that will power the spatially-enabled public health and emergency situation rooms of the future, where sensor data and citizen reports can be triaged and acted upon in real-time by distributed teams of professionals, this paper offers a comprehensive state-of-the-art review of the overlapping domains of the Sensor Web, citizen sensing and 'human-in-the-loop sensing' in the era of the Mobile and Social Web, and the roles these domains can play in environmental and public health surveillance and crisis/disaster informatics. We provide an in-depth review of the key issues and trends in these areas, the challenges faced when reasoning and making decisions with real-time crowdsourced data (such as issues of information overload, "noise", misinformation, bias and trust), the core technologies and Open Geospatial Consortium (OGC) standards involved (Sensor Web Enablement and Open GeoSMS), as well as a few outstanding project implementation examples from around the world.

  5. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications

    PubMed Central

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-01-01

    Background Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. Results The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. Conclusion The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools. PMID:18021453

  6. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications.

    PubMed

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-11-19

    Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools.

  7. WebQuests as Language-Learning Tools

    ERIC Educational Resources Information Center

    Aydin, Selami

    2016-01-01

    This study presents a review of the literature that examines WebQuests as tools for second-language acquisition and foreign language-learning processes to guide teachers in their teaching activities and researchers in further research on the issue. The study first introduces the theoretical background behind WebQuest use in the mentioned…

  8. Free and Easy to Use Web Based Presentation and Classroom Tools

    ERIC Educational Resources Information Center

    Jensen, Jennifer; Tunon, Johanna

    2012-01-01

    A number of free Web-based tools are available for distance librarians to create presentations and online assignments. The relative merits of presentation tools like Dabbleboard, Jing, Prezi, Tildee, 280 Slides, and Glogster, and classroom tools like Make Beliefs Comix, Picviewer, Photopeach, and Wordle are assessed for ease of use by distance…

  9. GeoMapApp Learning Activities: Grab-and-go inquiry-based geoscience activities that bring cutting-edge technology to the classroom

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.; Kluge, S.

    2011-12-01

    NSF-funded GeoMapApp Learning Activities (http://serc.carleton.edu/geomapapp) provide self-contained learning opportunities that are centred around the principles of guided inquiry. The activities allow students to interact with and analyse research-quality geoscience data to explore and enhance student understanding of underlying geoscience content and concepts. Each activity offers ready-to-use step-by-step student instructions and answer sheets that can be downloaded from the web page. Also provided are annotated teacher versions of the worksheets that include teaching tips, additional content and suggestions for further work. Downloadable pre- and post- quizzes tied to each activity help educators gauge the learning progression of their students. Short multimedia tutorials and details on content alignment with state and national teaching standards round out the package of material that comprises each "grab-and-go" activity. GeoMapApp Learning Activities expose students to content and concepts typically found at the community college, high school and introductory undergraduate levels. The activities are based upon GeoMapApp (http://www.geomapapp.org), a free, easy-to-use map-based data exploration and visualisation tool that allows students to access a wide range of geoscience data sets in a virtual lab-like environment. Activities that have so far been created under this project include student exploration of seafloor spreading rates, a study of mass wasting as revealed through geomorphological evidence, and an analysis of plate motion and hotspot traces. The step-by-step instructions and guided inquiry approach lead students through each activity, thus reducing the need for teacher intervention whilst also boosting the time that students can spend on productive exploration and learning. The activities can be used, for example, in a classroom lab with the educator present and as self-paced assignments in an out-of-class setting. GeoMapApp Learning Activities are hosted on the SERC-Carleton web site.

  10. The invention of gonioscopy by Alexios Trantas and his contribution to ophthalmology.

    PubMed

    Kalantzis, G; Georgalas, I; Tsiamis, C; El-Hindy, N; Poulakou-Rebelakou, E

    2015-01-01

    Gonioscopy is a technique used to examine structures in the anterior chamber angle (the fluid filled space inside the eye between the iris and the innermost layer of the cornea, the endothelium). It is an essential tool in ophthalmic practice, particularly in the diagnosis of glaucoma. In 1899, the Greek ophthalmologist Alexios Trantas was the first to visualise the angle in vivo and coined the term 'gonioscopy'. He made a number of other important contributions to ophthalmology.

  11. 3D reconstruction techniques made easy: know-how and pictures.

    PubMed

    Luccichenti, Giacomo; Cademartiri, Filippo; Pezzella, Francesca Romana; Runza, Giuseppe; Belgrano, Manuel; Midiri, Massimo; Sabatini, Umberto; Bastianello, Stefano; Krestin, Gabriel P

    2005-10-01

    Three-dimensional reconstructions represent a visual-based tool for illustrating the basis of three-dimensional post-processing such as interpolation, ray-casting, segmentation, percentage classification, gradient calculation, shading and illumination. The knowledge of the optimal scanning and reconstruction parameters facilitates the use of three-dimensional reconstruction techniques in clinical practise. The aim of this article is to explain the principles of multidimensional image processing in a pictorial way and the advantages and limitations of the different possibilities of 3D visualisation.

  12. Self-Organizing Maps for In Silico Screening and Data Visualization.

    PubMed

    Digles, Daniela; Ecker, Gerhard F

    2011-10-01

    Self-organizing maps, which are unsupervised artificial neural networks, have become a very useful tool in a wide area of disciplines, including medicinal chemistry. Here, we will focus on two applications of self-organizing maps: the use of self-organizing maps for in silico screening and for clustering and visualisation of large datasets. Additionally, the importance of parameter selection is discussed and some modifications to the original algorithm are summarised. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Application of Synchrophasor Measurements for Improving Situational Awareness of the Power System

    NASA Astrophysics Data System (ADS)

    Obushevs, A.; Mutule, A.

    2018-04-01

    The paper focuses on the application of synchrophasor measurements that present unprecedented benefits compared to SCADA systems in order to facilitate the successful transformation of the Nordic-Baltic-and-European electric power system to operate with large amounts of renewable energy sources and improve situational awareness of the power system. The article describes new functionalities of visualisation tools to estimate a grid inertia level in real time with monitoring results between Nordic and Baltic power systems.

  14. Web 2.0 in Computer-Assisted Language Learning: A Research Synthesis and Implications for Instructional Design and Educational Practice

    ERIC Educational Resources Information Center

    Parmaxi, Antigoni; Zaphiris, Panayiotis

    2017-01-01

    This study explores the research development pertaining to the use of Web 2.0 technologies in the field of Computer-Assisted Language Learning (CALL). Published research manuscripts related to the use of Web 2.0 tools in CALL have been explored, and the following research foci have been determined: (1) Web 2.0 tools that dominate second/foreign…

  15. LifeWatchGreece Portal development: architecture, implementation and challenges for a biodiversity research e-infrastructure.

    PubMed

    Gougousis, Alexandros; Bailly, Nicolas

    2016-01-01

    Biodiversity data is characterized by its cross-disciplinary character, the extremely broad range of data types and structures, and the plethora of different data sources providing resources for the same piece of information in a heterogeneous way. Since the web inception two decades ago, there are multiple initiatives to connect, aggregate, share, and publish biodiversity data, and to establish data and work flows in order to analyze them. The European program LifeWatch aims at establishing a distributed network of nodes implementing virtual research environment in Europe to facilitate the work of biodiversity researchers and managers. LifeWatchGreece is one of these nodes where a portal was developed offering access to a suite of virtual laboratories and e-services. Despite its strict definition in information technology, in practice "portal" is a fairly broad term that embraces many web architectures. In the biodiversity domain, the term "portal" is usually used to indicate either a web site that provides access to a single or an aggregation of data repositories (like: http://indiabiodiversity.org/, http://www.mountainbiodiversity.org/, http://data.freshwaterbiodiversity.eu), a web site that gathers information about various online biodiversity tools (like http://test-eubon.ebd.csic.es/, http://marine.lifewatch.eu/) or a web site that just gathers information and news about the biodiversity domain (like http://chm.moew.government.bg). LifeWatchGreece's portal takes the concept of a portal a step further. In strict IT terms, LifeWatchGreece's portal is partly a portal, partly a platform and partly an aggregator. It includes a number of biodiversity-related web tools integrated into a centrally-controlled software ecosystem. This ecosystem includes subsystems for access control, traffic monitoring, user notifications and web tool management. These subsystems are shared to all the web tools that have been integrated to the portal and thereby are part of this ecosystem. These web tools do not consist in external and completely independent web applications as it happens in most other portals. A quite obvious (to the user) indication of this is the Single-Sign-On (SSO) functionality for all tools and the common user interface wrapper that most of these tools use. Another example of a less obvious functionality is the common user profile that is shared and can be utilized by all tools (e.g user's timezone).

  16. The Virtual Learning Commons (VLC): Enabling Co-Innovation Across Disciplines

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gandara, A.; Del Rio, N.

    2014-12-01

    A key challenge for scientists addressing grand-challenge problems is identifying, understanding, and integrating potentially relevant methods, models and tools that that are rapidly evolving in the informatics community. Such tools are essential for effectively integrating data and models in complex research projects, yet it is often difficult to know what tools are available and it is not easy to understand or evaluate how they might be used in a given research context. The goal of the National Science Foundation-funded Virtual Learning Commons (VLC) is to improve awareness and understanding of emerging methodologies and technologies, facilitate individual and group evaluation of these, and trace the impact of innovations within and across teams, disciplines, and communities. The VLC is a Web-based social bookmarking site designed specifically to support knowledge exchange in research communities. It is founded on well-developed models of technology adoption, diffusion of innovation, and experiential learning. The VLC makes use of Web 2.0 (Social Web) and Web 3.0 (Semantic Web) approaches. Semantic Web approaches enable discovery of potentially relevant methods, models, and tools, while Social Web approaches enable collaborative learning about their function. The VLC is under development and the first release is expected Fall 2014.

  17. The Impact of Active Visualisation of High School Students on the Ability to Memorise Verbal Definitions

    ERIC Educational Resources Information Center

    Šmajdek, Anamarija; Selan, Jurij

    2016-01-01

    The era of visual communication influences the cognitive strategies of the individual. Education, too, must adjust to these changes, which raises questions regarding the use of visualisation in teaching. In the present study, we examine the impact of visualisation on the ability of high school students to memorise text. In the theoretical part of…

  18. Developing visualisation software for rehabilitation: investigating the requirements of patients, therapists and the rehabilitation process

    PubMed Central

    Loudon, David; Macdonald, Alastair S.; Carse, Bruce; Thikey, Heather; Jones, Lucy; Rowe, Philip J.; Uzor, Stephen; Ayoade, Mobolaji; Baillie, Lynne

    2012-01-01

    This paper describes the ongoing process of the development and evaluation of prototype visualisation software, designed to assist in the understanding and the improvement of appropriate movements during rehabilitation. The process of engaging users throughout the research project is detailed in the paper, including how the design of the visualisation software is being adapted to meet the emerging understanding of the needs of patients and professionals, and of the rehabilitation process. The value of the process for the design of the visualisation software is illustrated with a discussion of the findings of pre-pilot focus groups with stroke survivors and therapists. PMID:23011812

  19. Dr TIM: Ray-tracer TIM, with additional specialist scientific capabilities

    NASA Astrophysics Data System (ADS)

    Oxburgh, Stephen; Tyc, Tomáš; Courtial, Johannes

    2014-03-01

    We describe several extensions to TIM, a raytracing program for ray-optics research. These include relativistic raytracing; simulation of the external appearance of Eaton lenses, Luneburg lenses and generalised focusing gradient-index lens (GGRIN) lenses, which are types of perfect imaging devices; raytracing through interfaces between spaces with different optical metrics; and refraction with generalised confocal lenslet arrays, which are particularly versatile METATOYs. Catalogue identifier: AEKY_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licencing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 106905 No. of bytes in distributed program, including test data, etc.: 6327715 Distribution format: tar.gz Programming language: Java. Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6. Operating system: Any, developed under Mac OS X Version 10.6 and 10.8.3. RAM: Typically 130 MB (interactive version running under Mac OS X Version 10.8.3) Classification: 14, 18. Catalogue identifier of previous version: AEKY_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183(2012)711 External routines: JAMA [1] (source code included) Does the new version supersede the previous version?: Yes Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Reasons for new version: Significant extension of the capabilities (see Summary of revisions), as demanded by our research. Summary of revisions: Added capabilities include the simulation of different types of camera moving at relativistic speeds relative to the scene; visualisation of the external appearance of generalised focusing gradient-index (GGRIN) lenses, including Maxwell fisheye, Eaton and Luneburg lenses; calculation of refraction at the interface between spaces with different optical metrics; and handling of generalised confocal lenslet arrays (gCLAs), a new type of METATOY. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories and geometric optic transformations; can simulate photos taken with different types of camera moving at relativistic speeds, interfaces between spaces with different optical metrics, the view through METATOYs and generalised focusing gradient-index lenses; can create anaglyphs (for viewing with coloured “3D glasses”), HDMI-1.4a standard 3D images, and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene. References: [1] JAMA: A Java Matrix Package, http://math.nist.gov/javanumerics/jama/

  20. VISIONET: intuitive visualisation of overlapping transcription factor networks, with applications in cardiogenic gene discovery.

    PubMed

    Nim, Hieu T; Furtado, Milena B; Costa, Mauro W; Rosenthal, Nadia A; Kitano, Hiroaki; Boyd, Sarah E

    2015-05-01

    Existing de novo software platforms have largely overlooked a valuable resource, the expertise of the intended biologist users. Typical data representations such as long gene lists, or highly dense and overlapping transcription factor networks often hinder biologists from relating these results to their expertise. VISIONET, a streamlined visualisation tool built from experimental needs, enables biologists to transform large and dense overlapping transcription factor networks into sparse human-readable graphs via numerically filtering. The VISIONET interface allows users without a computing background to interactively explore and filter their data, and empowers them to apply their specialist knowledge on far more complex and substantial data sets than is currently possible. Applying VISIONET to the Tbx20-Gata4 transcription factor network led to the discovery and validation of Aldh1a2, an essential developmental gene associated with various important cardiac disorders, as a healthy adult cardiac fibroblast gene co-regulated by cardiogenic transcription factors Gata4 and Tbx20. We demonstrate with experimental validations the utility of VISIONET for expertise-driven gene discovery that opens new experimental directions that would not otherwise have been identified.

  1. The VERCE Science Gateway: enabling user friendly seismic waves simulations across European HPC infrastructures

    NASA Astrophysics Data System (ADS)

    Spinuso, Alessandro; Krause, Amy; Ramos Garcia, Clàudia; Casarotti, Emanuele; Magnoni, Federica; Klampanos, Iraklis A.; Frobert, Laurent; Krischer, Lion; Trani, Luca; David, Mario; Leong, Siew Hoon; Muraleedharan, Visakh

    2014-05-01

    The EU-funded project VERCE (Virtual Earthquake and seismology Research Community in Europe) aims to deploy technologies which satisfy the HPC and data-intensive requirements of modern seismology. As a result of VERCE's official collaboration with the EU project SCI-BUS, access to computational resources, like local clusters and international infrastructures (EGI and PRACE), is made homogeneous and integrated within a dedicated science gateway based on the gUSE framework. In this presentation we give a detailed overview on the progress achieved with the developments of the VERCE Science Gateway, according to a use-case driven implementation strategy. More specifically, we show how the computational technologies and data services have been integrated within a tool for Seismic Forward Modelling, whose objective is to offer the possibility to perform simulations of seismic waves as a service to the seismological community. We will introduce the interactive components of the OGC map based web interface and how it supports the user with setting up the simulation. We will go through the selection of input data, which are either fetched from federated seismological web services, adopting community standards, or provided by the users themselves by accessing their own document data store. The HPC scientific codes can be selected from a number of waveform simulators, currently available to the seismological community as batch tools or with limited configuration capabilities in their interactive online versions. The results will be staged out from the HPC via a secure GridFTP transfer to a VERCE data layer managed by iRODS. The provenance information of the simulation will be automatically cataloged by the data layer via NoSQL techonologies. We will try to demonstrate how data access, validation and visualisation can be supported by a general purpose provenance framework which, besides common provenance concepts imported from the OPM and the W3C-PROV initiatives, also offers an extensible metadata archive including community and user defined metadata and annotations. Finally, we will show how the VERCE Gateway platform will allow the customisation of pre and post processing phases of the simulation workflows, thanks to the availability of a registry of processing elements (PEs,) which are easily developed and maintained by the seismologists.

  2. Using component technologies for web based wavelet enhanced mammographic image visualization.

    PubMed

    Sakellaropoulos, P; Costaridou, L; Panayiotakis, G

    2000-01-01

    The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.

  3. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry.

    PubMed

    Schutyser, M A I; Straatsma, J; Keijzer, P M; Verschueren, M; De Jong, P

    2008-11-30

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs. It can be applied to existing products and processes but also to reduce time to market for new products. Important aspects of the tool are its user-friendliness and its specifications customised to the needs of small dairy companies. To challenge the web-based tool it was applied for optimisation of thermal treatments in 16 dairy companies producing yoghurt, fresh cream, chocolate milk and cheese. Optimisation with WebSim-MILQ resulted in concrete improvements with respect to risk of microbial contamination, cheese yield, fouling and production costs. In this paper we illustrate the use of WebSim-MILQ for optimisation of a cheese milk pasteurisation process where we could increase the cheese yield (1 extra cheese for each 100 produced cheeses from the same amount of milk) and reduced the risk of contamination of pasteurised cheese milk with thermoresistent streptococci from critical to negligible. In another case we demonstrate the advantage for changing from an indirect to a direct heating method for a UHT process resulting in 80% less fouling, while improving product quality and maintaining product safety.

  4. Vocabulary Learning on Learner-Created Content by Using Web 2.0 Tools

    ERIC Educational Resources Information Center

    Eren, Omer

    2015-01-01

    The present research examined the use of Web 2.0 tools to improve students' vocabulary knowledge at the School of Foreign Languages, Gaziantep University. Current studies in literature mostly deal with descriptions of students' attitudes towards the reasons for the use of web-based platforms. However, integrating usual classroom environment with…

  5. Evaluating the Effectiveness of College Web Sites for Prospective Students

    ERIC Educational Resources Information Center

    Ford, Wendy G.

    2011-01-01

    College Web sites are often the first structured encounter a student has with a prospective college or university. Outside of serving as a marketing tool (Williams 2000), very little literature exists on the functional purpose of a college's Web site. Almost all college sites show an informational and transactional tool for currently enrolled…

  6. Web 2.0 in the Mathematics Classroom

    ERIC Educational Resources Information Center

    McCoy, Leah P.

    2014-01-01

    A key characteristic of successful mathematics teachers is that they are able to provide varied activities that promote student learning and assessment. Web 2.0 applications can provide an assortment of tools to help produce creative activities. A Web 2.0 tool enables the student to enter data and create multimedia products using text, graphics,…

  7. Mining Hidden Gems Beneath the Surface: A Look At the Invisible Web.

    ERIC Educational Resources Information Center

    Carlson, Randal D.; Repman, Judi

    2002-01-01

    Describes resources for researchers called the Invisible Web that are hidden from the usual search engines and other tools and contrasts them with those resources available on the surface Web. Identifies specialized search tools, databases, and strategies that can be used to locate credible in-depth information. (Author/LRW)

  8. Change Management Meets Web 2.0

    ERIC Educational Resources Information Center

    Gale, Doug

    2008-01-01

    Web 2.0 is the term used to describe a group of web-based creativity, information-sharing, and collaboration tools including wikis, blogs, social networks, and folksonomies. The common thread in all of these tools is twofold: They enable collaboration and information sharing, and their impact on higher education has been dramatic. A recent study…

  9. Web Database Development: Implications for Academic Publishing.

    ERIC Educational Resources Information Center

    Fernekes, Bob

    This paper discusses the preliminary planning, design, and development of a pilot project to create an Internet accessible database and search tool for locating and distributing company data and scholarly work. Team members established four project objectives: (1) to develop a Web accessible database and decision tool that creates Web pages on the…

  10. Comprehensive Analysis of Semantic Web Reasoners and Tools: A Survey

    ERIC Educational Resources Information Center

    Khamparia, Aditya; Pandey, Babita

    2017-01-01

    Ontologies are emerging as best representation techniques for knowledge based context domains. The continuing need for interoperation, collaboration and effective information retrieval has lead to the creation of semantic web with the help of tools and reasoners which manages personalized information. The future of semantic web lies in an ontology…

  11. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and institutional logic from higher level processes (engine) suit JWP's requirements. The use of Hydra Platform and Pynsim helps make complex customised models such as the JWP model easier to run and manage with international groups of researchers.

  12. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  13. AAVSO Target Tool: A Web-Based Service for Tracking Variable Star Observations (Abstract)

    NASA Astrophysics Data System (ADS)

    Burger, D.; Stassun, K. G.; Barnes, C.; Kafka, S.; Beck, S.; Li, K.

    2018-06-01

    (Abstract only) The AAVSO Target Tool is a web-based interface for bringing stars in need of observation to the attention of AAVSOís network of amateur and professional astronomers. The site currently tracks over 700 targets of interest, collecting data from them on a regular basis from AAVSOís servers and sorting them based on priority. While the target tool does not require a login, users can obtain visibility times for each target by signing up and entering a telescope location. Other key features of the site include filtering by AAVSO observing section, sorting by different variable types, formatting the data for printing, and exporting the data to a CSV file. The AAVSO Target Tool builds upon seven years of experience developing web applications for astronomical data analysis, most notably on Filtergraph (Burger, D., et al. 2013, Astronomical Data Analysis Software and Systems XXII, Astronomical Society of the Pacific, San Francisco, 399), and is built using the web2py web framework based on the python programming language. The target tool is available at http://filtergraph.com/aavso.

  14. Examining Web 2.0 Tools Usage of Science Teacher Candidates

    ERIC Educational Resources Information Center

    Balkan Kiyici, Fatime

    2012-01-01

    Using technology in a science teaching is so important. Only the person, who can use these tools in expert level, can use these tools in their teaching activities. In this research it is aimed firstly identifying science teacher candidates web 2.0 tools usage experience level and factors affecting experience level. In this research survey method…

  15. Sharik 1.0: User Needs and System Requirements for a Web-Based Tool to Support Collaborative Sensemaking

    DTIC Science & Technology

    2016-05-01

    Sharik 1.0: User Needs and System Requirements for a Web -Based Tool to Support Collaborative Sensemaking Shadi Ghajar-Khosravi...share the new intelligence items with their peers. In this report, the authors describe Sharik (SHAring Resources, Information, and Knowledge), a web ...SHAring Resources, Information and Knowledge, soit le partage des ressources, de l’information et des connaissances), un outil Web qui facilite le

  16. Utilizing Web 2.0 Technologies for Library Web Tutorials: An Examination of Instruction on Community College Libraries' Websites Serving Large Student Bodies

    ERIC Educational Resources Information Center

    Blummer, Barbara; Kenton, Jeffrey M.

    2015-01-01

    This is the second part of a series on Web 2.0 tools available from community college libraries' Websites. The first article appeared in an earlier volume of this journal and it illustrated the wide variety of Web 2.0 tools on community college libraries' Websites serving large student bodies (Blummer and Kenton 2014). The research found many of…

  17. Emerging Instructional Technologies: Exploring the Extent of Faculty Use of Web 2.0 Tools at a Midwestern Community College

    ERIC Educational Resources Information Center

    Daher, Tareq; Lazarevic, Bojan

    2014-01-01

    The purpose of this research is to provide insight into the several aspects of instructional use of emerging web-based technologies. The study first explores the extent of Web 2.0 technology integration into face-to-face classroom activities. In this phase, the main focus of research interests was on the types and dynamics of Web 2.0 tools used by…

  18. Contributions of Traditional Web 1.0 Tools e.g. Email and Web 2.0 Tools e.g. Weblog towards Knowledge Management

    ERIC Educational Resources Information Center

    Dehinbo, Johnson

    2010-01-01

    The use of email utilizes the power of Web 1.0 to enable users to access their email from any computer and mobile devices that is connected to the Internet making email valuable in acquiring and transferring knowledge. But the advent of Web 2.0 and social networking seems to indicate certain limitations of email. The use of social networking seems…

  19. How To Succeed in Promoting Your Web Site: The Impact of Search Engine Registration on Retrieval of a World Wide Web Site.

    ERIC Educational Resources Information Center

    Tunender, Heather; Ervin, Jane

    1998-01-01

    Character strings were planted in a World Wide Web site (Project Whistlestop) to test indexing and retrieval rates of five Web search tools (Lycos, infoseek, AltaVista, Yahoo, Excite). It was found that search tools indexed few of the planted character strings, none indexed the META descriptor tag, and only Excite indexed into the 3rd-4th site…

  20. SOAP based web services and their future role in VO projects

    NASA Astrophysics Data System (ADS)

    Topf, F.; Jacquey, C.; Génot, V.; Cecconi, B.; André, N.; Zhang, T. L.; Kallio, E.; Lammer, H.; Facsko, G.; Stöckler, R.; Khodachenko, M.

    2011-10-01

    Modern state-of-the-art web services are from crucial importance for the interoperability of different VO tools existing in the planetary community. SOAP based web services assure the interconnectability between different data sources and tools by providing a common protocol for communication. This paper will point out a best practice approach with the Automated Multi-Dataset Analysis Tool (AMDA) developed by CDPP, Toulouse and the provision of VEX/MAG data from a remote database located at IWF, Graz. Furthermore a new FP7 project IMPEx will be introduced with a potential usage example of AMDA web services in conjunction with simulation models.

  1. NOAA Miami Regional Library > Home

    Science.gov Websites

    Services & Education Social Networking & Other Web Tools for Earth Science Library Catalog AOML ; Education|Social Networking & Other Web Tools for Earth Science 4301 Rickenbacker Causeway, Miami, Fl

  2. Online 4d Reconstruction Using Multi-Images Available Under Open Access

    NASA Astrophysics Data System (ADS)

    Ioannides, M.; Hadjiprocopi, A.; Doulamis, N.; Doulamis, A.; Protopapadakis, E.; Makantasis, K.; Santos, P.; Fellner, D.; Stork, A.; Balet, O.; Julien, M.; Weinlinger, G.; Johnson, P. S.; Klein, M.; Fritsch, D.

    2013-07-01

    The advent of technology in digital cameras and their incorporation into virtually any smart mobile device has led to an explosion of the number of photographs taken every day. Today, the number of images stored online and available freely has reached unprecedented levels. It is estimated that in 2011, there were over 100 billion photographs stored in just one of the major social media sites. This number is growing exponentially. Moreover, advances in the fields of Photogrammetry and Computer Vision have led to significant breakthroughs such as the Structure from Motion algorithm which creates 3D models of objects using their twodimensional photographs. The existence of powerful and affordable computational machinery not only the reconstruction of complex structures but also entire cities. This paper illustrates an overview of our methodology for producing 3D models of Cultural Heritage structures such as monuments and artefacts from 2D data (pictures, video), available on Internet repositories, social media, Google Maps, Bing, etc. We also present new approaches to semantic enrichment of the end results and their subsequent export to Europeana, the European digital library, for integrated, interactive 3D visualisation within regular web browsers using WebGl and X3D. Our main goal is to enable historians, architects, archaeologists, urban planners and affiliated professionals to reconstruct views of historical structures from millions of images floating around the web and interact with them.

  3. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    NASA Astrophysics Data System (ADS)

    Ma, Xiuzeng; Li, Yingkui; Bourgeois, Mike; Caffee, Marc; Elmore, David; Granger, Darryl; Muzikar, Paul; Smith, Preston

    2007-06-01

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for 10Be and 26Al has been finished and published at http://www.physics.purdue.edu/primelab/for_users/rockage.html. WebCN for 36Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  4. Using a Visualiser in Primary Science

    ERIC Educational Resources Information Center

    Nicholson, Danny

    2011-01-01

    Picture the scene--a child in a class has brought in a fabulous example of a snake skin, a snail, a seed, a fossil or rock and the whole class wants to see it. How does a teacher allow them all to observe it without destroying it or jostling each other? One way to get around this issue is to use a visualiser. A visualiser is essentially a small…

  5. 2B-Alert Web: An Open-Access Tool for Predicting the Effects of Sleep/Wake Schedules and Caffeine Consumption on Neurobehavioral Performance.

    PubMed

    Reifman, Jaques; Kumar, Kamal; Wesensten, Nancy J; Tountas, Nikolaos A; Balkin, Thomas J; Ramakrishnan, Sridhar

    2016-12-01

    Computational tools that predict the effects of daily sleep/wake amounts on neurobehavioral performance are critical components of fatigue management systems, allowing for the identification of periods during which individuals are at increased risk for performance errors. However, none of the existing computational tools is publicly available, and the commercially available tools do not account for the beneficial effects of caffeine on performance, limiting their practical utility. Here, we introduce 2B-Alert Web, an open-access tool for predicting neurobehavioral performance, which accounts for the effects of sleep/wake schedules, time of day, and caffeine consumption, while incorporating the latest scientific findings in sleep restriction, sleep extension, and recovery sleep. We combined our validated Unified Model of Performance and our validated caffeine model to form a single, integrated modeling framework instantiated as a Web-enabled tool. 2B-Alert Web allows users to input daily sleep/wake schedules and caffeine consumption (dosage and time) to obtain group-average predictions of neurobehavioral performance based on psychomotor vigilance tasks. 2B-Alert Web is accessible at: https://2b-alert-web.bhsai.org. The 2B-Alert Web tool allows users to obtain predictions for mean response time, mean reciprocal response time, and number of lapses. The graphing tool allows for simultaneous display of up to seven different sleep/wake and caffeine schedules. The schedules and corresponding predicted outputs can be saved as a Microsoft Excel file; the corresponding plots can be saved as an image file. The schedules and predictions are erased when the user logs off, thereby maintaining privacy and confidentiality. The publicly accessible 2B-Alert Web tool is available for operators, schedulers, and neurobehavioral scientists as well as the general public to determine the impact of any given sleep/wake schedule, caffeine consumption, and time of day on performance of a group of individuals. This evidence-based tool can be used as a decision aid to design effective work schedules, guide the design of future sleep restriction and caffeine studies, and increase public awareness of the effects of sleep amounts, time of day, and caffeine on alertness. © 2016 Associated Professional Sleep Societies, LLC.

  6. Hydrogen Financial Analysis Scenario Tool (H2FAST) Documentation

    Science.gov Websites

    for the web and spreadsheet versions of H2FAST. H2FAST Web Tool User's Manual H2FAST Spreadsheet Tool User's Manual (DRAFT) Technical Support Send questions or feedback about H2FAST to H2FAST@nrel.gov. Home

  7. Patient-oriented interactive E-health tools on U.S. hospital Web sites.

    PubMed

    Huang, Edgar; Chang, Chiu-Chi Angela

    2012-01-01

    The purpose of this study is to provide evidence for strategic planning regarding e-health development in U.S. hospitals. A content analysis of a representative sample of the U.S. hospital Web sites has revealed how U.S. hospitals have taken advantage of the 21 patient-oriented interactive tools identified in this study. Significant gaps between various types of hospitals have also been found. It is concluded that although the majority of the U.S. hospitals have adopted traditional functional tools, they need to make significant inroad in implementing the core e-business tools to serve their patients/users, making their Web sites more efficient marketing tools.

  8. WIRM: An Open Source Toolkit for Building Biomedical Web Applications

    PubMed Central

    Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.

    2002-01-01

    This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108

  9. Using Firefly Tools to Enhance Archive Web Pages

    NASA Astrophysics Data System (ADS)

    Roby, W.; Wu, X.; Ly, L.; Goldina, T.

    2013-10-01

    Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.

  10. Innovative Ways of Visualising Meta Data in 4D Using Open Source Libaries

    NASA Astrophysics Data System (ADS)

    Balhar, Jakub; Valach, Pavel; Veselka, Jonas; Voumard, Yann

    2016-08-01

    There are more and more data being measured by different Earth Observation satellites around the world. Ever increasing amount of these data present new challenges and opportunities for their visualization.In this paper we propose how to visualize the amount, distribution and the structure of the data in a transparent way, which will take into account time-dimensions as well. Our approach allows us to get a global overview as well detailed regional information about distribution of the products from EO observation missions.We focus on introducing our mobile-friendly and easy- to-use web mapping application for 4D visualization of the data. Apart from that we also present the Java application which can read and process the data from various data sources.

  11. Global Connections: Web Conferencing Tools Help Educators Collaborate Anytime, Anywhere

    ERIC Educational Resources Information Center

    Forrester, Dave

    2009-01-01

    Web conferencing tools help educators from around the world collaborate in real time. Teachers, school counselors, and administrators need only to put on their headsets, check the time zone, and log on to meet and learn from educators across the globe. In this article, the author discusses how educators can use Web conferencing at their schools.…

  12. Faculty Recommendations for Web Tools: Implications for Course Management Systems

    ERIC Educational Resources Information Center

    Oliver, Kevin; Moore, John

    2008-01-01

    A gap analysis of web tools in Engineering was undertaken as one part of the Digital Library Network for Engineering and Technology (DLNET) grant funded by NSF (DUE-0085849). DLNET represents a Web portal and an online review process to archive quality knowledge objects in Engineering and Technology disciplines. The gap analysis coincided with the…

  13. Teachers' Perceptions and Attitudes toward the Implementation of Web 2.0 Tools in Secondary Education

    ERIC Educational Resources Information Center

    Quadri, Lekan Kamil

    2014-01-01

    Researchers have concluded that Web 2.0 technologies offered many educational benefits. However, many secondary teachers in a large northwestern school district were not using Web 2.0 tools in spite of its possibilities for teaching and learning. The purpose of this quantitative correlational research was to explore the relationships between the…

  14. Web-Based Course Delivery and Administration Using Scheme.

    ERIC Educational Resources Information Center

    Salustri, Filippo A.

    This paper discusses the use at the University of Windsor (Ontario) of a small World Wide Web-based tool for course delivery and administration called HAL (HTML-based Administrative Lackey), written in the Scheme programming language. This tool was developed by the author to provide Web-based services for a large first-year undergraduate course in…

  15. The Effectiveness of Web-Based Learning Environment: A Case Study of Public Universities in Kenya

    ERIC Educational Resources Information Center

    Kirui, Paul A.; Mutai, Sheila J.

    2010-01-01

    Web mining is emerging in many aspects of e-learning, aiming at improving online learning and teaching processes and making them more transparent and effective. Researchers using Web mining tools and techniques are challenged to learn more about the online students' reshaping online courses and educational websites, and create tools for…

  16. Attitudes, Perceptions, and Behavioral Intentions of Engineering Workers toward Web 2.0 Tools in the Workplace

    ERIC Educational Resources Information Center

    Krause, Jaclyn A.

    2010-01-01

    As Web 2.0 tools and technologies increase in popularity in consumer markets, enterprises are seeking ways to take advantage of the rich social knowledge exchanges that these tools offer. The problem this study addresses is that it remains unknown whether employees perceive that these tools offer value to the organization and therefore will be…

  17. Oh! Web 2.0, Virtual Reference Service 2.0, Tools & Techniques (II)

    ERIC Educational Resources Information Center

    Arya, Harsh Bardhan; Mishra, J. K.

    2012-01-01

    The paper describes the theory and definition of the practice of librarianship, specifically addressing how Web 2.0 technologies (tools) such as synchronous messaging, collaborative reference service and streaming media, blogs, wikis, social networks, social bookmarking tools, tagging, RSS feeds, and mashups might intimate changes and how…

  18. Enhancing e-Learning Content by Using Semantic Web Technologies

    ERIC Educational Resources Information Center

    García-González, Herminio; Gayo, José Emilio Labra; del Puerto Paule-Ruiz, María

    2017-01-01

    We describe a new educational tool that relies on Semantic Web technologies to enhance lessons content. We conducted an experiment with 32 students whose results demonstrate better performance when exposed to our tool in comparison with a plain native tool. Consequently, this prototype opens new possibilities in lessons content enhancement.

  19. Web Surveys to Digital Movies: Technological Tools of the Trade.

    ERIC Educational Resources Information Center

    Fetterman, David M.

    2002-01-01

    Highlights some of the technological tools used by educational researchers today, focusing on data collection related tools such as Web surveys, digital photography, voice recognition and transcription, file sharing and virtual office, videoconferencing on the Internet, instantaneous chat and chat rooms, reporting and dissemination, and digital…

  20. Categorisation of visualisation methods to support the design of Human-Computer Interaction Systems.

    PubMed

    Li, Katie; Tiwari, Ashutosh; Alcock, Jeffrey; Bermell-Garcia, Pablo

    2016-07-01

    During the design of Human-Computer Interaction (HCI) systems, the creation of visual artefacts forms an important part of design. On one hand producing a visual artefact has a number of advantages: it helps designers to externalise their thought and acts as a common language between different stakeholders. On the other hand, if an inappropriate visualisation method is employed it could hinder the design process. To support the design of HCI systems, this paper reviews the categorisation of visualisation methods used in HCI. A keyword search is conducted to identify a) current HCI design methods, b) approaches of selecting these methods. The resulting design methods are filtered to create a list of just visualisation methods. These are then categorised using the approaches identified in (b). As a result 23 HCI visualisation methods are identified and categorised in 5 selection approaches (The Recipient, Primary Purpose, Visual Archetype, Interaction Type, and The Design Process). Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. A Framework for Network Visualisation: Progress Report

    DTIC Science & Technology

    2006-12-01

    time; secondly a simple oscillation, in which traffic changes, but those changes repeat periodically; or thirdly, a “ strange attractor ”, a pattern of...changes that never repeats exactly, though it may appear to repeat approximately. The strange attractor is the signature of a chaotic system, which...IST-063 3 - 1 Taylor, M.M. (2006) A Framework for Network Visualisation: Progress Report. In Visualising Network Information (pp. 3-1 – 3-22

  2. A Comparative Study of the Effects of Using Dynamic Geometry Software and Physical Manipulatives on the Spatial Visualisation Skills of Pre-Service Mathematics Teachers

    ERIC Educational Resources Information Center

    Baki, Adnan; Kosa, Temel; Guven, Bulent

    2011-01-01

    The study compared the effects of dynamic geometry software and physical manipulatives on the spatial visualisation skills of first-year pre-service mathematics teachers. A pre- and post-test quasi-experimental design was used. The Purdue Spatial Visualisation Test (PSVT) was used for the pre- and post-test. There were three treatment groups. The…

  3. Prototype of Partial Cutting Tool of Geological Map Images Distributed by Geological Web Map Service

    NASA Astrophysics Data System (ADS)

    Nonogaki, S.; Nemoto, T.

    2014-12-01

    Geological maps and topographical maps play an important role in disaster assessment, resource management, and environmental preservation. These map information have been distributed in accordance with Web services standards such as Web Map Service (WMS) and Web Map Tile Service (WMTS) recently. In this study, a partial cutting tool of geological map images distributed by geological WMTS was implemented with Free and Open Source Software. The tool mainly consists of two functions: display function and cutting function. The former function was implemented using OpenLayers. The latter function was implemented using Geospatial Data Abstraction Library (GDAL). All other small functions were implemented by PHP and Python. As a result, this tool allows not only displaying WMTS layer on web browser but also generating a geological map image of intended area and zoom level. At this moment, available WTMS layers are limited to the ones distributed by WMTS for the Seamless Digital Geological Map of Japan. The geological map image can be saved as GeoTIFF format and WebGL format. GeoTIFF is one of the georeferenced raster formats that is available in many kinds of Geographical Information System. WebGL is useful for confirming a relationship between geology and geography in 3D. In conclusion, the partial cutting tool developed in this study would contribute to create better conditions for promoting utilization of geological information. Future work is to increase the number of available WMTS layers and the types of output file format.

  4. Web Search Studies: Multidisciplinary Perspectives on Web Search Engines

    NASA Astrophysics Data System (ADS)

    Zimmer, Michael

    Perhaps the most significant tool of our internet age is the web search engine, providing a powerful interface for accessing the vast amount of information available on the world wide web and beyond. While still in its infancy compared to the knowledge tools that precede it - such as the dictionary or encyclopedia - the impact of web search engines on society and culture has already received considerable attention from a variety of academic disciplines and perspectives. This article aims to organize a meta-discipline of “web search studies,” centered around a nucleus of major research on web search engines from five key perspectives: technical foundations and evaluations; transaction log analyses; user studies; political, ethical, and cultural critiques; and legal and policy analyses.

  5. Comparing apples and oranges: the Community Intercomparison Suite

    NASA Astrophysics Data System (ADS)

    Schutgens, Nick; Stier, Philip; Pascoe, Stephen

    2014-05-01

    Visual representation and comparison of geoscientific datasets presents a huge challenge due to the large variety of file formats and spatio-temporal sampling of data (be they observations or simulations). The Community Intercomparison Suite attempts to greatly simplify these tasks for users by offering an intelligent but simple command line tool for visualisation and colocation of diverse datasets. In addition, CIS can subset and aggregate large datasets into smaller more manageable datasets. Our philosophy is to remove as much as possible the need for specialist knowledge by the user of the structure of a dataset. The colocation of observations with model data is as simple as: "cis col ::" which will resample the simulation data to the spatio-temporal sampling of the observations, contingent on a few user-defined options that specify a resampling kernel. CIS can deal with both gridded and ungridded datasets of 2, 3 or 4 spatio-temporal dimensions. It can handle different spatial coordinates (e.g. longitude or distance, altitude or pressure level). CIS supports both HDF, netCDF and ASCII file formats. The suite is written in Python with entirely publicly available open source dependencies. Plug-ins allow a high degree of user-moddability. A web-based developer hub includes a manual and simple examples. CIS is developed as open source code by a specialist IT company under supervision of scientists from the University of Oxford as part of investment in the JASMIN superdatacluster facility at the Centre of Environmental Data Archival.

  6. Web-based decision support and visualization tools for water quality management in the Chesapeake Bay watershed

    USGS Publications Warehouse

    Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.

    2009-01-01

    Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.

  7. DEVELOPMENTS IN GRworkbench

    NASA Astrophysics Data System (ADS)

    Moylan, Andrew; Scott, Susan M.; Searle, Anthony C.

    2006-02-01

    The software tool GRworkbench is an ongoing project in visual, numerical General Relativity at The Australian National University. Recently, GRworkbench has been significantly extended to facilitate numerical experimentation in analytically-defined space-times. The numerical differential geometric engine has been rewritten using functional programming techniques, enabling objects which are normally defined as functions in the formalism of differential geometry and General Relativity to be directly represented as function variables in the C++ code of GRworkbench. The new functional differential geometric engine allows for more accurate and efficient visualisation of objects in space-times and makes new, efficient computational techniques available. Motivated by the desire to investigate a recent scientific claim using GRworkbench, new tools for numerical experimentation have been implemented, allowing for the simulation of complex physical situations.

  8. A Web Based Collaborative Design Environment for Spacecraft

    NASA Technical Reports Server (NTRS)

    Dunphy, Julia

    1998-01-01

    In this era of shrinking federal budgets in the USA we need to dramatically improve our efficiency in the spacecraft engineering design process. We have come up with a method which captures much of the experts' expertise in a dataflow design graph: Seamlessly connectable set of local and remote design tools; Seamlessly connectable web based design tools; and Web browser interface to the developing spacecraft design. We have recently completed our first web browser interface and demonstrated its utility in the design of an aeroshell using design tools located at web sites at three NASA facilities. Multiple design engineers and managers are now able to interrogate the design engine simultaneously and find out what the design looks like at any point in the design cycle, what its parameters are, and how it reacts to adverse space environments.

  9. Social Web mining and exploitation for serious applications: Technosocial Predictive Analytics and related technologies for public health, environmental and national security surveillance.

    PubMed

    Kamel Boulos, Maged N; Sanfilippo, Antonio P; Corley, Courtney D; Wheeler, Steve

    2010-10-01

    This paper explores Technosocial Predictive Analytics (TPA) and related methods for Web "data mining" where users' posts and queries are garnered from Social Web ("Web 2.0") tools such as blogs, micro-blogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people's collective health status of whole populations. Several health related tool examples are described and demonstrated as practical means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  10. Enhanced STEM Learning with the GeoMapApp Data Exploration Tool

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.

    2014-12-01

    GeoMapApp (http://www.geomapapp.org), is a free, map-based data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory. GeoMapApp provides casual and specialist users alike with access to hundreds of built-in geoscience data sets covering geology, geophysics, geochemistry, oceanography, climatology, cryospherics, and the environment. Users can also import their own data tables, spreadsheets, shapefiles, grids and images. Simple manipulation and analysis tools combined with layering capabilities and engaging visualisations provide a powerful platform with which to explore and interrogate geoscience data in its proper geospatial context thus helping users to more easily gain insight into the meaning of the data. A global elevation base map covering the oceans as well as continents forms the backbone of GeoMapApp. The multi-resolution base map is updated regularly and includes data sources ranging from Space Shuttle elevation data for land areas to ultra-high-resolution surveys of coral reefs and seafloor hydrothermal vent fields. Examples of built-in data sets that can be layered over the elevation model include interactive earthquake and volcano data, plate tectonic velocities, hurricane tracks, land and ocean temperature, water column properties, age of the ocean floor, and deep submersible bottom photos. A versatile profiling tool provides instant access to data cross-sections. Contouring and 3-D views are also offered - the attached image shows a 3-D view of East Africa's Ngorongoro Crater as an example. Tabular data - both imported and built-in - can be displayed in a variety of ways and a lasso tool enables users to quickly select data points directly from the map. A range of STEM-based education material based upon GeoMapApp is already available, including a number of self-contained modules for school- and college-level students (http://www.geomapapp.org/education/contributed_material.html). More learning modules are planned, such as one on the effects of sea-level rise. GeoMapApp users include students, teachers, researchers, curriculum developers and outreach specialists.

  11. New approaches in assessing food intake in epidemiology.

    PubMed

    Conrad, Johanna; Koch, Stefanie A J; Nöthlings, Ute

    2018-06-22

    A promising direction for improving dietary intake measurement in epidemiologic studies is the combination of short-term and long-term dietary assessment methods using statistical methods. Thereby, web-based instruments are particularly interesting as their application offers several potential advantages such as self-administration and a shorter completion time. The objective of this review is to provide an overview of new web-based short-term instruments and to describe their features. A number of web-based short-term dietary assessment tools for application in different countries and age-groups have been developed so far. Particular attention should be paid to the underlying database and the search function of the tool. Moreover, web-based instruments can improve the estimation of portion sizes by offering several options to the user. Web-based dietary assessment methods are associated with lower costs and reduced burden for participants and researchers, and show a comparable validity with traditional instruments. When there is a need for a web-based tool researcher should consider the adaptation of existing tools rather than developing new instruments. The combination of short-term and long-term instruments seems more feasible with the use of new technology.

  12. Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.

    PubMed

    Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan

    2017-10-01

    Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.

  13. Technology Integration in Science Classrooms: Framework, Principles, and Examples

    ERIC Educational Resources Information Center

    Kim, Minchi C.; Freemyer, Sarah

    2011-01-01

    A great number of technologies and tools have been developed to support science learning and teaching. However, science teachers and researchers point out numerous challenges to implementing such tools in science classrooms. For instance, guidelines, lesson plans, Web links, and tools teachers can easily find through Web-based search engines often…

  14. SETAC Short Course: Introduction to interspecies toxicity extrapolation using EPA’s Web-ICE tool

    EPA Science Inventory

    The Web-ICE tool is a user friendly interface that contains modules to predict acute toxicity to over 500 species of aquatic (algae, invertebrates, fish) and terrestrial (birds and mammals) taxa. The tool contains a suite of over 3000 ICE models developed from a database of over ...

  15. The Web-Database Connection Tools for Sharing Information on the Campus Intranet.

    ERIC Educational Resources Information Center

    Thibeault, Nancy E.

    This paper evaluates four tools for creating World Wide Web pages that interface with Microsoft Access databases: DB Gateway, Internet Database Assistant (IDBA), Microsoft Internet Database Connector (IDC), and Cold Fusion. The system requirements and features of each tool are discussed. A sample application, "The Virtual Help Desk"…

  16. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    ERIC Educational Resources Information Center

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  17. Tailored and Integrated Web-Based Tools for Improving Psychosocial Outcomes of Cancer Patients: The DoTTI Development Framework

    PubMed Central

    Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-01-01

    Background Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. Objective The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. Methods The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. Results The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. Conclusions This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases. PMID:24641991

  18. Tailored and integrated Web-based tools for improving psychosocial outcomes of cancer patients: the DoTTI development framework.

    PubMed

    Smits, Rochelle; Bryant, Jamie; Sanson-Fisher, Rob; Tzelepis, Flora; Henskens, Frans; Paul, Christine; Stevenson, William

    2014-03-14

    Effective communication with cancer patients and their families about their disease, treatment options, and possible outcomes may improve psychosocial outcomes. However, traditional approaches to providing information to patients, including verbal information and written booklets, have a number of shortcomings centered on their limited ability to meet patient preferences and literacy levels. New-generation Web-based technologies offer an innovative and pragmatic solution for overcoming these limitations by providing a platform for interactive information seeking, information sharing, and user-centered tailoring. The primary goal of this paper is to discuss the advantages of comprehensive and iterative Web-based technologies for health information provision and propose a four-phase framework for the development of Web-based information tools. The proposed framework draws on our experience of constructing a Web-based information tool for hematological cancer patients and their families. The framework is based on principles for the development and evaluation of complex interventions and draws on the Agile methodology of software programming that emphasizes collaboration and iteration throughout the development process. The DoTTI framework provides a model for a comprehensive and iterative approach to the development of Web-based informational tools for patients. The process involves 4 phases of development: (1) Design and development, (2) Testing early iterations, (3) Testing for effectiveness, and (4) Integration and implementation. At each step, stakeholders (including researchers, clinicians, consumers, and programmers) are engaged in consultations to review progress, provide feedback on versions of the Web-based tool, and based on feedback, determine the appropriate next steps in development. This 4-phase framework is evidence-informed and consumer-centered and could be applied widely to develop Web-based programs for a diverse range of diseases.

  19. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  20. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  1. Teaching a Foreign Language to Deaf People via Vodcasting & Web 2.0 Tools

    NASA Astrophysics Data System (ADS)

    Drigas, Athanasios; Vrettaros, John; Tagoulis, Alexandors; Kouremenos, Dimitris

    This paper presents the design and development of an e-learning course in teaching deaf people in a foreign language, whose first language is the sign language. The course is based in e-material, vodcasting and web 2.0 tools such as social networking and blog The course has been designed especially for deaf people and it is exploring the possibilities that e-learning material vodcasting and web 2.0 tools can offer to enhance the learning process and achieve more effective learning results.

  2. Siberian Earth System Science Cluster - A web-based Geoportal to provide user-friendly Earth Observation Products for supporting NEESPI scientists

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Gerlach, R.; Hese, S.; Schmullius, C.

    2012-04-01

    To provide earth observation products in the area of Siberia, the Siberian Earth System Science Cluster (SIB-ESS-C) was established as a spatial data infrastructure at the University of Jena (Germany), Department for Earth Observation. This spatial data infrastructure implements standards published by the Open Geospatial Consortium (OGC) and the International Organizsation for Standardization (ISO) for data discovery, data access, data processing and data analysis. The objective of SIB-ESS-C is to faciliate environmental research and Earth system science in Siberia. The region for this project covers the entire Asian part of the Russian Federation approximately between 58°E - 170°W and 48°N - 80°N. To provide discovery, access and analysis services a webportal was published for searching and visualisation of available data. This webportal is based on current web technologies like AJAX, Drupal Content Management System as backend software and a user-friendly surface with Drag-n-Drop and further mouse events. To have a wide range of regular updated earth observation products, some products from sensor MODIS at the satellites Aqua and Terra were processed. A direct connection to NASA archive servers makes it possible to download MODIS Level 3 and 4 products and integrate it in the SIB-ESS-C infrastructure. These data can be downloaded in a file format called Hierarchical Data Format (HDF). For visualisation and further analysis, this data is reprojected, converted to GeoTIFF and global products clipped to the project area. All these steps are implemented as an automatic process chain. If new MODIS data is available within the infrastructure this process chain is executed. With the link to a MODIS catalogue system, the system gets new data daily. With the implemented analysis processes, timeseries data can be analysed, for example to plot a trend or different time series against one another. Scientists working in this area and working with MODIS data can make use of this service over the webportal. Both searching manually the NASA archive for MODIS data, processing these data automatically and then download it for further processing and using the regular updated products.

  3. Developing creativity and problem-solving skills of engineering students: a comparison of web- and pen-and-paper-based approaches

    NASA Astrophysics Data System (ADS)

    Valentine, Andrew; Belski, Iouri; Hamilton, Margaret

    2017-11-01

    Problem-solving is a key engineering skill, yet is an area in which engineering graduates underperform. This paper investigates the potential of using web-based tools to teach students problem-solving techniques without the need to make use of class time. An idea generation experiment involving 90 students was designed. Students were surveyed about their study habits and reported they use electronic-based materials more than paper-based materials while studying, suggesting students may engage with web-based tools. Students then generated solutions to a problem task using either a paper-based template or an equivalent web interface. Students who used the web-based approach performed as well as students who used the paper-based approach, suggesting the technique can be successfully adopted and taught online. Web-based tools may therefore be adopted as supplementary material in a range of engineering courses as a way to increase students' options for enhancing problem-solving skills.

  4. Executing SADI services in Galaxy.

    PubMed

    Aranguren, Mikel Egaña; González, Alejandro Rodríguez; Wilkinson, Mark D

    2014-01-01

    In recent years Galaxy has become a popular workflow management system in bioinformatics, due to its ease of installation, use and extension. The availability of Semantic Web-oriented tools in Galaxy, however, is limited. This is also the case for Semantic Web Services such as those provided by the SADI project, i.e. services that consume and produce RDF. Here we present SADI-Galaxy, a tool generator that deploys selected SADI Services as typical Galaxy tools. SADI-Galaxy is a Galaxy tool generator: through SADI-Galaxy, any SADI-compliant service becomes a Galaxy tool that can participate in other out-standing features of Galaxy such as data storage, history, workflow creation, and publication. Galaxy can also be used to execute and combine SADI services as it does with other Galaxy tools. Finally, we have semi-automated the packing and unpacking of data into RDF such that other Galaxy tools can easily be combined with SADI services, plugging the rich SADI Semantic Web Service environment into the popular Galaxy ecosystem. SADI-Galaxy bridges the gap between Galaxy, an easy to use but "static" workflow system with a wide user-base, and SADI, a sophisticated, semantic, discovery-based framework for Web Services, thus benefiting both user communities.

  5. ddPCRclust - An R package and Shiny app for automated analysis of multiplexed ddPCR data.

    PubMed

    Brink, Benedikt G; Meskas, Justin; Brinkman, Ryan R

    2018-03-09

    Droplet digital PCR (ddPCR) is an emerging technology for quantifying DNA. By partitioning the target DNA into ∼20000 droplets, each serving as its own PCR reaction compartment, a very high sensitivity of DNA quantification can be achieved. However, manual analysis of the data is time consuming and algorithms for automated analysis of non-orthogonal, multiplexed ddPCR data are unavailable, presenting a major bottleneck for the advancement of ddPCR transitioning from low-throughput to high- throughput. ddPCRclust is an R package for automated analysis of data from Bio-Rad's droplet digital PCR systems (QX100 and QX200). It can automatically analyse and visualise multiplexed ddPCR experiments with up to four targets per reaction. Results are on par with manual analysis, but only take minutes to compute instead of hours. The accompanying Shiny app ddPCRvis provides easy access to the functionalities of ddPCRclust through a web-browser based GUI. R package: https://github.com/bgbrink/ddPCRclust; Interface: https://github.com/bgbrink/ddPCRvis/; Web: https://bibiserv.cebitec.uni-bielefeld.de/ddPCRvis/. bbrink@cebitec.uni-bielefeld.de.

  6. Hospital-based nurses' perceptions of the adoption of Web 2.0 tools for knowledge sharing, learning, social interaction and the production of collective intelligence.

    PubMed

    Lau, Adela S M

    2011-11-11

    Web 2.0 provides a platform or a set of tools such as blogs, wikis, really simple syndication (RSS), podcasts, tags, social bookmarks, and social networking software for knowledge sharing, learning, social interaction, and the production of collective intelligence in a virtual environment. Web 2.0 is also becoming increasingly popular in e-learning and e-social communities. The objectives were to investigate how Web 2.0 tools can be applied for knowledge sharing, learning, social interaction, and the production of collective intelligence in the nursing domain and to investigate what behavioral perceptions are involved in the adoption of Web 2.0 tools by nurses. The decomposed technology acceptance model was applied to construct the research model on which the hypotheses were based. A questionnaire was developed based on the model and data from nurses (n = 388) were collected from late January 2009 until April 30, 2009. Pearson's correlation analysis and t tests were used for data analysis. Intention toward using Web 2.0 tools was positively correlated with usage behavior (r = .60, P < .05). Behavioral intention was positively correlated with attitude (r = .72, P < .05), perceived behavioral control (r = .58, P < .05), and subjective norm (r = .45, P < .05). In their decomposed constructs, perceived usefulness (r = .7, P < .05), relative advantage (r = .64, P < .05), and compatibility (r = .60,P < .05) were positively correlated with attitude, but perceived ease of use was not significantly correlated (r = .004, P < .05) with it. Peer (r = .47, P < .05), senior management (r = .24,P < .05), and hospital (r = .45, P < .05) influences had positive correlations with subjective norm. Resource (r = .41,P < .05) and technological (r = .69,P < .05) conditions were positively correlated with perceived behavioral control. The identified behavioral perceptions may further health policy makers' understanding of nurses' concerns regarding and barriers to the adoption of Web 2.0 tools and enable them to better plan the strategy of implementation of Web 2.0 tools for knowledge sharing, learning, social interaction, and the production of collective intelligence.

  7. Hospital-Based Nurses’ Perceptions of the Adoption of Web 2.0 Tools for Knowledge Sharing, Learning, Social Interaction and the Production of Collective Intelligence

    PubMed Central

    2011-01-01

    Background Web 2.0 provides a platform or a set of tools such as blogs, wikis, really simple syndication (RSS), podcasts, tags, social bookmarks, and social networking software for knowledge sharing, learning, social interaction, and the production of collective intelligence in a virtual environment. Web 2.0 is also becoming increasingly popular in e-learning and e-social communities. Objectives The objectives were to investigate how Web 2.0 tools can be applied for knowledge sharing, learning, social interaction, and the production of collective intelligence in the nursing domain and to investigate what behavioral perceptions are involved in the adoption of Web 2.0 tools by nurses. Methods The decomposed technology acceptance model was applied to construct the research model on which the hypotheses were based. A questionnaire was developed based on the model and data from nurses (n = 388) were collected from late January 2009 until April 30, 2009. Pearson’s correlation analysis and t tests were used for data analysis. Results Intention toward using Web 2.0 tools was positively correlated with usage behavior (r = .60, P < .05). Behavioral intention was positively correlated with attitude (r = .72, P < .05), perceived behavioral control (r = .58, P < .05), and subjective norm (r = .45, P < .05). In their decomposed constructs, perceived usefulness (r = .7, P < .05), relative advantage (r = .64, P < .05), and compatibility (r = .60, P < .05) were positively correlated with attitude, but perceived ease of use was not significantly correlated (r = .004, P < .05) with it. Peer (r = .47, P < .05), senior management (r = .24, P < .05), and hospital (r = .45, P < .05) influences had positive correlations with subjective norm. Resource (r = .41, P < .05) and technological (r = .69, P < .05) conditions were positively correlated with perceived behavioral control. Conclusions The identified behavioral perceptions may further health policy makers’ understanding of nurses’ concerns regarding and barriers to the adoption of Web 2.0 tools and enable them to better plan the strategy of implementation of Web 2.0 tools for knowledge sharing, learning, social interaction, and the production of collective intelligence. PMID:22079851

  8. Navigating complex patients using an innovative tool: the MTM Spider Web.

    PubMed

    Morello, Candis M; Hirsch, Jan D; Lee, Kelly C

    2013-01-01

    To introduce a teaching tool that can be used to assess the complexity of medication therapy management (MTM) patients, prioritize appropriate interventions, and design patient-centered care plans for each encounter. MTM patients are complex as a result of multiple comorbidities, medications, and socioeconomic and behavioral issues. Pharmacists who provide MTM services are required to synthesize a plethora of information (medical and nonmedical), evaluate and prioritize the clinical problems, and design a comprehensive patient-centered care plan. The MTM Spider Web is a visual tool to facilitate this process. A description is provided regarding how to build the MTM Spider Web using case-based scenarios. This model can be used to teach pharmacists, health professional students, and patients. The MTM Spider Web is an innovative teaching tool that can be used to teach pharmacists and students how to assess complex patients and design a patient-centered care plan to deliver the most appropriate medication therapy.

  9. Mapping patient safety: a large-scale literature review using bibliometric visualisation techniques.

    PubMed

    Rodrigues, S P; van Eck, N J; Waltman, L; Jansen, F W

    2014-03-13

    The amount of scientific literature available is often overwhelming, making it difficult for researchers to have a good overview of the literature and to see relations between different developments. Visualisation techniques based on bibliometric data are helpful in obtaining an overview of the literature on complex research topics, and have been applied here to the topic of patient safety (PS). On the basis of title words and citation relations, publications in the period 2000-2010 related to PS were identified in the Scopus bibliographic database. A visualisation of the most frequently cited PS publications was produced based on direct and indirect citation relations between publications. Terms were extracted from titles and abstracts of the publications, and a visualisation of the most important terms was created. The main PS-related topics studied in the literature were identified using a technique for clustering publications and terms. A total of 8480 publications were identified, of which the 1462 most frequently cited ones were included in the visualisation. The publications were clustered into 19 clusters, which were grouped into three categories: (1) magnitude of PS problems (42% of all included publications); (2) PS risk factors (31%) and (3) implementation of solutions (19%). In the visualisation of PS-related terms, five clusters were identified: (1) medication; (2) measuring harm; (3) PS culture; (4) physician; (5) training, education and communication. Both analysis at publication and term level indicate an increasing focus on risk factors. A bibliometric visualisation approach makes it possible to analyse large amounts of literature. This approach is very useful for improving one's understanding of a complex research topic such as PS and for suggesting new research directions or alternative research priorities. For PS research, the approach suggests that more research on implementing PS improvement initiatives might be needed.

  10. War Gamers Handbook: A Guide for Professional War Gamers

    DTIC Science & Technology

    2015-11-01

    more complex games led us to integrate knowledge management, web tools, and multitouch , multiuser technologies in order to more efficiently and... Multitouch multiuser (MTMU) and communications operating picture (COP) interfaces ◊ Web development—Web tools and player interfaces Now that the game...hurricane or flood scenario to provide a plausible backdrop to facilitate player interaction toward game objectives. Scenarios should include only the

  11. Using WebCT as a Supplemental Tool to Enhance Critical Thinking and Engagement among Developmental Reading Students

    ERIC Educational Resources Information Center

    Burgess, Melissa L.

    2009-01-01

    The purpose of this research was to examine possible outcomes of developmental students' critical thinking and motivation to read when the online learning community, WebCT, was implemented. My role, in addition to instructor, was that of participant-observer. I implemented WebCT tools, such as discussion board and chat, over a four-month period…

  12. Computed 3D visualisation of an extinct cephalopod using computer tomographs.

    PubMed

    Lukeneder, Alexander

    2012-08-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites . Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal.

  13. a Geo-Visual Analytics Approach to Biological Shepherding: Modelling Animal Movements and Impacts

    NASA Astrophysics Data System (ADS)

    Benke, K. K.; Sheth, F.; Betteridge, K.; Pettit, C. J.; Aurambout, J.-P.

    2012-07-01

    The lamb industry in Victoria is a significant component of the state economy with annual exports in the vicinity of 1 billion. GPS and visualisation tools can be used to monitor grazing animal movements at the farm scale and observe interactions with the environment. Modelling the spatial-temporal movements of grazing animals in response to environmental conditions provides input for the design of paddocks with the aim of improving management procedures, animal performance and animal welfare. The term "biological shepherding" is associated with the re-design of environmental conditions and the analysis of responses from grazing animals. The combination of biological shepherding with geo-visual analytics (geo-spatial data analysis with visualisation) provides a framework for improving landscape design and supports research in grazing behaviour in variable landscapes, heat stress avoidance behaviour during summer months, and modelling excreta distributions (with respect to nitrogen emissions and nitrogen return for fertilising the paddock). Nitrogen losses due to excreta are mainly in the form of gaseous emissions to the atmosphere and leaching into the groundwater. In this study, background and context are provided in the case of biological shepherding and tracking animal movements. Examples are provided of recent applications in regional Australia and New Zealand. Based on experimental data and computer simulation, and using data visualisation and feature extraction, it was demonstrated that livestock excreta are not always randomly located, but concentrated around localised gathering points, sometimes separated by the nature of the excretion. Farmers require information on the nitrogen losses in order to reduce emissions to meet local and international nitrogen leaching and greenhouse gas targets and to improve the efficiency of nutrient management.

  14. Computed 3D visualisation of an extinct cephalopod using computer tomographs

    NASA Astrophysics Data System (ADS)

    Lukeneder, Alexander

    2012-08-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites. Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal.

  15. Computed 3D visualisation of an extinct cephalopod using computer tomographs

    PubMed Central

    Lukeneder, Alexander

    2012-01-01

    The first 3D visualisation of a heteromorph cephalopod species from the Southern Alps (Dolomites, northern Italy) is presented. Computed tomography, palaeontological data and 3D reconstructions were included in the production of a movie, which shows a life reconstruction of the extinct organism. This detailed reconstruction is according to the current knowledge of the shape and mode of life as well as habitat of this animal. The results are based on the most complete shell known thus far of the genus Dissimilites. Object-based combined analyses from computed tomography and various computed 3D facility programmes help to understand morphological details as well as their ontogentical changes in fossil material. In this study, an additional goal was to show changes in locomotion during different ontogenetic phases of such fossil, marine shell-bearing animals (ammonoids). Hence, the presented models and tools can serve as starting points for discussions on morphology and locomotion of extinct cephalopods in general, and of the genus Dissimilites in particular. The heteromorph ammonoid genus Dissimilites is interpreted here as an active swimmer of the Tethyan Ocean. This study portrays non-destructive methods of 3D visualisation applied on palaeontological material, starting with computed tomography resulting in animated, high-quality video clips. The here presented 3D geometrical models and animation, which are based on palaeontological material, demonstrate the wide range of applications, analytical techniques and also outline possible limitations of 3D models in earth sciences and palaeontology. The realistic 3D models and motion pictures can easily be shared amongst palaeontologists. Data, images and short clips can be discussed online and, if necessary, adapted in morphological details and motion-style to better represent the cephalopod animal. PMID:24850976

  16. Exploring the Relationship between Web 2.0 Tools Self-Efficacy and Teachers' Use of These Tools in Their Teaching

    ERIC Educational Resources Information Center

    Alhassan, Riyadh

    2017-01-01

    The purpose of this study was to examine the relationship between teachers' self-efficacy in using of Web 2.0 tools and some demographic variables, and their use of those tools in their teaching. The study data was collected from a random sample of public school teachers in Riyadh, Saudi Arabia. The results showed a strong positive relationship…

  17. Analysis of Utility and Use of a Web-Based Tool for Digital Signal Processing Teaching by Means of a Technological Acceptance Model

    ERIC Educational Resources Information Center

    Toral, S. L.; Barrero, F.; Martinez-Torres, M. R.

    2007-01-01

    This paper presents an exploratory study about the development of a structural and measurement model for the technological acceptance (TAM) of a web-based educational tool. The aim consists of measuring not only the use of this tool, but also the external variables with a significant influence in its use for planning future improvements. The tool,…

  18. A cloud based tool for knowledge exchange on local scale flood risk.

    PubMed

    Wilkinson, M E; Mackay, E; Quinn, P F; Stutter, M; Beven, K J; MacLeod, C J A; Macklin, M G; Elkhatib, Y; Percy, B; Vitolo, C; Haygarth, P M

    2015-09-15

    There is an emerging and urgent need for new approaches for the management of environmental challenges such as flood hazard in the broad context of sustainability. This requires a new way of working which bridges disciplines and organisations, and that breaks down science-culture boundaries. With this, there is growing recognition that the appropriate involvement of local communities in catchment management decisions can result in multiple benefits. However, new tools are required to connect organisations and communities. The growth of cloud based technologies offers a novel way to facilitate this process of exchange of information in environmental science and management; however, stakeholders need to be engaged with as part of the development process from the beginning rather than being presented with a final product at the end. Here we present the development of a pilot Local Environmental Virtual Observatory Flooding Tool. The aim was to develop a cloud based learning platform for stakeholders, bringing together fragmented data, models and visualisation tools that will enable these stakeholders to make scientifically informed environmental management decisions at the local scale. It has been developed by engaging with different stakeholder groups in three catchment case studies in the UK and a panel of national experts in relevant topic areas. However, these case study catchments are typical of many northern latitude catchments. The tool was designed to communicate flood risk in locally impacted communities whilst engaging with landowners/farmers about the risk of runoff from the farmed landscape. It has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. The pilot tool combines cloud based services, local catchment datasets, a hydrological model and bespoke visualisation tools to explore real time hydrometric data and the impact of flood risk caused by future land use changes. The novel aspects of the pilot tool are; the co-evolution of tools on a cloud based platform with stakeholders, policy and scientists; encouraging different science disciplines to work together; a wealth of information that is accessible and understandable to a range of stakeholders; and provides a framework for how to approach the development of such a cloud based tool in the future. Above all, stakeholders saw the tool and the potential of cloud technologies as an effective means to taking a whole systems approach to solving environmental issues. This sense of community ownership is essential in order to facilitate future appropriate and acceptable land use management decisions to be co-developed by local catchment communities. The development processes and the resulting pilot tool could be applied to local catchments globally to facilitate bottom up catchment management approaches. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Share2Quit: Web-Based Peer-Driven Referrals for Smoking Cessation

    PubMed Central

    2013-01-01

    Background Smoking is the number one preventable cause of death in the United States. Effective Web-assisted tobacco interventions are often underutilized and require new and innovative engagement approaches. Web-based peer-driven chain referrals successfully used outside health care have the potential for increasing the reach of Internet interventions. Objective The objective of our study was to describe the protocol for the development and testing of proactive Web-based chain-referral tools for increasing the access to Decide2Quit.org, a Web-assisted tobacco intervention system. Methods We will build and refine proactive chain-referral tools, including email and Facebook referrals. In addition, we will implement respondent-driven sampling (RDS), a controlled chain-referral sampling technique designed to remove inherent biases in chain referrals and obtain a representative sample. We will begin our chain referrals with an initial recruitment of former and current smokers as seeds (initial participants) who will be trained to refer current smokers from their social network using the developed tools. In turn, these newly referred smokers will also be provided the tools to refer other smokers from their social networks. We will model predictors of referral success using sample weights from the RDS to estimate the success of the system in the targeted population. Results This protocol describes the evaluation of proactive Web-based chain-referral tools, which can be used in tobacco interventions to increase the access to hard-to-reach populations, for promoting smoking cessation. Conclusions Share2Quit represents an innovative advancement by capitalizing on naturally occurring technology trends to recruit smokers to Web-assisted tobacco interventions. PMID:24067329

  20. CDPP Tools in the IMPEx infrastructure

    NASA Astrophysics Data System (ADS)

    Gangloff, Michel; Génot, Vincent; Bourrel, Nataliya; Hess, Sébastien; Khodachenko, Maxim; Modolo, Ronan; Kallio, Esa; Alexeev, Igor; Al-Ubaidi, Tarek; Cecconi, Baptiste; André, Nicolas; Budnik, Elena; Bouchemit, Myriam; Dufourg, Nicolas; Beigbeder, Laurent

    2014-05-01

    The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of plasma data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (http://amda.cdpp.eu/) which enables in depth analysis of large amount of data through dedicated functionalities such as: visualization, conditional search, cataloguing, and 3DView (http://3dview.cdpp.eu/) which provides immersive visualisations in planetary environments and is further developed to include simulation and observational data. Both tools implement the IMPEx protocol (http://impexfp7.oeaw.ac.at/) to give access to outputs of simulation runs and models in planetary sciences from several providers like LATMOS, FMI , SINP; prototypes have also been built to access some UCLA and CCMC simulations. These tools and their interaction will be presented together with the IMPEx simulation data model (http://impex.latmos.ipsl.fr/tools/DataModel.htm) used for the interface to model databases.

  1. The use of GIS tools for road infrastructure safety management

    NASA Astrophysics Data System (ADS)

    Budzyński, Marcin; Kustra, Wojciech; Okraszewska, Romanika; Jamroz, Kazimierz; Pyrchla, Jerzy

    2018-01-01

    There are many factors that influence accidents and their severity. They can be grouped within the system of man, vehicle and environment. The article focuses on how GIS tools can be used to manage road infrastructure safety. To ensure a better understanding and identification of road factors, GIS tools help with the acquisition of road parameter data. Their other role is helping with a clear and effective presentation of risk ranking. GIS is key to identifying high-risk sections and supports the effective communication of safety levels. This makes it a vital element of safety management. The article describes the use of GIS for the collection and visualisation of road parameter data which are not available in any of the existing databases, i.e. horizontal curve parameters. As we know from research and statistics, they are important factors that determine the safety of road infrastructure. Finally, new research is proposed as well as the possibilities for applying GIS tools for the purposes of road safety inspection.

  2. Social Web mining and exploitation for serious applications: Technosocial Predictive Analytics and related technologies for public health, environmental and national security surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamel Boulos, Maged; Sanfilippo, Antonio P.; Corley, Courtney D.

    2010-03-17

    This paper explores techno-social predictive analytics (TPA) and related methods for Web “data mining” where users’ posts and queries are garnered from Social Web (“Web 2.0”) tools such as blogs, microblogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people’s collective health status of whole populations. Several health related tool examples are described and demonstrated as practicalmore » means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks.« less

  3. Application of ESE Data and Tools to Air Quality Management: Services for Helping the Air Quality Community use ESE Data (SHAirED)

    NASA Technical Reports Server (NTRS)

    Falke, Stefan; Husar, Rudolf

    2011-01-01

    The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.

  4. Implementing Web 2.0 Tools in the Classroom: Four Teachers' Accounts

    ERIC Educational Resources Information Center

    Kovalik, Cindy; Kuo, Chia-Ling; Cummins, Megan; Dipzinski, Erin; Joseph, Paula; Laskey, Stephanie

    2014-01-01

    In this paper, four teachers shared their experiences using the following free Web 2.0 tools with their students: Jing, Wix, Google Sites, and Blogger. The teachers found that students reacted positively to lessons in which these tools were used, and also noted improvements they could make when using them in the future.

  5. eCDRweb User Guide–Primary Support

    EPA Pesticide Factsheets

    This document presents the user guide for the Office of Pollution Prevention and Toxics’ (OPPT) e-CDR web tool. E-CDRweb is the electronic, web-based tool provided by the Environmental Protection Agency (EPA) for the submission of Chemical Data Reporting (CDR) information. This document is the user guide for the Primary Support user of the e-CDRweb tool.

  6. Webquest 2.0: An Instructional Model for Digital Learners

    ERIC Educational Resources Information Center

    Dell, Diana F. Abernathy

    2012-01-01

    Teaching and learning tools such as Moodle and Web 2.0 tools are appearing in K-12 classrooms; however, there is a lack of scholarly research to guide the implementation of these tools. The WebQuest model, a widely adopted inquiry-based model for online instruction, has instructional inadequacies and does not make the most of emerging…

  7. Web-Based Machine Translation as a Tool for Promoting Electronic Literacy and Language Awareness

    ERIC Educational Resources Information Center

    Williams, Lawrence

    2006-01-01

    This article addresses a pervasive problem of concern to teachers of many foreign languages: the use of Web-Based Machine Translation (WBMT) by students who do not understand the complexities of this relatively new tool. Although networked technologies have greatly increased access to many language and communication tools, WBMT is still…

  8. eCDRweb User Guide–Secondary Support

    EPA Pesticide Factsheets

    This document presents the user guide for the Office of Pollution Prevention and Toxics’ (OPPT) e-CDR web tool. E-CDRweb is the electronic, web-based tool provided by the Environmental Protection Agency (EPA) for the submission of Chemical Data Reporting (CDR) information. This document is the user guide for the Secondary Support user of the e-CDRweb tool.

  9. Web Based Personal Nutrition Management Tool

    NASA Astrophysics Data System (ADS)

    Bozkurt, Selen; Zayim, Neşe; Gülkesen, Kemal Hakan; Samur, Mehmet Kemal

    Internet is being used increasingly as a resource for accessing health-related information because of its several advantages. Therefore, Internet tailoring becomes quite preferable in health education and personal health management recently. Today, there are many web based health programs de-signed for individuals. Among these studies nutrition and weight management is popular because, obesity has become a heavy burden for populations worldwide. In this study, we designed a web based personal nutrition education and management tool, The Nutrition Web Portal, in order to enhance patients’ nutrition knowledge, and provide behavioral change against obesity. The present paper reports analysis, design and development processes of The Nutrition Web Portal.

  10. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  11. Charting Our Path with a Web Literacy Map

    ERIC Educational Resources Information Center

    Dalton, Bridget

    2015-01-01

    Being a literacy teacher today means being a teacher of Web literacies. This article features the "Web Literacy Map", an open source tool from Mozilla's Webmaker project. The map focuses on Exploring (Navigating the Web); Building (creating for the Web), and Connecting (Participating on the Web). Readers are invited to use resources,…

  12. The Role of the Web Server in a Capstone Web Application Course

    ERIC Educational Resources Information Center

    Umapathy, Karthikeyan; Wallace, F. Layne

    2010-01-01

    Web applications have become commonplace in the Information Systems curriculum. Much of the discussion about Web development for capstone courses has centered on the scripting tools. Very little has been discussed about different ways to incorporate the Web server into Web application development courses. In this paper, three different ways of…

  13. Genomic Enzymology: Web Tools for Leveraging Protein Family Sequence-Function Space and Genome Context to Discover Novel Functions.

    PubMed

    Gerlt, John A

    2017-08-22

    The exponentially increasing number of protein and nucleic acid sequences provides opportunities to discover novel enzymes, metabolic pathways, and metabolites/natural products, thereby adding to our knowledge of biochemistry and biology. The challenge has evolved from generating sequence information to mining the databases to integrating and leveraging the available information, i.e., the availability of "genomic enzymology" web tools. Web tools that allow identification of biosynthetic gene clusters are widely used by the natural products/synthetic biology community, thereby facilitating the discovery of novel natural products and the enzymes responsible for their biosynthesis. However, many novel enzymes with interesting mechanisms participate in uncharacterized small-molecule metabolic pathways; their discovery and functional characterization also can be accomplished by leveraging information in protein and nucleic acid databases. This Perspective focuses on two genomic enzymology web tools that assist the discovery novel metabolic pathways: (1) Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST) for generating sequence similarity networks to visualize and analyze sequence-function space in protein families and (2) Enzyme Function Initiative-Genome Neighborhood Tool (EFI-GNT) for generating genome neighborhood networks to visualize and analyze the genome context in microbial and fungal genomes. Both tools have been adapted to other applications to facilitate target selection for enzyme discovery and functional characterization. As the natural products community has demonstrated, the enzymology community needs to embrace the essential role of web tools that allow the protein and genome sequence databases to be leveraged for novel insights into enzymological problems.

  14. Genomic Enzymology: Web Tools for Leveraging Protein Family Sequence–Function Space and Genome Context to Discover Novel Functions

    PubMed Central

    2017-01-01

    The exponentially increasing number of protein and nucleic acid sequences provides opportunities to discover novel enzymes, metabolic pathways, and metabolites/natural products, thereby adding to our knowledge of biochemistry and biology. The challenge has evolved from generating sequence information to mining the databases to integrating and leveraging the available information, i.e., the availability of “genomic enzymology” web tools. Web tools that allow identification of biosynthetic gene clusters are widely used by the natural products/synthetic biology community, thereby facilitating the discovery of novel natural products and the enzymes responsible for their biosynthesis. However, many novel enzymes with interesting mechanisms participate in uncharacterized small-molecule metabolic pathways; their discovery and functional characterization also can be accomplished by leveraging information in protein and nucleic acid databases. This Perspective focuses on two genomic enzymology web tools that assist the discovery novel metabolic pathways: (1) Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST) for generating sequence similarity networks to visualize and analyze sequence–function space in protein families and (2) Enzyme Function Initiative-Genome Neighborhood Tool (EFI-GNT) for generating genome neighborhood networks to visualize and analyze the genome context in microbial and fungal genomes. Both tools have been adapted to other applications to facilitate target selection for enzyme discovery and functional characterization. As the natural products community has demonstrated, the enzymology community needs to embrace the essential role of web tools that allow the protein and genome sequence databases to be leveraged for novel insights into enzymological problems. PMID:28826221

  15. WebViz: A web browser based application for collaborative analysis of 3D data

    NASA Astrophysics Data System (ADS)

    Ruegg, C. S.

    2011-12-01

    In the age of high speed Internet where people can interact instantly, scientific tools have lacked technology which can incorporate this concept of communication using the web. To solve this issue a web application for geological studies has been created, tentatively titled WebViz. This web application utilizes tools provided by Google Web Toolkit to create an AJAX web application capable of features found in non web based software. Using these tools, a web application can be created to act as piece of software from anywhere in the globe with a reasonably speedy Internet connection. An application of this technology can be seen with data regarding the recent tsunami from the major japan earthquakes. After constructing the appropriate data to fit a computer render software called HVR, WebViz can request images of the tsunami data and display it to anyone who has access to the application. This convenience alone makes WebViz a viable solution, but the option to interact with this data with others around the world causes WebViz to be taken as a serious computational tool. WebViz also can be used on any javascript enabled browser such as those found on modern tablets and smart phones over a fast wireless connection. Due to the fact that WebViz's current state is built using Google Web Toolkit the portability of the application is in it's most efficient form. Though many developers have been involved with the project, each person has contributed to increase the usability and speed of the application. In the project's most recent form a dramatic speed increase has been designed as well as a more efficient user interface. The speed increase has been informally noticed in recent uses of the application in China and Australia with the hosting server being located at the University of Minnesota. The user interface has been improved to not only look better but the functionality has been improved. Major functions of the application are rotating the 3D object using buttons. These buttons have been replaced with a new layout that is easier to understand the function and is also easy to use with mobile devices. With these new changes, WebViz is easier to control and use for general use.

  16. Criteria for Comparing Children's Web Search Tools.

    ERIC Educational Resources Information Center

    Kuntz, Jerry

    1999-01-01

    Presents criteria for evaluating and comparing Web search tools designed for children. Highlights include database size; accountability; categorization; search access methods; help files; spell check; URL searching; links to alternative search services; advertising; privacy policy; and layout and design. (LRW)

  17. Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership

    NASA Astrophysics Data System (ADS)

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.

  18. MyFreePACS: a free web-based radiology image storage and viewing tool.

    PubMed

    de Regt, David; Weinberger, Ed

    2004-08-01

    We developed an easy-to-use method for central storage and subsequent viewing of radiology images for use on any PC equipped with Internet Explorer. We developed MyFreePACS, a program that uses a DICOM server to receive and store images and transmit them over the Web to the MyFreePACS Web client. The MyFreePACS Web client is a Web page that uses an ActiveX control for viewing and manipulating images. The client contains many of the tools found in modern image viewing stations including 3D localization and multiplanar reformation. The system is built entirely with free components and is freely available for download and installation from the Web at www.myfreepacs.com.

  19. Visualising ‘work done’ with a simple flying wheel toy

    NASA Astrophysics Data System (ADS)

    Amir, Nazir

    2018-07-01

    A way to scaffold students’ understanding of abstract physics concepts is through fun activities that help in visualisation. This article highlights a simple flying wheel toy can be used as a demonstration kit in presenting the equation ‘Work Done  =  Force  ×  Distance’. The kit has helped the author’s students visualise the equation, getting them to appreciate how components of the equation are related to one another, which otherwise may have been abstract for them.

  20. Perceived Benefits and Attitudes of Student Teachers to Web-Quest as a Motivating, Creative and Inquiry-Based Learning Tool in Education

    ERIC Educational Resources Information Center

    Aina, Samuel Ayobami; Sofowora, Alaba Olaniyi

    2013-01-01

    This study discussed how the Department of Teacher Education, University of Ibadan utilized Web-Quest as a motivating and creative tool to teach a compulsory and large pre-service teachers' Course (TEE 304) The study also investigated the attitude and perception of pre-service teachers to the use of Web-Quest. The results showed that the sample…

  1. WebScope: A New Tool for Fusion Data Analysis and Visualization

    NASA Astrophysics Data System (ADS)

    Yang, Fei; Dang, Ningning; Xiao, Bingjia

    2010-04-01

    A visualization tool was developed through a web browser based on Java applets embedded into HTML pages, in order to provide a world access to the EAST experimental data. It can display data from various trees in different servers in a single panel. With WebScope, it is easier to make a comparison between different data sources and perform a simple calculation over different data sources.

  2. iAnn: an event sharing platform for the life sciences.

    PubMed

    Jimenez, Rafael C; Albar, Juan P; Bhak, Jong; Blatter, Marie-Claude; Blicher, Thomas; Brazas, Michelle D; Brooksbank, Cath; Budd, Aidan; De Las Rivas, Javier; Dreyer, Jacqueline; van Driel, Marc A; Dunn, Michael J; Fernandes, Pedro L; van Gelder, Celia W G; Hermjakob, Henning; Ioannidis, Vassilios; Judge, David P; Kahlem, Pascal; Korpelainen, Eija; Kraus, Hans-Joachim; Loveland, Jane; Mayer, Christine; McDowall, Jennifer; Moran, Federico; Mulder, Nicola; Nyronen, Tommi; Rother, Kristian; Salazar, Gustavo A; Schneider, Reinhard; Via, Allegra; Villaveces, Jose M; Yu, Ping; Schneider, Maria V; Attwood, Teresa K; Corpas, Manuel

    2013-08-01

    We present iAnn, an open source community-driven platform for dissemination of life science events, such as courses, conferences and workshops. iAnn allows automatic visualisation and integration of customised event reports. A central repository lies at the core of the platform: curators add submitted events, and these are subsequently accessed via web services. Thus, once an iAnn widget is incorporated into a website, it permanently shows timely relevant information as if it were native to the remote site. At the same time, announcements submitted to the repository are automatically disseminated to all portals that query the system. To facilitate the visualization of announcements, iAnn provides powerful filtering options and views, integrated in Google Maps and Google Calendar. All iAnn widgets are freely available. http://iann.pro/iannviewer manuel.corpas@tgac.ac.uk.

  3. The Ensembl genome database project.

    PubMed

    Hubbard, T; Barker, D; Birney, E; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Huminiecki, L; Kasprzyk, A; Lehvaslaiho, H; Lijnzaad, P; Melsopp, C; Mongin, E; Pettett, R; Pocock, M; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Clamp, M

    2002-01-01

    The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of the human genome sequence, with confirmed gene predictions that have been integrated with external data sources, and is available as either an interactive web site or as flat files. It is also an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements from sequence analysis to data storage and visualisation. The Ensembl site is one of the leading sources of human genome sequence annotation and provided much of the analysis for publication by the international human genome project of the draft genome. The Ensembl system is being installed around the world in both companies and academic sites on machines ranging from supercomputers to laptops.

  4. 75 FR 35765 - Proposed Information Collection; Comment Request; BroadbandMatch Web Site Tool

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-23

    ... DEPARTMENT OF COMMERCE National Telecommunications and Information Administration Proposed Information Collection; Comment Request; BroadbandMatch Web Site Tool AGENCY: National Telecommunications and Information Administration, Commerce. ACTION: Notice. SUMMARY: The Department of Commerce, as part of its...

  5. A Java tool for dynamic web-based 3D visualization of anatomy and overlapping gene or protein expression patterns.

    PubMed

    Gerth, Victor E; Vize, Peter D

    2005-04-01

    The Gene Expression Viewer is a web-launched three-dimensional visualization tool, tailored to compare surface reconstructions of multi-channel image volumes generated by confocal microscopy or micro-CT.

  6. Mass Spectrometry Imaging of low Molecular Weight Compounds in Garlic (Allium sativum L.) with Gold Nanoparticle Enhanced Target.

    PubMed

    Misiorek, Maria; Sekuła, Justyna; Ruman, Tomasz

    2017-11-01

    Garlic (Allium sativum) is the subject of many studies due to its numerous beneficial properties. Although compounds of garlic have been studied by various analytical methods, their tissue distributions are still unclear. Mass spectrometry imaging (MSI) appears to be a very powerful tool for the identification of the localisation of compounds within a garlic clove. Visualisation of the spatial distribution of garlic low-molecular weight compounds with nanoparticle-based MSI. Compounds occurring on the cross-section of sprouted garlic has been transferred to gold-nanoparticle enhanced target (AuNPET) by imprinting. The imprint was then subjected to MSI analysis. The results suggest that low molecular weight compounds, such as amino acids, dipeptides, fatty acids, organosulphur and organoselenium compounds are distributed within the garlic clove in a characteristic manner. It can be connected with their biological functions and metabolic properties in the plant. New methodology for the visualisation of low molecular weight compounds allowed a correlation to be made between their spatial distribution within a sprouted garlic clove and their biological function. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  7. On the retrieval of crystallographic information from atom probe microscopy data via signal mapping from the detector coordinate space.

    PubMed

    Wallace, Nathan D; Ceguerra, Anna V; Breen, Andrew J; Ringer, Simon P

    2018-06-01

    Atom probe tomography is a powerful microscopy technique capable of reconstructing the 3D position and chemical identity of millions of atoms within engineering materials, at the atomic level. Crystallographic information contained within the data is particularly valuable for the purposes of reconstruction calibration and grain boundary analysis. Typically, analysing this data is a manual, time-consuming and error prone process. In many cases, the crystallographic signal is so weak that it is difficult to detect at all. In this study, a new automated signal processing methodology is demonstrated. We use the affine properties of the detector coordinate space, or the 'detector stack', as the basis for our calculations. The methodological framework and the visualisation tools are shown to be superior to the standard method of crystallographic pole visualisation directly from field evaporation images and there is no requirement for iterations between a full real-space initial tomographic reconstruction and the detector stack. The mapping approaches are demonstrated for aluminium, tungsten, magnesium and molybdenum. Implications for reconstruction calibration, accuracy of crystallographic measurements, reliability and repeatability are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Optical visualisation of thermogenesis in stimulated single-cell brown adipocytes.

    PubMed

    Kriszt, Rókus; Arai, Satoshi; Itoh, Hideki; Lee, Michelle H; Goralczyk, Anna G; Ang, Xiu Min; Cypess, Aaron M; White, Andrew P; Shamsi, Farnaz; Xue, Ruidan; Lee, Jung Yeol; Lee, Sung-Chan; Hou, Yanyan; Kitaguchi, Tetsuya; Sudhaharan, Thankiah; Ishiwata, Shin'ichi; Lane, E Birgitte; Chang, Young-Tae; Tseng, Yu-Hua; Suzuki, Madoka; Raghunath, Michael

    2017-05-03

    The identification of brown adipose deposits in adults has led to significant interest in targeting this metabolically active tissue for treatment of obesity and diabetes. Improved methods for the direct measurement of heat production as the signature function of brown adipocytes (BAs), particularly at the single cell level, would be of substantial benefit to these ongoing efforts. Here, we report the first application of a small molecule-type thermosensitive fluorescent dye, ERthermAC, to monitor thermogenesis in BAs derived from murine brown fat precursors and in human brown fat cells differentiated from human neck brown preadipocytes. ERthermAC accumulated in the endoplasmic reticulum of BAs and displayed a marked change in fluorescence intensity in response to adrenergic stimulation of cells, which corresponded to temperature change. ERthermAC fluorescence intensity profiles were congruent with mitochondrial depolarisation events visualised by the JC-1 probe. Moreover, the averaged fluorescence intensity changes across a population of cells correlated well with dynamic changes such as thermal power, oxygen consumption, and extracellular acidification rates. These findings suggest ERthermAC as a promising new tool for studying thermogenic function in brown adipocytes of both murine and human origins.

  9. The ESA FELYX High Resolution Diagnostic Data Set System Design and Implementation

    NASA Astrophysics Data System (ADS)

    Taberner, M.; Shutler, J.; Walker, P.; Poulter, D.; Piolle, J.-F.; Donlon, C.; Guidetti, V.

    2013-10-01

    Felyx is currently under development and is the latest evolution of a generalised High Resolution Diagnostic Data Set system funded by ESA. It draws on previous prototype developments and experience in the GHRSST, Medspiration, GlobColour and GlobWave projects. In this paper, we outline the design and implementation of the system, and illustrate using the Ocean Colour demonstration activities. Felyx is fundamentally a tool to facilitate the analysis of EO data: it is being developed by IFREMER, PML and Pelamis. It will be free software written in python and javascript. The aim is to provide Earth Observation data producers and users with an opensource, flexible and reusable tool to allow the quality and performance of data streams from satellite, in situ and model sources to be easily monitored and studied. New to this project, is the ability to establish and incorporate multi-sensor match-up database capabilities. The systems will be deployable anywhere and even include interaction mechanisms between the deployed instances. The primary concept of Felyx is to work as an extraction tool. It allows for the extraction of subsets of source data over predefined target areas(which can be static or moving). These data subsets, and associated metrics, can then be accessed by users or client applications either as raw files or through automatic alerts. These data can then be used to generate periodic reports or be used for statistical analysis and visualisation through a flexible web interface. Felyx can be used for subsetting, the generation of statistics, the generation of reports or warnings/alerts, and in-depth analyses, to name a few. There are many potential applications but important uses foreseen are: * monitoring and assessing the quality of Earth observations (e.g. satellite products and time series) through statistical analysis and/or comparison with other data sources * assessing and inter-comparing geophysical inversion algorithms * observing a given phenomenon, collecting and cumulating various parameters over a defined area * crossing different sources of data for synergy applications The services provided by felyx will be generic, deployable at users own premises, and flexible allowing the integration and development of any kind of parameters. Users will be able to operate their own felyx instance at any location, on datasets and parameters of their own interest, and the various instances will be able to interact with each other, creating a web of felyx systems enabling aggregation and cross comparison of miniProds and metrics from multiple sources. Initially two instances will be operated simultaneously during a 6 months demonstration phase, at IFREMER - on sea surface temperature and ocean waves datasets - and PML - on ocean colour.

  10. Creating a Classroom Kaleidoscope with the World Wide Web.

    ERIC Educational Resources Information Center

    Quinlan, Laurie A.

    1997-01-01

    Discusses the elements of classroom Web presentations: planning; construction, including design tips; classroom use; and assessment. Lists 14 World Wide Web resources for K-12 teachers; Internet search tools (directories, search engines and meta-search engines); a Web glossary; and an example of HTML for a simple Web page. (PEN)

  11. WEBCAP: Web Scheduler for Distance Learning Multimedia Documents with Web Workload Considerations

    ERIC Educational Resources Information Center

    Habib, Sami; Safar, Maytham

    2008-01-01

    In many web applications, such as the distance learning, the frequency of refreshing multimedia web documents places a heavy burden on the WWW resources. Moreover, the updated web documents may encounter inordinate delays, which make it difficult to retrieve web documents in time. Here, we present an Internet tool called WEBCAP that can schedule…

  12. Proposition and Organization of an Adaptive Learning Domain Based on Fusion from the Web

    ERIC Educational Resources Information Center

    Chaoui, Mohammed; Laskri, Mohamed Tayeb

    2013-01-01

    The Web allows self-navigated education through interaction with large amounts of Web resources. While enjoying the flexibility of Web tools, authors may suffer from research and filtering Web resources, when they face various resources formats and complex structures. An adaptation of extracted Web resources must be assured by authors, to give…

  13. Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems

    ERIC Educational Resources Information Center

    Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul

    2009-01-01

    Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…

  14. Reading the Writing on the Graffiti Wall: The World Wide Web and Training.

    ERIC Educational Resources Information Center

    Jones, Charles M.

    This paper examines the benefits to be derived from networked computer-based instruction (CBI) and discusses the potential of the World Wide Web (WWW) as an effective tool in employee training. Methods of utilizing the WWW as a training tool and communication tool are explored. The discussion is divided into the following sections: (1) "WWW and…

  15. Web-Site as an Educational Tool in Biology Education: A Case of Nutrition Issue

    ERIC Educational Resources Information Center

    Fancovicova, Jana; Prokop, Pavol; Usak, Muhammet

    2010-01-01

    The purpose of the study was to evaluate the efficacy and feasibility of using website in biology education. We have explored the World Wide Web as a possible tool for education about health and nutrition. The websites were teaching tools for primary school students. Control groups used the traditional educational materials as books or worksheets,…

  16. CoCoFolio: A Web-Based Electronic Portfolio for Enriching Students' Learning by Collaboration.

    ERIC Educational Resources Information Center

    Sugiyama, Takeshi; Kakehi, Naoyuki; Kura, Tsuneko; Takahashi Tokiichiro

    A Web-based electronic portfolio, CoCoFolio, was developed for enriching students' learning by collaboration. CoCoFolio consists of two collaboration tools: a multi-layer drawing tool, CoCoBoard, and a small bulletin board, Discussion Board, for each student's submission. These tools support a series of expression activities: expression, sharing,…

  17. Using Web Database Tools To Facilitate the Construction of Knowledge in Online Courses.

    ERIC Educational Resources Information Center

    McNeil, Sara G.; Robin, Bernard R.

    This paper presents an overview of database tools that dynamically generate World Wide Web materials and focuses on the use of these tools to support research activities, as well as teaching and learning. Database applications have been used in classrooms to support learning activities for over a decade, but, although business and e-commerce have…

  18. Choosing Web 2.0 Tools for Instruction: An Extension of Task-Technology Fit

    ERIC Educational Resources Information Center

    Gupta, Saurabh

    2014-01-01

    The growth of technology and the inclusion of "digital natives" as students in the education world have created a demand pull for the use of Web 2.0 technologies in education. Dominant among these tools have been wikis, blogs and discussion boards. Distance education experts view the use of these tools as differentiators when compared to…

  19. e-CDRweb User Guide – Secondary Authorized Official

    EPA Pesticide Factsheets

    This document presents the user guide for the Office of Pollution Prevention and Toxics’ (OPPT) e-CDRweb tool. E-CDRweb is the electronic, web-based tool provided by the Environmental Protection Agency (EPA) for the submission of Chemical Data Reporting (CDR) information. This document is the user guide for the Secondary Authorized Official (AO) user of the e-CDR web tool.

  20. CCDST: A free Canadian climate data scraping tool

    NASA Astrophysics Data System (ADS)

    Bonifacio, Charmaine; Barchyn, Thomas E.; Hugenholtz, Chris H.; Kienzle, Stefan W.

    2015-02-01

    In this paper we present a new software tool that automatically fetches, downloads and consolidates climate data from a Web database where the data are contained on multiple Web pages. The tool is called the Canadian Climate Data Scraping Tool (CCDST) and was developed to enhance access and simplify analysis of climate data from Canada's National Climate Data and Information Archive (NCDIA). The CCDST deconstructs a URL for a particular climate station in the NCDIA and then iteratively modifies the date parameters to download large volumes of data, remove individual file headers, and merge data files into one output file. This automated sequence enhances access to climate data by substantially reducing the time needed to manually download data from multiple Web pages. To this end, we present a case study of the temporal dynamics of blowing snow events that resulted in ~3.1 weeks time savings. Without the CCDST, the time involved in manually downloading climate data limits access and restrains researchers and students from exploring climate trends. The tool is coded as a Microsoft Excel macro and is available to researchers and students for free. The main concept and structure of the tool can be modified for other Web databases hosting geophysical data.

Top