Sample records for user easy access

  1. Access Control for Home Data Sharing: Attitudes, Needs and Practices

    DTIC Science & Technology

    2009-10-01

    cameras, mobile phones and portable music players make creating and interacting with this content easy. Home users are increasingly interested in...messages, photos, home videos, journal files and home musical recordings. Many participants considered unauthorized access by strangers, acquaintances...configuration does not allow users to share different subsets of music with different people. Facebook supplies rich, customizable access controls for

  2. Tools for discovering and accessing Great Lakes scientific data

    USGS Publications Warehouse

    Lucido, Jessica M.; Bruce, Jennifer L.

    2015-01-01

    The USGS strives to develop data products that are easy to find, easy to understand, and easy to use through Web-accessible tools that allow users to learn about the breadth and scope of GLRI activities being undertaken by the USGS and its partners. By creating tools that enable data to be shared and reused more easily, the USGS can encourage collaboration and assist the GL community in finding, interpreting, and understanding the information created during GLRI science activities.

  3. A tool for improving the Web accessibility of visually handicapped persons.

    PubMed

    Fujiki, Tadayoshi; Hanada, Eisuke; Yamada, Tomomi; Noda, Yoshihiro; Antoku, Yasuaki; Nakashima, Naoki; Nose, Yoshiaki

    2006-04-01

    Abstract Much has been written concerning the difficulties faced by visually handicapped persons when they access the internet. To solve some of the problems and to make web pages more accessible, we developed a tool we call the "Easy Bar," which works as a toolbar on the web browser. The functions of the Easy Bar are to change the size of web texts and images, to adjust the color, and to clear cached data that is automatically saved by the web browser. These functions are executed with ease by clicking buttons and operating a pull-down list. Since the icons built into Easy Bar are quite large, it is not necessary for the user to deal with delicate operations. The functions of Easy Bar run on any web page without increasing the processing time. For the visually handicapped, Easy Bar would contribute greatly to improved web accessibility to medical information.

  4. A software for managing after-hours activities in research user facilities

    DOE PAGES

    Camino, F. E.

    2017-05-01

    Here, we present an afterhours activity management program for shared facilities, which handles the processes required for afterhours access (request, approval, extension, etc.). It implements the concept of permitted afterhours activities, which consists of a list of well-defined activities that each user can perform afterhours. The program provides an easy and unambiguous way for users to know which activities they are allowed to perform afterhours. In addition, the program can enhance its safety efficacy by interacting with lab and instrument access control systems commonly present in user facilities.

  5. A software for managing after-hours activities in research user facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camino, F. E.

    Here, we present an afterhours activity management program for shared facilities, which handles the processes required for afterhours access (request, approval, extension, etc.). It implements the concept of permitted afterhours activities, which consists of a list of well-defined activities that each user can perform afterhours. The program provides an easy and unambiguous way for users to know which activities they are allowed to perform afterhours. In addition, the program can enhance its safety efficacy by interacting with lab and instrument access control systems commonly present in user facilities.

  6. Remote Access Multi-Mission Processing and Analysis Ground Environment (RAMPAGE)

    NASA Technical Reports Server (NTRS)

    Lee, Y.; Specht, T.

    2000-01-01

    At Jet Propulsion Laboratory (JPL), a goal of providing easy and simple data access to the mission engineering data using web-based standards to a wide variety of users is now possible by the RAMPAGE development.

  7. A validation study regarding a generative approach in choosing appropriate colors for impaired users.

    PubMed

    Troiano, Luigi; Birtolo, Cosimo; Armenise, Roberto

    2016-01-01

    In many circumstances, concepts, ideas and emotions are mainly conveyed by colors. Color vision disorders can heavily limit the user experience in accessing Information Society. Therefore, color vision impairments should be taken into account in order to make information and services accessible to a broader audience. The task is not easy for designers that generally are not affected by any color vision disorder. In any case, the design of accessible user interfaces should not lead to to boring color schemes. The selection of appealing and harmonic color combinations should be preserved. In past research we investigated a generative approach led by evolutionary computing in supporting interface designers to make colors accessible to impaired users. This approach has also been followed by other authors. The contribution of this paper is to provide an experimental validation to the claim that this approach is actually beneficial to designers and users.

  8. Design Optimization Toolkit: Users' Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less

  9. Easy-to-use interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blattner, M M; Blattner, D O; Tong, Y

    1999-04-01

    Easy-to-use interfaces are a class of interfaces that fall between public access interfaces and graphical user interfaces in usability and cognitive difficulty. We describe characteristics of easy-to-use interfaces by the properties of four dimensions: selection, navigation, direct manipulation, and contextual metaphors. Another constraint we introduced was to include as little text as possible, and what text we have will be in at least four languages. Formative evaluations were conducted to identify and isolate these characteristics. Our application is a visual interface for a home automation system intended for a diverse set of users. The design will be expanded to accommodatemore » the visually disabled in the near future.« less

  10. In-house access to PACS images and related data through World Wide Web

    NASA Astrophysics Data System (ADS)

    Mascarini, Christian; Ratib, Osman M.; Trayser, Gerhard; Ligier, Yves; Appel, R. D.

    1996-05-01

    The development of a hospital wide PACS is in progress at the University Hospital of Geneva and several archive modules are operational since 1992. This PACS is intended for wide distribution of images to clinical wards. As the PACS project and the number of archived images grow rapidly in the hospital, it was necessary to provide an easy, more widely accessible and convenient access to the PACS database for the clinicians in the different wards and clinical units of the hospital. An innovative solution has been developed using tools such as Netscape navigator and NCSA World Wide Web server as an alternative to conventional database query and retrieval software. These tools present the advantages of providing an user interface which is the same independently of the platform being used (Mac, Windows, UNIX, ...), and an easy integration of different types of documents (text, images, ...). A strict access control has been added to this interface. It allows user identification and access rights checking, as defined by the in-house hospital information system, before allowing the navigation through patient data records.

  11. Marine Web Portal as an Interface between Users and Marine Data and Information Sources

    NASA Astrophysics Data System (ADS)

    Palazov, A.; Stefanov, A.; Marinova, V.; Slabakova, V.

    2012-04-01

    Fundamental elements of the success of marine data and information management system and an effective support of marine and maritime economic activities are the speed and the ease with which users can identify, locate, get access, exchange and use oceanographic and marine data and information. There are a lot of activities and bodies have been identified as marine data and information users, such as: science, government and local authorities, port authorities, shipping, marine industry, fishery and aquaculture, tourist industry, environmental protection, coast protection, oil spills combat, Search and Rescue, national security, civil protection, and general public. On other hand diverse sources of real-time and historical marine data and information exist and generally they are fragmented, distributed in different places and sometimes unknown for the users. The marine web portal concept is to build common web based interface which will provide users fast and easy access to all available marine data and information sources, both historical and real-time such as: marine data bases, observing systems, forecasting systems, atlases etc. The service is regionally oriented to meet user needs. The main advantage of the portal is that it provides general look "at glance" on all available marine data and information as well as direct user to easy discover data and information in interest. It is planned to provide personalization ability, which will give the user instrument to tailor visualization according its personal needs.

  12. Development of a paperless, Y2K compliant exposure tracking database at Los Alamos National Laboratory.

    PubMed

    Conwell, J L; Creek, K L; Pozzi, A R; Whyte, H M

    2001-02-01

    The Industrial Hygiene and Safety Group at Los Alamos National Laboratory (LANL) developed a database application known as IH DataView, which manages industrial hygiene monitoring data. IH DataView replaces a LANL legacy system, IHSD, that restricted user access to a single point of data entry needed enhancements that support new operational requirements, and was not Year 2000 (Y2K) compliant. IH DataView features a comprehensive suite of data collection and tracking capabilities. Through the use of Oracle database management and application development tools, the system is Y2K compliant and Web enabled for easy deployment and user access via the Internet. System accessibility is particularly important because LANL operations are spread over 43 square miles, and industrial hygienists (IHs) located across the laboratory will use the system. IH DataView shows promise of being useful in the future because it eliminates these problems. It has a flexible architecture and sophisticated capability to collect, track, and analyze data in easy-to-use form.

  13. Leveraging Technology and Social Media for Information Sharing

    DTIC Science & Technology

    2009-04-01

    praised as a "gift to humanity the benefits of social networking sites such as Facebook and MySpace in forging friendships and understanding.2” The...is relatively easy to sign-up to and access. It should be noted that many DoD installations and agencies restrict access to social networking sites for...to sign-up to and access. As with facebook, many DoD installations and agencies restrict access to social networking sites . Users of Twitter are

  14. The Mac Internet Tour Guide: Cruising the Internet the Easy Way. [First Edition.

    ERIC Educational Resources Information Center

    Fraase, Michael

    Published exclusively for MacIntosh computer users, this guide provides an overview of Internet resources for new and experienced users. E-mail, file transfer, and decompression software used to access the resources are included on a 800k, 3.5 inch disk. The following chapters are included: (1) "What Is the Internet" covers finding your…

  15. Non-visual Web Browsing: Beyond Web Accessibility

    PubMed Central

    Ramakrishnan, I.V.; Ashok, Vikas

    2017-01-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability. PMID:29202137

  16. Non-visual Web Browsing: Beyond Web Accessibility.

    PubMed

    Ramakrishnan, I V; Ashok, Vikas; Billah, Syed Masum

    2017-07-01

    People with vision impairments typically use screen readers to browse the Web. To facilitate non-visual browsing, web sites must be made accessible to screen readers, i.e., all the visible elements in the web site must be readable by the screen reader. But even if web sites are accessible, screen-reader users may not find them easy to use and/or easy to navigate. For example, they may not be able to locate the desired information without having to listen to a lot of irrelevant contents. These issues go beyond web accessibility and directly impact web usability. Several techniques have been reported in the accessibility literature for making the Web usable for screen reading. This paper is a review of these techniques. Interestingly, the review reveals that understanding the semantics of the web content is the overarching theme that drives these techniques for improving web usability.

  17. A Distributed Multi-Agent System for Collaborative Information Management and Learning

    NASA Technical Reports Server (NTRS)

    Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this paper, we present DIAMS, a system of distributed, collaborative agents to help users access, manage, share and exchange information. A DIAMS personal agent helps its owner find information most relevant to current needs. It provides tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Flexible hierarchical display is integrated with indexed query search-to support effective information access. Automatic indexing methods are employed to support user queries and communication between agents. Contents of a repository are kept in object-oriented storage to facilitate information sharing. Collaboration between users is aided by easy sharing utilities as well as automated information exchange. Matchmaker agents are designed to establish connections between users with similar interests and expertise. DIAMS agents provide needed services for users to share and learn information from one another on the World Wide Web.

  18. A new forecast presentation tool for offshore contractors

    NASA Astrophysics Data System (ADS)

    Jørgensen, M.

    2009-09-01

    Contractors working off shore are often very sensitive to both sea and weather conditions, and it's essential that they have easy access to reliable information on coming conditions to enable planning of when to start or shut down offshore operations to avoid loss of life and materials. Danish Meteorological Institute, DMI, recently, in cooperation with business partners in the field, developed a new application to accommodate that need. The "Marine Forecast Service” is a browser based forecast presentation tool. It provides an interface for the user to enable easy and quick access to all relevant meteorological and oceanographic forecasts and observations for a given area of interest. Each customer gains access to the application via a standard login/password procedure. Once logged in, the user can inspect animated forecast maps of parameters like wind, gust, wave height, swell and current among others. Supplementing the general maps, the user can choose to look at forecast graphs for each of the locations where the user is running operations. These forecast graphs can also be overlaid with the user's own in situ observations, if such exist. Furthermore, the data from the graphs can be exported as data files that the customer can use in his own applications as he desires. As part of the application, a forecaster's view on the current and near future weather situation is presented to the user as well, adding further value to the information presented through maps and graphs. Among other features of the product, animated radar and satellite images could be mentioned. And finally the application provides the possibility of a "second opinion” through traditional weather charts from another recognized provider of weather forecasts. The presentation will provide more detailed insights into the contents of the applications as well as some of the experiences with the product.

  19. Data warehousing as a healthcare business solution.

    PubMed

    Scheese, R

    1998-02-01

    Because of the trend toward consolidation in the healthcare field, many organizations have massive amounts of data stored in various information systems organizationwide, but access to the data by end users may be difficult. Healthcare organizations are being pressured to provide managers easy access to the data needed for critical decision making. One solution many organizations are turning to is implementing decision-support data warehouses. A data warehouse instantly delivers information directly to end users, freeing healthcare information systems staff for strategic operations. If designed appropriately, data warehouses can be a cost-effective tool for business analysis and decision support.

  20. OMARC: An online multimedia application for training health care providers in the assessment of respiratory conditions.

    PubMed

    Meruvia-Pastor, Oscar; Patra, Pranjal; Andres, Karen; Twomey, Creina; Peña-Castillo, Lourdes

    2016-05-01

    OMARC, a multimedia application designed to support the training of health care providers for the identification of common lung sounds heard in a patient's thorax as part of a health assessment, is described and its positive contribution to user learning is assessed. The main goal of OMARC is to effectively help health-care students become familiar with lung sounds as part of the assessment of respiratory conditions. In addition, the application must be easy to use and accessible to students and practitioners over the internet. OMARC was developed using an online platform to facilitate access to users in remote locations. OMARC's unique contribution as an educational software tool is that it presents a narrative about normal and abnormal lung sounds using interactive multimedia and sample case studies designed by professional health-care providers and educators. Its interface consists of two distinct components: a sounds glossary and a rich multimedia interface which presents clinical case studies and provides access to lung sounds placed on a model of a human torso. OMARC's contents can be extended through the addition of sounds and case studies designed by health-care educators and professionals. To validate OMARC and determine its efficacy in improving learning and capture user perceptions about it, we performed a pilot study with ten nursing students. Participants' performance was measured through an evaluation of their ability to identify several normal and adventitious/abnormal sounds prior and after exposure to OMARC. Results indicate that participants are able to better identify different lung sounds, going from an average of 63% (S.D. 18.3%) in the pre-test evaluation to an average of 90% (S.D. of 11.5%) after practising with OMARC. Furthermore, participants indicated in a user satisfaction questionnaire that they found the application helpful, easy to use and that they would recommend it to other persons in their field. OMARC is an online multimedia application for training health care students in the assessment of respiratory conditions. The software integrates multimedia technology and health-care education concepts to facilitate learning, while being useful and easy to use. Results from a pilot study indicate that OMARC significantly helps to improve the capacity of the users to correctly identify lung sounds for different respiratory conditions. In addition, participants' opinions about OMARC were quite positive: users were likely to recommend the application to other persons in their field and found the application easy to use and helpful to better identify lung sounds. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Radiology Teacher: a free, Internet-based radiology teaching file server.

    PubMed

    Talanow, Roland

    2009-12-01

    Teaching files are an essential ingredient in residency education. The online program Radiology Teacher was developed to allow the creation of interactive and customized teaching files in real time. Online access makes it available anytime and anywhere, and it is free of charge, user tailored, and easy to use. No programming skills, additional plug-ins, or installations are needed, allowing its use even on protected intranets. Special effects for enhancing the learning experience as well as the linking and the source code are created automatically by the program. It may be used in different modes by individuals and institutions to share cases from multiple authors in a single database. Radiology Teacher is an easy-to-use automatic teaching file program that may enhance users' learning experiences by offering different modes of user-defined presentations.

  2. A compendium of forest growth and yield simulators for the Pacific coast states

    Treesearch

    Martin W. Ritchie

    1999-01-01

    This manuscript provides information needed for the user to access current information about forest growth and yield simulators. Ultimately, the best source of information for any simulator is the user’s guide and the sage advice of those who built the simulator. In some instances, these people are easy to find and are willing to provide all the support for the program...

  3. ARCAS (ACACIA Regional Climate-data Access System) -- a Web Access System for Climate Model Data Access, Visualization and Comparison

    NASA Astrophysics Data System (ADS)

    Hakkarinen, C.; Brown, D.; Callahan, J.; hankin, S.; de Koningh, M.; Middleton-Link, D.; Wigley, T.

    2001-05-01

    A Web-based access system to climate model output data sets for intercomparison and analysis has been produced, using the NOAA-PMEL developed Live Access Server software as host server and Ferret as the data serving and visualization engine. Called ARCAS ("ACACIA Regional Climate-data Access System"), and publicly accessible at http://dataserver.ucar.edu/arcas, the site currently serves climate model outputs from runs of the NCAR Climate System Model for the 21st century, for Business as Usual and Stabilization of Greenhouse Gas Emission scenarios. Users can select, download, and graphically display single variables or comparisons of two variables from either or both of the CSM model runs, averaged for monthly, seasonal, or annual time resolutions. The time length of the averaging period, and the geographical domain for download and display, are fully selectable by the user. A variety of arithmetic operations on the data variables can be computed "on-the-fly", as defined by the user. Expansions of the user-selectable options for defining analysis options, and for accessing other DOD-compatible ("Distributed Ocean Data System-compatible") data sets, residing at locations other than the NCAR hardware server on which ARCAS operates, are planned for this year. These expansions are designed to allow users quick and easy-to-operate web-based access to the largest possible selection of climate model output data sets available throughout the world.

  4. Air Markets Program Data (AMPD)

    EPA Pesticide Factsheets

    The Air Markets Program Data tool allows users to search EPA data to answer scientific, general, policy, and regulatory questions about industry emissions. Air Markets Program Data (AMPD) is a web-based application that allows users easy access to both current and historical data collected as part of EPA's emissions trading programs. This site allows you to create and view reports and to download emissions data for further analysis. AMPD provides a query tool so users can create custom queries of industry source emissions data, allowance data, compliance data, and facility attributes. In addition, AMPD provides interactive maps, charts, reports, and pre-packaged datasets. AMPD does not require any additional software, plug-ins, or security controls and can be accessed using a standard web browser.

  5. Empowering Middle School Teachers with Portable Computers.

    ERIC Educational Resources Information Center

    Weast, Jerry D.; And Others

    1993-01-01

    A Sioux Falls (South Dakota) project that supplied middle school teachers with Macintosh computers and training to use them showed gratifying results. Easy access to portable notebook computers made teachers more active computer users, increased teacher interaction and collaboration, enhanced teacher productivity regarding management tasks and…

  6. The EBI Search engine: providing search and retrieval functionality for biological data from EMBL-EBI.

    PubMed

    Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Gur, Tamer; Cowley, Andrew; Li, Weizhong; Uludag, Mahmut; Pundir, Sangya; Cham, Jennifer A; McWilliam, Hamish; Lopez, Rodrigo

    2015-07-01

    The European Bioinformatics Institute (EMBL-EBI-https://www.ebi.ac.uk) provides free and unrestricted access to data across all major areas of biology and biomedicine. Searching and extracting knowledge across these domains requires a fast and scalable solution that addresses the requirements of domain experts as well as casual users. We present the EBI Search engine, referred to here as 'EBI Search', an easy-to-use fast text search and indexing system with powerful data navigation and retrieval capabilities. API integration provides access to analytical tools, allowing users to further investigate the results of their search. The interconnectivity that exists between data resources at EMBL-EBI provides easy, quick and precise navigation and a better understanding of the relationship between different data types including sequences, genes, gene products, proteins, protein domains, protein families, enzymes and macromolecular structures, together with relevant life science literature. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  7. Interoperable and accessible census and survey data from IPUMS.

    PubMed

    Kugler, Tracy A; Fitch, Catherine A

    2018-02-27

    The first version of the Integrated Public Use Microdata Series (IPUMS) was released to users in 1993, and since that time IPUMS has come to stand for interoperable and accessible census and survey data. Initially created to harmonize U.S. census microdata over time, IPUMS now includes microdata from the U.S. and international censuses and from surveys on health, employment, and other topics. IPUMS also provides geo-spatial data, aggregate population data, and environmental data. IPUMS supports ten data products, each disseminating an integrated data collection with a set of tools that make complex data easy to find, access, and use. Key features are record-level integration to create interoperable datasets, user-friendly interfaces, and comprehensive metadata and documentation. The IPUMS philosophy aligns closely with the FAIR principles of findability, accessibility, interoperability, and re-usability. IPUMS data have catalyzed knowledge generation across a wide range of social science and other disciplines, as evidenced by the large volume of publications and other products created by the vast IPUMS user community.

  8. Math Description Engine Software Development Kit

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Dexter, Dan E.; Hodgson, Terry R.

    2010-01-01

    The Math Description Engine Software Development Kit (MDE SDK) can be used by software developers to make computer-rendered graphs more accessible to blind and visually-impaired users. The MDE SDK generates alternative graph descriptions in two forms: textual descriptions and non-verbal sound renderings, or sonification. It also enables display of an animated trace of a graph sonification on a visual graph component, with color and line-thickness options for users having low vision or color-related impairments. A set of accessible graphical user interface widgets is provided for operation by end users and for control of accessible graph displays. Version 1.0 of the MDE SDK generates text descriptions for 2D graphs commonly seen in math and science curriculum (and practice). The mathematically rich text descriptions can also serve as a virtual math and science assistant for blind and sighted users, making graphs more accessible for everyone. The MDE SDK has a simple application programming interface (API) that makes it easy for programmers and Web-site developers to make graphs accessible with just a few lines of code. The source code is written in Java for cross-platform compatibility and to take advantage of Java s built-in support for building accessible software application interfaces. Compiled-library and NASA Open Source versions are available with API documentation and Programmer s Guide at http:/ / prim e.jsc.n asa. gov.

  9. User's Guide for MetView: A Meteorological Display and Assessment Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glantz, Clifford S.; Pelton, Mitchell A.; Allwine, K Jerry

    2000-09-27

    MetView Version 2.0 is an easy-to-use model for accessing, viewing, and analyzing meteorological data. MetView provides both graphical and numerical displays of data. It can accommodate data from an extensive meteorological monitoring network that includes near-surface monitoring locations, instrumented towers, sodars, and meteorologist observations. MetView is used operationally for both routine, emergency response, and research applications at the U.S. Department of Energy's Hanford Site. At the Site's Emergency Operations Center, MetView aids in the access, visualization, and interpretation of real-time meteorological data. Historical data can also be accessed and displayed. Emergency response personnel at the Emergency Operations Center use MetViewmore » products in the formulation of protective action recommendations and other decisions. In the initial stage of an emergency, MetView can be operated using a very simple, five-step procedure. This first-responder procedure allows non-technical staff to rapidly generate meteorological products and disseminate key information. After first-responder information products are produced, the Emergency Operations Center's technical staff can conduct more sophisticated analyses using the model. This may include examining the vertical variation in winds, assessing recent changes in atmospheric conditions, evaluating atmospheric mixing rates, and forecasting changes in meteorological conditions. This user's guide provides easy-to-follow instructions for both first-responder and routine operation of the model. Examples, with explanations, are provided for each type of MetView output display. Information is provided on the naming convention, format, and contents of each type of meteorological data file used by the model area. This user's guide serves as a ready reference for experienced MetView users and a training manual for new users.« less

  10. Coal Data Browser

    EIA Publications

    The Coal Data Browser gives users easy access to coal information from EIA's electricity and coal surveys as well as data from the Mine Safety and Health Administration and trade information from the U.S. Census Bureau. Users can also see the shipment data from individual mines that deliver coal to the U.S. electric power fleet, have the ability to track supplies delivered to a given power plant, and to see which mines serve each particular plant.

  11. Motofit - integrating neutron reflectometry acquisition, reduction and analysis into one, easy to use, package

    NASA Astrophysics Data System (ADS)

    Nelson, Andrew

    2010-11-01

    The efficient use of complex neutron scattering instruments is often hindered by the complex nature of their operating software. This complexity exists at each experimental step: data acquisition, reduction and analysis, with each step being as important as the previous. For example, whilst command line interfaces are powerful at automated acquisition they often reduce accessibility by novice users and sometimes reduce the efficiency for advanced users. One solution to this is the development of a graphical user interface which allows the user to operate the instrument by a simple and intuitive "push button" approach. This approach was taken by the Motofit software package for analysis of multiple contrast reflectometry data. Here we describe the extension of this package to cover the data acquisition and reduction steps for the Platypus time-of-flight neutron reflectometer. Consequently, the complete operation of an instrument is integrated into a single, easy to use, program, leading to efficient instrument usage.

  12. Goddard Atmospheric Composition Data Center: Aura Data and Services in One Place

    NASA Technical Reports Server (NTRS)

    Leptoukh, G.; Kempler, S.; Gerasimov, I.; Ahmad, S.; Johnson, J.

    2005-01-01

    The Goddard Atmospheric Composition Data and Information Services Center (AC-DISC) is a portal to the Atmospheric Composition specific, user driven, multi-sensor, on-line, easy access archive and distribution system employing data analysis and visualization, data mining, and other user requested techniques for the better science data usage. It provides convenient access to Atmospheric Composition data and information from various remote-sensing missions, from TOMS, UARS, MODIS, and AIRS, to the most recent data from Aura OMI, MLS, HIRDLS (once these datasets are released to the public), as well as Atmospheric Composition datasets residing at other remote archive site.

  13. Web Based Data Access to the World Data Center for Climate

    NASA Astrophysics Data System (ADS)

    Toussaint, F.; Lautenschlager, M.

    2006-12-01

    The World Data Center for Climate (WDC-Climate, www.wdc-climate.de) is hosted by the Model &Data Group (M&D) of the Max Planck Institute for Meteorology. The M&D department is financed by the German government and uses the computers and mass storage facilities of the German Climate Computing Centre (Deutsches Klimarechenzentrum, DKRZ). The WDC-Climate provides web access to 200 Terabytes of climate data; the total mass storage archive contains nearly 4 Petabytes. Although the majority of the datasets concern model output data, some satellite and observational data are accessible as well. The underlying relational database is distributed on five servers. The CERA relational data model is used to integrate catalogue data and mass data. The flexibility of the model allows to store and access very different types of data and metadata. The CERA metadata catalogue provides easy access to the content of the CERA database as well as to other data in the web. Visit ceramodel.wdc-climate.de for additional information on the CERA data model. The majority of the users access data via the CERA metadata catalogue, which is open without registration. However, prior to retrieving data user are required to check in and apply for a userid and password. The CERA metadata catalogue is servlet based. So it is accessible worldwide through any web browser at cera.wdc-climate.de. In addition to data and metadata access by the web catalogue, WDC-Climate offers a number of other forms of web based data access. All metadata are available via http request as xml files in various metadata formats (ISO, DC, etc., see wini.wdc-climate.de) which allows for easy data interchange with other catalogues. Model data can be retrieved in GRIB, ASCII, NetCDF, and binary (IEEE) format. WDC-Climate serves as data centre for various projects. Since xml files are accessible by http, the integration of data into applications of different projects is very easy. Projects supported by WDC-Climate are e.g. CEOP, IPCC, and CARIBIC. A script tool for data download (jblob) is offered on the web page, to make retrieval of huge data quantities more comfortable.

  14. Interactive Voice/Web Response System in clinical research

    PubMed Central

    Ruikar, Vrishabhsagar

    2016-01-01

    Emerging technologies in computer and telecommunication industry has eased the access to computer through telephone. An Interactive Voice/Web Response System (IxRS) is one of the user friendly systems for end users, with complex and tailored programs at its backend. The backend programs are specially tailored for easy understanding of users. Clinical research industry has experienced revolution in methodologies of data capture with time. Different systems have evolved toward emerging modern technologies and tools in couple of decades from past, for example, Electronic Data Capture, IxRS, electronic patient reported outcomes, etc. PMID:26952178

  15. Interactive Voice/Web Response System in clinical research.

    PubMed

    Ruikar, Vrishabhsagar

    2016-01-01

    Emerging technologies in computer and telecommunication industry has eased the access to computer through telephone. An Interactive Voice/Web Response System (IxRS) is one of the user friendly systems for end users, with complex and tailored programs at its backend. The backend programs are specially tailored for easy understanding of users. Clinical research industry has experienced revolution in methodologies of data capture with time. Different systems have evolved toward emerging modern technologies and tools in couple of decades from past, for example, Electronic Data Capture, IxRS, electronic patient reported outcomes, etc.

  16. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    NASA Astrophysics Data System (ADS)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  17. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    PubMed

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  18. Interactive SIGHT: textual access to simple bar charts

    NASA Astrophysics Data System (ADS)

    Demir, Seniz; Oliver, David; Schwartz, Edward; Elzer, Stephanie; Carberry, Sandra; Mccoy, Kathleen F.; Chester, Daniel

    2010-12-01

    Information graphics, such as bar charts and line graphs, are an important component of many articles from popular media. The majority of such graphics have an intention (a high-level message) to communicate to the graph viewer. Since the intended message of a graphic is often not repeated in the accompanying text, graphics together with the textual segments contribute to the overall purpose of an article and cannot be ignored. Unfortunately, these visual displays are provided in a format which is not readily accessible to everyone. For example, individuals with sight impairments who use screen readers to listen to documents have limited access to the graphics. This article presents a new accessibility tool, the Interactive SIGHT (Summarizing Information GrapHics Textually) system, that is intended to enable visually impaired users to access the knowledge that one would gain from viewing information graphics found on the web. The current system, which is implemented as a browser extension that works on simple bar charts, can be invoked by a user via a keystroke combination while navigating the web. Once launched, Interactive SIGHT first provides a brief summary that conveys the underlying intention of a bar chart along with the chart's most significant and salient features, and then produces history-aware follow-up responses to provide further information about the chart upon request from the user. We present two user studies that were conducted with sighted and visually impaired users to determine how effective the initial summary and follow-up responses are in conveying the informational content of bar charts, and to evaluate how easy it is to use the system interface. The evaluation results are promising and indicate that the system responses are well-structured and enable visually impaired users to answer key questions about bar charts in an easy-to-use manner. Post-experimental interviews revealed that visually impaired participants were very satisfied with the system offering different options to access the content of a chart to meet their specific needs and that they would use Interactive SIGHT if it was publicly available so as not to have to ignore graphics on the web. Being a language based assistive technology designed to compensate for the lack of sight, our work paves the road for a stronger acceptance of natural language interfaces to graph interpretation that we believe will be of great benefit to the visually impaired community.

  19. Philosophers and Technologists: Vicarious and Virtual Knowledge Constructs

    ERIC Educational Resources Information Center

    McNeese, Beverly D.

    2007-01-01

    In an age of continual technological advancement, user-friendly software, and consumer demand for the latest upgraded gadget, the ethical and moral discoveries derived from a careful reading of any fictional literature by college students is struggling in the American college classroom. Easy-access information systems, coinciding with the…

  20. GenePattern | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    GenePattern is a genomic analysis platform that provides access to hundreds of tools for the analysis and visualization of multiple data types. A web-based interface provides easy access to these tools and allows the creation of multi-step analysis pipelines that enable reproducible in silico research. A new GenePattern Notebook environment allows users to combine GenePattern analyses with text, graphics, and code to create complete reproducible research narratives.

  1. Easy Handling of Sensors and Actuators over TCP/IP Networks by Open Source Hardware/Software.

    PubMed

    Mejías, Andrés; Herrera, Reyes S; Márquez, Marco A; Calderón, Antonio José; González, Isaías; Andújar, José Manuel

    2017-01-05

    There are several specific solutions for accessing sensors and actuators present in any process or system through a TCP/IP network, either local or a wide area type like the Internet. The usage of sensors and actuators of different nature and diverse interfaces (SPI, I2C, analogue, etc.) makes access to them from a network in a homogeneous and secure way more complex. A framework, including both software and hardware resources, is necessary to simplify and unify networked access to these devices. In this paper, a set of open-source software tools, specifically designed to cover the different issues concerning the access to sensors and actuators, and two proposed low-cost hardware architectures to operate with the abovementioned software tools are presented. They allow integrated and easy access to local or remote sensors and actuators. The software tools, integrated in the free authoring tool Easy Java and Javascript Simulations (EJS) solve the interaction issues between the subsystem that integrates sensors and actuators into the network, called convergence subsystem in this paper, and the Human Machine Interface (HMI)-this one designed using the intuitive graphical system of EJS-located on the user's computer. The proposed hardware architectures and software tools are described and experimental implementations with the proposed tools are presented.

  2. A DICOM based radiotherapy plan database for research collaboration and reporting

    NASA Astrophysics Data System (ADS)

    Westberg, J.; Krogh, S.; Brink, C.; Vogelius, I. R.

    2014-03-01

    Purpose: To create a central radiotherapy (RT) plan database for dose analysis and reporting, capable of calculating and presenting statistics on user defined patient groups. The goal is to facilitate multi-center research studies with easy and secure access to RT plans and statistics on protocol compliance. Methods: RT institutions are able to send data to the central database using DICOM communications on a secure computer network. The central system is composed of a number of DICOM servers, an SQL database and in-house developed software services to process the incoming data. A web site within the secure network allows the user to manage their submitted data. Results: The RT plan database has been developed in Microsoft .NET and users are able to send DICOM data between RT centers in Denmark. Dose-volume histogram (DVH) calculations performed by the system are comparable to those of conventional RT software. A permission system was implemented to ensure access control and easy, yet secure, data sharing across centers. The reports contain DVH statistics for structures in user defined patient groups. The system currently contains over 2200 patients in 14 collaborations. Conclusions: A central RT plan repository for use in multi-center trials and quality assurance was created. The system provides an attractive alternative to dummy runs by enabling continuous monitoring of protocol conformity and plan metrics in a trial.

  3. A uniqueness-and-anonymity-preserving remote user authentication scheme for connected health care.

    PubMed

    Chang, Ya-Fen; Yu, Shih-Hui; Shiao, Ding-Rui

    2013-04-01

    Connected health care provides new opportunities for improving financial and clinical performance. Many connected health care applications such as telecare medicine information system, personally controlled health records system, and patient monitoring have been proposed. Correct and quality care is the goal of connected heath care, and user authentication can ensure the legality of patients. After reviewing authentication schemes for connected health care applications, we find that many of them cannot protect patient privacy such that others can trace users/patients by the transmitted data. And the verification tokens used by these authentication schemes to authenticate users or servers are only password, smart card and RFID tag. Actually, these verification tokens are not unique and easy to copy. On the other hand, biometric characteristics, such as iris, face, voiceprint, fingerprint and so on, are unique, easy to be verified, and hard to be copied. In this paper, a biometrics-based user authentication scheme will be proposed to ensure uniqueness and anonymity at the same time. With the proposed scheme, only the legal user/patient himself/herself can access the remote server, and no one can trace him/her according to transmitted data.

  4. Data access for scientific problem solving

    NASA Technical Reports Server (NTRS)

    Brown, James W.

    1987-01-01

    An essential ingredient in scientific work is data. In disciplines such as Oceanography, data sources are many and volumes are formidable. The full value of large stores of data cannot be realized unless careful thought is given to data access. JPL has developed the Pilot Ocean Data System to investigate techniques for archiving and accessing ocean data obtained from space. These include efficient storage and rapid retrieval of satellite data, an easy-to-use user interface, and a variety of output products which, taken together, permit researchers to extract and use data rapidly and conveniently.

  5. Heterogeneous distributed query processing: The DAVID system

    NASA Technical Reports Server (NTRS)

    Jacobs, Barry E.

    1985-01-01

    The objective of the Distributed Access View Integrated Database (DAVID) project is the development of an easy to use computer system with which NASA scientists, engineers and administrators can uniformly access distributed heterogeneous databases. Basically, DAVID will be a database management system that sits alongside already existing database and file management systems. Its function is to enable users to access the data in other languages and file systems without having to learn the data manipulation languages. Given here is an outline of a talk on the DAVID project and several charts.

  6. It cannot be all about safety: the benefits of prolonged mobility.

    PubMed

    Oxley, Jennifer; Whelan, Michelle

    2008-08-01

    While there is much emphasis on managing the safety of older road users, there is limited understanding and recognition of the significance of mobility and transportation needs, mobility changes in later life, and the impact of reduced mobility on quality of life. Moreover, there is little information about the measures that can be taken to increase or at least maintain mobility in older age. A systematic literature review was undertaken to address the issues associated with the transportation and mobility needs of older road users. Articles and publications were selected for relevance and research strength and strategies and measures aimed to manage the safe mobility of older road users were reviewed. The review provides clear evidence that, for older adults who cease driving, quality of life is reduced and that there are a number of adverse consequences of poor mobility. The misconceptions regarding the risks that older drivers pose on the road and how their safe mobility should be managed are discussed, particularly the implications of current licensing procedures on mobility. Evidence is also presented showing there are subgroups of older adults who are more likely to suffer more pronounced mobility consequences including women and financially disadvantaged groups. Moreover, "best-practice" strategies for maintaining at least some level of mobility for older adults are highlighted in four broad categories: safer road users, safer vehicles, safer roads and infrastructure, and provision of new and innovative alternative transport options that are specifically tailored to older adults. Provision of safe travel options that allow easy access to services and amenities is a vital factor in maintaining mobility amongst older road users. An understanding that continued mobility means access to a private vehicle, either as a driver (for as long as possible as it is safe to drive) or as a passenger, and easy and practical access to other forms of transport is essential in the management of health, well-being, and the safe mobility of older road users.

  7. An end-to-end secure patient information access card system.

    PubMed

    Alkhateeb, A; Singer, H; Yakami, M; Takahashi, T

    2000-03-01

    The rapid development of the Internet and the increasing interest in Internet-based solutions has promoted the idea of creating Internet-based health information applications. This will force a change in the role of IC cards in healthcare card systems from a data carrier to an access key medium. At the Medical Informatics Department of Kyoto University Hospital we are developing a smart card patient information project where patient databases are accessed via the Internet. Strong end-to-end data encryption is performed via Secure Socket Layers, transparent to transmit patient information. The smart card is playing the crucial role of access key to the database: user authentication is performed internally without ever revealing the actual key. For easy acceptance by healthcare professionals, the user interface is integrated as a plug-in for two familiar Web browsers, Netscape Navigator and MS Internet Explorer.

  8. Leveraging Globus to Support Access and Delivery of Scientific Data

    NASA Astrophysics Data System (ADS)

    Cram, T.; Schuster, D.; Ji, Z.; Worley, S. J.

    2015-12-01

    The NCAR Research Data Archive (RDA; http://rda.ucar.edu) contains a large and diverse collection of meteorological and oceanographic observations, operational and reanalysis outputs, and remote sensing datasets to support atmospheric and geoscience research. The RDA contains greater than 600 dataset collections which support the varying needs of a diverse user community. The number of RDA users is increasing annually, and the most popular method used to access the RDA data holdings is through web based protocols, such as wget and cURL based scripts. In the year 2014, 11,000 unique users downloaded greater than 1.1 petabytes of data from the RDA, and customized data products were prepared for more than 45,000 user-driven requests. In order to further support this increase in web download usage, the RDA has implemented the Globus data transfer service (www.globus.org) to provide a GridFTP data transfer option for the user community. The Globus service is broadly scalable, has an easy to install client, is sustainably supported, and provides a robust, efficient, and reliable data transfer option for the research community. This presentation will highlight the technical functionality, challenges, and usefulness of the Globus data transfer service for accessing the RDA data holdings.

  9. Making Big Data, Safe Data: A Test Optimization Approach

    DTIC Science & Technology

    2016-06-15

    catalyzed by the need to put a value on testing. Included with this project report is a proof of concept created in MS Excel utilizing its VBA ...Language To make the proof of concept more user friendly, MS Excel was chosen for its convenient user interface and its developer tool, VBA . Another...reason it was selected is everyone has easy access to MS Excel, so the file accompanying this project paper can be easily viewed, used, and modified by

  10. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    ERIC Educational Resources Information Center

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  11. Active Ingredient - AZ

    EPA Pesticide Factsheets

    EPA Pesticide Chemical Search allows a user to easily find the pesticide chemical or active ingredient that they are interested in by using an array of simple to advanced search options. Chemical Search provides a single point of reference for easy access to information previously published in a variety of locations, including various EPA web pages and Regulations.gov.

  12. Portable Technology Comes of Age

    ERIC Educational Resources Information Center

    Wangemann, Paul; Lewis, Nina; Squires, David A.

    2003-01-01

    The PDA was originally conceived of as a portable handheld electronic device that provided a user with a tool to organize his or her life through easy access to a personal calendar, daily planner, and address book. Over the years, these devices have expanded to include many new functions, which have helped more applications in diverse fields. This…

  13. EVITHERM: The Virtual Institute of Thermal Metrology

    NASA Astrophysics Data System (ADS)

    Redgrove, J.; Filtz, J.-R.; Fischer, J.; Le Parlouër, P.; Mathot, V.; Nesvadba, P.; Pavese, F.

    2007-12-01

    Evitherm is a web-based thermal resource centre, resulting from a 3-year project partly funded by the EU’s GROWTH programme (2002 05). Evitherm links together the widely distributed centres of excellence (NMIs, research and teaching institutes, consultants, etc.) and others concerned with thermal measurements and technology to provide a focal point for information exchange and knowledge transfer between all these organizations and industry. To facilitate the quick and easy flow of thermal knowledge to users of thermal technologies, evitherm has a website (www.evitherm.org) through which it disseminates information and by which it also provides access to resources such as training, property data, measurements and experts. Among the resources available from the website are (1) thermal property data—offering access to some of the world’s leading databases; (2) expertise— evitherm has a database of consultants, an Advice line, a public Forum and a unique Consultancy Brokering Service whereby users are linked to the expert they need to solve their thermal industrial problems; (3) industry resources—thermal information for particular industry sectors; (4) services—information directories on thermal property measurement, training, equipment supply, reference materials, etc.; (5) literature—links to books, papers, standards, etc.; (6) events—conferences, meetings, seminars, organizations and networks, what’s happening. A user only has to register (for free) to gain access to all the information on the evitherm website. Much of the thermal property data can be accessed for free and in a few cases we have negotiated affordable rates for access to some leading databases, such as CINDAS, THERSYST and NELFOOD. This article illustrates the aims and structure of the evitherm Society, how it is directed, and how it serves the thermal community worldwide in its need for quick and easy access to the resources needed to help ensure a well resourced industrial work force and clean and efficient thermal processes.

  14. BIOME: A browser-aware search and order system

    NASA Technical Reports Server (NTRS)

    Grubb, Jon W.; Jennings, Sarah V.; Yow, Teresa G.; Daughterty, Patricia F.

    1996-01-01

    The Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC), which is associated with NASA's Earth Observing System Data and Information System (EOSDIS), provides access to a large number of tabular and imagery datasets used in ecological and environmental research. Because of its large and diverse data holdings, the challenge for the ORNL DAAC is to help users find data of interest from the hundreds of thousands of files available at the DAAC without overwhelming them. Therefore, the ORNL DAAC developed the Biogeochemical Information Ordering Management Environment (BIOME), a search and order system for the World Wide Web (WWW). The WWW provides a new vehicle that allows a wide range of users access to the data. This paper describes the specialized attributes incorporated into BIOME that allow researchers easy access to an otherwise bewildering array of data products.

  15. User experiences of evidence-based online resources for health professionals: User testing of The Cochrane Library

    PubMed Central

    Rosenbaum, Sarah E; Glenton, Claire; Cracknell, Jane

    2008-01-01

    Background Evidence-based decision making relies on easy access to trustworthy research results. The Cochrane Library is a key source of evidence about the effect of interventions and aims to "promote the accessibility of systematic reviews to anyone wanting to make a decision about health care". We explored how health professionals found, used and experienced The Library, looking at facets of user experience including findability, usability, usefulness, credibility, desirability and value. Methods We carried out 32 one-hour usability tests on participants from Norway and the UK. Participants both browsed freely and attempted to perform individually tailored tasks while "thinking aloud". Sessions were recorded and viewed in real time by researchers. Transcriptions and videos were reviewed by one researcher and one designer. Findings reported here reflect issues receiving a high degree of saturation and that we judge to be critical to the user experience of evidence-based web sites, based on principles for usability heuristics, web guidelines and evidence-based practice. Results Participants had much difficulty locating both the site and its contents. Non-native English speakers were at an extra disadvantage when retrieving relevant documents despite high levels of English-language skills. Many participants displayed feelings of ineptitude, alienation and frustration. Some made serious mistakes in correctly distinguishing between different information types, for instance reviews, review protocols, and individual studies. Although most expressed a high regard for the site's credibility, some later displayed a mistrust of the independence of the information. Others were overconfident, thinking everything on The Cochrane Library site shared the same level of quality approval. Conclusion Paradoxically, The Cochrane Library, established to support easy access to research evidence, has its own problems of accessibility. Health professionals' experiences of this and other evidence-based online resources can be improved by applying existing principles for web usability, prioritizing the development of simple search functionality, emitting "researcher" jargon, consistent marking of site ownership, and clear signposting of different document types and different content quality. PMID:18662382

  16. Worldwide Ocean Optics Database (WOOD)

    DTIC Science & Technology

    2001-09-30

    user can obtain values computed from empirical algorithms (e.g., beam attenuation estimated from diffuse attenuation and backscatter data). Error ...from empirical algorithms (e.g., beam attenuation estimated from diffuse attenuation and backscatter data). Error estimates will also be provided for...properties, including diffuse attenuation, beam attenuation, and scattering. The database shall be easy to use, Internet accessible, and frequently updated

  17. The North Carolina State University Libraries Search Experience: Usability Testing Tabbed Search Interfaces for Academic Libraries

    ERIC Educational Resources Information Center

    Teague-Rector, Susan; Ballard, Angela; Pauley, Susan K.

    2011-01-01

    Creating a learnable, effective, and user-friendly library Web site hinges on providing easy access to search. Designing a search interface for academic libraries can be particularly challenging given the complexity and range of searchable library collections, such as bibliographic databases, electronic journals, and article search silos. Library…

  18. The HydroShare Collaborative Repository for the Hydrology Community

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Couch, A.; Hooper, R. P.; Dash, P. K.; Stealey, M.; Yi, H.; Bandaragoda, C.; Castronova, A. M.

    2017-12-01

    HydroShare is an online, collaboration system for sharing of hydrologic data, analytical tools, and models. It supports the sharing of, and collaboration around, "resources" which are defined by standardized content types for data formats and models commonly used in hydrology. With HydroShare you can: Share your data and models with colleagues; Manage who has access to the content that you share; Share, access, visualize and manipulate a broad set of hydrologic data types and models; Use the web services application programming interface (API) to program automated and client access; Publish data and models and obtain a citable digital object identifier (DOI); Aggregate your resources into collections; Discover and access data and models published by others; Use web apps to visualize, analyze and run models on data in HydroShare. This presentation will describe the functionality and architecture of HydroShare highlighting our approach to making this system easy to use and serving the needs of the hydrology community represented by the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI). Metadata for uploaded files is harvested automatically or captured using easy to use web user interfaces. Users are encouraged to add or create resources in HydroShare early in the data life cycle. To encourage this we allow users to share and collaborate on HydroShare resources privately among individual users or groups, entering metadata while doing the work. HydroShare also provides enhanced functionality for users through web apps that provide tools and computational capability for actions on resources. HydroShare's architecture broadly is comprised of: (1) resource storage, (2) resource exploration website, and (3) web apps for actions on resources. System components are loosely coupled and interact through APIs, which enhances robustness, as components can be upgraded and advanced relatively independently. The full power of this paradigm is the extensibility it supports. Web apps are hosted on separate servers, which may be 3rd party servers. They are registered in HydroShare using a web app resource that configures the connectivity for them to be discovered and launched directly from resource types they are associated with.

  19. StarTrax --- The Next Generation User Interface

    NASA Astrophysics Data System (ADS)

    Richmond, Alan; White, Nick

    StarTrax is a software package to be distributed to end users for installation on their local computing infrastructure. It will provide access to many services of the HEASARC, i.e. bulletins, catalogs, proposal and analysis tools, initially for the ROSAT MIPS (Mission Information and Planning System), later for the Next Generation Browse. A user activating the GUI will reach all HEASARC capabilities through a uniform view of the system, independent of the local computing environment and of the networking method of accessing StarTrax. Use it if you prefer the point-and-click metaphor of modern GUI technology, to the classical command-line interfaces (CLI). Notable strengths include: easy to use; excellent portability; very robust server support; feedback button on every dialog; painstakingly crafted User Guide. It is designed to support a large number of input devices including terminals, workstations and personal computers. XVT's Portability Toolkit is used to build the GUI in C/C++ to run on: OSF/Motif (UNIX or VMS), OPEN LOOK (UNIX), or Macintosh, or MS-Windows (DOS), or character systems.

  20. PROVIDING PLANT DATA ANALYTICS THROUGH A SEAMLESS DIGITAL ENVIRONMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bly, Aaron; Oxstrand, Johanna

    As technology continues to evolve and become more integrated into a worker’s daily routine in the Nuclear Power industry the need for easy access to data becomes a priority. Not only does the need for data increase but the amount of data collected increases. In most cases the data is collected and stored in various software applications, many of which are legacy systems, which do not offer any other option to access the data except through the application’s user interface. Furthermore the data gets grouped in “silos” according to work function and not necessarily by subject. Hence, in order tomore » access all the information needed for a particular task or analysis one may have to access multiple applications to gather all the data needed. The industry and the research community have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment. An SDE provides a means to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. In addition, the nuclear utilities have identified the need for research focused on data analytics. The effort should develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. Idaho National Laboratory is leading such effort, which is conducted in close collaboration with vendors, nuclear utilities, Institute of Nuclear Power Operations, and Electric Power Research Institute. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This paper will describe the study and the initial results.« less

  1. NASA Taxonomy 2.0 Project Overview

    NASA Technical Reports Server (NTRS)

    Dutra, Jayne; Busch, Joseph

    2004-01-01

    This viewgraph presentation reviews the project to develop a Taxonomy for NASA. The benefits of this project are: Make it easy for various audiences to find relevant information from NASA programs quickly, specifically (1) Provide easy access for NASA Web resources (2) Information integration for unified queries and management reporting ve search results targeted to user interests the ability to move content through the enterprise to where it is needed most (3) Facilitate Records Management and Retention Requirements. In addition the project will assist NASA in complying with E-Government Act of 2002 and prepare NASA to participate in federal projects.

  2. The NOAO NVO Portal

    NASA Astrophysics Data System (ADS)

    Miller, C. J.; Gasson, D.; Fuentes, E.

    2007-10-01

    The NOAO NVO Portal is a web application for one-stop discovery, analysis, and access to VO-compliant imaging data and services. The current release allows for GUI-based discovery of nearly a half million images from archives such as the NOAO Science Archive, the Hubble Space Telescope WFPC2 and ACS instruments, XMM-Newton, Chandra, and ESO's INT Wide-Field Survey, among others. The NOAO Portal allows users to view image metadata, footprint wire-frames, FITS image previews, and provides one-click access to science quality imaging data throughout the entire sky via the Firefox web browser (i.e., no applet or code to download). Users can stage images from multiple archives at the NOAO NVO Portal for quick and easy bulk downloads. The NOAO NVO Portal also provides simplified and direct access to VO analysis services, such as the WESIX catalog generation service. We highlight the features of the NOAO NVO Portal (http://nvo.noao.edu).

  3. BIOME: A browser-aware search and order system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grubb, J.W.; Jennings, S.V.; Yow, T.G.

    1996-05-01

    The Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC), which is associated with NASA`s Earth Observing System Data and Information System (EOSDIS), provides access to a large number of tabular and imagery datasets used in ecological and environmental research. Because of its large and diverse data holdings, the challenge for the ORNL DAAC is to help users find data of interest from the hundreds of thousands of files available at the DAAC without overwhelming them. Therefore, the ORNL DAAC developed the Biogeochemical Information Ordering Management Environment (BIOME), a search and order system for the World Wide Web (WWW).more » The WWW provides a new vehicle that allows a wide range of users access to the data. This paper describes the specialized attributes incorporated into BIOME that allow researchers easy access to an otherwise bewildering array of data products.« less

  4. Examining Researcher Needs and Barriers for using Electronic Health Data for Translational Research

    PubMed Central

    Stephens, Kari A.; Lee, E. Sally; Estiri, Hossein; Jung, Hyunggu

    2015-01-01

    To achieve the Learning Health Care System, we must harness electronic health data (EHD) by providing effective tools for researchers to access data efficiently. EHD is proliferating and researchers are relying on these data to pioneer discovery. Tools must be user-centric to ensure their utility. To this end, we conducted a qualitative study to assess researcher needs and barriers to using EHD. Researchers expressed the need to be confident about the data and have easy access, a clear process for exploration and access, and adequate resources, while barriers included difficulties in finding datasets, usability of the data, cumbersome processes, and lack of resources. These needs and barriers can inform the design process for innovating tools to increase utility of EHD. Understanding researcher needs is key to building effective user-centered EHD tools to support translational research. PMID:26306262

  5. P43-S Computational Biology Applications Suite for High-Performance Computing (BioHPC.net)

    PubMed Central

    Pillardy, J.

    2007-01-01

    One of the challenges of high-performance computing (HPC) is user accessibility. At the Cornell University Computational Biology Service Unit, which is also a Microsoft HPC institute, we have developed a computational biology application suite that allows researchers from biological laboratories to submit their jobs to the parallel cluster through an easy-to-use Web interface. Through this system, we are providing users with popular bioinformatics tools including BLAST, HMMER, InterproScan, and MrBayes. The system is flexible and can be easily customized to include other software. It is also scalable; the installation on our servers currently processes approximately 8500 job submissions per year, many of them requiring massively parallel computations. It also has a built-in user management system, which can limit software and/or database access to specified users. TAIR, the major database of the plant model organism Arabidopsis, and SGN, the international tomato genome database, are both using our system for storage and data analysis. The system consists of a Web server running the interface (ASP.NET C#), Microsoft SQL server (ADO.NET), compute cluster running Microsoft Windows, ftp server, and file server. Users can interact with their jobs and data via a Web browser, ftp, or e-mail. The interface is accessible at http://cbsuapps.tc.cornell.edu/.

  6. A study on user authentication methodology using numeric password and fingerprint biometric information.

    PubMed

    Ju, Seung-hwan; Seo, Hee-suk; Han, Sung-hyu; Ryou, Jae-cheol; Kwak, Jin

    2013-01-01

    The prevalence of computers and the development of the Internet made us able to easily access information. As people are concerned about user information security, the interest of the user authentication method is growing. The most common computer authentication method is the use of alphanumerical usernames and passwords. The password authentication systems currently used are easy, but only if you know the password, as the user authentication is vulnerable. User authentication using fingerprints, only the user with the information that is specific to the authentication security is strong. But there are disadvantage such as the user cannot change the authentication key. In this study, we proposed authentication methodology that combines numeric-based password and biometric-based fingerprint authentication system. Use the information in the user's fingerprint, authentication keys to obtain security. Also, using numeric-based password can to easily change the password; the authentication keys were designed to provide flexibility.

  7. A Study on User Authentication Methodology Using Numeric Password and Fingerprint Biometric Information

    PubMed Central

    Ju, Seung-hwan; Seo, Hee-suk; Han, Sung-hyu; Ryou, Jae-cheol

    2013-01-01

    The prevalence of computers and the development of the Internet made us able to easily access information. As people are concerned about user information security, the interest of the user authentication method is growing. The most common computer authentication method is the use of alphanumerical usernames and passwords. The password authentication systems currently used are easy, but only if you know the password, as the user authentication is vulnerable. User authentication using fingerprints, only the user with the information that is specific to the authentication security is strong. But there are disadvantage such as the user cannot change the authentication key. In this study, we proposed authentication methodology that combines numeric-based password and biometric-based fingerprint authentication system. Use the information in the user's fingerprint, authentication keys to obtain security. Also, using numeric-based password can to easily change the password; the authentication keys were designed to provide flexibility. PMID:24151601

  8. dCache, Sync-and-Share for Big Data

    NASA Astrophysics Data System (ADS)

    Millar, AP; Fuhrmann, P.; Mkrtchyan, T.; Behrmann, G.; Bernardt, C.; Buchholz, Q.; Guelzow, V.; Litvintsev, D.; Schwank, K.; Rossi, A.; van der Reest, P.

    2015-12-01

    The availability of cheap, easy-to-use sync-and-share cloud services has split the scientific storage world into the traditional big data management systems and the very attractive sync-and-share services. With the former, the location of data is well understood while the latter is mostly operated in the Cloud, resulting in a rather complex legal situation. Beside legal issues, those two worlds have little overlap in user authentication and access protocols. While traditional storage technologies, popular in HEP, are based on X.509, cloud services and sync-and-share software technologies are generally based on username/password authentication or mechanisms like SAML or Open ID Connect. Similarly, data access models offered by both are somewhat different, with sync-and-share services often using proprietary protocols. As both approaches are very attractive, dCache.org developed a hybrid system, providing the best of both worlds. To avoid reinventing the wheel, dCache.org decided to embed another Open Source project: OwnCloud. This offers the required modern access capabilities but does not support the managed data functionality needed for large capacity data storage. With this hybrid system, scientists can share files and synchronize their data with laptops or mobile devices as easy as with any other cloud storage service. On top of this, the same data can be accessed via established mechanisms, like GridFTP to serve the Globus Transfer Service or the WLCG FTS3 tool, or the data can be made available to worker nodes or HPC applications via a mounted filesystem. As dCache provides a flexible authentication module, the same user can access its storage via different authentication mechanisms; e.g., X.509 and SAML. Additionally, users can specify the desired quality of service or trigger media transitions as necessary, thus tuning data access latency to the planned access profile. Such features are a natural consequence of using dCache. We will describe the design of the hybrid dCache/OwnCloud system, report on several months of operations experience running it at DESY, and elucidate the future road-map.

  9. [Statistical analysis using freely-available "EZR (Easy R)" software].

    PubMed

    Kanda, Yoshinobu

    2015-10-01

    Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.

  10. Building the vision : a series of AZTech ITS model deployment success stories for the Phoenix metropolitan area : number six : user friendliness touches Arizona : kiosks offer fast, efficient resource for traveler and community information

    DOT National Transportation Integrated Search

    1998-01-01

    To the traveling public, the most readily apparent benefit of AZTech is easy access to traveler information. Providing travelers with real value requires that information to be factual, comprehensive and timely. Through AZTech, numerous services are ...

  11. An Easy-to-Build Remote Laboratory with Data Transfer Using the Internet School Experimental System

    ERIC Educational Resources Information Center

    Schauer, Frantisek; Lustig, Frantisek; Dvorak, Jiri; Ozvoldova, Miroslava

    2008-01-01

    The present state of information communication technology makes it possible to devise and run computer-based e-laboratories accessible to any user with a connection to the Internet, equipped with very simple technical means and making full use of web services. Thus, the way is open for a new strategy of physics education with strongly global…

  12. We Never Have to Say Goodbye: Finding a Place for OPACS in Discovery Environments

    ERIC Educational Resources Information Center

    Matthews, J. Greg

    2009-01-01

    It is easy to lament the shortcomings of traditional online public access catalogs (OPACs) in the Google Age. Users cannot, for example, usually input a snippet of a long-forgotten pop song's chorus into an online library catalog and almost instantly retrieve a relevant result along with hundreds of other options. On the other hand, should OPACs…

  13. NREL Releases Major Update to Wind Energy Dataset | News | NREL

    Science.gov Websites

    Toolkit-made 2 terabytes (TB) of information available, covering about 120,000 locations identified using ) using the AWS cloud to provide users with easy access to the data, which is stored as a series of HDF5 files. The information can be narrowed to a specific site or time and analyzed using either a custom

  14. From Zero to Wireless in 4 Essential Steps

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2011-01-01

    Setting up a wireless network is easy enough. Place a few access points in strategic areas, then watch the network traffic fly! That approach may hold up reasonably well for a few dozen users, but there's no way it would support a schoolwide 1-to-1 program. Once students take to it with their wireless devices, "they're going to kill it,"…

  15. OSTA data systems planning workshop recommendations

    NASA Technical Reports Server (NTRS)

    Desjardins, R.

    1981-01-01

    The Integrated Discipline Requirements are presented, including the following needs: (1) quality data sets, (2) a systematic treatment of problems with present data, (3) a single integrated catalog or master directory, (4) continuity of data formats, (5) a standard geographic and time basis, (6) data delivery in terms of easy rather than immediate accessibility, (7) data archives, and (8) cooperation with user agencies.

  16. GENESI-DR - A single access point to Earth Science data

    NASA Astrophysics Data System (ADS)

    Cossu, R.; Goncalves, P.; Pacini, F.

    2009-04-01

    The amount of information being generated about our planet is increasing at an exponential rate, but it must be easily accessible in order to apply it to the global needs relating to the state of the Earth. Currently, information about the state of the Earth, relevant services, analysis results, applications and tools are accessible in a very scattered and uncoordinated way, often through individual initiatives from Earth Observation mission operators, scientific institutes dealing with ground measurements, service companies, data catalogues, etc. A dedicated infrastructure providing transparent access to all this will support Earth Science communities by allowing them to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains. The use of high-speed networks (GÉANT) and the experimentation of new technologies, like BitTorrent, will also contribute to better services for the Earth Science communities. GENESI-DR (Ground European Network for Earth Science Interoperations - Digital Repositories), an ESA-led, European Commission (EC)-funded two-year project, is taking the lead in providing reliable, easy, long-term access to Earth Science data via the Internet. This project will allow scientists from different Earth Science disciplines located across Europe to locate, access, combine and integrate historical and fresh Earth-related data from space, airborne and in-situ sensors archived in large distributed repositories. GENESI-DR builds a federated collection of heterogeneous digital Earth Science repositories to establish a dedicated infrastructure providing transparent access to all this and allowing Earth Science communities to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains. The federated digital repositories, seen as services and data providers, will share access to their resources (catalogue functions, data access, processing services etc.) and will adhere to a common set of standards / policies / interfaces. The end-users will be provided with a virtual collection of digital Earth Science data, irrespectively of their location in the various single federated repositories. GENESI-DR objectives have lead to the identification of the basic GENESI-DR infrastructure requirements: • Capability, for Earth Science users, to discover data from different European Earth Science Digital Repositories through the same interface in a transparent and homogeneous way; • Easiness and speed of access to large volumes of coherently maintained distributed data in an effective and timely way; • Capability, for DR owners, to easily make available their data to a significantly increased audience with no need to duplicate them in a different storage system. Data discovery is based on a Central Discovery Service, which allows users and applications to easily query information about data collections and products existing in heterogeneous catalogues, at federated DR sites. This service can be accessed by users via web interface, the GENESI-DR Web Portal, or by external applications via open standardized interfaces exposed by the system. The Central Discovery Service identifies the DRs providing products complying with the user search criteria and returns the corresponding access points to the requester. By taking into consideration different and efficient data transfer technologies such as HTTPS, GridFTP and BitTorrent, the infrastructure provides easiness and speed of access. Conversely, for data publishing GENESI-DR provides several mechanisms to assist DR owners in producing a metadata catalogues. In order to reach its objectives, the GENESI-DR e-Infrastructure will be validated against user needs for accessing and sharing Earth Science data. Initially, four specific applications in the land, atmosphere and marine domains have been selected, including: • Near real time orthorectification for agricultural crops monitoring • Urban area mapping in support of emergency response • Data assimilation in GlobModel, addressing major environmental and health issues in Europe, with a particular focus on air quality • SeaDataNet to aid environmental assessments and to forecast the physical state of the oceans in near real time. Other applications will complement this during the second half of the project. GENESI-DR also aims to develop common approaches to preserve the historical archives and the ability to access the derived user information as both software and hardware transformations occur. Ensuring access to Earth Science data for future generations is of utmost importance because it allows for the continuity of knowledge generation improvement. For instance, scientists accessing today's climate change data in 50 years will be able to better understand and detect trends in global warming and apply this knowledge to ongoing natural phenomena. GENESI-DR will work towards harmonising operations and applying approved standards, policies and interfaces at key Earth Science data repositories. To help with this undertaking, GENESI-DR will establish links with the relevant organisations and programmes such as space agencies, institutional environmental programmes, international Earth Science programmes and standardisation bodies.

  17. A review of manual wheelchairs.

    PubMed

    Flemmer, Claire L; Flemmer, Rory C

    2016-01-01

    To review the scientific literature published in the last 14 years on the different types of manual wheelchairs. A systematic review of the literature was conducted to find the recent research on manual wheelchairs. The findings of 77 references on pushrim-propelled wheelchairs, crank-propelled wheelchairs, lever-propelled wheelchairs, geared manual wheelchairs and pushrim-activated power-assist wheelchairs are reported. The pushrim-propelled wheelchair is light, easy to steer and has good indoor manoeuvrability but is very inefficient and causes serious upper body overloading so that long-term use leads to steadily deteriorating capability for the user and ultimately a transition to a powered chair. Whilst the latter is less physically demanding, the sedentary lifestyle and decreasing muscle use lead to several secondary health problems. Crank- and lever-propelled wheelchairs and geared pushrim wheelchairs are more efficient and less demanding and may improve the quality of life of the user by expanding the range of accessible environments, reducing upper body pain, increasing independence and avoiding or delaying the 'debilitating cycle'. However, wheelchairs with these alternative modes of propulsion are often heavier, wider and/or longer and are less easy to steer, brake and fold than the pushrim wheelchair. Implications for rehabilitation Pushrim-propelled wheelchairs are difficult to drive on outdoor paths (grass and gravel/sand surfaces) and ramps so that users are confined to restricted environments and have limited participation in everyday activities. The repetitive strain imposed on the upper body by pushrim propulsion leads to very high prevalence of shoulder and wrist pain in manual wheelchair users. Crank-propelled and lever-propelled wheelchairs are more efficient and less straining than pushrim propelled wheelchairs, allowing users to access more challenging environments, prolong independence and improve the quality of life.

  18. Vibrotactile Feedbacks System for Assisting the Physically Impaired Persons for Easy Navigation

    NASA Astrophysics Data System (ADS)

    Safa, M.; Geetha, G.; Elakkiya, U.; Saranya, D.

    2018-04-01

    NAYAN architecture is for a visually impaired person to help for navigation. As well known, all visually impaired people desperately requires special requirements even to access services like the public transportation. This prototype system is a portable device; it is so easy to carry in any conduction to travel through a familiar and unfamiliar environment. The system consists of GPS receiver and it can get NEMA data through the satellite and it is provided to user's Smartphone through Arduino board. This application uses two vibrotactile feedbacks that will be placed in the left and right shoulder for vibration feedback, which gives information about the current location. The ultrasonic sensor is used for obstacle detection which is found in front of the visually impaired person. The Bluetooth modules connected with Arduino board is to send information to the user's mobile phone which it receives from GPS.

  19. Finding and Exploring Health Information with a Slider-Based User Interface.

    PubMed

    Pang, Patrick Cheong-Iao; Verspoor, Karin; Pearce, Jon; Chang, Shanton

    2016-01-01

    Despite the fact that search engines are the primary channel to access online health information, there are better ways to find and explore health information on the web. Search engines are prone to problems when they are used to find health information. For instance, users have difficulties in expressing health scenarios with appropriate search keywords, search results are not optimised for medical queries, and the search process does not account for users' literacy levels and reading preferences. In this paper, we describe our approach to addressing these problems by introducing a novel design using a slider-based user interface for discovering health information without the need for precise search keywords. The user evaluation suggests that the interface is easy to use and able to assist users in the process of discovering new information. This study demonstrates the potential value of adopting slider controls in the user interface of health websites for navigation and information discovery.

  20. Development of a mental health smartphone app: perspectives of mental health service users.

    PubMed

    Goodwin, John; Cummins, John; Behan, Laura; O'Brien, Sinead M

    2016-10-01

    Current mental health policy emphasises the importance of service user involvement in the delivery of care. Information Technology can have an effect on quality and efficiency of care. The aim of this study is to gain the viewpoint of service users from a local mental health service in developing a mental health app. A qualitative descriptive approach was used. Eight volunteers aged 18-49 years were interviewed with the aid of a semi-structured questionnaire. Interviewees defined a good app by its ease of use. Common themes included availability of contact information, identifying triggers, the ability to rate mood/anxiety levels on a scale, guided relaxation techniques, and the option to personalise the app. The researchers will aim to produce an app that is easily accessible, highly personalisable and will include functions highlighted as important (i.e. contact information, etc.). This research will assist in the development of an easy-to-use app that could increase access to services, and allow service users to take an active role in their care. In previous studies, apps were developed without the involvement of service users. This study recognises the important role of service users in this area.

  1. Myokit: A simple interface to cardiac cellular electrophysiology.

    PubMed

    Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A

    2016-01-01

    Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. JHelioviewer: Open-Source Software for Discovery and Image Access in the Petabyte Age

    NASA Astrophysics Data System (ADS)

    Mueller, D.; Dimitoglou, G.; Garcia Ortiz, J.; Langenberg, M.; Nuhn, M.; Dau, A.; Pagel, S.; Schmidt, L.; Hughitt, V. K.; Ireland, J.; Fleck, B.

    2011-12-01

    The unprecedented torrent of data returned by the Solar Dynamics Observatory is both a blessing and a barrier: a blessing for making available data with significantly higher spatial and temporal resolution, but a barrier for scientists to access, browse and analyze them. With such staggering data volume, the data is accessible only from a few repositories and users have to deal with data sets effectively immobile and practically difficult to download. From a scientist's perspective this poses three challenges: accessing, browsing and finding interesting data while avoiding the proverbial search for a needle in a haystack. To address these challenges, we have developed JHelioviewer, an open-source visualization software that lets users browse large data volumes both as still images and movies. We did so by deploying an efficient image encoding, storage, and dissemination solution using the JPEG 2000 standard. This solution enables users to access remote images at different resolution levels as a single data stream. Users can view, manipulate, pan, zoom, and overlay JPEG 2000 compressed data quickly, without severe network bandwidth penalties. Besides viewing data, the browser provides third-party metadata and event catalog integration to quickly locate data of interest, as well as an interface to the Virtual Solar Observatory to download science-quality data. As part of the ESA/NASA Helioviewer Project, JHelioviewer offers intuitive ways to browse large amounts of heterogeneous data remotely and provides an extensible and customizable open-source platform for the scientific community. In addition, the easy-to-use graphical user interface enables the general public and educators to access, enjoy and reuse data from space missions without barriers.

  3. webPOISONCONTROL: can poison control be automated?

    PubMed

    Litovitz, Toby; Benson, Blaine E; Smolinske, Susan

    2016-08-01

    A free webPOISONCONTROL app allows the public to determine the appropriate triage of poison ingestions without calling poison control. If accepted and safe, this alternative expands access to reliable poison control services to those who prefer the Internet over the telephone. This study assesses feasibility, safety, and user-acceptance of automated online triage of asymptomatic, nonsuicidal poison ingestion cases. The user provides substance name, amount, age, and weight in an automated online tool or downloadable app, and is given a specific triage recommendation to stay home, go to the emergency department, or call poison control for further guidance. Safety was determined by assessing outcomes of consecutive home-triaged cases with follow-up and by confirming the correct application of algorithms. Case completion times and user perceptions of speed and ease of use were measures of user-acceptance. Of 9256 cases, 73.3% were triaged to home, 2.1% to an emergency department, and 24.5% directed to call poison control. Children younger than 6 years were involved in 75.2% of cases. Automated follow-up was done in 31.2% of home-triaged cases; 82.3% of these had no effect. No major or fatal outcomes were reported. More than 91% of survey respondents found the tool quick and easy to use. Median case completion time was 4.1 minutes. webPOISONCONTROL augments traditional poison control services by providing automated, accurate online access to case-specific triage and first aid guidance for poison ingestions. It is safe, quick, and easy to use. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Recent developments in the CCP-EM software suite.

    PubMed

    Burnley, Tom; Palmer, Colin M; Winn, Martyn

    2017-06-01

    As part of its remit to provide computational support to the cryo-EM community, the Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has produced a software framework which enables easy access to a range of programs and utilities. The resulting software suite incorporates contributions from different collaborators by encapsulating them in Python task wrappers, which are then made accessible via a user-friendly graphical user interface as well as a command-line interface suitable for scripting. The framework includes tools for project and data management. An overview of the design of the framework is given, together with a survey of the functionality at different levels. The current CCP-EM suite has particular strength in the building and refinement of atomic models into cryo-EM reconstructions, which is described in detail.

  5. Thomson Scientific's expanding Web of Knowledge: beyond citation databases and current awareness services.

    PubMed

    London, Sue; Brahmi, Frances A

    2005-01-01

    As end-user demand for easy access to electronic full text continues to climb, an increasing number of information providers are combining that access with their other products and services, making navigating their Web sites by librarians seeking information on a given product or service more daunting than ever. One such provider of a complex array of products and services is Thomson Scientific. This paper looks at some of the many products and tools available from two of Thomson Scientific's businesses, Thomson ISI and Thomson ResearchSoft. Among the items of most interest to health sciences and veterinary librarians and their users are the variety of databases available via the ISI Web of Knowledge platform and the information management products available from ResearchSoft.

  6. Recent developments in the CCP-EM software suite

    PubMed Central

    Burnley, Tom

    2017-01-01

    As part of its remit to provide computational support to the cryo-EM community, the Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has produced a software framework which enables easy access to a range of programs and utilities. The resulting software suite incorporates contributions from different collaborators by encapsulating them in Python task wrappers, which are then made accessible via a user-friendly graphical user interface as well as a command-line interface suitable for scripting. The framework includes tools for project and data management. An overview of the design of the framework is given, together with a survey of the functionality at different levels. The current CCP-EM suite has particular strength in the building and refinement of atomic models into cryo-EM reconstructions, which is described in detail. PMID:28580908

  7. Trends in Perceived Access to Marijuana Among Adolescents in the United States: 2002-2015.

    PubMed

    Salas-Wright, Christopher P; Oh, Sehun; Goings, Trenette Clark; Vaughn, Michael G

    2017-09-01

    There is concern that changes in marijuana-related policy and public opinion may lead to increased access to marijuana among young people in the United States. However, little research has been conducted on changes in youth's perceptions of marijuana access, and studies have yet to systematically examine trends in perceived access across key sociodemographic and externalizing behavioral subgroups. Using population-based data collected between 2002 and 2015 as part of the National Survey on Drug Use and Health, we examined trends in perceived marijuana access among non-Hispanic White, African American, and Hispanic adolescents (ages 12-17, n = 221,412). Following the trend analysis method outlined by the Centers for Disease Control and Prevention, we conducted logistic regression analyses to test for secular trends. Between 2002 and 2015, we observed a 27% overall reduction in the relative proportion of adolescents ages 12-17-and a 42% reduction among those ages 12-14-reporting that it would be "very easy" to obtain marijuana. This pattern was uniformly observed among youth in all sociodemographic subgroups (i.e., across age, gender, race/ethnicity, household income) and among youth reporting involvement/no involvement in most measures of substance use (alcohol, marijuana) and delinquency (handgun carrying, attacks). However, perceived very easy access remained stable among youth reporting tobacco use and criminal justice system involvement. Despite the legalization of recreational and medical marijuana in some states, our findings suggest that, with the notable exception of adolescent tobacco users and juvenile offenders, perceptions that marijuana would be very easy to obtain are on the decline among American youth.

  8. Design of a Template for Handwriting Based Hindi Text Entry in Handheld Devices

    NASA Astrophysics Data System (ADS)

    Gangopadhyay, Diya; Vasal, Ityam; Yammiyavar, Pradeep

    Mobile phones, in the recent times, have become affordable and accessible to a wider range of users including the hitherto technologically and economically under-represented segments. Indian users are a gigantic consumer base for mobile phones. With Hindi being one of the most widely spoken languages in the country and the primary tool of communication for about a third of its population, an effective solution for Hindi text entry in mobile devices is expected to be immensely useful to the non English speaking users. This paper proposes a mobile phone handwriting based text entry solution for Hindi language, which allows for an easy text entry method, while facilitating better recognition accuracy.

  9. FreeSASA: An open source C library for solvent accessible surface area calculations.

    PubMed

    Mitternacht, Simon

    2016-01-01

    Calculating solvent accessible surface areas (SASA) is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards' and Shrake and Rupley's approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.

  10. ZEUS hardware control system

    NASA Astrophysics Data System (ADS)

    Loveless, R.; Erhard, P.; Ficenec, J.; Gather, K.; Heath, G.; Iacovacci, M.; Kehres, J.; Mobayyen, M.; Notz, D.; Orr, R.; Orr, R.; Sephton, A.; Stroili, R.; Tokushuku, K.; Vogel, W.; Whitmore, J.; Wiggers, L.

    1989-12-01

    The ZEUS collaboration is building a system to monitor, control and document the hardware of the ZEUS detector. This system is based on a network of VAX computers and microprocessors connected via ethernet. The database for the hardware values will be ADAMO tables; the ethernet connection will be DECNET, TCP/IP, or RPC. Most of the documentation will also be kept in ADAMO tables for easy access by users.

  11. Chips: A Tool for Developing Software Interfaces Interactively.

    DTIC Science & Technology

    1987-10-01

    of the application through the objects on the screen. Chips makes this easy by supplying simple and direct access to the source code and data ...object-oriented programming, user interface management systems, programming environments. Typographic Conventions Technical terms appearing in the...creating an environment in which we could do our work. This project could not have happened without him. Jeff Bonar started and managed the Chips

  12. Single-centre experience with Renal PatientView, a web-based system that provides patients with access to their laboratory results.

    PubMed

    Woywodt, Alexander; Vythelingum, Kervina; Rayner, Scott; Anderton, John; Ahmed, Aimun

    2014-10-01

    Renal PatientView (RPV) is a novel, web-based system in the UK that provides patients with access to their laboratory results, in conjunction with patient information. To study how renal patients within our centre access and use RPV. We sent out questionnaires in December 2011 to all 651 RPV users under our care. We collected information on aspects such as the frequency and timing of RPV usage, the parameters viewed by users, and the impact of RPV on their care. A total of 295 (45 %) questionnaires were returned. The predominant users of RPV were transplant patients (42 %) followed by pre-dialysis chronic kidney disease patients (37 %). Forty-two percent of RPV users accessed their results after their clinic appointments, 38 % prior to visiting the clinic. The majority of patients (76 %) had used the system to discuss treatment with their renal physician, while 20 % of patients gave permission to other members of their family to use RPV to monitor results on their behalf. Most users (78 %) reported accessing RPV on average 1-5 times/month. Most patients used RPV to monitor their kidney function, 81 % to check creatinine levels, 57 % to check potassium results. Ninety-two percent of patients found RPV easy to use and 93 % felt that overall the system helps them in taking care of their condition; 53 % of patients reported high satisfaction with RPV. Our results provide interesting insight into use of a system that gives patients web-based access to laboratory results. The fact that 20 % of patients delegate access to relatives also warrants further study. We propose that online access to laboratory results should be offered to all renal patients, although clinicians need to be mindful of the 'digital divide', i.e. part of the population that is not amenable to IT-based strategies for patient empowerment.

  13. pyGeno: A Python package for precision medicine and proteogenomics.

    PubMed

    Daouda, Tariq; Perreault, Claude; Lemieux, Sébastien

    2016-01-01

    pyGeno is a Python package mainly intended for precision medicine applications that revolve around genomics and proteomics. It integrates reference sequences and annotations from Ensembl, genomic polymorphisms from the dbSNP database and data from next-gen sequencing into an easy to use, memory-efficient and fast framework, therefore allowing the user to easily explore subject-specific genomes and proteomes. Compared to a standalone program, pyGeno gives the user access to the complete expressivity of Python, a general programming language. Its range of application therefore encompasses both short scripts and large scale genome-wide studies.

  14. pyGeno: A Python package for precision medicine and proteogenomics

    PubMed Central

    Daouda, Tariq; Perreault, Claude; Lemieux, Sébastien

    2016-01-01

    pyGeno is a Python package mainly intended for precision medicine applications that revolve around genomics and proteomics. It integrates reference sequences and annotations from Ensembl, genomic polymorphisms from the dbSNP database and data from next-gen sequencing into an easy to use, memory-efficient and fast framework, therefore allowing the user to easily explore subject-specific genomes and proteomes. Compared to a standalone program, pyGeno gives the user access to the complete expressivity of Python, a general programming language. Its range of application therefore encompasses both short scripts and large scale genome-wide studies. PMID:27785359

  15. A cloud-based semantic wiki for user training in healthcare process management.

    PubMed

    Papakonstantinou, D; Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2011-01-01

    Successful healthcare process design requires active participation of users who are familiar with the cooperative and collaborative nature of healthcare delivery, expressed in terms of healthcare processes. Hence, a reusable, flexible, agile and adaptable training material is needed with the objective to enable users instill their knowledge and expertise in healthcare process management and (re)configuration activities. To this end, social software, such as a wiki, could be used as it supports cooperation and collaboration anytime, anywhere and combined with semantic web technology that enables structuring pieces of information for easy retrieval, reuse and exchange between different systems and tools. In this paper a semantic wiki is presented as a means for developing training material for healthcare providers regarding healthcare process management. The semantic wiki should act as a collective online memory containing training material that is accessible to authorized users, thus enhancing the training process with collaboration and cooperation capabilities. It is proposed that the wiki is stored in a secure virtual private cloud that is accessible from anywhere, be it an excessively open environment, while meeting the requirements of redundancy, high performance and autoscaling.

  16. VIEWCACHE: An incremental pointer-base access method for distributed databases. Part 1: The universal index system design document. Part 2: The universal index system low-level design document. Part 3: User's guide. Part 4: Reference manual. Part 5: UIMS test suite

    NASA Technical Reports Server (NTRS)

    Kelley, Steve; Roussopoulos, Nick; Sellis, Timos

    1992-01-01

    The goal of the Universal Index System (UIS), is to provide an easy-to-use and reliable interface to many different kinds of database systems. The impetus for this system was to simplify database index management for users, thus encouraging the use of indexes. As the idea grew into an actual system design, the concept of increasing database performance by facilitating the use of time-saving techniques at the user level became a theme for the project. This Final Report describes the Design, the Implementation of UIS, and its Language Interfaces. It also includes the User's Guide and the Reference Manual.

  17. Sustains--direct access for the patient to the medical record over the Internet.

    PubMed

    Eklund, Benny; Joustra-Enquist, Ingrid

    2004-01-01

    The basic idea of Sustains III is to emulate the Internet banking for Health Care. Instead of an "Internet Bank Account" the user has a "Health Care Account". The user logs in using a One Time Password which is sent to the user's mobile phone as an SMS, three seconds after the PIN code is entered. Thus personal information can be transferred both ways in a secure way, with acceptable privacy. The user can then explore the medical record in detail. Also get full and complete list of prescriptions, lab-result etc. It's also an easy way of exchange written information between the doctor and the patient. So far Sustains has showed that patients are very satisfied and is also beneficial for the physicians.

  18. File Management In Space

    NASA Technical Reports Server (NTRS)

    Critchfield, Anna R.; Zepp, Robert H.

    2000-01-01

    We propose that the user interact with the spacecraft as if the spacecraft were a file server, so that the user can select and receive data as files in standard formats (e.g., tables or images, such as jpeg) via the Internet. Internet technology will be used end-to-end from the spacecraft to authorized users, such as the flight operation team, and project scientists. The proposed solution includes a ground system and spacecraft architecture, mission operations scenarios, and an implementation roadmap showing migration from current practice to the future, where distributed users request and receive files of spacecraft data from archives or spacecraft with equal ease. This solution will provide ground support personnel and scientists easy, direct, secure access to their authorized data without cumbersome processing, and can be extended to support autonomous communications with the spacecraft.

  19. Increasing access to emergency contraception through online prescription requests.

    PubMed

    Averbach, Sarah; Wendt, Jacqueline Moro; Levine, Deborah K; Philip, Susan S; Klausner, Jeffrey D

    2010-01-01

    To describe a pilot program, Plan B Online Prescription Access, to provide easy access to prescriptions for emergency contraception via the Internet. We measured electronic prescriptions for Plan B (Duramed Pharmaceuticals, Cincinnati, Ohio) by month over time. Pharmacists faxed patient-generated prescriptions back to the Department of Public Health for confirmation. Despite no marketing, within the first 18 months of the program, 152 electronic prescriptions for Plan B were requested by 128 female San Francisco residents. Seventy-eight prescriptions were filled (51%) by pharmacists. If correctly marketed, online prescriptions for Plan B have the potential to be an effective means of increasing emergency contraception access in both urban and rural settings across the United States. Further user-acceptability studies are warranted.

  20. CIP Training Manual: Collaborative Information Portal Advance Training Information for Field Test Participants

    NASA Technical Reports Server (NTRS)

    Schreiner, John; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Collaborative Information Portal (CIP) is a web-based information management and retrieval system. Its purpose is to provide users at MER (Mars Exploration Rover) mission operations with easy access to a broad range of mission data and products and contextual information such as the current operations schedule. The CIP web-server provides this content in a user customizable web-portal environment. Since CIP is still under development, only a subset of the full feature set will be available for the EDO field test. The CIP web-portal will be accessed through a standard web browser. CIP is intended to be intuitive and simple to use, however, at the training session, users will receive a one to two page reference guide, which should aid them in using CIP. Users must provide their own computers for accessing CIP during the field test. These computers should be configured with Java 1.3 and a Java 2 enabled browser. Macintosh computers should be running OS 10.1.3 or later. Classic Mac OS (OS 9) is not supported. For more information please read section 7.3 in the FIASCO Rover Science Operations Test Mission Plan. Several screen shots of the Beta Release of CIP are shown on the following pages.

  1. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  2. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  3. Easy access to geophysical data sets at the IRIS Data Management Center

    NASA Astrophysics Data System (ADS)

    Trabant, C.; Ahern, T.; Suleiman, Y.; Karstens, R.; Weertman, B.

    2012-04-01

    At the IRIS Data Management Center (DMC) we primarily manage seismological data but also have other geophysical data sets for related fields including atmospheric pressure and gravity measurements and higher level data products derived from raw data. With a few exceptions all data managed by the IRIS DMC are openly available and we serve an international research audience. These data are available via a number of different mechanisms from batch requests submitted through email, web interfaces, near real time streams and more recently web services. Our initial suite of web services offer access to almost all of the raw data and associated metadata managed at the DMC. In addition, we offer services that apply processing to the data before it is sent to the user. Web service technologies are ubiquitous with support available in nearly every programming language and operating system. By their nature web services are programmatic interfaces, but by choosing a simple subset of web service methods we make our data available to a very broad user base. These interfaces will be usable by professional developers as well as non-programmers. Whenever possible we chose open and recognized standards. The data returned to the user is in a variety of formats depending on type, including FDSN SEED, QuakeML, StationXML, ASCII, PNG images and in some cases where no appropriate standard could be found a customized XML format. To promote easy access to seismological data for all researchers we are coordinating with international partners to define web service interfaces standards. Additionally we are working with key partners in Europe to complete the initial implementation of these services. Once a standard has been adopted and implemented at multiple data centers researchers will be able to use the same request tools to access data across multiple data centers. The web services that apply on-demand processing to requested data include the capability to apply instrument corrections and format translations which ultimately allows more researchers to use the data without knowledge of specific data and metadata formats. In addition to serving as a new platform on top of which research scientists will build advanced processing tools we anticipate that they will result in more data being accessible by more users.

  4. Accessing suicide-related information on the internet: a retrospective observational study of search behavior.

    PubMed

    Wong, Paul Wai-Ching; Fu, King-Wa; Yau, Rickey Sai-Pong; Ma, Helen Hei-Man; Law, Yik-Wa; Chang, Shu-Sen; Yip, Paul Siu-Fai

    2013-01-11

    The Internet's potential impact on suicide is of major public health interest as easy online access to pro-suicide information or specific suicide methods may increase suicide risk among vulnerable Internet users. Little is known, however, about users' actual searching and browsing behaviors of online suicide-related information. To investigate what webpages people actually clicked on after searching with suicide-related queries on a search engine and to examine what queries people used to get access to pro-suicide websites. A retrospective observational study was done. We used a web search dataset released by America Online (AOL). The dataset was randomly sampled from all AOL subscribers' web queries between March and May 2006 and generated by 657,000 service subscribers. We found 5526 search queries (0.026%, 5526/21,000,000) that included the keyword "suicide". The 5526 search queries included 1586 different search terms and were generated by 1625 unique subscribers (0.25%, 1625/657,000). Of these queries, 61.38% (3392/5526) were followed by users clicking on a search result. Of these 3392 queries, 1344 (39.62%) webpages were clicked on by 930 unique users but only 1314 of those webpages were accessible during the study period. Each clicked-through webpage was classified into 11 categories. The categories of the most visited webpages were: entertainment (30.13%; 396/1314), scientific information (18.31%; 240/1314), and community resources (14.53%; 191/1314). Among the 1314 accessed webpages, we could identify only two pro-suicide websites. We found that the search terms used to access these sites included "commiting suicide with a gas oven", "hairless goat", "pictures of murder by strangulation", and "photo of a severe burn". A limitation of our study is that the database may be dated and confined to mainly English webpages. Searching or browsing suicide-related or pro-suicide webpages was uncommon, although a small group of users did access websites that contain detailed suicide method information.

  5. Accessing northern California earthquake data via Internet

    NASA Astrophysics Data System (ADS)

    Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David

    The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.

  6. Programmatic access to logical models in the Cell Collective modeling environment via a REST API.

    PubMed

    Kowal, Bryan M; Schreier, Travis R; Dauer, Joseph T; Helikar, Tomáš

    2016-01-01

    Cell Collective (www.cellcollective.org) is a web-based interactive environment for constructing, simulating and analyzing logical models of biological systems. Herein, we present a Web service to access models, annotations, and simulation data in the Cell Collective platform through the Representational State Transfer (REST) Application Programming Interface (API). The REST API provides a convenient method for obtaining Cell Collective data through almost any programming language. To ensure easy processing of the retrieved data, the request output from the API is available in a standard JSON format. The Cell Collective REST API is freely available at http://thecellcollective.org/tccapi. All public models in Cell Collective are available through the REST API. For users interested in creating and accessing their own models through the REST API first need to create an account in Cell Collective (http://thecellcollective.org). thelikar2@unl.edu. Technical user documentation: https://goo.gl/U52GWo. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veseli, S.

    As the number of sites deploying and adopting EPICS Version 4 grows, so does the need to support PV Access from multiple languages. Especially important are the widely used scripting languages that tend to reduce both software development time and the learning curve for new users. In this paper we describe PvaPy, a Python API for the EPICS PV Access protocol and its accompanying structured data API. Rather than implementing the protocol itself in Python, PvaPy wraps the existing EPICS Version 4 C++ libraries using the Boost.Python framework. This approach allows us to benefit from the existing code base andmore » functionality, and to significantly reduce the Python API development effort. PvaPy objects are based on Python dictionaries and provide users with the ability to access even the most complex of PV Data structures in a relatively straightforward way. Its interfaces are easy to use, and include support for advanced EPICS Version 4 features such as implementation of client and server Remote Procedure Calls (RPC).« less

  8. Designing and Implementing a Clinician Workstation

    PubMed Central

    Tape, Thomas G.; Campbell, James R.

    1993-01-01

    We describe a simple approach to designing and quickly implementing a clinician workstation that helps practitioners access a variety of information resources. Using easy-to-use graphical tools, we installed a pilot workstation in our clinics. We were able to accommodate most of our users' needs and have a functional workstation installed in two months. This is the first step of an evolutionary process moving from separate tasks to full application integration.

  9. Reprint Filing: A Profile-Based Solution

    PubMed Central

    Gass, David A.; Putnam, R. Wayne

    1983-01-01

    A reprint filing system based on practice profiles can give family physicians easy access to relevant medical information. The use of the ICHPPC classification and some supplemental categories provides a more practical coding mechanism than organ systems, textbook chapter titles or even Index Medicus subject headings. The system can be simply maintained, updated and improved, but users must regularly weed out unused information, and read widely to keep the reprints current. PMID:21283301

  10. Monitoring and controlling ATLAS data management: The Rucio web user interface

    NASA Astrophysics Data System (ADS)

    Lassnig, M.; Beermann, T.; Vigne, R.; Barisits, M.; Garonne, V.; Serfon, C.

    2015-12-01

    The monitoring and controlling interfaces of the previous data management system DQ2 followed the evolutionary requirements and needs of the ATLAS collaboration. The new data management system, Rucio, has put in place a redesigned web-based interface based upon the lessons learnt from DQ2, and the increased volume of managed information. This interface encompasses both a monitoring and controlling component, and allows easy integration for usergenerated views. The interface follows three design principles. First, the collection and storage of data from internal and external systems is asynchronous to reduce latency. This includes the use of technologies like ActiveMQ or Nagios. Second, analysis of the data into information is done massively parallel due to its volume, using a combined approach with an Oracle database and Hadoop MapReduce. Third, sharing of the information does not distinguish between human or programmatic access, making it easy to access selective parts of the information both in constrained frontends like web-browsers as well as remote services. This contribution will detail the reasons for these principles and the design choices taken. Additionally, the implementation, the interactions with external systems, and an evaluation of the system in production, both from a technological and user perspective, conclude this contribution.

  11. A Compositional Relevance Model for Adaptive Information Retrieval

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)

    1994-01-01

    There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.

  12. Cry-Bt identifier: a biological database for PCR detection of Cry genes present in transgenic plants.

    PubMed

    Singh, Vinay Kumar; Ambwani, Sonu; Marla, Soma; Kumar, Anil

    2009-10-23

    We describe the development of a user friendly tool that would assist in the retrieval of information relating to Cry genes in transgenic crops. The tool also helps in detection of transformed Cry genes from Bacillus thuringiensis present in transgenic plants by providing suitable designed primers for PCR identification of these genes. The tool designed based on relational database model enables easy retrieval of information from the database with simple user queries. The tool also enables users to access related information about Cry genes present in various databases by interacting with different sources (nucleotide sequences, protein sequence, sequence comparison tools, published literature, conserved domains, evolutionary and structural data). http://insilicogenomics.in/Cry-btIdentifier/welcome.html.

  13. Was access to health care easy for immigrants in Spain? The perspectives of health personnel in Catalonia and Andalusia.

    PubMed

    Vázquez, María-Luisa; Vargas, Ingrid; Jaramillo, Daniel López; Porthé, Victoria; López-Fernández, Luis Andrés; Vargas, Hernán; Bosch, Lola; Hernández, Silvia S; Azarola, Ainhoa Ruiz

    2016-04-01

    Until April 2012, all Spanish citizens were entitled to health care and policies had been developed at national and regional level to remove potential barriers of access, however, evidence suggested problems of access for immigrants. In order to identify factors affecting immigrants' access to health care, we conducted a qualitative study based on individual interviews with healthcare managers (n=27) and professionals (n=65) in Catalonia and Andalusia, before the policy change that restricted access for some groups. A thematic analysis was carried out. Health professionals considered access to health care "easy" for immigrants and similar to access for autochthons in both regions. Clear barriers were identified to enter the health system (in obtaining the health card) and in using services, indicating a mismatch between the characteristics of services and those of immigrants. Results did not differ among regions, except for in Catalonia, where access to care was considered harder for users without a health card, due to the fees charged, and in general, because of the distance to primary health care in rural areas. In conclusion, despite the universal coverage granted by the Spanish healthcare system and developed health policies, a number of barriers in access emerged that would require implementing the existing policies. However, the measures taken in the context of the economic crisis are pointing in the opposite direction, towards maintaining or increasing barriers. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. ERPLAB: an open-source toolbox for the analysis of event-related potentials.

    PubMed

    Lopez-Calderon, Javier; Luck, Steven J

    2014-01-01

    ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB's EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB's tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user's guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  15. Development, Technical, and User Evaluation of a Web Mobile Application for Self-Control of Diabetes.

    PubMed

    Garcia-Zapirain, Begoña; de la Torre Díez, Isabel; Sainz de Abajo, Beatriz; López-Coronado, Miguel

    2016-09-01

    The main objective of this research was to develop and evaluate a Web-based mobile application (app) known as "Diario Diabetes" on both a technical and user level, by means of which individuals with diabetes may monitor their illness easily at any time and in any place using any device that has Internet access. The technologies used to develop the app were HTML, CSS, JavaScript, PHP, and MySQL, all of which are an open source. Once the app was developed, it was evaluated on a technical level (by measuring loading times) and on a user level, through a survey. Different loading times for the application were measured, with it being noted that under no circumstances does this exceed 2 s. Usability was evaluated by 150 users who initially used the application. A majority (71%) of users used a PC to access the app, 83% considered the app's design to be attractive, 67% considered the tasks to be very useful, and 67% found it very easy to use. Although applications exist for controlling diabetes both at mobile virtual shops or on a research level, our app may help to improve the administration of these types of patients and they are the ones who will ultimately opt for one or the other. According to the results obtained, we can state that all users would recommend the app's use to other users.

  16. Data security issues arising from integration of wireless access into healthcare networks.

    PubMed

    Frenzel, John C

    2003-04-01

    The versatility of having Ethernet speed connectivity without wires is rapidly driving adoption of wireless data networking by end users across all types of industry. Designed to be easy to configure and work among diverse platforms, wireless brings online data to mobile users. This functionality is particularly useful in modern clinical medicine. Wireless presents operators of networks containing or transmitting sensitive and confidential data with several new types of security vulnerabilities, and potentially opens previously protected core network resources to outside attack. Herein, we review the types of vulnerabilities, the tools necessary to exploit them, and strategies to thwart a successful attack.

  17. UkrVO astronomical WEB services

    NASA Astrophysics Data System (ADS)

    Mazhaev, A.

    2017-02-01

    Ukraine Virtual Observatory (UkrVO) has been a member of the International Virtual Observatory Alliance (IVOA) since 2011. The virtual observatory (VO) is not a magic solution to all problems of data storing and processing, but it provides certain standards for building infrastructure of astronomical data center. The astronomical databases help data mining and offer to users an easy access to observation metadata, images within celestial sphere and results of image processing. The astronomical web services (AWS) of UkrVO give to users handy tools for data selection from large astronomical catalogues for a relatively small region of interest in the sky. Examples of the AWS usage are showed.

  18. Integrated web-based viewing and secure remote access to a clinical data repository and diverse clinical systems.

    PubMed

    Duncan, R G; Saperia, D; Dulbandzhyan, R; Shabot, M M; Polaschek, J X; Jones, D T

    2001-01-01

    The advent of the World-Wide-Web protocols and client-server technology has made it easy to build low-cost, user-friendly, platform-independent graphical user interfaces to health information systems and to integrate the presentation of data from multiple systems. The authors describe a Web interface for a clinical data repository (CDR) that was moved from concept to production status in less than six months using a rapid prototyping approach, multi-disciplinary development team, and off-the-shelf hardware and software. The system has since been expanded to provide an integrated display of clinical data from nearly 20 disparate information systems.

  19. LinkWinds: An Approach to Visual Data Analysis

    NASA Technical Reports Server (NTRS)

    Jacobson, Allan S.

    1992-01-01

    The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration and analysis system resulting from a NASA/JPL program of research into graphical methods for rapidly accessing, displaying and analyzing large multivariate multidisciplinary datasets. It is an integrated multi-application execution environment allowing the dynamic interconnection of multiple windows containing visual displays and/or controls through a data-linking paradigm. This paradigm, which results in a system much like a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but provides a highly intuitive, easy to learn user interface on top of the traditional graphical user interface.

  20. The Philosophy of User Interfaces in HELIO and the Importance of CASSIS

    NASA Astrophysics Data System (ADS)

    Bonnin, X.; Aboudarham, J.; Renié, C.; Csillaghy, A.; Messerotti, M.; Bentley, R. D.

    2012-09-01

    HELIO is a European project funded under FP7 (Project No. 238969). One of its goals as a Heliospheric Virtual Observatory is to provide an easy access to many datasets scattered all over the world, in the fields of Solar physics, Heliophysics, and Planetary magnetospheres. The efficiency of such a tool is very much related to the quality of the user interface. HELIO infrastructure is based on a Service Oriented Architecture (SOA), regrouping a network of standalone components, which allows four main types of interfaces: - HELIO Front End (HFE) is a browser-based user interface, which offers a centralized access to the HELIO main functionalities. Especially, it provides the possibility to reach data directly, or to refine selection by determination of observing characteristics, such as which instrument was observing at that time, which instrument was at this location, etc. - Many services/components provide their own standalone graphical user interface. While one can directly access individually each of these interfaces, they can also be connected together. - Most services also provide direct access for any tools through a public interface. A small java library, called Java API, simplifies this access by providing client stubs for services and shields the user from security, discovery and failover issues. - Workflows capabilities are available in HELIO, allowing complex combination of queries over several services. We want the user to be able to navigate easily, at his needs, through the various interfaces, and possibly use a specific one in order to make much-dedicated queries. We will also emphasize the importance of the CASSIS project (Coordination Action for the integration of Solar System Infrastructure and Science) in encouraging the interoperability necessary to undertake scientific studies that span disciplinary boundaries. If related projects follow the guidelines being developed by CASSIS then using external resources with HELIO will be greatly simplified.

  1. Train users’ perceptions of walking distance to train station and attributes of paratransit service: understanding their association with decision using paratransit or not towards the train station

    NASA Astrophysics Data System (ADS)

    Syafriharti, R.; Kombaitan, B.; Kusumantoro, I. P.; Syabri, I.

    2018-05-01

    Access mode is an important factor in public transport systems. Most of the train users from Cicalengka to Padalarang via Bandung use paratransit as access mode. Access modes under this study are only paratransit and walking. This study aims to explore the relationship between access mode choice to the station and the perception about walking distance to station, perception about attributes of paratransit service quality which consist of accessibility, cheapness, comfortable, swiftness, safety, security and easiness. Of all the variables tested, walking distance to the station is the only variable relating to the mode access choice. So, a person will tend to use paratransit when his/her perception of walking distance to station is relatively far away. While perceptions about the quality of paratransit service can not determine whether a person will choose paratransit or not.

  2. expVIP: a Customizable RNA-seq Data Analysis and Visualization Platform1[OPEN

    PubMed Central

    2016-01-01

    The majority of transcriptome sequencing (RNA-seq) expression studies in plants remain underutilized and inaccessible due to the use of disparate transcriptome references and the lack of skills and resources to analyze and visualize these data. We have developed expVIP, an expression visualization and integration platform, which allows easy analysis of RNA-seq data combined with an intuitive and interactive interface. Users can analyze public and user-specified data sets with minimal bioinformatics knowledge using the expVIP virtual machine. This generates a custom Web browser to visualize, sort, and filter the RNA-seq data and provides outputs for differential gene expression analysis. We demonstrate expVIP’s suitability for polyploid crops and evaluate its performance across a range of biologically relevant scenarios. To exemplify its use in crop research, we developed a flexible wheat (Triticum aestivum) expression browser (www.wheat-expression.com) that can be expanded with user-generated data in a local virtual machine environment. The open-access expVIP platform will facilitate the analysis of gene expression data from a wide variety of species by enabling the easy integration, visualization, and comparison of RNA-seq data across experiments. PMID:26869702

  3. [Design and application of portable rescue vehicle].

    PubMed

    Guo, Ying; Qi, Huaying; Wang, Shen

    2017-12-01

    The disease of critically ill patients was with rapid changes, and at any time faced the risk of emergency. The current commonly used rescue vehicles were larger and bulky implementation, which were not conducive to the operation, therefore the design of a portable rescue vehicle was needed. This new type of rescue vehicle is multi-layer folding structure, with small footprint, large storage space, so a variety of first aid things can be classified and put, easy to be cleaned and disinfected. In the rescue process, the portable rescue vehicles can be placed in the required position; box of various emergency items can be found at a glance with easy access; the height of the infusion stand can adjust freely according to the user height; the rescue vehicle handle can be easy to pull and adjust accord with human body mechanics principle. The portable rescue vehicle facilitates the operation of medical staff, and is worthy of clinical application.

  4. Advanced Query and Data Mining Capabilities for MaROS

    NASA Technical Reports Server (NTRS)

    Wang, Paul; Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Hy, Franklin H.

    2013-01-01

    The Mars Relay Operational Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay network. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. As part of MaROS, the innovators have developed and implemented a feature set that operates on several levels of the software architecture. This new feature is an advanced querying capability through either the Web-based user interface, or through a back-end REST interface to access all of the data gathered from the network. This software is not meant to replace the REST interface, but to augment and expand the range of available data. The current REST interface provides specific data that is used by the MaROS Web application to display and visualize the information; however, the returned information from the REST interface has typically been pre-processed to return only a subset of the entire information within the repository, particularly only the information that is of interest to the GUI (graphical user interface). The new, advanced query and data mining capabilities allow users to retrieve the raw data and/or to perform their own data processing. The query language used to access the repository is a restricted subset of the structured query language (SQL) that can be built safely from the Web user interface, or entered as freeform SQL by a user. The results are returned in a CSV (Comma Separated Values) format for easy exporting to third party tools and applications that can be used for data mining or user-defined visualization and interpretation. This is the first time that a service is capable of providing access to all cross-project relay data from a single Web resource. Because MaROS contains the data for a variety of missions from the Mars network, which span both NASA and ESA, the software also establishes an access control list (ACL) on each data record in the database repository to enforce user access permissions through a multilayered approach.

  5. An Approach to Dynamic Service Management in Pervasive Computing Systems

    DTIC Science & Technology

    2005-01-01

    standard interface to them that is easily accessible by any user. This paper outlines the design of Centaurus , an infrastructure for presenting...based on Extensi- ble Markup Language (XML) for communication, giving the system a uniform and easily adaptable interface. Centaurus defines a...easy and automatic usage. This is the vision that guides our re- search on the Centaurus system. We define a SmartSpace as a dynamic environment that

  6. The Social Sciences: A Cross-Disciplinary Guide to Selected Sources. Third Edition. Library and Information Science Text Series.

    ERIC Educational Resources Information Center

    Herron, Nancy L., Ed.

    Like the two editions that precede it, this volume is designed as a working text to serve two groups of users. It is both an up-to-date guide to the literature prepared by practicing librarians for fast and easy access to some of the best resources in the social science literature, and it is a teaching textbook for students wanting a clear,…

  7. A Visual Programming Methodology for Tactical Aircrew Scheduling and Other Applications

    DTIC Science & Technology

    1991-12-01

    prgramming methodology and environment of a user-specific application remains with and is delivered as part of the application, then there is another factor...animation is useful, not only for scheduling applications, but as a general prgramming methodology. Of course, there are a number of improvements...possible using Excel because there is nothing to prevent access to cells. However, it is easy to imagine a spreadsheet which can support the

  8. Enhanced Information Retrieval Using AJAX

    NASA Astrophysics Data System (ADS)

    Kachhwaha, Rajendra; Rajvanshi, Nitin

    2010-11-01

    Information Retrieval deals with the representation, storage, organization of, and access to information items. The representation and organization of information items should provide the user with easy access to the information with the rapid development of Internet, large amounts of digitally stored information is readily available on the World Wide Web. This information is so huge that it becomes increasingly difficult and time consuming for the users to find the information relevant to their needs. The explosive growth of information on the Internet has greatly increased the need for information retrieval systems. However, most of the search engines are using conventional information retrieval systems. An information system needs to implement sophisticated pattern matching tools to determine contents at a faster rate. AJAX has recently emerged as the new tool such the of information retrieval process of information retrieval can become fast and information reaches the use at a faster pace as compared to conventional retrieval systems.

  9. Demagnetization Analysis in Excel (DAIE) - An open source workbook in Excel for viewing and analyzing demagnetization data from paleomagnetic discrete samples and u-channels

    NASA Astrophysics Data System (ADS)

    Sagnotti, Leonardo

    2013-04-01

    Modern rock magnetometers and stepwise demagnetization procedures result in the production of large datasets, which need a versatile and fast software for their display and analysis. Various software packages for paleomagnetic analyses have been recently developed to overcome the problems linked to the limited capability and the loss of operability of early codes written in obsolete computer languages and/or platforms, not compatible with modern 64 bit processors. The Demagnetization Analysis in Excel (DAIE) workbook is a new software that has been designed to make the analysis of demagnetization data easy and accessible on an application (Microsoft Excel) widely diffused and available on both the Microsoft Windows and Mac OS X operating systems. The widespread diffusion of Excel should guarantee a long term working life, since compatibility and functionality of current Excel files should be most likely maintained during the development of new processors and operating systems. DAIE is designed for viewing and analyzing stepwise demagnetization data of both discrete and u-channel samples. DAIE consists of a single file and has an open modular structure organized in 10 distinct worksheets. The standard demagnetization diagrams and various parameters of common use are shown on the same worksheet including selectable parameters and user's choices. The remanence characteristic components may be computed by principal component analysis (PCA) on a selected interval of demagnetization steps. Saving of the PCA data can be done both sample by sample, or in automatic by applying the selected choices to all the samples included in the file. The DAIE open structure allows easy personalization, development and improvement. The workbook has the following features which may be valuable for various users: - Operability in nearly all the computers and platforms; - Easy inputs of demagnetization data by "copy and paste" from ASCII files; - Easy export of computed parameters and demagnetization plots; - Complete control of the whole workflow and possibility of implementation of the workbook by any user; - Modular structure in distinct worksheets for each type of analyses and plots, in order to make implementation and personalization easier; - Opportunity to use the workbook for educational purposes, since all the computations and analyses are easily traceable and accessible; - Automatic and fast analysis of a large batch of demagnetization data, such as those measured on u-channel samples. The DAIE workbook and the "User manual" are available for download on a dedicated web site (http://roma2.rm.ingv.it/en/facilities/software/49/daie).

  10. Accessibility of Home Blood Pressure Monitors for Blind and Visually Impaired People

    PubMed Central

    Uslan, Mark M.; Burton, Darren M.; Wilson, Thomas E.; Taylor, Steven; Chertow, Bruce S.; Terry, Jack E.

    2007-01-01

    Background The prevalence of hypertension comorbid with diabetes is a significant health care issue. Use of the home blood pressure monitor (HBPM) for aiding in the control of hypertension is noteworthy because of benefits that accrue from following a home measurement regimen. To be usable by blind and visually impaired patients, HBPMs must have speech output to convey all screen information, an easily readable visual display, identifiable controls that are easy to use, and an accessible user manual. Methods Data on the physical aspects and the features and functions of nine Food and Drug Administration-approved HBPMs (eight of which were recommended by the British Hypertension Society) were tabulated and analyzed for usability by blind and visually impaired individuals. Video Electronics Standards Association standards were used to measure contrast modulation in the displays of the HBPMs. Ten persons who are blind or visually impaired and who have diabetes were surveyed to determine how they monitor their blood pressure and to learn their ideas for improvements in usability. Results Physical controls were found to be easy to identify, and operating procedures were found to be relatively simple on all of the HBPMs, but user manuals were either inaccessible or minimally accessible to blind persons. The two HBPMs that have speech output do not voice all of the information that is displayed on the screen. Some functions that are standard in the HBPMs without speech output, such as the feature for automatically setting cuff inflation volume and memory, were lacking in the HBPMs with speech output. These features were mentioned as desirable in interviews with legally blind persons who are diabetic and who monitor their blood pressure at home. Visual display output was large and adequate in all of the HBPMs. Michelson contrast for numeric digits in the HBPM displays was also measured, ranging from 55 to 75% for characters with dominant spatial frequency components lying in the range of 0.5–1.0 cycles/degree. Conclusions Home blood pressure monitors are easy-to-use devices that do not present accessibility barriers that are difficult to surmount, either technically or operationally. Two HBPMs with voice output were found to have a significant degree of accessibility, but they were not found to offer as many features as those HBPMs that were less accessible. Recommendations were made to improve accessibility, including the development of visual display standards that specify a minimally acceptable level of Michelson contrast. PMID:19888410

  11. A survey of Lab Tests Online-UK users: a key resource for patients to empower and help them understand their laboratory test results.

    PubMed

    Leyland, Rebecca; Freedman, Danielle B

    2016-11-01

    Background Lab Tests Online-UK celebrated its 10th anniversary in 2014 and to mark the occasion the first comprehensive survey of website users was undertaken. Methods A pop-up box with a link to Survey Monkey was used to offer website users the chance to participate in the survey, which was live from 4 March 2014 to 11 April 2014. Results Six hundred and sixty-one participants started the questionnaire and 338 completed all of the demographic questions. Although the website is designed and aimed at patients and the public, a significant number of respondents were health-care professionals (47%). The majority of survey participants found the Lab Tests Online-UK website via a search engine and were visiting the site for themselves. The majority of participants found what they were looking for on the website and found the information very easy or fairly easy to understand. The patient respondents were keen to see their laboratory test results (87%), but the majority did not have access (60%) at the time of the survey. Conclusions This survey provides good evidence that the Lab Tests Online-UK website is a useful resource for patients and health-care professionals alike. It comes at a poignant time as the release of results direct to patients starts with access to their medical records. The Lab Tests Online-UK website has a key role in enabling patients to understand their lab test results, and therefore empowering them to take an interest and engage in their own healthcare.

  12. Usability of a mobile electronic medical record prototype: a verbal protocol analysis.

    PubMed

    Wu, Robert C; Orr, M Scott; Chignell, Mark; Straus, Sharon E

    2008-06-01

    Point of care access to electronic medical records may provide clinicians with the information they want when they need it and may in turn improve patient safety. Yet providing an electronic medical record on handheld devices presents many usability challenges, and it is unclear whether clinicians will use them. An iterative design process for the development and evaluation of a prototype of a mobile electronic medical record was performed. Usability sessions were conducted in which physicians were asked to 'think aloud' while working through clinical scenarios using the prototype. Verbal protocol analysis, which consists of coding utterances, was conducted on the transcripts from the sessions and common themes were extracted. Usability sessions were held with five family physicians and four internists with varying levels of computer expertise. Physicians were able to use the device to complete 52 of 54 required tasks. Users commented that it was intuitive (9/9), would increase accessibility (5/9) but for them to use it, it would need the system to be fast and time-saving (5/9). Users had difficulty entering information (5/9) and reading the screen (4/9). In terms of functionality, users had concerns about completeness of information (6/9), details of ordering (5/9) and desired billing functionality (5/9) and integration with other systems (4/9). While physicians can use mobile electronic medical records in realistic scenarios, certain requirements likely need to be met including a fast system with easy data selection, easy data entry and improved display before widespread adoption occurs.

  13. cisPath: an R/Bioconductor package for cloud users for visualization and management of functional protein interaction networks.

    PubMed

    Wang, Likun; Yang, Luhe; Peng, Zuohan; Lu, Dan; Jin, Yan; McNutt, Michael; Yin, Yuxin

    2015-01-01

    With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services.

  14. cisPath: an R/Bioconductor package for cloud users for visualization and management of functional protein interaction networks

    PubMed Central

    2015-01-01

    Background With the burgeoning development of cloud technology and services, there are an increasing number of users who prefer cloud to run their applications. All software and associated data are hosted on the cloud, allowing users to access them via a web browser from any computer, anywhere. This paper presents cisPath, an R/Bioconductor package deployed on cloud servers for client users to visualize, manage, and share functional protein interaction networks. Results With this R package, users can easily integrate downloaded protein-protein interaction information from different online databases with private data to construct new and personalized interaction networks. Additional functions allow users to generate specific networks based on private databases. Since the results produced with the use of this package are in the form of web pages, cloud users can easily view and edit the network graphs via the browser, using a mouse or touch screen, without the need to download them to a local computer. This package can also be installed and run on a local desktop computer. Depending on user preference, results can be publicized or shared by uploading to a web server or cloud driver, allowing other users to directly access results via a web browser. Conclusions This package can be installed and run on a variety of platforms. Since all network views are shown in web pages, such package is particularly useful for cloud users. The easy installation and operation is an attractive quality for R beginners and users with no previous experience with cloud services. PMID:25708840

  15. Psychological risk factors of addiction to social networking sites among Chinese smartphone users.

    PubMed

    Wu, Anise M S; Cheung, Vivi I; Ku, Lisbeth; Hung, Eva P W

    2013-09-01

    Smartphones allow users to access social networking sites (SNSs) whenever and wherever they want. Such easy availability and accessibility may increase their vulnerability to addiction. Based on the social cognitive theory (SCT), we examined the impacts of outcome expectancies, self-efficacy, and impulsivity on young Chinese smartphone users' addictive tendencies toward SNSs. Two hundred seventy-seven Macau young smartphone users (116 males and 161 females; mean age = 26.62) filled out an online Chinese questionnaire concerning their usage of social networking sites via smartphones, addiction tendencies toward SNSs, impulsivity trait, outcome expectancies toward the use, and Internet self-efficacy. The findings revealed that those who spent more time on SNSs also reported higher addictive tendencies. Addictive tendencies were positively correlated with both outcome expectancies and impulsivity, but negatively associated with Internet self-efficacy. These three psychological variables explained 23% of the variance in addictive tendencies. The findings of this study suggest that, compared to demographics, psychological factors provide a better account for addictive tendencies towards SNSs among Chinese smartphone users in Macau. The three psychological risk factors were low Internet self-efficacy, favorable outcome expectancies, and high impulsivity trait. Educational campaigns with screening procedures for high-risk groups are recommended for effective prevention and treatment.

  16. Psychological risk factors of addiction to social networking sites among Chinese smartphone users

    PubMed Central

    Wu, Anise M. S.; Cheung, Vivi I.; Ku, Lisbeth; Hung, Eva P. W.

    2013-01-01

    Background and aims: Smartphones allow users to access social networking sites (SNSs) whenever and wherever they want. Such easy availability and accessibility may increase their vulnerability to addiction. Based on the social cognitive theory (SCT), we examined the impacts of outcome expectancies, self-efficacy, and impulsivity on young Chinese smartphone users' addictive tendencies toward SNSs. Methods: Two hundred seventy-seven Macau young smartphone users (116 males and 161 females; mean age = 26.62) filled out an online Chinese questionnaire concerning their usage of social networking sites via smartphones, addiction tendencies toward SNSs, impulsivity trait, outcome expectancies toward the use, and Internet self-efficacy. Results: The findings revealed that those who spent more time on SNSs also reported higher addictive tendencies. Addictive tendencies were positively correlated with both outcome expectancies and impulsivity, but negatively associated with Internet self-efficacy. These three psychological variables explained 23% of the variance in addictive tendencies. Conclusions: The findings of this study suggest that, compared to demographics, psychological factors provide a better account for addictive tendencies towards SNSs among Chinese smartphone users in Macau. The three psychological risk factors were low Internet self-efficacy, favorable outcome expectancies, and high impulsivity trait. Educational campaigns with screening procedures for high-risk groups are recommended for effective prevention and treatment. PMID:25215198

  17. User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package

    USGS Publications Warehouse

    Shapiro, Jason

    2018-05-29

    MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.

  18. ZeBase: an open-source relational database for zebrafish laboratories.

    PubMed

    Hensley, Monica R; Hassenplug, Eric; McPhail, Rodney; Leung, Yuk Fai

    2012-03-01

    Abstract ZeBase is an open-source relational database for zebrafish inventory. It is designed for the recording of genetic, breeding, and survival information of fish lines maintained in a single- or multi-laboratory environment. Users can easily access ZeBase through standard web-browsers anywhere on a network. Convenient search and reporting functions are available to facilitate routine inventory work; such functions can also be automated by simple scripting. Optional barcode generation and scanning are also built-in for easy access to the information related to any fish. Further information of the database and an example implementation can be found at http://zebase.bio.purdue.edu.

  19. Chemozart: a web-based 3D molecular structure editor and visualizer platform.

    PubMed

    Mohebifar, Mohamad; Sajadi, Fatemehsadat

    2015-01-01

    Chemozart is a 3D Molecule editor and visualizer built on top of native web components. It offers an easy to access service, user-friendly graphical interface and modular design. It is a client centric web application which communicates with the server via a representational state transfer style web service. Both client-side and server-side application are written in JavaScript. A combination of JavaScript and HTML is used to draw three-dimensional structures of molecules. With the help of WebGL, three-dimensional visualization tool is provided. Using CSS3 and HTML5, a user-friendly interface is composed. More than 30 packages are used to compose this application which adds enough flexibility to it to be extended. Molecule structures can be drawn on all types of platforms and is compatible with mobile devices. No installation is required in order to use this application and it can be accessed through the internet. This application can be extended on both server-side and client-side by implementing modules in JavaScript. Molecular compounds are drawn on the HTML5 Canvas element using WebGL context. Chemozart is a chemical platform which is powerful, flexible, and easy to access. It provides an online web-based tool used for chemical visualization along with result oriented optimization for cloud based API (application programming interface). JavaScript libraries which allow creation of web pages containing interactive three-dimensional molecular structures has also been made available. The application has been released under Apache 2 License and is available from the project website https://chemozart.com.

  20. Building a Smart Portal for Astronomy

    NASA Astrophysics Data System (ADS)

    Derriere, S.; Boch, T.

    2011-07-01

    The development of a portal for accessing astronomical resources is not an easy task. The ever-increasing complexity of the data products can result in very complex user interfaces, requiring a lot of effort and learning from the user in order to perform searches. This is often a design choice, where the user must explicitly set many constraints, while the portal search logic remains simple. We investigated a different approach, where the query interface is kept as simple as possible (ideally, a simple text field, like for Google search), and the search logic is made much more complex to interpret the query in a relevant manner. We will present the implications of this approach in terms of interpretation and categorization of the query parameters (related to astronomical vocabularies), translation (mapping) of these concepts into the portal components metadata, identification of query schemes and use cases matching the input parameters, and delivery of query results to the user.

  1. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather M; Graham, Paul S; Morgan, Keith S

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less

  2. A user-friendly mathematical modelling web interface to assist local decision making in the fight against drug-resistant tuberculosis.

    PubMed

    Ragonnet, Romain; Trauer, James M; Denholm, Justin T; Marais, Ben J; McBryde, Emma S

    2017-05-30

    Multidrug-resistant and rifampicin-resistant tuberculosis (MDR/RR-TB) represent an important challenge for global tuberculosis (TB) control. The high rates of MDR/RR-TB observed among re-treatment cases can arise from diverse pathways: de novo amplification during initial treatment, inappropriate treatment of undiagnosed MDR/RR-TB, relapse despite appropriate treatment, or reinfection with MDR/RR-TB. Mathematical modelling allows quantification of the contribution made by these pathways in different settings. This information provides valuable insights for TB policy-makers, allowing better contextualised solutions. However, mathematical modelling outputs need to consider local data and be easily accessible to decision makers in order to improve their usefulness. We present a user-friendly web-based modelling interface, which can be used by people without technical knowledge. Users can input their own parameter values and produce estimates for their specific setting. This innovative tool provides easy access to mathematical modelling outputs that are highly relevant to national TB control programs. In future, the same approach could be applied to a variety of modelling applications, enhancing local decision making.

  3. Boosting a Low-Cost Smart Home Environment with Usage and Access Control Rules.

    PubMed

    Barsocchi, Paolo; Calabrò, Antonello; Ferro, Erina; Gennaro, Claudio; Marchetti, Eda; Vairo, Claudio

    2018-06-08

    Smart Home has gained widespread attention due to its flexible integration into everyday life. Pervasive sensing technologies are used to recognize and track the activities that people perform during the day, and to allow communication and cooperation of physical objects. Usually, the available infrastructures and applications leveraging these smart environments have a critical impact on the overall cost of the Smart Home construction, require to be preferably installed during the home construction and are still not user-centric. In this paper, we propose a low cost, easy to install, user-friendly, dynamic and flexible infrastructure able to perform runtime resources management by decoupling the different levels of control rules. The basic idea relies on the usage of off-the-shelf sensors and technologies to guarantee the regular exchange of critical information, without the necessity from the user to develop accurate models for managing resources or regulating their access/usage. This allows us to simplify the continuous updating and improvement, to reduce the maintenance effort and to improve residents’ living and security. A first validation of the proposed infrastructure on a case study is also presented.

  4. Human interface to large multimedia databases

    NASA Astrophysics Data System (ADS)

    Davis, Ben; Marks, Linn; Collins, Dave; Mack, Robert; Malkin, Peter; Nguyen, Tam

    1994-04-01

    The emergence of high-speed networking for multimedia will have the effect of turning the computer screen into a window on a very large information space. As this information space increases in size and complexity, providing users with easy and intuitive means of accessing information will become increasingly important. Providing access to large amounts of text has been the focus of work for hundreds of years and has resulted in the evolution of a set of standards, from the Dewey Decimal System for libraries to the recently proposed ANSI standards for representing information on-line: KIF, Knowledge Interchange Format, and CG's, Conceptual Graphs. Certain problems remain unsolved by these efforts, though: how to let users know the contents of the information space, so that they know whether or not they want to search it in the first place, how to facilitate browsing, and, more specifically, how to facilitate visual browsing. These issues are particularly important for users in educational contexts and have been the focus of much of our recent work. In this paper we discuss some of the solutions we have prototypes: specifically, visual means, visual browsers, and visual definitional sequences.

  5. ChRIS--A web-based neuroimaging and informatics system for collecting, organizing, processing, visualizing and sharing of medical data.

    PubMed

    Pienaar, Rudolph; Rannou, Nicolas; Bernal, Jorge; Hahn, Daniel; Grant, P Ellen

    2015-01-01

    The utility of web browsers for general purpose computing, long anticipated, is only now coming into fruition. In this paper we present a web-based medical image data and information management software platform called ChRIS ([Boston] Children's Research Integration System). ChRIS' deep functionality allows for easy retrieval of medical image data from resources typically found in hospitals, organizes and presents information in a modern feed-like interface, provides access to a growing library of plugins that process these data - typically on a connected High Performance Compute Cluster, allows for easy data sharing between users and instances of ChRIS and provides powerful 3D visualization and real time collaboration.

  6. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google Earth layers using KML; generation of maps via WMS or ArcIMS protocols; and data manipulation with Unix utilities.

  7. Health 2.0 and Implications for Nursing Education

    PubMed Central

    Nelson, Ramona

    2012-01-01

    Over the last 20 years the evolution of web browsers providing easy access to the Internet has initiated a revolution in access to healthcare related information for both healthcare providers and patients. This access has changed both the process used to deliver education and the content of the nursing education curriculum worldwide. Our amazing ability to access information around the world is referred as to Web 1.0. Web 2.0 moves beyond access to a world where users are interactively creating information. With the advent of Health 2.0 we are confronting a second revolution that is challenging all aspects of healthcare including all aspects of nursing. This paper explores the concept of Health 2.0, discusses a conceptual framework approach for integrating Health 2.0 content into the nursing curriculum, outlines examples of key concepts required in today’s nursing curriculum and identifies selected issues arising from the impact of Health 2.0. PMID:24199108

  8. CRESST Human Performance Knowledge Mapping System

    DTIC Science & Technology

    2002-12-01

    link subcategories. Semantica Evaluation copy unavailable Visual Mind M H No Cannot add relation labels. Smart Ideas H H No Easy to use. Linking in...Screen Users can access all top-level functions from the main screen shown in Figure 4. The design of the Web favored breadth over depth, which allows...based on whether their propositions match propositions in the expert map. LifeMap PC on the Web /Mac 0 http:/ /www2.ucsc.edu/-mlrg/mlrgtools.html This

  9. Exploring a social network for sharing information about pain.

    PubMed

    Alvarez, Ana Graziela; Dal Sasso, Grace T Marcon

    2012-01-01

    The purpose of study was to evaluate the opinion of users about the experience of sharing information about pain in a social network. An electronic survey study was conducted from September to November/2009. Nine participants assessed the social network through of an electronic questionnaire. positive aspects (easy access, organized information, interactivity, encourages the sharing of information, learning opportunity). The sharing of information contributes to the development of a collective intelligence based on exchanging experiences and knowledge sharing.

  10. The Protein Disease Database of human body fluids: II. Computer methods and data issues.

    PubMed

    Lemkin, P F; Orr, G A; Goldstein, M P; Creed, G J; Myrick, J E; Merril, C R

    1995-01-01

    The Protein Disease Database (PDD) is a relational database of proteins and diseases. With this database it is possible to screen for quantitative protein abnormalities associated with disease states. These quantitative relationships use data drawn from the peer-reviewed biomedical literature. Assays may also include those observed in high-resolution electrophoretic gels that offer the potential to quantitate many proteins in a single test as well as data gathered by enzymatic or immunologic assays. We are using the Internet World Wide Web (WWW) and the Web browser paradigm as an access method for wide distribution and querying of the Protein Disease Database. The WWW hypertext transfer protocol and its Common Gateway Interface make it possible to build powerful graphical user interfaces that can support easy-to-use data retrieval using query specification forms or images. The details of these interactions are totally transparent to the users of these forms. Using a client-server SQL relational database, user query access, initial data entry and database maintenance are all performed over the Internet with a Web browser. We discuss the underlying design issues, mapping mechanisms and assumptions that we used in constructing the system, data entry, access to the database server, security, and synthesis of derived two-dimensional gel image maps and hypertext documents resulting from SQL database searches.

  11. TopoCad - A unified system for geospatial data and services

    NASA Astrophysics Data System (ADS)

    Felus, Y. A.; Sagi, Y.; Regev, R.; Keinan, E.

    2013-10-01

    "E-government" is a leading trend in public sector activities in recent years. The Survey of Israel set as a vision to provide all of its services and datasets online. The TopoCad system is the latest software tool developed in order to unify a number of services and databases into one on-line and user friendly system. The TopoCad system is based on Web 1.0 technology; hence the customer is only a consumer of data. All data and services are accessible for the surveyors and geo-information professional in an easy and comfortable way. The future lies in Web 2.0 and Web 3.0 technologies through which professionals can upload their own data for quality control and future assimilation with the national database. A key issue in the development of this complex system was to implement a simple and easy (comfortable) user experience (UX). The user interface employs natural language dialog box in order to understand the user requirements. The system then links spatial data with alpha-numeric data in a flawless manner. The operation of the TopoCad requires no user guide or training. It is intuitive and self-taught. The system utilizes semantic engines and machine understanding technologies to link records from diverse databases in a meaningful way. Thus, the next generation of TopoCad will include five main modules: users and projects information, coordinates transformations and calculations services, geospatial data quality control, linking governmental systems and databases, smart forms and applications. The article describes the first stage of the TopoCad system and gives an overview of its future development.

  12. Doing Your Science While You're in Orbit

    NASA Astrophysics Data System (ADS)

    Green, Mark L.; Miller, Stephen D.; Vazhkudai, Sudharshan S.; Trater, James R.

    2010-11-01

    Large-scale neutron facilities such as the Spallation Neutron Source (SNS) located at Oak Ridge National Laboratory need easy-to-use access to Department of Energy Leadership Computing Facilities and experiment repository data. The Orbiter thick- and thin-client and its supporting Service Oriented Architecture (SOA) based services (available at https://orbiter.sns.gov) consist of standards-based components that are reusable and extensible for accessing high performance computing, data and computational grid infrastructure, and cluster-based resources easily from a user configurable interface. The primary Orbiter system goals consist of (1) developing infrastructure for the creation and automation of virtual instrumentation experiment optimization, (2) developing user interfaces for thin- and thick-client access, (3) provide a prototype incorporating major instrument simulation packages, and (4) facilitate neutron science community access and collaboration. The secure Orbiter SOA authentication and authorization is achieved through the developed Virtual File System (VFS) services, which use Role-Based Access Control (RBAC) for data repository file access, thin-and thick-client functionality and application access, and computational job workflow management. The VFS Relational Database Management System (RDMS) consists of approximately 45 database tables describing 498 user accounts with 495 groups over 432,000 directories with 904,077 repository files. Over 59 million NeXus file metadata records are associated to the 12,800 unique NeXus file field/class names generated from the 52,824 repository NeXus files. Services that enable (a) summary dashboards of data repository status with Quality of Service (QoS) metrics, (b) data repository NeXus file field/class name full text search capabilities within a Google like interface, (c) fully functional RBAC browser for the read-only data repository and shared areas, (d) user/group defined and shared metadata for data repository files, (e) user, group, repository, and web 2.0 based global positioning with additional service capabilities are currently available. The SNS based Orbiter SOA integration progress with the Distributed Data Analysis for Neutron Scattering Experiments (DANSE) software development project is summarized with an emphasis on DANSE Central Services and the Virtual Neutron Facility (VNF). Additionally, the DANSE utilization of the Orbiter SOA authentication, authorization, and data transfer services best practice implementations are presented.

  13. Proactive Support of Internet Browsing when Searching for Relevant Health Information.

    PubMed

    Rurik, Clas; Zowalla, Richard; Wiesner, Martin; Pfeifer, Daniel

    2015-01-01

    Many people use the Internet as one of the primary sources of health information. This is due to the high volume and easy access of freely available information regarding diseases, diagnoses and treatments. However, users may find it difficult to retrieve information which is easily understandable and does not require a deep medical background. In this paper, we present a new kind of Web browser add-on, in order to proactively support users when searching for relevant health information. Our add-on not only visualizes the understandability of displayed medical text but also provides further recommendations of Web pages which hold similar content but are potentially easier to comprehend.

  14. Small UGV platforms for unattended sensors

    NASA Astrophysics Data System (ADS)

    Smuda, Bill; Gerhart, Grant

    2005-10-01

    The wars in Iraq and Afghanistan have shown the importance of sensor and robotic technology as a force multiplier and a tool for moving soldiers out of harms way. Situations on the ground make soldiers easy targets for snipers and suicide bombers. Sensors and robotics technology reduces risk to soldiers and other personnel at checkpoints, in access areas and on convoy routes. Early user involvement in innovative and aggressive acquisition and development strategies are the key to moving sensor and robotic and associated technology into the hands of the user, the soldier on the ground. This paper discusses activity associated with rapid development of the robotics, sensors and our field experience with robotics in Iraq and Afghanistan.

  15. Enabling Discoveries in Earth Sciences Through the Geosciences Network (GEON)

    NASA Astrophysics Data System (ADS)

    Seber, D.; Baru, C.; Memon, A.; Lin, K.; Youn, C.

    2005-12-01

    Taking advantage of the state-of-the-art information technology resources GEON researchers are building a cyberinfrastructure designed to enable data sharing, semantic data integration, high-end computations and 4D visualization in easy-to-use web-based environments. The GEON Network currently allows users to search and register Earth science resources such as data sets (GIS layers, GMT files, geoTIFF images, ASCII files, relational databases etc), software applications or ontologies. Portal based access mechanisms enable developers to built dynamic user interfaces to conduct advanced processing and modeling efforts across distributed computers and supercomputers. Researchers and educators can access the networked resources through the GEON portal and its portlets that were developed to conduct better and more comprehensive science and educational studies. For example, the SYNSEIS portlet in GEON enables users to access in near-real time seismic waveforms from the IRIS Data Management Center, easily build a 3D geologic model within the area of the seismic station(s) and the epicenter and perform a 3D synthetic seismogram analysis to understand the lithospheric structure and earthquake source parameters for any given earthquake in the US. Similarly, GEON's workbench area enables users to create their own work environment and copy, visualize and analyze any data sets within the network, and create subsets of the data sets for their own purposes. Since all these resources are built as part of a Service-oriented Architecture (SOA), they are also used in other development platforms. One such platform is Kepler Workflow system which can access web service based resources and provides users with graphical programming interfaces to build a model to conduct computations and/or visualization efforts using the networked resources. Developments in the area of semantic integration of the networked datasets continue to advance and prototype studies can be accessed via the GEON portal at www.geongrid.org

  16. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics.

    PubMed

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-03-15

    RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .

  17. EasyFRAP-web: a web-based tool for the analysis of fluorescence recovery after photobleaching data.

    PubMed

    Koulouras, Grigorios; Panagopoulos, Andreas; Rapsomaniki, Maria A; Giakoumakis, Nickolaos N; Taraviras, Stavros; Lygerou, Zoi

    2018-06-13

    Understanding protein dynamics is crucial in order to elucidate protein function and interactions. Advances in modern microscopy facilitate the exploration of the mobility of fluorescently tagged proteins within living cells. Fluorescence recovery after photobleaching (FRAP) is an increasingly popular functional live-cell imaging technique which enables the study of the dynamic properties of proteins at a single-cell level. As an increasing number of labs generate FRAP datasets, there is a need for fast, interactive and user-friendly applications that analyze the resulting data. Here we present easyFRAP-web, a web application that simplifies the qualitative and quantitative analysis of FRAP datasets. EasyFRAP-web permits quick analysis of FRAP datasets through an intuitive web interface with interconnected analysis steps (experimental data assessment, different types of normalization and estimation of curve-derived quantitative parameters). In addition, easyFRAP-web provides dynamic and interactive data visualization and data and figure export for further analysis after every step. We test easyFRAP-web by analyzing FRAP datasets capturing the mobility of the cell cycle regulator Cdt2 in the presence and absence of DNA damage in cultured cells. We show that easyFRAP-web yields results consistent with previous studies and highlights cell-to-cell heterogeneity in the estimated kinetic parameters. EasyFRAP-web is platform-independent and is freely accessible at: https://easyfrap.vmnet.upatras.gr/.

  18. An integrated knowledge system for wind tunnel testing - Project Engineers' Intelligent Assistant

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.; Shi, George Z.; Hoyt, W. A.; Steinle, Frank W., Jr.

    1993-01-01

    The Project Engineers' Intelligent Assistant (PEIA) is an integrated knowledge system developed using artificial intelligence technology, including hypertext, expert systems, and dynamic user interfaces. This system integrates documents, engineering codes, databases, and knowledge from domain experts into an enriched hypermedia environment and was designed to assist project engineers in planning and conducting wind tunnel tests. PEIA is a modular system which consists of an intelligent user-interface, seven modules and an integrated tool facility. Hypermedia technology is discussed and the seven PEIA modules are described. System maintenance and updating is very easy due to the modular structure and the integrated tool facility provides user access to commercial software shells for documentation, reporting, or database updating. PEIA is expected to provide project engineers with technical information, increase efficiency and productivity, and provide a realistic tool for personnel training.

  19. Calypso: a user-friendly web-server for mining and visualizing microbiome-environment interactions.

    PubMed

    Zakrzewski, Martha; Proietti, Carla; Ellis, Jonathan J; Hasan, Shihab; Brion, Marie-Jo; Berger, Bernard; Krause, Lutz

    2017-03-01

    Calypso is an easy-to-use online software suite that allows non-expert users to mine, interpret and compare taxonomic information from metagenomic or 16S rDNA datasets. Calypso has a focus on multivariate statistical approaches that can identify complex environment-microbiome associations. The software enables quantitative visualizations, statistical testing, multivariate analysis, supervised learning, factor analysis, multivariable regression, network analysis and diversity estimates. Comprehensive help pages, tutorials and videos are provided via a wiki page. The web-interface is accessible via http://cgenome.net/calypso/ . The software is programmed in Java, PERL and R and the source code is available from Zenodo ( https://zenodo.org/record/50931 ). The software is freely available for non-commercial users. l.krause@uq.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  20. CRAX. Cassandra Exoskeleton

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, D.G.; Eubanks, L.

    1998-03-01

    This software assists the engineering designer in characterizing the statistical uncertainty in the performance of complex systems as a result of variations in manufacturing processes, material properties, system geometry or operating environment. The software is composed of a graphical user interface that provides the user with easy access to Cassandra uncertainty analysis routines. Together this interface and the Cassandra routines are referred to as CRAX (CassandRA eXoskeleton). The software is flexible enough, that with minor modification, it is able to interface with large modeling and analysis codes such as heat transfer or finite element analysis software. The current version permitsmore » the user to manually input a performance function, the number of random variables and their associated statistical characteristics: density function, mean, coefficients of variation. Additional uncertainity analysis modules are continuously being added to the Cassandra core.« less

  1. Cassandra Exoskeleton

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robiinson, David G.

    1999-02-20

    This software assists the engineering designer in characterizing the statistical uncertainty in the performance of complex systems as a result of variations in manufacturing processes, material properties, system geometry or operating environment. The software is composed of a graphical user interface that provides the user with easy access to Cassandra uncertainty analysis routines. Together this interface and the Cassandra routines are referred to as CRAX (CassandRA eXoskeleton). The software is flexible enough, that with minor modification, it is able to interface with large modeling and analysis codes such as heat transfer or finite element analysis software. The current version permitsmore » the user to manually input a performance function, the number of random variables and their associated statistical characteristics: density function, mean, coefficients of variation. Additional uncertainity analysis modules are continuously being added to the Cassandra core.« less

  2. Machine learning for micro-tomography

    NASA Astrophysics Data System (ADS)

    Parkinson, Dilworth Y.; Pelt, Daniël. M.; Perciano, Talita; Ushizima, Daniela; Krishnan, Harinarayan; Barnard, Harold S.; MacDowell, Alastair A.; Sethian, James

    2017-09-01

    Machine learning has revolutionized a number of fields, but many micro-tomography users have never used it for their work. The micro-tomography beamline at the Advanced Light Source (ALS), in collaboration with the Center for Applied Mathematics for Energy Research Applications (CAMERA) at Lawrence Berkeley National Laboratory, has now deployed a series of tools to automate data processing for ALS users using machine learning. This includes new reconstruction algorithms, feature extraction tools, and image classification and recommen- dation systems for scientific image. Some of these tools are either in automated pipelines that operate on data as it is collected or as stand-alone software. Others are deployed on computing resources at Berkeley Lab-from workstations to supercomputers-and made accessible to users through either scripting or easy-to-use graphical interfaces. This paper presents a progress report on this work.

  3. GES DAAC HDF Data Processing and Visualization Tools

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Cho, S.; Johnson, J.; Li, J.; Liu, Z.; Lu, L.; Pollack, N.; Qin, J.; Savtchenko, A.; Teng, B.

    2002-12-01

    The Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC) plays a major role in enabling basic scientific research and providing access to scientific data to the general user community. Several GES DAAC Data Support Teams provide expert assistance to users in accessing data, including information on visualization tools and documentation for data products. To provide easy access to the science data, the data support teams have additionally developed many online and desktop tools for data processing and visualization. This presentation is an overview of major HDF tools implemented at the GES DAAC and aimed at optimizing access to EOS data for the Earth Sciences community. GES DAAC ONLINE TOOLS: MODIS and AIRS on-demand Channel/Variable Subsetter are web-based, on-the-fly/on-demand subsetters that perform channel/variable subsetting and restructuring for Level1B and Level 2 data products. Users can specify criteria to subset data files with desired channels and variables and then download the subsetted file. AIRS QuickLook is a CGI/IDL combo package that allows users to view AIRS/HSB/AMSU Level-1B data online by specifying a channel prior to obtaining data. A global map is also provided along with the image to show geographic coverage of the granule and flight direction of the spacecraft. OASIS (Online data AnalySIS) is an IDL-based HTML/CGI interface for search, selection, and simple analysis of earth science data. It supports binary and GRIB formatted data, such as TOVS, Data Assimilation products, and some NCEP operational products. TRMM Online Analysis System is designed for quick exploration, analyses, and visualization of TRMM Level-3 and other precipitation products. The products consist of the daily (3B42), monthly(3B43), near-real-time (3B42RT), and Willmott's climate data. The system is also designed to be simple and easy to use - users can plot the average or accumulated rainfall over their region of interest for a given time period, or plot the time series of regional rainfall average. WebGIS is an online web software that implements the Open GIS Consortium (OGC) standards for mapping requests and rendering. It allows users access to TRMM, MODIS, SeaWiFS, and AVHRR data from several DAAC map servers, as well as externally served data such as political boundaries, population centers, lakes, rivers, and elevation. GES DAAC DESKTOP TOOLS: HDFLook-MODIS is a new, multifunctional, data processing and visualization tool for Radiometric and Geolocation, Atmosphere, Ocean, and Land MODIS HDF-EOS data. Features include (1) accessing and visualization of all swath (Levels l and 2) MODIS and AIRS products, and gridded (Levels 3 and 4) MODIS products; (2) re-mapping of swath data to world map; (3) geo-projection conversion; (4) interactive and batch mode capabilities; (5) subsetting and multi-granule processing; and (6) data conversion. SIMAP is an IDL-based script that is designed to read and map MODIS Level 1B (L1B) and Level 2 (L2) Ocean and Atmosphere products. It is a non-interactive, command line executed tool. The resulting maps are scaled to physical units (e.g., radiances, concentrations, brightness temperatures) and saved in binary files. TRMM HDF (in C and Fortran), reads in TRMM HDF data files and writes out user-selected SDS arrays and Vdata tables as separate flat binary files.

  4. [User's perceived quality in an internal medicine service after a five-year period application of a user's satisfaction survey].

    PubMed

    García-Aparicio, J; Herrero-Herrero, J; Corral-Gudino, L; Jorge-Sánchez, R

    2010-01-01

    To evaluate the quality perceived by users of the 'Los Montalvos' Internal Medicine Service (Salamanca, Spain), over its first five years of operation. A cross-sectional study was carried out from February 2004 to January 2009. All in-patients (6,997) were given a survey model SERVQHOS at the time of discharge, which was anonymous and voluntary. We collected 2,435 surveys. Participation was 34.8%. Except for the item regarding accessibility, the other questions of the survey were perceived "as expected" or above expectations by over 85% of the users. A total of 90.6% of patients who completed the survey were satisfied with the care received, and 83.9% would recommend the hospital to others. The variables with higher predictive capability, in relation to overall satisfaction, were "personalised care', and the interests of staff to solve problems. The easy access to the hospital' was seen by 33.6% as below expectations. After introducing several improvement measures, the percentage of dissatisfaction regarding accessibility was 24.8% (p=0.02). Nine out of ten patients surveyed were satisfied or very satisfied with the care received, and would recommend the hospital to others. The variables more strongly associated with overall satisfaction were those related to service personnel. After identifying deficiencies and implementing measures to improve, the survey detected an increase in the level of satisfaction. Copyright 2009 SECA. Published by Elsevier Espana. All rights reserved.

  5. ESASky: All the sky you need

    NASA Astrophysics Data System (ADS)

    De Marchi, Guido; ESASky Team

    2018-06-01

    ESASky is a discovery portal giving to all astronomers, professional and amateur alike, an easy way to access high-quality scientific data from their computer, tablet, or mobile device. It includes over half a million images, 300,000 spectra, and more than a billion catalogue sources. From gamma rays to radio wavelengths, it allows users to explore the cosmos with data from a dozen space missions from the astronomical archives of ESA, NASA, and JAXA and does not require prior knowledge of any particular mission. ESASky features an all-sky exploration interface, letting users easily zoom in for stars as single targets or as part of a whole galaxy, visualise them and retrieve the relevant data taken in an area of the sky with just a few clicks. Users can easily compare observations of the same source obtained by different space missions at different times and wavelengths. They can also use ESASky to plan future observations with the James Webb Space Telescope, comparing the relevant portion of the sky as observed by Hubble and other missions. We will illustrate the many options to visualise and access astronomical data: interactive footprints for each instrument, tree-maps, filters, and solar-system object trajectories can all be combined and displayed. The most recent version of ESASky, released in February, also includes access to scientific publications, allowing users to visualise on the sky all astronomical objects with associated scientific publications and to link directly back to the papers in the NASA Astrophysics Data System.

  6. Mobile Visualization and Analysis Tools for Spatial Time-Series Data

    NASA Astrophysics Data System (ADS)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2013-12-01

    The Siberian Earth System Science Cluster (SIB-ESS-C) provides access and analysis services for spatial time-series data build on products from the Moderate Resolution Imaging Spectroradiometer (MODIS) and climate data from meteorological stations. Until now a webportal for data access, visualization and analysis with standard-compliant web services was developed for SIB-ESS-C. As a further enhancement a mobile app was developed to provide an easy access to these time-series data for field campaigns. The app sends the current position from the GPS receiver and a specific dataset (like land surface temperature or vegetation indices) - selected by the user - to our SIB-ESS-C web service and gets the requested time-series data for the identified pixel back in real-time. The data is then being plotted directly in the app. Furthermore the user has possibilities to analyze the time-series data for breaking points and other phenological values. These processings are executed on demand of the user on our SIB-ESS-C web server and results are transfered to the app. Any processing can also be done at the SIB-ESS-C webportal. The aim of this work is to make spatial time-series data and analysis functions available for end users without the need of data processing. In this presentation the author gives an overview on this new mobile app, the functionalities, the technical infrastructure as well as technological issues (how the app was developed, our made experiences).

  7. ERPLAB: an open-source toolbox for the analysis of event-related potentials

    PubMed Central

    Lopez-Calderon, Javier; Luck, Steven J.

    2014-01-01

    ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations. PMID:24782741

  8. SPANG: a SPARQL client supporting generation and reuse of queries for distributed RDF databases.

    PubMed

    Chiba, Hirokazu; Uchiyama, Ikuo

    2017-02-08

    Toward improved interoperability of distributed biological databases, an increasing number of datasets have been published in the standardized Resource Description Framework (RDF). Although the powerful SPARQL Protocol and RDF Query Language (SPARQL) provides a basis for exploiting RDF databases, writing SPARQL code is burdensome for users including bioinformaticians. Thus, an easy-to-use interface is necessary. We developed SPANG, a SPARQL client that has unique features for querying RDF datasets. SPANG dynamically generates typical SPARQL queries according to specified arguments. It can also call SPARQL template libraries constructed in a local system or published on the Web. Further, it enables combinatorial execution of multiple queries, each with a distinct target database. These features facilitate easy and effective access to RDF datasets and integrative analysis of distributed data. SPANG helps users to exploit RDF datasets by generation and reuse of SPARQL queries through a simple interface. This client will enhance integrative exploitation of biological RDF datasets distributed across the Web. This software package is freely available at http://purl.org/net/spang .

  9. Reviews on Security Issues and Challenges in Cloud Computing

    NASA Astrophysics Data System (ADS)

    An, Y. Z.; Zaaba, Z. F.; Samsudin, N. F.

    2016-11-01

    Cloud computing is an Internet-based computing service provided by the third party allowing share of resources and data among devices. It is widely used in many organizations nowadays and becoming more popular because it changes the way of how the Information Technology (IT) of an organization is organized and managed. It provides lots of benefits such as simplicity and lower costs, almost unlimited storage, least maintenance, easy utilization, backup and recovery, continuous availability, quality of service, automated software integration, scalability, flexibility and reliability, easy access to information, elasticity, quick deployment and lower barrier to entry. While there is increasing use of cloud computing service in this new era, the security issues of the cloud computing become a challenges. Cloud computing must be safe and secure enough to ensure the privacy of the users. This paper firstly lists out the architecture of the cloud computing, then discuss the most common security issues of using cloud and some solutions to the security issues since security is one of the most critical aspect in cloud computing due to the sensitivity of user's data.

  10. Catalog Descriptions Using VOTable Files

    NASA Astrophysics Data System (ADS)

    Thompson, R.; Levay, K.; Kimball, T.; White, R.

    2008-08-01

    Additional information is frequently required to describe database table contents and make it understandable to users. For this reason, the Multimission Archive at Space Telescope (MAST) creates Òdescription filesÓ for each table/catalog. After trying various XML and CSV formats, we finally chose VOTable. These files are easy to update via an HTML form, easily read using an XML parser such as (in our case) the PHP5 SimpleXML extension, and have found multiple uses in our data access/retrieval process.

  11. IMAGESEER - IMAGEs for Education and Research

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Grubb, Thomas; Milner, Barbara

    2012-01-01

    IMAGESEER is a new Web portal that brings easy access to NASA image data for non-NASA researchers, educators, and students. The IMAGESEER Web site and database are specifically designed to be utilized by the university community, to enable teaching image processing (IP) techniques on NASA data, as well as to provide reference benchmark data to validate new IP algorithms. Along with the data and a Web user interface front-end, basic knowledge of the application domains, benchmark information, and specific NASA IP challenges (or case studies) are provided.

  12. Implementing a bubble memory hierarchy system

    NASA Technical Reports Server (NTRS)

    Segura, R.; Nichols, C. D.

    1979-01-01

    This paper reports on implementation of a magnetic bubble memory in a two-level hierarchial system. The hierarchy used a major-minor loop device and RAM under microprocessor control. Dynamic memory addressing, dual bus primary memory, and hardware data modification detection are incorporated in the system to minimize access time. It is the objective of the system to incorporate the advantages of bipolar memory with that of bubble domain memory to provide a smart, optimal memory system which is easy to interface and independent of user's system.

  13. Surfing for Data: A Gathering Trend in Data Storage Is the Use of Web-Based Applications that Make It Easy for Authorized Users to Access Hosted Server Content with Just a Computing Device and Browser

    ERIC Educational Resources Information Center

    Technology & Learning, 2005

    2005-01-01

    In recent years, the widespread availability of networks and the flexibility of Web browsers have shifted the industry from a client-server model to a Web-based one. In the client-server model of computing, clients run applications locally, with the servers managing storage, printing functions, and network traffic. Because every client is…

  14. The ViennaRNA web services.

    PubMed

    Gruber, Andreas R; Bernhart, Stephan H; Lorenz, Ronny

    2015-01-01

    The ViennaRNA package is a widely used collection of programs for thermodynamic RNA secondary structure prediction. Over the years, many additional tools have been developed building on the core programs of the package to also address issues related to noncoding RNA detection, RNA folding kinetics, or efficient sequence design considering RNA-RNA hybridizations. The ViennaRNA web services provide easy and user-friendly web access to these tools. This chapter describes how to use this online platform to perform tasks such as prediction of minimum free energy structures, prediction of RNA-RNA hybrids, or noncoding RNA detection. The ViennaRNA web services can be used free of charge and can be accessed via http://rna.tbi.univie.ac.at.

  15. TerraLook: Providing easy, no-cost access to satellite images for busy people and the technologically disinclined

    USGS Publications Warehouse

    Geller, G.N.; Fosnight, E.A.; Chaudhuri, Sambhudas

    2008-01-01

    Access to satellite images has been largely limited to communities with specialized tools and expertise, even though images could also benefit other communities. This situation has resulted in underutilization of the data. TerraLook, which consists of collections of georeferenced JPEG images and an open source toolkit to use them, makes satellite images available to those lacking experience with remote sensing. Users can find, roam, and zoom images, create and display vector overlays, adjust and annotate images so they can be used as a communication vehicle, compare images taken at different times, and perform other activities useful for natural resource management, sustainable development, education, and other activities. ?? 2007 IEEE.

  16. TerraLook: Providing easy, no-cost access to satellite images for busy people and the technologically disinclined

    USGS Publications Warehouse

    Geller, G.N.; Fosnight, E.A.; Chaudhuri, Sambhudas

    2007-01-01

    Access to satellite images has been largely limited to communities with specialized tools and expertise, even though images could also benefit other communities. This situation has resulted in underutilization of the data. TerraLook, which consists of collections of georeferenced JPEG images and an open source toolkit to use them, makes satellite images available to those lacking experience with remote sensing. Users can find, roam, and zoom images, create and display vector overlays, adjust and annotate images so they can be used as a communication vehicle, compare images taken at different times, and perform other activities useful for natural resource management, sustainable development, education, and other activities. ?? 2007 IEEE.

  17. Aether: leveraging linear programming for optimal cloud computing in genomics.

    PubMed

    Luber, Jacob M; Tierney, Braden T; Cofer, Evan M; Patel, Chirag J; Kostic, Aleksandar D

    2018-05-01

    Across biology, we are seeing rapid developments in scale of data production without a corresponding increase in data analysis capabilities. Here, we present Aether (http://aether.kosticlab.org), an intuitive, easy-to-use, cost-effective and scalable framework that uses linear programming to optimally bid on and deploy combinations of underutilized cloud computing resources. Our approach simultaneously minimizes the cost of data analysis and provides an easy transition from users' existing HPC pipelines. Data utilized are available at https://pubs.broadinstitute.org/diabimmune and with EBI SRA accession ERP005989. Source code is available at (https://github.com/kosticlab/aether). Examples, documentation and a tutorial are available at http://aether.kosticlab.org. chirag_patel@hms.harvard.edu or aleksandar.kostic@joslin.harvard.edu. Supplementary data are available at Bioinformatics online.

  18. Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.

    2015-09-01

    Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.

  19. [Information resources in medicine: present and future trends].

    PubMed

    Campos-Asensio, C

    2010-12-01

    We are immersed in a new paradigm for scientific information access that, in the future, will be only be available and transmitted in electronic format. The concept of using internet as information storage has changed, with emphasis on its interactivity and possibility to share contents. The Web 2.0 has revolutionized the way of internet is understood, promoting the participation of those who access it, collaborating in its construction per se through intuitive and easy-to-use tools. Medicine 2.0 means supposes the participation of the user in the design, selection and evaluation of the contents. The future of access to information is through Medicine 2.0 services. The aim of this paper is to review the tools and instruments available for health care professionals to access scientific information, with special emphasis on web 2.0 tools. Copyright © 2010 Elsevier España, S.L. All rights reserved.

  20. Access Control for Mobile Assessment Systems Using ID.

    PubMed

    Nakayama, Masaharu; Ishii, Tadashi; Morino, Kazuma

    2015-01-01

    The assessment of shelters during disaster is critical to ensure the health of evacuees and prevent pandemic. In the Ishinomaki area, one of the areas most damaged by the Great East Japan Earthquake, the highly organized assessment helped to successfully manage a total of 328 shelters with a total of 46,480 evacuees. The input and analysis of vast amounts of data was tedious work for staff members. However, a web-based assessment system that utilized mobile devices was thought to decrease workload and standardize the evaluation form. The necessary access of information should be controlled in order to maintain individuals' privacy. We successfully developed an access control system using IDs. By utilizing a unique numerical ID, users can access the input form or assessment table. This avoids unnecessary queries to the server, resulting in a quick response and easy availability, even with poor internet connection.

  1. An Intelligent Terminal for Access to a Medical Database

    PubMed Central

    Womble, M. E.; Wilson, S. D.; Keiser, H. N.; Tworek, M. L.

    1978-01-01

    Very powerful data base management systems (DBMS) now exist which allow medical personnel access to patient record data bases. DBMS's make it easy to retrieve either complete or abbreviated records of patients with similar characteristics. In addition, statistics on data base records are immediately accessible. However, the price of this power is a large computer with the inherent problems of access, response time, and reliability. If a general purpose, time-shared computer is used to get this power, the response time to a request can be either rapid or slow, depending upon loading by other users. Furthermore, if the computer is accessed via dial-up telephone lines, there is competition with other users for telephone ports. If either the DBMS or the host machine is replaced, the medical users, who are typically not sophisticated in computer usage, are forced to learn the new system. Microcomputers, because of their low cost and adaptability, lend themselves to a solution of these problems. A microprocessor-based intelligent terminal has been designed and implemented at the USAF School of Aerospace Medicine to provide a transparent interface between the user and his data base. The intelligent terminal system includes multiple microprocessors, floppy disks, a CRT terminal, and a printer. Users interact with the system at the CRT terminal using menu selection (framing). The system translates the menu selection into the query language of the DBMS and handles all actual communication with the DBMS and its host computer, including telephone dialing and sign on procedures, as well as the actual data base query and response. Retrieved information is stored locally for CRT display, hard copy production, and/or permanent retention. Microprocessor-based communication units provide security for sensitive medical data through encryption/decryption algorithms and high reliability error detection transmission schemes. Highly modular software design permits adapation to a different DBMS and/or host computer with only minor localized software changes. Importantly, this portability is completely transparent to system users. Although the terminal system is independent of the host computer and its DBMS, it has been linked to a UNIVAC 1108 computer supporting MRI's SYSTEM 2000 DBMS.

  2. Patient portals and health apps: Pitfalls, promises, and what one might learn from the other.

    PubMed

    Baldwin, Jessica L; Singh, Hardeep; Sittig, Dean F; Giardina, Traber Davis

    2017-09-01

    Widespread use of health information technology (IT) could potentially increase patients' access to their health information and facilitate future goals of advancing patient-centered care. Despite having increased access to their health data, patients do not always understand this information or its implications, and digital health data can be difficult to navigate when displayed in a small-format, complex interface. In this paper, we discuss two forms of patient-facing health IT tools-patient portals and applications (apps)-and highlight how, despite several limitations of each, combining high-yield features of mobile health (mHealth) apps with portals could increase patient engagement and self-management and be more effective than either of them alone. Patient portal adoption is variable, and due to design and interface limitations and health literacy issues, many people find the portal difficult to use. Conversely, apps have experienced rapid adoption and traditionally have more consumer-friendly features with easy log-in access, real-time tracking, and simplified data display. These features make the applications more intuitive and easy-to-use than patient portals. While apps have their own limitations and might serve different purposes, patient portals could adopt some high-yield features and functions of apps that lead to engagement success with patients. We thus suggest that to improve user experience with future portals, developers could look towards mHealth apps in design, function, and user interface. Adding new features to portals may improve their use and empower patients to track their overall health and disease states. Nevertheless, both these health IT tools should be subjected to rigorous evaluation to ensure they meet their potential in improving patient outcomes. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  3. The EXOSAT database and archive

    NASA Technical Reports Server (NTRS)

    Reynolds, A. P.; Parmar, A. N.

    1992-01-01

    The EXOSAT database provides on-line access to the results and data products (spectra, images, and lightcurves) from the EXOSAT mission as well as access to data and logs from a number of other missions (such as EINSTEIN, COS-B, ROSAT, and IRAS). In addition, a number of familiar optical, infrared, and x ray catalogs, including the Hubble Space Telescope (HST) guide star catalog are available. The complete database is located at the EXOSAT observatory at ESTEC in the Netherlands and is accessible remotely via a captive account. The database management system was specifically developed to efficiently access the database and to allow the user to perform statistical studies on large samples of astronomical objects as well as to retrieve scientific and bibliographic information on single sources. The system was designed to be mission independent and includes timing, image processing, and spectral analysis packages as well as software to allow the easy transfer of analysis results and products to the user's own institute. The archive at ESTEC comprises a subset of the EXOSAT observations, stored on magnetic tape. Observations of particular interest were copied in compressed format to an optical jukebox, allowing users to retrieve and analyze selected raw data entirely from their terminals. Such analysis may be necessary if the user's needs are not accommodated by the products contained in the database (in terms of time resolution, spectral range, and the finesse of the background subtraction, for instance). Long-term archiving of the full final observation data is taking place at ESRIN in Italy as part of the ESIS program, again using optical media, and ESRIN have now assumed responsibility for distributing the data to the community. Tests showed that raw observational data (typically several tens of megabytes for a single target) can be transferred via the existing networks in reasonable time.

  4. Embedding online patient record access in UK primary care: a survey of stakeholder experiences.

    PubMed

    Pagliari, Claudia; Shand, Tim; Fisher, Brian

    2012-05-01

    To explore the integration of online patient Record Access within UK Primary Care, its perceived impacts on workload and service quality, and barriers to implementation. Mixed format survey of clinicians, administrators and patients. Telephone interviews with non-users. Primary care centres within NHS England that had offered online record access for the preceding year. Of the 57 practices initially agreeing to pilot the system, 32 had adopted it and 16 of these returned questionnaires. The 42 individual respondents included 14 practice managers, 15 clinicians and 13 patients. Follow-up interviews were conducted with one participant from 15 of the 25 non-adopter practices. Most professionals believed that the system is easy to integrate within primary care; while most patients found it easy to integrate within their daily lives. Professionals perceived no increase in the volume of patient queries or clinical consultations as a result of Record Access; indeed some believed that these had decreased. Most clinicians and patients believed that the service had improved mutual trust, communication, patients' health knowledge and health behaviour. Inhibiting factors included concerns about security, liability and resource requirements. Non-adoption was most frequently attributed to competing priorities, rather than negative beliefs about the service. Record access has an important role to play in supporting patient-focused healthcare policies in the UK and may be easily accommodated within existing services. Additional materials to facilitate patient recruitment, inform system set-up processes, and assure clinicians of their legal position are likely to encourage more widespread adoption.

  5. Discerning Trends in Performance Across Multiple Events

    NASA Technical Reports Server (NTRS)

    Slater, Simon; Hiltz, Mike; Rice, Craig

    2006-01-01

    Mass Data is a computer program that enables rapid, easy discernment of trends in performance data across multiple flights and ground tests. The program can perform Fourier analysis and other functions for the purposes of frequency analysis and trending of all variables. These functions facilitate identification of past use of diagnosed systems and of anomalies in such systems, and enable rapid assessment of related current problems. Many variables, for computation of which it is usually necessary to perform extensive manual manipulation of raw downlist data, are automatically computed and made available to all users, regularly eliminating the need for what would otherwise be an extensive amount of engineering analysis. Data from flight, ground test, and simulation are preprocessed and stored in one central location for instantaneous access and comparison for diagnostic and trending purposes. Rules are created so that an event log is created for every flight, making it easy to locate information on similar maneuvers across many flights. The same rules can be created for test sets and simulations, and are searchable, so that information on like events is easily accessible.

  6. iview: an interactive WebGL visualizer for protein-ligand complex.

    PubMed

    Li, Hongjian; Leung, Kwong-Sak; Nakane, Takanori; Wong, Man-Hon

    2014-02-25

    Visualization of protein-ligand complex plays an important role in elaborating protein-ligand interactions and aiding novel drug design. Most existing web visualizers either rely on slow software rendering, or lack virtual reality support. The vital feature of macromolecular surface construction is also unavailable. We have developed iview, an easy-to-use interactive WebGL visualizer of protein-ligand complex. It exploits hardware acceleration rather than software rendering. It features three special effects in virtual reality settings, namely anaglyph, parallax barrier and oculus rift, resulting in visually appealing identification of intermolecular interactions. It supports four surface representations including Van der Waals surface, solvent excluded surface, solvent accessible surface and molecular surface. Moreover, based on the feature-rich version of iview, we have also developed a neat and tailor-made version specifically for our istar web platform for protein-ligand docking purpose. This demonstrates the excellent portability of iview. Using innovative 3D techniques, we provide a user friendly visualizer that is not intended to compete with professional visualizers, but to enable easy accessibility and platform independence.

  7. Arabidopsis Gene Family Profiler (aGFP)--user-oriented transcriptomic database with easy-to-use graphic interface.

    PubMed

    Dupl'áková, Nikoleta; Renák, David; Hovanec, Patrik; Honysová, Barbora; Twell, David; Honys, David

    2007-07-23

    Microarray technologies now belong to the standard functional genomics toolbox and have undergone massive development leading to increased genome coverage, accuracy and reliability. The number of experiments exploiting microarray technology has markedly increased in recent years. In parallel with the rapid accumulation of transcriptomic data, on-line analysis tools are being introduced to simplify their use. Global statistical data analysis methods contribute to the development of overall concepts about gene expression patterns and to query and compose working hypotheses. More recently, these applications are being supplemented with more specialized products offering visualization and specific data mining tools. We present a curated gene family-oriented gene expression database, Arabidopsis Gene Family Profiler (aGFP; http://agfp.ueb.cas.cz), which gives the user access to a large collection of normalised Affymetrix ATH1 microarray datasets. The database currently contains NASC Array and AtGenExpress transcriptomic datasets for various tissues at different developmental stages of wild type plants gathered from nearly 350 gene chips. The Arabidopsis GFP database has been designed as an easy-to-use tool for users needing an easily accessible resource for expression data of single genes, pre-defined gene families or custom gene sets, with the further possibility of keyword search. Arabidopsis Gene Family Profiler presents a user-friendly web interface using both graphic and text output. Data are stored at the MySQL server and individual queries are created in PHP script. The most distinguishable features of Arabidopsis Gene Family Profiler database are: 1) the presentation of normalized datasets (Affymetrix MAS algorithm and calculation of model-based gene-expression values based on the Perfect Match-only model); 2) the choice between two different normalization algorithms (Affymetrix MAS4 or MAS5 algorithms); 3) an intuitive interface; 4) an interactive "virtual plant" visualizing the spatial and developmental expression profiles of both gene families and individual genes. Arabidopsis GFP gives users the possibility to analyze current Arabidopsis developmental transcriptomic data starting with simple global queries that can be expanded and further refined to visualize comparative and highly selective gene expression profiles.

  8. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    PubMed Central

    Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA

    2008-01-01

    Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks through a single easy-to-use web application. The system architecture is flexible and scalable to allow new array types, analysis algorithms and tools to be added with relative ease and to cope with large increases in data volume. PMID:19032776

  9. Recommending images of user interests from the biomedical literature

    NASA Astrophysics Data System (ADS)

    Clukey, Steven; Xu, Songhua

    2013-03-01

    Every year hundreds of thousands of biomedical images are published in journals and conferences. Consequently, finding images relevant to one's interests becomes an ever daunting task. This vast amount of literature creates a need for intelligent and easy-to-use tools that can help researchers effectively navigate through the content corpus and conveniently locate materials of their interests. Traditionally, literature search tools allow users to query content using topic keywords. However, manual query composition is often time and energy consuming. A better system would be one that can automatically deliver relevant content to a researcher without having the end user manually manifest one's search intent and interests via search queries. Such a computer-aided assistance for information access can be provided by a system that first determines a researcher's interests automatically and then recommends images relevant to the person's interests accordingly. The technology can greatly improve a researcher's ability to stay up to date in their fields of study by allowing them to efficiently browse images and documents matching their needs and interests among the vast amount of the biomedical literature. A prototype system implementation of the technology can be accessed via http://www.smartdataware.com.

  10. The Climate-G Portal: a Grid Enabled Scientifc Gateway for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni

    2010-05-01

    Grid portals are web gateways aiming at concealing the underlying infrastructure through a pervasive, transparent, user-friendly, ubiquitous and seamless access to heterogeneous and geographical spread resources (i.e. storage, computational facilities, services, sensors, network, databases). Definitively they provide an enhanced problem-solving environment able to deal with modern, large scale scientific and engineering problems. Scientific gateways are able to introduce a revolution in the way scientists and researchers organize and carry out their activities. Access to distributed resources, complex workflow capabilities, and community-oriented functionalities are just some of the features that can be provided by such a web-based environment. In the context of the EGEE NA4 Earth Science Cluster, Climate-G is a distributed testbed focusing on climate change research topics. The Euro-Mediterranean Center for Climate Change (CMCC) is actively participating in the testbed providing the scientific gateway (Climate-G Portal) to access to the entire infrastructure. The Climate-G Portal has to face important and critical challenges as well as has to satisfy and address key requirements. In the following, the most relevant ones are presented and discussed. Transparency: the portal has to provide a transparent access to the underlying infrastructure preventing users from dealing with low level details and the complexity of a distributed grid environment. Security: users must be authenticated and authorized on the portal to access and exploit portal functionalities. A wide set of roles is needed to clearly assign the proper one to each user. The access to the computational grid must be completely secured, since the target infrastructure to run jobs is a production grid environment. A security infrastructure (based on X509v3 digital certificates) is strongly needed. Pervasivity and ubiquity: the access to the system must be pervasive and ubiquitous. This is easily true due to the nature of the needed web approach. Usability and simplicity: the portal has to provide simple, high level and user friendly interfaces to ease the access and exploitation of the entire system. Coexistence of general purpose and domain oriented services: along with general purpose services (file transfer, job submission, etc.), the portal has to provide domain based services and functionalities. Subsetting of data, visualization of 2D maps around a virtual globe, delivery of maps through OGC compliant interfaces (i.e. Web Map Service - WMS) are just some examples. Since april 2009, about 70 users (85% coming from the climate change community) got access to the portal. A key challenge of this work is the idea to provide users with an integrated working environment, that is a place where scientists can find huge amount of data, complete metadata support, a wide set of data access services, data visualization and analysis tools, easy access to the underlying grid infrastructure and advanced monitoring interfaces.

  11. Design strategies and functionality of the Visual Interface for Virtual Interaction Development (VIVID) tool

    NASA Technical Reports Server (NTRS)

    Nguyen, Lac; Kenney, Patrick J.

    1993-01-01

    Development of interactive virtual environments (VE) has typically consisted of three primary activities: model (object) development, model relationship tree development, and environment behavior definition and coding. The model and relationship tree development activities are accomplished with a variety of well-established graphic library (GL) based programs - most utilizing graphical user interfaces (GUI) with point-and-click interactions. Because of this GUI format, little programming expertise on the part of the developer is necessary to create the 3D graphical models or to establish interrelationships between the models. However, the third VE development activity, environment behavior definition and coding, has generally required the greatest amount of time and programmer expertise. Behaviors, characteristics, and interactions between objects and the user within a VE must be defined via command line C coding prior to rendering the environment scenes. In an effort to simplify this environment behavior definition phase for non-programmers, and to provide easy access to model and tree tools, a graphical interface and development tool has been created. The principal thrust of this research is to effect rapid development and prototyping of virtual environments. This presentation will discuss the 'Visual Interface for Virtual Interaction Development' (VIVID) tool; an X-Windows based system employing drop-down menus for user selection of program access, models, and trees, behavior editing, and code generation. Examples of these selection will be highlighted in this presentation, as will the currently available program interfaces. The functionality of this tool allows non-programming users access to all facets of VE development while providing experienced programmers with a collection of pre-coded behaviors. In conjunction with its existing, interfaces and predefined suite of behaviors, future development plans for VIVID will be described. These include incorporation of dual user virtual environment enhancements, tool expansion, and additional behaviors.

  12. A Study of Practical Proxy Reencryption with a Keyword Search Scheme considering Cloud Storage Structure

    PubMed Central

    Lee, Im-Yeong

    2014-01-01

    Data outsourcing services have emerged with the increasing use of digital information. They can be used to store data from various devices via networks that are easy to access. Unlike existing removable storage systems, storage outsourcing is available to many users because it has no storage limit and does not require a local storage medium. However, the reliability of storage outsourcing has become an important topic because many users employ it to store large volumes of data. To protect against unethical administrators and attackers, a variety of cryptography systems are used, such as searchable encryption and proxy reencryption. However, existing searchable encryption technology is inconvenient for use in storage outsourcing environments where users upload their data to be shared with others as necessary. In addition, some existing schemes are vulnerable to collusion attacks and have computing cost inefficiencies. In this paper, we analyze existing proxy re-encryption with keyword search. PMID:24693240

  13. A study of practical proxy reencryption with a keyword search scheme considering cloud storage structure.

    PubMed

    Lee, Sun-Ho; Lee, Im-Yeong

    2014-01-01

    Data outsourcing services have emerged with the increasing use of digital information. They can be used to store data from various devices via networks that are easy to access. Unlike existing removable storage systems, storage outsourcing is available to many users because it has no storage limit and does not require a local storage medium. However, the reliability of storage outsourcing has become an important topic because many users employ it to store large volumes of data. To protect against unethical administrators and attackers, a variety of cryptography systems are used, such as searchable encryption and proxy reencryption. However, existing searchable encryption technology is inconvenient for use in storage outsourcing environments where users upload their data to be shared with others as necessary. In addition, some existing schemes are vulnerable to collusion attacks and have computing cost inefficiencies. In this paper, we analyze existing proxy re-encryption with keyword search.

  14. User Driven Development of Software Tools for Open Data Discovery and Exploration

    NASA Astrophysics Data System (ADS)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  15. Comparative analysis of data base management systems

    NASA Technical Reports Server (NTRS)

    Smith, R.

    1983-01-01

    A study to determine if the Remote File Inquiry (RFI) system would handle the future requirements of the user community is discussed. RFI is a locally written and locally maintained on-line query/update package. The current and future on-line requirements of the user community were studied. Additional consideration was given to the types of data structuring the users required. The survey indicated the features of greatest benefit were: sort, subtotals, totals, record selection, storage of queries, global updating and the ability to page break. The major deficiencies were: one level of hierarchy, excessive response time, software unreliability, difficult to add, delete and modify records, complicated error messages and the lack of ability to perform interfield comparisons. Missing features users required were: formatted screens, interfield comparions, interfield arithmetic, multiple file access, security and data integrity. The survey team recommended Kennedy Space Center move forward to state-of-the-art software, a Data Base Management System which is thoroughly tested and easy to implement and use.

  16. Access to the NCAR Research Data Archive via the Globus Data Transfer Service

    NASA Astrophysics Data System (ADS)

    Cram, T.; Schuster, D.; Ji, Z.; Worley, S. J.

    2014-12-01

    The NCAR Research Data Archive (RDA; http://rda.ucar.edu) contains a large and diverse collection of meteorological and oceanographic observations, operational and reanalysis outputs, and remote sensing datasets to support atmospheric and geoscience research. The RDA contains greater than 600 dataset collections which support the varying needs of a diverse user community. The number of RDA users is increasing annually, and the most popular method used to access the RDA data holdings is through web based protocols, such as wget and cURL based scripts. In the year 2013, 10,000 unique users downloaded greater than 820 terabytes of data from the RDA, and customized data products were prepared for more than 29,000 user-driven requests. In order to further support this increase in web download usage, the RDA is implementing the Globus data transfer service (www.globus.org) to provide a GridFTP data transfer option for the user community. The Globus service is broadly scalable, has an easy to install client, is sustainably supported, and provides a robust, efficient, and reliable data transfer option for RDA users. This paper highlights the main functionality and usefulness of the Globus data transfer service for accessing the RDA holdings. The Globus data transfer service, developed and supported by the Computation Institute at The University of Chicago and Argonne National Laboratory, uses the GridFTP as a fast, secure, and reliable method for transferring data between two endpoints. A Globus user account is required to use this service, and data transfer endpoints are defined on the Globus web interface. In the RDA use cases, the access endpoint is created on the RDA data server at NCAR. The data user defines the receiving endpoint for the data transfer, which can be the main file system at a host institution, a personal work station, or laptop. Once initiated, the data transfer runs as an unattended background process by Globus, and Globus ensures that the transfer is accurately fulfilled. Users can monitor the data transfer progress on the Globus web interface and optionally receive an email notification once it is complete. Globus also provides a command-line interface to support scripted transfers, which can be useful when embedded in data processing workflows.

  17. The CEBAF Element Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theodore Larrieu, Christopher Slominski, Michele Joyce

    2011-03-01

    With the inauguration of the CEBAF Element Database (CED) in Fall 2010, Jefferson Lab computer scientists have taken a step toward the eventual goal of a model-driven accelerator. Once fully populated, the database will be the primary repository of information used for everything from generating lattice decks to booting control computers to building controls screens. A requirement influencing the CED design is that it provide access to not only present, but also future and past configurations of the accelerator. To accomplish this, an introspective database schema was designed that allows new elements, types, and properties to be defined on-the-fly withmore » no changes to table structure. Used in conjunction with Oracle Workspace Manager, it allows users to query data from any time in the database history with the same tools used to query the present configuration. Users can also check-out workspaces to use as staging areas for upcoming machine configurations. All Access to the CED is through a well-documented Application Programming Interface (API) that is translated automatically from original C++ source code into native libraries for scripting languages such as perl, php, and TCL making access to the CED easy and ubiquitous.« less

  18. Cooperative data dissemination to mission sites

    NASA Astrophysics Data System (ADS)

    Chen, Fangfei; Johnson, Matthew P.; Bar-Noy, Amotz; La Porta, Thomas F.

    2010-04-01

    Timely dissemination of information to mobile users is vital in many applications. In a critical situation, no network infrastructure may be available for use in dissemination, over and above the on-board storage capability of the mobile users themselves. We consider the following specialized content distribution application: a group of users equipped with wireless devices build an ad hoc network in order cooperatively to retrieve information from certain regions (the mission sites). Each user requires access to some set of information items originating from sources lying within a region. Each user desires low-latency access to its desired data items, upon request (i.e., when pulled). In order to minimize average response time, we allow users to pull data either directly from sources or, when possible, from other nearby users who have already pulled, and continue to carry, the desired data items. That is, we allow for data to be pushed to one user and then pulled by one or more additional users. The total latency experienced by a user vis-vis a certain data item is then in general a combination of the push delay and the pull delay. We assume each delay time is a function of the hop distance between the pair of points in question. Our goal in this paper is to assign data to mobile users, in order to minimize the total cost and the average latency experienced by all the users. In a static setting, we solve this problem in two different schemes, one of which is easy to solve but wasteful, one of which relates to NP-hard problems but is less so. Then in a dynamic setting, we adapt the algorithm for the static setting and develop a new algorithm with respect to users' gradual arrival. In the end we show a trade-off can be made between minimizing the cost and latency.

  19. Accessing Suicide-Related Information on the Internet: A Retrospective Observational Study of Search Behavior

    PubMed Central

    2013-01-01

    Background The Internet’s potential impact on suicide is of major public health interest as easy online access to pro-suicide information or specific suicide methods may increase suicide risk among vulnerable Internet users. Little is known, however, about users’ actual searching and browsing behaviors of online suicide-related information. Objective To investigate what webpages people actually clicked on after searching with suicide-related queries on a search engine and to examine what queries people used to get access to pro-suicide websites. Methods A retrospective observational study was done. We used a web search dataset released by America Online (AOL). The dataset was randomly sampled from all AOL subscribers’ web queries between March and May 2006 and generated by 657,000 service subscribers. Results We found 5526 search queries (0.026%, 5526/21,000,000) that included the keyword "suicide". The 5526 search queries included 1586 different search terms and were generated by 1625 unique subscribers (0.25%, 1625/657,000). Of these queries, 61.38% (3392/5526) were followed by users clicking on a search result. Of these 3392 queries, 1344 (39.62%) webpages were clicked on by 930 unique users but only 1314 of those webpages were accessible during the study period. Each clicked-through webpage was classified into 11 categories. The categories of the most visited webpages were: entertainment (30.13%; 396/1314), scientific information (18.31%; 240/1314), and community resources (14.53%; 191/1314). Among the 1314 accessed webpages, we could identify only two pro-suicide websites. We found that the search terms used to access these sites included “commiting suicide with a gas oven”, “hairless goat”, “pictures of murder by strangulation”, and “photo of a severe burn”. A limitation of our study is that the database may be dated and confined to mainly English webpages. Conclusions Searching or browsing suicide-related or pro-suicide webpages was uncommon, although a small group of users did access websites that contain detailed suicide method information. PMID:23305632

  20. MOLEonline 2.0: interactive web-based analysis of biomacromolecular channels.

    PubMed

    Berka, Karel; Hanák, Ondrej; Sehnal, David; Banás, Pavel; Navrátilová, Veronika; Jaiswal, Deepti; Ionescu, Crina-Maria; Svobodová Vareková, Radka; Koca, Jaroslav; Otyepka, Michal

    2012-07-01

    Biomolecular channels play important roles in many biological systems, e.g. enzymes, ribosomes and ion channels. This article introduces a web-based interactive MOLEonline 2.0 application for the analysis of access/egress paths to interior molecular voids. MOLEonline 2.0 enables platform-independent, easy-to-use and interactive analyses of (bio)macromolecular channels, tunnels and pores. Results are presented in a clear manner, making their interpretation easy. For each channel, MOLEonline displays a 3D graphical representation of the channel, its profile accompanied by a list of lining residues and also its basic physicochemical properties. The users can tune advanced parameters when performing a channel search to direct the search according to their needs. The MOLEonline 2.0 application is freely available via the Internet at http://ncbr.muni.cz/mole or http://mole.upol.cz.

  1. Data Archive and Portal Thrust Area Strategy Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sivaraman, Chitra; Stephan, Eric G.; Macduff, Matt C.

    2014-09-01

    This report describes the Data Archive and Portal (DAP), a key capability of the U.S. Department of Energy's Atmosphere to Electron (A2e) initiative. The DAP Thrust Area Planning Group was organized to develop a plan for deploying this capability. Primarily, the report focuses on a distributed system--a DOE Wind Cloud--that functions as a repository for all A2e data. The Wind Cloud will be accessible via an open, easy-to-navigate user interface that facilitates community data access, interaction, and collaboration. DAP management will work with the community, industry, and international standards bodies to develop standards for wind data and to capture importantmore » characteristics of all data in the Wind Cloud.« less

  2. WebSat--a web software for microsatellite marker development.

    PubMed

    Martins, Wellington Santos; Lucas, Divino César Soares; Neves, Kelligton Fabricio de Souza; Bertioli, David John

    2009-01-01

    Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. The web tool may be accessed at http://purl.oclc.org/NET/websat/

  3. Interactive Tree Of Life v2: online annotation and display of phylogenetic trees made easy.

    PubMed

    Letunic, Ivica; Bork, Peer

    2011-07-01

    Interactive Tree Of Life (http://itol.embl.de) is a web-based tool for the display, manipulation and annotation of phylogenetic trees. It is freely available and open to everyone. In addition to classical tree viewer functions, iTOL offers many novel ways of annotating trees with various additional data. Current version introduces numerous new features and greatly expands the number of supported data set types. Trees can be interactively manipulated and edited. A free personal account system is available, providing management and sharing of trees in user defined workspaces and projects. Export to various bitmap and vector graphics formats is supported. Batch access interface is available for programmatic access or inclusion of interactive trees into other web services.

  4. The CloudBoard Research Platform: an interactive whiteboard for corporate users

    NASA Astrophysics Data System (ADS)

    Barrus, John; Schwartz, Edward L.

    2013-03-01

    Over one million interactive whiteboards (IWBs) are sold annually worldwide, predominantly for classroom use with few sales for corporate use. Unmet needs for IWB corporate use were investigated and the CloudBoard Research Platform (CBRP) was developed to investigate and test technology for meeting these needs. The CBRP supports audio conferencing with shared remote drawing activity, casual capture of whiteboard activity for long-term storage and retrieval, use of standard formats such as PDF for easy import of documents via the web and email and easy export of documents. Company RFID badges and key fobs provide secure access to documents at the board and automatic logout occurs after a period of inactivity. Users manage their documents with a web browser. Analytics and remote device management is provided for administrators. The IWB hardware consists of off-the-shelf components (a Hitachi UST Projector, SMART Technologies, Inc. IWB hardware, Mac Mini, Polycom speakerphone, etc.) and a custom occupancy sensor. The three back-end servers provide the web interface, document storage, stroke and audio streaming. Ease of use, security, and robustness sufficient for internal adoption was achieved. Five of the 10 boards installed at various Ricoh sites have been in daily or weekly use for the past year and total system downtime was less than an hour in 2012. Since CBRP was installed, 65 registered users, 9 of whom use the system regularly, have created over 2600 documents.

  5. A cloud based brokering framework to support hydrology at global scale

    NASA Astrophysics Data System (ADS)

    Boldrini, E.; Pecora, S.; Bordini, F.; Nativi, S.

    2016-12-01

    This work presents the hydrology broker designed and deployed in the context of a collaboration between the Regional Agency for Environmental Protection in the Italian region Emilia-Romagna (ARPA-ER) and CNR-IIA (National Research Council of Italy). The hydrology brokering platform eases the task of discovering and accessing hydrological observation data, usually acquired and made available by national agencies by means of a set of heterogeneous services (e.g. CUAHSI HIS servers, OGC services, FTP servers) and formats (e.g. WaterML, O&M, ...). The hydrology broker makes all the already published data available according to one or more of the desired and well known discovery protocols, access protocols, and formats . As a result, the user is able to search and access the available hydrological data through his preferred client (e.g. CUAHSI HydroDesktop, 52North SWE client). It is also easy to build a hydrological web portal on top of the broker, using the user friendly js API. The hydrology broker has been deployed on the Amazon cloud to ensure scalability and tested in the context of the work of the Commission for Hydrology of WMO on three different scenarios: the La Plata river basin, the Sava river basin and the Arctic-HYCOS project. In each scenario the hydrology broker discovered and accessed heterogeneous data formats (e.g. Waterml 1.0/2.0, proprietary CSV documents) from the heterogeneous services (e.g. CUAHSI HIS servers, FTP service and agency proprietary services) managed by several national agencies and international commissions. The hydrology broker made possible to present all the available data uniformly through the user desired service type and format (e.g. an HIS server publishing Waterml 2.0), producing a great improvement in both system interoperability and data exchange. Interoperability tests were also successfully conducted with WMO Information System (WIS) nodes, making possible for a specific Global Information Center System (GISC) to gather the available hydrological records as ISO 19115:2007 metadata documents through the OAI-PMH interface exposed by the broker. The framework flexibility makes it also easy to add other sources, as well as additional published interfaces, in order to cope with the future standard requirements needed by the hydrological community.

  6. EasyKSORD: A Platform of Keyword Search Over Relational Databases

    NASA Astrophysics Data System (ADS)

    Peng, Zhaohui; Li, Jing; Wang, Shan

    Keyword Search Over Relational Databases (KSORD) enables casual users to use keyword queries (a set of keywords) to search relational databases just like searching the Web, without any knowledge of the database schema or any need of writing SQL queries. Based on our previous work, we design and implement a novel KSORD platform named EasyKSORD for users and system administrators to use and manage different KSORD systems in a novel and simple manner. EasyKSORD supports advanced queries, efficient data-graph-based search engines, multiform result presentations, and system logging and analysis. Through EasyKSORD, users can search relational databases easily and read search results conveniently, and system administrators can easily monitor and analyze the operations of KSORD and manage KSORD systems much better.

  7. National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents

    NASA Astrophysics Data System (ADS)

    Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.

    2014-12-01

    The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.

  8. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods.

    PubMed

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.

  9. Nutrient Tracking Tool - A user-friendly tool for evaluating the water and air quality and quantity as affected by various agricultural management practices

    NASA Astrophysics Data System (ADS)

    Saleh, A.; Niraula, R.; Gallego, O.; Osei, E.; Kannan, N.

    2017-12-01

    The Nutrient Tracking Tool (NTT) is a user-friendly web-based computer program that estimate nutrient (nitrogen and phosphorus) and sediment losses from fields managed under a variety of cropping patterns and management practices. The NTT includes a user-friendly web-based interface and is linked to the Agricultural Policy Environmental eXtender (APEX) model. It also accesses USDA-NRCS's Web Soil Survey to obtain field, weather, and soil information. NTT provides producers, government officials, and other users with a fast and efficient method of estimating the nutrient, sediment, and atmosphoric gases (N2o, Co2, and NH4) losses, and crop production under different conservation practices regims at the farm-level. The information obtained from NTT can help producers to determine the most cost-effective conservation practice(s) to reduce the nutrient and sediment losses while optimizing the crop production. Also, the recent version of NTT (NTTg3) has been developed for those coutries without access to national databasis, such as soils and wether. The NTTg3 also has been designed as easy to use APEX interface. NTT is currently being evaluated for trading and other programs at Cheaseapea Bay regions and numerous states in US. During this presentation the new capabilities of NTTg3 will be described and demonstrated.

  10. Evaluation of the large scale computing needs of the energy research program and how to meet them. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, B.

    The Energy Research program may be on the verge of abdicating an important role it has traditionally played in the development and use of state-of-the-art computer systems. The lack of easy access to Class VI systems coupled to the easy availability of local, user-friendly systems is conspiring to drive many investigators away from forefront research in computational science and in the use of state-of-the-art computers for more discipline-oriented problem solving. The survey conducted under the auspices of this contract clearly demonstrates a significant suppressed demand for actual Class VI hours totaling the full capacity of one such system. The currentmore » usage is about a factor of 15 below this level. There is also a need for about 50% more capacity in the current mini/midi availability. Meeting the needs of the ER community for this level of computing power and capacity is most probably best achieved through the establishment of a central Class VI capability at some site linked through a nationwide network to the various ER laboratories and universities and interfaced with the local user-friendly systems at those remote sites.« less

  11. User's Guide for the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS)

    NASA Technical Reports Server (NTRS)

    Frederick, Dean K.; DeCastro, Jonathan A.; Litt, Jonathan S.

    2007-01-01

    This report is a Users Guide for the NASA-developed Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) software, which is a transient simulation of a large commercial turbofan engine (up to 90,000-lb thrust) with a realistic engine control system. The software supports easy access to health, control, and engine parameters through a graphical user interface (GUI). C-MAPSS provides the user with a graphical turbofan engine simulation environment in which advanced algorithms can be implemented and tested. C-MAPSS can run user-specified transient simulations, and it can generate state-space linear models of the nonlinear engine model at an operating point. The code has a number of GUI screens that allow point-and-click operation, and have editable fields for user-specified input. The software includes an atmospheric model which allows simulation of engine operation at altitudes from sea level to 40,000 ft, Mach numbers from 0 to 0.90, and ambient temperatures from -60 to 103 F. The package also includes a power-management system that allows the engine to be operated over a wide range of thrust levels throughout the full range of flight conditions.

  12. Cloud Computing with iPlant Atmosphere.

    PubMed

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  13. Collaborative Planetary GIS with JMARS

    NASA Astrophysics Data System (ADS)

    Dickenshied, S.; Christensen, P. R.; Edwards, C. S.; Prashad, L. C.; Anwar, S.; Engle, E.; Noss, D.; Jmars Development Team

    2010-12-01

    Traditional GIS tools have allowed users to work locally with their own datasets in their own computing environment. More recently, data providers have started offering online repositories of preprocessed data which helps minimize the learning curve required to access new datasets. The ideal collaborative GIS tool provides the functionality of a traditional GIS and easy access to preprocessed data repositories while also enabling users to contribute data, analysis, and ideas back into the very tools they're using. JMARS (Java Mission-planning and Analysis for Remote Sensing) is a suite of geospatial applications developed by the Mars Space Flight Facility at Arizona State University. This software is used for mission planning and scientific data analysis by several NASA missions, including Mars Odyssey, Mars Reconnaissance Orbiter, and the Lunar Reconnaissance Orbiter. It is used by scientists, researchers and students of all ages from more than 40 countries around the world. In addition to offering a rich set of global and regional maps and publicly released orbiter images, the JMARS software development team has been working on ways to encourage the creation of collaborative datasets. Bringing together users from diverse teams and backgrounds allows new features to be developed with an interest in making the application useful and accessible to as wide a potential audience as possible. Actively engaging the scientific community in development strategy and hands on tasks allows the creation of user driven data content that would not otherwise be possible. The first community generated dataset to result from this effort is a tool mapping peer-reviewed papers to the locations they relate to on Mars with links to ancillary data. This allows users of JMARS to browse to an area of interest and then quickly locate papers corresponding to that area. Alternately, users can search for published papers over a specified time interval and visually see what areas of Mars have received the most attention over the requested time span.

  14. The first step in using a robot in brain injury rehabilitation: patients' and health-care professionals' perspective.

    PubMed

    Boman, Inga-Lill; Bartfai, Aniko

    2015-01-01

    To evaluate the usability of a mobile telepresence robot (MTR) in a hospital training apartment (HTA). The MTR was manoeuvred remotely and was used for communication when assessing independent living skills, and for security monitoring of cognitively impaired patients. Occupational therapists (OTs) and nurses received training in how to use the MTR. The nurses completed a questionnaire regarding their expectations of using the MTR. OTs and patients staying in the HTA were interviewed about their experiences of the MTR. Interviews and questionnaires were analysed qualitatively. The HTA patients were very satisfied with the MTR. The OTs and nurses reported generally positive experiences. The OT's found that assessment via the MTR was more neutral than being physically present. However, the use of the MTR implied considerable difficulties for health-care professionals. The main obstacle for the nurses was the need for fast and easy access in emergency situations while protecting the patients' integrity. The results indicate that the MTR could be a useful tool to support daily living skills and safety monitoring of HTA patients. However, when designing technology for multiple users, such as health-care professionals, the needs of all users, their routines and support services involved, should also be considered. Implications for Rehabilitation A mobile telepresence robot (MTR) can be a useful tool for assessments and communication in rehabilitation. The design of the robot has to allow easy use by remote users, particularly in emergency situations. When designing MTRs the needs of ALL users have to be taken into consideration.

  15. Web-based Tool Suite for Plasmasphere Information Discovery

    NASA Astrophysics Data System (ADS)

    Newman, T. S.; Wang, C.; Gallagher, D. L.

    2005-12-01

    A suite of tools that enable discovery of terrestrial plasmasphere characteristics from NASA IMAGE Extreme Ultra Violet (EUV) images is described. The tool suite is web-accessible, allowing easy remote access without the need for any software installation on the user's computer. The features supported by the tool include reconstruction of the plasmasphere plasma density distribution from a short sequence of EUV images, semi-automated selection of the plasmapause boundary in an EUV image, and mapping of the selected boundary to the geomagnetic equatorial plane. EUV image upload and result download is also supported. The tool suite's plasmapause mapping feature is achieved via the Roelof and Skinner (2000) Edge Algorithm. The plasma density reconstruction is achieved through a tomographic technique that exploits physical constraints to allow for a moderate resolution result. The tool suite's software architecture uses Java Server Pages (JSP) and Java Applets on the front side for user-software interaction and Java Servlets on the server side for task execution. The compute-intensive components of the tool suite are implemented in C++ and invoked by the server via Java Native Interface (JNI).

  16. Granatum: a graphical single-cell RNA-Seq analysis pipeline for genomics scientists.

    PubMed

    Zhu, Xun; Wolfgruber, Thomas K; Tasato, Austin; Arisdakessian, Cédric; Garmire, David G; Garmire, Lana X

    2017-12-05

    Single-cell RNA sequencing (scRNA-Seq) is an increasingly popular platform to study heterogeneity at the single-cell level. Computational methods to process scRNA-Seq data are not very accessible to bench scientists as they require a significant amount of bioinformatic skills. We have developed Granatum, a web-based scRNA-Seq analysis pipeline to make analysis more broadly accessible to researchers. Without a single line of programming code, users can click through the pipeline, setting parameters and visualizing results via the interactive graphical interface. Granatum conveniently walks users through various steps of scRNA-Seq analysis. It has a comprehensive list of modules, including plate merging and batch-effect removal, outlier-sample removal, gene-expression normalization, imputation, gene filtering, cell clustering, differential gene expression analysis, pathway/ontology enrichment analysis, protein network interaction visualization, and pseudo-time cell series construction. Granatum enables broad adoption of scRNA-Seq technology by empowering bench scientists with an easy-to-use graphical interface for scRNA-Seq data analysis. The package is freely available for research use at http://garmiregroup.org/granatum/app.

  17. Managing industrial risk--having a tested and proven system to prevent and assess risk.

    PubMed

    Heller, Stephen

    2006-03-17

    Some relatively easy techniques exist to improve the risk picture/profile to aid in preventing losses. Today with the advent of computer system resources, focusing on specific aspects of risk through systematic scoring and comparison, the risk analysis can be relatively easy to achieve. Techniques like these demonstrate how working experience and common sense can be combined mathematically into a flexible risk management tool or risk model for analyzing risk. The risk assessment methodology provided by companies today is no longer the ideas and practices of one group or even one company. It is reflective of the practice of many companies, as well as the ideas and expertise of academia and government regulators. The use of multi-criteria decision making (MCDM) techniques for making critical decisions has been recognized for many years for a variety of purposes. In today's computer age, the easy accessing and user-friendly nature for using these techniques, makes them a favorable choice for use in the risk assessment environment. The new user of these methodologies should find many ideas directly applicable to his or her needs when approaching risk decision making. The user should find their ideas readily adapted, with slight modification, to accurately reflect a specific situation using MCDM techniques. This makes them an attractive feature for use in assessment and risk modeling. The main advantage of decision making techniques, such as MCDM, is that in the early stages of a risk assessment, accurate data on industrial risk, and failures are lacking. In most cases, it is still insufficient to perform a thorough risk assessment using purely statistical concepts. The practical advantages towards deviating from strict data-driven protocol seem to outweigh the drawbacks. Industry failure data often comes at a high cost when a loss occurs. We can benefit from this unfortunate acquisition of data through the continuous refining of our decisions by incorporating this new information into our assessments. MCDM techniques offer flexibility in accessing comparison within broad data sets to reflect our best estimation of their importance towards contribution to the risk picture. This allows for the accurate determination of the more probable and more consequential issues. This can later be refined using more intensive risk techniques and the avoidance of less critical issues.

  18. Non-Orthogonal Random Access in MIMO Cognitive Radio Networks: Beamforming, Power Allocation, and Opportunistic Transmission

    PubMed Central

    Lin, Huifa; Shin, Won-Yong

    2017-01-01

    We study secondary random access in multi-input multi-output cognitive radio networks, where a slotted ALOHA-type protocol and successive interference cancellation are used. We first introduce three types of transmit beamforming performed by secondary users, where multiple antennas are used to suppress the interference at the primary base station and/or to increase the received signal power at the secondary base station. Then, we show a simple decentralized power allocation along with the equivalent single-antenna conversion. To exploit the multiuser diversity gain, an opportunistic transmission protocol is proposed, where the secondary users generating less interference are opportunistically selected, resulting in a further reduction of the interference temperature. The proposed methods are validated via computer simulations. Numerical results show that increasing the number of transmit antennas can greatly reduce the interference temperature, while increasing the number of receive antennas leads to a reduction of the total transmit power. Optimal parameter values of the opportunistic transmission protocol are examined according to three types of beamforming and different antenna configurations, in terms of maximizing the cognitive transmission capacity. All the beamforming, decentralized power allocation, and opportunistic transmission protocol are performed by the secondary users in a decentralized manner, thus resulting in an easy implementation in practice. PMID:28076402

  19. Non-Orthogonal Random Access in MIMO Cognitive Radio Networks: Beamforming, Power Allocation, and Opportunistic Transmission.

    PubMed

    Lin, Huifa; Shin, Won-Yong

    2017-01-01

    We study secondary random access in multi-input multi-output cognitive radio networks, where a slotted ALOHA-type protocol and successive interference cancellation are used. We first introduce three types of transmit beamforming performed by secondary users, where multiple antennas are used to suppress the interference at the primary base station and/or to increase the received signal power at the secondary base station. Then, we show a simple decentralized power allocation along with the equivalent single-antenna conversion. To exploit the multiuser diversity gain, an opportunistic transmission protocol is proposed, where the secondary users generating less interference are opportunistically selected, resulting in a further reduction of the interference temperature. The proposed methods are validated via computer simulations. Numerical results show that increasing the number of transmit antennas can greatly reduce the interference temperature, while increasing the number of receive antennas leads to a reduction of the total transmit power. Optimal parameter values of the opportunistic transmission protocol are examined according to three types of beamforming and different antenna configurations, in terms of maximizing the cognitive transmission capacity. All the beamforming, decentralized power allocation, and opportunistic transmission protocol are performed by the secondary users in a decentralized manner, thus resulting in an easy implementation in practice.

  20. EnviroAtlas - Austin, TX - Park Access by Block Group

    EPA Pesticide Factsheets

    This EnviroAtlas dataset shows the block group population that is within and beyond an easy walking distance (500m) of a park entrance. Park entrances were included in this analysis if they were within 5km of the EnviroAtlas community boundary. This dataset was produced by the US EPA to support research and online mapping activities related to EnviroAtlas. EnviroAtlas (https://www.epa.gov/enviroatlas) allows the user to interact with a web-based, easy-to-use, mapping application to view and analyze multiple ecosystem services for the contiguous United States. The dataset is available as downloadable data (https://edg.epa.gov/data/Public/ORD/EnviroAtlas) or as an EnviroAtlas map service. Additional descriptive information about each attribute in this dataset can be found in its associated EnviroAtlas Fact Sheet (https://www.epa.gov/enviroatlas/enviroatlas-fact-sheets).

  1. Fast massive preventive security and information communication systems

    NASA Astrophysics Data System (ADS)

    Akopian, David; Chen, Philip; Miryakar, Susheel; Kumar, Abhinav

    2008-04-01

    We present a fast massive information communication system for data collection from distributive sources such as cell phone users. As a very important application one can mention preventive notification systems when timely notification and evidence communication may help to improve safety and security through wide public involvement by ensuring easy-to-access and easy-to-communicate information systems. The technology significantly simplifies the response to the events and will help e.g. special agencies to gather crucial information in time and respond as quickly as possible. Cellular phones are nowadays affordable for most of the residents and became a common personal accessory. The paper describes several ways to design such systems including existing internet access capabilities of cell phones or downloadable specialized software. We provide examples of such designs. The main idea is in structuring information in predetermined way and communicating data through a centralized gate-server which will automatically process information and forward it to a proper destination. The gate-server eliminates a need in knowing contact data and specific local community infrastructure. All the cell phones will have self-localizing capability according to FCC E911 mandate, thus the communicated information can be further tagged automatically by location and time information.

  2. Structure and software tools of AIDA.

    PubMed

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write implementation-specific code which can be selected and loaded by a special source loader, being part of the AIDA software. This feature is also accessible for maintaining software on different sites and on different installations.

  3. WebSat ‐ A web software for microsatellite marker development

    PubMed Central

    Martins, Wellington Santos; Soares Lucas, Divino César; de Souza Neves, Kelligton Fabricio; Bertioli, David John

    2009-01-01

    Simple sequence repeats (SSR), also known as microsatellites, have been extensively used as molecular markers due to their abundance and high degree of polymorphism. We have developed a simple to use web software, called WebSat, for microsatellite molecular marker prediction and development. WebSat is accessible through the Internet, requiring no program installation. Although a web solution, it makes use of Ajax techniques, providing a rich, responsive user interface. WebSat allows the submission of sequences, visualization of microsatellites and the design of primers suitable for their amplification. The program allows full control of parameters and the easy export of the resulting data, thus facilitating the development of microsatellite markers. Availability The web tool may be accessed at http://purl.oclc.org/NET/websat/ PMID:19255650

  4. Exploring accessibility issues of a public building for the mobility impaired. Case study: interstate bus terminal (ISBT), Vijayawada, India.

    PubMed

    Alagappan, Valliappan; Hefferan, Albert; Parivallal, Aarthi

    2018-04-01

    Right to access in the built environment creates equal and nondiscriminatory opportunities to a person with disabilities in order to move freely around and interact positively without hindrance and barriers. The objective of the study is to understand the existing accessibility related issues and implementation of guidelines and standards for public buildings. The technical verification using onsite and offsite access audit format for current provision of facilities in the internal and external environment has been carried out with the format prepared in reference to Central Public Works Department (CPWD) accessibility guidelines for mobility impaired and elderly and American Disability Act (ADA) guidelines. The access audit format included parameters like accessibility, safety, security, comfort and convenience and it addresses the barriers faced by wheel chair users, people with crutches, prosthetics and with non-assistive devices. The study addressed accessibility compliance in three zones of the building with initiation from parking area zone, inside the building, and area outside the building premises. The findings highlight the environmental barriers encountered by mobility impaired people and represented graphically in the layout plan and physical effort required to overcome the challenges in the built environment. The overall accessibility compliance is 42% in the interstate bus terminal. Implications for rehabilitation The study identifies the environmental limitations, human and technologically facilitators with the help of Central Public Works Department (CPWD) and American Disability Act (ADA) guidelines (1990). It highlights barriers for mobility-impaired users, by demonstrating in a spatial layout and the means to facilitate easy access with minimal frustration, stress and with less physical effort. It demonstrates the need for preparation of separate guidelines for making the existing types of buildings to be access and disabled-friendly. New accessibility guidelines shall be prepared by incorporating concepts like such as relative accessibility into new bus terminal buildings. Guidelines help the disabled in the process of rehabilitation and develop inclusiveness not rather than alienation.

  5. Personalized summarization using user preference for m-learning

    NASA Astrophysics Data System (ADS)

    Lee, Sihyoung; Yang, Seungji; Ro, Yong Man; Kim, Hyoung Joong

    2008-02-01

    As the Internet and multimedia technology is becoming advanced, the number of digital multimedia contents is also becoming abundant in learning area. In order to facilitate the access of digital knowledge and to meet the need of a lifelong learning, e-learning could be the helpful alternative way to the conventional learning paradigms. E-learning is known as a unifying term to express online, web-based and technology-delivered learning. Mobile-learning (m-learning) is defined as e-learning through mobile devices using wireless transmission. In a survey, more than half of the people remarked that the re-consumption was one of the convenient features in e-learning. However, it is not easy to find user's preferred segmentation from a full version of lengthy e-learning content. Especially in m-learning, a content-summarization method is strongly required because mobile devices are limited to low processing power and battery capacity. In this paper, we propose a new user preference model for re-consumption to construct personalized summarization for re-consumption. The user preference for re-consumption is modeled based on user actions with statistical model. Based on the user preference model for re-consumption with personalized user actions, our method discriminates preferred parts over the entire content. Experimental results demonstrated successful personalized summarization.

  6. User testing and performance evaluation of the Electronic Quality Improvement Platform for Plans and Pharmacies.

    PubMed

    Pringle, Janice L; Kearney, Shannon M; Grasso, Kim; Boyer, Annette D; Conklin, Mark H; Szymanski, Keith A

    2015-01-01

    To user-test and evaluate a performance information management platform that makes standardized, benchmarked medication use quality data available to both health plans and community pharmacy organizations. Multiple health/drug plans and multiple chain and independent pharmacies across the United States. During the first phase of the study, user experience was measured via user satisfaction surveys and interviews with key personnel (pharmacists, pharmacy leaders, and health plan leadership). Improvements were subsequently made to the platform based on these findings. During the second phase of the study, the platform was implemented in a greater number of pharmacies and by a greater number of payers. User experience was then reevaluated to gather information for further improvements. The surveys and interviews revealed that users found the Web-based platform easy to use and beneficial in terms of understanding and comparing performance metrics. Primary concerns included lack of access to real-time data and patient-specific data. Many users also expressed uncertainty as to how they could use the information and data provided by the platform. The study findings indicate that while information management platforms can be used effectively in both pharmacy and health plan settings, future development is needed to ensure that the provided data can be transferred to pharmacy best practices and improved quality care.

  7. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    NASA Astrophysics Data System (ADS)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  8. New Searching Capability and OpenURL Linking in the ADS

    NASA Astrophysics Data System (ADS)

    Eichhorn, Guenther; Accomazzi, A.; Grant, C. S.; Henneken, E.; Kurtz, M. J.; Thompson, D. M.; Murray, S. S.

    2006-12-01

    The ADS is the search system of choice for the astronomical community. It also covers a large part of the physics and physics/astronomy education literature. In order to make access to this system as easy as possible, we developed a Google-like interface version of our search form. This one-field search parses the user input and automatically detects author names and year ranges. Firefox users can set up their browser to have this search field installed in the top right corner search field to have even easier access to the ADS search capability. The basic search is available from the ADS Homepage at: http://adsabs.harvard.edu To aid with access to subscription journals the ADS now supports OpenURL linking. If your library supports an OpenURL server, you can specify this server in the ADS preference settings. All links to journal articles will then automatically be directed to the OpenURL with the appropriate link information. We provide a selection of known OpenURL servers to choose from. If your server is not in this list, please send the necessary information to ads@cfa.harvard.edu and we will include it in our list. The ADS is funded by NASA grant NNG06GG68G.

  9. Glance Information System for ATLAS Management

    NASA Astrophysics Data System (ADS)

    Grael, F. F.; Maidantchik, C.; Évora, L. H. R. A.; Karam, K.; Moraes, L. O. F.; Cirilli, M.; Nessi, M.; Pommès, K.; ATLAS Collaboration

    2011-12-01

    ATLAS Experiment is an international collaboration where more than 37 countries, 172 institutes and laboratories, 2900 physicists, engineers, and computer scientists plus 700 students participate. The management of this teamwork involves several aspects such as institute contribution, employment records, members' appointment, authors' list, preparation and publication of papers and speakers nomination. Previously, most of the information was accessible by a limited group and developers had to face problems such as different terminology, diverse data modeling, heterogeneous databases and unlike users needs. Moreover, the systems were not designed to handle new requirements. The maintenance has to be an easy task due to the long lifetime experiment and professionals turnover. The Glance system, a generic mechanism for accessing any database, acts as an intermediate layer isolating the user from the particularities of each database. It retrieves, inserts and updates the database independently of its technology and modeling. Relying on Glance, a group of systems were built to support the ATLAS management and operation aspects: ATLAS Membership, ATLAS Appointments, ATLAS Speakers, ATLAS Analysis Follow-Up, ATLAS Conference Notes, ATLAS Thesis, ATLAS Traceability and DSS Alarms Viewer. This paper presents the overview of the Glance information framework and describes the privilege mechanism developed to grant different level of access for each member and system.

  10. A novel Interactive Health Communication Application (IHCA) for parents of children with long-term conditions: Development, implementation and feasibility assessment.

    PubMed

    Swallow, Veronica; Carolan, Ian; Smith, Trish; Webb, Nicholas J A; Knafl, Kathleen; Santacroce, Sheila; Campbell, Malcolm; Harper-Jones, Melanie; Hanif, Noreen; Hall, Andrew

    2016-01-01

    Few evidence-based, on-line resources exist to support home-based care of childhood long-term conditions. In a feasibility study, children with stages 3, 4, or 5 chronic kidney disease, parents and professionals collaboratively developed a novel Online Parent Information and Support (OPIS) application. Parents were randomized to an intervention arm with access to OPIS or a control arm without access. OPIS usage was assessed using Google Analytics. Parents in the intervention arm completed the Suitability Assessment of Materials (SAM) and User Interface Satisfaction (USE) questionnaires and participated in qualitative interviews. Twenty parents accessed OPIS with a mean of 23.3 (SD 20.8, range 2-64) visits per user. Responses from the SAM and USE questionnaires were positive, most respondents rating OPIS highly and finding it easy to use. Qualitative suggestions include refinement of OPIS components, enabling personalization of OPIS functionalities and proactive endorsements of OPIS by professionals. Implementation of OPIS into standard practice is feasible in the centre where it was developed. Suggested developments will augment reported strengths to inform ongoing testing in the wider UK network of units. Our design and methods are transferrable to developing and evaluating web-applications to support home-based clinical care-giving for other long-term conditions.

  11. Using open-source programs to create a web-based portal for hydrologic information

    NASA Astrophysics Data System (ADS)

    Kim, H.

    2013-12-01

    Some hydrologic data sets, such as basin climatology, precipitation, and terrestrial water storage, are not easily obtainable and distributable due to their size and complexity. We present a Hydrologic Information Portal (HIP) that has been implemented at the University of California for Hydrologic Modeling (UCCHM) and that has been organized around the large river basins of North America. This portal can be easily accessed through a modern web browser that enables easy access and visualization of such hydrologic data sets. Some of the main features of our HIP include a set of data visualization features so that users can search, retrieve, analyze, integrate, organize, and map data within large river basins. Recent information technologies such as Google Maps, Tornado (Python asynchronous web server), NumPy/SciPy (Scientific Library for Python) and d3.js (Visualization library for JavaScript) were incorporated into the HIP to create ease in navigating large data sets. With such open source libraries, HIP can give public users a way to combine and explore various data sets by generating multiple chart types (Line, Bar, Pie, Scatter plot) directly from the Google Maps viewport. Every rendered object such as a basin shape on the viewport is clickable, and this is the first step to access the visualization of data sets.

  12. The Gene Set Builder: collation, curation, and distribution of sets of genes

    PubMed Central

    Yusuf, Dimas; Lim, Jonathan S; Wasserman, Wyeth W

    2005-01-01

    Background In bioinformatics and genomics, there are many applications designed to investigate the common properties for a set of genes. Often, these multi-gene analysis tools attempt to reveal sequential, functional, and expressional ties. However, while tremendous effort has been invested in developing tools that can analyze a set of genes, minimal effort has been invested in developing tools that can help researchers compile, store, and annotate gene sets in the first place. As a result, the process of making or accessing a set often involves tedious and time consuming steps such as finding identifiers for each individual gene. These steps are often repeated extensively to shift from one identifier type to another; or to recreate a published set. In this paper, we present a simple online tool which – with the help of the gene catalogs Ensembl and GeneLynx – can help researchers build and annotate sets of genes quickly and easily. Description The Gene Set Builder is a database-driven, web-based tool designed to help researchers compile, store, export, and share sets of genes. This application supports the 17 eukaryotic genomes found in version 32 of the Ensembl database, which includes species from yeast to human. User-created information such as sets and customized annotations are stored to facilitate easy access. Gene sets stored in the system can be "exported" in a variety of output formats – as lists of identifiers, in tables, or as sequences. In addition, gene sets can be "shared" with specific users to facilitate collaborations or fully released to provide access to published results. The application also features a Perl API (Application Programming Interface) for direct connectivity to custom analysis tools. A downloadable Quick Reference guide and an online tutorial are available to help new users learn its functionalities. Conclusion The Gene Set Builder is an Ensembl-facilitated online tool designed to help researchers compile and manage sets of genes in a user-friendly environment. The application can be accessed via . PMID:16371163

  13. A RESTful API for accessing microbial community data for MG-RAST.

    PubMed

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; Brettin, Tom; D'Souza, Mark; Gerlach, Wolfgang; Matthews, Hunter; Paczian, Tobias; Wilkening, Jared; Glass, Elizabeth M; Desai, Narayan; Meyer, Folker

    2015-01-01

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MG-RAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, as well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http://kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase's microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service.

  14. Atmospheric Composition Data and Information Services Center (ACDISC)

    NASA Technical Reports Server (NTRS)

    Kempler, S.

    2005-01-01

    NASA's GSFC Earth Sciences (GES) Data and Information and Data Services Center (DISC) manages the archive, distribution and data access for atmospheric composition data from AURA'S OMI, MLS, and hopefully one day, HIRDLS instruments, as well as heritage datasets from TOMS, UARS, MODIS, and AIRS. This data is currently archived in the GES Distributed Active Archive Center (DAAC). The GES DISC has begun the development of a community driven data management system that's sole purpose is to manage and provide value added services to NASA's Atmospheric Composition (AC) Data. This system, called the Atmospheric Composition Data and Information Services Center (ACDISC) will provide access all AC datasets from the above mentioned instruments, as well as AC datasets residing at remote archive sites (e.g, LaRC DAAC) The goals of the ACDISC are to: 1) Provide a data center for Atmospheric Scientists, guided by Atmospheric Scientists; 2) Be absolutely responsive to the data and data service needs of the Atmospheric Composition (AC) community; 3) Provide services (i.e., expertise) that will facilitate the effortless access to and usage of AC data; 4) Collaborate with AC scientists to facilitate the use of data from multiple sensors for long term atmospheric research. The ACDISC is an AC specific, user driven, multi-sensor, on-line, easy access archive and distribution system employing data analysis and visualization, data mining, and other user requested techniques that facilitate science data usage. The purpose of this presentation is to provide the evolution path that the GES DISC in order to better serve AC data, and also to receive continued community feedback and further foster collaboration with AC data users and providers.

  15. Automated collection of imaging and phenotypic data to centralized and distributed data repositories

    PubMed Central

    King, Margaret D.; Wood, Dylan; Miller, Brittny; Kelly, Ross; Landis, Drew; Courtney, William; Wang, Runtang; Turner, Jessica A.; Calhoun, Vince D.

    2014-01-01

    Accurate data collection at the ground level is vital to the integrity of neuroimaging research. Similarly important is the ability to connect and curate data in order to make it meaningful and sharable with other investigators. Collecting data, especially with several different modalities, can be time consuming and expensive. These issues have driven the development of automated collection of neuroimaging and clinical assessment data within COINS (Collaborative Informatics and Neuroimaging Suite). COINS is an end-to-end data management system. It provides a comprehensive platform for data collection, management, secure storage, and flexible data retrieval (Bockholt et al., 2010; Scott et al., 2011). It was initially developed for the investigators at the Mind Research Network (MRN), but is now available to neuroimaging institutions worldwide. Self Assessment (SA) is an application embedded in the Assessment Manager (ASMT) tool in COINS. It is an innovative tool that allows participants to fill out assessments via the web-based Participant Portal. It eliminates the need for paper collection and data entry by allowing participants to submit their assessments directly to COINS. Instruments (surveys) are created through ASMT and include many unique question types and associated SA features that can be implemented to help the flow of assessment administration. SA provides an instrument queuing system with an easy-to-use drag and drop interface for research staff to set up participants' queues. After a queue has been created for the participant, they can access the Participant Portal via the internet to fill out their assessments. This allows them the flexibility to participate from home, a library, on site, etc. The collected data is stored in a PostgresSQL database at MRN. This data is only accessible by users that have explicit permission to access the data through their COINS user accounts and access to MRN network. This allows for high volume data collection and with minimal user access to PHI (protected health information). An added benefit to using COINS is the ability to collect, store and share imaging data and assessment data with no interaction with outside tools or programs. All study data collected (imaging and assessment) is stored and exported with a participant's unique subject identifier so there is no need to keep extra spreadsheets or databases to link and keep track of the data. Data is easily exported from COINS via the Query Builder and study portal tools, which allow fine grained selection of data to be exported into comma separated value file format for easy import into statistical programs. There is a great need for data collection tools that limit human intervention and error while at the same time providing users with intuitive design. COINS aims to be a leader in database solutions for research studies collecting data from several different modalities. PMID:24926252

  16. Automated collection of imaging and phenotypic data to centralized and distributed data repositories.

    PubMed

    King, Margaret D; Wood, Dylan; Miller, Brittny; Kelly, Ross; Landis, Drew; Courtney, William; Wang, Runtang; Turner, Jessica A; Calhoun, Vince D

    2014-01-01

    Accurate data collection at the ground level is vital to the integrity of neuroimaging research. Similarly important is the ability to connect and curate data in order to make it meaningful and sharable with other investigators. Collecting data, especially with several different modalities, can be time consuming and expensive. These issues have driven the development of automated collection of neuroimaging and clinical assessment data within COINS (Collaborative Informatics and Neuroimaging Suite). COINS is an end-to-end data management system. It provides a comprehensive platform for data collection, management, secure storage, and flexible data retrieval (Bockholt et al., 2010; Scott et al., 2011). It was initially developed for the investigators at the Mind Research Network (MRN), but is now available to neuroimaging institutions worldwide. Self Assessment (SA) is an application embedded in the Assessment Manager (ASMT) tool in COINS. It is an innovative tool that allows participants to fill out assessments via the web-based Participant Portal. It eliminates the need for paper collection and data entry by allowing participants to submit their assessments directly to COINS. Instruments (surveys) are created through ASMT and include many unique question types and associated SA features that can be implemented to help the flow of assessment administration. SA provides an instrument queuing system with an easy-to-use drag and drop interface for research staff to set up participants' queues. After a queue has been created for the participant, they can access the Participant Portal via the internet to fill out their assessments. This allows them the flexibility to participate from home, a library, on site, etc. The collected data is stored in a PostgresSQL database at MRN. This data is only accessible by users that have explicit permission to access the data through their COINS user accounts and access to MRN network. This allows for high volume data collection and with minimal user access to PHI (protected health information). An added benefit to using COINS is the ability to collect, store and share imaging data and assessment data with no interaction with outside tools or programs. All study data collected (imaging and assessment) is stored and exported with a participant's unique subject identifier so there is no need to keep extra spreadsheets or databases to link and keep track of the data. Data is easily exported from COINS via the Query Builder and study portal tools, which allow fine grained selection of data to be exported into comma separated value file format for easy import into statistical programs. There is a great need for data collection tools that limit human intervention and error while at the same time providing users with intuitive design. COINS aims to be a leader in database solutions for research studies collecting data from several different modalities.

  17. TADPLOT program, version 2.0: User's guide

    NASA Technical Reports Server (NTRS)

    Hammond, Dana P.

    1991-01-01

    The TADPLOT Program, Version 2.0 is described. The TADPLOT program is a software package coordinated by a single, easy-to-use interface, enabling the researcher to access several standard file formats, selectively collect specific subsets of data, and create full-featured publication and viewgraph quality plots. The user-interface was designed to be independent from any file format, yet provide capabilities to accommodate highly specialized data queries. Integrated with an applications software network, data can be assessed, collected, and viewed quickly and easily. Since the commands are data independent, subsequent modifications to the file format will be transparent, while additional file formats can be integrated with minimal impact on the user-interface. The graphical capabilities are independent of the method of data collection; thus, the data specification and subsequent plotting can be modified and upgraded as separate functional components. The graphics kernel selected adheres to the full functional specifications of the CORE standard. Both interface and postprocessing capabilities are fully integrated into TADPLOT.

  18. Techniques for animation of CFD results. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Horowitz, Jay; Hanson, Jeffery C.

    1992-01-01

    Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.

  19. The interactive astronomical data analysis facility - image enhancement techniques to Comet Halley

    NASA Astrophysics Data System (ADS)

    Klinglesmith, D. A.

    1981-10-01

    PDP 11/40 computer is at the heart of a general purpose interactive data analysis facility designed to permit easy access to data in both visual imagery and graphic representations. The major components consist of: the 11/40 CPU and 256 K bytes of 16-bit memory; two TU10 tape drives; 20 million bytes of disk storage; three user terminals; and the COMTAL image processing display system. The application of image enhancement techniques to two sequences of photographs of Comet Halley taken in Egypt in 1910 provides evidence for eruptions from the comet's nucleus.

  20. Utilizing Non-Contact Stress Measurement System (NSMS) as a Health Monitor

    NASA Technical Reports Server (NTRS)

    Hayes, Terry; Hayes, Bryan; Bynum, Ken

    2011-01-01

    Continuously monitor all 156 blades throughout the entire operating envelope without adversely affecting tunnel conditions or compromise compressor shell integrity, Calculate dynamic response and identify the frequency/mode to determine individual blade deflection amplitudes, natural frequencies, phase, and damping (Q), Log static deflection to build a database of deflection values at certain compressor conditions to use as basis for real-time online Blade Stack monitor, Monitor for stall, surge, flutter, and blade damage, Operate with limited user input, low maintenance cost, safe illumination of probes, easy probe replacement, and require little or no access to compressor.

  1. Converting information from paper to optical media

    NASA Technical Reports Server (NTRS)

    Deaton, Timothy N.; Tiller, Bruce K.

    1990-01-01

    The technology of converting large amounts of paper into electronic form is described for use in information management systems based on optical disk storage. The space savings and photographic nature of microfiche are combined in these systems with the advantages of computerized data (fast and flexible retrieval of graphics and text, simultaneous instant access for multiple users, and easy manipulation of data). It is noted that electronic imaging systems offer a unique opportunity to dramatically increase the productivity and profitability of information systems. Particular attention is given to the CALS (Computer-aided Aquisition and Logistic Support) system.

  2. Social Networking Adapted for Distributed Scientific Collaboration

    NASA Technical Reports Server (NTRS)

    Karimabadi, Homa

    2012-01-01

    Share is a social networking site with novel, specially designed feature sets to enable simultaneous remote collaboration and sharing of large data sets among scientists. The site will include not only the standard features found on popular consumer-oriented social networking sites such as Facebook and Myspace, but also a number of powerful tools to extend its functionality to a science collaboration site. A Virtual Observatory is a promising technology for making data accessible from various missions and instruments through a Web browser. Sci-Share augments services provided by Virtual Observatories by enabling distributed collaboration and sharing of downloaded and/or processed data among scientists. This will, in turn, increase science returns from NASA missions. Sci-Share also enables better utilization of NASA s high-performance computing resources by providing an easy and central mechanism to access and share large files on users space or those saved on mass storage. The most common means of remote scientific collaboration today remains the trio of e-mail for electronic communication, FTP for file sharing, and personalized Web sites for dissemination of papers and research results. Each of these tools has well-known limitations. Sci-Share transforms the social networking paradigm into a scientific collaboration environment by offering powerful tools for cooperative discourse and digital content sharing. Sci-Share differentiates itself by serving as an online repository for users digital content with the following unique features: a) Sharing of any file type, any size, from anywhere; b) Creation of projects and groups for controlled sharing; c) Module for sharing files on HPC (High Performance Computing) sites; d) Universal accessibility of staged files as embedded links on other sites (e.g. Facebook) and tools (e.g. e-mail); e) Drag-and-drop transfer of large files, replacing awkward e-mail attachments (and file size limitations); f) Enterprise-level data and messaging encryption; and g) Easy-to-use intuitive workflow.

  3. Optimizing real-time Web-based user interfaces for observatories

    NASA Astrophysics Data System (ADS)

    Gibson, J. Duane; Pickering, Timothy E.; Porter, Dallan; Schaller, Skip

    2008-08-01

    In using common HTML/Ajax approaches for web-based data presentation and telescope control user interfaces at the MMT Observatory (MMTO), we rapidly were confronted with web browser performance issues. Much of the operational data at the MMTO is highly dynamic and is constantly changing during normal operations. Status of telescope subsystems must be displayed with minimal latency to telescope operators and other users. A major motivation of migrating toward web-based applications at the MMTO is to provide easy access to current and past observatory subsystem data for a wide variety of users on their favorite operating system through a familiar interface, their web browser. Performance issues, especially for user interfaces that control telescope subsystems, led to investigations of more efficient use of HTML/Ajax and web server technologies as well as other web-based technologies, such as Java and Flash/Flex. The results presented here focus on techniques for optimizing HTML/Ajax web applications with near real-time data display. This study indicates that direct modification of the contents or "nodeValue" attribute of text nodes is the most efficient method of updating data values displayed on a web page. Other optimization techniques are discussed for web-based applications that display highly dynamic data.

  4. The Anatomy of a Grid portal

    NASA Astrophysics Data System (ADS)

    Licari, Daniele; Calzolari, Federico

    2011-12-01

    In this paper we introduce a new way to deal with Grid portals referring to our implementation. L-GRID is a light portal to access the EGEE/EGI Grid infrastructure via Web, allowing users to submit their jobs from a common Web browser in a few minutes, without any knowledge about the Grid infrastructure. It provides the control over the complete lifecycle of a Grid Job, from its submission and status monitoring, to the output retrieval. The system, implemented as client-server architecture, is based on the Globus Grid middleware. The client side application is based on a java applet; the server relies on a Globus User Interface. There is no need of user registration on the server side, and the user needs only his own X.509 personal certificate. The system is user-friendly, secure (it uses SSL protocol, mechanism for dynamic delegation and identity creation in public key infrastructures), highly customizable, open source, and easy to install. The X.509 personal certificate does not get out from the local machine. It allows to reduce the time spent for the job submission, granting at the same time a higher efficiency and a better security level in proxy delegation and management.

  5. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  6. The value of usability testing for Internet-based adolescent self-management interventions: "Managing Hemophilia Online".

    PubMed

    Breakey, Vicky R; Warias, Ashley V; Ignas, Danial M; White, Meghan; Blanchette, Victor S; Stinson, Jennifer N

    2013-10-04

    As adolescents with hemophilia approach adulthood, they are expected to assume responsibility for their disease management. A bilingual (English and French) Internet-based self-management program, "Teens Taking Charge: Managing Hemophilia Online," was developed to support adolescents with hemophilia in this transition. This study explored the usability of the website and resulted in refinement of the prototype. A purposive sample (n=18; age 13-18; mean age 15.5 years) was recruited from two tertiary care centers to assess the usability of the program in English and French. Qualitative observations using a "think aloud" usability testing method and semi-structured interviews were conducted in four iterative cycles, with changes to the prototype made as necessary following each cycle. This study was approved by research ethics boards at each site. Teens responded positively to the content and appearance of the website and felt that it was easy to navigate and understand. The multimedia components (videos, animations, quizzes) were felt to enrich the experience. Changes to the presentation of content and the website user-interface were made after the first, second and third cycles of testing in English. Cycle four did not result in any further changes. Overall, teens found the website to be easy to use. Usability testing identified end-user concerns that informed improvements to the program. Usability testing is a crucial step in the development of Internet-based self-management programs to ensure information is delivered in a manner that is accessible and understood by users.

  7. Facilitating access to information in large documents with an intelligent hypertext system

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie

    1993-01-01

    Retrieving specific information from large amounts of documentation is not an easy task. It could be facilitated if information relevant in the current problem solving context could be automatically supplied to the user. As a first step towards this goal, we have developed an intelligent hypertext system called CID (Computer Integrated Documentation) and tested it on the Space Station Freedom requirement documents. The CID system enables integration of various technical documents in a hypertext framework and includes an intelligent context-sensitive indexing and retrieval mechanism. This mechanism utilizes on-line user information requirements and relevance feedback either to reinforce current indexing in case of success or to generate new knowledge in case of failure. This allows the CID system to provide helpful responses, based on previous usage of the documentation, and to improve its performance over time.

  8. Open Source GIS Connectors to the NASA GES DISC Satellite Data

    NASA Astrophysics Data System (ADS)

    Pham, L.; Kempler, S. J.; Yang, W.

    2014-12-01

    The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) houses a suite of satellite-derived GIS data including high spatiotemporal resolution precipitation, air quality, and modeled land surface parameter data. The data are extremely useful to various GIS research and applications at regional, continental, and global scales, as evidenced by the growing GIS user requests to the data. On the other hand, we also found that some GIS users, especially those from the ArcGIS community, having difficulties in obtaining, importing, and using our data, primarily due to the unfamiliarity of the users with our products and GIS software's lack of capabilities in dealing with the predominately raster form data in various sometimes very complicated formats. In this presentation, we introduce a set of open source ArcGIS data connectors that significantly simplify the access and use of our data in ArcGIS. With the connectors, users do not need to know the data access URLs, the access protocols or syntaxes, and data formats. Nor do they need to browse through a long list of variables that are often embedded into one single science data file and whose names may sometimes be confusing to those not familiar with the file (such as variable CH4_VMR_D for "CH4 Volume mixing ratio from the descending orbit" and variable EVPsfc for "Total Evapotranspiration"). The connectors will expose most GIS-related variables to the users with easy to understand names. User can simply define the spatiotemporal range of their study, select interested parameter(s), and have the needed data be downloaded, imported, and displayed in ArcGIS. The connectors are python text files and there is no installation process. They can be placed at any user directory and be started by simply clicking on it. In the presentation, we'll also demonstrate how to use the tools to load GES DISC time series air quality data with a few clicks and how such data depict the spatial and temporal patterns of air quality in different parts of the world during the past decade.

  9. Making SAR Data Accessible - ASF's ALOS PALSAR Radiometric Terrain Correction Project

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Arko, S. A.; Gens, R.

    2015-12-01

    While SAR data have proven valuable for a wide range of geophysical research questions, so far, largely only the SAR-educated science communities have been able to fully exploit the information content of internationally available SAR archives. The main issues that have been preventing a more widespread utilization of SAR are related to (1) the diversity and complexity of SAR data formats, (2) the complexity of the processing flows needed to extract geophysical information from SAR, (3) the lack of standardization and automation of these processing flows, and (4) the often ignored geocoding procedures, leaving the data in image coordinate space. In order to improve upon this situation, ASF's radiometric terrain-correction (RTC) project is generating uniformly formatted and easily accessible value-added products from the ASF Distributed Active Archive Center's (DAAC) five-year archive of JAXA's ALOS PALSAR sensor. Specifically, the project applies geometric and radiometric corrections to SAR data to allow for an easy and direct combination of obliquely acquired SAR data with remote sensing imagery acquired in nadir observation geometries. Finally, the value-added data is provided to the user in the broadly accepted Geotiff format, in order to support the easy integration of SAR data into GIS environments. The goal of ASF's RTC project is to make SAR data more accessible and more attractive to the broader SAR applications community, especially to those users that currently have limited SAR expertise. Production of RTC products commenced October 2014 and will conclude late in 2015. As of July 2015, processing of 71% of ASF's ALOS PALSAR archive was completed. Adding to the utility of this dataset are recent changes to the data access policy that allow the full-resolution RTC products to be provided to the public, without restriction. In this paper we will introduce the processing flow that was developed for the RTC project and summarize the calibration and validation procedures that were implemented to determine and monitor system performance. The paper will also show the current progress of RTC processing, provide examples of generated data sets, and demonstrate the benefit of the RTC archives for applications such as land-use classification and change detection.

  10. Parsley: a Command-Line Parser for Astronomical Applications

    NASA Astrophysics Data System (ADS)

    Deich, William

    Parsley is a sophisticated keyword + value parser, packaged as a library of routines that offers an easy method for providing command-line arguments to programs. It makes it easy for the user to enter values, and it makes it easy for the programmer to collect and validate the user's entries. Parsley is tuned for astronomical applications: for example, dates entered in Julian, Modified Julian, calendar, or several other formats are all recognized without special effort by the user or by the programmer; angles can be entered using decimal degrees or dd:mm:ss; time-like intervals as decimal hours, hh:mm:ss, or a variety of other units. Vectors of data are accepted as readily as scalars.

  11. Users guide for EASI graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sasser, D.W.

    1978-03-01

    EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of EASI Graphics and illustrates its application with some examples.

  12. Development and validation of an online interactive, multimedia wound care algorithms program.

    PubMed

    Beitz, Janice M; van Rijswijk, Lia

    2012-01-01

    To provide education based on evidence-based and validated wound care algorithms we designed and implemented an interactive, Web-based learning program for teaching wound care. A mixed methods quantitative pilot study design with qualitative components was used to test and ascertain the ease of use, validity, and reliability of the online program. A convenience sample of 56 RN wound experts (formally educated, certified in wound care, or both) participated. The interactive, online program consists of a user introduction, interactive assessment of 15 acute and chronic wound photos, user feedback about the percentage correct, partially correct, or incorrect algorithm and dressing choices and a user survey. After giving consent, participants accessed the online program, provided answers to the demographic survey, and completed the assessment module and photographic test, along with a posttest survey. The construct validity of the online interactive program was strong. Eighty-five percent (85%) of algorithm and 87% of dressing choices were fully correct even though some programming design issues were identified. Online study results were consistently better than previously conducted comparable paper-pencil study results. Using a 5-point Likert-type scale, participants rated the program's value and ease of use as 3.88 (valuable to very valuable) and 3.97 (easy to very easy), respectively. Similarly the research process was described qualitatively as "enjoyable" and "exciting." This digital program was well received indicating its "perceived benefits" for nonexpert users, which may help reduce barriers to implementing safe, evidence-based care. Ongoing research using larger sample sizes may help refine the program or algorithms while identifying clinician educational needs. Initial design imperfections and programming problems identified also underscored the importance of testing all paper and Web-based programs designed to educate health care professionals or guide patient care.

  13. Implementation of an Embedded Web Server Application for Wireless Control of Brain Computer Interface Based Home Environments.

    PubMed

    Aydın, Eda Akman; Bay, Ömer Faruk; Güler, İnan

    2016-01-01

    Brain Computer Interface (BCI) based environment control systems could facilitate life of people with neuromuscular diseases, reduces dependence on their caregivers, and improves their quality of life. As well as easy usage, low-cost, and robust system performance, mobility is an important functionality expected from a practical BCI system in real life. In this study, in order to enhance users' mobility, we propose internet based wireless communication between BCI system and home environment. We designed and implemented a prototype of an embedded low-cost, low power, easy to use web server which is employed in internet based wireless control of a BCI based home environment. The embedded web server provides remote access to the environmental control module through BCI and web interfaces. While the proposed system offers to BCI users enhanced mobility, it also provides remote control of the home environment by caregivers as well as the individuals in initial stages of neuromuscular disease. The input of BCI system is P300 potentials. We used Region Based Paradigm (RBP) as stimulus interface. Performance of the BCI system is evaluated on data recorded from 8 non-disabled subjects. The experimental results indicate that the proposed web server enables internet based wireless control of electrical home appliances successfully through BCIs.

  14. Providing Effective Access to Shared Resources: A COIN Approach

    NASA Technical Reports Server (NTRS)

    Airiau, Stephane; Wolpert, David H.

    2004-01-01

    Managers of systems of shared resources typically have many separate goals. Examples are efficient utilization of the resources among its users and ensuring no user s satisfaction in the system falls below a preset minimal level. Since such goals will usually conflict with one another, either implicitly or explicitly the manager must determine the relative importance of the goals, encapsulating that into an overall utility function rating the possible behaviors of the entire system. Here we demonstrate a distributed, robust, and adaptive way to optimize that overall function. Our approach is to interpose adaptive agents between each user and the system, where each such agent is working to maximize its own private utility function. In turn, each such agent's function should be both relatively easy for the agent to learn to optimize, and "aligned" with the overall utility function of the system manager - an overall function that is based on but in general different from the satisfaction functions of the individual users. To ensure this we enhance the Collective INtelligence (COIN) framework to incorporate user satisfaction functions in the overall utility function of the system manager and accordingly in the associated private utility functions assigned to the users agents. We present experimental evaluations of different COIN-based private utility functions and demonstrate that those COIN-based functions outperform some natural alternatives.

  15. Providing Effective Access to Shared Resources: A COIN Approach

    NASA Technical Reports Server (NTRS)

    Airiau, Stephane; Wolpert, David H.; Sen, Sandip; Tumer, Kagan

    2003-01-01

    Managers of systems of shared resources typically have many separate goals. Examples are efficient utilization of the resources among its users and ensuring no user's satisfaction in the system falls below a preset minimal level. Since such goals will usually conflict with one another, either implicitly or explicitly the manager must determine the relative importance of the goals, encapsulating that into an overall utility function rating the possible behaviors of the entire system. Here we demonstrate a distributed, robust, and adaptive way to optimize that overall function. Our approach is to interpose adaptive agents between each user and the system, where each such agent is working to maximize its own private utility function. In turn, each such agent's function should be both relatively easy for the agent to learn to optimize, and 'aligned' with the overall utility function of the system manager - an overall function that is based on but in general different from the satisfaction functions of the individual users. To ensure this we enhance the COllective INtelligence (COIN) framework to incorporate user satisfaction functions in the overall utility function of the system manager and accordingly in the associated private utility functions assigned to the users agents. We present experimental evaluations of different COIN-based private utility functions and demonstrate that those COIN-based functions outperform some natural alternatives.

  16. Yet More Visualized JAMSTEC Cruise and Dive Information

    NASA Astrophysics Data System (ADS)

    Tomiyama, T.; Hase, H.; Fukuda, K.; Saito, H.; Kayo, M.; Matsuda, S.; Azuma, S.

    2014-12-01

    Every year, JAMSTEC performs about a hundred of research cruises and numerous dive surveys using its research vessels and submersibles. JAMSTEC provides data and samples obtained during these cruises and dives to international users through a series of data sites on the Internet. The "DARWIN (http://www.godac.jamstec.go.jp/darwin/e)" data site disseminates cruise and dive information. On DARWIN, users can search interested cruises and dives with a combination search form or an interactive tree menu, and find lists of observation data as well as links to surrounding databases. Document catalog, physical sample databases, and visual archive of dive surveys (e. g. in http://www.godac.jamstec.go.jp/jmedia/portal/e) are directly accessible from the lists. In 2014, DARWIN experienced an update, which was arranged mainly for enabling on-demand data visualization. Using login users' functions, users can put listed data items into the virtual basket and then trim, plot and download the data. The visualization tools help users to quickly grasp the quality and characteristics of observation data. Meanwhile, JAMSTEC launched a new data site named "JDIVES (http://www.godac.jamstec.go.jp/jdives/e)" to visualize data and sample information obtained by dive surveys. JDIVES shows tracks of dive surveys on the "Google Earth Plugin" and diagrams of deep-sea environmental data such as temperature, salinity, and depth. Submersible camera images and links to associated databases are placed along the dive tracks. The JDVIES interface enables users to perform so-called virtual dive surveys, which can help users to understand local geometries of dive spots and geological settings of associated data and samples. It is not easy for individual researchers to organize a huge amount of information recovered from each cruise and dive. The improved visibility and accessibility of JAMSTEC databases are advantageous not only for second-hand users, but also for on-board researchers themselves.

  17. Use of cloud computing technology in natural hazard assessment and emergency management

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.

    2015-12-01

    During a natural hazard event, the most up-to-date data needs to be in the hands of those on the front line. Decision support system tools can be developed to provide access to pre-made outputs to quickly assess the hazard and potential risk. However, with the ever growing availability of new satellite data as well as ground and airborne data generated in real-time there is a need to analyze the large volumes of data in an easy-to-access and effective environment. With the growth in the use of cloud computing, where the analysis and visualization system can grow with the needs of the user, then these facilities can used to provide this real-time analysis. Think of a central command center uploading the data to the cloud compute system and then those researchers in-the-field connecting to a web-based tool to view the newly acquired data. New data can be added by any user and then viewed instantly by anyone else in the organization through the cloud computing interface. This provides the ideal tool for collaborative data analysis, hazard assessment and decision making. We present the rationale for developing a cloud computing systems and illustrate how this tool can be developed for use in real-time environments. Users would have access to an interactive online image analysis tool without the need for specific remote sensing software on their local system therefore increasing their understanding of the ongoing hazard and mitigate its impact on the surrounding region.

  18. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.« less

  19. Communicating LightSail: Embedded Reporting and Web Strategies for Citizen-Funded Space Missions

    NASA Astrophysics Data System (ADS)

    Hilverda, M.; Davis, J.

    2015-12-01

    The Planetary Society (TPS) is a non-profit space advocacy group with a stated mission to "empower the world's citizens to advance space science and exploration." In 2009, TPS began work on LightSail, a small, citizen-funded spacecraft to demonstrate solar sailing propulsion technology. The program included a test flight, completed in June 2015, with a primary mission slated for late 2016. TPS initiated a LightSail public engagement campaign to provide the public with transparent mission updates, and foster educational outreach. A credentialed science journalist was given unrestricted access to the team and data, and provided regular reports without editorial oversight. An accompanying website, sail.planetary.org, provided project updates, multimedia, and real-time spacecraft data during the mission. Design approaches included a clean layout with text optimized for easy reading, balanced by strong visual elements to enhance reader comprehension and interest. A dedicated "Mission Control" page featured social media feeds, links to most recent articles, and a ground track showing the spacecraft's position, including overflight predictions based on user location. A responsive, cross-platform design allowed easy access across a broad range of devices. Efficient web server performance was prioritized by implementing a static content management system (CMS). Despite two spacecraft contingencies, the test mission successfully completed its primary objective of solar sail deployment. Qualitative feedback on the transparent, embedded reporting style was positive, and website metrics showed high user retention times. The website also grew awareness and support for the primary 2016 mission, driving traffic to a Kickstarter campaign that raised $1.24 million. Websites constantly evolve, and changes for the primary mission will include a new CMS to better support multiple authors and a custom dashboard to display real-time spacecraft sensor data.

  20. Facilitating Scientific Collaboration and Education with Easy Access Web Maps Using the AGAP Antarctic Geophysical Data

    NASA Astrophysics Data System (ADS)

    Abdi, A.

    2012-12-01

    Science and science education benefit from easy access to data yet often geophysical data sets are large, complex and difficult to share. The difficulty in sharing data and imagery easily inhibits both collaboration and the use of real data in educational applications. The dissemination of data products through web maps serves a very efficient and user-friendly method for students, the public and the science community to gain insights and understanding from data. Few research groups provide direct access to their data, let alone map-based visualizations. By building upon current GIS infrastructure with web mapping technologies, like ArcGIS Server, scientific groups, institutions and agencies can enhance the value of their GIS investments. The advantages of web maps to serve data products are many; existing web-mapping technology allows complex GIS analysis to be shared across the Internet, and can be easily scaled from a few users to millions. This poster highlights the features of an interactive web map developed at the Polar Geophysics Group at the Lamont-Doherty Earth Observatory of Columbia University that provides a visual representation of, and access to, data products that resulted from the group's recently concluded AGAP project (http://pgg.ldeo.columbia.edu). The AGAP project collected more than 120,000 line km of new aerogeophysical data using two Twin Otter aircrafts. Data included ice penetrating radar, magnetometer, gravimeter and laser altimeter measurements. The web map is based upon ArcGIS Viewer for Flex, which is a configurable client application built on the ArcGIS API for Flex that works seamlessly with ArcGIS Server 10. The application can serve a variety of raster and vector file formats through the Data Interoperability for Server, which eliminates data sharing barriers across numerous file formats. The ability of the application to serve large datasets is only hindered by the availability of appropriate hardware. ArcGIS is a proprietary product, but there are a few data portals in the earth sciences that have a map interface using open access products such as MapServer and OpenLayers, the most notable being the NASA IceBridge Data Portal. Indeed, with the widespread availability of web mapping technology, the scientific community should advance towards this direction when disseminating their data.

  1. A coastal information system to propel emerging science and ...

    EPA Pesticide Factsheets

    The Estuary Data Mapper (EDM) is a free, interactive virtual gateway to coastal data aimed to promote research and aid in environmental management. The graphical user interface allows users to custom select and subset data based on their spatial and temporal interests giving them easy access to visualize, retrieve, and save data for further analysis. Data are accessible across estuarine systems of the Atlantic, Gulf of Mexico and Pacific regions of the United States and includes: (1) time series data including tidal, hydrologic, and weather, (2) water and sediment quality, (3) atmospheric deposition, (4) habitat, (5) coastal exposure indices, (6) historic and projected land-use and population, (7) historic and projected nitrogen and phosphorous sources and load summaries. EDM issues Web Coverage Service Interface Standard queries (WCS; simple, standard one-line text strings) to a public web service to quickly obtain data subsets by variable, for a date-time range and area selected by user. EDM is continuously being enhanced with updated data and new options. Recent additions include a comprehensive suite of nitrogen source and loading data, and inputs for supporting a modeling approach of seagrass habitat. Additions planned for the near future include 1) support for Integrated Water Resources Management cost-benefit analysis, specifically the Watershed Management Optimization Support Tool and 2) visualization of the combined effects of climate change, land-use a

  2. The Hawaiian Algal Database: a laboratory LIMS and online resource for biodiversity data

    PubMed Central

    Wang, Norman; Sherwood, Alison R; Kurihara, Akira; Conklin, Kimberly Y; Sauvage, Thomas; Presting, Gernot G

    2009-01-01

    Background Organization and presentation of biodiversity data is greatly facilitated by databases that are specially designed to allow easy data entry and organized data display. Such databases also have the capacity to serve as Laboratory Information Management Systems (LIMS). The Hawaiian Algal Database was designed to showcase specimens collected from the Hawaiian Archipelago, enabling users around the world to compare their specimens with our photographs and DNA sequence data, and to provide lab personnel with an organizational tool for storing various biodiversity data types. Description We describe the Hawaiian Algal Database, a comprehensive and searchable database containing photographs and micrographs, geo-referenced collecting information, taxonomic checklists and standardized DNA sequence data. All data for individual samples are linked through unique accession numbers. Users can search online for sample information by accession number, numerous levels of taxonomy, or collection site. At the present time the database contains data representing over 2,000 samples of marine, freshwater and terrestrial algae from the Hawaiian Archipelago. These samples are primarily red algae, although other taxa are being added. Conclusion The Hawaiian Algal Database is a digital repository for Hawaiian algal samples and acts as a LIMS for the laboratory. Users can make use of the online search tool to view and download specimen photographs and micrographs, DNA sequences and relevant habitat data, including georeferenced collecting locations. It is publicly available at . PMID:19728892

  3. Estuary Data Mapper: A coastal information system to propel ...

    EPA Pesticide Factsheets

    The Estuary Data Mapper (EDM) is a free, interactive virtual gateway to coastal data aimed to promote research and aid in environmental management. The graphical user interface allows users to custom select and subset data based on their spatial and temporal interests giving them easy access to visualize, retrieve, and save data for further analysis. Data are accessible across estuarine systems of the Atlantic, Gulf of Mexico and Pacific regions of the United States and includes: (1) time series data including tidal, hydrologic, and weather, (2) water and sediment quality, (3) atmospheric deposition, (4) habitat, (5) coastal exposure indices, (6) historic and projected land-use and population, (7) historic and projected nitrogen and phosphorous sources and load summaries. EDM issues Web Coverage Service Interface Standard queries (WCS; simple, standard one-line text strings) to a public web service to quickly obtain data subsets by variable, for a date-time range and area selected by user. EDM is continuously being enhanced with updated data and new options. Recent additions include a comprehensive suite of nitrogen source and loading data, and inputs for supporting a modeling approach of seagrass habitat. Additions planned for the near future include 1) support for Integrated Water Resources Management cost-benefit analysis, specifically the Watershed Management Optimization Support Tool and 2) visualization of the combined effects of climate change, land-use a

  4. EPA Recovery Mapper

    EPA Pesticide Factsheets

    The EPA Recovery Mapper is an Internet interactive mapping application that allows users to discover information about every American Recovery and Reinvestment Act (ARRA) award that EPA has funded for six programs. By integrating data reported by the recipients of Recovery Act funding and data created by EPA, this application delivers a level of transparency and public accessibility to users interested in EPA's use of Recovery Act monies. The application is relatively easy to use and builds on the same mapping model as Google, Bing, MapQuest and other commonly used mapping interfaces. EPA Recovery Mapper tracks each award made by each program and gives basic Quick Facts information for each award including award name, location, award date, dollar amounts and more. Data Summaries for each EPA program or for each state are provided displaying dollars for Total Awarded, Total Received (Paid), and Total Jobs This Quarter by Recovery for the latest quarter of data released by Recovery.gov. The data are reported to the government and EPA four times a year by the award recipients. The latest quarterly report will always be displayed in the EPA Recovery Mapper. In addition, the application provides many details about each award. Users will learn more about how to access and interpret these data later in this document. Data shown in the EPA Recovery Mapper are derived from information reported back to FederalReporting.gov from the recipients of Recovery Act funding. EPA

  5. DataONE: A Distributed Environmental and Earth Science Data Network Supporting the Full Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Cook, R.; Michener, W.; Vieglais, D.; Budden, A.; Koskela, R.

    2012-04-01

    Addressing grand environmental science challenges requires unprecedented access to easily understood data that cross the breadth of temporal, spatial, and thematic scales. Tools are needed to plan management of the data, discover the relevant data, integrate heterogeneous and diverse data, and convert the data to information and knowledge. Addressing these challenges requires new approaches for the full data life cycle of managing, preserving, sharing, and analyzing data. DataONE (Observation Network for Earth) represents a virtual organization that enables new science and knowledge creation through preservation and access to data about life on Earth and the environment that sustains it. The DataONE approach is to improve data collection and management techniques; facilitate easy, secure, and persistent storage of data; continue to increase access to data and tools that improve data interoperability; disseminate integrated and user-friendly tools for data discovery and novel analyses; work with researchers to build intuitive data exploration and visualization tools; and support communities of practice via education, outreach, and stakeholder engagement.

  6. [Concept and applications of the Web 3.0: an introduction for medical doctors].

    PubMed

    Mayer, Miguel Angel; Leis, Angela

    2010-05-01

    The development of the Internet is continuous and appears to be never-ending, although with the arrival of Web 3.0 it could be said that the Internet is what its creators intended it to be from the first moment, an extraordinary and immense organised, understandable, and easy to access data base, characteristics still not achieved. The innovations and services included in Web 3.0 will result, in the first place, in better, faster and safer access to quality information. In the second place it should provide better personalisation of the health services that Internet users access, avoiding irrelevant information that may contain wrong, false and dangerous recommendations. However, these changes will have to be accompanied by the legal requirements common to the information society, by the ethical aspects associated with medical care, guaranteeing and contributing, in all cases, to improving the doctor-patient relationship. Copyright 2009 Elsevier España, S.L. All rights reserved.

  7. Collaborative Visualization Project: shared-technology learning environments for science learning

    NASA Astrophysics Data System (ADS)

    Pea, Roy D.; Gomez, Louis M.

    1993-01-01

    Project-enhanced science learning (PESL) provides students with opportunities for `cognitive apprenticeships' in authentic scientific inquiry using computers for data-collection and analysis. Student teams work on projects with teacher guidance to develop and apply their understanding of science concepts and skills. We are applying advanced computing and communications technologies to augment and transform PESL at-a-distance (beyond the boundaries of the individual school), which is limited today to asynchronous, text-only networking and unsuitable for collaborative science learning involving shared access to multimedia resources such as data, graphs, tables, pictures, and audio-video communication. Our work creates user technology (a Collaborative Science Workbench providing PESL design support and shared synchronous document views, program, and data access; a Science Learning Resource Directory for easy access to resources including two-way video links to collaborators, mentors, museum exhibits, media-rich resources such as scientific visualization graphics), and refine enabling technologies (audiovisual and shared-data telephony, networking) for this PESL niche. We characterize participation scenarios for using these resources and we discuss national networked access to science education expertise.

  8. Application of Bayesian Classification to Content-Based Data Management

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Berrick, S.; Gopalan, A.; Hua, X.; Shen, S.; Smith, P.; Yang, K-Y.; Wheeler, K.; Curry, C.

    2004-01-01

    The high volume of Earth Observing System data has proven to be challenging to manage for data centers and users alike. At the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC), about 1 TB of new data are archived each day. Distribution to users is also about 1 TB/day. A substantial portion of this distribution is MODIS calibrated radiance data, which has a wide variety of uses. However, much of the data is not useful for a particular user's needs: for example, ocean color users typically need oceanic pixels that are free of cloud and sun-glint. The GES DAAC is using a simple Bayesian classification scheme to rapidly classify each pixel in the scene in order to support several experimental content-based data services for near-real-time MODIS calibrated radiance products (from Direct Readout stations). Content-based subsetting would allow distribution of, say, only clear pixels to the user if desired. Content-based subscriptions would distribute data to users only when they fit the user's usability criteria in their area of interest within the scene. Content-based cache management would retain more useful data on disk for easy online access. The classification may even be exploited in an automated quality assessment of the geolocation product. Though initially to be demonstrated at the GES DAAC, these techniques have applicability in other resource-limited environments, such as spaceborne data systems.

  9. The SNS/HFIR Web Portal System - How Can it Help Me?

    NASA Astrophysics Data System (ADS)

    Miller, Stephen D.; Geist, Al; Herwig, Kenneth W.; Peterson, Peter F.; Reuter, Michael A.; Ren, Shelly; Bilheux, Jean-Christophe; Campbell, Stuart I.; Kohl, James A.; Vazhkudai, Sudharshan S.; Cobb, John W.; Lynch, Vickie E.; Chen, Meili; Trater, James R.; Smith, Bradford C.; (William Swain, Tom; Huang, Jian; Mikkelson, Ruth; Mikkelson, Dennis; een, Mar K. L. Gr

    2010-11-01

    In a busy world, continuing with the status-quo, to do things the way we are already familiar, often seems to be the most efficient way to conduct our work. We look for the value-add to decide if investing in a new method is worth the effort. How shall we evaluate if we have reached this tipping point for change? For contemporary researchers, understanding the properties of the data is a good starting point. The new generation of neutron scattering instruments being built are higher resolution and produce one or more orders of magnitude larger data than the previous generation of instruments. For instance, we have grown out of being able to perform some important tasks with our laptops - the data are too big and the computations would simply take too long. These large datasets can be problematic as facility users now begin to grapple with many of the same issues faced by more established computing communities. These issues include data access, management, and movement, data format standards, distributed computing, and collaboration among others. The Neutron Science Portal has been architected, designed, and implemented to provide users with an easy-to-use interface for managing and processing data, while also keeping an eye on meeting modern cybersecurity requirements imposed on institutions. The cost of entry for users has been lowered by utilizing a web interface providing access to backend portal resources. Users can browse or search for data which they are allowed to see, data reduction applications can be run without having to load the software, sample activation calculations can be performed for SNS and HFIR beamlines, McStas simulations can be run on TeraGrid and ORNL computers, and advanced analysis applications such as those being produced by the DANSE project can be run. Behind the scenes is a "live cataloging" system which automatically catalogs and archives experiment data via the data management system, and provides proposal team members access to their experiment data. The complexity of data movement and utilizing distributed computing resources has been taken care on behalf of users. Collaboration is facilitated by providing users a read/writeable common area, shared

  10. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  11. User needs on Nursing Net (The Kango Net) - analyzing the total consultation page - http://www.kango-net.jp/en/index.html.

    PubMed

    Sakyo, Yumi; Nakayama, Kazuhiro; Komatsu, Hiroko; Setoyama, Yoko

    2009-01-01

    People are required to take in and comprehend a massive amount of health information and in turn make some serious decisions based on that information. We, at St. Luke's College of Nursing, provide a rich selection of high-quality health information, and have set up Nursing Net (The Kango Net:Kango is Nursing in Japanese). This website provides information for consumers as well as people interested in the nursing profession. In an attempt to identify the needs of users, this study conducted an analysis of the contents on the total consultation page. Many readers voted that responses to nursing techniques and symptoms questions proved instrumental in their queries. Based on the results of this study, we can conclude that this is an easy-to-access, convenient site for getting health information about physical symptoms and nursing techniques.

  12. STINGRAY: system for integrated genomic resources and analysis.

    PubMed

    Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R

    2014-03-07

    The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.

  13. Proteopedia: Exciting Advances in the 3D Encyclopedia of Biomolecular Structure

    NASA Astrophysics Data System (ADS)

    Prilusky, Jaime; Hodis, Eran; Sussman, Joel L.

    Proteopedia is a collaborative, 3D web-encyclopedia of protein, nucleic acid and other structures. Proteopedia ( http://www.proteopedia.org ) presents 3D biomolecule structures in a broadly accessible manner to a diverse scientific audience through easy-to-use molecular visualization tools integrated into a wiki environment that anyone with a user account can edit. We describe recent advances in the web resource in the areas of content and software. In terms of content, we describe a large growth in user-added content as well as improvements in automatically-generated content for all PDB entry pages in the resource. In terms of software, we describe new features ranging from the capability to create pages hidden from public view to the capability to export pages for offline viewing. New software features also include an improved file-handling system and availability of biological assemblies of protein structures alongside their asymmetric units.

  14. Machine Translation-Supported Cross-Language Information Retrieval for a Consumer Health Resource

    PubMed Central

    Rosemblat, Graciela; Gemoets, Darren; Browne, Allen C.; Tse, Tony

    2003-01-01

    The U.S. National Institutes of Health, through its National Library of Medicine, developed ClinicalTrials.gov to provide the public with easy access to information on clinical trials on a wide range of conditions or diseases. Only English language information retrieval is currently supported. Given the growing number of Spanish speakers in the U.S. and their increasing use of the Web, we anticipate a significant increase in Spanish-speaking users. This study compares the effectiveness of two common cross-language information retrieval methods using machine translation, query translation versus document translation, using a subset of genuine user queries from ClinicalTrials.gov. Preliminary results conducted with the ClinicalTrials.gov search engine show that in our environment, query translation is statistically significantly better than document translation. We discuss possible reasons for this result and we conclude with suggestions for future work. PMID:14728236

  15. A steady and oscillatory kernel function method for interfering surfaces in subsonic, transonic and supersonic flow. [prediction analysis techniques for airfoils

    NASA Technical Reports Server (NTRS)

    Cunningham, A. M., Jr.

    1976-01-01

    The theory, results and user instructions for an aerodynamic computer program are presented. The theory is based on linear lifting surface theory, and the method is the kernel function. The program is applicable to multiple interfering surfaces which may be coplanar or noncoplanar. Local linearization was used to treat nonuniform flow problems without shocks. For cases with imbedded shocks, the appropriate boundary conditions were added to account for the flow discontinuities. The data describing nonuniform flow fields must be input from some other source such as an experiment or a finite difference solution. The results are in the form of small linear perturbations about nonlinear flow fields. The method was applied to a wide variety of problems for which it is demonstrated to be significantly superior to the uniform flow method. Program user instructions are given for easy access.

  16. STINGRAY: system for integrated genomic resources and analysis

    PubMed Central

    2014-01-01

    Background The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. Findings STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. Conclusion STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/. PMID:24606808

  17. RatLab: an easy to use tool for place code simulations

    PubMed Central

    Schönfeld, Fabian; Wiskott, Laurenz

    2013-01-01

    In this paper we present the RatLab toolkit, a software framework designed to set up and simulate a wide range of studies targeting the encoding of space in rats. It provides open access to our modeling approach to establish place and head direction cells within unknown environments and it offers a set of parameters to allow for the easy construction of a variety of enclosures for a virtual rat as well as controlling its movement pattern over the course of experiments. Once a spatial code is formed RatLab can be used to modify aspects of the enclosure or movement pattern and plot the effect of such modifications on the spatial representation, i.e., place and head direction cell activity. The simulation is based on a hierarchical Slow Feature Analysis (SFA) network that has been shown before to establish a spatial encoding of new environments using visual input data only. RatLab encapsulates such a network, generates the visual training data, and performs all sampling automatically—with each of these stages being further configurable by the user. RatLab was written with the intention to make our SFA model more accessible to the community and to that end features a range of elements to allow for experimentation with the model without the need for specific programming skills. PMID:23908627

  18. HERA: A dynamic web application for visualizing community exposure to flood hazards based on storm and sea level rise scenarios

    NASA Astrophysics Data System (ADS)

    Jones, Jeanne M.; Henry, Kevin; Wood, Nathan; Ng, Peter; Jamieson, Matthew

    2017-12-01

    The Hazard Exposure Reporting and Analytics (HERA) dynamic web application was created to provide a platform that makes research on community exposure to coastal-flooding hazards influenced by sea level rise accessible to planners, decision makers, and the public in a manner that is both easy to use and easily accessible. HERA allows users to (a) choose flood-hazard scenarios based on sea level rise and storm assumptions, (b) appreciate the modeling uncertainty behind a chosen hazard zone, (c) select one or several communities to examine exposure, (d) select the category of population or societal asset, and (e) choose how to look at results. The application is designed to highlight comparisons between (a) varying levels of sea level rise and coastal storms, (b) communities, (c) societal asset categories, and (d) spatial scales. Through a combination of spatial and graphical visualizations, HERA aims to help individuals and organizations to craft more informed mitigation and adaptation strategies for climate-driven coastal hazards. This paper summarizes the technologies used to maximize the user experience, in terms of interface design, visualization approaches, and data processing.

  19. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  20. Sharing tools and best practice in Global Sensitivity Analysis within academia and with industry

    NASA Astrophysics Data System (ADS)

    Wagener, T.; Pianosi, F.; Noacco, V.; Sarrazin, F.

    2017-12-01

    We have spent years trying to improve the use of global sensitivity analysis (GSA) in earth and environmental modelling. Our efforts included (1) the development of tools that provide easy access to widely used GSA methods, (2) the definition of workflows so that best practice is shared in an accessible way, and (3) the development of algorithms to close gaps in available GSA methods (such as moment independent strategies) and to make GSA applications more robust (such as convergence criteria). These elements have been combined in our GSA Toolbox, called SAFE (www.safetoolbox.info), which has up to now been adopted by over 1000 (largely) academic users worldwide. However, despite growing uptake in academic circles and across a wide range of application areas, transfer to industry applications has been difficult. Initial market research regarding opportunities and barriers for uptake revealed a large potential market, but also highlighted a significant lack of knowledge regarding state-of-the-art methods and their potential value for end-users. We will present examples and discuss our experience so far in trying to overcome these problems and move beyond academia in distributing GSA tools and expertise.

  1. KNMI Data Centre: Easy access for all

    NASA Astrophysics Data System (ADS)

    van de Vegte, John; Som de Cerff, Wim; Plieger, Maarten; de Vreede, Ernst; Sluiter, Raymond; Willem Noteboom, Jan; van der Neut, Ian; Verhoef, Hans; van Versendaal, Robert; van Binnendijk, Martin; Kalle, Henk; Knopper, Arthur; Spit, Jasper; Mastop, Joeri; Klos, Olaf; Calis, Gijs; Ha, Siu-Siu; van Moosel, Wim; Klein Ikkink, Henk-Jan; Tosun, Tuncay

    2013-04-01

    KNMI is the Dutch institute for weather, climate research and seismology. It disseminates weather information to the public at large, the government, aviation and the shipping industry in the interest of safety, the economy and a sustainable environment. To gain insight into long-term developments KNMI conducts research on climate change. Making the knowledge, data and information on hand at KNMI accessible is one core activity. A huge part of the KNMI information is from numerical models, insitu sensor networks and remote sensing satellites. This digital collection is mostly internal only available and is a collection of non searchable , non standardized file formats, lacking documentation and has no references to scientific publications. With the KNMI Data Centre (KDC) project these issues are tackled. In the project a user driven development approach with SCRUM was chosen to get maximum user involvement in a relative short development timeframe. Building on open standards and proven open source technology (which includes in-house developed software like ADAGUC WMS and Portal) resulted in a first release in December 2012 This presentation will focus on the aspects of KDC relating to its technical challenges, the development strategy and the initial usage results of the data centre.

  2. HERA: A dynamic web application for visualizing community exposure to flood hazards based on storm and sea level rise scenarios

    USGS Publications Warehouse

    Jones, Jeanne M.; Henry, Kevin; Wood, Nathan J.; Ng, Peter; Jamieson, Matthew

    2017-01-01

    The Hazard Exposure Reporting and Analytics (HERA) dynamic web application was created to provide a platform that makes research on community exposure to coastal-flooding hazards influenced by sea level rise accessible to planners, decision makers, and the public in a manner that is both easy to use and easily accessible. HERA allows users to (a) choose flood-hazard scenarios based on sea level rise and storm assumptions, (b) appreciate the modeling uncertainty behind a chosen hazard zone, (c) select one or several communities to examine exposure, (d) select the category of population or societal asset, and (e) choose how to look at results. The application is designed to highlight comparisons between (a) varying levels of sea level rise and coastal storms, (b) communities, (c) societal asset categories, and (d) spatial scales. Through a combination of spatial and graphical visualizations, HERA aims to help individuals and organizations to craft more informed mitigation and adaptation strategies for climate-driven coastal hazards. This paper summarizes the technologies used to maximize the user experience, in terms of interface design, visualization approaches, and data processing.

  3. Evaluating PLATO: postgraduate teaching and learning online.

    PubMed

    Brown, Menna; Bullock, Alison

    2014-02-01

      The use of the Internet as a teaching medium has increased rapidly over the last decade. PLATO (postgraduate learning and teaching online) was launched in 2008 by the e-learning unit (ELU) of Wales Deanery. Located within Learning@NHSWales, a Moodle virtual learning environment (VLE), it hosts a wide range of freely available courses and resources tailored to support the education, training and continuing professional development (CPD) needs of health care professionals working across the National Health Service (NHS) Wales. The evaluation aimed to identify the costs and benefits of PLATO, report its value as attributed by users, identify potential cost savings and make recommendations.   Five courses (case studies) were selected, representing the range of available e-learning resources: e-induction; fetal heart monitoring; cervical screening; GP prospective trainers; and tools for trainers. Mixed methods were used: one-to-one qualitative interviews, focus group discussions and surveys explored user views, and identified individual and organisational value.   Qualitative findings identified six key areas of value for users: ELU support and guidance; avoidance of duplication and standardisation; central reference; local control; flexibility for learners; and specific features. Survey results (n=72) indicated 72 per cent of consultants reported that PLATO was easy to access and user friendly. E-learning was rated as 'very/important' for CPD by 79 per cent of respondents. Key challenges were: access, navigation, user concerns, awareness and support.   PLATO supports education and helps deliver UK General Medical Council standards. Future plans should address the suggested recommendations to realise cost savings for NHS Wales and the Wales Deanery. The findings have wider applicability to others developing or using VLEs. © 2014 John Wiley & Sons Ltd.

  4. Colorado Late Cenozoic Fault and Fold Database and Internet Map Server: User-friendly technology for complex information

    USGS Publications Warehouse

    Morgan, K.S.; Pattyn, G.J.; Morgan, M.L.

    2005-01-01

    Internet mapping applications for geologic data allow simultaneous data delivery and collection, enabling quick data modification while efficiently supplying the end user with information. Utilizing Web-based technologies, the Colorado Geological Survey's Colorado Late Cenozoic Fault and Fold Database was transformed from a monothematic, nonspatial Microsoft Access database into a complex information set incorporating multiple data sources. The resulting user-friendly format supports easy analysis and browsing. The core of the application is the Microsoft Access database, which contains information compiled from available literature about faults and folds that are known or suspected to have moved during the late Cenozoic. The database contains nonspatial fields such as structure type, age, and rate of movement. Geographic locations of the fault and fold traces were compiled from previous studies at 1:250,000 scale to form a spatial database containing information such as length and strike. Integration of the two databases allowed both spatial and nonspatial information to be presented on the Internet as a single dataset (http://geosurvey.state.co.us/pubs/ceno/). The user-friendly interface enables users to view and query the data in an integrated manner, thus providing multiple ways to locate desired information. Retaining the digital data format also allows continuous data updating and quick delivery of newly acquired information. This dataset is a valuable resource to anyone interested in earthquake hazards and the activity of faults and folds in Colorado. Additional geologic hazard layers and imagery may aid in decision support and hazard evaluation. The up-to-date and customizable maps are invaluable tools for researchers or the public.

  5. Making Dynamic Digital Maps Cross-Platform and WWW Capable

    NASA Astrophysics Data System (ADS)

    Condit, C. D.

    2001-05-01

    High-quality color geologic maps are an invaluable information resource for educators, students and researchers. However, maps with large datasets that include images, or various types of movies, in addition to site locations where analytical data has been collected, are difficult to publish in a format that facilitates their easy access, distribution and use. The development of capable desktop computers and object oriented graphical programming environments has facilitated publication of such data sets in an encapsulated form. The original Dynamic Digital Map (DDM) programs, developed using the Macintosh based SuperCard programming environment, exemplified this approach, in which all data are included in a single package designed so that display and access to the data did not depend on proprietary programs. These DDMs were aimed for ease of use, and allowed data to be displayed by several methods, including point-and-click at icons pin-pointing sample (or image) locations on maps, and from clicklists of sample or site numbers. Each of these DDMs included an overview and automated tour explaining the content organization and program use. This SuperCard development culminated in a "DDM Template", which is a SuperCard shell into which SuperCard users could insert their own content and thus create their own DDMs, following instructions in an accompanying "DDM Cookbook" (URL http://www.geo.umass.edu/faculty/condit/condit2.html). These original SuperCard-based DDMs suffered two critical limitations: a single user platform (Macintosh) and, although they can be downloaded from the web, their use lacked an integration into the WWW. Over the last eight months I have been porting the DDM technology to MetaCard, which is aggressively cross-platform (11 UNIX dialects, WIN32 and Macintosh). The new MetaCard DDM is redesigned to make the maps and images accessible either from CD or the web, using the "LoadNGo" concept. LoadNGo allows the user to download the stand-alone DDM program using a standard browser, and then use the program independently to access images, maps and data with fast web connections. DDMs are intended to be a fast and inexpensive way to publish and make accessible, as an integrated product, high-quality color maps and data sets. They are not a substitute for the analytical capability of GIS; however maps produced using GIS and CAD programs can be easily integrated into DDMs. The preparation of any map product is a time consuming effort. To compliment that effort, the DDM Templates have build into them the capability to contain explanatory text at three different user levels (or perhaps in three different languages), thus one DDM may be used as both a research publication medium and an educational outreach product, with the user choosing which user mode to access the data.

  6. A searching and reporting system for relational databases using a graph-based metadata representation.

    PubMed

    Hewitt, Robin; Gobbi, Alberto; Lee, Man-Ling

    2005-01-01

    Relational databases are the current standard for storing and retrieving data in the pharmaceutical and biotech industries. However, retrieving data from a relational database requires specialized knowledge of the database schema and of the SQL query language. At Anadys, we have developed an easy-to-use system for searching and reporting data in a relational database to support our drug discovery project teams. This system is fast and flexible and allows users to access all data without having to write SQL queries. This paper presents the hierarchical, graph-based metadata representation and SQL-construction methods that, together, are the basis of this system's capabilities.

  7. Managing Content in a Matter of Minutes

    NASA Technical Reports Server (NTRS)

    2004-01-01

    NASA software created to help scientists expeditiously search and organize their research documents is now aiding compliance personnel, law enforcement investigators, and the general public in their efforts to search, store, manage, and retrieve documents more efficiently. Developed at Ames Research Center, NETMARK software was designed to manipulate vast amounts of unstructured and semi-structured NASA documents. NETMARK is both a relational and object-oriented technology built on an Oracle enterprise-wide database. To ensure easy user access, Ames constructed NETMARK as a Web-enabled platform utilizing the latest in Internet technology. One of the significant benefits of the program was its ability to store and manage mission-critical data.

  8. SBEToolbox: A Matlab Toolbox for Biological Network Analysis

    PubMed Central

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418

  9. SBEToolbox: A Matlab Toolbox for Biological Network Analysis.

    PubMed

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.

  10. Recent Developments on the Turbulence Modeling Resource Website (Invited)

    NASA Technical Reports Server (NTRS)

    Rumssey, Christopher L.

    2015-01-01

    The NASA Langley Turbulence Model Resource (TMR) website has been active for over five years. Its main goal of providing a one-stop, easily accessible internet site for up-to-date information on Reynolds-averaged Navier-Stokes turbulence models remains unchanged. In particular, the site strives to provide an easy way for users to verify their own implementations of widely-used turbulence models, and to compare the results from different models for a variety of simple unit problems covering a range of flow physics. Some new features have been recently added to the website. This paper documents the site's features, including recent developments, future plans, and open questions.

  11. Behavioral Economics and the Supplemental Nutrition Assistance Program:: Making the Healthy Choice the Easy Choice.

    PubMed

    Ammerman, Alice S; Hartman, Terry; DeMarco, Molly M

    2017-02-01

    The Supplemental Nutrition Assistance Program (SNAP) serves as an important nutritional safety net program for many Americans. Given its aim to use traditional economic levers to provide access to food, the SNAP program includes minimal nutritional requirements and restrictions. As food choices are influenced by more than just economic constraints, behavioral economics may offer insights and tools for altering food purchases for SNAP users. This manuscript outlines behavioral economics strategies that have potential to encourage healthier food choices within the SNAP program. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  12. Database Development for Electrical, Electronic, and Electromechanical (EEE) Parts for the International Space Station Alpha

    NASA Technical Reports Server (NTRS)

    Wassil-Grimm, Andrew D.

    1997-01-01

    More effective electronic communication processes are needed to transfer contractor and international partner data into NASA and prime contractor baseline database systems. It is estimated that the International Space Station Alpha (ISSA) parts database will contain up to one million parts each of which may require database capabilities for approximately one thousand bytes of data for each part. The resulting gigabyte database must provide easy access to users who will be preparing multiple analyses and reports in order to verify as-designed, as-built, launch, on-orbit, and return configurations for up to 45 missions associated with the construction of the ISSA. Additionally, Internet access to this data base is strongly indicated to allow multiple user access from clients located in many foreign countries. This summer's project involved familiarization and evaluation of the ISSA Electrical, Electronic, and Electromechanical (EEE) Parts data and the process of electronically managing these data. Particular attention was devoted to improving the interfaces among the many elements of the ISSA information system and its global customers and suppliers. Additionally, prototype queries were developed to facilitate the identification of data changes in the data base, verifications that the designs used only approved parts, and certifications that the flight hardware containing EEE parts was ready for flight. This project also resulted in specific recommendations to NASA for further development in the area of EEE parts database development and usage.

  13. Archiving Space Geodesy Data for 20+ Years at the CDDIS

    NASA Technical Reports Server (NTRS)

    Noll, Carey E.; Dube, M. P.

    2004-01-01

    Since 1982, the Crustal Dynamics Data Information System (CDDIS) has supported the archive and distribution of geodetic data products acquired by NASA programs. These data include GPS (Global Positioning System), GLONASS (GLObal NAvigation Satellite System), SLR (Satellite Laser Ranging), VLBI (Very Long Baseline Interferometry), and DORIS (Doppler Orbitography and Radiolocation Integrated by Satellite). The data archive supports NASA's space geodesy activities through the Solid Earth and Natural Hazards (SENH) program. The CDDIS data system and its archive have become increasingly important to many national and international programs, particularly several of the operational services within the International Association of Geodesy (IAG), including the International GPS Service (IGS), the International Laser Ranging Service (ILRS), the International VLBI Service for Geodesy and Astrometry (IVS), the International DORIS Service (IDS), and the International Earth Rotation Service (IERS). The CDDIS provides easy and ready access to a variety of data sets, products, and information about these data. The specialized nature of the CDDIS lends itself well to enhancement and thus can accommodate diverse data sets and user requirements. All data sets and metadata extracted from these data sets are accessible to scientists through ftp and the web; general information about each data set is accessible via the web. The CDDIS, including background information about the system and its user communities, the computer architecture, archive contents, available metadata, and future plans will be discussed.

  14. Setting Access Permission through Transitive Relationship in Web-based Social Networks

    NASA Astrophysics Data System (ADS)

    Hong, Dan; Shen, Vincent Y.

    The rising popularity of various social networking websites has created a huge problem on Internet privacy. Although it is easy to post photos, comments, opinions on some events, etc. on the Web, some of these data (such as a person’s location at a particular time, criticisms of a politician, etc.) are private and should not be accessed by unauthorized users. Although social networks facilitate sharing, the fear of sending sensitive data to a third party without knowledge or permission of the data owners discourages people from taking full advantage of some social networking applications. We exploit the existing relationships on social networks and build a ‘‘trust network’’ with transitive relationship to allow controlled data sharing so that the privacy and preferences of data owners are respected. The trust network linking private data owners, private data requesters, and intermediary users is a directed weighted graph. The permission value for each private data requester can be automatically assigned in this network based on the transitive relationship. Experiments were conducted to confirm the feasibility of constructing the trust network from existing social networks, and to assess the validity of permission value assignments in the query process. Since the data owners only need to define the access rights of their closest contacts once, this privacy scheme can make private data sharing easily manageable by social network participants.

  15. Local storage federation through XRootD architecture for interactive distributed analysis

    NASA Astrophysics Data System (ADS)

    Colamaria, F.; Colella, D.; Donvito, G.; Elia, D.; Franco, A.; Luparello, G.; Maggi, G.; Miniello, G.; Vallero, S.; Vino, G.

    2015-12-01

    A cloud-based Virtual Analysis Facility (VAF) for the ALICE experiment at the LHC has been deployed in Bari. Similar facilities are currently running in other Italian sites with the aim to create a federation of interoperating farms able to provide their computing resources for interactive distributed analysis. The use of cloud technology, along with elastic provisioning of computing resources as an alternative to the grid for running data intensive analyses, is the main challenge of these facilities. One of the crucial aspects of the user-driven analysis execution is the data access. A local storage facility has the disadvantage that the stored data can be accessed only locally, i.e. from within the single VAF. To overcome such a limitation a federated infrastructure, which provides full access to all the data belonging to the federation independently from the site where they are stored, has been set up. The federation architecture exploits both cloud computing and XRootD technologies, in order to provide a dynamic, easy-to-use and well performing solution for data handling. It should allow the users to store the files and efficiently retrieve the data, since it implements a dynamic distributed cache among many datacenters in Italy connected to one another through the high-bandwidth national network. Details on the preliminary architecture implementation and performance studies are discussed.

  16. The value of usability testing for Internet-based adolescent self-management interventions: “Managing Hemophilia Online”

    PubMed Central

    2013-01-01

    Background As adolescents with hemophilia approach adulthood, they are expected to assume responsibility for their disease management. A bilingual (English and French) Internet-based self-management program, “Teens Taking Charge: Managing Hemophilia Online,” was developed to support adolescents with hemophilia in this transition. This study explored the usability of the website and resulted in refinement of the prototype. Methods A purposive sample (n=18; age 13–18; mean age 15.5 years) was recruited from two tertiary care centers to assess the usability of the program in English and French. Qualitative observations using a “think aloud” usability testing method and semi-structured interviews were conducted in four iterative cycles, with changes to the prototype made as necessary following each cycle. This study was approved by research ethics boards at each site. Results Teens responded positively to the content and appearance of the website and felt that it was easy to navigate and understand. The multimedia components (videos, animations, quizzes) were felt to enrich the experience. Changes to the presentation of content and the website user-interface were made after the first, second and third cycles of testing in English. Cycle four did not result in any further changes. Conclusions Overall, teens found the website to be easy to use. Usability testing identified end-user concerns that informed improvements to the program. Usability testing is a crucial step in the development of Internet-based self-management programs to ensure information is delivered in a manner that is accessible and understood by users. PMID:24094082

  17. GES DISC Datalist Enables Easy Data Selection For Natural Phenomena Studies

    NASA Technical Reports Server (NTRS)

    Li, Angela; Shie, Chung-Lin; Hegde, Mahabaleshwa; Petrenko, Maksym; Teng, William; Bryant, Keith; Liu, Zhong; Hearty, Thomas; Shen, Suhung; Seiler, Edward; hide

    2017-01-01

    In order to investigate and assess natural hazards such as tropical storms, winter storms, volcanic eruptions, floods, and drought in a timely manner, the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has been developing an efficient data search and access service. Called "Datalist," this service enables users to acquire their data of interest "all at once," with minimum effort. A Datalist is a virtual collection of predefined or user-defined data variables from one or more archived data sets. Datalists are more than just data. Datalists effectively provide users with a sophisticated integrated data and services package, including metadata, citation, documentation, visualization, and data-specific services (e.g., subset and OPeNDAP), all available from one-stop shopping. The predefined Datalists, created by the experienced GES DISC science support team, should save a significant amount of time that users would otherwise have to spend. The Datalist service is an extension of the new GES DISC website, which is completely data-driven. A Datalist, also known as "data bundle," is treated just as any other data set. Being a virtual collection, a Datalist requires no extra storage space.

  18. Research-oriented image registry for multimodal image integration.

    PubMed

    Tanaka, M; Sadato, N; Ishimori, Y; Yonekura, Y; Yamashita, Y; Komuro, H; Hayahsi, N; Ishii, Y

    1998-01-01

    To provide multimodal biomedical images automatically, we constructed the research-oriented image registry, Data Delivery System (DDS). DDS was constructed on the campus local area network. Machines which generate images (imagers: DSA, ultrasound, PET, MRI, SPECT and CT) were connected to the campus LAN. Once a patient is registered, all his images are automatically picked up by DDS as they are generated, transferred through the gateway server to the intermediate server, and copied into the directory of the user who registered the patient. DDS informs the user through e-mail that new data have been generated and transferred. Data format is automatically converted into one which is chosen by the user. Data inactive for a certain period in the intermediate server are automatically achieved into the final and permanent data server based on compact disk. As a soft link is automatically generated through this step, a user has access to all (old or new) image data of the patient of his interest. As DDS runs with minimal maintenance, cost and time for data transfer are significantly saved. By making the complex process of data transfer and conversion invisible, DDS has made it easy for naive-to-computer researchers to concentrate on their biomedical interest.

  19. Baby Steps Text: Feasibility Study of an SMS-Based Tool for Tracking Children's Developmental Progress.

    PubMed

    Suh, Hyewon; Porter, John R; Racadio, Robert; Sung, Yi-Chen; Kientz, Julie A

    2016-01-01

    To help reach populations of children without consistent Internet access or medical care, we designed and implemented Baby Steps Text, an automated text message-based screening tool. We conducted preliminary user research via storyboarding and prototyping with target populations and then developed a fully functional system. In a one-month deployment study, we evaluated the feasibility of Baby Steps Text with fourteen families. During a one-month study, 13 out of 14 participants were able to learn and use the response structure (yielding 2.88% error rate) and complete a child development screener entirely via text messages. All post-study survey respondents agreed Baby Steps Text was understandable and easy to use, which was also confirmed through post-study interviews. Some survey respondents expressed liking Baby Steps Text because it was easy, quick, convenient to use, and delivered helpful, timely information. Our initial deployment study shows text messaging is a feasible tool for supporting parents in tracking and monitoring their child's development.

  20. Cell phones and children: follow the precautionary road.

    PubMed

    Rosenberg, Suzanne

    2013-01-01

    Children are increasingly using cell phones. "Family package" deals make it easy for parents to obtain phones for their children, and the phones provide parents with the comfort of easy access to their children. However, cell phones emit radio frequency (RF) radiation (Bucher & the Committee on Appropriations, 2010). While the government has deemed RF radiation to be safe, there is no current significant research to make this claim. To determine the relationship between cell phone radiation and brain cancer requires long-term studies lasting decades and with inclusion of frequent users in the subject pool. Further, to extend the results of any study to children requires controlling for the differences between juveniles and adults regarding the composition of the head, and bone density and neural tissue. Dr. L. Hardell of the University Hospital of Sweden noted that "it is necessary to apply the precautionary principle in this situation," especially for long-term exposure that is likely to affect children (Hardell as cited in Mead, 2008, p. 1). There is cause for concern.

  1. Software Development Of XML Parser Based On Algebraic Tools

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  2. A proactive password checker

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1990-01-01

    Password selection has long been a difficult issue; traditionally, passwords are either assigned by the computer or chosen by the user. When the computer does the assignment, the passwords are often hard to remember; when the user makes the selection, the passwords are often easy to guess. This paper describes a technique, and a mechanism, to allow users to select passwords which to them are easy to remember but to others would be very difficult to guess. The technique is site, user, and group compatible, and allows rapid changing of constraints imposed upon the password. Although experience with this technique is limited, it appears to have much promise.

  3. Optimizing the Information Presentation on Mining Potential by using Web Services Technology with Restful Protocol

    NASA Astrophysics Data System (ADS)

    Abdillah, T.; Dai, R.; Setiawan, E.

    2018-02-01

    This study aims to develop the application of Web Services technology with RestFul Protocol to optimize the information presentation on mining potential. This study used User Interface Design approach for the information accuracy and relevance as well as the Web Service for the reliability in presenting the information. The results show that: the information accuracy and relevance regarding mining potential can be seen from the achievement of User Interface implementation in the application that is based on the following rules: The consideration of the appropriate colours and objects, the easiness of using the navigation, and users’ interaction with the applications that employs symbols and languages understood by the users; the information accuracy and relevance related to mining potential can be observed by the information presented by using charts and Tool Tip Text to help the users understand the provided chart/figure; the reliability of the information presentation is evident by the results of Web Services testing in Figure 4.5.6. This study finds out that User Interface Design and Web Services approaches (for the access of different Platform apps) are able to optimize the presentation. The results of this study can be used as a reference for software developers and Provincial Government of Gorontalo.

  4. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  5. DockoMatic 2.0: high throughput inverse virtual screening and homology modeling.

    PubMed

    Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T; McDougal, Owen M; Andersen, Timothy L

    2013-08-26

    DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly graphical user interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to (1) conduct high throughput inverse virtual screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELER programs and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education.

  6. Bayesian Network Webserver: a comprehensive tool for biological network modeling.

    PubMed

    Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan

    2013-11-01

    The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.

  7. Gigwa-Genotype investigator for genome-wide analyses.

    PubMed

    Sempéré, Guilhem; Philippe, Florian; Dereeper, Alexis; Ruiz, Manuel; Sarah, Gautier; Larmande, Pierre

    2016-06-06

    Exploring the structure of genomes and analyzing their evolution is essential to understanding the ecological adaptation of organisms. However, with the large amounts of data being produced by next-generation sequencing, computational challenges arise in terms of storage, search, sharing, analysis and visualization. This is particularly true with regards to studies of genomic variation, which are currently lacking scalable and user-friendly data exploration solutions. Here we present Gigwa, a web-based tool that provides an easy and intuitive way to explore large amounts of genotyping data by filtering it not only on the basis of variant features, including functional annotations, but also on genotype patterns. The data storage relies on MongoDB, which offers good scalability properties. Gigwa can handle multiple databases and may be deployed in either single- or multi-user mode. In addition, it provides a wide range of popular export formats. The Gigwa application is suitable for managing large amounts of genomic variation data. Its user-friendly web interface makes such processing widely accessible. It can either be simply deployed on a workstation or be used to provide a shared data portal for a given community of researchers.

  8. Pharmit: interactive exploration of chemical space.

    PubMed

    Sunseri, Jocelyn; Koes, David Ryan

    2016-07-08

    Pharmit (http://pharmit.csb.pitt.edu) provides an online, interactive environment for the virtual screening of large compound databases using pharmacophores, molecular shape and energy minimization. Users can import, create and edit virtual screening queries in an interactive browser-based interface. Queries are specified in terms of a pharmacophore, a spatial arrangement of the essential features of an interaction, and molecular shape. Search results can be further ranked and filtered using energy minimization. In addition to a number of pre-built databases of popular compound libraries, users may submit their own compound libraries for screening. Pharmit uses state-of-the-art sub-linear algorithms to provide interactive screening of millions of compounds. Queries typically take a few seconds to a few minutes depending on their complexity. This allows users to iteratively refine their search during a single session. The easy access to large chemical datasets provided by Pharmit simplifies and accelerates structure-based drug design. Pharmit is available under a dual BSD/GPL open-source license. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Development of a user-centered radiology teaching file system

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcelo; Fujino, Asa

    2011-03-01

    Learning radiology requires systematic and comprehensive study of a large knowledge base of medical images. In this work is presented the development of a digital radiology teaching file system. The proposed system has been created in order to offer a set of customized services regarding to users' contexts and their informational needs. This has been done by means of an electronic infrastructure that provides easy and integrated access to all relevant patient data at the time of image interpretation, so that radiologists and researchers can examine all available data to reach well-informed conclusions, while protecting patient data privacy and security. The system is presented such as an environment which implements a distributed clinical database, including medical images, authoring tools, repository for multimedia documents, and also a peer-reviewed model which assures dataset quality. The current implementation has shown that creating clinical data repositories on networked computer environments points to be a good solution in terms of providing means to review information management practices in electronic environments and to create customized and contextbased tools for users connected to the system throughout electronic interfaces.

  10. Experience of wireless local area network in a radiation oncology department.

    PubMed

    Mandal, Abhijit; Asthana, Anupam Kumar; Aggarwal, Lalit Mohan

    2010-01-01

    The aim of this work is to develop a wireless local area network (LAN) between different types of users (Radiation Oncologists, Radiological Physicists, Radiation Technologists, etc) for efficient patient data management and to made easy the availability of information (chair side) to improve the quality of patient care in Radiation Oncology department. We have used mobile workstations (Laptops) and stationary workstations, all equipped with wireless-fidelity (Wi-Fi) access. Wireless standard 802.11g (as recommended by Institute of Electrical and Electronic Engineers (IEEE, Piscataway, NJ) has been used. The wireless networking was configured with the Service Set Identifier (SSID), Media Access Control (MAC) address filtering, and Wired Equivalent Privacy (WEP) network securities. We are successfully using this wireless network in sharing the indigenously developed patient information management software. The proper selection of the hardware and the software combined with a secure wireless LAN setup will lead to a more efficient and productive radiation oncology department.

  11. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  12. A Global Repository for Planet-Sized Experiments and Observations

    NASA Technical Reports Server (NTRS)

    Williams, Dean; Balaji, V.; Cinquini, Luca; Denvil, Sebastien; Duffy, Daniel; Evans, Ben; Ferraro, Robert D.; Hansen, Rose; Lautenschlager, Michael; Trenham, Claire

    2016-01-01

    Working across U.S. federal agencies, international agencies, and multiple worldwide data centers, and spanning seven international network organizations, the Earth System Grid Federation (ESGF) allows users to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a system of geographically distributed peer nodes that are independently administered yet united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP) output used by the Intergovernmental Panel on Climate Change assessment reports. Data served by ESGF not only include model output (i.e., CMIP simulation runs) but also include observational data from satellites and instruments, reanalyses, and generated images. Metadata summarize basic information about the data for fast and easy data discovery.

  13. Benefits of cloud computing for PACS and archiving.

    PubMed

    Koch, Patrick

    2012-01-01

    The goal of cloud-based services is to provide easy, scalable access to computing resources and IT services. The healthcare industry requires a private cloud that adheres to government mandates designed to ensure privacy and security of patient data while enabling access by authorized users. Cloud-based computing in the imaging market has evolved from a service that provided cost effective disaster recovery for archived data to fully featured PACS and vendor neutral archiving services that can address the needs of healthcare providers of all sizes. Healthcare providers worldwide are now using the cloud to distribute images to remote radiologists while supporting advanced reading tools, deliver radiology reports and imaging studies to referring physicians, and provide redundant data storage. Vendor managed cloud services eliminate large capital investments in equipment and maintenance, as well as staffing for the data center--creating a reduction in total cost of ownership for the healthcare provider.

  14. PlantDB – a versatile database for managing plant research

    PubMed Central

    Exner, Vivien; Hirsch-Hoffmann, Matthias; Gruissem, Wilhelm; Hennig, Lars

    2008-01-01

    Background Research in plant science laboratories often involves usage of many different species, cultivars, ecotypes, mutants, alleles or transgenic lines. This creates a great challenge to keep track of the identity of experimental plants and stored samples or seeds. Results Here, we describe PlantDB – a Microsoft® Office Access database – with a user-friendly front-end for managing information relevant for experimental plants. PlantDB can hold information about plants of different species, cultivars or genetic composition. Introduction of a concise identifier system allows easy generation of pedigree trees. In addition, all information about any experimental plant – from growth conditions and dates over extracted samples such as RNA to files containing images of the plants – can be linked unequivocally. Conclusion We have been using PlantDB for several years in our laboratory and found that it greatly facilitates access to relevant information. PMID:18182106

  15. Education and training in the MEDICOM system.

    PubMed

    Marinos, G; Palamas, S; Vlachos, I; Panou-Diamandi, O; Kalivas, D; Koutsouris, D

    2000-01-01

    MEDICOM system is a world wide telematics application for electronic commerce of medical devices. It has been designed so as to provide the health care professionals with a central Internet access to up-to-date information about medical equipment from multiple manufacturers, in a particular easy and friendly way. Moreover, the Medicom system will serve the health care professionals' requirements for high-quality information about specific products in a form of multimedia presentations and that of a secure communication channel with the community of manufacturers, especially for post marketing surveillance. The system will provide the medical staff (physicians and technicians) with demonstrations of the operation procedures and the functioning of high-tech equipment in a form of virtual models. Moreover, through the medicom system the end users of medical devices can have access to on line libraries and participate in special newsgroups. This paper discusses the architectural structure of the MEDICOM system with emphasis to its educational and training functionality.

  16. TCIApathfinder: an R client for The Cancer Imaging Archive REST API.

    PubMed

    Russell, Pamela; Fountain, Kelly; Wolverton, Dulcy; Ghosh, Debashis

    2018-06-05

    The Cancer Imaging Archive (TCIA) hosts publicly available de-identified medical images of cancer from over 25 body sites and over 30,000 patients. Over 400 published studies have utilized freely available TCIA images. Images and metadata are available for download through a web interface or a REST API. Here we present TCIApathfinder, an R client for the TCIA REST API. TCIApathfinder wraps API access in user-friendly R functions that can be called interactively within an R session or easily incorporated into scripts. Functions are provided to explore the contents of the large database and to download image files. TCIApathfinder provides easy access to TCIA resources in the highly popular R programming environment. TCIApathfinder is freely available under the MIT license as a package on CRAN (https://cran.r-project.org/web/packages/TCIApathfinder/index.html) and at https://github.com/pamelarussell/TCIApathfinder. Copyright ©2018, American Association for Cancer Research.

  17. A New Cloud Architecture of Virtual Trusted Platform Modules

    NASA Astrophysics Data System (ADS)

    Liu, Dongxi; Lee, Jack; Jang, Julian; Nepal, Surya; Zic, John

    We propose and implement a cloud architecture of virtual Trusted Platform Modules (TPMs) to improve the usability of TPMs. In this architecture, virtual TPMs can be obtained from the TPM cloud on demand. Hence, the TPM functionality is available for applications that do not have physical TPMs in their local platforms. Moreover, the TPM cloud allows users to access their keys and data in the same virtual TPM even if they move to untrusted platforms. The TPM cloud is easy to access for applications in different languages since cloud computing delivers services in standard protocols. The functionality of the TPM cloud is demonstrated by applying it to implement the Needham-Schroeder public-key protocol for web authentications, such that the strong security provided by TPMs is integrated into high level applications. The chain of trust based on the TPM cloud is discussed and the security properties of the virtual TPMs in the cloud is analyzed.

  18. Application driven interface generation for EASIE. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Kao, Ya-Chen

    1992-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a user interface and a set of utility programs which support the rapid integration and execution of analysis programs about a central relational database. EASIE provides users with two basic modes of execution. One of them is a menu-driven execution mode, called Application-Driven Execution (ADE), which provides sufficient guidance to review data, select a menu action item, and execute an application program. The other mode of execution, called Complete Control Execution (CCE), provides an extended executive interface which allows in-depth control of the design process. Currently, the EASIE system is based on alphanumeric techniques only. It is the purpose of this project to extend the flexibility of the EASIE system in the ADE mode by implementing it in a window system. Secondly, a set of utilities will be developed to assist the experienced engineer in the generation of an ADE application.

  19. Concentrations of indoor pollutants (CIP) database user's manual (Version 4. 0)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apte, M.G.; Brown, S.R.; Corradi, C.A.

    1990-10-01

    This is the latest release of the database and the user manual. The user manual is a tutorial and reference for utilizing the CIP Database system. An installation guide is included to cover various hardware configurations. Numerous examples and explanations of the dialogue between the user and the database program are provided. It is hoped that this resource will, along with on-line help and the menu-driven software, make for a quick and easy learning curve. For the purposes of this manual, it is assumed that the user is acquainted with the goals of the CIP Database, which are: (1) tomore » collect existing measurements of concentrations of indoor air pollutants in a user-oriented database and (2) to provide a repository of references citing measured field results openly accessible to a wide audience of researchers, policy makers, and others interested in the issues of indoor air quality. The database software, as distinct from the data, is contained in two files, CIP. EXE and PFIL.COM. CIP.EXE is made up of a number of programs written in dBase III command code and compiled using Clipper into a single, executable file. PFIL.COM is a program written in Turbo Pascal that handles the output of summary text files and is called from CIP.EXE. Version 4.0 of the CIP Database is current through March 1990.« less

  20. Data and Data Products for Climate Research: Web Services at the Asia-Pacific Data-Research Center (APDRC)

    NASA Astrophysics Data System (ADS)

    DeCarlo, S.; Potemra, J. T.; Wang, K.

    2012-12-01

    The International Pacific Research Center (IPRC) at the University of Hawaii maintains a data center for climate studies called the Asia-Pacific Data-Research Center (APDRC). This data center was designed within a center of excellence in climate research with the intention of serving the needs of the research scientist. The APDRC provides easy access to a wide collection of climate data and data products for a wide variety of users. The data center maintains an archive of approximately 100 data sets including in-situ and remote data, as well as a range of model-based output. All data are available via on-line browsing tools such as a Live Access Server (LAS) and DChart, and direct binary access is available through OPeNDAP services. On-line tutorials on how to use these services are now available. Users can keep up-to-date with new data and product announcements via the APDRC facebook page. The main focus of the APDRC has been climate scientists, and the services are therefore streamlined to such users, both in the number and types of data served, but also in the way data are served. In addition, due to the integration of the APDRC within the IPRC, several value-added data products (see figure for an example using Argo floats) have been developed via a variety of research activities. The APDRC, therefore, has three main foci: 1. acquisition of climate-related data, 2. maintenance of integrated data servers, and 3. development and distribution of data products The APDRC can be found at http://apdrc.soest.hawaii.edu. The presentation will provide an overview along with specific examples of the data, data products and data services available at the APDRC.; APDRC product example: gridded field from Argo profiling floats

  1. Subsetting Tools for Enabling Easy Access to International Airborne Chemistry Data

    NASA Astrophysics Data System (ADS)

    Northup, E. A.; Chen, G.; Quam, B. M.; Beach, A. L., III; Silverman, M. L.; Early, A. B.

    2017-12-01

    In response to the Research Opportunities in Earth and Space Science (ROSES) 2015 release announcement for Advancing Collaborative Connections for Earth System Science (ACCESS), researchers at NASA Langley Research Center (LaRC) proposed to extend the capabilities of the existing Toolsets for Airborne Data (TAD) to include subsetting functionality to allow for easier access to international airborne field campaign data. Airborne field studies are commonly used to gain a detailed understanding of atmospheric processes for scientific research on international climate change and air quality issues. To accommodate the rigorous process for manipulating airborne field study chemistry data, and to lessen barriers for researchers, TAD was created with the ability to geolocate data from various sources measured on different time scales from a single flight. The analysis of airborne chemistry data typically requires data subsetting, which can be challenging and resource-intensive for end users. In an effort to streamline this process, new data subsetting features and updates to the current database model will be added to the TAD toolset. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. These new web-based tools will allow for automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The system has been designed to allow for new in-situ airborne missions to be added as they become available, with only minor pre-processing required. The development of these enhancements will be discussed in this presentation.

  2. AEGIS: a wildfire prevention and management information system

    NASA Astrophysics Data System (ADS)

    Kalabokidis, Kostas; Ager, Alan; Finney, Mark; Athanasis, Nikos; Palaiologou, Palaiologos; Vasilakos, Christos

    2016-03-01

    We describe a Web-GIS wildfire prevention and management platform (AEGIS) developed as an integrated and easy-to-use decision support tool to manage wildland fire hazards in Greece (http://aegis.aegean.gr). The AEGIS platform assists with early fire warning, fire planning, fire control and coordination of firefighting forces by providing online access to information that is essential for wildfire management. The system uses a number of spatial and non-spatial data sources to support key system functionalities. Land use/land cover maps were produced by combining field inventory data with high-resolution multispectral satellite images (RapidEye). These data support wildfire simulation tools that allow the users to examine potential fire behavior and hazard with the Minimum Travel Time fire spread algorithm. End-users provide a minimum number of inputs such as fire duration, ignition point and weather information to conduct a fire simulation. AEGIS offers three types of simulations, i.e., single-fire propagation, point-scale calculation of potential fire behavior, and burn probability analysis, similar to the FlamMap fire behavior modeling software. Artificial neural networks (ANNs) were utilized for wildfire ignition risk assessment based on various parameters, training methods, activation functions, pre-processing methods and network structures. The combination of ANNs and expected burned area maps are used to generate integrated output map of fire hazard prediction. The system also incorporates weather information obtained from remote automatic weather stations and weather forecast maps. The system and associated computation algorithms leverage parallel processing techniques (i.e., High Performance Computing and Cloud Computing) that ensure computational power required for real-time application. All AEGIS functionalities are accessible to authorized end-users through a web-based graphical user interface. An innovative smartphone application, AEGIS App, also provides mobile access to the web-based version of the system.

  3. NASA's Earth Observing Data and Information System

    NASA Technical Reports Server (NTRS)

    Mitchell, Andrew E.; Behnke, Jeanne; Lowe, Dawn; Ramapriyan, H. K.

    2009-01-01

    NASA's Earth Observing System Data and Information System (EOSDIS) has been a central component of NASA Earth observation program for over 10 years. It is one of the largest civilian science information system in the US, performing ingest, archive and distribution of over 3 terabytes of data per day much of which is from NASA s flagship missions Terra, Aqua and Aura. The system supports a variety of science disciplines including polar processes, land cover change, radiation budget, and most especially global climate change. The EOSDIS data centers, collocated with centers of science discipline expertise, archive and distribute standard data products produced by science investigator-led processing systems. Key to the success of EOSDIS is the concept of core versus community requirements. EOSDIS supports a core set of services to meet specific NASA needs and relies on community-developed services to meet specific user needs. EOSDIS offers a metadata registry, ECHO (Earth Observing System Clearinghouse), through which the scientific community can easily discover and exchange NASA s Earth science data and services. Users can search, manage, and access the contents of ECHO s registries (data and services) through user-developed and community-tailored interfaces or clients. The ECHO framework has become the primary access point for cross-Data Center search-and-order of EOSDIS and other Earth Science data holdings archived at the EOSDIS data centers. ECHO s Warehouse Inventory Search Tool (WIST) is the primary web-based client for discovering and ordering cross-discipline data from the EOSDIS data centers. The architecture of the EOSDIS provides a platform for the publication, discovery, understanding and access to NASA s Earth Observation resources and allows for easy integration of new datasets. The EOSDIS also has developed several methods for incorporating socioeconomic data into its data collection. Over the years, we have developed several methods for determining needs of the user community including use of the American Customer Satisfaction Index and a broad metrics program.

  4. An automatically updateable web publishing solution: taking document sharing and conversion to enterprise level

    NASA Astrophysics Data System (ADS)

    Rahman, Fuad; Tarnikova, Yuliya; Hartono, Rachmat; Alam, Hassan

    2006-01-01

    This paper presents a novel automatic web publishing solution, Pageview (R). PageView (R) is a complete working solution for document processing and management. The principal aim of this tool is to allow workgroups to share, access and publish documents on-line on a regular basis. For example, assuming that a person is working on some documents. The user will, in some fashion, organize his work either in his own local directory or in a shared network drive. Now extend that concept to a workgroup. Within a workgroup, some users are working together on some documents, and they are saving them in a directory structure somewhere on a document repository. The next stage of this reasoning is that a workgroup is working on some documents, and they want to publish them routinely on-line. Now it may happen that they are using different editing tools, different software, and different graphics tools. The resultant documents may be in PDF, Microsoft Office (R), HTML, or Word Perfect format, just to name a few. In general, this process needs the documents to be processed in a fashion so that they are in the HTML format, and then a web designer needs to work on that collection to make them available on-line. PageView (R) takes care of this whole process automatically, making the document workflow clean and easy to follow. PageView (R) Server publishes documents, complete with the directory structure, for online use. The documents are automatically converted to HTML and PDF so that users can view the content without downloading the original files, or having to download browser plug-ins. Once published, other users can access the documents as if they are accessing them from their local folders. The paper will describe the complete working system and will discuss possible applications within the document management research.

  5. Long-term effectiveness of the SpeechEasy fluency-enhancement device.

    PubMed

    Gallop, Ronald F; Runyan, Charles M

    2012-12-01

    The SpeechEasy has been found to be an effective device for reduction of stuttering frequency for many people who stutter (PWS); published studies typically have compared stuttering reduction at initial fitting of the device to results achieved up to one year later. This study examines long-term effectiveness by examining whether effects of the SpeechEasy were maintained for longer periods, from 13 to 59 months. Results indicated no significant change for seven device users from post-fitting to the time of the study (t=-.074, p=.943); however, findings varied greatly on a case-by-case basis. Most notably, when stuttering frequency for eleven users and former users, prior to device fitting, was compared to current stuttering frequency while not wearing the device, the change over time was found to be statistically significant (t=2.851, p=.017), suggesting a carry-over effect of the device. There was no significant difference in stuttering frequency when users were wearing versus not wearing the device currently (t=1.949, p=0.92). Examinations of these results, as well as direction for future research, are described herein. The reader will be able to: (a) identify and briefly describe two types of altered auditory feedback which the SpeechEasy incorporates in order to help reduce stuttering; (b) describe the carry-over effect found in this study, suggest effectiveness associated with the device over a longer period of time than previously reported, as well as its implications, and (c) list factors that might be assessed in future research involving this device in order to more narrowly determine which prospective users are most likely to benefit from employing the SpeechEasy. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. The IRIS Federator: Accessing Seismological Data Across Data Centers

    NASA Astrophysics Data System (ADS)

    Trabant, C. M.; Van Fossen, M.; Ahern, T. K.; Weekly, R. T.

    2015-12-01

    In 2013 the International Federation of Digital Seismograph Networks (FDSN) approved a specification for web service interfaces for accessing seismological station metadata, time series and event parameters. Since then, a number of seismological data centers have implemented FDSN service interfaces, with more implementations in development. We have developed a new system called the IRIS Federator which leverages this standardization and provides the scientific community with a service for easy discovery and access of seismological data across FDSN data centers. These centers are located throughout the world and this work represents one model of a system for data collection across geographic and political boundaries.The main components of the IRIS Federator are a catalog of time series metadata holdings at each data center and a web service interface for searching the catalog. The service interface is designed to support client­-side federated data access, a model in which the client (software run by the user) queries the catalog and then collects the data from each identified center. By default the results are returned in a format suitable for direct submission to those web services, but could also be formatted in a simple text format for general data discovery purposes. The interface will remove any duplication of time series channels between data centers according to a set of business rules by default, however a user may request results with all duplicate time series entries included. We will demonstrate how client­-side federation is being incorporated into some of the DMC's data access tools. We anticipate further enhancement of the IRIS Federator to improve data discovery in various scenarios and to improve usefulness to communities beyond seismology.Data centers with FDSN web services: http://www.fdsn.org/webservices/The IRIS Federator query interface: http://service.iris.edu/irisws/fedcatalog/1/

  7. SAAMBE: Webserver to Predict the Charge of Binding Free Energy Caused by Amino Acids Mutations.

    PubMed

    Petukh, Marharyta; Dai, Luogeng; Alexov, Emil

    2016-04-12

    Predicting the effect of amino acid substitutions on protein-protein affinity (typically evaluated via the change of protein binding free energy) is important for both understanding the disease-causing mechanism of missense mutations and guiding protein engineering. In addition, researchers are also interested in understanding which energy components are mostly affected by the mutation and how the mutation affects the overall structure of the corresponding protein. Here we report a webserver, the Single Amino Acid Mutation based change in Binding free Energy (SAAMBE) webserver, which addresses the demand for tools for predicting the change of protein binding free energy. SAAMBE is an easy to use webserver, which only requires that a coordinate file be inputted and the user is provided with various, but easy to navigate, options. The user specifies the mutation position, wild type residue and type of mutation to be made. The server predicts the binding free energy change, the changes of the corresponding energy components and provides the energy minimized 3D structure of the wild type and mutant proteins for download. The SAAMBE protocol performance was tested by benchmarking the predictions against over 1300 experimentally determined changes of binding free energy and a Pearson correlation coefficient of 0.62 was obtained. How the predictions can be used for discriminating disease-causing from harmless mutations is discussed. The webserver can be accessed via http://compbio.clemson.edu/saambe_webserver/.

  8. Access Control Mechanism for IoT Environments Based on Modelling Communication Procedures as Resources.

    PubMed

    Cruz-Piris, Luis; Rivera, Diego; Marsa-Maestre, Ivan; de la Hoz, Enrique; Velasco, Juan R

    2018-03-20

    Internet growth has generated new types of services where the use of sensors and actuators is especially remarkable. These services compose what is known as the Internet of Things (IoT). One of the biggest current challenges is obtaining a safe and easy access control scheme for the data managed in these services. We propose integrating IoT devices in an access control system designed for Web-based services by modelling certain IoT communication elements as resources. This would allow us to obtain a unified access control scheme between heterogeneous devices (IoT devices, Internet-based services, etc.). To achieve this, we have analysed the most relevant communication protocols for these kinds of environments and then we have proposed a methodology which allows the modelling of communication actions as resources. Then, we can protect these resources using access control mechanisms. The validation of our proposal has been carried out by selecting a communication protocol based on message exchange, specifically Message Queuing Telemetry Transport (MQTT). As an access control scheme, we have selected User-Managed Access (UMA), an existing Open Authorization (OAuth) 2.0 profile originally developed for the protection of Internet services. We have performed tests focused on validating the proposed solution in terms of the correctness of the access control system. Finally, we have evaluated the energy consumption overhead when using our proposal.

  9. Access Control Mechanism for IoT Environments Based on Modelling Communication Procedures as Resources

    PubMed Central

    2018-01-01

    Internet growth has generated new types of services where the use of sensors and actuators is especially remarkable. These services compose what is known as the Internet of Things (IoT). One of the biggest current challenges is obtaining a safe and easy access control scheme for the data managed in these services. We propose integrating IoT devices in an access control system designed for Web-based services by modelling certain IoT communication elements as resources. This would allow us to obtain a unified access control scheme between heterogeneous devices (IoT devices, Internet-based services, etc.). To achieve this, we have analysed the most relevant communication protocols for these kinds of environments and then we have proposed a methodology which allows the modelling of communication actions as resources. Then, we can protect these resources using access control mechanisms. The validation of our proposal has been carried out by selecting a communication protocol based on message exchange, specifically Message Queuing Telemetry Transport (MQTT). As an access control scheme, we have selected User-Managed Access (UMA), an existing Open Authorization (OAuth) 2.0 profile originally developed for the protection of Internet services. We have performed tests focused on validating the proposed solution in terms of the correctness of the access control system. Finally, we have evaluated the energy consumption overhead when using our proposal. PMID:29558406

  10. DooSo6: Easy Collaboration over Shared Projects

    NASA Astrophysics Data System (ADS)

    Ignat, Claudia-Lavinia; Oster, Gérald; Molli, Pascal

    Existing tools for supporting parallel work feature some disadvantages that prevent them to be widely used. Very often they require a complex installation and creation of accounts for all group members. Users need to learn and deal with complex commands for efficiently using these collaborative tools. Some tools require users to abandon their favourite editors and impose them to use a certain co-authorship application. In this paper, we propose the DooSo6 collaboration tool that offers support for parallel work, requires no installation, no creation of accounts and that is easy to use, users being able to continue working with their favourite editors. User authentication is achieved by means of a capability-based mechanism.

  11. MapApp: A Java(TM) Applet for Accessing Geographic Databases

    NASA Astrophysics Data System (ADS)

    Haxby, W.; Carbotte, S.; Ryan, W. B.; OHara, S.

    2001-12-01

    MapApp (http://coast.ldeo.columbia.edu/help/MapApp.html) is a prototype Java(TM) applet that is intended to give easy and versatile access to geographic data sets through a web browser. It was developed initially to interface with the RIDGE Multibeam Synthesis. Subsequently, interfaces with other geophysical databases were added. At present, multibeam bathymetry grids, underway geophysics along ship tracks, and the LDEO Borehole Research Group's ODP well logging database are accessible through MapApp. We plan to add an interface with the Ridge Petrology Database in the near future. The central component of MapApp is a world physiographic map. Users may navigate around the map (zoom/pan) without waiting for HTTP requests to a remote server to be processed. A focus request loads image tiles from the server to compose a new map at the current viewing resolution. Areas in which multibeam grids are available may be focused to a pixel resolution of about 200 m. These areas may be identified by toggling a mask. Databases may be accessed through menus, and selected data objects may be loaded into MapApp by selecting items from tables. Once loaded, a bathymetry grid may be contoured or used to create bathymetric profiles; ship tracks and ODP sites may be overlain on the map and their geophysical data plotted in X-Y graphs. The advantage of applets over traditional web pages is that they permit dynamic interaction with data sets, while limiting time consuming interaction with a remote server. Users may customize the graphics display by modifying the scale, or the symbol or line characteristics of rendered data, contour interval, etc. The ease with which users can select areas, view the physiography of areas, and preview data sets and evaluate them for quality and applicability, makes MapApp a valuable tool for education and research.

  12. Integrating NASA Earth Science Enterprise (ESE) Data Into Global Agricultural Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Teng, W.; Kempler, S.; Chiu, L.; Doraiswamy, P.; Liu, Z.; Milich, L.; Tetrault, R.

    2003-12-01

    Monitoring global agricultural crop conditions during the growing season and estimating potential seasonal production are critically important for market development of U.S. agricultural products and for global food security. Two major operational users of satellite remote sensing for global crop monitoring are the USDA Foreign Agricultural Service (FAS) and the U.N. World Food Programme (WFP). The primary goal of FAS is to improve foreign market access for U.S. agricultural products. The WFP uses food to meet emergency needs and to support economic and social development. Both use global agricultural decision support systems that can integrate and synthesize a variety of data sources to provide accurate and timely information on global crop conditions. The Goddard Space Flight Center Earth Sciences Distributed Active Archive Center (GES DAAC) has begun a project to provide operational solutions to FAS and WFP, by fully leveraging results from previous work, as well as from existing capabilities of the users. The GES DAAC has effectively used its recently developed prototype TRMM Online Visualization and Analysis System (TOVAS) to provide ESE data and information to the WFP for its agricultural drought monitoring efforts. This prototype system will be evolved into an Agricultural Information System (AIS), which will operationally provide ESE and other data products (e.g., rainfall, land productivity) and services, to be integrated into and thus enhance the existing GIS-based, decision support systems of FAS and WFP. Agriculture-oriented, ESE data products (e.g., MODIS-based, crop condition assessment product; TRMM derived, drought index product) will be input to a crop growth model in collaboration with the USDA Agricultural Research Service, to generate crop condition and yield prediction maps. The AIS will have the capability for remotely accessing distributed data, by being compliant with community-based interoperability standards, enabling easy access to agriculture-related products from other data producers. The AIS? system approach will provide a generic mechanism for easily incorporating new products and making them accessible to users.

  13. A RESTful API for accessing microbial community data for MG-RAST

    DOE PAGES

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; ...

    2015-01-08

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MGRAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, asmore » well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http:// kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase’s microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service.« less

  14. A RESTful API for Accessing Microbial Community Data for MG-RAST

    PubMed Central

    Wilke, Andreas; Bischof, Jared; Harrison, Travis; Brettin, Tom; D'Souza, Mark; Gerlach, Wolfgang; Matthews, Hunter; Paczian, Tobias; Wilkening, Jared; Glass, Elizabeth M.; Desai, Narayan; Meyer, Folker

    2015-01-01

    Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MG-RAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, as well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http://kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase's microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service. PMID:25569221

  15. SEMTAP (Serpentine End Match TApe program): The Easy Way to Program Your Numerically Controlled Router for the Production of SEM Joints

    Treesearch

    Ronald E. Coleman

    1977-01-01

    SEMTAP (Serpentine End Match TApe Program) is an easy and inexpensive method of programing a numerically controlled router for the manufacture of SEM (Serpentine End Matching) joints. The SEMTAP computer program allows the user to issue commands that will accurately direct a numerically controlled router along any SEM path. The user need not be a computer programer to...

  16. Construction of a database for published phase II/III drug intervention clinical trials for the period 2009-2014 comprising 2,326 records, 90 disease categories, and 939 drug entities.

    PubMed

    Jeong, Sohyun; Han, Nayoung; Choi, Boyoon; Sohn, Minji; Song, Yun-Kyoung; Chung, Myeon-Woo; Na, Han-Sung; Ji, Eunhee; Kim, Hyunah; Rhew, Ki Yon; Kim, Therasa; Kim, In-Wha; Oh, Jung Mi

    2016-06-01

    To construct a database of published clinical drug trials suitable for use 1) as a research tool in accessing clinical trial information and 2) in evidence-based decision-making by regulatory professionals, clinical research investigators, and medical practitioners. Comprehensive information obtained from a search of design elements and results of clinical trials in peer reviewed journals using PubMed (http://www.ncbi.nlm.ih.gov/pubmed). The methodology to develop a structured database was devised by a panel composed of experts in medical, pharmaceutical, information technology, and members of Ministry of Food and Drug Safety (MFDS) using a step by step approach. A double-sided system consisting of user mode and manager mode served as the framework for the database; elements of interest from each trial were entered via secure manager mode enabling the input information to be accessed in a user-friendly manner (user mode). Information regarding methodology used and results of drug treatment were extracted as detail elements of each data set and then inputted into the web-based database system. Comprehensive information comprising 2,326 clinical trial records, 90 disease states, and 939 drugs entities and concerning study objectives, background, methods used, results, and conclusion could be extracted from published information on phase II/III drug intervention clinical trials appearing in SCI journals within the last 10 years. The extracted data was successfully assembled into a clinical drug trial database with easy access suitable for use as a research tool. The clinically most important therapeutic categories, i.e., cancer, cardiovascular, respiratory, neurological, metabolic, urogenital, gastrointestinal, psychological, and infectious diseases were covered by the database. Names of test and control drugs, details on primary and secondary outcomes and indexed keywords could also be retrieved and built into the database. The construction used in the database enables the user to sort and download targeted information as a Microsoft Excel spreadsheet. Because of the comprehensive and standardized nature of the clinical drug trial database and its ease of access it should serve as valuable information repository and research tool for accessing clinical trial information and making evidence-based decisions by regulatory professionals, clinical research investigators, and medical practitioners.

  17. User's Guide for the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS): Version 2

    NASA Technical Reports Server (NTRS)

    Liu, Yuan; Frederick, Dean K.; DeCastro, Jonathan A.; Litt, Jonathan S.; Chan, William W.

    2012-01-01

    This report is a Users Guide for version 2 of the NASA-developed Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) software, which is a transient simulation of a large commercial turbofan engine (up to 90,000-lb thrust) with a realistic engine control system. The software supports easy access to health, control, and engine parameters through a graphical user interface (GUI). C-MAPSS v.2 has some enhancements over the original, including three actuators rather than one, the addition of actuator and sensor dynamics, and an improved controller, while retaining or improving on the convenience and user-friendliness of the original. C-MAPSS v.2 provides the user with a graphical turbofan engine simulation environment in which advanced algorithms can be implemented and tested. C-MAPSS can run user-specified transient simulations, and it can generate state-space linear models of the nonlinear engine model at an operating point. The code has a number of GUI screens that allow point-and-click operation, and have editable fields for user-specified input. The software includes an atmospheric model which allows simulation of engine operation at altitudes from sea level to 40,000 ft, Mach numbers from 0 to 0.90, and ambient temperatures from -60 to 103 F. The package also includes a power-management system that allows the engine to be operated over a wide range of thrust levels throughout the full range of flight conditions.

  18. Interactive and Approachable Web-Based Tools for Exploring Global Geophysical Data Records

    NASA Astrophysics Data System (ADS)

    Croteau, M. J.; Nerem, R. S.; Merrifield, M. A.; Thompson, P. R.; Loomis, B. D.; Wiese, D. N.; Zlotnicki, V.; Larson, J.; Talpe, M.; Hardy, R. A.

    2017-12-01

    Making global and regional data accessible and understandable for non-experts can be both challenging and hazardous. While data products are often developed with end users in mind, the ease of use of these data can vary greatly. Scientists must take care to provide detailed guides for how to use data products to ensure users are not incorrectly applying data to their problem. For example, terrestrial water storage data from the Gravity Recovery and Climate Experiment (GRACE) satellite mission is notoriously difficult for non-experts to access and correctly use. However, allowing these data to be easily accessible to scientists outside the GRACE community is desirable because this would allow that data to see much wider-spread use. We have developed a web-based interactive mapping and plotting tool that provides easy access to geophysical data. This work presents an intuitive method for making such data widely accessible to experts and non-experts alike, making the data approachable and ensuring proper use of the data. This tool has proven helpful to experts by providing fast and detailed access to the data. Simultaneously, the tool allows non-experts to gain familiarity with the information contained in the data and access to that information for both scientific studies and public use. In this presentation, we discuss the development of this tool and application to both GRACE and ocean altimetry satellite missions, and demonstrate the capabilities of the tool. Focusing on the data visualization aspects of the tool, we showcase our integrations of the Mapbox API and the D3.js data-driven web document framework. We then explore the potential of these tools in other web-based visualization projects, and how incorporation of such tools into science can improve the presentation of research results. We demonstrate how the development of an interactive and exploratory resource can enable further layers of exploratory and scientific discovery.

  19. jSPyDB, an open source database-independent tool for data management

    NASA Astrophysics Data System (ADS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  20. Improved Functionality and Curation Support in the ADS

    NASA Astrophysics Data System (ADS)

    Accomazzi, Alberto; Kurtz, Michael J.; Henneken, Edwin A.; Grant, Carolyn S.; Thompson, Donna; Chyla, Roman; Holachek, Alexandra; Sudilovsky, Vladimir; Murray, Stephen S.

    2015-01-01

    In this poster we describe the developments of the new ADS platform over the past year, focusing on the functionality which improves its discovery and curation capabilities.The ADS Application Programming Interface (API) is being updated to support authenticated access to the entire suite of ADS services, in addition to the search functionality itself. This allows programmatic access to resources which are specific to a user or class of users.A new interface, built directly on top of the API, now provides a more intuitive search experience and takes into account the best practices in web usability and responsive design. The interface now incorporates in-line views of graphics from the AAS Astroexplorer and the ADS All-Sky Survey image collections.The ADS Private Libraries, first introduced over 10 years ago, are now being enhanced to allow the bookmarking, tagging and annotation of records of interest. In addition, libraries can be shared with one or more ADS users, providing an easy way to collaborate in the curation of lists of papers. A library can also be explicitly made public and shared at large via the publishing of its URL.In collaboration with the AAS, the ADS plans to support the adoption of ORCID identifiers by implementing a plugin which will simplify the import of papers in ORCID via a query to the ADS API. Deeper integration between the two systems will depend on available resources and feedback from the community.

  1. Bio-Docklets: virtualization containers for single-step execution of NGS pipelines.

    PubMed

    Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis; Krampis, Konstantinos

    2017-08-01

    Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a "meta-script" that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. © The Authors 2017. Published by Oxford University Press.

  2. Bio-Docklets: virtualization containers for single-step execution of NGS pipelines

    PubMed Central

    Kim, Baekdoo; Ali, Thahmina; Lijeron, Carlos; Afgan, Enis

    2017-01-01

    Abstract Processing of next-generation sequencing (NGS) data requires significant technical skills, involving installation, configuration, and execution of bioinformatics data pipelines, in addition to specialized postanalysis visualization and data mining software. In order to address some of these challenges, developers have leveraged virtualization containers toward seamless deployment of preconfigured bioinformatics software and pipelines on any computational platform. We present an approach for abstracting the complex data operations of multistep, bioinformatics pipelines for NGS data analysis. As examples, we have deployed 2 pipelines for RNA sequencing and chromatin immunoprecipitation sequencing, preconfigured within Docker virtualization containers we call Bio-Docklets. Each Bio-Docklet exposes a single data input and output endpoint and from a user perspective, running the pipelines as simply as running a single bioinformatics tool. This is achieved using a “meta-script” that automatically starts the Bio-Docklets and controls the pipeline execution through the BioBlend software library and the Galaxy Application Programming Interface. The pipeline output is postprocessed by integration with the Visual Omics Explorer framework, providing interactive data visualizations that users can access through a web browser. Our goal is to enable easy access to NGS data analysis pipelines for nonbioinformatics experts on any computing environment, whether a laboratory workstation, university computer cluster, or a cloud service provider. Beyond end users, the Bio-Docklets also enables developers to programmatically deploy and run a large number of pipeline instances for concurrent analysis of multiple datasets. PMID:28854616

  3. FASH: A web application for nucleotides sequence search.

    PubMed

    Veksler-Lublinksy, Isana; Barash, Danny; Avisar, Chai; Troim, Einav; Chew, Paul; Kedem, Klara

    2008-05-27

    : FASH (Fourier Alignment Sequence Heuristics) is a web application, based on the Fast Fourier Transform, for finding remote homologs within a long nucleic acid sequence. Given a query sequence and a long text-sequence (e.g, the human genome), FASH detects subsequences within the text that are remotely-similar to the query. FASH offers an alternative approach to Blast/Fasta for querying long RNA/DNA sequences. FASH differs from these other approaches in that it does not depend on the existence of contiguous seed-sequences in its initial detection phase. The FASH web server is user friendly and very easy to operate. FASH can be accessed athttps://fash.bgu.ac.il:8443/fash/default.jsp (secured website).

  4. NASA SensorWeb and OGC Standards for Disaster Management

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2010-01-01

    I. Goal: Enable user to cost-effectively find and create customized data products to help manage disasters; a) On-demand; b) Low cost and non-specialized tools such as Google Earth and browsers; c) Access via open network but with sufficient security. II. Use standards to interface various sensors and resultant data: a) Wrap sensors in Open Geospatial Consortium (OGC) standards; b) Wrap data processing algorithms and servers with OGC standards c) Use standardized workflows to orchestrate and script the creation of these data; products. III. Target Web 2.0 mass market: a) Make it simple and easy to use; b) Leverage new capabilities and tools that are emerging; c) Improve speed and responsiveness.

  5. Automatic generation of reports at the TELECOM SCC

    NASA Astrophysics Data System (ADS)

    Beltan, Thierry; Jalbaud, Myriam; Fronton, Jean Francois

    In-orbit satellite follow-up produces a certain amount of reports on a regular basis (daily, weekly, quarterly, annually). Most of these documents use the information of former issues with the increments of the last period of time. They are made up of text, tables, graphs or pictures. The system presented here is the SGMT (Systeme de Gestion de la Memoire Technique), which means Technical Memory Mangement System. It provides the system operators with tools to generate the greatest part of these reports, as automatically as possible. It gives an easy access to the reports and the large amount of available memory enables the user to consult data on the complete lifetime of a satellite family.

  6. e-Learning system ERM for medical radiation physics education.

    PubMed

    Stoeva, Magdalena; Cvetkov, Asen

    2005-09-01

    The objective of this paper is to present the Education for Radiation in Medicine (ERM) e-Learning System. The system was developed, tested and piloted in the Inter-University Medical Physics Centre, Plovdiv, Bulgaria. It was based on the results of EU Project TEMPUS S-JEP 09826. The ERM e-Learning System is an integrated on-line system for remote education covering aspects of Medical Radiation Physics education (M.Sc. level). It provides user-friendly interface and optimised functionality with three different access levels: trainee, professor and administrator. The minimum server requirements and the standard client side working environment turn the system into a good, cost effective and easy to support solution for remote education.

  7. Interoperable Access to Near Real Time Ocean Observations with the Observing System Monitoring Center

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Hankin, S.; Mendelssohn, R.; Simons, R.; Smith, B.; Kern, K. J.

    2013-12-01

    The Observing System Monitoring Center (OSMC), a project funded by the National Oceanic and Atmospheric Administration's Climate Observations Division (COD), exists to join the discrete 'networks' of In Situ ocean observing platforms -- ships, surface floats, profiling floats, tide gauges, etc. - into a single, integrated system. The OSMC is addressing this goal through capabilities in three areas focusing on the needs of specific user groups: 1) it provides real time monitoring of the integrated observing system assets to assist management in optimizing the cost-effectiveness of the system for the assessment of climate variables; 2) it makes the stream of real time data coming from the observing system available to scientific end users into an easy-to-use form; and 3) in the future, it will unify the delayed-mode data from platform-focused data assembly centers into a standards- based distributed system that is readily accessible to interested users from the science and education communities. In this presentation, we will be focusing on the efforts of the OSMC to provide interoperable access to the near real time data stream that is available via the Global Telecommunications System (GTS). This is a very rich data source, and includes data from nearly all of the oceanographic platforms that are actively observing. We will discuss how the data is being served out using a number of widely used 'web services' (including OPeNDAP and SOS) and downloadable file formats (KML, csv, xls, netCDF), so that it can be accessed in web browsers and popular desktop analysis tools. We will also be discussing our use of the Environmental Research Division's Data Access Program (ERDDAP), available from NOAA/NMFS, which has allowed us to achieve our goals of serving the near real time data. From an interoperability perspective, it's important to note that access to the this stream of data is not just for humans, but also for machine-to-machine requests. We'll also delve into how we configured access to the near real time ocean observations in accordance with the Climate and Forecast (CF) metadata conventions describing the various 'feature types' associated with particular in situ observation types, or discrete sampling geometries (DSG). Wrapping up, we'll discuss some of the ways this data source is already being used.

  8. Measuring Website Quality: Asymmetric Effect of User Satisfaction

    ERIC Educational Resources Information Center

    Kincl, Tomas; Strach, Pavel

    2012-01-01

    Website quality measurement tools have been largely static and have struggled to determine relevant attributes of user satisfaction. This study compares and contrasts attributes of user satisfaction based on usability guidelines seeking to identify practical easy-to-administer measurement tools. The website users assessed business school homepages…

  9. OntoCAT -- simple ontology search and integration in Java, R and REST/JavaScript

    PubMed Central

    2011-01-01

    Background Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. Results OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. Conclusions OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. Availability http://www.ontocat.org PMID:21619703

  10. OntoCAT--simple ontology search and integration in Java, R and REST/JavaScript.

    PubMed

    Adamusiak, Tomasz; Burdett, Tony; Kurbatova, Natalja; Joeri van der Velde, K; Abeygunawardena, Niran; Antonakaki, Despoina; Kapushesky, Misha; Parkinson, Helen; Swertz, Morris A

    2011-05-29

    Ontologies have become an essential asset in the bioinformatics toolbox and a number of ontology access resources are now available, for example, the EBI Ontology Lookup Service (OLS) and the NCBO BioPortal. However, these resources differ substantially in mode, ease of access, and ontology content. This makes it relatively difficult to access each ontology source separately, map their contents to research data, and much of this effort is being replicated across different research groups. OntoCAT provides a seamless programming interface to query heterogeneous ontology resources including OLS and BioPortal, as well as user-specified local OWL and OBO files. Each resource is wrapped behind easy to learn Java, Bioconductor/R and REST web service commands enabling reuse and integration of ontology software efforts despite variation in technologies. It is also available as a stand-alone MOLGENIS database and a Google App Engine application. OntoCAT provides a robust, configurable solution for accessing ontology terms specified locally and from remote services, is available as a stand-alone tool and has been tested thoroughly in the ArrayExpress, MOLGENIS, EFO and Gen2Phen phenotype use cases. http://www.ontocat.org.

  11. Usability and feasibility of health IT interventions to enhance Self-Care for Lymphedema Symptom Management in breast cancer survivors.

    PubMed

    Fu, Mei R; Axelrod, Deborah; Guth, Amber A; Wang, Yao; Scagliola, Joan; Hiotis, Karen; Rampertaap, Kavita; El-Shammaa, Nardin

    2016-09-01

    The-Optimal-Lymph-Flow health IT system (TOLF) is a patient-centered, web-and-mobile-based educational and behavioral health IT system focusing on safe, innovative, and pragmatic self-care strategies for lymphedema symptom management. The purpose of this study was to evaluate usability, feasibility, and acceptability of TOLF among the end-user of breast cancer survivors. Two types of usability testing were completed with 30 breast cancer survivors: heuristic evaluation and end-user testing. Each participant was asked to think aloud while completing a set of specified tasks designed to explicate and freely explore the system features. A heuristic evaluation checklist, the Perceived Ease of Use and Usefulness Questionnaire, and The Post Study System Usability Questionnaire were used to evaluate usability of the system. Open-ended questions were used to gather qualitative data. Quantitative data were analyzed using descriptive statistics and qualitative data were summarized thematically. Breast cancer survivors were very satisfied with the system: 90% (n = 27) rated the system having no usability problems; 10% (n = 3) noted minor cosmetic problems: spelling errors or text font size. The majority of participants 96.6% (n = 29) strongly agreed that the system was easy to use and effective in helping to learn about lymphedema, symptoms and self-care strategies. Themes from the qualitative data included empowerment, high quality information, loving avatar simulation videos, easy accessibility, and user-friendliness. This usability study provided evidence on breast cancer survivor's acceptance and highly positive evaluation of TOLF's usability as well as feasibility of using technologically-driven delivery model to enhance self-care strategies for lymphedema symptom management.

  12. Design and Implementation of the CEBAF Element Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theodore Larrieu, Christopher Slominski, Michele Joyce

    2011-10-01

    With inauguration of the CEBAF Element Database (CED) in Fall 2010, Jefferson Lab computer scientists have taken a first step toward the eventual goal of a model-driven accelerator. Once fully populated, the database will be the primary repository of information used for everything from generating lattice decks to booting front-end computers to building controls screens. A particular requirement influencing the CED design is that it must provide consistent access to not only present, but also future, and eventually past, configurations of the CEBAF accelerator. To accomplish this, an introspective database schema was designed that allows new elements, element types, andmore » element properties to be defined on-the-fly without changing table structure. When used in conjunction with the Oracle Workspace Manager, it allows users to seamlessly query data from any time in the database history with the exact same tools as they use for querying the present configuration. Users can also check-out workspaces and use them as staging areas for upcoming machine configurations. All Access to the CED is through a well-documented API that is translated automatically from original C++ into native libraries for script languages such as perl, php, and TCL making access to the CED easy and ubiquitous. Notice: Authored by Jefferson Science Associates, LLC under U.S. DOE Contract No. DE-AC05-06OR23177. The U.S. Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce this manuscript for U.S. Government purposes.« less

  13. LudusScope: Accessible Interactive Smartphone Microscopy for Life-Science Education.

    PubMed

    Kim, Honesty; Gerber, Lukas Cyrill; Chiu, Daniel; Lee, Seung Ah; Cira, Nate J; Xia, Sherwin Yuyang; Riedel-Kruse, Ingmar H

    2016-01-01

    For centuries, observational microscopy has greatly facilitated biology education, but we still cannot easily and playfully interact with the microscopic world we see. We therefore developed the LudusScope, an accessible, interactive do-it-yourself smartphone microscopy platform that promotes exploratory stimulation and observation of microscopic organisms, in a design that combines the educational modalities of build, play, and inquire. The LudusScope's touchscreen and joystick allow the selection and stimulation of phototactic microorganisms such as Euglena gracilis with light. Organismal behavior is tracked and displayed in real time, enabling open and structured game play as well as scientific inquiry via quantitative experimentation. Furthermore, we used the Scratch programming language to incorporate biophysical modeling. This platform is designed as an accessible, low-cost educational kit for easy construction and expansion. User testing with both teachers and students demonstrates the educational potential of the LudusScope, and we anticipate additional synergy with the maker movement. Transforming observational microscopy into an interactive experience will make microbiology more tangible to society, and effectively support the interdisciplinary learning required by the Next Generation Science Standards.

  14. LudusScope: Accessible Interactive Smartphone Microscopy for Life-Science Education

    PubMed Central

    Kim, Honesty; Gerber, Lukas Cyrill; Chiu, Daniel; Lee, Seung Ah; Cira, Nate J.; Xia, Sherwin Yuyang; Riedel-Kruse, Ingmar H.

    2016-01-01

    For centuries, observational microscopy has greatly facilitated biology education, but we still cannot easily and playfully interact with the microscopic world we see. We therefore developed the LudusScope, an accessible, interactive do-it-yourself smartphone microscopy platform that promotes exploratory stimulation and observation of microscopic organisms, in a design that combines the educational modalities of build, play, and inquire. The LudusScope’s touchscreen and joystick allow the selection and stimulation of phototactic microorganisms such as Euglena gracilis with light. Organismal behavior is tracked and displayed in real time, enabling open and structured game play as well as scientific inquiry via quantitative experimentation. Furthermore, we used the Scratch programming language to incorporate biophysical modeling. This platform is designed as an accessible, low-cost educational kit for easy construction and expansion. User testing with both teachers and students demonstrates the educational potential of the LudusScope, and we anticipate additional synergy with the maker movement. Transforming observational microscopy into an interactive experience will make microbiology more tangible to society, and effectively support the interdisciplinary learning required by the Next Generation Science Standards. PMID:27706189

  15. Innovative Technology Transfer Partnerships

    NASA Technical Reports Server (NTRS)

    Kohler, Jeff

    2004-01-01

    The National Aeronautics and Space Administration (NASA) seeks to license its Advanced Tire and Strut Pressure Monitor (TSPM) technology. The TSPM is a handheld system to accurately measure tire and strut pressure and temperature over a wide temperature range (20 to 120 OF), as well as improve personnel safety. Sensor accuracy, electronics design, and a simple user interface allow operators quick, easy access to required measurements. The handheld electronics, powered by 12-VAC or by 9-VDC batteries, provide the user with an easy-to-read visual display of pressure/temperature or the streaming of pressure/temperature data via an RS-232 interface. When connected to a laptop computer, this new measurement system can provide users with automated data recording and trending, eliminating the chance for data hand-recording errors. In addition, calibration software allows for calibration data to be automatically utilized for the generation of new data conversion equations, simplifying the calibration processes that are so critical to reliable measurements. The design places a high-accuracy pressure sensor (also used as a temperature sensor) as close to the tire or strut measurement location as possible, allowing the user to make accurate measurements rapidly, minimizing the amount of high-pressure volumes, and allowing reasonable distance between the tire or strut and the operator. The pressure sensor attaches directly to the pressure supply/relief valve on the tire and/or strut, with necessary electronics contained in the handheld enclosure. A software algorithm ensures high accuracy of the device over the wide temperature range. Using the pressure sensor as a temperature sensor permits measurement of the actual temperature of the pressurized gas. This device can be adapted to create a portable calibration standard that does not require thermal conditioning. This allows accurate pressure measurements without disturbing the gas temperature. In-place calibration can save considerable time and money and is suitable in many process applications throughout industry.

  16. The Proteins API: accessing key integrated protein and genome information

    PubMed Central

    Antunes, Ricardo; Alpi, Emanuele; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd

    2017-01-01

    Abstract The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to ‘talk’ to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). PMID:28383659

  17. The Proteins API: accessing key integrated protein and genome information.

    PubMed

    Nightingale, Andrew; Antunes, Ricardo; Alpi, Emanuele; Bursteinas, Borisas; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd; Martin, Maria

    2017-07-03

    The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to 'talk' to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  18. xiSPEC: web-based visualization, analysis and sharing of proteomics data.

    PubMed

    Kolbowski, Lars; Combe, Colin; Rappsilber, Juri

    2018-05-08

    We present xiSPEC, a standard compliant, next-generation web-based spectrum viewer for visualizing, analyzing and sharing mass spectrometry data. Peptide-spectrum matches from standard proteomics and cross-linking experiments are supported. xiSPEC is to date the only browser-based tool supporting the standardized file formats mzML and mzIdentML defined by the proteomics standards initiative. Users can either upload data directly or select files from the PRIDE data repository as input. xiSPEC allows users to save and share their datasets publicly or password protected for providing access to collaborators or readers and reviewers of manuscripts. The identification table features advanced interaction controls and spectra are presented in three interconnected views: (i) annotated mass spectrum, (ii) peptide sequence fragmentation key and (iii) quality control error plots of matched fragments. Highlighting or selecting data points in any view is represented in all other views. Views are interactive scalable vector graphic elements, which can be exported, e.g. for use in publication. xiSPEC allows for re-annotation of spectra for easy hypothesis testing by modifying input data. xiSPEC is freely accessible at http://spectrumviewer.org and the source code is openly available on https://github.com/Rappsilber-Laboratory/xiSPEC.

  19. Going to where the users are! Making the collaborative resource management and science workspace mobile

    NASA Astrophysics Data System (ADS)

    Osti, D.; Osti, A.

    2013-12-01

    People are very busy today and getting stakeholders the information they need is an important part of our jobs. The BDL application is the mobile extension of the California collaborative resource management portal www.baydeltalive.com. BDL has been visited by more than 250,000 unique visitors this past year from various areas of water use and management including state and federal agencies, agriculture, scientists, policy makers, water consumers, voters, operations management and more. The audience is a qualified user group of more than 15,000 individuals participating in California hydrological ecosystem science, water management and policy. This is an important effort aimed to improve how scientists and policy makers are working together to understand this complicated and divisive system and how they are becoming better managers of that system. The BayDetaLive mobile application gives California watershed management stakeholders and water user community unprecedented access to real time natural resource management information. The application provides user with the following: 1. Access to Real Time Environmental Conditions from the more than the 600 California Data Exchange Sensors including hydrodynamic, water quality and meteorological data. Save important stations as favorites for easy access later. 2. Daily Delta Operations Data including estimated hydrology, daily exports, status of infrastructure operations, reservoir storage, salvage data, major stations, drinking water quality reports, weather forecasts and more. 3. Photos/Videos/Documents: Browse and share from the more than 1000 current documents in the BDL library. Relevant images, videos, science journals, presentations and articles. 4. Science: Access the latest science articles, news, projects and journals. 5. Data Visualizations: View recently published real time data interpolations of Delta Conditions. From 30-day turbidity models to daily forecasts. This service is published as conditions produce scientifically relevant visuals including winter conditions, first flush archives and fish migration seasons. 5. Maps: Access the entire Delta Atlas from anywhere! The atlas includes Delta levees, soils, islands and waterways, diversions, infrastructure, urban areas, land use, salinity, tidal flows, managed lands, protected lands and more. 6. Projects: Discover the latest project summaries currently underway in the Delta. Project Categories include restoration, operations, infrastructure to name a few. Share your discovery for more depth access on the BayDeltaLive.com website. 7. News: Current Delta Science topics. App Keywords: California Delta, Water Management, Natural Resource Management, Real Time Data, Water Operations, Water Supply, Water Quality, Collaboration

  20. User assumptions about information retrieval systems: Ethical concerns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Froehlich, T.J.

    Information professionals, whether designers, intermediaries, database producers or vendors, bear some responsibility for the information that they make available to users of information systems. The users of such systems may tend to make many assumptions about the information that a system provides, such as believing: that the data are comprehensive, current and accurate, that the information resources or databases have same degree of quality and consistency of indexing; that the abstracts, if they exist, correctly and adequate reflect the content of the article; that there is consistency informs of author names or journal titles or indexing within and across databases;more » that there is standardization in and across databases; that once errors are detected, they are corrected; that appropriate choices of databases or information resources are a relatively easy matter, etc. The truth is that few of these assumptions are valid in commercia or corporate or organizational databases. However, given these beliefs and assumptions by many users, often promoted by information providers, information professionals, impossible, should intervene to warn users about the limitations and constraints of the databases they are using. With the growth of the Internet and end-user products (e.g., CD-ROMs), such interventions have significantly declined. In such cases, information should be provided on start-up or through interface screens, indicating to users, the constraints and orientation of the system they are using. The principle of {open_quotes}caveat emptor{close_quotes} is naive and socially irresponsible: information professionals or systems have an obligation to provide some framework or context for the information that users are accessing.« less

  1. DockoMatic 2.0: High Throughput Inverse Virtual Screening and Homology Modeling

    PubMed Central

    Bullock, Casey; Cornia, Nic; Jacob, Reed; Remm, Andrew; Peavey, Thomas; Weekes, Ken; Mallory, Chris; Oxford, Julia T.; McDougal, Owen M.; Andersen, Timothy L.

    2013-01-01

    DockoMatic is a free and open source application that unifies a suite of software programs within a user-friendly Graphical User Interface (GUI) to facilitate molecular docking experiments. Here we describe the release of DockoMatic 2.0; significant software advances include the ability to: (1) conduct high throughput Inverse Virtual Screening (IVS); (2) construct 3D homology models; and (3) customize the user interface. Users can now efficiently setup, start, and manage IVS experiments through the DockoMatic GUI by specifying a receptor(s), ligand(s), grid parameter file(s), and docking engine (either AutoDock or AutoDock Vina). DockoMatic automatically generates the needed experiment input files and output directories, and allows the user to manage and monitor job progress. Upon job completion, a summary of results is generated by Dockomatic to facilitate interpretation by the user. DockoMatic functionality has also been expanded to facilitate the construction of 3D protein homology models using the Timely Integrated Modeler (TIM) wizard. The wizard TIM provides an interface that accesses the basic local alignment search tool (BLAST) and MODELLER programs, and guides the user through the necessary steps to easily and efficiently create 3D homology models for biomacromolecular structures. The DockoMatic GUI can be customized by the user, and the software design makes it relatively easy to integrate additional docking engines, scoring functions, or third party programs. DockoMatic is a free comprehensive molecular docking software program for all levels of scientists in both research and education. PMID:23808933

  2. Online Airtightness Calculator for the US, Canada and China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Som S; Hun, Diana E; Desjarlais, Andre Omer

    The contribution of air leakage to heating and cooling loads has been increasing as the thermal resistance of building envelopes continues to improve. Easy-to-access data are needed to convince building owners and contractors that enhancing the airtightness of buildings is the next logical step to achieve a high-performance building envelope. To this end, Oak Ridge National Laboratory, the National Institute of Standards and Technology, the Air Barrier Association of America, and the US-China Clean Energy Research Center for Building Energy Efficiency Consortium partnered to develop an online calculator that estimates the potential energy savings in major US, Canadian, and Chinesemore » cities due to improvements in airtightness. This tool will have user-friendly graphical interface that uses a database of CONTAM-EnergyPlus pre-run simulation results, and will be available to the public at no cost. Baseline leakage rates are either user-specified or the user selects them from the supplied typical leakage rates. Users will enter the expected airtightness after the proper installation of an air barrier system. Energy costs are estimated based on the building location and inputs from users. This paper provides an overview of the methodology that is followed in this calculator, as well as results from an example. The successful deployment of this calculator could influence construction practices so that greenhouse gas emissions from the US, Canada, and China are significantly curtailed.« less

  3. Parent use of touchscreen computer kiosks for child health promotion in community settings.

    PubMed

    Thompson, Darcy A; Lozano, Paula; Christakis, Dimitri A

    2007-03-01

    The goals were to evaluate the use of touchscreen computer kiosks, containing only child health-promoting information, in urban, low-income, community settings and to characterize the users of these kiosks. Three user-driven touchscreen computer kiosks were placed in low-income urban locations in Seattle, Washington, from March 2005 to October 2005. The locations included a public library, a Department of Motor Vehicles office, and a McDonald's restaurant. Users selected age-appropriate modules with prevention information and screening tools. Users entered the age of the child and were presented with age-appropriate modules. On exiting, users were asked to rate their experience and to provide basic demographic data. In total, there were 1846 kiosk sessions. Almost one half occurred at McDonald's. Seventy-eight percent of users identified themselves as first-time users. Users sought information for children of all ages. Sixty-one percent of first-time users explored 1 module. First-time users were most interested in television/media use (16%), smoke exposure (14%), attention-deficit/hyperactivity disorder screening (12%), and asthma assessment (11%). At-risk children were identified in 52% of sessions. Eighty-seven percent of first-time users who completed the asthma assessment had children whose asthma was uncontrolled. Twenty-eight percent of users responded to > or = 1 question on the exit survey. Of those, 48% had less than a high school education, and 26% had never used the Internet. Approximately one half found the kiosk easy to use (57%) and the information easy to understand (55%); 66% said there was at least some new information. Fifty-five percent planned to try some of the things they had learned, and 49% intended to talk to their child's doctor about what they had learned. User-driven computer kiosks were used in community settings to obtain child health information. Users found the kiosks easy to use. Additional study on improving use and understanding the impact is needed.

  4. The CatchMod toolbox: easy and guided access to ICT tools for Water Framework Directive implementation.

    PubMed

    van Griensven, A; Vanrolleghem, P A

    2006-01-01

    Web-based toolboxes are handy tools to inform experienced users of existing software in their disciplines. However, for the implementation of the Water Framework Directive, a much more diverse public (water managers, consultancy firms, scientists, etc.) will ask for a very wide diversity of Information and Communication Technology (ICT) tools. It is obvious that the users of a web-based ICT-toolbox providing all this will not be experts in all of the disciplines and that a toolbox for ICT tools for Water Framework Directive implementation should thus go beyond just making interesting web-links. To deal with this issue, expert knowledge is brought to the users through the incorporation of visitor-geared guidance (materials) in the Harmoni-CA toolbox. Small workshops of expert teams were organized to deliver documents explaining why the tools are important, when they are required and what activity they support/perform, as well as a categorization of the multitude of available tools. An integration of this information in the web-based toolbox helps the users to browse through a toolbox containing tools, reports, guidance documents and interesting links. The Harmoni-CA toolbox thus provides not only a virtual toolbox, but incorporates a virtual expert as well.

  5. In-home firearm access among US adolescents and the role of religious subculture: Results from a nationally representative study.

    PubMed

    Stroope, Samuel; Tom, Joshua C

    2017-09-01

    Religious participation is linked to numerous positive safety outcomes for adolescents. Scant attention, however, has been paid to associations between religious participation and safety risks among adolescents. Using data from Add Health (N = 18,449), a nationally representative school-based sample of US adolescents, this study examines the relationship between adolescents' religious affiliation and easy access to firearms at home. Regression analyses adjust for complex sampling design and compare easy firearm access at home among conservative Protestant adolescents to adolescent firearm access in other religious traditions. Conservative Protestant adolescents have a substantially greater likelihood of easy access to a gun at home compared to adolescents of all other major religious traditions in the United States. Recognizing differences in adolescent firearm access between subcultural groups can help public health interventions more effectively identify and address the needs of vulnerable populations. The paper's conclusion considers suggestions for effective policy and programmatic initiatives. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. EUFAR the unique portal for airborne research in Europe

    NASA Astrophysics Data System (ADS)

    Gérard, Elisabeth; Brown, Philip

    2016-04-01

    Created in 2000 and supported by the EU Framework Programmes since then, EUFAR was born out of the necessity to create a central network and access point for the airborne research community in Europe. With the aim to support researchers by granting them access to research infrastructures, not accessible in their home countries, EUFAR also provides technical support and training in the field of airborne research for the environmental and geo-sciences. Today, EUFAR2 (2014-2018) coordinates and facilitates transnational access to 18 instrumented aircraft and 3 remote-sensing instruments through the 13 operators who are part of EUFAR's current 24-partner European consortium. In addition, the current project supports networking and research activities focused on providing an enabling environment for and promoting airborne research. The EUFAR2 activities cover three objectives, supported by the internet website www.eufar.net: (I - Institutional) improvement of the access to the research infrastructures and development of the future fleet according to the strategic advisory committee (SAC) recommendations; (ii - Innovation) improvement of the scientific knowledge and promotion of innovating instruments, processes and services for the emergence of new industrial technologies, with an identification of industrial needs by the SAC; (iii - Service) optimisation and harmonisation of the use of the research infrastructures through the development of the community of young researches in airborne science, of the standards and protocols and of the airborne central database. With the launch of a brand new website (www.eufar.net) in mid-November 2015, EUFAR aims to improve user experience on the website, which serves as a source of information and a hub where users are able to collaborate, learn, share expertise and best practices, and apply for transnational access, and education and training funded opportunities within the network. With its newly designed eye-catching interface, the website offers easy navigation, and user friendly functionalities. New features also include a section on news and airborne research stories to keep users up-to-date on EUFAR's activities, a career section, photo galleries, and much more. By elaborating new solutions for the web portal, EUFAR continues to serve as an interactive and dynamic platform bringing together experts, early-stage researchers, operators, data users, industry and other stakeholders in the airborne research community. A main focus of the current project is the establishment of a sustainable legal structure for EUFAR. This is critical to ensuring the continuity of EUFAR and securing, at the least, partial financial independence from the European Commission who has been funding the project since its start. After carefully examining different legal forms relevant for EUFAR, the arguments are strongly in favour of establishing an International non-profit Association under the Belgian law (AISBL). Together with the implementation of an Open Access scheme by means of resource-sharing to support the mobility of personnel across countries envisaged in 2016, such a sustainable structure would contribute substantially toward broadening the user base of existing airborne research facilities in Europe and mobilising additional resources for this end. In essence, this would cement EUFAR's position as the key portal for airborne research in Europe.

  7. Easy GROMACS: A Graphical User Interface for GROMACS Molecular Dynamics Simulation Package

    NASA Astrophysics Data System (ADS)

    Dizkirici, Ayten; Tekpinar, Mustafa

    2015-03-01

    GROMACS is a widely used molecular dynamics simulation package. Since it is a command driven program, it is difficult to use this program for molecular biologists, biochemists, new graduate students and undergraduate researchers who are interested in molecular dynamics simulations. To alleviate the problem for those researchers, we wrote a graphical user interface that simplifies protein preparation for a classical molecular dynamics simulation. Our program can work with various GROMACS versions and it can perform essential analyses of GROMACS trajectories as well as protein preparation. We named our open source program `Easy GROMACS'. Easy GROMACS can give researchers more time for scientific research instead of dealing with technical intricacies.

  8. Web3DMol: interactive protein structure visualization based on WebGL.

    PubMed

    Shi, Maoxiang; Gao, Juntao; Zhang, Michael Q

    2017-07-03

    A growing number of web-based databases and tools for protein research are being developed. There is now a widespread need for visualization tools to present the three-dimensional (3D) structure of proteins in web browsers. Here, we introduce our 3D modeling program-Web3DMol-a web application focusing on protein structure visualization in modern web browsers. Users submit a PDB identification code or select a PDB archive from their local disk, and Web3DMol will display and allow interactive manipulation of the 3D structure. Featured functions, such as sequence plot, fragment segmentation, measure tool and meta-information display, are offered for users to gain a better understanding of protein structure. Easy-to-use APIs are available for developers to reuse and extend Web3DMol. Web3DMol can be freely accessed at http://web3dmol.duapp.com/, and the source code is distributed under the MIT license. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-05-04

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less

  10. The new Health Sciences Library at the State University of New York at Buffalo.

    PubMed Central

    Fabrizio, N; Huang, C K

    1988-01-01

    The new Health Sciences Library at the State University of New York at Buffalo is a harmonious and functional blend of the old and the new. The old is a renovated Georgian style building with formal rooms containing fireplaces, carved woodwork and English oak paneling. The new is a contemporary four-story addition. Through the arrangement of space and the interior design, the new library offers users easy access to services and resources; accommodates the heavy daily flow of users and library materials; provides an environment of comfort, quiet, and safety; and promotes efficient communication among all segments of the library staff. This was accomplished through sound architectural design which included close consultation with the library director and staff during the planning process. The new library is equipped to face the challenge of meeting the needs of biomedical education, research, and clinical programs of the institution and its constituents in the years to come. Images PMID:3370382

  11. The graphics and data acquisition software package

    NASA Technical Reports Server (NTRS)

    Crosier, W. G.

    1981-01-01

    A software package was developed for use with micro and minicomputers, particularly the LSI-11/DPD-11 series. The package has a number of Fortran-callable subroutines which perform a variety of frequently needed tasks for biomedical applications. All routines are well documented, flexible, easy to use and modify, and require minimal programmer knowledge of peripheral hardware. The package is also economical of memory and CPU time. A single subroutine call can perform any one of the following functions: (1) plot an array of integer values from sampled A/D data, (2) plot an array of Y values versus an array of X values; (3) draw horizontal and/or vertical grid lines of selectable type; (4) annotate grid lines with user units; (5) get coordinates of user controlled crosshairs from the terminal for interactive graphics; (6) sample any analog channel with program selectable gain; (7) wait a specified time interval, and (8) perform random access I/O of one or more blocks of a sequential disk file. Several miscellaneous functions are also provided.

  12. The XCatDB, a Rich 3XMM Catalogue Interface

    NASA Astrophysics Data System (ADS)

    Michel, L.; Grisé, F.; Motch, C.; Gomez-Moran, A. N.

    2015-09-01

    The last release of the XMM catalog, the 3XMM-DR4 published in July 2013, is the largest X-ray catalog ever built. It includes lots of data products such as spectra, time series, images, previews, and extractions of archival catalogs matching the position of X-ray sources. The Strasbourg Observatory built an original interface called XCatDB. It was designed to make the best of this wide set of related products with an emphasis on the images. Besides, it offers an easy access to all other catalog parameters. Users can select data with very elaborate queries and can process them with online services such as an X-ray spectral fitting routine. The combination of all these features allows the users to select data of interest to the naked eye as well as to filter catalog parameters. Data selections can be picked out for further scientific analysis thanks to an interface operating external VO clients. The XcatDB has been developed with Saada.

  13. Regional Disparities in Online Map User Access Volume and Determining Factors

    NASA Astrophysics Data System (ADS)

    Li, R.; Yang, N.; Li, R.; Huang, W.; Wu, H.

    2017-09-01

    The regional disparities of online map user access volume (use `user access volume' in this paper to indicate briefly) is a topic of growing interest with the increment of popularity in public users, which helps to target the construction of geographic information services for different areas. At first place we statistically analysed the online map user access logs and quantified these regional access disparities on different scales. The results show that the volume of user access is decreasing from east to the west in China as a whole, while East China produces the most access volume; these cities are also the crucial economic and transport centres. Then Principal Component Regression (PCR) is applied to explore the regional disparities of user access volume. A determining model for Online Map access volume is proposed afterwards, which indicates that area scale is the primary determining factor for regional disparities, followed by public transport development level and public service development level. Other factors like user quality index and financial index have very limited influence on the user access volume. According to the study of regional disparities in user access volume, map providers can reasonably dispatch and allocate the data resources and service resources in each area and improve the operational efficiency of the Online Map server cluster.

  14. User Access | Energy Systems Integration Facility | NREL

    Science.gov Websites

    User Access User Access The ESIF houses an unparalleled collection of state-of-the-art capabilities user access program, the ESIF allows researchers access to its premier laboratories in support of research and development that aims to optimize our entire energy system at full power. Requests for access

  15. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.« less

  16. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.« less

  17. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Charles G. Crawford

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1 development activities.« less

  18. Measurement of Usability for Multimedia Interactive Learning Based on Website in Mathematics for SMK

    NASA Astrophysics Data System (ADS)

    Sukardjo, Moch.; Sugiyanta, Lipur

    2018-04-01

    Web usability, if evaluation done correctly, can significantly improve the quality of the website. Website containing multimedia for education shoud apply user interfaces that are both easy to learn and easy to use. Multimedia has big role in changing the mindset of a person in learning. Using multimedia, learners get easy to obtain information, adjust information and empower information. Therefore, multimedia is utilized by teachers in developing learning techniques to improve student learning outcomes. For students with self-directed learning, multimedia provides the ease and completeness of the courses in such a way that students can complete the learning independently both at school and at home without the guidance of teachers. The learning independence takes place in how students choose, absorb information, and follow the evaluation quickly and efficiently. The 2013 Curriculum 2013 for Vocational High School (SMK) requires teachers to create engaging teaching and learning activities that students enjoy in the classroom (also called invitation learning environment). The creation of learning activity environment is still problem for most teachers. Various researches reveal that teaching and learning activities will be more effective and easy when assisted by visual tools. Using multimedia, learning material can be presented more attractively that help students understand the material easily. The opposite is found in the learning activity environment who only rely on ordinary lectures. Usability is a quality level of multimedia with easy to learn, easy to use and encourages users to use it. The website Multimedia Interactive Learning for Mathematics SMK Class X is targeted object. Usability website in Multimedia Interactive Learning for Mathematics SMK Class X is important indicators to measure effectiveness, efficiency, and student satisfaction to access the functionality of website. This usability measurement should be done carefully before the design is implemented thoroughly. The only way to get test with high quality results is to start testing at the beginning of the design process and continuously testing each of the next steps. This research performs usability testing on of website by using WAMMI criterion (Website Analysis and Measurement Inventory) and will be focused on how convenience using the website application. Components of Attractiveness, Controllability, Efficiency, Helpfulness, and Learnability are applied. The website in Multimedia Interactive Learning for Mathematics SMK Class X can be in accordance with the purpose to be accepted by student to improve student learning outcomes. The results show that WAMMI method show the usability value of Multimedia Mathematics SMK Class X is about from 70% to 90%.

  19. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  20. "Just Another Tool for Online Studies” (JATOS): An Easy Solution for Setup and Management of Web Servers Supporting Online Studies

    PubMed Central

    Lange, Kristian; Kühn, Simone; Filevich, Elisa

    2015-01-01

    We present here “Just Another Tool for Online Studies” (JATOS): an open source, cross-platform web application with a graphical user interface (GUI) that greatly simplifies setting up and communicating with a web server to host online studies that are written in JavaScript. JATOS is easy to install in all three major platforms (Microsoft Windows, Mac OS X, and Linux), and seamlessly pairs with a database for secure data storage. It can be installed on a server or locally, allowing researchers to try the application and feasibility of their studies within a browser environment, before engaging in setting up a server. All communication with the JATOS server takes place via a GUI (with no need to use a command line interface), making JATOS an especially accessible tool for researchers without a strong IT background. We describe JATOS’ main features and implementation and provide a detailed tutorial along with example studies to help interested researchers to set up their online studies. JATOS can be found under the Internet address: www.jatos.org. PMID:26114751

  1. User-Centered Indexing for Adaptive Information Access

    NASA Technical Reports Server (NTRS)

    Chen, James R.; Mathe, Nathalie

    1996-01-01

    We are focusing on information access tasks characterized by large volume of hypermedia connected technical documents, a need for rapid and effective access to familiar information, and long-term interaction with evolving information. The problem for technical users is to build and maintain a personalized task-oriented model of the information to quickly access relevant information. We propose a solution which provides user-centered adaptive information retrieval and navigation. This solution supports users in customizing information access over time. It is complementary to information discovery methods which provide access to new information, since it lets users customize future access to previously found information. It relies on a technique, called Adaptive Relevance Network, which creates and maintains a complex indexing structure to represent personal user's information access maps organized by concepts. This technique is integrated within the Adaptive HyperMan system, which helps NASA Space Shuttle flight controllers organize and access large amount of information. It allows users to select and mark any part of a document as interesting, and to index that part with user-defined concepts. Users can then do subsequent retrieval of marked portions of documents. This functionality allows users to define and access personal collections of information, which are dynamically computed. The system also supports collaborative review by letting users share group access maps. The adaptive relevance network provides long-term adaptation based both on usage and on explicit user input. The indexing structure is dynamic and evolves over time. Leading and generalization support flexible retrieval of information under similar concepts. The network is geared towards more recent information access, and automatically manages its size in order to maintain rapid access when scaling up to large hypermedia space. We present results of simulated learning experiments.

  2. Geoscience Information Network (USGIN) Solutions for Interoperable Open Data Access Requirements

    NASA Astrophysics Data System (ADS)

    Allison, M. L.; Richard, S. M.; Patten, K.

    2014-12-01

    The geosciences are leading development of free, interoperable open access to data. US Geoscience Information Network (USGIN) is a freely available data integration framework, jointly developed by the USGS and the Association of American State Geologists (AASG), in compliance with international standards and protocols to provide easy discovery, access, and interoperability for geoscience data. USGIN standards include the geologic exchange language 'GeoSciML' (v 3.2 which enables instant interoperability of geologic formation data) which is also the base standard used by the 117-nation OneGeology consortium. The USGIN deployment of NGDS serves as a continent-scale operational demonstration of the expanded OneGeology vision to provide access to all geoscience data worldwide. USGIN is developed to accommodate a variety of applications; for example, the International Renewable Energy Agency streams data live to the Global Atlas of Renewable Energy. Alternatively, users without robust data sharing systems can download and implement a free software packet, "GINstack" to easily deploy web services for exposing data online for discovery and access. The White House Open Data Access Initiative requires all federally funded research projects and federal agencies to make their data publicly accessible in an open source, interoperable format, with metadata. USGIN currently incorporates all aspects of the Initiative as it emphasizes interoperability. The system is successfully deployed as the National Geothermal Data System (NGDS), officially launched at the White House Energy Datapalooza in May, 2014. The USGIN Foundation has been established to ensure this technology continues to be accessible and available.

  3. Check out the Atmospheric Science User Forum

    Atmospheric Science Data Center

    2016-11-16

    Check out the Atmospheric Science User Forum Tuesday, November 15, 2016 The ASDC would like to bring your attention to the Atmospheric Science User Forum. The purpose of this forum is to improve user service, quality, and efficiency of NASA atmospheric science data. The forum intends to provide a quick and easy way to facilitate ...

  4. Web-Based Family Life Education: Spotlight on User Experience

    ERIC Educational Resources Information Center

    Doty, Jennifer; Doty, Matthew; Dwrokin, Jodi

    2011-01-01

    Family Life Education (FLE) websites can benefit from the field of user experience, which makes technology easy to use. A heuristic evaluation of five FLE sites was performed using Neilson's heuristics, guidelines for making sites user friendly. Greater site complexity resulted in more potential user problems. Sites most frequently had problems…

  5. A data skimming service for locally resident analysis data

    NASA Astrophysics Data System (ADS)

    Cranshaw, J.; Gardner, R. W.; Gieraltowski, J.; Malon, D.; Mambelli, M.; May, E.

    2008-07-01

    A Data Skimming Service (DSS) is a site-level service for rapid event filtering and selection from locally resident datasets based on metadata queries to associated 'tag' databases. In US ATLAS, we expect most if not all of the AOD-based datasets to be replicated to each of the five Tier 2 regional facilities in the US Tier 1 'cloud' coordinated by Brookhaven National Laboratory. Entire datasets will consist of on the order of several terabytes of data, and providing easy, quick access to skimmed subsets of these data will be vital to physics working groups. Typically, physicists will be interested in portions of the complete datasets, selected according to event-level attributes (number of jets, missing Et, etc) and content (specific analysis objects for subsequent processing). In this paper we describe methods used to classify data (metadata tag generation) and to store these results in a local database. Next we discuss a general framework which includes methods for accessing this information, defining skims, specifying event output content, accessing locally available storage through a variety of interfaces (SRM, dCache/dccp, gridftp), accessing remote storage elements as specified, and user job submission tools through local or grid schedulers. The advantages of the DSS are the ability to quickly 'browse' datasets and design skims, for example, pre-adjusting cuts to get to a desired skim level with minimal use of compute resources, and to encode these analysis operations in a database for re-analysis and archival purposes. Additionally the framework has provisions to operate autonomously in the event that external, central resources are not available, and to provide, as a reduced package, a minimal skimming service tailored to the needs of small Tier 3 centres or individual users.

  6. 76 FR 71267 - Standardized and Enhanced Disclosure Requirements for Television Broadcast Licensee Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... collect and publish data in forms that make it easy for citizens, entrepreneurs, software developers, and... serves the public interest easier to understand and more accessible will not only promote discussion... order to create a permanent, searchable record of these arrangements and afford easy access by consumers...

  7. Transforming paper-based assessment forms to a digital format: Exemplified by the Housing Enabler prototype app.

    PubMed

    Svarre, Tanja; Lunn, Tine Bieber Kirkegaard; Helle, Tina

    2017-11-01

    The aim of this paper is to provide the reader with an overall impression of the stepwise user-centred design approach including the specific methods used and lessons learned when transforming paper-based assessment forms into a prototype app, taking the Housing Enabler as an example. Four design iterations were performed, building on a domain study, workshops, expert evaluation and controlled and realistic usability tests. The user-centred design process involved purposefully selected participants with different Housing Enabler knowledge and housing adaptation experience. The design iterations resulted in the development of a Housing Enabler prototype app. The prototype app has several features and options that are new compared with the original paper-based Housing Enabler assessment form. These new features include a user friendly overview of the assessment form; easy navigation by swiping back and forth between items; onsite data analysis; and ranking of the accessibility score, photo documentation and a data export facility. Based on the presented stepwise approach, a high-fidelity Housing Enabler prototype app was successfully developed. The development process has emphasized the importance of combining design participants' knowledge and experiences, and has shown that methods should seem relevant to participants to increase their engagement.

  8. Applications of the U.S. Geological Survey's global land cover product

    USGS Publications Warehouse

    Reed, B.

    1997-01-01

    The U.S. Geological Survey (USGS), in partnership with several international agencies and universities, has produced a global land cover characteristics database. The land cover data were created using multitemporal analysis of advanced very high resolution radiometer satellite images in conjunction with other existing geographic data. A translation table permits the conversion of the land cover classes into several conventional land cover schemes that are used by ecosystem modelers, climate modelers, land management agencies, and other user groups. The alternative classification schemes include Global Ecosystems, the Biosphere Atmosphere Transfer Scheme, the Simple Biosphere, the USGS Anderson Level 2, and the International Geosphere Biosphere Programme. The distribution system for these data is through the World Wide Web (the web site address is: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html) or by magnetic media upon special request The availability of the data over the World Wide Web, in conjunction with the flexible database structure, allows easy data access to a wide range of users. The web site contains a user registration form that allows analysis of the diverse applications of large-area land cover data. Currently, applications are divided among mapping (20 percent), conservation (30 percent), and modeling (35 percent).

  9. Using Unix system auditing for detecting network intrusions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, M.J.

    1993-03-01

    Intrusion Detection Systems (IDSs) are designed to detect actions of individuals who use computer resources without authorization as well as legitimate users who exceed their privileges. This paper describes a novel approach to IDS research, namely a decision aiding approach to intrusion detection. The introduction of a decision tree represents the logical steps necessary to distinguish and identify different types of attacks. This tool, the Intrusion Decision Aiding Tool (IDAT), utilizes IDS-based attack models and standard Unix audit data. Since attacks have certain characteristics and are based on already developed signature attack models, experienced and knowledgeable Unix system administrators knowmore » what to look for in system audit logs to determine if a system has been attacked. Others, however, are usually less able to recognize common signatures of unauthorized access. Users can traverse the tree using available audit data displayed by IDAT and general knowledge they possess to reach a conclusion regarding suspicious activity. IDAT is an easy-to-use window based application that gathers, analyzes, and displays pertinent system data according to Unix attack characteristics. IDAT offers a more practical approach and allows the user to make an informed decision regarding suspicious activity.« less

  10. No Longer Have to Choose

    NASA Astrophysics Data System (ADS)

    Brown, H.; Ritchey, N. A.

    2017-12-01

    NOAA National Centers for Environmental Information (NCEI) once was three separate data centers (NGDC, NODC, and NCDC). In 2015 the three centers merged into NCEI. NCEI has refined the art of long term preservation and stewardship practices throughout the life-cycle of various types of data. NCEI can help you navigate and make the complicated world of preserving your data user-friendly. Using tools at NCEI, data providers can request data to be archived, submit data for archival and create complete International Organization for Standardization (ISO) metadata records with ease. To ensure traceability, Digital Object Identifiers (DOIs) are minted for published data sets. The services offered at NCEI follow standards and NOAA directives such as the Open Archival Information System (OAIS) - Reference Model (ISO 14721) to ensure consistent long-term preservation for the Nation's resource of global environmental data for a broad spectrum of users. The implementation of these standards supports the data to be accessible, independently understandable and reproducible in an easy to understand format for all types of users. Insights from combined knowledge of 100+years of various domain and data management and preservation and the tools supporting these functions will be shared.

  11. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    PubMed

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  12. EDENetworks: a user-friendly software to build and analyse networks in biogeography, ecology and population genetics.

    PubMed

    Kivelä, Mikko; Arnaud-Haond, Sophie; Saramäki, Jari

    2015-01-01

    The recent application of graph-based network theory analysis to biogeography, community ecology and population genetics has created a need for user-friendly software, which would allow a wider accessibility to and adaptation of these methods. EDENetworks aims to fill this void by providing an easy-to-use interface for the whole analysis pipeline of ecological and evolutionary networks starting from matrices of species distributions, genotypes, bacterial OTUs or populations characterized genetically. The user can choose between several different ecological distance metrics, such as Bray-Curtis or Sorensen distance, or population genetic metrics such as FST or Goldstein distances, to turn the raw data into a distance/dissimilarity matrix. This matrix is then transformed into a network by manual or automatic thresholding based on percolation theory or by building the minimum spanning tree. The networks can be visualized along with auxiliary data and analysed with various metrics such as degree, clustering coefficient, assortativity and betweenness centrality. The statistical significance of the results can be estimated either by resampling the original biological data or by null models based on permutations of the data. © 2014 John Wiley & Sons Ltd.

  13. Tools and Services for Working with Multiple Land Remote Sensing Data Products

    NASA Astrophysics Data System (ADS)

    Krehbiel, C.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.; Maiersperger, T.

    2016-12-01

    The availability of increasingly large and diverse satellite remote sensing datasets provides both an opportunity and a challenge across broad Earth science research communities. On one hand, the extensive assortment of available data offer unprecedented opportunities to improve our understanding of Earth science and enable data use across a multitude of science disciplines. On the other hand, increasingly complex formats, data structures, and metadata can be an obstacle to data use for the broad user community that is interested in incorporating remote sensing Earth science data into their research. NASA's Land Processes Distributed Active Archive Center (LP DAAC) provides easy to use Python notebook tutorials for services such as accessing land remote sensing data from the LP DAAC Data Pool and interpreting data quality information from MODIS. We use examples to demonstrate the capabilities of the Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), such as spatially and spectrally subsetting data, decoding valuable quality information, and exploring initial analysis results within the user interface. We also show data recipes for R and Python scripts that help users process ASTER L1T and ASTER Global Emissivity Datasets.

  14. qPortal: A platform for data-driven biomedical research.

    PubMed

    Mohr, Christopher; Friedrich, Andreas; Wojnar, David; Kenar, Erhan; Polatkan, Aydin Can; Codrea, Marius Cosmin; Czemmel, Stefan; Kohlbacher, Oliver; Nahnsen, Sven

    2018-01-01

    Modern biomedical research aims at drawing biological conclusions from large, highly complex biological datasets. It has become common practice to make extensive use of high-throughput technologies that produce big amounts of heterogeneous data. In addition to the ever-improving accuracy, methods are getting faster and cheaper, resulting in a steadily increasing need for scalable data management and easily accessible means of analysis. We present qPortal, a platform providing users with an intuitive way to manage and analyze quantitative biological data. The backend leverages a variety of concepts and technologies, such as relational databases, data stores, data models and means of data transfer, as well as front-end solutions to give users access to data management and easy-to-use analysis options. Users are empowered to conduct their experiments from the experimental design to the visualization of their results through the platform. Here, we illustrate the feature-rich portal by simulating a biomedical study based on publically available data. We demonstrate the software's strength in supporting the entire project life cycle. The software supports the project design and registration, empowers users to do all-digital project management and finally provides means to perform analysis. We compare our approach to Galaxy, one of the most widely used scientific workflow and analysis platforms in computational biology. Application of both systems to a small case study shows the differences between a data-driven approach (qPortal) and a workflow-driven approach (Galaxy). qPortal, a one-stop-shop solution for biomedical projects offers up-to-date analysis pipelines, quality control workflows, and visualization tools. Through intensive user interactions, appropriate data models have been developed. These models build the foundation of our biological data management system and provide possibilities to annotate data, query metadata for statistics and future re-analysis on high-performance computing systems via coupling of workflow management systems. Integration of project and data management as well as workflow resources in one place present clear advantages over existing solutions.

  15. Implementing Recommendations From Web Accessibility Guidelines: A Comparative Study of Nondisabled Users and Users With Visual Impairments.

    PubMed

    Schmutz, Sven; Sonderegger, Andreas; Sauer, Juergen

    2017-09-01

    The present study examined whether implementing recommendations of Web accessibility guidelines would have different effects on nondisabled users than on users with visual impairments. The predominant approach for making Web sites accessible for users with disabilities is to apply accessibility guidelines. However, it has been hardly examined whether this approach has side effects for nondisabled users. A comparison of the effects on both user groups would contribute to a better understanding of possible advantages and drawbacks of applying accessibility guidelines. Participants from two matched samples, comprising 55 participants with visual impairments and 55 without impairments, took part in a synchronous remote testing of a Web site. Each participant was randomly assigned to one of three Web sites, which differed in the level of accessibility (very low, low, and high) according to recommendations of the well-established Web Content Accessibility Guidelines 2.0 (WCAG 2.0). Performance (i.e., task completion rate and task completion time) and a range of subjective variables (i.e., perceived usability, positive affect, negative affect, perceived aesthetics, perceived workload, and user experience) were measured. Higher conformance to Web accessibility guidelines resulted in increased performance and more positive user ratings (e.g., perceived usability or aesthetics) for both user groups. There was no interaction between user group and accessibility level. Higher conformance to WCAG 2.0 may result in benefits for nondisabled users and users with visual impairments alike. Practitioners may use the present findings as a basis for deciding on whether and how to implement accessibility best.

  16. Enhancement of Local Climate Analysis Tool

    NASA Astrophysics Data System (ADS)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  17. Distributed XQuery-Based Integration and Visualization of Multimodality Brain Mapping Data

    PubMed Central

    Detwiler, Landon T.; Suciu, Dan; Franklin, Joshua D.; Moore, Eider B.; Poliakov, Andrew V.; Lee, Eunjung S.; Corina, David P.; Ojemann, George A.; Brinkley, James F.

    2008-01-01

    This paper addresses the need for relatively small groups of collaborating investigators to integrate distributed and heterogeneous data about the brain. Although various national efforts facilitate large-scale data sharing, these approaches are generally too “heavyweight” for individual or small groups of investigators, with the result that most data sharing among collaborators continues to be ad hoc. Our approach to this problem is to create a “lightweight” distributed query architecture, in which data sources are accessible via web services that accept arbitrary query languages but return XML results. A Distributed XQuery Processor (DXQP) accepts distributed XQueries in which subqueries are shipped to the remote data sources to be executed, with the resulting XML integrated by DXQP. A web-based application called DXBrain accesses DXQP, allowing a user to create, save and execute distributed XQueries, and to view the results in various formats including a 3-D brain visualization. Example results are presented using distributed brain mapping data sources obtained in studies of language organization in the brain, but any other XML source could be included. The advantage of this approach is that it is very easy to add and query a new source, the tradeoff being that the user needs to understand XQuery and the schemata of the underlying sources. For small numbers of known sources this burden is not onerous for a knowledgeable user, leading to the conclusion that the system helps to fill the gap between ad hoc local methods and large scale but complex national data sharing efforts. PMID:19198662

  18. Eurogrid: a new glideinWMS based portal for CDF data analysis

    NASA Astrophysics Data System (ADS)

    Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.

    2012-12-01

    The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.

  19. Dexterity: A MATLAB-based analysis software suite for processing and visualizing data from tasks that measure arm or forelimb function.

    PubMed

    Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B

    2017-07-15

    Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. BioMart Central Portal: an open database network for the biological community

    PubMed Central

    Guberman, Jonathan M.; Ai, J.; Arnaiz, O.; Baran, Joachim; Blake, Andrew; Baldock, Richard; Chelala, Claude; Croft, David; Cros, Anthony; Cutts, Rosalind J.; Di Génova, A.; Forbes, Simon; Fujisawa, T.; Gadaleta, E.; Goodstein, D. M.; Gundem, Gunes; Haggarty, Bernard; Haider, Syed; Hall, Matthew; Harris, Todd; Haw, Robin; Hu, S.; Hubbard, Simon; Hsu, Jack; Iyer, Vivek; Jones, Philip; Katayama, Toshiaki; Kinsella, R.; Kong, Lei; Lawson, Daniel; Liang, Yong; Lopez-Bigas, Nuria; Luo, J.; Lush, Michael; Mason, Jeremy; Moreews, Francois; Ndegwa, Nelson; Oakley, Darren; Perez-Llamas, Christian; Primig, Michael; Rivkin, Elena; Rosanoff, S.; Shepherd, Rebecca; Simon, Reinhard; Skarnes, B.; Smedley, Damian; Sperling, Linda; Spooner, William; Stevenson, Peter; Stone, Kevin; Teague, J.; Wang, Jun; Wang, Jianxin; Whitty, Brett; Wong, D. T.; Wong-Erasmus, Marie; Yao, L.; Youens-Clark, Ken; Yung, Christina; Zhang, Junjun; Kasprzyk, Arek

    2011-01-01

    BioMart Central Portal is a first of its kind, community-driven effort to provide unified access to dozens of biological databases spanning genomics, proteomics, model organisms, cancer data, ontology information and more. Anybody can contribute an independently maintained resource to the Central Portal, allowing it to be exposed to and shared with the research community, and linking it with the other resources in the portal. Users can take advantage of the common interface to quickly utilize different sources without learning a new system for each. The system also simplifies cross-database searches that might otherwise require several complicated steps. Several integrated tools streamline common tasks, such as converting between ID formats and retrieving sequences. The combination of a wide variety of databases, an easy-to-use interface, robust programmatic access and the array of tools make Central Portal a one-stop shop for biological data querying. Here, we describe the structure of Central Portal and show example queries to demonstrate its capabilities. Database URL: http://central.biomart.org. PMID:21930507

  1. Inexpensive demonstration set for teaching geometrical optics made by 3D printer

    NASA Astrophysics Data System (ADS)

    Havlíček, Karel; Ryston, Matěj

    2018-03-01

    Good sets for teaching geometric optics are relatively expensive to buy and difficult to make on your own, which often forces teachers to use less than ideal instruments and methods. This is a great shame, since this is a visually appealing topic that can motivate students. For this reason, we have designed a set that is relatively cheap, easy to use and can therefore (in some cases) remedy this situation. Our set is manufactured using 3D printing technology, which limits its users to those that have access to it; however, 3D printing technology is becoming more and more accessible every day (even in schools). On the other hand, 3D printing allows us to let the machines do the majority of the manufacturing work, making the process of building the set almost as simple as ‘download and press print’. This article presents this set, what it consists of, how it is done and where can you find all the necessary files and instructions.

  2. Effective spatial database support for acquiring spatial information from remote sensing images

    NASA Astrophysics Data System (ADS)

    Jin, Peiquan; Wan, Shouhong; Yue, Lihua

    2009-12-01

    In this paper, a new approach to maintain spatial information acquiring from remote-sensing images is presented, which is based on Object-Relational DBMS. According to this approach, the detected and recognized results of targets are stored and able to be further accessed in an ORDBMS-based spatial database system, and users can access the spatial information using the standard SQL interface. This approach is different from the traditional ArcSDE-based method, because the spatial information management module is totally integrated into the DBMS and becomes one of the core modules in the DBMS. We focus on three issues, namely the general framework for the ORDBMS-based spatial database system, the definitions of the add-in spatial data types and operators, and the process to develop a spatial Datablade on Informix. The results show that the ORDBMS-based spatial database support for image-based target detecting and recognition is easy and practical to be implemented.

  3. 3D-Lab: a collaborative web-based platform for molecular modeling.

    PubMed

    Grebner, Christoph; Norrby, Magnus; Enström, Jonatan; Nilsson, Ingemar; Hogner, Anders; Henriksson, Jonas; Westin, Johan; Faramarzi, Farzad; Werner, Philip; Boström, Jonas

    2016-09-01

    The use of 3D information has shown impact in numerous applications in drug design. However, it is often under-utilized and traditionally limited to specialists. We want to change that, and present an approach making 3D information and molecular modeling accessible and easy-to-use 'for the people'. A user-friendly and collaborative web-based platform (3D-Lab) for 3D modeling, including a blazingly fast virtual screening capability, was developed. 3D-Lab provides an interface to automatic molecular modeling, like conformer generation, ligand alignments, molecular dockings and simple quantum chemistry protocols. 3D-Lab is designed to be modular, and to facilitate sharing of 3D-information to promote interactions between drug designers. Recent enhancements to our open-source virtual reality tool Molecular Rift are described. The integrated drug-design platform allows drug designers to instantaneously access 3D information and readily apply advanced and automated 3D molecular modeling tasks, with the aim to improve decision-making in drug design projects.

  4. BioMart Central Portal: an open database network for the biological community.

    PubMed

    Guberman, Jonathan M; Ai, J; Arnaiz, O; Baran, Joachim; Blake, Andrew; Baldock, Richard; Chelala, Claude; Croft, David; Cros, Anthony; Cutts, Rosalind J; Di Génova, A; Forbes, Simon; Fujisawa, T; Gadaleta, E; Goodstein, D M; Gundem, Gunes; Haggarty, Bernard; Haider, Syed; Hall, Matthew; Harris, Todd; Haw, Robin; Hu, S; Hubbard, Simon; Hsu, Jack; Iyer, Vivek; Jones, Philip; Katayama, Toshiaki; Kinsella, R; Kong, Lei; Lawson, Daniel; Liang, Yong; Lopez-Bigas, Nuria; Luo, J; Lush, Michael; Mason, Jeremy; Moreews, Francois; Ndegwa, Nelson; Oakley, Darren; Perez-Llamas, Christian; Primig, Michael; Rivkin, Elena; Rosanoff, S; Shepherd, Rebecca; Simon, Reinhard; Skarnes, B; Smedley, Damian; Sperling, Linda; Spooner, William; Stevenson, Peter; Stone, Kevin; Teague, J; Wang, Jun; Wang, Jianxin; Whitty, Brett; Wong, D T; Wong-Erasmus, Marie; Yao, L; Youens-Clark, Ken; Yung, Christina; Zhang, Junjun; Kasprzyk, Arek

    2011-01-01

    BioMart Central Portal is a first of its kind, community-driven effort to provide unified access to dozens of biological databases spanning genomics, proteomics, model organisms, cancer data, ontology information and more. Anybody can contribute an independently maintained resource to the Central Portal, allowing it to be exposed to and shared with the research community, and linking it with the other resources in the portal. Users can take advantage of the common interface to quickly utilize different sources without learning a new system for each. The system also simplifies cross-database searches that might otherwise require several complicated steps. Several integrated tools streamline common tasks, such as converting between ID formats and retrieving sequences. The combination of a wide variety of databases, an easy-to-use interface, robust programmatic access and the array of tools make Central Portal a one-stop shop for biological data querying. Here, we describe the structure of Central Portal and show example queries to demonstrate its capabilities.

  5. Investigating health information needs of community radio stations and applying the World Wide Web to disseminate audio products.

    PubMed

    Snyders, Janus; van Wyk, Elmarie; van Zyl, Hendra

    2010-01-01

    The Web and Media Technologies Platform (WMTP) of the South African Medical Research Council (MRC) conducted a pilot project amongst community radio stations in South Africa. Based on previous research done in Africa WMTP investigated the following research question: How reliable is the content of health information broadcast by community radio stations? The main objectives of the project were to determine the 1) intervals of health slots on community radio stations, 2) sources used by community radio stations for health slots, 3) type of audio products needed for health slots, and 4) to develop a user friendly Web site in response to the stations' needs for easy access to audio material on health information.

  6. 3D Slicer as a tool for interactive brain tumor segmentation.

    PubMed

    Kikinis, Ron; Pieper, Steve

    2011-01-01

    User interaction is required for reliable segmentation of brain tumors in clinical practice and in clinical research. By incorporating current research tools, 3D Slicer provides a set of interactive, easy to use tools that can be efficiently used for this purpose. One of the modules of 3D Slicer is an interactive editor tool, which contains a variety of interactive segmentation effects. Use of these effects for fast and reproducible segmentation of a single glioblastoma from magnetic resonance imaging data is demonstrated. The innovation in this work lies not in the algorithm, but in the accessibility of the algorithm because of its integration into a software platform that is practical for research in a clinical setting.

  7. ReOpt[trademark] V2.0 user guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M K; Bryant, J L

    1992-10-01

    Cleaning up the large number of contaminated waste sites at Department of Energy (DOE) facilities in the US presents a large and complex problem. Each waste site poses a singular set of circumstances (different contaminants, environmental concerns, and regulations) that affect selection of an appropriate response. Pacific Northwest Laboratory (PNL) developed ReOpt to provide information about the remedial action technologies that are currently available. It is an easy-to-use personal computer program and database that contains data about these remedial technologies and auxiliary data about contaminants and regulations. ReOpt will enable engineers and planners involved in environmental restoration efforts to quicklymore » identify potentially applicable environmental restoration technologies and access corresponding information required to select cleanup activities for DOE sites.« less

  8. WiseView: Visualizing motion and variability of faint WISE sources

    NASA Astrophysics Data System (ADS)

    Caselden, Dan; Westin, Paul, III; Meisner, Aaron; Kuchner, Marc; Colin, Guillaume

    2018-06-01

    WiseView renders image blinks of Wide-field Infrared Survey Explorer (WISE) coadds spanning a multi-year time baseline in a browser. The software allows for easy visual identification of motion and variability for sources far beyond the single-frame detection limit, a key threshold not surmounted by many studies. WiseView transparently gathers small image cutouts drawn from many terabytes of unWISE coadds, facilitating access to this large and unique dataset. Users need only input the coordinates of interest and can interactively tune parameters including the image stretch, colormap and blink rate. WiseView was developed in the context of the Backyard Worlds: Planet 9 citizen science project, and has enabled hundreds of brown dwarf candidate discoveries by citizen scientists and professional astronomers.

  9. Isfahan MISP Dataset.

    PubMed

    Kashefpur, Masoud; Kafieh, Rahele; Jorjandi, Sahar; Golmohammadi, Hadis; Khodabande, Zahra; Abbasi, Mohammadreza; Teifuri, Nilufar; Fakharzadeh, Ali Akbar; Kashefpoor, Maryam; Rabbani, Hossein

    2017-01-01

    An online depository was introduced to share clinical ground truth with the public and provide open access for researchers to evaluate their computer-aided algorithms. PHP was used for web programming and MySQL for database managing. The website was entitled "biosigdata.com." It was a fast, secure, and easy-to-use online database for medical signals and images. Freely registered users could download the datasets and could also share their own supplementary materials while maintaining their privacies (citation and fee). Commenting was also available for all datasets, and automatic sitemap and semi-automatic SEO indexing have been set for the site. A comprehensive list of available websites for medical datasets is also presented as a Supplementary (http://journalonweb.com/tempaccess/4800.584.JMSS_55_16I3253.pdf).

  10. MiMiR – an integrated platform for microarray data sharing, mining and analysis

    PubMed Central

    Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence

    2008-01-01

    Background Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. Results A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. Conclusion The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies. PMID:18801157

  11. a Digital Pre-Inventory of Architectural Heritage in Kosovo Using DOCU-TOOLS®

    NASA Astrophysics Data System (ADS)

    Jäger-Klein, C.; Kryeziu, A.; Ymeri Hoxha, V.; Rant, M.

    2017-08-01

    Kosovo is one of the new states in transition in the Western Balkans and its state institutions are not yet fully functional. Although the territory has a rich architectural heritage, the documentation and inventory of this cultural legacy by the national monument protection institutions is insufficiently-structured and incomplete. Civil society has collected far more material than the state, but people are largely untrained in the terminology and categories of professional cultural inventories and in database systems and their international standards. What is missing is an efficient, user-friendly, low-threshold tool to gather together and integrate the various materials, archive them appropriately and make all the information suitably accessible to the public. Multiple groups of information-holders should be able to feed this open-access platform in an easy and self-explanatory way. In this case, existing systems such as the Arches Heritage Inventory and Management System would seem to be too complex, as it pre-supposes a certain understanding of the standard terminology and internationally used categories. Also, the platform as archive must be able to guarantee the integrity and authenticity of the inputted material to avoid abuse through unauthorized users with nationalistic views. Such an open-access lay-inventory would enable Kosovo to meet the urgent need for a national heritage inventory, which the state institutions have thus far been able to establish. The situation is time-sensitive, as Kosovo will soon repeat its attempt to join UNESCO, having failed to do so in 2015, receiving only a minimum number of votes in favour. In Austria, a program called docu-tools® was recently developed to tackle a similar problem. It can be used by non-professionals to document complicated and multi-structured cases within the building process. Its cloud and app-design structure allows archiving enormous numbers of images and documents in whatever format. Additionally, it allows parallel access by authorized users and avoids any hierarchy of structure or prerequisites for its users. The archived documents cannot be changed after input, which gave this documentation tool acclaimed court relevance. The following article is an attempt to explore the potential for this tool to prepare Kosovo for a comprehensive heritage inventory.

  12. MiMiR--an integrated platform for microarray data sharing, mining and analysis.

    PubMed

    Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence

    2008-09-18

    Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies.

  13. Project Assessment Skills Web Application

    NASA Technical Reports Server (NTRS)

    Goff, Samuel J.

    2013-01-01

    The purpose of this project is to utilize Ruby on Rails to create a web application that will replace a spreadsheet keeping track of training courses and tasks. The goal is to create a fast and easy to use web application that will allow users to track progress on training courses. This application will allow users to update and keep track of all of the training required of them. The training courses will be organized by group and by user, making readability easier. This will also allow group leads and administrators to get a sense of how everyone is progressing in training. Currently, updating and finding information from this spreadsheet is a long and tedious task. By upgrading to a web application, finding and updating information will be easier than ever as well as adding new training courses and tasks. Accessing this data will be much easier in that users just have to go to a website and log in with NDC credentials rather than request the relevant spreadsheet from the holder. In addition to Ruby on Rails, I will be using JavaScript, CSS, and jQuery to help add functionality and ease of use to my web application. This web application will include a number of features that will help update and track progress on training. For example, one feature will be to track progress of a whole group of users to be able to see how the group as a whole is progressing. Another feature will be to assign tasks to either a user or a group of users. All of these together will create a user friendly and functional web application.

  14. The Usability of Diabetes MAP: A Web-delivered Intervention for Improving Medication Adherence.

    PubMed

    Nelson, Lyndsay A; Bethune, Magaela C; Lagotte, Andrea E; Osborn, Chandra Y

    2016-05-12

    Web-delivered interventions are a feasible approach to health promotion. However, if a website is poorly designed, difficult to navigate, and has technical bugs, it will not be used as intended. Usability testing prior to evaluating a website's benefits can identify barriers to user engagement and maximize future use. We developed a Web-delivered intervention called Diabetes Medication Adherence Promotion (Diabetes MAP) and used a mixed-methods approach to test its usability prior to evaluating its efficacy on medication adherence and glycemic control in a randomized controlled trial. We recruited English-speaking adults with type 2 diabetes mellitus (T2DM) from an academic medical center who were prescribed diabetes medications. A trained research assistant administered a baseline survey, collected medical record information, and instructed participants on how to access Diabetes MAP. Participants were asked to use the site independently for 2 weeks and to provide survey and/or focus group feedback on their experience. We analyzed survey data descriptively and qualitative data thematically to identify participants' favorable and unfavorable experiences, characterize usability concerns, and solicit recommendations for improving Diabetes MAP. Enrolled participants (N=32) were an average of 51.7 ± 11.8 years old, 66% (21/32) female, 60% (19/32) non-Hispanic White, 88% (28/32) had more than 12 years of education, half had household incomes over $50,000, and 78% (25/32) were privately insured. Average duration of diagnosed diabetes was 7.8 ± 6.3 years, average A1c was 7.4 ± 2.0, and 38% (12/32) were prescribed insulin. Of enrolled participants, 91% (29/32) provided survey and/or focus group feedback about Diabetes MAP. On the survey, participants agreed website information was clear and easy to understand, but in focus groups they reported navigational challenges and difficulty overcoming user errors (eg, entering data in an unspecified format). Participants also reported difficulty accessing the site and, once accessed, using all of its features. Participants recommended improving the site's user interface to facilitate quick, efficient access to all features and content. Adults with T2DM rated the Diabetes MAP website favorably on surveys, but focus groups gave more in-depth feedback on the user experience (eg, difficulty accessing the site, maximizing all of the site's features and content, and recovering from errors). Appropriate usability testing methods ensure Web-delivered interventions work as intended and any benefits are not diminished by usability challenges.

  15. The Usability of Diabetes MAP: A Web-delivered Intervention for Improving Medication Adherence

    PubMed Central

    Nelson, Lyndsay A; Bethune, Magaela C; Lagotte, Andrea E

    2016-01-01

    Background Web-delivered interventions are a feasible approach to health promotion. However, if a website is poorly designed, difficult to navigate, and has technical bugs, it will not be used as intended. Usability testing prior to evaluating a website’s benefits can identify barriers to user engagement and maximize future use. Objective We developed a Web-delivered intervention called Diabetes Medication Adherence Promotion (Diabetes MAP) and used a mixed-methods approach to test its usability prior to evaluating its efficacy on medication adherence and glycemic control in a randomized controlled trial. Methods We recruited English-speaking adults with type 2 diabetes mellitus (T2DM) from an academic medical center who were prescribed diabetes medications. A trained research assistant administered a baseline survey, collected medical record information, and instructed participants on how to access Diabetes MAP. Participants were asked to use the site independently for 2 weeks and to provide survey and/or focus group feedback on their experience. We analyzed survey data descriptively and qualitative data thematically to identify participants’ favorable and unfavorable experiences, characterize usability concerns, and solicit recommendations for improving Diabetes MAP. Results Enrolled participants (N=32) were an average of 51.7 ± 11.8 years old, 66% (21/32) female, 60% (19/32) non-Hispanic White, 88% (28/32) had more than 12 years of education, half had household incomes over $50,000, and 78% (25/32) were privately insured. Average duration of diagnosed diabetes was 7.8 ± 6.3 years, average A1c was 7.4 ± 2.0, and 38% (12/32) were prescribed insulin. Of enrolled participants, 91% (29/32) provided survey and/or focus group feedback about Diabetes MAP. On the survey, participants agreed website information was clear and easy to understand, but in focus groups they reported navigational challenges and difficulty overcoming user errors (eg, entering data in an unspecified format). Participants also reported difficulty accessing the site and, once accessed, using all of its features. Participants recommended improving the site’s user interface to facilitate quick, efficient access to all features and content. Conclusions Adults with T2DM rated the Diabetes MAP website favorably on surveys, but focus groups gave more in-depth feedback on the user experience (eg, difficulty accessing the site, maximizing all of the site’s features and content, and recovering from errors). Appropriate usability testing methods ensure Web-delivered interventions work as intended and any benefits are not diminished by usability challenges. PMID:27174496

  16. The Psychosocial Development and Increased Fluency of Users of the SpeechEasyRTM Device: A Multiple Unit Case Study

    ERIC Educational Resources Information Center

    Horgan, David James

    2010-01-01

    This dissertation study explored the efficacy of the SpeechEasy[R] device for individuals who are gainfully employed stutterers and who participated in workplace education learning activities. This study attempted to fill a gap in the literature regarding efficacy of the SpeechEasy[R] device. It employed a qualitative multiple unit case study…

  17. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Kunst, O.; Cubasch, U.

    2014-12-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: guest password: miklip

  18. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"

  19. Phytotracker, an information management system for easy recording and tracking of plants, seeds and plasmids

    PubMed Central

    2012-01-01

    Background A large number of different plant lines are produced and maintained in a typical plant research laboratory, both as seed stocks and in active growth. These collections need careful and consistent management to track and maintain them properly, and this is a particularly pressing issue in laboratories undertaking research involving genetic manipulation due to regulatory requirements. Researchers and PIs need to access these data and collections, and therefore an easy-to-use plant-oriented laboratory information management system that implements, maintains and displays the information in a simple and visual format would be of great help in both the daily work in the lab and in ensuring regulatory compliance. Results Here, we introduce ‘Phytotracker’, a laboratory management system designed specifically to organise and track plasmids, seeds and growing plants that can be used in mixed platform environments. Phytotracker is designed with simplicity of user operation and ease of installation and management as the major factor, whilst providing tracking tools that cover the full range of activities in molecular genetics labs. It utilises the cross-platform Filemaker relational database, which allows it to be run as a stand-alone or as a server-based networked solution available across all workstations in a lab that can be internet accessible if desired. It can also be readily modified or customised further. Phytotracker provides cataloguing and search functions for plasmids, seed batches, seed stocks and plants growing in pots or trays, and allows tracking of each plant from seed sowing, through harvest to the new seed batch and can print appropriate labels at each stage. The system enters seed information as it is transferred from the previous harvest data, and allows both selfing and hybridization (crossing) to be defined and tracked. Transgenic lines can be linked to their plasmid DNA source. This ease of use and flexibility helps users to reduce their time needed to organise their plants, seeds and plasmids and to maintain laboratory continuity involving multiple workers. Conclusion We have developed and used Phytotracker for over five years and have found it has been an intuitive, powerful and flexible research tool in organising our plasmid, seed and plant collections requiring minimal maintenance and training for users. It has been developed in an Arabidopsis molecular genetics environment, but can be readily adapted for almost any plant laboratory research. PMID:23062011

  20. Phytotracker, an information management system for easy recording and tracking of plants, seeds and plasmids.

    PubMed

    Nieuwland, Jeroen; Sornay, Emily; Marchbank, Angela; de Graaf, Barend Hj; Murray, James Ah

    2012-10-13

    A large number of different plant lines are produced and maintained in a typical plant research laboratory, both as seed stocks and in active growth. These collections need careful and consistent management to track and maintain them properly, and this is a particularly pressing issue in laboratories undertaking research involving genetic manipulation due to regulatory requirements. Researchers and PIs need to access these data and collections, and therefore an easy-to-use plant-oriented laboratory information management system that implements, maintains and displays the information in a simple and visual format would be of great help in both the daily work in the lab and in ensuring regulatory compliance. Here, we introduce 'Phytotracker', a laboratory management system designed specifically to organise and track plasmids, seeds and growing plants that can be used in mixed platform environments. Phytotracker is designed with simplicity of user operation and ease of installation and management as the major factor, whilst providing tracking tools that cover the full range of activities in molecular genetics labs. It utilises the cross-platform Filemaker relational database, which allows it to be run as a stand-alone or as a server-based networked solution available across all workstations in a lab that can be internet accessible if desired. It can also be readily modified or customised further. Phytotracker provides cataloguing and search functions for plasmids, seed batches, seed stocks and plants growing in pots or trays, and allows tracking of each plant from seed sowing, through harvest to the new seed batch and can print appropriate labels at each stage. The system enters seed information as it is transferred from the previous harvest data, and allows both selfing and hybridization (crossing) to be defined and tracked. Transgenic lines can be linked to their plasmid DNA source. This ease of use and flexibility helps users to reduce their time needed to organise their plants, seeds and plasmids and to maintain laboratory continuity involving multiple workers. We have developed and used Phytotracker for over five years and have found it has been an intuitive, powerful and flexible research tool in organising our plasmid, seed and plant collections requiring minimal maintenance and training for users. It has been developed in an Arabidopsis molecular genetics environment, but can be readily adapted for almost any plant laboratory research.

  1. Data Access Services that Make Remote Sensing Data Easier to Use

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher

    2010-01-01

    This slide presentation reviews some of the processes that NASA uses to make the remote sensing data easy to use over the World Wide Web. This work involves much research into data formats, geolocation structures and quality indicators, often to be followed by coding a preprocessing program. Only then are the data usable within the analysis tool of choice. The Goddard Earth Sciences Data and Information Services Center is deploying a variety of data access services that are designed to dramatically shorten the time consumed in the data preparation step. On-the-fly conversion to the standard network Common Data Form (netCDF) format with Climate-Forecast (CF) conventions imposes a standard coordinate system framework that makes data instantly readable through several tools, such as the Integrated Data Viewer, Gridded Analysis and Display System, Panoply and Ferret. A similar benefit is achieved by serving data through the Open Source Project for a Network Data Access Protocol (OPeNDAP), which also provides subsetting. The Data Quality Screening Service goes a step further in filtering out data points based on quality control flags, based on science team recommendations or user-specified criteria. Further still is the Giovanni online analysis system which goes beyond handling formatting and quality to provide visualization and basic statistics of the data. This general approach of automating the preparation steps has the important added benefit of enabling use of the data by non-human users (i.e., computer programs), which often make sub-optimal use of the available data due to the need to hard-code data preparation on the client side.

  2. ARIANE: integration of information databases within a hospital intranet.

    PubMed

    Joubert, M; Aymard, S; Fieschi, D; Volot, F; Staccini, P; Robert, J J; Fieschi, M

    1998-05-01

    Large information systems handle massive volume of data stored in heterogeneous sources. Each server has its own model of representation of concepts with regard to its aims. One of the main problems end-users encounter when accessing different servers is to match their own viewpoint on biomedical concepts with the various representations that are made in the databases servers. The aim of the project ARIANE is to provide end-users with easy-to-use and natural means to access and query heterogeneous information databases. The objectives of this research work consist in building a conceptual interface by means of the Internet technology inside an enterprise Intranet and to propose a method to realize it. This method is based on the knowledge sources provided by the Unified Medical Language System (UMLS) project of the US National Library of Medicine. Experiments concern queries to three different information servers: PubMed, a Medline server of the NLM; Thériaque, a French database on drugs implemented in the Hospital Intranet; and a Web site dedicated to Internet resources in gastroenterology and nutrition, located at the Faculty of Medicine of Nice (France). Accessing to each of these servers is different according to the kind of information delivered and according to the technology used to query it. Dealing with health care professional workstation, the authors introduced in the ARIANE project quality criteria in order to attempt a homogeneous and efficient way to build a query system able to be integrated in existing information systems and to integrate existing and new information sources.

  3. LigoDV-web: Providing easy, secure and universal access to a large distributed scientific data store for the LIGO scientific collaboration

    NASA Astrophysics Data System (ADS)

    Areeda, J. S.; Smith, J. R.; Lundgren, A. P.; Maros, E.; Macleod, D. M.; Zweizig, J.

    2017-01-01

    Gravitational-wave observatories around the world, including the Laser Interferometer Gravitational-Wave Observatory (LIGO), record a large volume of gravitational-wave output data and auxiliary data about the instruments and their environments. These data are stored at the observatory sites and distributed to computing clusters for data analysis. LigoDV-web is a web-based data viewer that provides access to data recorded at the LIGO Hanford, LIGO Livingston and GEO600 observatories, and the 40 m prototype interferometer at Caltech. The challenge addressed by this project is to provide meaningful visualizations of small data sets to anyone in the collaboration in a fast, secure and reliable manner with minimal software, hardware and training required of the end users. LigoDV-web is implemented as a Java Enterprise Application, with Shibboleth Single Sign On for authentication and authorization, and a proprietary network protocol used for data access on the back end. Collaboration members with proper credentials can request data be displayed in any of several general formats from any Internet appliance that supports a modern browser with Javascript and minimal HTML5 support, including personal computers, smartphones, and tablets. Since its inception in 2012, 634 unique users have visited the LigoDV-web website in a total of 33 , 861 sessions and generated a total of 139 , 875 plots. This infrastructure has been helpful in many analyses within the collaboration including follow-up of the data surrounding the first gravitational-wave events observed by LIGO in 2015.

  4. iDrug: a web-accessible and interactive drug discovery and design platform

    PubMed Central

    2014-01-01

    Background The progress in computer-aided drug design (CADD) approaches over the past decades accelerated the early-stage pharmaceutical research. Many powerful standalone tools for CADD have been developed in academia. As programs are developed by various research groups, a consistent user-friendly online graphical working environment, combining computational techniques such as pharmacophore mapping, similarity calculation, scoring, and target identification is needed. Results We presented a versatile, user-friendly, and efficient online tool for computer-aided drug design based on pharmacophore and 3D molecular similarity searching. The web interface enables binding sites detection, virtual screening hits identification, and drug targets prediction in an interactive manner through a seamless interface to all adapted packages (e.g., Cavity, PocketV.2, PharmMapper, SHAFTS). Several commercially available compound databases for hit identification and a well-annotated pharmacophore database for drug targets prediction were integrated in iDrug as well. The web interface provides tools for real-time molecular building/editing, converting, displaying, and analyzing. All the customized configurations of the functional modules can be accessed through featured session files provided, which can be saved to the local disk and uploaded to resume or update the history work. Conclusions iDrug is easy to use, and provides a novel, fast and reliable tool for conducting drug design experiments. By using iDrug, various molecular design processing tasks can be submitted and visualized simply in one browser without installing locally any standalone modeling softwares. iDrug is accessible free of charge at http://lilab.ecust.edu.cn/idrug. PMID:24955134

  5. Wikipedia Chemical Structure Explorer: substructure and similarity searching of molecules from Wikipedia.

    PubMed

    Ertl, Peter; Patiny, Luc; Sander, Thomas; Rufener, Christian; Zasso, Michaël

    2015-01-01

    Wikipedia, the world's largest and most popular encyclopedia is an indispensable source of chemistry information. It contains among others also entries for over 15,000 chemicals including metabolites, drugs, agrochemicals and industrial chemicals. To provide an easy access to this wealth of information we decided to develop a substructure and similarity search tool for chemical structures referenced in Wikipedia. We extracted chemical structures from entries in Wikipedia and implemented a web system allowing structure and similarity searching on these data. The whole search as well as visualization system is written in JavaScript and therefore can run locally within a web page and does not require a central server. The Wikipedia Chemical Structure Explorer is accessible on-line at www.cheminfo.org/wikipedia and is available also as an open source project from GitHub for local installation. The web-based Wikipedia Chemical Structure Explorer provides a useful resource for research as well as for chemical education enabling both researchers and students easy and user friendly chemistry searching and identification of relevant information in Wikipedia. The tool can also help to improve quality of chemical entries in Wikipedia by providing potential contributors regularly updated list of entries with problematic structures. And last but not least this search system is a nice example of how the modern web technology can be applied in the field of cheminformatics. Graphical abstractWikipedia Chemical Structure Explorer allows substructure and similarity searches on molecules referenced in Wikipedia.

  6. EST-PAC a web package for EST annotation and protein sequence prediction

    PubMed Central

    Strahm, Yvan; Powell, David; Lefèvre, Christophe

    2006-01-01

    With the decreasing cost of DNA sequencing technology and the vast diversity of biological resources, researchers increasingly face the basic challenge of annotating a larger number of expressed sequences tags (EST) from a variety of species. This typically consists of a series of repetitive tasks, which should be automated and easy to use. The results of these annotation tasks need to be stored and organized in a consistent way. All these operations should be self-installing, platform independent, easy to customize and amenable to using distributed bioinformatics resources available on the Internet. In order to address these issues, we present EST-PAC a web oriented multi-platform software package for expressed sequences tag (EST) annotation. EST-PAC provides a solution for the administration of EST and protein sequence annotations accessible through a web interface. Three aspects of EST annotation are automated: 1) searching local or remote biological databases for sequence similarities using Blast services, 2) predicting protein coding sequence from EST data and, 3) annotating predicted protein sequences with functional domain predictions. In practice, EST-PAC integrates the BLASTALL suite, EST-Scan2 and HMMER in a relational database system accessible through a simple web interface. EST-PAC also takes advantage of the relational database to allow consistent storage, powerful queries of results and, management of the annotation process. The system allows users to customize annotation strategies and provides an open-source data-management environment for research and education in bioinformatics. PMID:17147782

  7. Implementing Recommendations From Web Accessibility Guidelines: Would They Also Provide Benefits to Nondisabled Users.

    PubMed

    Schmutz, Sven; Sonderegger, Andreas; Sauer, Juergen

    2016-06-01

    We examined the consequences of implementing Web accessibility guidelines for nondisabled users. Although there are Web accessibility guidelines for people with disabilities available, they are rarely used in practice, partly due to the fact that practitioners believe that such guidelines provide no benefits, or even have negative consequences, for nondisabled people, who represent the main user group of Web sites. Despite these concerns, there is a lack of empirical research on the effects of current Web accessibility guidelines on nondisabled users. Sixty-one nondisabled participants used one of three Web sites differing in levels of accessibility (high, low, and very low). Accessibility levels were determined by following established Web accessibility guidelines (WCAG 2.0). A broad methodological approach was used, including performance measures (e.g., task completion time) and user ratings (e.g., perceived usability). A high level of Web accessibility led to better performance (i.e., task completion time and task completion rate) than low or very low accessibility. Likewise, high Web accessibility improved user ratings (i.e., perceived usability, aesthetics, workload, and trustworthiness) compared to low or very low Web accessibility. There was no difference between the very low and low Web accessibility conditions for any of the outcome measures. Contrary to some concerns in the literature and among practitioners, high conformance with Web accessibility guidelines may provide benefits to users without disabilities. The findings may encourage more practitioners to implement WCAG 2.0 for the benefit of users with disabilities and nondisabled users. © 2016, Human Factors and Ergonomics Society.

  8. Making a web based ulcer record work by aligning architecture, legislation and users - a formative evaluation study.

    PubMed

    Ekeland, Anne G; Skipenes, Eva; Nyheim, Beate; Christiansen, Ellen K

    2011-01-01

    The University Hospital of North Norway selected a web-based ulcer record used in Denmark, available from mobile phones. Data was stored in a common database and easily accessible. According to Norwegian legislation, only employees of the organization that owns an IT system can access the system, and use of mobile units requires strong security solutions. The system had to be changed. The paper addresses interactions in order to make the system legal, and assesses regulations that followed. By addressing conflicting scripts and the contingent nature of knowledge, we conducted a formative evaluation aiming at improving the object being studied. Participatory observation in a one year process, minutes from meetings and information from participants, constitute the data material. In the technological domain, one database was replaced by four. In the health care delivery domain, easy access was replaced by a more complicated log on procedure, and in the domain of law and security, a clarification of risk levels was obtained, thereby allowing for access by mobile phones with today's authentication mechanisms. Flexibility concerning predefined scripts was important in all domains. Changes were made that improved the platform for further development of legitimate communication of patient data via mobile units. The study also shows the value of formative evaluations in innovations.

  9. Integrality in cervical cancer care: evaluation of access

    PubMed Central

    Brito-Silva, Keila; Bezerra, Adriana Falangola Benjamin; Chaves, Lucieli Dias Pedreschi; Tanaka, Oswaldo Yoshimi

    2014-01-01

    OBJECTIVE To evaluate integrity of access to uterine cervical cancer prevention, diagnosis and treatment services. METHODS The tracer condition was analyzed using a mixed quantitative and qualitative approach. The quantitative approach was based on secondary data from the analysis of cytology and biopsy exams performed between 2008 and 2010 on 25 to 59 year-old women in a municipality with a large population and with the necessary technological resources. Data were obtained from the Health Information System and the Regional Cervical Cancer Information System. Statistical analysis was performed using PASW statistic 17.0 software. The qualitative approach involved semi-structured interviews with service managers, health care professionals and users. NVivo 9.0 software was used for the content analysis of the primary data. RESULTS Pap smear coverage was low, possible due to insufficient screening and the difficulty of making appointments in primary care. The numbers of biopsies conducted are similar to those of abnormal cytologies, reflecting easy access to the specialized services. There was higher coverage among younger women. More serious diagnoses, for both cytologies and biopsies, were more prevalent in older women. CONCLUSIONS Insufficient coverage of cytologies, reported by the interviewees allows us to understand access difficulties in primary care, as well as the fragility of screening strategies. PMID:24897045

  10. Easy access to firearms: juveniles' risks for violent offending and violent victimization.

    PubMed

    Ruback, R Barry; Shaffer, Jennifer N; Clark, Valerie A

    2011-07-01

    Keeping firearms at home may increase personal safety but it may also increase the risk of injury. This study uses data from three waves of the National Longitudinal Study of Adolescent Health to assess the extent to which adolescents' easy access to firearms at home increases the risk of violent offending and violent victimization. Access to firearms was higher for males, Whites, and adolescents having two parents, especially fathers. Current access to firearms at home significantly increased the odds of both violent offending and violent victimization, even after controlling for prior access, prior offending, and prior victimization. This relationship persisted into early adulthood; access to firearms still significantly increased the odds of violent offending and violent victimization.

  11. GeoMapApp: Using Authentic Geoscience Data to Promote Student Engagement and Understanding

    NASA Astrophysics Data System (ADS)

    Goodwillie, A. M.

    2016-12-01

    We increasingly expect geoscience data to be readily and freely accessible via the web in formats that are easy to handle. Yet, we are often required to compile data sets with different formats from multiple sources and, sometimes, we give up in frustration. Fortunately, recent advances in web-enabled technologies are helping to lower barriers by bridging the gap of data accessibility and integration. GeoMapApp (http://www.geomapapp.org), a free data discovery and visualisation tool developed with NSF funding at Lamont-Doherty Earth Observatory provides users with an intuitive map-based interface. GeoMapApp offers free access to hundreds of integrated research-grade geoscience data sets. Examples include earthquake and volcano data, geological maps, lithospheric plate boundary information, geochemical, oceanographic, and environmental data. Users can also import their own data files. The GeoMapApp interface presents data in its proper geographical context that enhances geospatial awareness and helps students more easily gain insight and understanding from the data. Simple tools for data manipulation help students analyse the data in different ways. An improved Save Session function allows users to store a pre-loaded state of GeoMapApp. When shared with a class, the saved file frees up valuable classroom time for students to explore and interrogate the data by allowing every student to open GeoMapApp at exactly the same starting point. GeoMapApp is adaptable to a range of learning environments from lab sessions, group projects, and homework assignments to in-class pop-ups. A wide range of undergraduate enquiry-driven education modules for GeoMapApp is already available at SERC. In this presentation, we will show GeoMapApp-based activities that promote student engagement with authentic geoscience data and that provide a better sense of data "ownership" and of academic equality - GeoMapApp presents the same data in the same tool used by researchers. Topics covered will include plate tectonics and climatology.

  12. Interactive Online Real-time Groundwater Model for Irrigation Water Allocation in the Heihe Mid-reaches, China

    NASA Astrophysics Data System (ADS)

    Pedrazzini, G.; Kinzelbach, W.

    2016-12-01

    In the Heihe Basin and many other semi-arid regions in the world the ongoing introduction of smart meter IC-card systems on farmers' pumping wells will soon allow monitoring and control of abstractions with the goal of preventing further depletion of the resource. In this regard, a major interest of policy makers concerns the development of new and the improvement of existing legislation on pricing schemes and groundwater/surface water quotas. Predictive knowledge on the development of groundwater levels for different allocation schemes or climatic change scenarios is required to support decision-makers in this task. In the past groundwater models have been a static component of investigations and their results delivered in the form of reports. We set up and integrated a groundwater model into a user-friendly web-based environment, allowing direct and easy access to the novice user. Through operating sliders the user can select an irrigation district, change irrigation patterns such as partitioning of surface- and groundwater, size of irrigation area, irrigation efficiency, as well as a number of climate related parameters. Reactive handles allow to display the results in real-time. The implemented software is all license free. The tool is currently being introduced to irrigation district managers in the project area. Findings will be available after some practical experience to be expected in a given time. The accessibility via a web-interface is a novelty in the context of groundwater models. It allows delivering a product accessible from everywhere and from any device. The maintenance and if necessary updating of model or software can occur remotely. Feedback mechanisms between reality and prediction will be introduced and the model periodically updated through data assimilation as new data becomes available. This will render the model a dynamic tool steadily available and evolving over time.

  13. Learning to Detect Vandalism in Social Content Systems: A Study on Wikipedia

    NASA Astrophysics Data System (ADS)

    Javanmardi, Sara; McDonald, David W.; Caruana, Rich; Forouzan, Sholeh; Lopes, Cristina V.

    A challenge facing user generated content systems is vandalism, i.e. edits that damage content quality. The high visibility and easy access to social networks makes them popular targets for vandals. Detecting and removing vandalism is critical for these user generated content systems. Because vandalism can take many forms, there are many different kinds of features that are potentially useful for detecting it. The complex nature of vandalism, and the large number of potential features, make vandalism detection difficult and time consuming for human editors. Machine learning techniques hold promise for developing accurate, tunable, and maintainable models that can be incorporated into vandalism detection tools. We describe a method for training classifiers for vandalism detection that yields classifiers that are more accurate on the PAN 2010 corpus than others previously developed. Because of the high turnaround in social network systems, it is important for vandalism detection tools to run in real-time. To this aim, we use feature selection to find the minimal set of features consistent with high accuracy. In addition, because some features are more costly to compute than others, we use cost-sensitive feature selection to reduce the total computational cost of executing our models. In addition to the features previously used for spam detection, we introduce new features based on user action histories. The user history features contribute significantly to classifier performance. The approach we use is general and can easily be applied to other user generated content systems.

  14. Voice Biometrics as a Way to Self-service Password Reset

    NASA Astrophysics Data System (ADS)

    Hohgräfe, Bernd; Jacobi, Sebastian

    Password resets are time consuming. Especially when urgent jobs need to be done, it is cumbersome to inform the user helpdesk, to identify oneself and then to wait for response. It is easy to enter a wrong password multiple times, which leads to the blocking of the application. Voice biometrics is an easy and secure way for individuals to reset their own password. Read more about how you can ease the burden of your user helpdesk and how voice biometric password resets benefit your expense situation without harming your security.

  15. A Tale of Two Archives: PDS3/PDS4 Archiving and Distribution of Juno Mission Data

    NASA Astrophysics Data System (ADS)

    Stevenson, Zena; Neakrase, Lynn; Huber, Lyle; Chanover, Nancy J.; Beebe, Reta F.; Sweebe, Kathrine; Johnson, Joni J.

    2017-10-01

    The Juno mission to Jupiter, which was launched on 5 August 2011 and arrived at the Jovian system in July 2016, represents the last mission to be officially archived under the PDS3 archive standards. Modernization and availability of the newer PDS4 archive standard has prompted the PDS Atmospheres Node (ATM) to provide on-the-fly migration of Juno data from PDS3 to PDS4. Data distribution under both standards presents challenges in terms of how to present data to the end user in both standards, without sacrificing accessibility to the data or impacting the active PDS3 mission pipelines tasked with delivering the data on predetermined schedules. The PDS Atmospheres Node has leveraged its experience with prior active PDS4 missions (e.g., LADEE and MAVEN) and ongoing PDS3-to-PDS4 data migration efforts providing a seamless distribution of Juno data in both PDS3 and PDS4. When ATM receives a data delivery from the Juno Science Operations Center, the PDS3 labels are validated and then fed through PDS4 migration software built at ATM. Specifically, a collection of Python methods and scripts has been developed to make the migration process as automatic as possible, even when working with the more complex labels used by several of the Juno instruments. This is used to create all of the PDS4 data labels at once and build PDS4 archive bundles with minimal human effort. Resultant bundles are then validated against the PDS4 standard and released alongside the certified PDS3 versions of the same data. The newer design of the distribution pages provides access to both versions of the data, utilizing some of the enhanced capabilities of PDS4 to improve search and retrieval of Juno data. Webpages are designed with the intent of offering easy access to all documentation for Juno data as well as the data themselves in both standards for users of all experience levels. We discuss the structure and organization of the Juno archive and associated webpages as examples of joint PDS3/PDS4 data access for end users.

  16. Infrastructure for the life sciences: design and implementation of the UniProt website.

    PubMed

    Jain, Eric; Bairoch, Amos; Duvaud, Severine; Phan, Isabelle; Redaschi, Nicole; Suzek, Baris E; Martin, Maria J; McGarvey, Peter; Gasteiger, Elisabeth

    2009-05-08

    The UniProt consortium was formed in 2002 by groups from the Swiss Institute of Bioinformatics (SIB), the European Bioinformatics Institute (EBI) and the Protein Information Resource (PIR) at Georgetown University, and soon afterwards the website http://www.uniprot.org was set up as a central entry point to UniProt resources. Requests to this address were redirected to one of the three organisations' websites. While these sites shared a set of static pages with general information about UniProt, their pages for searching and viewing data were different. To provide users with a consistent view and to cut the cost of maintaining three separate sites, the consortium decided to develop a common website for UniProt. Following several years of intense development and a year of public beta testing, the http://www.uniprot.org domain was switched to the newly developed site described in this paper in July 2008. The UniProt consortium is the main provider of protein sequence and annotation data for much of the life sciences community. The http://www.uniprot.org website is the primary access point to this data and to documentation and basic tools for the data. These tools include full text and field-based text search, similarity search, multiple sequence alignment, batch retrieval and database identifier mapping. This paper discusses the design and implementation of the new website, which was released in July 2008, and shows how it improves data access for users with different levels of experience, as well as to machines for programmatic access.http://www.uniprot.org/ is open for both academic and commercial use. The site was built with open source tools and libraries. Feedback is very welcome and should be sent to help@uniprot.org. The new UniProt website makes accessing and understanding UniProt easier than ever. The two main lessons learned are that getting the basics right for such a data provider website has huge benefits, but is not trivial and easy to underestimate, and that there is no substitute for using empirical data throughout the development process to decide on what is and what is not working for your users.

  17. CloVR-Comparative: automated, cloud-enabled comparative microbial genome sequence analysis pipeline.

    PubMed

    Agrawal, Sonia; Arze, Cesar; Adkins, Ricky S; Crabtree, Jonathan; Riley, David; Vangala, Mahesh; Galens, Kevin; Fraser, Claire M; Tettelin, Hervé; White, Owen; Angiuoli, Samuel V; Mahurkar, Anup; Fricke, W Florian

    2017-04-27

    The benefit of increasing genomic sequence data to the scientific community depends on easy-to-use, scalable bioinformatics support. CloVR-Comparative combines commonly used bioinformatics tools into an intuitive, automated, and cloud-enabled analysis pipeline for comparative microbial genomics. CloVR-Comparative runs on annotated complete or draft genome sequences that are uploaded by the user or selected via a taxonomic tree-based user interface and downloaded from NCBI. CloVR-Comparative runs reference-free multiple whole-genome alignments to determine unique, shared and core coding sequences (CDSs) and single nucleotide polymorphisms (SNPs). Output includes short summary reports and detailed text-based results files, graphical visualizations (phylogenetic trees, circular figures), and a database file linked to the Sybil comparative genome browser. Data up- and download, pipeline configuration and monitoring, and access to Sybil are managed through CloVR-Comparative web interface. CloVR-Comparative and Sybil are distributed as part of the CloVR virtual appliance, which runs on local computers or the Amazon EC2 cloud. Representative datasets (e.g. 40 draft and complete Escherichia coli genomes) are processed in <36 h on a local desktop or at a cost of <$20 on EC2. CloVR-Comparative allows anybody with Internet access to run comparative genomics projects, while eliminating the need for on-site computational resources and expertise.

  18. lncRNATargets: A platform for lncRNA target prediction based on nucleic acid thermodynamics.

    PubMed

    Hu, Ruifeng; Sun, Xiaobo

    2016-08-01

    Many studies have supported that long noncoding RNAs (lncRNAs) perform various functions in various critical biological processes. Advanced experimental and computational technologies allow access to more information on lncRNAs. Determining the functions and action mechanisms of these RNAs on a large scale is urgently needed. We provided lncRNATargets, which is a web-based platform for lncRNA target prediction based on nucleic acid thermodynamics. The nearest-neighbor (NN) model was used to calculate binging-free energy. The main principle of NN model for nucleic acid assumes that identity and orientation of neighbor base pairs determine stability of a given base pair. lncRNATargets features the following options: setting of a specific temperature that allow use not only for human but also for other animals or plants; processing all lncRNAs in high throughput without RNA size limitation that is superior to any other existing tool; and web-based, user-friendly interface, and colored result displays that allow easy access for nonskilled computer operators and provide better understanding of results. This technique could provide accurate calculation on the binding-free energy of lncRNA-target dimers to predict if these structures are well targeted together. lncRNATargets provides high accuracy calculations, and this user-friendly program is available for free at http://www.herbbol.org:8001/lrt/ .

  19. Processing multilevel secure test and evaluation information

    NASA Astrophysics Data System (ADS)

    Hurlburt, George; Hildreth, Bradley; Acevedo, Teresa

    1994-07-01

    The Test and Evaluation Community Network (TECNET) is building a Multilevel Secure (MLS) system. This system features simultaneous access to classified and unclassified information and easy access through widely available communications channels. It provides the necessary separation of classification levels, assured through the use of trusted system design techniques, security assessments and evaluations. This system enables cleared T&E users to view and manipulate classified and unclassified information resources either using a single terminal interface or multiple windows in a graphical user interface. TECNET is in direct partnership with the National Security Agency (NSA) to develop and field the MLS TECNET capability in the near term. The centerpiece of this partnership is a state-of-the-art Concurrent Systems Security Engineering (CSSE) process. In developing the MLS TECNET capability, TECNET and NSA are providing members, with various expertise and diverse backgrounds, to participate in the CSSE process. The CSSE process is founded on the concepts of both Systems Engineering and Concurrent Engineering. Systems Engineering is an interdisciplinary approach to evolve and verify an integrated and life cycle balanced set of system product and process solutions that satisfy customer needs (ASD/ENS-MIL STD 499B 1992). Concurrent Engineering is design and development using the simultaneous, applied talents of a diverse group of people with the appropriate skills. Harnessing diverse talents to support CSSE requires active participation by team members in an environment that both respects and encourages diversity.

  20. Web-Based Satellite Products Database for Meteorological and Climate Applications

    NASA Technical Reports Server (NTRS)

    Phan, Dung; Spangenberg, Douglas A.; Palikonda, Rabindra; Khaiyer, Mandana M.; Nordeen, Michele L.; Nguyen, Louis; Minnis, Patrick

    2004-01-01

    The need for ready access to satellite data and associated physical parameters such as cloud properties has been steadily growing. Air traffic management, weather forecasters, energy producers, and weather and climate researchers among others can utilize more satellite information than in the past. Thus, it is essential that such data are made available in near real-time and as archival products in an easy-access and user friendly environment. A host of Internet web sites currently provide a variety of satellite products for various applications. Each site has a unique contribution with appeal to a particular segment of the public and scientific community. This is no less true for the NASA Langley's Clouds and Radiation (NLCR) website (http://www-pm.larc.nasa.gov) that has been evolving over the past 10 years to support a variety of research projects This website was originally developed to display cloud products derived from the Geostationary Operational Environmental Satellite (GOES) over the Southern Great Plains for the Atmospheric Radiation Measurement (ARM) Program. It has evolved into a site providing a comprehensive database of near real-time and historical satellite products used for meteorological, aviation, and climate studies. To encourage the user community to take advantage of the site, this paper summarizes the various products and projects supported by the website and discusses future options for new datasets.

  1. Automatic User Interface Generation for Visualizing Big Geoscience Data

    NASA Astrophysics Data System (ADS)

    Yu, H.; Wu, J.; Zhou, Y.; Tang, Z.; Kuo, K. S.

    2016-12-01

    Along with advanced computing and observation technologies, geoscience and its related fields have been generating a large amount of data at an unprecedented growth rate. Visualization becomes an increasingly attractive and feasible means for researchers to effectively and efficiently access and explore data to gain new understandings and discoveries. However, visualization has been challenging due to a lack of effective data models and visual representations to tackle the heterogeneity of geoscience data. We propose a new geoscience data visualization framework by leveraging the interface automata theory to automatically generate user interface (UI). Our study has the following three main contributions. First, geoscience data has its unique hierarchy data structure and complex formats, and therefore it is relatively easy for users to get lost or confused during their exploration of the data. By applying interface automata model to the UI design, users can be clearly guided to find the exact visualization and analysis that they want. In addition, from a development perspective, interface automaton is also easier to understand than conditional statements, which can simplify the development process. Second, it is common that geoscience data has discontinuity in its hierarchy structure. The application of interface automata can prevent users from suffering automation surprises, and enhance user experience. Third, for supporting a variety of different data visualization and analysis, our design with interface automata could also make applications become extendable in that a new visualization function or a new data group could be easily added to an existing application, which reduces the overhead of maintenance significantly. We demonstrate the effectiveness of our framework using real-world applications.

  2. An electronic laboratory notebook based on HTML forms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marstaller, J.E.; Zorn, M.D.

    The electronic notebook records information that has traditionally been kept in handwritten laboratory notebooks. It keeps detailed information about the progress of the research , such as the optimization of primers, the screening of the primers and, finally, the mapping of the probes. The notebook provides two areas of services: Data entry, and reviewing of data in all stages. The World wide Web browsers, with HTML based forms provide a fast and easy mechanism to create forms-based user interfaces. The computer scientist can sit down with the biologist and rapidly make changes in response to the user`s comments. Furthermore themore » HTML forms work equally well on a number of different hardware platforms; thus the biologists may continue using their Macintosh computers and find a familiar interface if they have to work on a Unix workstation. The web browser can be run from any machine connected to the Internet: thus the users are free to enter or view information even away from their labs at home or while on travel. Access can be restricted by password and other means to secure the confidentiality of the data. A bonus that is hard to implement otherwise is the facile connection to outside resources. Linking local information to data in public databases is only a hypertext link away with little or no additional programming efforts.« less

  3. mySyntenyPortal: an application package to construct websites for synteny block analysis.

    PubMed

    Lee, Jongin; Lee, Daehwan; Sim, Mikang; Kwon, Daehong; Kim, Juyeon; Ko, Younhee; Kim, Jaebum

    2018-06-05

    Advances in sequencing technologies have facilitated large-scale comparative genomics based on whole genome sequencing. Constructing and investigating conserved genomic regions among multiple species (called synteny blocks) are essential in the comparative genomics. However, they require significant amounts of computational resources and time in addition to bioinformatics skills. Many web interfaces have been developed to make such tasks easier. However, these web interfaces cannot be customized for users who want to use their own set of genome sequences or definition of synteny blocks. To resolve this limitation, we present mySyntenyPortal, a stand-alone application package to construct websites for synteny block analyses by using users' own genome data. mySyntenyPortal provides both command line and web-based interfaces to build and manage websites for large-scale comparative genomic analyses. The websites can be also easily published and accessed by other users. To demonstrate the usability of mySyntenyPortal, we present an example study for building websites to compare genomes of three mammalian species (human, mouse, and cow) and show how they can be easily utilized to identify potential genes affected by genome rearrangements. mySyntenyPortal will contribute for extended comparative genomic analyses based on large-scale whole genome sequences by providing unique functionality to support the easy creation of interactive websites for synteny block analyses from user's own genome data.

  4. Full-Text Linking: Affiliated versus Nonaffiliated Access in a Free Database.

    ERIC Educational Resources Information Center

    Grogg, Jill E.; Andreadis, Debra K.; Kirk, Rachel A.

    2002-01-01

    Presents a comparison of access to full-text articles from a free bibliographic database (PubSCIENCE) for affiliated and unaffiliated users. Found that affiliated users had access to more full-text articles than unaffiliated users had, and that both types of users could increase their level of access through additional searching and greater…

  5. OpenTopography: Enabling Online Access to High-Resolution Lidar Topography Data and Processing Tools

    NASA Astrophysics Data System (ADS)

    Crosby, Christopher; Nandigam, Viswanath; Baru, Chaitan; Arrowsmith, J. Ramon

    2013-04-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology are revolutionizing the way we study the Earth's surface and overlying vegetation. These data, collected from airborne, tripod, or mobile-mounted scanners have emerged as a fundamental tool for research on topics ranging from earthquake hazards to hillslope processes. Lidar data provide a digital representation of the earth's surface at a resolution sufficient to appropriately capture the processes that contribute to landscape evolution. The U.S. National Science Foundation-funded OpenTopography Facility (http://www.opentopography.org) is a web-based system designed to democratize access to earth science-oriented lidar topography data. OpenTopography provides free, online access to lidar data in a number of forms, including the raw point cloud and associated geospatial-processing tools for customized analysis. The point cloud data are co-located with on-demand processing tools to generate digital elevation models, and derived products and visualizations which allow users to quickly access data in a format appropriate for their scientific application. The OpenTopography system is built using a service-oriented architecture (SOA) that leverages cyberinfrastructure resources at the San Diego Supercomputer Center at the University of California San Diego to allow users, regardless of expertise level, to access these massive lidar datasets and derived products for use in research and teaching. OpenTopography hosts over 500 billion lidar returns covering 85,000 km2. These data are all in the public domain and are provided by a variety of partners under joint agreements and memoranda of understanding with OpenTopography. Partners include national facilities such as the NSF-funded National Center for Airborne Lidar Mapping (NCALM), as well as non-governmental organizations and local, state, and federal agencies. OpenTopography has become a hub for high-resolution topography resources. Datasets hosted by other organizations, as well as lidar-specific software, can be registered into the OpenTopography catalog, providing users a "one-stop shop" for such information. With several thousand active users, OpenTopography is an excellent example of a mature Spatial Data Infrastructure system that is enabling access to challenging data for research, education and outreach. Ongoing OpenTopography design and development work includes the archive and publication of datasets using digital object identifiers (DOIs); creation of a more flexible and scalable high-performance environment for processing of large datasets; expanded support for satellite and terrestrial lidar; and creation of a "pluggable" infrastructure for third-party programs and algorithms. OpenTopography has successfully created a facility for sharing lidar data. In the project's next phase, we are working to enable equally easy and successful sharing of services for processing and analysis of these data.

  6. Accessing the public MIMIC-II intensive care relational database for clinical research.

    PubMed

    Scott, Daniel J; Lee, Joon; Silva, Ikaro; Park, Shinhyuk; Moody, George B; Celi, Leo A; Mark, Roger G

    2013-01-10

    The Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC-II) database is a free, public resource for intensive care research. The database was officially released in 2006, and has attracted a growing number of researchers in academia and industry. We present the two major software tools that facilitate accessing the relational database: the web-based QueryBuilder and a downloadable virtual machine (VM) image. QueryBuilder and the MIMIC-II VM have been developed successfully and are freely available to MIMIC-II users. Simple example SQL queries and the resulting data are presented. Clinical studies pertaining to acute kidney injury and prediction of fluid requirements in the intensive care unit are shown as typical examples of research performed with MIMIC-II. In addition, MIMIC-II has also provided data for annual PhysioNet/Computing in Cardiology Challenges, including the 2012 Challenge "Predicting mortality of ICU Patients". QueryBuilder is a web-based tool that provides easy access to MIMIC-II. For more computationally intensive queries, one can locally install a complete copy of MIMIC-II in a VM. Both publicly available tools provide the MIMIC-II research community with convenient querying interfaces and complement the value of the MIMIC-II relational database.

  7. Autoplot and the HAPI Server

    NASA Astrophysics Data System (ADS)

    Faden, J.; Vandegriff, J. D.; Weigel, R. S.

    2016-12-01

    Autoplot was introduced in 2008 as an easy-to-use plotting tool for the space physics community. It reads data from a variety of file resources, such as CDF and HDF files, and a number of specialized data servers, such as the PDS/PPI's DIT-DOS, CDAWeb, and from the University of Iowa's RPWG Das2Server. Each of these servers have optimized methods for transmitting data to display in Autoplot, but require coordination and specialized software to work, limiting Autoplot's ability to access new servers and datasets. Likewise, groups who would like software to access their APIs must either write thier own clients, or publish a specification document in hopes that people will write clients. The HAPI specification was written so that a simple, standard API could be used by both Autoplot and server implementations, to remove these barriers to free flow of time series data. Autoplot's software for communicating with HAPI servers is presented, showing the user interface scientists will use, and how data servers might implement the HAPI specification to provide access to their data. This will also include instructions on how Autoplot is used and installed desktop computers, and used to view data from the RBSP, Juno, and other missions.

  8. The diagnosis related groups enhanced electronic medical record.

    PubMed

    Müller, Marcel Lucas; Bürkle, Thomas; Irps, Sebastian; Roeder, Norbert; Prokosch, Hans-Ulrich

    2003-07-01

    The introduction of Diagnosis Related Groups as a basis for hospital payment in Germany announced essential changes in the hospital reimbursement practice. A hospital's economical survival will depend vitally on the accuracy and completeness of the documentation of DRG relevant data like diagnosis and procedure codes. In order to enhance physicians' coding compliance, an easy-to-use interface integrating coding tasks seamlessly into clinical routine had to be developed. A generic approach should access coding and clinical guidelines from different information sources. Within the Electronic Medical Record (EMR) a user interface ('DRG Control Center') for all DRG relevant clinical and administrative data has been built. A comprehensive DRG-related web site gives online access to DRG grouping software and an electronic coding expert. Both components are linked together using an application supporting bi-directional communication. Other web based services like a guideline search engine can be integrated as well. With the proposed method, the clinician gains quick access to context sensitive clinical guidelines for appropriate treatment of his/her patient and administrative guidelines for the adequate coding of the diagnoses and procedures. This paper describes the design and current implementation and discusses our experiences.

  9. OPACs: The User and Subject Access.

    ERIC Educational Resources Information Center

    Carson, Elizabeth

    1985-01-01

    This survey of the literature reveals user and professional opinions of changes in subject access features available for online public access catalogs. Highlights include expanded access to fields already incorporated into traditional MARC record, access to context of the record, and design of the user interface. Twenty-four references are cited.…

  10. 75 FR 4101 - Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-26

    ... Verification (EIV) System User Access Authorization Form and Rules of Behavior and User Agreement AGENCY... lists the following information: Title of Proposal: Enterprise Income Verification (EIV) System User Access, Authorization Form and Rules Of Behavior and User Agreement. OMB Approval Number: 2577-New. Form...

  11. An operational data access infrastructure for accessing integrated environmental and socio-economic data from the Dutch Wadden Sea

    NASA Astrophysics Data System (ADS)

    De Bruin, T.

    2012-12-01

    The Wadden Sea, an UNESCO World Heritage Site along the Northern coasts of The Netherlands, Germany and Denmark, is a very valuable, yet also highly vulnerable tidal flats area. Knowledge is key to the sustainable management of the Wadden Sea. This knowledge should be reliable, founded on promptly accessible information and sufficiently broad to integrate both ecological and economic analyses. The knowledge is gained from extensive monotoring of both ecological and socio-economic parameters. Even though many organisations, research institutes, government agencies and NGOs carry out monitoring, there is no central overview of monitoring activities, nor easy access to the resulting data. The 'Wadden Sea Long-Term Ecosystem Research' (WaLTER) project (2011-2015) aims to set-up an integrated monitoring plan for the main environmental and management issues relevant to the Wadden Sea, such as sea-level rise, fisheries management, recreation and industry activities. The WaLTER data access infrastructure will be a distributed system of data providers, with a centralized data access portal. It is based on and makes use of the existing data access infrastructure of the Netherlands National Oceanographic Data Committee (NL-NODC), which has been operational since early 2009. The NL-NODC system is identical to and in fact developed by the European SeaDataNet project, furthering standardisation on a pan-European scale. The presentation will focus on the use of a distributed data access infrastructure to address the needs of different user groups such as policy makers, scientists and the general public.

  12. Controlling user access to electronic resources without password

    DOEpatents

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  13. Modeling erosion under future climates with the WEPP model

    Treesearch

    Timothy Bayley; William Elliot; Mark A. Nearing; D. Phillp Guertin; Thomas Johnson; David Goodrich; Dennis Flanagan

    2010-01-01

    The Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT) was developed to be an easy-to-use, web-based erosion model that allows users to adjust climate inputs for user-specified climate scenarios. WEPPCAT allows the user to modify monthly mean climate parameters, including maximum and minimum temperatures, number of wet days, precipitation, and...

  14. Promoting scientific collaboration and research through integrated social networking capabilities within the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.

    2009-04-01

    LiDAR (Light Distance And Ranging) topography data offer earth scientists the opportunity to study the earth's surface at very high resolutions. As a result, the popularity of these data is growing dramatically. However, the management, distribution, and analysis of community LiDAR data sets is a challenge due to their massive size (multi-billion point, mutli-terabyte). We have also found that many earth science users of these data sets lack the computing resources and expertise required to process these data. We have developed the OpenTopography Portal to democratize access to these large and computationally challenging data sets. The OpenTopography Portal uses cyberinfrastructure technology developed by the GEON project to provide access to LiDAR data in a variety of formats. LiDAR data products available range from simple Google Earth visualizations of LiDAR-derived hillshades to 1 km2 tiles of standard digital elevation model (DEM) products as well as LiDAR point cloud data and user generated custom-DEMs. We have found that the wide spectrum of LiDAR users have variable scientific applications, computing resources and technical experience and thus require a data system with multiple distribution mechanisms and platforms to serve a broader range of user communities. Because the volume of LiDAR topography data available is rapidly expanding, and data analysis techniques are evolving, there is a need for the user community to be able to communicate and interact to share knowledge and experiences. To address this need, the OpenTopography Portal enables social networking capabilities through a variety of collaboration tools, web 2.0 technologies and customized usage pattern tracking. Fundamentally, these tools offer users the ability to communicate, to access and share documents, participate in discussions, and to keep up to date on upcoming events and emerging technologies. The OpenTopography portal achieves the social networking capabilities by integrating various software technologies and platforms. These include the Expression Engine Content Management System (CMS) that comes with pre-packaged collaboration tools like blogs and wikis, the Gridsphere portal framework that contains the primary GEON LiDAR System portlet with user job monitoring capabilities and a java web based discussion forum (Jforums) application all seamlessly integrated under one portal. The OpenTopography Portal also provides integrated authentication mechanism between the various CMS collaboration tools and the core gridsphere based portlets. The integration of these various technologies allows for enhanced user interaction capabilities within the portal. By integrating popular collaboration tools like discussion forums and blogs we can promote conversation and openness among users. The ability to ask question and share expertise in forum discussions allows users to easily find information and interact with users facing similar challenges. The OpenTopography Blog enables our domain experts to post ideas, news items, commentary, and other resources in order to foster discussion and information sharing. The content management capabilities of the portal allow for easy updates to information in the form of publications, documents, and news articles. Access to the most current information fosters better decision-making. As has become the standard for web 2.0 technologies, the OpenTopography Portal is fully RSS enabled to allow users of the portal to keep track of news items, forum discussions, blog updates, and system outages. We are currently exploring how the information captured by user and job monitoring components of the Gridsphere based GEON LiDAR System can be harnessed to provide a recommender system that will help users to identify appropriate processing parameters and to locate related documents and data. By seamlessly integrating the various platforms and technologies under one single portal, we can take advantage of popular online collaboration tools that are either stand alone or software platform restricted. The availability of these collaboration tools along with the data will foster more community interaction and increase the strength and vibrancy of the LiDAR topography user community.

  15. Accessibility information in New Delhi for "EasenAccess" Android-based app for persons with disability: an observational study.

    PubMed

    Agarwal, Yashovardhan

    2018-06-14

    The World Health Organization and the World Bank's "World Report on Disability" reported that over 1 billion people have various kinds of disability worldwide while Indian Census 2011 reported about 26 million in India. The United Nations Convention states, "The Rights of Persons with Disabilities (PwD) include accessibility to Information, Transportation, Environment, Communication Technology and Services". This article takes forward the reason of making the "EasenAccess" (EnA) Android-based app to empower PwD with wheelchair-accessibility information, communication sentences and sending SOS signals with location. A survey of 25 most frequented places in New Delhi by common people and tourist with chosen 12 parameters in comparison the Government of India's survey of 100 most important buildings nationally. A statistical analysis and recommendations about areas for improvement, for the Government of India. EasenAccess helps millions of PwD to enable them with freedom of movement for employment and socio-economic activities to lead an independent lifestyle. EasenAccess increases government's access to information about lacunae, gives them an easy way to tabulate the places where more accessibility needs updating, and helps the government in facilitating information flow to the PwD. Implication for Rehabilitation The Rights of Persons with Disability Act in 2016 covers both the concepts of Universal Design of products, environments and programs; and accessibility. We are exploring with them the ways technology can help bridge the gap between rehabilitation and accessibility. In the higher income countries such as the UK or USA, it is normal for a person to receive training when being given a wheelchair to prevent future injuries. Frequently, even with this, training people develop upper limb injuries, due in part to the high, repetitive loads needed to push a wheelchair. This training is given as part of a package of rehabilitation, which also normally includes adaptations to people's living environments, which will enable them to use their wheelchair indoors. In Accessible Routes from Crowd-based Cloud Services (ARCCS) many NGOs have been able to develop sensors, which are themselves part of the Internet of Things, which when attached to a mobility device extend the ability of that device. Users can interact with the sensor data on their mobile phone via an app. They can also add geo-tagged photo or voice notes to annotate their journey. These can then be shared with other users of the ARCCS system. The system has been developed with a range of wheelchair users and other stakeholders. For example, one such initiative by the Government of India is called "Street Rehab". The aim of Street Rehab is to co-develop a new system for delivering a service for wheelchair users, which puts everyday activities at the heart of the rehabilitation process. To do this, a clear understanding of user needs, available technology and the accessibility of the city are all required. The first step is to understand the current accessibility of Delhi, the next to map this with the rehabilitation and livelihoods requirements of the wheelchair and tricycle users. This approach has led to the development of novel sensors and a data processing chain, which can automatically identify features of the sidewalk or surface, for example, drop curb, camber, and rough terrain. These classifications are then used to help with increasing localization of the person. In addition, the sensors can be used to identify how the pushing techniques of people, who self-propel their wheelchair. They have developed these sensors, as mobile phones alone, while useful if secured in a fixed position, are not adequate when loosely placed in bags or pockets. The aim is to find practical solutions for those who use mobility aids in India to access the services and places they wish to without risk of injury. Injuries can occur due to toppling out of a mobility device, being hit by a vehicle or developing an injury over time due to the demand of pushing/cycling the mobility device. EasenAccess can be a synergistic platform for all such future community based rehabilitation approaches. It can also help compile and collate data for accessibility gaps and rehabilitation issues encountered. EasenAccess app can thus help create an emerging framework, which puts the experience of the wheelchair user at the center but with a clear connection to people who can implement policy change on a broad scale.One that includes local people who will be advocates for creation of accessible maps, and local NGOs to provide hubs of training. These can be linked to a series of YouTube videos and supported via a messaging service, for example, local WhatsApp groups or social media groups, for example, a Facebook Page. With infinite possibilities, as illustrated above, EasenAccess app can create a new technological paradigm for convergence of Accessibility and Rehabilitation.

  16. Bridging the Gap between User Experience Research and Design in Industry: An Analysis of Two Common Communication Tools--Personas and Scenarios

    ERIC Educational Resources Information Center

    Putnam, Cynthia

    2010-01-01

    User experience (UX) research in the design of technology products utilizes human-centered design (HCD) methods to summarize and explain pertinent information about end users to designers. However, UX researchers cannot effectively communicate the needs and goals of users if designers do not find UX research (a) easy to integrate into design…

  17. The Johnson Space Center Management Information Systems (JSCMIS): An interface for organizational databases

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Erickson, Lloyd

    1990-01-01

    The Management Information and Decision Support Environment (MIDSE) is a research activity to build and test a prototype of a generic human interface on the Johnson Space Center (JSC) Information Network (CIN). The existing interfaces were developed specifically to support operations rather than the type of data which management could use. The diversity of the many interfaces and their relative difficulty discouraged occasional users from attempting to use them for their purposes. The MIDSE activity approached this problem by designing and building an interface to one JSC data base - the personnel statistics tables of the NASA Personnel and Payroll System (NPPS). The interface was designed against the following requirements: generic (use with any relational NOMAD data base); easy to learn (intuitive operations for new users); easy to use (efficient operations for experienced users); self-documenting (help facility which informs users about the data base structure as well as the operation of the interface); and low maintenance (easy configuration to new applications). A prototype interface entitled the JSC Management Information Systems (JSCMIS) was produced. It resides on CIN/PROFS and is available to JSC management who request it. The interface has passed management review and is ready for early use. Three kinds of data are now available: personnel statistics, personnel register, and plan/actual cost.

  18. Use and perceptions of information among family physicians: sources considered accessible, relevant, and reliable.

    PubMed

    Kosteniuk, Julie G; Morgan, Debra G; D'Arcy, Carl K

    2013-01-01

    The research determined (1) the information sources that family physicians (FPs) most commonly use to update their general medical knowledge and to make specific clinical decisions, and (2) the information sources FPs found to be most physically accessible, intellectually accessible (easy to understand), reliable (trustworthy), and relevant to their needs. A cross-sectional postal survey of 792 FPs and locum tenens, in full-time or part-time medical practice, currently practicing or on leave of absence in the Canadian province of Saskatchewan was conducted during the period of January to April 2008. Of 666 eligible physicians, 331 completed and returned surveys, resulting in a response rate of 49.7% (331/666). Medical textbooks and colleagues in the main patient care setting were the top 2 sources for the purpose of making specific clinical decisions. Medical textbooks were most frequently considered by FPs to be reliable (trustworthy), and colleagues in the main patient care setting were most physically accessible (easy to access). When making specific clinical decisions, FPs were most likely to use information from sources that they considered to be reliable and generally physically accessible, suggesting that FPs can best be supported by facilitating easy and convenient access to high-quality information.

  19. Use and perceptions of information among family physicians: sources considered accessible, relevant, and reliable

    PubMed Central

    Kosteniuk, Julie G.; Morgan, Debra G.; D'Arcy, Carl K.

    2013-01-01

    Objectives: The research determined (1) the information sources that family physicians (FPs) most commonly use to update their general medical knowledge and to make specific clinical decisions, and (2) the information sources FPs found to be most physically accessible, intellectually accessible (easy to understand), reliable (trustworthy), and relevant to their needs. Methods: A cross-sectional postal survey of 792 FPs and locum tenens, in full-time or part-time medical practice, currently practicing or on leave of absence in the Canadian province of Saskatchewan was conducted during the period of January to April 2008. Results: Of 666 eligible physicians, 331 completed and returned surveys, resulting in a response rate of 49.7% (331/666). Medical textbooks and colleagues in the main patient care setting were the top 2 sources for the purpose of making specific clinical decisions. Medical textbooks were most frequently considered by FPs to be reliable (trustworthy), and colleagues in the main patient care setting were most physically accessible (easy to access). Conclusions: When making specific clinical decisions, FPs were most likely to use information from sources that they considered to be reliable and generally physically accessible, suggesting that FPs can best be supported by facilitating easy and convenient access to high-quality information. PMID:23405045

  20. Development of a Low-cost, Comprehensive Recording System for Circadian Rhythm Behavior.

    PubMed

    Kwon, Jea; Park, Min Gu; Lee, Seung Eun; Lee, C Justin

    2018-02-01

    Circadian rhythm is defined as a 24-hour biological oscillation, which persists even without any external cues but also can be re-entrained by various environmental cues. One of the widely accepted circadian rhythm behavioral experiment is measuring the wheel-running activity (WRA) of rodents. However, the price for commercially available WRA recording system is not easily affordable for researchers due to high-cost implementation of sensors for wheel rotation. Here, we developed a cost-effective and comprehensive system for circadian rhythm recording by measuring the house-keeping activities (HKA). We have monitored animal's HKA as electrical signal by simply connecting animal housing cage with a standard analog/digital converter: input to the metal lid and ground to the metal grid floor. We show that acquired electrical signals are combined activities of eating, drinking and natural locomotor behaviors which are well-known indicators of circadian rhythm. Post-processing of measured electrical signals enabled us to draw actogram, which verifies HKA to be reliable circadian rhythm indicator. To provide easy access of HKA recording system for researchers, we have developed user-friendly MATLAB-based software, Circa Analysis. This software provides functions for easy extraction of scalable "touch activity" from raw data files by automating seven steps of post-processing and drawing actograms with highly intuitive user-interface and various options. With our cost-effective HKA circadian rhythm recording system, we have estimated the cost of our system to be less than $150 per channel. We anticipate our system will benefit many researchers who would like to study circadian rhythm.

  1. PRIMO: An Interactive Homology Modeling Pipeline.

    PubMed

    Hatherley, Rowan; Brown, David K; Glenister, Michael; Tastan Bishop, Özlem

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO's automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/.

  2. PRIMO: An Interactive Homology Modeling Pipeline

    PubMed Central

    Glenister, Michael

    2016-01-01

    The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO’s automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/. PMID:27855192

  3. A main path domain map as digital library interface

    NASA Astrophysics Data System (ADS)

    Demaine, Jeffrey

    2009-01-01

    The shift to electronic publishing of scientific journals is an opportunity for the digital library to provide non-traditional ways of accessing the literature. One method is to use citation metadata drawn from a collection of electronic journals to generate maps of science. These maps visualize the communication patterns in the collection, giving the user an easy-tograsp view of the semantic structure underlying the scientific literature. For this visualization to be understandable the complexity of the citation network must be reduced through an algorithm. This paper describes the Citation Pathfinder application and its integration into a prototype digital library. This application generates small-scale citation networks that expand upon the search results of the digital library. These domain maps are linked to the collection, creating an interface that is based on the communication patterns in science. The Main Path Analysis technique is employed to simplify these networks into linear, sequential structures. By identifying patterns that characterize the evolution of the research field, Citation Pathfinder uses citations to give users a deeper understanding of the scientific literature.

  4. Dental Informatics tool "SOFPRO" for the study of oral submucous fibrosis.

    PubMed

    Erlewad, Dinesh Masajirao; Mundhe, Kalpana Anandrao; Hazarey, Vinay K

    2016-01-01

    Dental informatics is an evolving branch widely used in dental education and practice. Numerous applications that support clinical care, education and research have been developed. However, very few such applications are developed and utilized in the epidemiological studies of oral submucous fibrosis (OSF) which is affecting a significant population of Asian countries. To design and develop an user friendly software for the descriptive epidemiological study of OSF. With the help of a software engineer a computer program SOFPRO was designed and developed by using, Ms-Visual Basic 6.0 (VB), Ms-Access 2000, Crystal Report 7.0 and Ms-Paint in operating system XP. For the analysis purpose the available OSF data from the departmental precancer registry was fed into the SOFPRO. Known data, not known and null data are successfully accepted in data entry and represented in data analysis of OSF. Smooth working of SOFPRO and its correct data flow was tested against real-time data of OSF. SOFPRO was found to be a user friendly automated tool for easy data collection, retrieval, management and analysis of OSF patients.

  5. SpidermiR: An R/Bioconductor Package for Integrative Analysis with miRNA Data.

    PubMed

    Cava, Claudia; Colaprico, Antonio; Bertoli, Gloria; Graudenzi, Alex; Silva, Tiago C; Olsen, Catharina; Noushmehr, Houtan; Bontempi, Gianluca; Mauri, Giancarlo; Castiglioni, Isabella

    2017-01-27

    Gene Regulatory Networks (GRNs) control many biological systems, but how such network coordination is shaped is still unknown. GRNs can be subdivided into basic connections that describe how the network members interact e.g., co-expression, physical interaction, co-localization, genetic influence, pathways, and shared protein domains. The important regulatory mechanisms of these networks involve miRNAs. We developed an R/Bioconductor package, namely SpidermiR, which offers an easy access to both GRNs and miRNAs to the end user, and integrates this information with differentially expressed genes obtained from The Cancer Genome Atlas. Specifically, SpidermiR allows the users to: (i) query and download GRNs and miRNAs from validated and predicted repositories; (ii) integrate miRNAs with GRNs in order to obtain miRNA-gene-gene and miRNA-protein-protein interactions, and to analyze miRNA GRNs in order to identify miRNA-gene communities; and (iii) graphically visualize the results of the analyses. These analyses can be performed through a single interface and without the need for any downloads. The full data sets are then rapidly integrated and processed locally.

  6. Foldit Standalone: a video game-derived protein structure manipulation interface using Rosetta

    PubMed Central

    Kleffner, Robert; Flatten, Jeff; Leaver-Fay, Andrew; Baker, David; Siegel, Justin B.; Khatib, Firas; Cooper, Seth

    2017-01-01

    Abstract Summary: Foldit Standalone is an interactive graphical interface to the Rosetta molecular modeling package. In contrast to most command-line or batch interactions with Rosetta, Foldit Standalone is designed to allow easy, real-time, direct manipulation of protein structures, while also giving access to the extensive power of Rosetta computations. Derived from the user interface of the scientific discovery game Foldit (itself based on Rosetta), Foldit Standalone has added more advanced features and removed the competitive game elements. Foldit Standalone was built from the ground up with a custom rendering and event engine, configurable visualizations and interactions driven by Rosetta. Foldit Standalone contains, among other features: electron density and contact map visualizations, multiple sequence alignment tools for template-based modeling, rigid body transformation controls, RosettaScripts support and an embedded Lua interpreter. Availability and Implementation: Foldit Standalone is available for download at https://fold.it/standalone, under the Rosetta license, which is free for academic and non-profit users. It is implemented in cross-platform C ++ and binary executables are available for Windows, macOS and Linux. Contact: scooper@ccs.neu.edu PMID:28481970

  7. User's manual for the Gaussian windows program

    NASA Technical Reports Server (NTRS)

    Jaeckel, Louis A.

    1992-01-01

    'Gaussian Windows' is a method for exploring a set of multivariate data, in order to estimate the shape of the underlying density function. The method can be used to find and describe structural features in the data. The method is described in two earlier papers. I assume that the reader has access to both of these papers, so I will not repeat material from them. The program described herein is written in BASIC and it runs on an IBM PC or PS/2 with the DOS 3.3 operating system. Although the program is slow and has limited memory space, it is adequate for experimenting with the method. Since it is written in BASIC, it is relatively easy to modify. The program and some related files are available on a 3-inch diskette. A listing of the program is also available. This user's manual explains the use of the program. First, it gives a brief tutorial, illustrating some of the program's features with a set of artificial data. Then, it describes the results displayed after the program does a Gaussian window, and it explains each of the items on the various menus.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singleton, Jr., Robert; Israel, Daniel M.; Doebling, Scott William

    For code verification, one compares the code output against known exact solutions. There are many standard test problems used in this capacity, such as the Noh and Sedov problems. ExactPack is a utility that integrates many of these exact solution codes into a common API (application program interface), and can be used as a stand-alone code or as a python package. ExactPack consists of python driver scripts that access a library of exact solutions written in Fortran or Python. The spatial profiles of the relevant physical quantities, such as the density, fluid velocity, sound speed, or internal energy, are returnedmore » at a time specified by the user. The solution profiles can be viewed and examined by a command line interface or a graphical user interface, and a number of analysis tools and unit tests are also provided. We have documented the physics of each problem in the solution library, and provided complete documentation on how to extend the library to include additional exact solutions. ExactPack’s code architecture makes it easy to extend the solution-code library to include additional exact solutions in a robust, reliable, and maintainable manner.« less

  9. Design and Implementation of an Interactive Web-Based Near Real-Time Forest Monitoring System.

    PubMed

    Pratihast, Arun Kumar; DeVries, Ben; Avitabile, Valerio; de Bruin, Sytze; Herold, Martin; Bergsma, Aldo

    2016-01-01

    This paper describes an interactive web-based near real-time (NRT) forest monitoring system using four levels of geographic information services: 1) the acquisition of continuous data streams from satellite and community-based monitoring using mobile devices, 2) NRT forest disturbance detection based on satellite time-series, 3) presentation of forest disturbance data through a web-based application and social media and 4) interaction of the satellite based disturbance alerts with the end-user communities to enhance the collection of ground data. The system is developed using open source technologies and has been implemented together with local experts in the UNESCO Kafa Biosphere Reserve, Ethiopia. The results show that the system is able to provide easy access to information on forest change and considerably improves the collection and storage of ground observation by local experts. Social media leads to higher levels of user interaction and noticeably improves communication among stakeholders. Finally, an evaluation of the system confirms the usability of the system in Ethiopia. The implemented system can provide a foundation for an operational forest monitoring system at the national level for REDD+ MRV applications.

  10. VO-KOREL: A Fourier Disentangling Service of the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Škoda, Petr; Hadrava, Petr; Fuchs, Jan

    2012-04-01

    VO-KOREL is a web service exploiting the technology of the Virtual Observatory for providing astronomers with the intuitive graphical front-end and distributed computing back-end running the most recent version of the Fourier disentangling code KOREL. The system integrates the ideas of the e-shop basket, conserving the privacy of every user by transfer encryption and access authentication, with features of laboratory notebook, allowing the easy housekeeping of both input parameters and final results, as well as it explores a newly emerging technology of cloud computing. While the web-based front-end allows the user to submit data and parameter files, edit parameters, manage a job list, resubmit or cancel running jobs and mainly watching the text and graphical results of a disentangling process, the main part of the back-end is a simple job queue submission system executing in parallel multiple instances of the FORTRAN code KOREL. This may be easily extended for GRID-based deployment on massively parallel computing clusters. The short introduction into underlying technologies is given, briefly mentioning advantages as well as bottlenecks of the design used.

  11. Modular Aero-Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Guo, Ten-Huei

    2006-01-01

    The Modular Aero-Propulsion System Simulation (MAPSS) is a graphical simulation environment designed for the development of advanced control algorithms and rapid testing of these algorithms on a generic computational model of a turbofan engine and its control system. MAPSS is a nonlinear, non-real-time simulation comprising a Component Level Model (CLM) module and a Controller-and-Actuator Dynamics (CAD) module. The CLM module simulates the dynamics of engine components at a sampling rate of 2,500 Hz. The controller submodule of the CAD module simulates a digital controller, which has a typical update rate of 50 Hz. The sampling rate for the actuators in the CAD module is the same as that of the CLM. MAPSS provides a graphical user interface that affords easy access to engine-operation, engine-health, and control parameters; is used to enter such input model parameters as power lever angle (PLA), Mach number, and altitude; and can be used to change controller and engine parameters. Output variables are selectable by the user. Output data as well as any changes to constants and other parameters can be saved and reloaded into the GUI later.

  12. Status and plans for the future of the Vienna VLBI Software

    NASA Astrophysics Data System (ADS)

    Madzak, Matthias; Böhm, Johannes; Böhm, Sigrid; Girdiuk, Anastasiia; Hellerschmied, Andreas; Hofmeister, Armin; Krasna, Hana; Kwak, Younghee; Landskron, Daniel; Mayer, David; McCallum, Jamie; Plank, Lucia; Schönberger, Caroline; Shabala, Stanislav; Sun, Jing; Teke, Kamil

    2016-04-01

    The Vienna VLBI Software (VieVS) is a VLBI analysis software developed and maintained at Technische Universität Wien (TU Wien) since 2008 with contributions from groups all over the world. It is used for both academic purposes in university courses as well as for providing VLBI analysis results to the geodetic community. Written in a modular structure in Matlab, VieVS offers easy access to the source code and the possibility to adapt the programs for particular purposes. The new version 2.3, released in December 2015, includes several new parameters to be estimated in the global solution, such as tidal ERP variation coefficients. The graphical user interface was slightly modified for an improved user functionality and, e.g., the possibility of deriving baseline length repeatabilities. The scheduling of satellite observations was refined, the simulator newly includes the effect of source structure which can also be corrected for in the analysis. This poster gives an overview of all VLBI-related activities in Vienna and provides an outlook to future plans concerning the Vienna VLBI Software.

  13. Exploratory visualization software for reporting environmental survey results.

    PubMed

    Fisher, P; Arnot, C; Bastin, L; Dykes, J

    2001-08-01

    Environmental surveys yield three principal products: maps, a set of data tables, and a textual report. The relationships between these three elements, however, are often cumbersome to present, making full use of all the information in an integrated and systematic sense difficult. The published paper report is only a partial solution. Modern developments in computing, particularly in cartography, GIS, and hypertext, mean that it is increasingly possible to conceive of an easier and more interactive approach to the presentation of such survey results. Here, we present such an approach which links map and tabular datasets arising from a vegetation survey, allowing users ready access to a complex dataset using dynamic mapping techniques. Multimedia datasets equipped with software like this provide an exciting means of quick and easy visual data exploration and comparison. These techniques are gaining popularity across the sciences as scientists and decision-makers are presented with increasing amounts of diverse digital data. We believe that the software environment actively encourages users to make complex interrogations of the survey information, providing a new vehicle for the reader of an environmental survey report.

  14. Operational Space-Assisted Irrigation Advisory Services: Overview Of And Lessons Learned From The Project DEMETER

    NASA Astrophysics Data System (ADS)

    Osann Jochum, M. A.; Demeter Partners

    2006-08-01

    The project DEMETER (DEMonstration of Earth observation TEchnologies in Routine irrigation advisory services) was dedicated to assessing and demonstrating improvements introduced by Earth observation (EO) and Information and Communication Technologies (ICT) in farm and Irrigation Advisory Service (IAS) day-to-day operations. The DEMETER concept of near-real-time delivery of EO-based irrigation scheduling information to IAS and farmers has proven to be valid. The operationality of the space segment was demonstrated for Landsat 5-TM in the Barrax pilot zone during the 2004 and 2005 irrigation campaigns. Extra-fast image delivery and quality controlled operational processing make the EO-based crop coefficient maps available at the same speed and quality as ground-based data (point samples), while significantly extending the spatial coverage and reducing service cost. Leading-edge online analysis and visualization tools provide easy, intuitive access to the information and personalized service to users. First feedback of users at IAS and farmer level is encouraging. The paper gives an overview of the project and its main achievements.

  15. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  16. BamTools: a C++ API and toolkit for analyzing and managing BAM files.

    PubMed

    Barnett, Derek W; Garrison, Erik K; Quinlan, Aaron R; Strömberg, Michael P; Marth, Gabor T

    2011-06-15

    Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools.

  17. The Hubble Spectroscopic Legacy Archive

    NASA Astrophysics Data System (ADS)

    Peeples, Molly S.; Tumlinson, Jason; Fox, Andrew; Aloisi, Alessandra; Ayres, Thomas R.; Danforth, Charles; Fleming, Scott W.; Jenkins, Edward B.; Jedrzejewski, Robert I.; Keeney, Brian A.; Oliveira, Cristina M.

    2016-01-01

    With no future space ultraviolet instruments currently planned, the data from the UV spectrographs aboard the Hubble Space Telescope have a legacy value beyond their initial science goals. The Hubble Spectroscopic Legacy Archive will provide to the community new science-grade combined spectra for all publicly available data obtained by the Cosmic Origins Spectrograph (COS) and the Space Telescope Imaging Spectrograph (STIS). These data will be packaged into "smart archives" according to target type and scientific themes to facilitate the construction of archival samples for common science uses. A new "quick look" capability will make the data easy for users to quickly access, assess the quality of, and download for archival science starting in Cycle 24, with the first generation of these products for the FUV modes of COS available online via MAST in early 2016.

  18. Developing science gateways for drug discovery in a grid environment.

    PubMed

    Pérez-Sánchez, Horacio; Rezaei, Vahid; Mezhuyev, Vitaliy; Man, Duhu; Peña-García, Jorge; den-Haan, Helena; Gesing, Sandra

    2016-01-01

    Methods for in silico screening of large databases of molecules increasingly complement and replace experimental techniques to discover novel compounds to combat diseases. As these techniques become more complex and computationally costly we are faced with an increasing problem to provide the research community of life sciences with a convenient tool for high-throughput virtual screening on distributed computing resources. To this end, we recently integrated the biophysics-based drug-screening program FlexScreen into a service, applicable for large-scale parallel screening and reusable in the context of scientific workflows. Our implementation is based on Pipeline Pilot and Simple Object Access Protocol and provides an easy-to-use graphical user interface to construct complex workflows, which can be executed on distributed computing resources, thus accelerating the throughput by several orders of magnitude.

  19. Comply with regulations or risk paying hefty fines: ten tips for choosing call recording to help ensure compliance.

    PubMed

    Johnson, Bill

    2014-01-01

    Medical practices are paying hundreds of thousands of dollars in fines for not complying with various governmental regulations, including a variety of HIPAA rules and credit card compliance. One solution to help reduce this risk and avoid fines is to use call recording to help ensure compliance. This article provides readers with key considerations for choosing and implementing a call recording solution for their medical practices to ensure that it will be compliant with key regulations. These tips include being able to customize call recording policies and procedures for their unique needs; providing secure, private storage; allowing easy access for authorized users; secure sharing of call recordings; regulatory compliance training; disaster recovery; and maintaining an audit-ready and compliant-evident state at all times.

  20. Protect the sick: health insurance reform in one easy lesson.

    PubMed

    Stone, Deborah

    2008-01-01

    In thinking about how to expand insurance coverage, the issue that matters is whether insurance enables sick and high-risk people to get medical care. Over the course of three decades, market-oriented insurance reforms have shifted more costs of illness onto people who need and use medical care. By making the users of care pay for it (or even some of it), cost-sharing discourages sick people from getting care, even if they have insurance, and for people with low-incomes and tight budgets, cost-sharing can effectively deny them access to care. Thus, covering or not covering sick people is the core issue of health insurance reform, both as a determinant of support and opposition to proposals, and as the proper yardstick for evaluating reform ideas.

  1. Isfahan MISP Dataset

    PubMed Central

    Kashefpur, Masoud; Kafieh, Rahele; Jorjandi, Sahar; Golmohammadi, Hadis; Khodabande, Zahra; Abbasi, Mohammadreza; Teifuri, Nilufar; Fakharzadeh, Ali Akbar; Kashefpoor, Maryam; Rabbani, Hossein

    2017-01-01

    An online depository was introduced to share clinical ground truth with the public and provide open access for researchers to evaluate their computer-aided algorithms. PHP was used for web programming and MySQL for database managing. The website was entitled “biosigdata.com.” It was a fast, secure, and easy-to-use online database for medical signals and images. Freely registered users could download the datasets and could also share their own supplementary materials while maintaining their privacies (citation and fee). Commenting was also available for all datasets, and automatic sitemap and semi-automatic SEO indexing have been set for the site. A comprehensive list of available websites for medical datasets is also presented as a Supplementary (http://journalonweb.com/tempaccess/4800.584.JMSS_55_16I3253.pdf). PMID:28487832

  2. Method and system to discover and recommend interesting documents

    DOEpatents

    Potok, Thomas Eugene; Steed, Chad Allen; Patton, Robert Matthew

    2017-01-31

    Disclosed are several examples of systems that can read millions of news feeds per day about topics (e.g., your customers, competitors, markets, and partners), and provide a small set of the most relevant items to read to keep current with the overwhelming amount of information currently available. Topics of interest can be chosen by the user of the system for use as seeds. The seeds can be vectorized and compared with the target documents to determine their similarity. The similarities can be sorted from highest to lowest so that the most similar seed and target documents are at the top of the list. This output can be produced in XML format so that an RSS Reader can format the XML. This allows for easy Internet access to these recommendations.

  3. Intelligent retrieval of medical images from the Internet

    NASA Astrophysics Data System (ADS)

    Tang, Yau-Kuo; Chiang, Ted T.

    1996-05-01

    The object of this study is using Internet resources to provide a cost-effective, user-friendly method to access the medical image archive system and to provide an easy method for the user to identify the images required. This paper describes the prototype system architecture, the implementation, and results. In the study, we prototype the Intelligent Medical Image Retrieval (IMIR) system as a Hypertext Transport Prototype server and provide Hypertext Markup Language forms for user, as an Internet client, using browser to enter image retrieval criteria for review. We are developing the intelligent retrieval engine, with the capability to map the free text search criteria to the standard terminology used for medical image identification. We evaluate retrieved records based on the number of the free text entries matched and their relevance level to the standard terminology. We are in the integration and testing phase. We have collected only a few different types of images for testing and have trained a few phrases to map the free text to the standard medical terminology. Nevertheless, we are able to demonstrate the IMIR's ability to search, retrieve, and review medical images from the archives using general Internet browser. The prototype also uncovered potential problems in performance, security, and accuracy. Additional studies and enhancements will make the system clinically operational.

  4. An Open Architecture to Support Social and Health Services in a Smart TV Environment.

    PubMed

    Costa, Carlos Rivas; Anido-Rifon, Luis E; Fernandez-Iglesias, Manuel J

    2017-03-01

    To design, implement, and test a solution to provide social and health services for the elderly at home based on smart TV technologies and access to all services. The architecture proposed is based on an open software platform and standard personal computing hardware. This provides great flexibility to develop new applications over the underlying infrastructure or to integrate new devices, for instance to monitor a broad range of vital signs in those cases where home monitoring is required. An actual system as a proof-of-concept was designed, implemented, and deployed. Applications range from social network clients to vital signs monitoring; from interactive TV contests to conventional online care applications such as medication reminders or telemedicine. In both cases, the results have been very positive, confirming the initial perception of the TV as a convenient, easy-to-use technology to provide social and health care. The TV set is a much more familiar computing interface for most senior users, and as a consequence, smart TVs become a most convenient solution for the design and implementation of applications and services targeted to this user group. This proposal has been tested in real setting with 62 senior people at their homes. Users included both individuals with experience using computers and others reluctant to them.

  5. AEGIS: a wildfire prevention and management information system

    NASA Astrophysics Data System (ADS)

    Kalabokidis, K.; Ager, A.; Finney, M.; Athanasis, N.; Palaiologou, P.; Vasilakos, C.

    2015-10-01

    A Web-GIS wildfire prevention and management platform (AEGIS) was developed as an integrated and easy-to-use decision support tool (http://aegis.aegean.gr). The AEGIS platform assists with early fire warning, fire planning, fire control and coordination of firefighting forces by providing access to information that is essential for wildfire management. Databases were created with spatial and non-spatial data to support key system functionalities. Updated land use/land cover maps were produced by combining field inventory data with high resolution multispectral satellite images (RapidEye) to be used as inputs in fire propagation modeling with the Minimum Travel Time algorithm. End users provide a minimum number of inputs such as fire duration, ignition point and weather information to conduct a fire simulation. AEGIS offers three types of simulations; i.e. single-fire propagations, conditional burn probabilities and at the landscape-level, similar to the FlamMap fire behavior modeling software. Artificial neural networks (ANN) were utilized for wildfire ignition risk assessment based on various parameters, training methods, activation functions, pre-processing methods and network structures. The combination of ANNs and expected burned area maps produced an integrated output map for fire danger prediction. The system also incorporates weather measurements from remote automatic weather stations and weather forecast maps. The structure of the algorithms relies on parallel processing techniques (i.e. High Performance Computing and Cloud Computing) that ensure computational power and speed. All AEGIS functionalities are accessible to authorized end users through a web-based graphical user interface. An innovative mobile application, AEGIS App, acts as a complementary tool to the web-based version of the system.

  6. A Prototype Hydrologic Observatory for the Neuse River Basin Using Remote Sensing Data as a Part of the CUAHSI-HIS Effort

    NASA Astrophysics Data System (ADS)

    Kanwar, R.; Narayan, U.; Lakshmi, V.

    2005-12-01

    Remote sensing has the potential to immensely advance the science and application of hydrology as it provides multi-scale and multi-temporal measurements of several hydrologic parameters. There is a wide variety of remote sensing data sources available to a hydrologist with a myriad of data formats, access techniques, data quality issues and temporal and spatial extents. It is very important to make data availability and its usage as convenient as possible for potential users. The CUAHSI Hydrologic Information System (HIS) initiative addresses this issue of better data access and management for hydrologists with a focus on in-situ data, that is point measurements of water and energy fluxes which make up the 'more conventional' sources of hydrologic data. This paper explores various sources of remotely sensed hydrologic data available, their data formats and volumes, current modes of data acquisition by end users, metadata associated with data itself, and requirements from potential data models that would allow a seamless integration of remotely sensed hydrologic observations into the Hydrologic Information System. Further, a prototype hydrologic observatory (HO) for the Neuse River Basin is developed using surface temperature, vegetation indices and soil moisture estimates available from remote sensing. The prototype (HO) uses the CUAHSI digital library system (DLS) on the back (server) end. On the front (client) end, a rich visual environment has been developed in order to provide better decision making tools in order to make an optimal choice in the selection of remote sensing data for a particular application. An easy point and click interface to the remote sensing data is also implemented for common users who are just interested in location based query of hydrologic variable values.

  7. Support Services for Remote Users of Online Public Access Catalogs.

    ERIC Educational Resources Information Center

    Kalin, Sally W.

    1991-01-01

    Discusses the needs of remote users of online public access catalogs (OPACs). User expectations are discussed; problems encountered by remote-access users are examined, including technical problems and searching problems; support services are described, including instruction, print guides, and online help; and differences from the needs of…

  8. Lunar e-Library: Putting Space History to Work

    NASA Technical Reports Server (NTRS)

    McMahan, Tracy A.; Shea, Charlotte A.; Finckenor, Miria

    2006-01-01

    As NASA plans and implements the Vision for Space Exploration, managers, engineers, and scientists need historically important information that is readily available and easily accessed. The Lunar e-Library - a searchable collection of 1100 electronic (.PDF) documents - makes it easy to find critical technical data and lessons learned and put space history knowledge in action. The Lunar e-Library, a DVD knowledge database, was developed by NASA to shorten research time and put knowledge at users' fingertips. Funded by NASA's Space Environments and Effects (SEE) Program headquartered at Marshall Space Flight Center (MSFC) and the MSFC Materials and Processes Laboratory, the goal of the Lunar e- Library effort was to identify key lessons learned from Apollo and other lunar programs and missions and to provide technical information from those programs in an easy-to-use format. The SEE Program began distributing the Lunar e-Library knowledge database in 2006. This paper describes the Lunar e-Library development process (including a description of the databases and resources used to acquire the documents) and the contents of the DVD product, demonstrates its usefulness with focused searches, and provides information on how to obtain this free resource.

  9. Hand-held computer operating system program for collection of resident experience data.

    PubMed

    Malan, T K; Haffner, W H; Armstrong, A Y; Satin, A J

    2000-11-01

    To describe a system for recording resident experience involving hand-held computers with the Palm Operating System (3 Com, Inc., Santa Clara, CA). Hand-held personal computers (PCs) are popular, easy to use, inexpensive, portable, and can share data among other operating systems. Residents in our program carry individual hand-held database computers to record Residency Review Committee (RRC) reportable patient encounters. Each resident's data is transferred to a single central relational database compatible with Microsoft Access (Microsoft Corporation, Redmond, WA). Patient data entry and subsequent transfer to a central database is accomplished with commercially available software that requires minimal computer expertise to implement and maintain. The central database can then be used for statistical analysis or to create required RRC resident experience reports. As a result, the data collection and transfer process takes less time for residents and program director alike, than paper-based or central computer-based systems. The system of collecting resident encounter data using hand-held computers with the Palm Operating System is easy to use, relatively inexpensive, accurate, and secure. The user-friendly system provides prompt, complete, and accurate data, enhancing the education of residents while facilitating the job of the program director.

  10. Effects of accessible website design on nondisabled users: age and device as moderating factors.

    PubMed

    Schmutz, Sven; Sonderegger, Andreas; Sauer, Juergen

    2018-05-01

    This study examined how implementing recommendations from Web accessibility guidelines affects nondisabled people in different age groups using different technical devices. While recent research showed positive effects of implementing such recommendations for nondisabled users, it remains unclear whether such effects would apply to different age groups and kind of devices. A 2 × 2 × 2 design was employed with website accessibility (high accessibility vs. very low accessibility), age (younger adults vs. older adults) and type of device (laptop vs. tablet) as independent variables. 110 nondisabled participants took part in a usability test, in which performance and satisfaction were measured as dependent variables. The results showed that higher accessibility increased task completion rate, task completion time and satisfaction ratings of nondisabled users. While user age did not have any effects, users showed faster task completion time under high accessibility when using a tablet rather than a laptop. The findings confirmed previous findings, which showed benefits of accessible websites for nondisabled users. These beneficial effects may now be generalised to a wide age range and across different devices. Practitioner Summary: This work is relevant to the design of websites since it emphasises the need to consider the characteristics of different user groups. Accessible website design (aimed at users with disabilities) leads to benefits for nondisabled users across different ages. These findings provide further encouragement for practitioners to apply WCAG 2.0.

  11. A standard format and a graphical user interface for spin system specification.

    PubMed

    Biternas, A G; Charnock, G T P; Kuprov, Ilya

    2014-03-01

    We introduce a simple and general XML format for spin system description that is the result of extensive consultations within Magnetic Resonance community and unifies under one roof all major existing spin interaction specification conventions. The format is human-readable, easy to edit and easy to parse using standard XML libraries. We also describe a graphical user interface that was designed to facilitate construction and visualization of complicated spin systems. The interface is capable of generating input files for several popular spin dynamics simulation packages. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Distributed visualization framework architecture

    NASA Astrophysics Data System (ADS)

    Mishchenko, Oleg; Raman, Sundaresan; Crawfis, Roger

    2010-01-01

    An architecture for distributed and collaborative visualization is presented. The design goals of the system are to create a lightweight, easy to use and extensible framework for reasearch in scientific visualization. The system provides both single user and collaborative distributed environment. System architecture employs a client-server model. Visualization projects can be synchronously accessed and modified from different client machines. We present a set of visualization use cases that illustrate the flexibility of our system. The framework provides a rich set of reusable components for creating new applications. These components make heavy use of leading design patterns. All components are based on the functionality of a small set of interfaces. This allows new components to be integrated seamlessly with little to no effort. All user input and higher-level control functionality interface with proxy objects supporting a concrete implementation of these interfaces. These light-weight objects can be easily streamed across the web and even integrated with smart clients running on a user's cell phone. The back-end is supported by concrete implementations wherever needed (for instance for rendering). A middle-tier manages any communication and synchronization with the proxy objects. In addition to the data components, we have developed several first-class GUI components for visualization. These include a layer compositor editor, a programmable shader editor, a material editor and various drawable editors. These GUI components interact strictly with the interfaces. Access to the various entities in the system is provided by an AssetManager. The asset manager keeps track of all of the registered proxies and responds to queries on the overall system. This allows all user components to be populated automatically. Hence if a new component is added that supports the IMaterial interface, any instances of this can be used in the various GUI components that work with this interface. One of the main features is an interactive shader designer. This allows rapid prototyping of new visualization renderings that are shader-based and greatly accelerates the development and debug cycle.

  13. LIMS for Lasers 2015 for achieving long-term accuracy and precision of δ2H, δ17O, and δ18O of waters using laser absorption spectrometry

    USGS Publications Warehouse

    Coplen, Tyler B.; Wassenaar, Leonard I

    2015-01-01

    Although laser absorption spectrometry (LAS) instrumentation is easy to use, its incorporation into laboratory operations is not easy, owing to extensive offline manipulation of comma-separated-values files for outlier detection, between-sample memory correction, nonlinearity (δ-variation with water amount) correction, drift correction, normalization to VSMOW-SLAP scales, and difficulty in performing long-term QA/QC audits. METHODS: A Microsoft Access relational-database application, LIMS (Laboratory Information Management System) for Lasers 2015, was developed. It automates LAS data corrections and manages clients, projects, samples, instrument-sample lists, and triple-isotope (δ(17) O, δ(18) O, and δ(2) H values) instrumental data for liquid-water samples. It enables users to (1) graphically evaluate sample injections for variable water yields and high isotope-delta variance; (2) correct for between-sample carryover, instrumental drift, and δ nonlinearity; and (3) normalize final results to VSMOW-SLAP scales. RESULTS: Cost-free LIMS for Lasers 2015 enables users to obtain improved δ(17) O, δ(18) O, and δ(2) H values with liquid-water LAS instruments, even those with under-performing syringes. For example, LAS δ(2) HVSMOW measurements of USGS50 Lake Kyoga (Uganda) water using an under-performing syringe having ±10 % variation in water concentration gave +31.7 ± 1.6 ‰ (2-σ standard deviation), compared with the reference value of +32.8 ± 0.4 ‰, after correction for variation in δ value with water concentration, between-sample memory, and normalization to the VSMOW-SLAP scale. CONCLUSIONS: LIMS for Lasers 2015 enables users to create systematic, well-founded instrument templates, import δ(2) H, δ(17) O, and δ(18) O results, evaluate performance with automatic graphical plots, correct for δ nonlinearity due to variable water concentration, correct for between-sample memory, adjust for drift, perform VSMOW-SLAP normalization, and perform long-term QA/QC audits easily.

  14. Dispel4py: An Open-Source Python library for Data-Intensive Seismology

    NASA Astrophysics Data System (ADS)

    Filgueira, Rosa; Krause, Amrey; Spinuso, Alessandro; Klampanos, Iraklis; Danecek, Peter; Atkinson, Malcolm

    2015-04-01

    Scientific workflows are a necessary tool for many scientific communities as they enable easy composition and execution of applications on computing resources while scientists can focus on their research without being distracted by the computation management. Nowadays, scientific communities (e.g. Seismology) have access to a large variety of computing resources and their computational problems are best addressed using parallel computing technology. However, successful use of these technologies requires a lot of additional machinery whose use is not straightforward for non-experts: different parallel frameworks (MPI, Storm, multiprocessing, etc.) must be used depending on the computing resources (local machines, grids, clouds, clusters) where applications are run. This implies that for achieving the best applications' performance, users usually have to change their codes depending on the features of the platform selected for running them. This work presents dispel4py, a new open-source Python library for describing abstract stream-based workflows for distributed data-intensive applications. Special care has been taken to provide dispel4py with the ability to map abstract workflows to different platforms dynamically at run-time. Currently dispel4py has four mappings: Apache Storm, MPI, multi-threading and sequential. The main goal of dispel4py is to provide an easy-to-use tool to develop and test workflows in local resources by using the sequential mode with a small dataset. Later, once a workflow is ready for long runs, it can be automatically executed on different parallel resources. dispel4py takes care of the underlying mappings by performing an efficient parallelisation. Processing Elements (PE) represent the basic computational activities of any dispel4Py workflow, which can be a seismologic algorithm, or a data transformation process. For creating a dispel4py workflow, users only have to write very few lines of code to describe their PEs and how they are connected by using Python, which is widely supported on many platforms and is popular in many scientific domains, such as in geosciences. Once, a dispel4py workflow is written, a user only has to select which mapping they would like to use, and everything else (parallelisation, distribution of data) is carried on by dispel4py without any cost to the user. Among all dispel4py features we would like to highlight the following: * The PEs are connected by streams and not by writing to and reading from intermediate files, avoiding many IO operations. * The PEs can be stored into a registry. Therefore, different users can recombine PEs in many different workflows. * dispel4py has been enriched with a provenance mechanism to support runtime provenance analysis. We have adopted the W3C-PROV data model, which is accessible via a prototypal browser-based user interface and a web API. It supports the users with the visualisation of graphical products and offers combined operations to access and download the data, which may be selectively stored at runtime, into dedicated data archives. dispel4py has been already used by seismologists in the VERCE project to develop different seismic workflows. One of them is the Seismic Ambient Noise Cross-Correlation workflow, which preprocesses and cross-correlates traces from several stations. First, this workflow was tested on a local machine by using a small number of stations as input data. Later, it was executed on different parallel platforms (SuperMUC cluster, and Terracorrelator machine), automatically scaling up by using MPI and multiprocessing mappings and up to 1000 stations as input data. The results show that the dispel4py achieves scalable performance in both mappings tested on different parallel platforms.

  15. Unified Desktop for Monitoring & Control Applications - The Open Navigator Framework Applied for Control Centre and EGSE Applications

    NASA Astrophysics Data System (ADS)

    Brauer, U.

    2007-08-01

    The Open Navigator Framework (ONF) was developed to provide a unified and scalable platform for user interface integration. The main objective for the framework was to raise usability of monitoring and control consoles and to provide a reuse of software components in different application areas. ONF is currently applied for the Columbus onboard crew interface, the commanding application for the Columbus Control Centre, the Columbus user facilities specialized user interfaces, the Mission Execution Crew Assistant (MECA) study and EADS Astrium internal R&D projects. ONF provides a well documented and proven middleware for GUI components (Java plugin interface, simplified concept similar to Eclipse). The overall application configuration is performed within a graphical user interface for layout and component selection. The end-user does not have to work in the underlying XML configuration files. ONF was optimized to provide harmonized user interfaces for monitoring and command consoles. It provides many convenience functions designed together with flight controllers and onboard crew: user defined workspaces, incl. support for multi screens efficient communication mechanism between the components integrated web browsing and documentation search &viewing consistent and integrated menus and shortcuts common logging and application configuration (properties) supervision interface for remote plugin GUI access (web based) A large number of operationally proven ONF components have been developed: Command Stack & History: Release of commands and follow up the command acknowledges System Message Panel: Browse, filter and search system messages/events Unified Synoptic System: Generic synoptic display system Situational Awareness : Show overall subsystem status based on monitoring of key parameters System Model Browser: Browse mission database defintions (measurements, commands, events) Flight Procedure Executor: Execute checklist and logical flow interactive procedures Web Browser : Integrated browser reference documentation and operations data Timeline Viewer: View master timeline as Gantt chart Search: Local search of operations products (e.g. documentation, procedures, displays) All GUI components access the underlying spacecraft data (commanding, reporting data, events, command history) via a common library providing adaptors for the current deployments (Columbus MCS, Columbus onboard Data Management System, Columbus Trainer raw packet protocol). New Adaptors are easy to develop. Currently an adaptor to SCOS 2000 is developed as part of a study for the ESTEC standardization section ("USS for ESTEC Reference Facility").

  16. Representing Graphical User Interfaces with Sound: A Review of Approaches

    ERIC Educational Resources Information Center

    Ratanasit, Dan; Moore, Melody M.

    2005-01-01

    The inability of computer users who are visually impaired to access graphical user interfaces (GUIs) has led researchers to propose approaches for adapting GUIs to auditory interfaces, with the goal of providing access for visually impaired people. This article outlines the issues involved in nonvisual access to graphical user interfaces, reviews…

  17. User Procedures Standardization for Network Access. NBS Technical Note 799.

    ERIC Educational Resources Information Center

    Neumann, A. J.

    User access procedures to information systems have become of crucial importance with the advent of computer networks, which have opened new types of resources to a broad spectrum of users. This report surveys user access protocols of six representative systems: BASIC, GE MK II, INFONET, MEDLINE, NIC/ARPANET and SPIRES. Functional access…

  18. Information broker: a useless overhead or a necessity

    NASA Astrophysics Data System (ADS)

    Maitan, Jacek

    1996-01-01

    The richness and diversity of information available over the Internet, its size, convenience of access, and its dynamic growth will create new ways to offer better education opportunities in medicine. The Internet will especially benefit medical training process that is expensive and requires continuous updating. The use of the Internet will lower the delivery cost and make medical information available to all potential users. On the other hand, since medical information must be trusted and new policies must be developed to support these capabilities, technologies alone are not enough. In general, we must deal with issues of liability, remuneration for educational and professional services, and general issues of ethics associated with patient-physician relationship in a complicated environment created by a mix of managed and private care combined with modern information technology. In this paper we will focus only on the need to create, to manage and to operate open system over the Internet, or similar low-cost and easy access networks, for the purpose of medical education process. Finally, using business analysis, we argue why the medical education infrastructure needs an information broker, a third party organization that will help the users to access the information and the publishers to display their titles. The first section outlines recent trends in medical education. In the second section, we discuss transfusion medicine requirements. In the third section we provide a summary of the American Red Cross (ARC) transfusion audit system; we discuss the relevance of the assumptions used in this system to other areas of medicine. In the fourth section we describe the overall system architecture and discuss key components. The fifth section covers business issues associated with medical education systems and with the potential role of ARC in particular. The last section provides a summary of findings.

  19. Cancer care management through a mobile phone health approach: key considerations.

    PubMed

    Mohammadzadeh, Niloofar; Safdari, Reza; Rahimi, Azin

    2013-01-01

    Greater use of mobile phone devices seems inevitable because the health industry and cancer care are facing challenges such as resource constraints, rising care costs, the need for immediate access to healthcare data of types such as audio video texts for early detection and treatment of patients and increasing remote aids in telemedicine. Physicians, in order to study the causes of cancer, detect cancer earlier, act in prevention measures, determine the effectiveness of treatment and specify the reasons for the treatment ineffectiveness, need to access accurate, comprehensive and timely cancer data. Mobile devices provide opportunities and can play an important role in consulting, diagnosis, treatment, and quick access to health information. There easy carriage make them perfect tools for healthcare providers in cancer care management. Key factors in cancer care management systems through a mobile phone health approach must be considered such as human resources, confidentiality and privacy, legal and ethical issues, appropriate ICT and provider infrastructure and costs in general aspects and interoperability, human relationships, types of mobile devices and telecommunication related points in specific aspects. The successful implementation of mobile-based systems in cancer care management will constantly face many challenges. Hence, in applying mobile cancer care, involvement of users and considering their needs in all phases of project, providing adequate bandwidth, preparation of standard tools that provide maximum mobility and flexibility for users, decreasing obstacles to interrupt network communications, and using suitable communication protocols are essential. It is obvious that identifying and reducing barriers and strengthening the positive points will have a significant role in appropriate planning and promoting the achievements of mobile cancer care systems. The aim of this article is to explain key points which should be considered in designing appropriate mobile health systems in cancer care as an approach for improving cancer care management.

  20. ASGARD: an open-access database of annotated transcriptomes for emerging model arthropod species.

    PubMed

    Zeng, Victor; Extavour, Cassandra G

    2012-01-01

    The increased throughput and decreased cost of next-generation sequencing (NGS) have shifted the bottleneck genomic research from sequencing to annotation, analysis and accessibility. This is particularly challenging for research communities working on organisms that lack the basic infrastructure of a sequenced genome, or an efficient way to utilize whatever sequence data may be available. Here we present a new database, the Assembled Searchable Giant Arthropod Read Database (ASGARD). This database is a repository and search engine for transcriptomic data from arthropods that are of high interest to multiple research communities but currently lack sequenced genomes. We demonstrate the functionality and utility of ASGARD using de novo assembled transcriptomes from the milkweed bug Oncopeltus fasciatus, the cricket Gryllus bimaculatus and the amphipod crustacean Parhyale hawaiensis. We have annotated these transcriptomes to assign putative orthology, coding region determination, protein domain identification and Gene Ontology (GO) term annotation to all possible assembly products. ASGARD allows users to search all assemblies by orthology annotation, GO term annotation or Basic Local Alignment Search Tool. User-friendly features of ASGARD include search term auto-completion suggestions based on database content, the ability to download assembly product sequences in FASTA format, direct links to NCBI data for predicted orthologs and graphical representation of the location of protein domains and matches to similar sequences from the NCBI non-redundant database. ASGARD will be a useful repository for transcriptome data from future NGS studies on these and other emerging model arthropods, regardless of sequencing platform, assembly or annotation status. This database thus provides easy, one-stop access to multi-species annotated transcriptome information. We anticipate that this database will be useful for members of multiple research communities, including developmental biology, physiology, evolutionary biology, ecology, comparative genomics and phylogenomics. Database URL: asgard.rc.fas.harvard.edu.

Top